WO2024049444A1 - Post-attack real-time trajectory recovery of collaborative robotic arms - Google Patents

Post-attack real-time trajectory recovery of collaborative robotic arms Download PDF

Info

Publication number
WO2024049444A1
WO2024049444A1 PCT/US2022/042459 US2022042459W WO2024049444A1 WO 2024049444 A1 WO2024049444 A1 WO 2024049444A1 US 2022042459 W US2022042459 W US 2022042459W WO 2024049444 A1 WO2024049444 A1 WO 2024049444A1
Authority
WO
WIPO (PCT)
Prior art keywords
effector
safety
robotic arm
trajectory
robot
Prior art date
Application number
PCT/US2022/042459
Other languages
French (fr)
Inventor
Radhika GHOSAL
Javier FELIP LEON
David Gonzalez Aguirre
Marcio JULIATO
Vuk Lesi
Manoj Sastry
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Priority to PCT/US2022/042459 priority Critical patent/WO2024049444A1/en
Publication of WO2024049444A1 publication Critical patent/WO2024049444A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/12Detection or prevention of fraud
    • H04W12/121Wireless intrusion detection systems [WIDS]; Wireless intrusion prevention systems [WIPS]
    • H04W12/122Counter-measures against attacks; Protection against rogue devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39135For multiple manipulators operating at same time, avoid collision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40476Collision, planning for collision free path

Definitions

  • a robot is a machine designed to perform an automated set of repetitive tasks under control of a computer. While some robots take human form, others are designed with a form suitable for a given set of tasks. Examples include industrial and medical operating robots with robotic arms, unmanned aerial vehicles (UAV) drones with a shape of an aircraft, quadrupedal robots shaped like dogs, and even microscopic nano robots shaped like a flagellum for propulsion through a human body.
  • UAV unmanned aerial vehicles
  • FIG. 1 illustrates an exemplary robotic system.
  • FIG. 2A illustrates a first view of a system with an intrusion detection system.
  • FIG. 2B illustrates a second view of a system with an intrusion detection system.
  • FIG. 3 illustrates block diagram of an intrusion detection system.
  • FIG. 4 illustrates a first logic flow
  • FIG. 5 illustrates a second logic flow
  • FIG. 6 illustrates an operating environment of a robotic system.
  • FIG. 7 A illustrates an operating environment of a robotic system at a start of a security attack.
  • FIG. 7B illustrates an operating environment of a robotic system when a security attack is detected.
  • FIG. 7C illustrates an operating environment of a robotic system when a security attack is neutralized.
  • FIG. 8 illustrates a region of uncertainty for a robotic system.
  • FIG. 9 illustrates an offline phase for a motion planner of an intrusion detection system.
  • FIG. 10 illustrates a robot and defined trajectories.
  • FIG. 11 illustrates a third logic flow.
  • FIG. 12 illustrates an online phase for a motion controller of an intrusion detection system.
  • FIG. 13 illustrates a fourth logic flow.
  • FIG. 14 illustrates a robot and multiple defined trajectories.
  • FIG. 15 illustrates a fifth logic flow.
  • FIG. 16 illustrates a sixth logic flow.
  • FIG. 17 illustrates computer readable medium.
  • FIG. 18 depicts an illustrative computer system architecture that may be used to practice exemplary embodiments described herein.
  • Embodiments are generally directed to techniques to secure robots and corresponding automated control systems from security threats. Some embodiments are particularly directed to techniques to monitor robots and take safety measures when a security or cybersecurity attack is detected.
  • an intrusion detection system may monitor computers or controllers arranged to control robots in order to identify and neutralize security attacks against the robots in a robotic system.
  • a robotic safety system receives security alerts from the IDS, and takes remedial or corrective actions to prevent damage to the robots, humans, objects, products, components, or other valuable resources of the robotic system.
  • Robots have replaced humans in performing repetitive and dangerous tasks.
  • industrial robots are routinely used for manufacturing to perform tasks such as welding, painting and assembly.
  • Articulated robots are a common industrial robot with robotic arms that mimic a human arm. As with human arms, the robotic arms are articulated with several degrees of freedom that allow a wide range of movement in three-dimensions.
  • Other types of robots include cartesian robots, cylindrical coordinate robots, spherical coordinate robots, selective compliance assembly robot arm (SCARA) robots, delta or parallel link robots, serial manipulators, parallel manipulators, anthropomorphic robots, and so forth.
  • SCARA selective compliance assembly robot arm
  • multiple robots can collaborate together to operate in close proximity to each other within a shared space.
  • Such robots are sometimes referred to as “collaborative robots.”
  • Collaborative robots can have robotic arms (or other parts) designed to operate together to perform shared tasks, such as assembly of a product.
  • Robotic controllers carefully synchronize movement of the robotic arms to safely perform tasks within the shared space while avoiding collisions, which can potentially damage the robotic arms, products, humans or other valuable resources.
  • Motion planning is used to calculate defined paths or trajectories for the robotic arms, which are rigorously followed during real- time operations of the robotic arms under supervision of associated computers or controllers.
  • Collaborative robots may be subject to a security attack.
  • malware can be injected into robotic controllers to modify movement of one or more robotic arms outside defined paths or trajectories. This could potentially cause the robotic arms to collide, either accidentally or on purpose.
  • Intrusion detection systems are deployed to detect security attacks and send alerts to robotic safety systems, which in turn take remedial actions for the collaborative robots.
  • remedial actions includes redirecting movement of robotic arms to safety positions or avoiding subsequent arm movements until the attack is neutralized.
  • there is a period of time between attack initiation and attack neutralization where a current position for a compromised robotic arm is uncertain or unknown as a consequence of the malicious actions of the attacker.
  • motion planning for robotic arms is a compute intensive operation and not really feasible during real-time attacks. Consequently, performing real-time motion planning to move robotic arms to safety positions during an attack can be challenging, particularly within tight time frames often measured in milliseconds or microseconds.
  • a robotic safety system is a hardware and/or software component, device or system that is generally responsible for controlling movement of one or more robotic arms of one or more collaborative robots when under a security attack. For instance, the robotic safety system directs movement of a robotic arm of a pair of collaborative robotic arms from a current position to a safety position when another robotic arm of the collaborative robotic arms is under a security attack.
  • a first phase is an offline phase when collaborative robots are non-operational.
  • a motion planner generates a series of safety trajectories for each robotic arm of the collaborative robots, and stores them in a database.
  • a safety trajectory may comprise a path between a current position and a desired end position for a robotic arm following a safe trajectory that avoids a defined subspace in a three-dimensional (3D) space.
  • a second phase is an online phase when collaborative robots are operational.
  • a motion controller is notified of a security attack on one robotic arm (referred to herein as a “compromised robotic arm”), retrieves a defined safety trajectory for another robotic arm (referred to herein as an “un-compromised robotic arm”) stored in the database, and controls movement of the un-compromised robotic arm along the defined safety trajectory.
  • the robotic safety system can respond to security threats in real-time with faster reaction times, while reducing or eliminating collisions between the un-compromised robotic arm and the compromised robotic arm.
  • the robotic safety system increases safety for humans, robots, objects, products, components, electronic parts, mechanical parts, and other valuable resources that are in relatively close proximity to the compromised robotic arm (e.g., within a reach envelope) during a security event.
  • a motion planner of a robotic safety system exhaustively calculates safety trajectories between a current position for a robotic arm and a safety position for the robotic arm.
  • the safety trajectories are calculated for each defined time interval or time step associated with the robotic arm.
  • the total operational time of each robotic arm is split into discrete points in time (referred to herein as “defined time intervals” or “defined time steps” which are used interchangeably) at some time granularity, usually on the order of milliseconds.
  • the motion planner receives as input a position associated with a given time step of a target robotic arm, and computes an uncertainty region for the target robotic arm.
  • target robotic arm refers to a robot that could potentially become a “compromised robotic arm” in the future during the online phase.
  • the offline phase exhaustively computes the uncertainty region for all time steps in the planned trajectories of the target robot.
  • an uncertainty region comprises a three- dimensional shape that defines boundaries for a reach envelope of a compromised robotic arm during an uncertainty period.
  • An uncertainty period represents all or some of a time period as measured between when a security attack is initiated and when the security attack is neutralized. Note that a time when a security attack is initiated (referred to herein as an “attack time”) is sometime before a time when the security attack is detected (referred to herein as a “detect time”).
  • a time when a security attack is neutralized (referred to herein as a “neutralization time”) is sometime after an attack time and detect time.
  • the safety trajectories calculated during the offline phase are stored in a database, and indexed for quick retrieval during an online phase.
  • a motion controller of a robotic safety system controls movement of robotic arms along defined trajectories to accomplish shared tasks.
  • an IDS detects a security attack on one of the robotic arms, it notifies the robotic safety system.
  • the robotic safety system obtains the last time step associated with the last good known position of the attacked robot.
  • the motion controller of the robotic safety system retrieves an appropriate safety trajectory for one or more un-compromised robotic arms from the database based on last known positions, poses or configurations for the one or more un- compromised robotic arms, which are derived from defined time steps associated with the un-compromised robotic arms.
  • the motion controller then instructs the un-compromised robotic arms to follow the safety trajectories from a current position of the robotic arms to a safety position for the robotic arms. Since the safety trajectory is pre-computed to avoid an uncertainty region for the compromised robotic arm, there is a lower probability of collisions between the robotic arms. Further, the use of pre-computed safety trajectories allows faster reaction times for the un-compromised robotic arm, relative to real-time motion planning, thereby lowering a probability of collisions even further. As a result, factories employing collaborative robots with increased resiliency against cyberattacks will have higher productivity and less downtime. This could benefit products in the autonomous systems area, such as robotics, self-driving automobiles, and other automated systems.
  • a and b and c are intended to be variables representing any positive integer.
  • a complete set of components 123 illustrated as components 123-1 through 123- ⁇ may include components 123-1, 123-2, 123-3, 123-4, and 123-5.
  • the embodiments are not limited in this context.
  • FIG. 1 depicts a robotic system 100.
  • the robotic system 100 may comprise any robotic system designed for a particular application within a robotic system environment 134.
  • robotic systems suitable for the robotic system 100 can include, without limitation, articulated robots, cartesian robots, cylindrical coordinate robots, spherical coordinate robots, selective compliance assembly robot arm (SCARA) robots, delta or parallel link robots, serial manipulators, parallel manipulators, anthropomorphic robots, and so forth.
  • the robotic system 100 is a collaborative robotic system having multiple robots each with one or more articulated robotic arms. However, embodiments are not limited to this particular type of robotic system.
  • the robotic system 100 may comprise multiple collaborative robots 102, 118.
  • Each robot 102, 118 may have one or more robotic arms, such as a first robotic arm 106 and a second robotic arm 122, respectively.
  • the robotic arms 106, 122 may be collaborative robotic arms arranged to move in a shared physical space 138 in order to execute tasks to accomplish shared goals, such as manipulating an object 136.
  • the shared physical space 138 is sometimes referred to as a “workspace” or “taskspace” for the robotic system 100.
  • the object 136 may be any physical object suitable for a given application for the robotic system 100, such as a human patient for a medical operating robot, a product or item for assembly or disassembly by an industrial operating robot, an automobile for welding or painting, and so forth.
  • a human patient for a medical operating robot a product or item for assembly or disassembly by an industrial operating robot, an automobile for welding or painting, and so forth.
  • the robotic system 100 may contain more than two robots and/or robotic arms operating within the shared physical space 138. Embodiments are not limited in this context.
  • the robotic arms 106, 122 are articulated robotic arms.
  • An articulated robotic arm is designed as a series of links connected by motor-actuated joints that extend from a base to an end-effector.
  • Articulated robotic arms often have an anthropomorphic arm structure that mimics a human arm, such as having a shoulder, an elbow and a wrist.
  • the robotic arms 106, 122 are articulated with several degrees of freedom that allow a wide range of movement in three-dimensions.
  • the robotic arms 106, 122 may have up to six joints, since it requires at least six degrees of freedom to place a manipulated object 136 in an arbitrary position and orientation in the shared physical space 138 of the robots 102, 118.
  • the first robotic arm 106 is a mechanical arm with a robot base 104 and one or more links 110, 114 to form a first kinematic chain.
  • Kinematics is used to describe a motion of systems composed of joined parts (e.g., multi-link systems), such as in an engine, robotic arm or human skeleton.
  • the links 110, 114 are connected by joints 108, 112 allowing rotational motion or translational displacement.
  • a first end-effector 116 At the end of the link 114 is a first end-effector 116, which is considered a terminus for the first kinematic chain.
  • the second robotic arm 122 is a mechanical arm with a robot base 120 and one or more links 126, 130 to form a second kinematic chain.
  • the links 126, 130 are connected by joints 124, 128 allowing rotational motion or translational displacement.
  • a second end-effector 132 At the end of the link 130 is a second end-effector 132, which is considered a terminus for the second kinematic chain.
  • the links and joints denoted for the robots 102, 118 are exemplary only, and the robots 102, 118 may have more links and joints not labeled in FIG. 1.
  • the end effectors 116, 132 can be any devices located at the terminus of the robotic arms 106, 122, that are designed to interact with robotic system environment 134, such as manipulating the object 136.
  • An exact implementation for the end-effectors 116, 132 depends on a given application for the robots 102, 118.
  • Examples for the end-effectors 116, 132 can include, without limitation, tools, grippers, pins, vacuums, magnets, welding guns, paint spray gun, surgical tool, drill, cutters, robotic hands, and so forth.
  • the position and orientation of the end-effectors 116, 132 are derived from joint positions using a geometric model of the robotic arms 106, 122, respectively.
  • a reachable workspace within the shared physical space 138 of the end-effectors 116, 132 is a manifold of reachable frames.
  • a reachable workspace is sometimes referred to herein as a “reachable envelope” for a robotic arm 106, 122.
  • a dextrous workspace comprises points of the reachable workspace where the robot can generate velocities that span the complete tangent space at that point, e.g., it can translate a manipulated object 136 with three degrees of freedom, and rotate the object with three degrees of rotation freedom.
  • Relationships between joint space and Cartesian space coordinates of the object 136 held by the robotic arms 106, 122 are in general multiple-valued. A same pose can be reached by the robotic arms 106, 122 in different ways, each with a different set of joint coordinates. Hence the reachable workspace of the robotic arms 106, 122 is divided in configurations (also called assembly modes), in which the kinematic relationships are locally one-to-one.
  • the robotic arms 106, 122 are programmed to follow one or more defined trajectories in the shared physical space 138 while performing a collaborative task on the object 136.
  • a defined trajectory is a sequence of valid configurations to move an object from a source to a destination.
  • a motion planner implements a motion planning algorithm to compute a continuous path (or series of points) that connects a start configuration S and a goal configuration G, while avoiding collision with known obstacles.
  • An exemplary motion planner and defined trajectories are described in more detail with reference to FIG. 3.
  • FIG. 2A depicts a system 200a.
  • the system 200a represents a control system for the robotic system 100 while operating under normal conditions.
  • the system 200a comprises a network 212 communicatively coupled to an intrusion detection system (IDS) 202.
  • the IDS 202 is communicatively coupled to a robotic safety system 214, which in turn, is communicatively coupled to a pair of controllers 204, 206.
  • IDS intrusion detection system
  • the controllers 204, 206 are arranged to control operations for the robots 102, 118, respectively.
  • the controllers 204, 206 may be implemented with various hardware elements, software elements, or a combination of both, as described in more detail with reference to FIG. 18. While two controllers 204, 206 are shown in system 200a, a single controller may be used to control both of the robots 102, 118. Embodiments are not limited in this context.
  • the IDS 202 is a device or software application that monitors a device, network or systems for malicious activity or policy violations.
  • the IDS 202 may be specifically tuned to detect various types of attacks, such as a malware injections into the controllers 204, 206, spoofing of sensing and actuation, Denial of Service of robotic functions, or other specific attack vectors for robotic system 100. Any intrusion activity or violation is typically reported either to other devices in the same network 212, an administrator, and/or collected centrally or locally using a security information and event management (SIEM) system.
  • SIEM security information and event management
  • a SIEM system combines outputs from multiple sources and uses alarm filtering techniques to distinguish malicious activity from false alarms.
  • the IDS 202 may be implemented for other devices in the robotic system environment 134, such as other robotic systems coupled to the network 212, to provide a more comprehensive security solution to an attacker.
  • the IDS 202 can utilize any number of different detection methods to detect an attack.
  • the IDS 202 may implement a signature-based method, a statistical anomaly-based method, a stateful protocol analysis method, machine-learning based, or some combination of all four methods.
  • a signature-based IDS monitors packets in the network 212 and compares with pre-configured and pre-determined attack patterns known as signatures.
  • a statistical anomaly -based or machine-learning based IDS monitors network traffic and compares it against an established baseline. The baseline will identify what is “normal” for the network 212, such as what sort of bandwidth is generally used and what protocols are used. For instance, ensemble models that use Matthews correlation co- efficient to identify unauthorized network traffic have obtained 99.73% accuracy.
  • a stateful protocol analysis IDS identifies deviations of protocol states by comparing observed events with defined profiles of generally accepted definitions of benign activity. It will be appreciated that these detection methods are by way of example and not limitation. Other embodiments may use different detection methods as well. The embodiments are not limited in this respect.
  • the robotic safety system 214 is a device that is implemented on a communications path between the IDS 202 and the controllers 204, 206.
  • the robotic safety system 214 is a software application that receives alerts or alarm signals from the IDS 202 indicating that one or both of the controllers 204, 206 are under a security attack.
  • the robotic safety system 214 receives an alarm notification, it begins robotic recovery operations or decisions to ensure safe operations for one or both of the robots 102, 118 while under attack.
  • One example of robotic recovery operations may include controlling movement for the robotic arms 106, 122 along pre-computed safety trajectories to designated safety positions.
  • a safety position may comprise a resting pose or position for an un-compromised robotic arm where it is outside an uncertainty region for a compromised robotic arm and therefore safe from collisions from the compromised robotic arm.
  • FIG. 2B depicts a system 200b.
  • the system 200b is similar to the system 200a.
  • the system 200b illustrates a case where an attacker 208 has attacked the controller 206 for the robot 118.
  • the attacker 208 has gained unauthorized access and control over operations for the controller 206, and therefore has control over operations for the robot 118 and the robotic arm 122.
  • the attacker 208 can cause the robotic arm 122 to deviate from its normal working defined trajectory to an unknown trajectory, thereby creating a possibility of collisions with the robotic arm 106 of the robot 102 which is operating within the same shared physical space 138.
  • the IDS 202 detects the attack, and immediately sends an alarm 210 to other devices on the network 212, including the robotic safety system 214, the controller 204 of the robot 102, and other devices accessible via the network 212 (e.g., other robots or a SIEM).
  • other devices on the network 212 including the robotic safety system 214, the controller 204 of the robot 102, and other devices accessible via the network 212 (e.g., other robots or a SIEM).
  • FIG. 3 depicts a more detailed block diagram of components implemented for the robotic safety system 214.
  • the robotic safety system 214 is representative of any number and type of electronic devices arranged to operate and process messages for the robotic system 100. More particularly, the robotic safety system 214 includes a processing circuitry 302, an interface 322 and a memory 304.
  • the processing circuitry 302 may include circuitry or processor logic, such as, for example, any of a variety of commercial processors.
  • the processing circuitry 302 may include multiple processors, a multi-threaded processor, a multi-core processor (whether the multiple cores coexist on the same or separate dies), and/or a multi- processor architecture of some other variety by which multiple physically separate processors are in some way linked.
  • the processing circuitry 302 may include graphics processing portions and may include dedicated memory, multiple-threaded processing and/or some other parallel processing capability.
  • the processing circuitry 302 may be an application specific integrated circuit (ASIC) or a field programmable integrated circuit (FPGA).
  • ASIC application specific integrated circuit
  • FPGA field programmable integrated circuit
  • the processing circuitry 302 may be circuitry arranged to perform computations related to the robotic system 100, the robots 102, 118, the robotic arms 106, 122, the controllers 204, 206, switching, routing, security, and so forth.
  • the interface 322 may include logic and/or features to support a communication interface.
  • the interface 322 may include one or more interfaces that operate according to various communication protocols or standards to communicate over direct or network communication links. Direct communications may occur via use of communication protocols or standards described in one or more industry standards (including progenies and variants).
  • the interface 322 may facilitate communication over a bus, such as, for example, peripheral component interconnect express (PCIe), non-volatile memory express (NVMe), universal serial bus (USB), system management bus (SMBus), SAS (e.g., serial attached small computer system interface (SCSI)) interfaces, serial AT attachment (SATA) interfaces, or the like.
  • PCIe peripheral component interconnect express
  • NVMe non-volatile memory express
  • USB universal serial bus
  • SMBs system management bus
  • SAS e.g., serial attached small computer system interface (SCSI) interfaces
  • SATA serial AT attachment
  • interface 322 may be arranged to support wireless communication protocols or standards, such as, for example, Wi-Fi, Bluetooth, ZigBee, LTE, 5G, 6G or the like.
  • the interface 322 of the robotic safety system 214 may be arranged to communicate over a wired or wireless connection to the other devices in the systems 200a, 200b, such as the IDS 202, the controllers 204, 206, the robots 102, 118, or other devices via the network 212.
  • the memory 304 may include logic, a portion of which includes arrays of integrated circuits, forming non-volatile memory to persistently store data or a combination of non-volatile memory and volatile memory. It is to be appreciated, that the memory 304 may be based on any of a variety of technologies. In particular, the arrays of integrated circuits included in memory 304 may be arranged to form one or more types of memory, such as, for example, dynamic random access memory (DRAM), NAND memory, NOR memory, or the like.
  • DRAM dynamic random access memory
  • NAND memory NAND memory
  • NOR memory NOR memory
  • the memory 304 includes a set of instructions 306, input data 308, output data 310, a motion planner 312 and a motion controller 314.
  • the memory 304 may optionally include an integrated IDS, such as the IDS 202 as described with reference to FIGS. 2A, 2B.
  • the robotic safety system 214 may also include one or more databases to store information, such as a first database to store configuration information 316, a second database to store defined trajectories 318, and a third database to store safety trajectories 320.
  • the configuration information 316 may generally include various parameters or properties associated with one or both of the robots 102, 118 and/or corresponding robotic arms 106, 122.
  • Example of configuration information 316 may include mechanical and electrical properties, type information, arm information, base information, link information, joint information, end-effector information, pose information, kinematics, state information, defined time intervals or time steps Ti to TN (also denoted herein as “TI-TN”) material types, physics-related information, object information, and any other information pertaining to the robotic arms 106, 122.
  • the defined trajectories 318 may generally include defined trajectories for the robotic arms 106, 122 while in normal operation, such as when operating together in the shared physical space 138 while performing a collaborative task on the object 136.
  • a defined trajectory is a sequence of valid configurations to move an object from a source to a destination.
  • the motion planner 312 implements a motion planning algorithm to compute a continuous path (or series of points) that connects a start configuration S and a goal configuration G, while avoiding collision with known obstacles.
  • the robotic arms 106, 122 and obstacle geometry is described in a two-dimensional (2D) or three-dimensional (3D) workspace, while motion is represented as a path in configuration space C.
  • a configuration describes a pose of one or both of the robotic arms 106, 122, and the configuration space C is a set of all possible configurations.
  • a set of configurations that avoids collisions with obstacles is called free space.
  • the complement of free space is called to obstacle or forbidden space.
  • An obstacle or forbidden space is considered a space to which the robotic arms 106, 122 cannot move.
  • An obstacle or forbidden space is sometimes referred to herein as an “uncertainty region” or “occupied space” for one or both of the robotic arms 106, 122.
  • a target space is a subspace of free space that denotes a space to which the robotic arms 106, 122 can move to reach a targeted destination.
  • a targeted destination is sometimes referred to herein as a “safety position” or “safety pose” for one or both of the robotic arms 106, 122.
  • the safety position or safety pose could be an original planned end position for a defined trajectory 318 of a robot that it was trying to achieve before an attack.
  • the configuration of the robot as well as the end-effector 116, 132 trajectory must be safe to avoid collisions.
  • the robotic arms 106, 122 may take alternative trajectories, referred to as safety trajectories 320, as discussed in more detail below.
  • the alternative trajectories are not necessarily a shortest trajectory, a more efficient trajectory or a smoother trajectory. However, the alternative trajectory should guarantee that the robot can achieve the safety position or safety pose, e.g., an originally planned end position.
  • the safety trajectories 320 may generally define a series of positions between a current position for a given end-effector and a safety position for the given end-effector that moves in a target space and avoids an uncertainty region of another end-effector at a defined time step T 1 -T N .
  • a safety position may comprise a resting pose or position for a robotic arm where it is outside an uncertainty region for a compromised robotic arm and therefore safe from collisions from the compromised robotic arm. As described in more detail with reference to FIG.
  • the motion planner 312 may receive as input configuration information 316, computes a set of safety trajectories 320 for both robotic arms 106, 122, and stores them in a database or other data structure in persistent memory (e.g., flash memory) or external storage (e.g., a hard drive).
  • persistent memory e.g., flash memory
  • external storage e.g., a hard drive
  • the motion planner 312 computes safety trajectories 320 for the robotic arms 106, 122 of the robotic system 100. More particularly, for each defined time step T 1 - T N associated with the robotic arms 106, 122, the motion planner 312 stores a corresponding safety trajectory 320 for the robotic arms 106, 122 that avoids an uncertainty region of a compromised robotic arm. For instance, the motion planner 312 computes a first set of safety trajectories 320 for the robotic arm 106 in anticipation of the robotic arm 122 being compromised by the attacker 208. Conversely, the motion planner 312 computes a second set of safety trajectories 320 for the robotic arm 122 in anticipation of the robotic arm 106 being compromised by the attacker 208.
  • the motion planner 312 computes the safety trajectories 320 during an offline phase, and stores the pre-computed safety trajectories 320 in a database, indexed in a manner for quick retrieval during an online phase for the robotic arms 106, 122 in the event of a security breach or incident compromising one or both robotic arms 106, 122.
  • the safety trajectories 320 are indexed by defined time steps T 1 -T N . Operations for the motion planner 312 will be discussed in more detail with reference to FIG. 4.
  • the motion controller 314 assists the controllers 204, 206 in controlling movement for the robotic arms 106, 122.
  • the robotic safety system 214 receives an alarm 210 from the IDS 202
  • the motion controller 314 begins robotic recovery operations or decisions to ensure safe operations for one or both of the robots 102, 118 and corresponding robotic arms 106, 122, respectively, while under attack.
  • robotic recovery operations may include controlling movement for the robotic arms 106, 122 along pre-computed safety trajectories 320 to designated safety positions.
  • the motion controller 314 identifies which of the robotic arms 106, 122 is currently under attack, acquires its last good known position, and retrieves a safety trajectory 320 for the un-compromised robotic arm 106, 122. For instance, the motion controller 314 may identify the robotic arm 122 as compromised and under attack from information contained within the alarm 210. The motion controller 314 can retrieve a safety trajectory 320 for the robotic arm 106 which is un-compromised and at risk of damage by the compromised robotic arm 122. The motion controller 314 can send the retrieved safety trajectory 320 for the un-compromised robotic arm 106 to the controller 204 for the un- compromised robotic arm 106.
  • the controller 204 then causes the un-compromised robotic arm 106 to deviate from its normal defined trajectory 318 and move along the received safety trajectory 320 to guide the un-compromised robotic arm 106 to a safety position.
  • the motion controller 314 performs similar operations for the robotic arm 122 when the robotic arm 106 is compromised.
  • the motion controller 314 may cooperatively operate with the controllers 204, 206, or in some instances, completely take over operations for the robots 102, 118 and corresponding robotic arms 106, 122.
  • the motion controller 314 can send a retrieved safety trajectory 320 or instructions for the retrieved safety trajectory 320 to the appropriate controller 204, 206 for execution by the receiving controller 204, 206.
  • the motion controller 314 can have authority (e.g., root or administrator permissions) to completely bypass the controllers 204, 206 and take direct operational control of the robotic arms 106, 122.
  • the motion controller 314 causes the robotic arm 106 or the robotic arm 122 to deviate from its normal defined trajectory 318 and move along the received safety trajectory 320 to guide the robotic arm 106 or the robotic arm 122 to a predetermined safety position. This may be advantageous when the robotic safety system 214 is uncertain whether a controller 204, 206 of an un-compromised robotic arm 106, 122 has not been compromised as well. Operations for the motion controller 314 will be discussed in more detail with reference to FIG. 5.
  • FIG. 4 illustrates an embodiment of a logic flow 400.
  • the logic flow 400 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 400 may include some or all of the operations performed by the motion planner 312 to generate or compute safety trajectories 320. Embodiments are not limited in this context.
  • the logic flow 400 illustrate a series of general operational blocks 402 to 408.
  • logic flow 400 receives configuration information for a first robot with a first robotic arm and a first end-effector at a defined time step.
  • logic flow 400 receives an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time step.
  • logic flow 400 determines a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time step.
  • logic flow 400 stores the safety trajectory in a database.
  • motion planner 312 plans the safety trajectory so that all parts of all configurations for the robots 102, 118 avoids the uncertainty region as well. This includes, for example, joints, links, rotators, different types of end-effectors, and any other movable parts of the robots 102, 118. In other words, the configuration space taken by an un-compromised robot must also not intersect an uncertainty region of a target robot or compromised robot.
  • the motion planner 312 may implement the logic flow 400 to compute safety trajectories 320 for both of the robotic arms 106, 122.
  • the motion planner 312 may receive configuration information 316 for a first robot 102 with a first robotic arm 106 and a first end-effector 116 at a defined time step T 1 -T N , where N represents any positive integer.
  • the motion planner 312 may also receive configuration information 316 for a second robot 118 with a second robotic arm 122 and a second end- effector 116 at the defined time step T 1 -T N .
  • the configuration information 316 for the second robot 118 may include, for example, a last good configuration for the second robot 118 at every possible time of attack.
  • the configuration information 316 for the second robot 118 may further include, for example, an uncertainty region for a second robot 118 with a second robotic arm 122 and a second end-effector 132.
  • the uncertainty region may comprise a three-dimensional (3D) shape or subspace in Cartesian space that defines boundaries for a reach envelope of the second end-effector 132 at the given defined time step T 1 -T N
  • the motion planner 312 may determine a safety trajectory 320 for the first end-effector 116 based on the uncertainty region.
  • the safety trajectory may define a series of positions between a current position for the first end-effector 116 and a safety position for the first end-effector 116 that avoids the uncertainty region of the second end-effector 132 at the defined time step T 1 -T N .
  • the motion planner 312 may store the safety trajectory in the third database with the other safety trajectories 320.
  • the motion planner 312 may repeat the logic flow 400 for each of the robotic arms 106, 122 at each of the defined time steps T 1 -T N until all possible safety trajectories 320 are computed for all the defined time steps T 1 -T N .
  • the motion planner 312 may implement additional operations associated with computing safety trajectories 320 for the robotic arms 106, 122.
  • the motion planner 312 may further compute safety trajectories 320 using configuration information 316, where each defined time step T 1 -T N is associated with a defined position, pose or configuration for the first end-effector 116 and the second end- effector 132.
  • the motion planner 312 may further compute safety trajectories 320 using configuration information 316 where the configuration information includes a set of parameters for the first robotic arm 106 at a defined time step T 1 -T N , the set of parameters including a joint position state for each joint of the first robotic arm 106, a joint velocity state for each joint of the first robotic arm 106, and a joint acceleration state for each joint of the first robotic arm 106.
  • the motion planner 312 may further compute safety trajectories 320 using configuration information 316 where the configuration information 316 includes a set of parameters for the first robotic arm 106 at a defined time step T 1 -T N , the set of parameters includes an initial kinematic frame and a final kinematic frame for the first robotic arm 106 at the defined time step T 1 -T N
  • the motion planner 312 may further compute safety trajectories 320 by determining an uncertainty region for the second robot 118 with the second robotic arm 122 and the second end-effector 132 based on configuration information 316 for the second robot 118, where the configuration information 316 includes a set of parameters for the second robotic arm 122 at a defined time step T 1 -T N , the set of parameters includes a joint position state for each joint of the second robotic arm 122, a joint velocity state for each joint of the second robotic arm 122, and a joint acceleration state for each joint of the second robotic arm 122. [0071] The motion planner 312 may further compute safety trajectories 320 by adapting a safety trajectory 320 to ensure movement of each joint of the first robotic arm 106 remains within joint acceleration parameters.
  • the motion planner 312 may further compute safety trajectories 320 by smoothing a safety trajectory 320 to ensure movement of the first robotic arm 106 remains within acceleration parameters along inflection regions.
  • the motion planner 312 may further compute safety trajectories 320 by regularizing a safety trajectory 320 for safe-trajectory branching and management.
  • the motion planner 312 may further compute safety trajectories 320 by receiving multiple uncertainty regions for the second robot 118 with the second robotic arm 122 and the second end-effector 132, each uncertainty region associated with a position of the second end-effector 132 at a defined time step T 1 -T N within a series of defined time steps T 1 -T N .
  • the motion planner 312 may further compute safety trajectories 320 by determining multiple safety trajectories 320 for the first end-effector 116 based on multiple uncertainty regions for the second robot 118 with the second robotic arm 122 and the second end- effector 132, each uncertainty region associated with a position of the second end-effector 132 at a defined time step within a series of defined time steps.
  • the motion planner 312 may further compute safety trajectories 320 by storing multiple safety trajectories 320 for the first end-effector 116 based on multiple uncertainty regions for the second robot 118 with the second robotic arm 122 and the second end- effector 132 in a database.
  • the motion planner 312 may further compute safety trajectories 320 and store and index the safety trajectories 320 for the first end-effector 116 in the database using defined time steps T 1 -T N and associated positions of the first end-effector 116 at each defined time step T 1 -T N .
  • FIG. 5 illustrates an embodiment of a logic flow 500.
  • the logic flow 500 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 500 may include some or all of the operations performed by the motion controller 314 to retrieve and execute a safety trajectory 320 for one or both robotic arms 106, 122.
  • Embodiments are not limited in this context.
  • the logic flow 500 illustrates a series of general operational blocks 502 to 514.
  • logic flow 500 monitors a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time step.
  • logic flow 500 monitors a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end- effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time step.
  • logic flow 500 detects a cyberattack on the second controller for the second robot.
  • logic flow 500 identifies a defined time step corresponding to a time of attack detection, the identified defined time step to comprise an attack detection time step.
  • logic flow 500 determines a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time step.
  • logic flow 500 retrieves a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end-effector.
  • the safety trajectory defines a series of positions between the current position for the first end-effector (and any other movable parts of a non-compromised robot) and a safety position for the first end- effector (and any other movable parts of a non-compromised robot) that avoids an uncertainty region of the second end-effector.
  • logic flow 500 sending control information to the first controller to cause the first end-effector to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second end-effector.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122.
  • the motion controller 314 monitors a first controller 204 for a first robot 102 with a first robotic arm 106, the first controller 204 to control movement of a first end-effector 116 of the first robotic arm 106 between a series of positions along a first defined trajectory 318, each position associated with a defined time step T 1 -T N
  • the motion controller 314 monitors a second controller 206 for a second robot 118 with a second robotic arm 122, the second controller 206 to control movement of a second end-effector 132 of the second robotic arm 122 between a series of positions along a second defined trajectory 318, each position associated with a defined time step T 1 -T N .
  • the motion controller 314 detects a cyberattack on the second controller 206 for the second robot 118.
  • the motion controller 314 identifies a defined time step T 1 -T N corresponding to a time of attack detection, the identified defined time step T 1 -T N to comprise an attack detection time step.
  • the motion controller 314 determines a current position for the first end-effector 116 from among the series of positions along the first defined trajectory 318 based on the attack detection time step.
  • the motion controller 314 retrieves a safety trajectory 320 from among multiple safety trajectories 320 stored in a database based on the current position for the first end-effector 116, the safety trajectory 320 to define a series of positions between the current position for the first end-effector 116 and a safety position for the first end-effector 116 that avoids an uncertainty region of the second end-effector 132.
  • the motion controller 314 sends control information to the first controller 204 to cause the first end-effector 116 to move from the current position to the safety position along the safety trajectory 320 in order to prevent a collision with the second end-effector 132.
  • the motion controller 314 can implement additional operations associated with controlling movement for the robotic arms 106, 122.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122, where the first robot 102 and the second robot 118 are collaborative robots arranged to move in a shared physical space 138 in order to execute tasks to accomplish shared goals on an object 136.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the first end-effector 116 is a terminus for the kinematic chain.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where the second robotic arm 122 is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the second end-effector 132 is a terminus for the kinematic chain.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where the first defined trajectory 318-1 for the first end-effector 116 and the second defined trajectory 318-2 for the second end-effector 132 are arranged to avoid collisions between the first end-effector 116 and the second end-effector 132 when moving within a shared physical space 138.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by detecting the cyberattack on the second controller 206 for the second robot 118 during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by comparing the time of attack detection to each time step associated with each position in the first defined trajectory 318-1 until a match is found to identify the attack detection time step.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by retrieving the current position for the first end- effector 116 from among the series of positions along the first defined trajectory 318-1 based on the attack detection time step from a database.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where the uncertainty region includes a three- dimensional shape that defines boundaries for a reach envelope of the second end-effector 132 after the attack detection time step.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where an uncertainty region is generated for each position associated with a time step for the second end-effector 132 of the second robotic arm 122, and stored in the database before the time of attack detection.
  • the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by causing execution of the control information by the first controller 204 to cause the first end-effector 116 to move from the current position to the safety point along the safety trajectory 320 in order to prevent a collision with the second end-effector 132.
  • the motion controller 314 can cease assisting the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by receiving notification that the detected cyberattack on the second controller 206 for the second robot 118 has been neutralized.
  • FIG. 6 depicts an example of an operating environment 600 representing normal operations for the robotic system 100.
  • the robots 102, 118 of the robotic system 100 each operate in an independent and cooperative manner.
  • the robotic arm 106 of the robot 102 follows a defined trajectory 318-1 while the robotic arm 122 of the robot 118 follows a defined trajectory 318-2.
  • the end-effector 116 of the robotic arm 106 is at a different position along the defined trajectory 318-1.
  • the end-effector 132 of the robotic arm 122 is at a different position along the defined trajectory 318-2.
  • the defined trajectories 318-1, 318-2 may intersect within the shared physical space 138 as each robotic arm 106, 122 works from a start configuration S towards a goal configuration G.
  • the motion planner 312 carefully plans and generates the defined trajectories 318-1, 318-2 to avoid collisions when the robotic arms 106, 122 are moving within the shared physical space 138.
  • each configuration (or pose) of the robotic arms 106, 122 is associated with a defined time step T 1 -T N .
  • T 1 -T N a defined time step
  • the robotic arm 106 is in a first configuration (or pose) at a first defined time step ti, a second configuration at time step t2, and so forth.
  • the robotic arm 122 is in a first configuration (or pose) a first defined time step t1, a second configuration at time step t2, and so forth.
  • FIG. 7 A depicts an operating environment 700a representing operations of the robotic system 100 at a start of a security attack. Assume at an attack time t A , the attacker 208 successfully compromises and takes control of the controller 206 for the robot 118. Further assume that the attack time t A corresponds with a defined time step t1-tN. In the operating environment 700a, the robotic arms 102, 118 are in respective configurations to avoid collisions with each other.
  • FIG. 7B depicts an operating environment 700b representing operations of the robotic system 100 when a security attack is detected.
  • the IDS 202 detects the attack on the controller 206 for the robot 118 at a detect time to.
  • the IDS 202 issues an alarm 210 to the robotic safety system 214 and/or the controller 204.
  • the detect time tD corresponds with a defined time step ti-tN.
  • the robotic arms 102, 118 are in respective configurations that could potentially cause collisions with each other.
  • FIG. 7C depicts and operating environment 700c representing operations of the robotic system 100 when a security attack is neutralized.
  • the IDS 202 neutralizes the attack on the controller 206 for the robot 118 at a neutralization time t N .
  • the neutralization time tN corresponds with a defined time step ti-tN (not necessarily the last defined time step TN).
  • the robotic arms 102, 118 have collided with each other even though the security attack has been neutralized. Consequently, the robotic safety system 214 needs to operate to secure the robot 102 and corresponding robotic arm 106 in at least a time period between t N - t A .
  • FIG. 8 depicts an operating environment 800.
  • the operating environment 800 illustrates an uncertainty region 802 for the compromised robotic arm 122 of the robot 118 during an uncertainty time period between t N - t A .
  • an attack time t A for an attack on the robot 118 occurs at a defined time step ts.
  • the robotic safety system 214 is uncertain as to the exact position of the end-effector 132 for the compromised robotic arm 122.
  • the robotic safety system 214 is aware of a last known position, configuration or pose for the robotic arm 122 at the defined time step t 5 since it has access to the defined trajectory 318-2 for the robotic arm 122 stored in a database.
  • the motion planner 312 can use configuration information 316 for the compromised robotic arm 122 at defined time step ts to calculate an uncertainty region 802 for the robotic arm 122 after the defined time step ts.
  • the uncertainty region 802 comprises a three-dimensional (3D) shape that defines estimated boundaries for a reach envelope of the second end-effector 132 after the given defined time step t5.
  • the motion planner 312 previously computes the uncertainty region 802 for the compromised robotic arm 122 using configuration information 316 for the compromised robotic arm 122 at defined time step t5, and then computes a safety trajectory 320-1 for the un-compromised robotic arm 106 of the robot 102 based on the uncertainty region 802, which is subsequently stored in a database.
  • the motion controller 314 either directly or indirectly, causes the un-compromised robotic arm 106 to deviate from a current position along the defined trajectory 318-1 to a safety position 804 along the safety trajectory 320-1, which is pre- computed to avoid the uncertainty region 802 of the compromised robotic arm 122 of the compromised robot 118. Since all safety trajectories 320 are computed during the offline phase and stored in a database, the motion controller 314 can quickly retrieve and execute the safety trajectory 320-1 during real-time operation of the robots 102, 118.
  • FIG. 9 depicts an operating environment 900 for the robotic system 100.
  • the motion planner 312 for the robotic safety system 214 computes a series of safety trajectories 320-1, 320-2, 320-3, 320-4 and 320-5 at corresponding defined time steps t1, t2, ts, t4 and t5, respectively, for the robotic arm 106 of the robot 102.
  • the safety trajectories 320-1 to 320-5 are stored in a database accessible by the robotic safety system 214.
  • the safety trajectories 320-1 to 320-5 are indexed in the database for quick retrieval by the motion controller 314 of the robotic safety system 214.
  • the safety trajectories 320-1 to 320-5 may be indexed in the database by a position, pose or configuration for the robotic arm 106 at a corresponding defined time step ti-ts, respectively.
  • Each of the safety trajectories 320-1 to 320-5 represent an alternative trajectory for the robotic arm 106 from a normal defined trajectory 318-1. More particularly, each of the safety trajectories 320-1 to 320-5 represent an alternative trajectory that starts at a current position for the end-effector 116 of the robotic arm 106 at a defined time step t1-t5 to a safety position 804 for the end-effector 116 of the robotic arm 106.
  • the safety trajectories 320-1, 320-2, respectively both easily bypass the uncertainty region 802.
  • the safety trajectories 320-3, 320-4 are closer to the uncertainty region 802, and after deviation from the defined trajectory 318-1 at defined time steps t3, t4, respectively, resume following the defined trajectory 318-1 at defined time steps t8, t7, respectively.
  • the safety trajectory 320-5 at defined time step t5 passes more closely to the uncertainty region 802, and if the robotic arm 106 would have continued along the defined trajectory 318-1 it would have traveled directly through the uncertainty region 802 at defined time step t6, thereby leading to potential collisions with the robotic arm 122.
  • FIG. 10 depicts an operating environment 1000 for the robotic system 100.
  • the robotic arm 106 of the robot 102 follows a defined trajectory 318-1 during normal operations.
  • the motion planner 312 of the robotic safety system 214 In preparation for a security attack, the motion planner 312 of the robotic safety system 214 generates safety trajectories 320 for the robotic arm 106.
  • the motion planner 312 considers uncertainty regions 802 for the robotic arm 122 of the robot 118 for each defined time step T1-TN.
  • the resulting safety trajectories 320 are a discrete collection of tuples of joint positions at given times or deltas for its instantiation.
  • the safety trajectories 320 fulfill position, velocity and acceleration limits and are collision-free with a corresponding uncertainty region 802, denoted as occupied space C in FIG. 10, and with no-self collisions. The regular distributions of these samples is ensured with respect to a task-space.
  • the motion planner 312 uses kinematics to calculate a motion for the robotic arms 106, 122. Geometric transformations, also called rigid transformations, are used to describe movement of components of the robotic arms 106, 122, thereby simplifying the derivation of equations of motion.
  • the motion planner 312 implements a motion planning algorithm which takes as input a set of configuration information 316 for the robotic arm 106 that includes a set of joint limits, an initial kinematic frame , a final kinematic frame direct kinematics , inverse kinematics ⁇ , and an occupied space C.
  • the robot 102 has a set of joints J0-J5.
  • the joints J0-J5 may have associated joint limits represented as a minimum [J(tj] and a maximum [J(t), J'(t), J"(t)], where J(t) represents a joint position state which is an angular value from encoders of all joints of the robot at time t and dimension is the degrees of freedom of the robot 102, J"(t) represents a joint velocity state which is an angular derivative (with respect to time) computed by the encoders of all joints of the robot 102 at time t, and J"(t) represents a joint acceleration state which is an angular second derivative (with respect to time) computed by the encoders of all joints of the robot at time t.
  • the robot 102 also has a robot base frame 1002 and a robot end-effector frame 1004.
  • the robot base frame 1002 is a kinematic frame in SE 3 defined as a robot base or first element of a kinematic chain.
  • the robot end-effector frame 1004 is a kinematic frame in SE 3 defined as a last element of the kinematic chain.
  • a frame is a time-variant rigid transformation in SE 3 representing a mapping from the robot base frame 1002 (frame
  • a frame is a time- variant rigid transformation in SE 3 representing a mapping from the robot base frame 1002 (frame B) to the robot end-effector frame 1004 (frame E) at time tb.
  • the uncertainty region 802 denoted as an occupied space C is defined as an occupied space represented in three-dimensions (3D) where 3D C R3, which is a subspace where the robotic arm 106 should not be located at any point along a safety trajectory 320.
  • the direct kinematics is defined which is a forward-kinematic function transform of a joint position state to a kinematic frame denoted as
  • the inverse kinematics is defined as , which is an inverse- kinematic function transform of a kinematic frame to its generating j oint position state J or empty if not reachable.
  • the motion planner 312 can use a rapid- exploration random trees (RRT) algorithm to generate a safety trajectory 320, which can be represented as follows:
  • the motion planner 312 can also implement a trajectory adapting, smoothing and regularization algorithm A to a safety trajectory 320.
  • the trajectory adapting, smoothing and regularization algorithm A may ensure joint velocities remain within joint limits, smooths a safety trajectory 320 to avoid higher accelerations than limits along inflection regions, and regularizes discrete samples at parametric task-space distances (translation or rotation) allowing homogeneous key points distributions for branching, as explained further in FIG. 11.
  • FIG. 11 depicts a logic flow 1100.
  • the logic flow 1100 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 1100 may include some or all of the operations performed by the motion planner 312 to generate or compute safety trajectories 320 during the offline phase 902. Embodiments are not limited in this context.
  • the logic flow 1100 receives a set of configuration information 316 representing full robot and application inputs.
  • the motion planner 312 implements a motion planning algorithm which takes as input a set of configuration information 316 for the robotic arm 106 that includes a set of joint limits, an initial kinematic frame a final kinematic frame direct kinematics inverse kinematics ⁇ , and an occupied space C.
  • the logic flow 1100 determines whether a safety trajectory 320 exists or is possible for a robotic arm at a given defined time step and uncertainty region. If no safety trajectory 320 exists at block 1104, the motion planner 312 outputs 0 no trajectory found and moves to the next defined time step. If a safety trajectory 320 exists at block 1104, then control moves to block 1106.
  • the logic flow 1100 generates a safety trajectory 320 which is a collision-free trajectory with n samples, denoted
  • the logic flow 1100 performs trajectory adapting, smoothing and regularization.
  • the motion planner 312 can implement a trajectory adapting, smoothing and regularization algorithm A to a safety trajectory 320.
  • a trajectory adapting, smoothing and regularization algorithm A For smoothing, if trajectory smoothing is not within limits along inflection regions, then control is passed back to block 1104 for replanning of the safety trajectory 320.
  • the motion planner 312 receives as input a set of parameters for regularization granularity [dr, dir], where dr represents a translation in millimeters (mm) and dR represents rotation in radians.
  • the logic flow 1100 performs trajectory post- processing with m samples, denoted as •
  • the safety trajectory 320 is output by the motion planner 312 and stored in the database for use by the motion controller 314 during an online phase for the robotic system 100.
  • FIG. 12 depicts an operating environment 1200.
  • the operating environment 1200 illustrates an online phase 1202 for the robotic system 100.
  • the robotic arm 106 traverses the defined trajectory 318-1 during normal operations.
  • Sometime after defined time step t3. the attacker 208 initiates a cybersecurity attack on the controller 206 of the robot 118, thereby compromising movement of the robotic arm 122.
  • the IDS 202 detects the cybersecurity attack, and sends an alarm 210 to the robotic safety system 214.
  • the motion controller 314 retrieves a safety trajectory 320-1 for the robotic arm 106 indexed at defined time step t4 in the database.
  • the safety trajectory 320-1 was previously generated by the motion planner 312 during the offline phase 902 based on a last known position, pose or configuration of the robotic arm f22 at defined time step t3 and associated uncertainty region 802.
  • the motion controller 3f4 then causes, indirectly through the controller 204 of the robotic arm 106 or through direct control of the robotic arm 106, the robotic arm 106 to deviate from the defined trajectory 318-1 at the defined time step t4 and instead follow the safety trajectory 320-1 to safely bypass the uncertainty region 802.
  • the robotic arm 106 traverses the safety trajectory 320-1 and arrives at a position, pose or configuration for the robotic arm 106 at the defined time step t7, where it resumes movement along the defined trajectory 318-1 to a safety position 804.
  • FIG. 13 depicts a logic flow 1300.
  • the logic flow 1300 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 1300 may include some or all of the operations performed by the motion controller 314 to retrieve and execute safety trajectories 320 during the online phase 1202. Embodiments are not limited in this context.
  • the logic flow 1300 receives as input a frame and joint information robotic arm 106 along the defined trajectory 318-1 at a detect time td for an attack.
  • the logic flow 1300 outputs (jd, td) which represents instantaneous joint position state at time h.
  • the logic flow 1300 calculates where intjn represents a rounding to an integer of m bits.
  • the logic flow 1300 then calculates • The result & is used as a query index to a database 1308.
  • the database 1308 stores safety trajectories 320 indexed by &.
  • the logic flow 1300 retrieves an indexed safety trajectory 320-1, and outputs to the block 1306.
  • the logic flow receives the indexed safety trajectory 320-1, and outputs joint information denoted as outputs a frame the robotic arm 106.
  • FIG. 14 depicts an operating environment 1400.
  • the operating environment 1400 illustrates an example of various frames of the end-effector 116 of the robotic arm 106 as it follows the defined trajectory 318-1. It also illustrates a set of current frames times i, j, k, I, m representing start frames for safety trajectories 320 for the end-effector 116 of the robotic arm 106 to avoid the uncertainty region 802 denoted as occupied space C for the end-effector 132 of the robotic arm 122 at various defined time steps.
  • Joint positions for the end-effector 116 of the robotic arm 106 along the defined trajectory 320-1 is denoted as (jd, td) which represents an instantaneous j oint position state at time h.
  • FIG. 15 illustrates an embodiment of a logic flow 1500.
  • the logic flow 1500 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 1500 may include some or all of the operations performed by the motion planner 312 to generate or compute safety trajectories 320.
  • the logic flow 1500 presents a more general case where the motion planner 312 computes safety trajectories 320 for the entire configuration or pose of the robotic arms 106, 122, including the end-effectors, to avoid an uncertainty region 802. Embodiments are not limited in this context.
  • the logic flow 1500 illustrate a series of general operational blocks 1502 to 1508.
  • logic flow 1500 receives configuration information for a first robot with a first robotic arm at a defined time step.
  • logic flow 1500 receives an uncertainty region for a second robot with a second robotic arm, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second robotic arm at the defined time step.
  • logic flow 1500 determines a safety trajectory for the first robotic arm based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first robotic arm and a safety position for the first robotic arm that avoids the uncertainty region of the second robotic arm at the defined time step.
  • logic flow 1500 stores the safety trajectory in a database.
  • the motion planner 312 receives configuration information 316 for a first robot 102 with a first robotic arm 106 at a defined time step Ti- TN.
  • the motion planner 312 receives an uncertainty region 802 for a second robot 118 with a second robotic arm 122.
  • the uncertainty region 802 may comprise a three- dimensional shape that defines boundaries for a reach envelope of the second robotic arm at the defined time step TI-TN.
  • the motion planner 312 determines a safety trajectory 320 for the first robotic arm 106 based on the uncertainty region 802, the safety trajectory 320 to define a series of positions between a current position for the first robotic arm 106 and a safety position 804 for the first robotic arm 106 that avoids the uncertainty region 802 of the second robotic arm 122 at the defined time step TI-TN.
  • the motion planner 312 stores the safety trajectory 320 in a database.
  • FIG. 16 illustrates an embodiment of a logic flow 1600.
  • the logic flow 1600 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 1600 may include some or all of the operations performed by the motion controller 314 to retrieve and execute a safety trajectory 320 for one or both robotic arms 106, 122.
  • the logic flow 1600 presents a more general case where the motion controller 314 retrieves and executes safety trajectories 320 for the entire configuration or pose of the robotic arms 106, 122, including the end-effectors, to avoid an uncertainty region 802. Embodiments are not limited in this context.
  • the logic flow 1600 illustrates a series of general operational blocks 1602 to 1614.
  • logic flow 1600 monitors a first controller for a first robot with a first robotic arm, the first controller to control movement of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time step.
  • logic flow 1600 monitors a second controller for a second robot with a second robotic arm, the second controller to control movement of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time step.
  • logic flow 1600 detects a cyberattack on the second controller for the second robot.
  • logic flow 1600 identifies a defined time step corresponding to a time of attack detection, the identified defined time step to comprise an attack detection time step.
  • logic flow 1600 determines a current position for the first robotic arm from among the series of positions along the first defined trajectory based on the attack detection time step.
  • logic flow 1600 retrieves a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first robotic arm, the safety trajectory to define a series of positions between the current position for the first robotic arm and a safety position for the first robotic arm that avoids an uncertainty region of the second robotic arm.
  • logic flow 1600 sending control information to the first controller to cause the first robotic arm to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second robotic arm.
  • motion controller 314 monitors a first controller 204 for a first robot 102 with a first robotic arm 106, the first controller 204 to control movement of the first robotic arm 106 between a series of positions along a first defined trajectory 318 , each position associated with a defined time step TI-TN.
  • motion controller 314 monitors a second controller 206 for a second robot 118 with a second robotic arm 122, the second controller 206 to control movement of the second robotic arm 122 between a series of positions along a second defined trajectory 318, each position associated with a defined time step TI-TN.
  • motion controller 314 detects a cyberattack on the second controller 206 for the second robot 118.
  • the motion controller 314 identifies a defined time step TI-TN corresponding to a time of attack detection, the identified defined time step TI-TN to comprise an attack detection time step.
  • the motion controller 314 determines a current position for the first robotic arm 106 from among the series of positions along the first defined trajectory 318 based on the attack detection time step.
  • the motion controller 314 retrieves a safety trajectory 320 from among multiple safety trajectories 320 stored in a database based on the current position for the first robotic arm 106, the safety trajectory 320 to define a series of positions between the current position for the first robotic arm 106 and a safety position 804 for the first robotic arm 106 that avoids an uncertainty region of the second robotic arm 122.
  • the motion controller 314 sends control information to the first controller 204 to cause the first robotic arm 106 to move from the current position to the safety position 804 along the safety trajectory 320 in order to prevent a collision with the second robotic arm 122.
  • FIG. 17 illustrates computer readable storage medium 1700.
  • Computer readable storage medium 1700 may comprise any non -transitory computer-readable storage medium or machine-readable storage medium, such as an optical, magnetic or semiconductor storage medium. In various embodiments, computer readable storage medium 1700 may comprise an article of manufacture.
  • computer readable storage medium 1700 may store computer executable instructions 1702 with which circuitry (e.g., controllers 204, 206, processing circuitry 302, network interface 322, or the like) can execute.
  • circuitry e.g., controllers 204, 206, processing circuitry 302, network interface 322, or the like
  • computer executable instructions 1702 can include instructions to implement operations described with respect to logic flows 400, 500, 1100 and 1300.
  • Examples of computer readable storage medium 1700 or machine-readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
  • Examples of computer executable instructions 1702 may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like.
  • FIG. 18 illustrates one example of a system architecture and data processing device that may be used to implement one or more illustrative aspects described herein in a standalone and/or networked environment.
  • Various network nodes such as the data server 1810, web server 1806, computer 1804, and laptop 1802 may be interconnected via a wide area network 1808 (WAN), such as the internet.
  • WAN wide area network
  • Other networks may also or alternatively be used, including private intranets, corporate networks, LANs, metropolitan area networks (MANs) wireless networks, personal networks (PANs), and the like.
  • Network 1808 is for illustration purposes and may be replaced with fewer or additional computer networks.
  • a local area network may have one or more of any known LAN topology and may use one or more of a variety of different protocols, such as ethernet.
  • Devices data server 1810, web server 1806, computer 1804, laptop 1802 and other devices may be connected to one or more of the networks via twisted pair wires, coaxial cable, fiber optics, radio waves or other communication media.
  • Computer software, hardware, and networks may be utilized in a variety of different system environments, including standalone, networked, remote-access (aka, remote desktop), virtualized, and/or cloud-based environments, among others.
  • network refers not only to systems in which remote storage devices are coupled together via one or more communication paths, but also to stand-alone devices that may be coupled, from time to time, to such systems that have storage capability. Consequently, the term “network” includes not only a “physical network” but also a “content network,” which is comprised of the data— attributable to a single entity— which resides across all physical networks.
  • the components may include data server 1810, web server 1806, and client computer 1804, laptop 1802.
  • Data server 1810 provides overall access, control and administration of databases and control software for performing one or more illustrative aspects described herein.
  • Data server 1610 may be connected to web server 1806 through which users interact with and obtain data as requested.
  • data server 1810 may act as a web server itself and be directly connected to the internet.
  • Data server 1810 may be connected to web server 1806 through the network 1808 (e.g., the internet), via direct or indirect connection, or via some other network.
  • Users may interact with the data server 1810 using remote computer 1804, laptop 1802, e.g., using a web browser to connect to the data server 1810 via one or more externally exposed web sites hosted by web server 1806.
  • Client computer 1804, laptop 1802 may be used in concert with data server 1810 to access data stored therein, or may be used for other purposes.
  • a user may access web server 1806 using an internet browser, as is known in the art, or by executing a software application that communicates with web server 1806 and/or data server 1810 over a computer network (such as the internet).
  • Servers and applications may be combined on the same physical machines, and retain separate virtual or logical addresses, or may reside on separate physical machines.
  • FIG. 18 illustrates just one example of a network architecture that may be used, and those of skill in the art will appreciate that the specific network architecture and data processing devices used may vary, and are secondary to the functionality that they provide, as further described herein. For example, services provided by web server 1806 and data server 1810 may be combined on a single server.
  • Each component data server 1810, web server 1806, computer 1804, laptop 1802 may be any type of known computer, server, or data processing device.
  • Data server 1810 e.g., may include a processor 1812 controlling overall operation of the data server 1810.
  • Data server 1810 may further include RAM 1816, ROM 1818, network interface 1814, input/output interfaces 1820 (e.g., keyboard, mouse, display, printer, etc.), and memory 1822.
  • Input/output interfaces 1820 may include a variety of interface units and drives for reading, writing, displaying, and/or printing data or files.
  • Memory 1822 may further store operating system software 1824 for controlling overall operation of the data server 1810, control logic 1826 for instructing data server 1810 to perform aspects described herein, and other application software 1828 providing secondary, support, and/or other functionality which may or may not be used in conjunction with aspects described herein.
  • the control logic may also be referred to herein as the data server software control logic 1826.
  • Functionality of the data server software may refer to operations or decisions made automatically based on rules coded into the control logic, made manually by a user providing input into the system, and/or a combination of automatic processing based on user input (e.g., queries, data updates, etc.).
  • Memory 1822 may also store data used in performance of one or more aspects described herein, including a first database 1630 and a second database 1832.
  • the databases 1830, 1832 can store information such as configuration information 316, defined trajectories 318, and/or safety trajectories 320.
  • the first database may include the second database (e.g., as a separate table, report, etc.). That is, the information can be stored in a single database, or separated into different logical, virtual, or physical databases, depending on system design.
  • Web server 1806, computer 1804, laptop 1802 may have similar or different architecture as described with respect to data server 1810.
  • data server 1810 may be spread across multiple data processing devices, for example, to distribute processing load across multiple computers, to segregate transactions based on geographic location, user access level, quality of service (QoS), etc.
  • QoS quality of service
  • One or more aspects may be embodied in computer-usable or readable data and/or computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices as described herein.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other device.
  • the modules may be written in a source code programming language that is subsequently compiled for execution, or may be written in a scripting language such as (but not limited to) HTML or XML.
  • the computer executable instructions may be stored on a computer readable medium such as a nonvolatile storage device.
  • Any suitable computer readable storage media may be utilized, including hard disks, CD-ROMs, optical storage devices, magnetic storage devices, and/or any combination thereof.
  • various transmission (non-storage) media representing data or events as described herein may be transferred between a source and a destination in the form of electromagnetic waves traveling through signal-conducting media such as metal wires, optical fibers, and/or wireless transmission media (e.g., air and/or space), various aspects described herein may be embodied as a method, a data processing system, or a computer program product.
  • various functionalities may be embodied in whole or in part in software, firmware and/or hardware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA), and the like.
  • Particular data structures may be used to more effectively implement one or more aspects described herein, and such data structures are contemplated within the scope of computer executable instructions and computer-usable data described herein.
  • At least one computer-readable storage medium may include instructions that, when executed, cause a system to perform any of the computer-implemented methods described herein.
  • a method to reduce robotic collisions includes monitoring a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, monitoring a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, detecting a cyberattack on the second controller for the second robot, identifying a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, determining a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval, retrieving a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end- effector, the safety trajectory to define a
  • the method may also include where the first robot and the second robot are collaborative robots arranged to move in a shared physical space in order to execute tasks to accomplish shared goals.
  • the method may also include where the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the first end-effector is a terminus for the kinematic chain.
  • the method may also include where the second robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the second end-effector is a terminus for the kinematic chain.
  • the method may also include where the first defined trajectory for the first end- effector and the second defined trajectory for the second end-effector are arranged to avoid collisions between the first end-effector and the second end-effector when moving within a shared space.
  • the method may also include includes detecting the cyberattack on the second controller for the second robot during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
  • the method may also include includes comparing the time of attack detection to each time interval associated with each position in the first defined trajectory until a match is found to identify the attack detection time interval.
  • the method may also include includes retrieving the current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval from a database.
  • the method may also include where the uncertainty region includes a three- dimensional shape that defines boundaries for a reach envelope of the second end-effector after the attack detection time interval.
  • the method may also include where an uncertainty region is generated for each position associated with a time interval for the second end-effector of the second robotic arm, and stored in the database before the time of attack detection.
  • the method may also include includes executing the control information by the first controller to cause the first end-effector to move from the current position to the safety point along the safety trajectory in order to prevent a collision with the second end-effector.
  • the method may also include includes neutralizing the detected cyberattack on the second controller for the second robot.
  • an apparatus includes a processor circuit.
  • the apparatus also includes a memory storing instructions that, when executed by the processor, cause the processor circuit to monitor a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, monitor a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, detect a cyberattack on the second controller for the second robot, identify a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, determine a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval, retrieve a safety trajectory from among multiple safety trajectories stored in a database based on the
  • the apparatus may also include where the first robot and the second robot are collaborative robots arranged to move in a shared physical space in order to execute tasks to accomplish shared goals.
  • the apparatus may also include where the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the first end-effector is a terminus for the kinematic chain.
  • the apparatus may also include where the second robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the second end-effector is a terminus for the kinematic chain.
  • the apparatus may also include where the first defined trajectory for the first end- effector and the second defined trajectory for the second end-effector are arranged to avoid collisions between the first end-effector and the second end-effector when move within a shared space.
  • the apparatus may also include the processor circuit to detect the cyberattack on the second controller for the second robot during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
  • the apparatus may also include the processor circuit to compare the time of attack detection to each time interval associated with each position in the first defined trajectory until a match is found to identify the attack detection time interval.
  • the computing apparatus may also include the processor circuit to retrieve the current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval from a database.
  • the computing apparatus may also include where the uncertainty region includes a three-dimensional shape that defines boundaries for a reach envelope of the second end- effector after the attack detection time interval.
  • the computing apparatus may also include where an uncertainty region is generated for each position associated with a time interval for the second end-effector of the second robotic arm, and stored in the database before the time of attack detection.
  • the computing apparatus may also include the processor circuit to execute the control information by the first controller to cause the first end-effector to move from the current position to the safety point along the safety trajectory in order to prevent a collision with the second end-effector.
  • the computing apparatus may also include the processor circuit to neutralize the detected cyberattack on the second controller for the second robot.
  • a non-transitory computer-readable storage medium including instructions that when executed by a computer, cause the computer to monitor a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, monitor a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, detect a cyberattack on the second controller for the second robot, identify a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, determine a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval, retrieve a safety trajectory from among multiple safety trajectories stored in a database
  • the computer-readable storage medium may also include where the first robot and the second robot are collaborative robots arranged to move in a shared physical space in order to execute tasks to accomplish shared goals.
  • the computer-readable storage medium may also include where the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the first end- effector is a terminus for the kinematic chain.
  • the computer-readable storage medium may also include where the second robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the second end-effector is a terminus for the kinematic chain.
  • the computer-readable storage medium may also include where the first defined trajectory for the first end-effector and the second defined trajectory for the second end- effector are arranged to avoid collisions between the first end-effector and the second end- effector when move within a shared space.
  • the computer-readable storage medium may also include includes detect the cyberattack on the second controller for the second robot during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
  • the computer-readable storage medium may also include includes compare the time of attack detection to each time interval associated with each position in the first defined trajectory until a match is found to identify the attack detection time interval.
  • the computer-readable storage medium may also include includes retrieve the current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval from a database.
  • the computer-readable storage medium may also include where the uncertainty region includes a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector after the attack detection time interval.
  • the computer-readable storage medium may also include where an uncertainty region is generated for each position associated with a time interval for the second end- effector of the second robotic arm, and stored in the database before the time of attack detection.
  • the computer-readable storage medium may also include includes execute the control information by the first controller to cause the first end-effector to move from the current position to the safety point along the safety trajectory in order to prevent a collision with the second end-effector.
  • the computer-readable storage medium may also include includes neutralize the detected cyberattack on the second controller for the second robot.
  • a method to compute a trajectory includes receiving configuration information for a first robot with a first robotic arm and a first end-effector at a defined time interval, receiving an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time interval, determining a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time interval, and storing the safety trajectory in a database.
  • the method may also include where the defined time interval is associated with a defined position for the first end-effector and the second end-effector.
  • the method may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the first robotic arm, a joint velocity state for each joint of the first robotic arm, and a joint acceleration state for each joint of the first robotic arm.
  • the method may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes an initial kinematic frame and a final kinematic frame for the first robotic arm at the defined time interval.
  • the method may also include includes determining the uncertainty region for the second robot with the second robotic arm and the second end-effector based on configuration information for the second robot, where the configuration information includes a set of parameters for the second robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the second robotic arm, a joint velocity state for each joint of the second robotic arm, and a joint acceleration state for each joint of the second robotic arm.
  • the method may also include includes adapting the safety trajectory to ensure movement of each joint of the first robotic arm remains within joint acceleration parameters.
  • the method may also include includes smoothing the safety trajectory to ensure movement of the first robotic arm remains within acceleration parameters along inflection regions.
  • the method may also include includes regularizing the safety trajectory for safe- trajectory branching and management.
  • the method may also include includes receiving multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
  • the method may also include includes determining multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
  • the method may also include includes storing multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector in the database.
  • the method may also include includes indexing multiple safety trajectories for the first end-effector in the database using defined time intervals and associated positions of the first end-effector at each defined time interval.
  • a computing apparatus includes a processor circuit.
  • the computing apparatus also includes a memory storing instructions that, when executed by the processor, cause the processor circuit to receive configuration information for a first robot with a first robotic arm and a first end-effector at a defined time interval, receive an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time interval, determine a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time interval, and store the safety trajectory in a database.
  • the computing apparatus may also include where the defined time interval is associated with a defined position for the first end-effector and the second end-effector. [0192] The computing apparatus may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the first robotic arm, a joint velocity state for each joint of the first robotic arm, and a joint acceleration state for each joint of the first robotic arm.
  • the computing apparatus may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes an initial kinematic frame and a final kinematic frame for the first robotic arm at the defined time interval.
  • the computing apparatus may also include the processor circuit to determine the uncertainty region for the second robot with the second robotic arm and the second end- effector based on configuration information for the second robot, where the configuration information includes a set of parameters for the second robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the second robotic arm, a joint velocity state for each joint of the second robotic arm, and a joint acceleration state for each joint of the second robotic arm.
  • the computing apparatus may also include the processor circuit to adapt the safety trajectory to ensure movement of each joint of the first robotic arm remains within joint acceleration parameters.
  • the computing apparatus may also include the processor circuit to smooth the safety trajectory to ensure movement of the first robotic arm remains within acceleration parameters along inflection regions.
  • the computing apparatus may also include the processor circuit to regularize the safety trajectory for safe-trajectory branching and management.
  • the computing apparatus may also include the processor circuit to receive multiple uncertainty regions for the second robot with the second robotic arm and the second end- effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
  • the computing apparatus may also include the processor circuit to determine multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
  • the computing apparatus may also include the processor circuit to store multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector in the database.
  • the computing apparatus may also include the processor circuit to index multiple safety trajectories for the first end-effector in the database using defined time intervals and associated positions of the first end-effector at each defined time interval.
  • a non-transitory computer-readable storage medium including instructions that when executed by a computer, cause the computer to receive configuration information for a first robot with a first robotic arm and a first end-effector at a defined time interval, receive an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time interval, determine a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time interval, and store the safety trajectory in a database.
  • the computer-readable storage medium may also include where the defined time interval is associated with a defined position for the first end-effector and the second end- effector.
  • the computer-readable storage medium may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the first robotic arm, a joint velocity state for each joint of the first robotic arm, and a joint acceleration state for each joint of the first robotic arm.
  • the computer-readable storage medium may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes an initial kinematic frame and a final kinematic frame for the first robotic arm at the defined time interval.
  • the computer-readable storage medium may also include includes determine the uncertainty region for the second robot with the second robotic arm and the second end- effector based on configuration information for the second robot, where the configuration information includes a set of parameters for the second robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the second robotic arm, a joint velocity state for each joint of the second robotic arm, and a joint acceleration state for each joint of the second robotic arm.
  • the computer-readable storage medium may also include includes adap the safety trajectory to ensure movement of each joint of the first robotic arm remains within joint acceleration parameters.
  • the computer-readable storage medium may also include includes smooth the safety trajectory to ensure movement of the first robotic arm remains within acceleration parameters along inflection regions.
  • the computer-readable storage medium may also include includes regularize the safety trajectory for safe-trajectory branching and management.
  • the computer-readable storage medium may also include includes receive multiple uncertainty regions for the second robot with the second robotic arm and the second end- effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
  • the computer-readable storage medium may also include includes determine multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
  • the computer-readable storage medium may also include includes store multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector in the database.
  • the computer-readable storage medium may also include includes index multiple safety trajectories for the first end-effector in the database using defined time intervals and associated positions of the first end-effector at each defined time interval.
  • a method to compute a trajectory includes receiving configuration information for a first robot with a first robotic arm at a defined time interval, receiving an uncertainty region for a second robot with a second robotic arm, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second robotic arm at the defined time interval, determining a safety trajectory for the first robotic arm based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first robotic arm and a safety position for the first robotic arm that avoids the uncertainty region of the second robotic arm at the defined time interval, and storing the safety trajectory in a database.
  • the method may also include the first robotic arm to have a first end-effector and the second robotic arm to have a second end-effector.
  • a method to reduce robotic collisions includes monitoring a first controller for a first robot with a first robotic arm, the first controller to control movement of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, monitoring a second controller for a second robot with a second robotic arm, the second controller to control movement of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, detecting a cyberattack on the second controller for the second robot, identifying a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, determining a current position for the first robotic arm from among the series of positions along the first defined trajectory based on the attack detection time interval, retrieving a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first robotic arm, the safety trajectory to define a series of positions between the current position for the first robotic arm and a safety position for the first
  • an apparatus to reduce robotic collisions includes a means for monitoring a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, a means for monitoring a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, a means for detecting a cyberattack on the second controller for the second robot, identifying a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, a means for determining a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval, a means for retrieving a safety trajectory from among multiple safety trajectories stored in a database
  • an apparatus to compute a trajectory includes a means for receiving configuration information for a first robot with a first robotic arm and a first end-effector at a defined time interval, a means for receiving an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time interval, a means for determining a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time interval, and a means for storing the safety trajectory in a database.

Abstract

Techniques to perform post-attack real-time trajectory recovery of collaborative robotic arms are described. An intrusion detection system detects a security attack, and sends an alarm to a robotic safety system. The robotic safety system takes remedial actions to secure one or both of the collaborative robotic arms. In an offline phase, the robotic safety system plans safety trajectories for the collaborative robotic arms and stores them in a database. In an online phase, the robotic safety system receives the alarm from the IDS, and retrieves safety trajectories for the collaborative robotic arms from the database. Other embodiments are described and claimed.

Description

POST-ATTACK REAL-TIME TRAJECTORY RECOVERY OF COLLABORATIVE ROBOTIC ARMS BACKGROUND
[0001] A robot is a machine designed to perform an automated set of repetitive tasks under control of a computer. While some robots take human form, others are designed with a form suitable for a given set of tasks. Examples include industrial and medical operating robots with robotic arms, unmanned aerial vehicles (UAV) drones with a shape of an aircraft, quadrupedal robots shaped like dogs, and even microscopic nano robots shaped like a flagellum for propulsion through a human body. As a robot is controlled by a computer, there is a potential for an attacker to gain unauthorized access to the computer, and thereby take over control of the robot, sometimes to lethal effect. This can cause the robot to harm humans or damage expensive equipment before an attack is detected and neutralized. Consequently, security measures are needed to safeguard against remote attacks and mitigate potential damage to the robot or surrounding environment.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0002] To easily identify the discussion of any particular element or act, the most significant digit or digits in a reference number refer to the figure number in which that element is first introduced.
[0003] FIG. 1 illustrates an exemplary robotic system.
[0004] FIG. 2A illustrates a first view of a system with an intrusion detection system. [0005] FIG. 2B illustrates a second view of a system with an intrusion detection system. [0006] FIG. 3 illustrates block diagram of an intrusion detection system.
[0007] FIG. 4 illustrates a first logic flow.
[0008] FIG. 5 illustrates a second logic flow.
[0009] FIG. 6 illustrates an operating environment of a robotic system.
[0010] FIG. 7 A illustrates an operating environment of a robotic system at a start of a security attack.
[0011] FIG. 7B illustrates an operating environment of a robotic system when a security attack is detected.
[0012] FIG. 7C illustrates an operating environment of a robotic system when a security attack is neutralized.
[0013] FIG. 8 illustrates a region of uncertainty for a robotic system.
[0014] FIG. 9 illustrates an offline phase for a motion planner of an intrusion detection system.
[0015] FIG. 10 illustrates a robot and defined trajectories. [0016] FIG. 11 illustrates a third logic flow.
[0017] FIG. 12 illustrates an online phase for a motion controller of an intrusion detection system.
[0018] FIG. 13 illustrates a fourth logic flow.
[0019] FIG. 14 illustrates a robot and multiple defined trajectories.
[0020] FIG. 15 illustrates a fifth logic flow.
[0021] FIG. 16 illustrates a sixth logic flow.
[0022] FIG. 17 illustrates computer readable medium.
[0023] FIG. 18 depicts an illustrative computer system architecture that may be used to practice exemplary embodiments described herein.
DETAILED DESCRIPTION
[0024] Embodiments are generally directed to techniques to secure robots and corresponding automated control systems from security threats. Some embodiments are particularly directed to techniques to monitor robots and take safety measures when a security or cybersecurity attack is detected. For example, an intrusion detection system (IDS) may monitor computers or controllers arranged to control robots in order to identify and neutralize security attacks against the robots in a robotic system. A robotic safety system receives security alerts from the IDS, and takes remedial or corrective actions to prevent damage to the robots, humans, objects, products, components, or other valuable resources of the robotic system.
[0025] Robots have replaced humans in performing repetitive and dangerous tasks. For example, industrial robots are routinely used for manufacturing to perform tasks such as welding, painting and assembly. Articulated robots are a common industrial robot with robotic arms that mimic a human arm. As with human arms, the robotic arms are articulated with several degrees of freedom that allow a wide range of movement in three-dimensions. Other types of robots include cartesian robots, cylindrical coordinate robots, spherical coordinate robots, selective compliance assembly robot arm (SCARA) robots, delta or parallel link robots, serial manipulators, parallel manipulators, anthropomorphic robots, and so forth.
[0026] In some cases, multiple robots can collaborate together to operate in close proximity to each other within a shared space. Such robots are sometimes referred to as “collaborative robots.” Collaborative robots can have robotic arms (or other parts) designed to operate together to perform shared tasks, such as assembly of a product. Robotic controllers carefully synchronize movement of the robotic arms to safely perform tasks within the shared space while avoiding collisions, which can potentially damage the robotic arms, products, humans or other valuable resources. Motion planning is used to calculate defined paths or trajectories for the robotic arms, which are rigorously followed during real- time operations of the robotic arms under supervision of associated computers or controllers.
[0027] Collaborative robots may be subject to a security attack. For example, malware can be injected into robotic controllers to modify movement of one or more robotic arms outside defined paths or trajectories. This could potentially cause the robotic arms to collide, either accidentally or on purpose. Intrusion detection systems are deployed to detect security attacks and send alerts to robotic safety systems, which in turn take remedial actions for the collaborative robots. One example of remedial actions includes redirecting movement of robotic arms to safety positions or avoiding subsequent arm movements until the attack is neutralized. However, there is a period of time between attack initiation and attack neutralization where a current position for a compromised robotic arm is uncertain or unknown as a consequence of the malicious actions of the attacker. Further, motion planning for robotic arms is a compute intensive operation and not really feasible during real-time attacks. Consequently, performing real-time motion planning to move robotic arms to safety positions during an attack can be challenging, particularly within tight time frames often measured in milliseconds or microseconds.
[0028] Embodiments attempt to solve these and other problems using a robotic safety system. A robotic safety system is a hardware and/or software component, device or system that is generally responsible for controlling movement of one or more robotic arms of one or more collaborative robots when under a security attack. For instance, the robotic safety system directs movement of a robotic arm of a pair of collaborative robotic arms from a current position to a safety position when another robotic arm of the collaborative robotic arms is under a security attack.
[0029] In order to effectively operate in real-time, the robotic safety system is designed to operate in multiple phases. A first phase is an offline phase when collaborative robots are non-operational. During the offline phase, a motion planner generates a series of safety trajectories for each robotic arm of the collaborative robots, and stores them in a database. In general, a safety trajectory may comprise a path between a current position and a desired end position for a robotic arm following a safe trajectory that avoids a defined subspace in a three-dimensional (3D) space. A second phase is an online phase when collaborative robots are operational. During the online phase, a motion controller is notified of a security attack on one robotic arm (referred to herein as a “compromised robotic arm”), retrieves a defined safety trajectory for another robotic arm (referred to herein as an “un-compromised robotic arm”) stored in the database, and controls movement of the un-compromised robotic arm along the defined safety trajectory. In this manner, the robotic safety system can respond to security threats in real-time with faster reaction times, while reducing or eliminating collisions between the un-compromised robotic arm and the compromised robotic arm. As a result, the robotic safety system increases safety for humans, robots, objects, products, components, electronic parts, mechanical parts, and other valuable resources that are in relatively close proximity to the compromised robotic arm (e.g., within a reach envelope) during a security event.
[0030] More particularly, in an offline phase, a motion planner of a robotic safety system exhaustively calculates safety trajectories between a current position for a robotic arm and a safety position for the robotic arm. The safety trajectories are calculated for each defined time interval or time step associated with the robotic arm. The total operational time of each robotic arm is split into discrete points in time (referred to herein as “defined time intervals” or “defined time steps” which are used interchangeably) at some time granularity, usually on the order of milliseconds. The motion planner receives as input a position associated with a given time step of a target robotic arm, and computes an uncertainty region for the target robotic arm. The term “target robotic arm” refers to a robot that could potentially become a “compromised robotic arm” in the future during the online phase. The offline phase exhaustively computes the uncertainty region for all time steps in the planned trajectories of the target robot. In general, an uncertainty region comprises a three- dimensional shape that defines boundaries for a reach envelope of a compromised robotic arm during an uncertainty period. An uncertainty period represents all or some of a time period as measured between when a security attack is initiated and when the security attack is neutralized. Note that a time when a security attack is initiated (referred to herein as an “attack time”) is sometime before a time when the security attack is detected (referred to herein as a “detect time”). Further note that a time when a security attack is neutralized (referred to herein as a “neutralization time”) is sometime after an attack time and detect time. The safety trajectories calculated during the offline phase are stored in a database, and indexed for quick retrieval during an online phase.
[0031] In the online phase, a motion controller of a robotic safety system controls movement of robotic arms along defined trajectories to accomplish shared tasks. When an IDS detects a security attack on one of the robotic arms, it notifies the robotic safety system. The robotic safety system obtains the last time step associated with the last good known position of the attacked robot. The motion controller of the robotic safety system retrieves an appropriate safety trajectory for one or more un-compromised robotic arms from the database based on last known positions, poses or configurations for the one or more un- compromised robotic arms, which are derived from defined time steps associated with the un-compromised robotic arms. The motion controller then instructs the un-compromised robotic arms to follow the safety trajectories from a current position of the robotic arms to a safety position for the robotic arms. Since the safety trajectory is pre-computed to avoid an uncertainty region for the compromised robotic arm, there is a lower probability of collisions between the robotic arms. Further, the use of pre-computed safety trajectories allows faster reaction times for the un-compromised robotic arm, relative to real-time motion planning, thereby lowering a probability of collisions even further. As a result, factories employing collaborative robots with increased resiliency against cyberattacks will have higher productivity and less downtime. This could benefit products in the autonomous systems area, such as robotics, self-driving automobiles, and other automated systems. [0032] With general reference to notations and nomenclature used herein, one or more portions of the detailed description which follows may be presented in terms of program procedures executed on a computer or network of computers. These procedural descriptions and representations are used by those skilled in the art to convey the substances of their work to others skilled in the art. A procedure is here, and generally, conceived to be a self- consistent sequence of operations leading to a desired result. These operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic, or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It proves convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. It should be noted, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to those quantities.
[0033] Further, these manipulations are often referred to in terms, such as adding or comparing, which are commonly associated with mental operations performed by a human operator. However, no such capability of a human operator is necessary, or desirable in most cases, in any of the operations described herein that form part of one or more embodiments. Rather, these operations are machine operations. Useful machines for performing operations of various embodiments include digital computers as selectively activated or configured by a computer program stored within that is written in accordance with the teachings herein, and/or include apparatus specially constructed for the required purpose or a digital computer. Various embodiments also relate to apparatus or systems for performing these operations. These apparatuses may be specially constructed for the required purpose. The required structure for a variety of these machines will be apparent from the description given.
[0034] Reference is now made to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for the purpose of explanation, numerous specific details are set forth in order to provide a thorough understanding thereof. It may be evident, however, that the novel embodiments can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate a description thereof. The intention is to cover all modification, equivalents, and alternatives within the scope of the claims.
[0035] In the Figures and the accompanying description, the designations “a” and “b” and “c” (and similar designators) are intended to be variables representing any positive integer. Thus, for example, if an implementation sets a value for a = 5, then a complete set of components 123 illustrated as components 123-1 through 123-α (or 123a) may include components 123-1, 123-2, 123-3, 123-4, and 123-5. The embodiments are not limited in this context.
[0036] FIG. 1 depicts a robotic system 100. The robotic system 100 may comprise any robotic system designed for a particular application within a robotic system environment 134. Examples of robotic systems suitable for the robotic system 100 can include, without limitation, articulated robots, cartesian robots, cylindrical coordinate robots, spherical coordinate robots, selective compliance assembly robot arm (SCARA) robots, delta or parallel link robots, serial manipulators, parallel manipulators, anthropomorphic robots, and so forth. In one embodiment, the robotic system 100 is a collaborative robotic system having multiple robots each with one or more articulated robotic arms. However, embodiments are not limited to this particular type of robotic system.
[0037] As shown in FIG. 1 by way of example and not limitation, the robotic system 100 may comprise multiple collaborative robots 102, 118. Each robot 102, 118 may have one or more robotic arms, such as a first robotic arm 106 and a second robotic arm 122, respectively. The robotic arms 106, 122 may be collaborative robotic arms arranged to move in a shared physical space 138 in order to execute tasks to accomplish shared goals, such as manipulating an object 136. The shared physical space 138 is sometimes referred to as a “workspace” or “taskspace” for the robotic system 100. The object 136 may be any physical object suitable for a given application for the robotic system 100, such as a human patient for a medical operating robot, a product or item for assembly or disassembly by an industrial operating robot, an automobile for welding or painting, and so forth. Although only two robots 102, 118 each having a single robotic arm 106, 122, respectively, are shown in FIG. 1, it may be appreciated that the robotic system 100 may contain more than two robots and/or robotic arms operating within the shared physical space 138. Embodiments are not limited in this context.
[0038] In one embodiment, the robotic arms 106, 122 are articulated robotic arms. An articulated robotic arm is designed as a series of links connected by motor-actuated joints that extend from a base to an end-effector. Articulated robotic arms often have an anthropomorphic arm structure that mimics a human arm, such as having a shoulder, an elbow and a wrist. As with human arms, the robotic arms 106, 122 are articulated with several degrees of freedom that allow a wide range of movement in three-dimensions. In one embodiment, for example, the robotic arms 106, 122 may have up to six joints, since it requires at least six degrees of freedom to place a manipulated object 136 in an arbitrary position and orientation in the shared physical space 138 of the robots 102, 118.
[0039] As shown in FIG. 1, the first robotic arm 106 is a mechanical arm with a robot base 104 and one or more links 110, 114 to form a first kinematic chain. Kinematics is used to describe a motion of systems composed of joined parts (e.g., multi-link systems), such as in an engine, robotic arm or human skeleton. The links 110, 114 are connected by joints 108, 112 allowing rotational motion or translational displacement. At the end of the link 114 is a first end-effector 116, which is considered a terminus for the first kinematic chain.
Similarly, the second robotic arm 122 is a mechanical arm with a robot base 120 and one or more links 126, 130 to form a second kinematic chain. The links 126, 130 are connected by joints 124, 128 allowing rotational motion or translational displacement. At the end of the link 130 is a second end-effector 132, which is considered a terminus for the second kinematic chain. Note the links and joints denoted for the robots 102, 118 are exemplary only, and the robots 102, 118 may have more links and joints not labeled in FIG. 1.
[0040] The end effectors 116, 132 can be any devices located at the terminus of the robotic arms 106, 122, that are designed to interact with robotic system environment 134, such as manipulating the object 136. An exact implementation for the end-effectors 116, 132 depends on a given application for the robots 102, 118. Examples for the end-effectors 116, 132 can include, without limitation, tools, grippers, pins, vacuums, magnets, welding guns, paint spray gun, surgical tool, drill, cutters, robotic hands, and so forth.
[0041] The position and orientation of the end-effectors 116, 132 are derived from joint positions using a geometric model of the robotic arms 106, 122, respectively. A reachable workspace within the shared physical space 138 of the end-effectors 116, 132 is a manifold of reachable frames. A reachable workspace is sometimes referred to herein as a “reachable envelope” for a robotic arm 106, 122. A dextrous workspace comprises points of the reachable workspace where the robot can generate velocities that span the complete tangent space at that point, e.g., it can translate a manipulated object 136 with three degrees of freedom, and rotate the object with three degrees of rotation freedom. Relationships between joint space and Cartesian space coordinates of the object 136 held by the robotic arms 106, 122 are in general multiple-valued. A same pose can be reached by the robotic arms 106, 122 in different ways, each with a different set of joint coordinates. Hence the reachable workspace of the robotic arms 106, 122 is divided in configurations (also called assembly modes), in which the kinematic relationships are locally one-to-one.
[0042] In operation, the robotic arms 106, 122 are programmed to follow one or more defined trajectories in the shared physical space 138 while performing a collaborative task on the object 136. A defined trajectory is a sequence of valid configurations to move an object from a source to a destination. A motion planner implements a motion planning algorithm to compute a continuous path (or series of points) that connects a start configuration S and a goal configuration G, while avoiding collision with known obstacles. An exemplary motion planner and defined trajectories are described in more detail with reference to FIG. 3.
[0043] FIG. 2A depicts a system 200a. The system 200a represents a control system for the robotic system 100 while operating under normal conditions. The system 200a comprises a network 212 communicatively coupled to an intrusion detection system (IDS) 202. The IDS 202 is communicatively coupled to a robotic safety system 214, which in turn, is communicatively coupled to a pair of controllers 204, 206.
[0044] The controllers 204, 206 are arranged to control operations for the robots 102, 118, respectively. The controllers 204, 206 may be implemented with various hardware elements, software elements, or a combination of both, as described in more detail with reference to FIG. 18. While two controllers 204, 206 are shown in system 200a, a single controller may be used to control both of the robots 102, 118. Embodiments are not limited in this context.
[0045] In general, the IDS 202 is a device or software application that monitors a device, network or systems for malicious activity or policy violations. The IDS 202 may be specifically tuned to detect various types of attacks, such as a malware injections into the controllers 204, 206, spoofing of sensing and actuation, Denial of Service of robotic functions, or other specific attack vectors for robotic system 100. Any intrusion activity or violation is typically reported either to other devices in the same network 212, an administrator, and/or collected centrally or locally using a security information and event management (SIEM) system. A SIEM system combines outputs from multiple sources and uses alarm filtering techniques to distinguish malicious activity from false alarms. In addition to monitoring the controllers 204, 206, the IDS 202 may be implemented for other devices in the robotic system environment 134, such as other robotic systems coupled to the network 212, to provide a more comprehensive security solution to an attacker.
[0046] The IDS 202 can utilize any number of different detection methods to detect an attack. For instance, the IDS 202 may implement a signature-based method, a statistical anomaly-based method, a stateful protocol analysis method, machine-learning based, or some combination of all four methods. A signature-based IDS monitors packets in the network 212 and compares with pre-configured and pre-determined attack patterns known as signatures. A statistical anomaly -based or machine-learning based IDS monitors network traffic and compares it against an established baseline. The baseline will identify what is “normal” for the network 212, such as what sort of bandwidth is generally used and what protocols are used. For instance, ensemble models that use Matthews correlation co- efficient to identify unauthorized network traffic have obtained 99.73% accuracy. A stateful protocol analysis IDS identifies deviations of protocol states by comparing observed events with defined profiles of generally accepted definitions of benign activity. It will be appreciated that these detection methods are by way of example and not limitation. Other embodiments may use different detection methods as well. The embodiments are not limited in this respect.
[0047] The robotic safety system 214 is a device that is implemented on a communications path between the IDS 202 and the controllers 204, 206. In one embodiment, the robotic safety system 214 is a software application that receives alerts or alarm signals from the IDS 202 indicating that one or both of the controllers 204, 206 are under a security attack. When the robotic safety system 214 receives an alarm notification, it begins robotic recovery operations or decisions to ensure safe operations for one or both of the robots 102, 118 while under attack. One example of robotic recovery operations may include controlling movement for the robotic arms 106, 122 along pre-computed safety trajectories to designated safety positions. A safety position may comprise a resting pose or position for an un-compromised robotic arm where it is outside an uncertainty region for a compromised robotic arm and therefore safe from collisions from the compromised robotic arm.
[0048] FIG. 2B depicts a system 200b. The system 200b is similar to the system 200a. However, the system 200b illustrates a case where an attacker 208 has attacked the controller 206 for the robot 118. The attacker 208 has gained unauthorized access and control over operations for the controller 206, and therefore has control over operations for the robot 118 and the robotic arm 122. For instance, the attacker 208 can cause the robotic arm 122 to deviate from its normal working defined trajectory to an unknown trajectory, thereby creating a possibility of collisions with the robotic arm 106 of the robot 102 which is operating within the same shared physical space 138. The IDS 202 detects the attack, and immediately sends an alarm 210 to other devices on the network 212, including the robotic safety system 214, the controller 204 of the robot 102, and other devices accessible via the network 212 (e.g., other robots or a SIEM).
[0049] FIG. 3 depicts a more detailed block diagram of components implemented for the robotic safety system 214. The robotic safety system 214 is representative of any number and type of electronic devices arranged to operate and process messages for the robotic system 100. More particularly, the robotic safety system 214 includes a processing circuitry 302, an interface 322 and a memory 304.
[0050] The processing circuitry 302 may include circuitry or processor logic, such as, for example, any of a variety of commercial processors. In some examples, the processing circuitry 302 may include multiple processors, a multi-threaded processor, a multi-core processor (whether the multiple cores coexist on the same or separate dies), and/or a multi- processor architecture of some other variety by which multiple physically separate processors are in some way linked. Additionally, in some examples, the processing circuitry 302 may include graphics processing portions and may include dedicated memory, multiple-threaded processing and/or some other parallel processing capability. In some examples, the processing circuitry 302 may be an application specific integrated circuit (ASIC) or a field programmable integrated circuit (FPGA). In some examples, the processing circuitry 302 may be circuitry arranged to perform computations related to the robotic system 100, the robots 102, 118, the robotic arms 106, 122, the controllers 204, 206, switching, routing, security, and so forth.
[0051] The interface 322 may include logic and/or features to support a communication interface. For example, the interface 322 may include one or more interfaces that operate according to various communication protocols or standards to communicate over direct or network communication links. Direct communications may occur via use of communication protocols or standards described in one or more industry standards (including progenies and variants). For example, the interface 322 may facilitate communication over a bus, such as, for example, peripheral component interconnect express (PCIe), non-volatile memory express (NVMe), universal serial bus (USB), system management bus (SMBus), SAS (e.g., serial attached small computer system interface (SCSI)) interfaces, serial AT attachment (SATA) interfaces, or the like. In some examples, interface 322 may be arranged to support wireless communication protocols or standards, such as, for example, Wi-Fi, Bluetooth, ZigBee, LTE, 5G, 6G or the like. In one embodiment, the interface 322 of the robotic safety system 214 may be arranged to communicate over a wired or wireless connection to the other devices in the systems 200a, 200b, such as the IDS 202, the controllers 204, 206, the robots 102, 118, or other devices via the network 212.
[0052] The memory 304 may include logic, a portion of which includes arrays of integrated circuits, forming non-volatile memory to persistently store data or a combination of non-volatile memory and volatile memory. It is to be appreciated, that the memory 304 may be based on any of a variety of technologies. In particular, the arrays of integrated circuits included in memory 304 may be arranged to form one or more types of memory, such as, for example, dynamic random access memory (DRAM), NAND memory, NOR memory, or the like.
[0053] The memory 304 includes a set of instructions 306, input data 308, output data 310, a motion planner 312 and a motion controller 314. In one embodiment, the memory 304 may optionally include an integrated IDS, such as the IDS 202 as described with reference to FIGS. 2A, 2B. The robotic safety system 214 may also include one or more databases to store information, such as a first database to store configuration information 316, a second database to store defined trajectories 318, and a third database to store safety trajectories 320.
[0054] The configuration information 316 may generally include various parameters or properties associated with one or both of the robots 102, 118 and/or corresponding robotic arms 106, 122. Example of configuration information 316 may include mechanical and electrical properties, type information, arm information, base information, link information, joint information, end-effector information, pose information, kinematics, state information, defined time intervals or time steps Ti to TN (also denoted herein as “TI-TN”) material types, physics-related information, object information, and any other information pertaining to the robotic arms 106, 122.
[0055] The defined trajectories 318 may generally include defined trajectories for the robotic arms 106, 122 while in normal operation, such as when operating together in the shared physical space 138 while performing a collaborative task on the object 136. As previously discussed, a defined trajectory is a sequence of valid configurations to move an object from a source to a destination. The motion planner 312 implements a motion planning algorithm to compute a continuous path (or series of points) that connects a start configuration S and a goal configuration G, while avoiding collision with known obstacles. The robotic arms 106, 122 and obstacle geometry is described in a two-dimensional (2D) or three-dimensional (3D) workspace, while motion is represented as a path in configuration space C. A configuration describes a pose of one or both of the robotic arms 106, 122, and the configuration space C is a set of all possible configurations. A set of configurations that avoids collisions with obstacles is called free space. The complement of free space is called to obstacle or forbidden space. An obstacle or forbidden space is considered a space to which the robotic arms 106, 122 cannot move. An obstacle or forbidden space is sometimes referred to herein as an “uncertainty region” or “occupied space” for one or both of the robotic arms 106, 122. Conversely, a target space is a subspace of free space that denotes a space to which the robotic arms 106, 122 can move to reach a targeted destination. A targeted destination is sometimes referred to herein as a “safety position” or “safety pose” for one or both of the robotic arms 106, 122.
[0056] In one embodiment, the safety position or safety pose could be an original planned end position for a defined trajectory 318 of a robot that it was trying to achieve before an attack. The configuration of the robot as well as the end-effector 116, 132 trajectory must be safe to avoid collisions. To accomplish this task, the robotic arms 106, 122 may take alternative trajectories, referred to as safety trajectories 320, as discussed in more detail below. The alternative trajectories are not necessarily a shortest trajectory, a more efficient trajectory or a smoother trajectory. However, the alternative trajectory should guarantee that the robot can achieve the safety position or safety pose, e.g., an originally planned end position. It is worthy to note, however, that it may not always be possible to achieve the originally planned end position for a robotic arm 106, 122 and/or end-effector 116, 132, respectively. In this case, an alternative trajectory could lead to a safety position or a safety pose which differs from the originally planned end position.
[0057] The safety trajectories 320 may generally define a series of positions between a current position for a given end-effector and a safety position for the given end-effector that moves in a target space and avoids an uncertainty region of another end-effector at a defined time step T1-TN. A safety position may comprise a resting pose or position for a robotic arm where it is outside an uncertainty region for a compromised robotic arm and therefore safe from collisions from the compromised robotic arm. As described in more detail with reference to FIG. 4, during an offline phase, the motion planner 312 may receive as input configuration information 316, computes a set of safety trajectories 320 for both robotic arms 106, 122, and stores them in a database or other data structure in persistent memory (e.g., flash memory) or external storage (e.g., a hard drive).
[0058] In general, the motion planner 312 computes safety trajectories 320 for the robotic arms 106, 122 of the robotic system 100. More particularly, for each defined time step T1- TN associated with the robotic arms 106, 122, the motion planner 312 stores a corresponding safety trajectory 320 for the robotic arms 106, 122 that avoids an uncertainty region of a compromised robotic arm. For instance, the motion planner 312 computes a first set of safety trajectories 320 for the robotic arm 106 in anticipation of the robotic arm 122 being compromised by the attacker 208. Conversely, the motion planner 312 computes a second set of safety trajectories 320 for the robotic arm 122 in anticipation of the robotic arm 106 being compromised by the attacker 208. The motion planner 312 computes the safety trajectories 320 during an offline phase, and stores the pre-computed safety trajectories 320 in a database, indexed in a manner for quick retrieval during an online phase for the robotic arms 106, 122 in the event of a security breach or incident compromising one or both robotic arms 106, 122. For instance, the safety trajectories 320 are indexed by defined time steps T1-TN. Operations for the motion planner 312 will be discussed in more detail with reference to FIG. 4.
[0059] In general, the motion controller 314 assists the controllers 204, 206 in controlling movement for the robotic arms 106, 122. When the robotic safety system 214 receives an alarm 210 from the IDS 202, the motion controller 314 begins robotic recovery operations or decisions to ensure safe operations for one or both of the robots 102, 118 and corresponding robotic arms 106, 122, respectively, while under attack. One example of robotic recovery operations may include controlling movement for the robotic arms 106, 122 along pre-computed safety trajectories 320 to designated safety positions. In response to an alarm 210, the motion controller 314 identifies which of the robotic arms 106, 122 is currently under attack, acquires its last good known position, and retrieves a safety trajectory 320 for the un-compromised robotic arm 106, 122. For instance, the motion controller 314 may identify the robotic arm 122 as compromised and under attack from information contained within the alarm 210. The motion controller 314 can retrieve a safety trajectory 320 for the robotic arm 106 which is un-compromised and at risk of damage by the compromised robotic arm 122. The motion controller 314 can send the retrieved safety trajectory 320 for the un-compromised robotic arm 106 to the controller 204 for the un- compromised robotic arm 106. The controller 204 then causes the un-compromised robotic arm 106 to deviate from its normal defined trajectory 318 and move along the received safety trajectory 320 to guide the un-compromised robotic arm 106 to a safety position. Likewise, the motion controller 314 performs similar operations for the robotic arm 122 when the robotic arm 106 is compromised.
[0060] The motion controller 314 may cooperatively operate with the controllers 204, 206, or in some instances, completely take over operations for the robots 102, 118 and corresponding robotic arms 106, 122. In the former case, the motion controller 314 can send a retrieved safety trajectory 320 or instructions for the retrieved safety trajectory 320 to the appropriate controller 204, 206 for execution by the receiving controller 204, 206. In the latter case, the motion controller 314 can have authority (e.g., root or administrator permissions) to completely bypass the controllers 204, 206 and take direct operational control of the robotic arms 106, 122. In this case, the motion controller 314 causes the robotic arm 106 or the robotic arm 122 to deviate from its normal defined trajectory 318 and move along the received safety trajectory 320 to guide the robotic arm 106 or the robotic arm 122 to a predetermined safety position. This may be advantageous when the robotic safety system 214 is uncertain whether a controller 204, 206 of an un-compromised robotic arm 106, 122 has not been compromised as well. Operations for the motion controller 314 will be discussed in more detail with reference to FIG. 5.
[0061] Operations for the disclosed embodiments may be further described with reference to the following figures. Some of the figures may include a logic flow. Although such figures presented herein may include a particular logic flow, it can be appreciated that the logic flow merely provides an example of how the general functionality as described herein can be implemented. Further, a given logic flow does not necessarily have to be executed in the order presented unless otherwise indicated. Moreover, not all acts illustrated in a logic flow may be required in some embodiments. In addition, the given logic flow may be implemented by a hardware element, a software element executed by a processor, or any combination thereof. The embodiments are not limited in this context.
[0062] FIG. 4 illustrates an embodiment of a logic flow 400. The logic flow 400 may be representative of some or all of the operations executed by one or more embodiments described herein. For example, the logic flow 400 may include some or all of the operations performed by the motion planner 312 to generate or compute safety trajectories 320. Embodiments are not limited in this context.
[0063] The logic flow 400 illustrate a series of general operational blocks 402 to 408. In block 402, logic flow 400 receives configuration information for a first robot with a first robotic arm and a first end-effector at a defined time step. In block 404, logic flow 400 receives an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time step. In block 406, logic flow 400 determines a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time step. In block 408, logic flow 400 stores the safety trajectory in a database.
[0064] It is worthy to note that although certain embodiments discuss a safety trajectory for end-effectors 116, 132 that avoids an uncertainty region for target robots 102, 118 or compromised robots 102, 118, it may be appreciated that motion planner 312 plans the safety trajectory so that all parts of all configurations for the robots 102, 118 avoids the uncertainty region as well. This includes, for example, joints, links, rotators, different types of end-effectors, and any other movable parts of the robots 102, 118. In other words, the configuration space taken by an un-compromised robot must also not intersect an uncertainty region of a target robot or compromised robot.
[0065] During an offline phase, the motion planner 312 may implement the logic flow 400 to compute safety trajectories 320 for both of the robotic arms 106, 122. For instance, the motion planner 312 may receive configuration information 316 for a first robot 102 with a first robotic arm 106 and a first end-effector 116 at a defined time step T1-TN, where N represents any positive integer. The motion planner 312 may also receive configuration information 316 for a second robot 118 with a second robotic arm 122 and a second end- effector 116 at the defined time step T1-TN.
[0066] The configuration information 316 for the second robot 118 may include, for example, a last good configuration for the second robot 118 at every possible time of attack. The configuration information 316 for the second robot 118 may further include, for example, an uncertainty region for a second robot 118 with a second robotic arm 122 and a second end-effector 132. The uncertainty region may comprise a three-dimensional (3D) shape or subspace in Cartesian space that defines boundaries for a reach envelope of the second end-effector 132 at the given defined time step T1-TN The motion planner 312 may determine a safety trajectory 320 for the first end-effector 116 based on the uncertainty region. The safety trajectory may define a series of positions between a current position for the first end-effector 116 and a safety position for the first end-effector 116 that avoids the uncertainty region of the second end-effector 132 at the defined time step T1-TN. The motion planner 312 may store the safety trajectory in the third database with the other safety trajectories 320. The motion planner 312 may repeat the logic flow 400 for each of the robotic arms 106, 122 at each of the defined time steps T1-TN until all possible safety trajectories 320 are computed for all the defined time steps T1-TN.
[0067] In various embodiments, the motion planner 312 may implement additional operations associated with computing safety trajectories 320 for the robotic arms 106, 122. For example, the motion planner 312 may further compute safety trajectories 320 using configuration information 316, where each defined time step T1-TN is associated with a defined position, pose or configuration for the first end-effector 116 and the second end- effector 132.
[0068] The motion planner 312 may further compute safety trajectories 320 using configuration information 316 where the configuration information includes a set of parameters for the first robotic arm 106 at a defined time step T1-TN, the set of parameters including a joint position state for each joint of the first robotic arm 106, a joint velocity state for each joint of the first robotic arm 106, and a joint acceleration state for each joint of the first robotic arm 106.
[0069] The motion planner 312 may further compute safety trajectories 320 using configuration information 316 where the configuration information 316 includes a set of parameters for the first robotic arm 106 at a defined time step T1-TN, the set of parameters includes an initial kinematic frame and a final kinematic frame for the first robotic arm 106 at the defined time step T1-TN
[0070] The motion planner 312 may further compute safety trajectories 320 by determining an uncertainty region for the second robot 118 with the second robotic arm 122 and the second end-effector 132 based on configuration information 316 for the second robot 118, where the configuration information 316 includes a set of parameters for the second robotic arm 122 at a defined time step T1-TN, the set of parameters includes a joint position state for each joint of the second robotic arm 122, a joint velocity state for each joint of the second robotic arm 122, and a joint acceleration state for each joint of the second robotic arm 122. [0071] The motion planner 312 may further compute safety trajectories 320 by adapting a safety trajectory 320 to ensure movement of each joint of the first robotic arm 106 remains within joint acceleration parameters.
[0072] The motion planner 312 may further compute safety trajectories 320 by smoothing a safety trajectory 320 to ensure movement of the first robotic arm 106 remains within acceleration parameters along inflection regions.
[0073] The motion planner 312 may further compute safety trajectories 320 by regularizing a safety trajectory 320 for safe-trajectory branching and management.
[0074] The motion planner 312 may further compute safety trajectories 320 by receiving multiple uncertainty regions for the second robot 118 with the second robotic arm 122 and the second end-effector 132, each uncertainty region associated with a position of the second end-effector 132 at a defined time step T1-TN within a series of defined time steps T1-TN. [0075] The motion planner 312 may further compute safety trajectories 320 by determining multiple safety trajectories 320 for the first end-effector 116 based on multiple uncertainty regions for the second robot 118 with the second robotic arm 122 and the second end- effector 132, each uncertainty region associated with a position of the second end-effector 132 at a defined time step within a series of defined time steps.
[0076] The motion planner 312 may further compute safety trajectories 320 by storing multiple safety trajectories 320 for the first end-effector 116 based on multiple uncertainty regions for the second robot 118 with the second robotic arm 122 and the second end- effector 132 in a database.
[0077] The motion planner 312 may further compute safety trajectories 320 and store and index the safety trajectories 320 for the first end-effector 116 in the database using defined time steps T1-TN and associated positions of the first end-effector 116 at each defined time step T1-TN.
[0078] FIG. 5 illustrates an embodiment of a logic flow 500. The logic flow 500 may be representative of some or all of the operations executed by one or more embodiments described herein. For example, the logic flow 500 may include some or all of the operations performed by the motion controller 314 to retrieve and execute a safety trajectory 320 for one or both robotic arms 106, 122. Embodiments are not limited in this context.
[0079] The logic flow 500 illustrates a series of general operational blocks 502 to 514. In block 502, logic flow 500 monitors a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time step. In block 504, logic flow 500 monitors a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end- effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time step. In block 506, logic flow 500 detects a cyberattack on the second controller for the second robot. In block 508, logic flow 500 identifies a defined time step corresponding to a time of attack detection, the identified defined time step to comprise an attack detection time step. In block 510, logic flow 500 determines a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time step. In block 512, logic flow 500 retrieves a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end-effector. The safety trajectory defines a series of positions between the current position for the first end-effector (and any other movable parts of a non-compromised robot) and a safety position for the first end- effector (and any other movable parts of a non-compromised robot) that avoids an uncertainty region of the second end-effector. In block 514, logic flow 500 sending control information to the first controller to cause the first end-effector to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second end-effector.
[0080] During an online phase, the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122. For instance, the motion controller 314 monitors a first controller 204 for a first robot 102 with a first robotic arm 106, the first controller 204 to control movement of a first end-effector 116 of the first robotic arm 106 between a series of positions along a first defined trajectory 318, each position associated with a defined time step T1-TN The motion controller 314 monitors a second controller 206 for a second robot 118 with a second robotic arm 122, the second controller 206 to control movement of a second end-effector 132 of the second robotic arm 122 between a series of positions along a second defined trajectory 318, each position associated with a defined time step T1-TN. The motion controller 314 detects a cyberattack on the second controller 206 for the second robot 118. The motion controller 314 identifies a defined time step T1-TN corresponding to a time of attack detection, the identified defined time step T1-TNto comprise an attack detection time step. The motion controller 314 determines a current position for the first end-effector 116 from among the series of positions along the first defined trajectory 318 based on the attack detection time step. The motion controller 314 retrieves a safety trajectory 320 from among multiple safety trajectories 320 stored in a database based on the current position for the first end-effector 116, the safety trajectory 320 to define a series of positions between the current position for the first end-effector 116 and a safety position for the first end-effector 116 that avoids an uncertainty region of the second end-effector 132. The motion controller 314 sends control information to the first controller 204 to cause the first end-effector 116 to move from the current position to the safety position along the safety trajectory 320 in order to prevent a collision with the second end-effector 132.
[0081] In various embodiments, the motion controller 314 can implement additional operations associated with controlling movement for the robotic arms 106, 122. For example, the motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122, where the first robot 102 and the second robot 118 are collaborative robots arranged to move in a shared physical space 138 in order to execute tasks to accomplish shared goals on an object 136. [0082] The motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the first end-effector 116 is a terminus for the kinematic chain.
[0083] The motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where the second robotic arm 122 is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the second end-effector 132 is a terminus for the kinematic chain.
[0084] The motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where the first defined trajectory 318-1 for the first end-effector 116 and the second defined trajectory 318-2 for the second end-effector 132 are arranged to avoid collisions between the first end-effector 116 and the second end-effector 132 when moving within a shared physical space 138.
[0085] The motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by detecting the cyberattack on the second controller 206 for the second robot 118 during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
[0086] The motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by comparing the time of attack detection to each time step associated with each position in the first defined trajectory 318-1 until a match is found to identify the attack detection time step.
[0087] The motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by retrieving the current position for the first end- effector 116 from among the series of positions along the first defined trajectory 318-1 based on the attack detection time step from a database.
[0088] The motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where the uncertainty region includes a three- dimensional shape that defines boundaries for a reach envelope of the second end-effector 132 after the attack detection time step.
[0089] The motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 where an uncertainty region is generated for each position associated with a time step for the second end-effector 132 of the second robotic arm 122, and stored in the database before the time of attack detection.
[0090] The motion controller 314 can assist the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by causing execution of the control information by the first controller 204 to cause the first end-effector 116 to move from the current position to the safety point along the safety trajectory 320 in order to prevent a collision with the second end-effector 132.
[0091] The motion controller 314 can cease assisting the controllers 204, 206 in controlling movement for the robotic arms 106, 122 by receiving notification that the detected cyberattack on the second controller 206 for the second robot 118 has been neutralized.
[0092] Various operating environments and specific use-case examples for the robotic safety system 214 of the robotic system 100 will be described with reference to FIGS. 6-16. [0093] FIG. 6 depicts an example of an operating environment 600 representing normal operations for the robotic system 100. As shown in FIG. 6, the robots 102, 118 of the robotic system 100 each operate in an independent and cooperative manner. During an online phase, the robotic arm 106 of the robot 102 follows a defined trajectory 318-1 while the robotic arm 122 of the robot 118 follows a defined trajectory 318-2. At each defined time step T1-TN the end-effector 116 of the robotic arm 106 is at a different position along the defined trajectory 318-1. Similarly, at each defined time step Ti-T\ the end-effector 132 of the robotic arm 122 is at a different position along the defined trajectory 318-2. The defined trajectories 318-1, 318-2 may intersect within the shared physical space 138 as each robotic arm 106, 122 works from a start configuration S towards a goal configuration G. During normal operations, the motion planner 312 carefully plans and generates the defined trajectories 318-1, 318-2 to avoid collisions when the robotic arms 106, 122 are moving within the shared physical space 138.
[0094] As the robotic arms 106, 122 each follow their respective defined trajectories 318- 1, 318-2, each configuration (or pose) of the robotic arms 106, 122 is associated with a defined time step T1-TN. For instance, in the specific example shown in FIG. 6 using defined time steps denoted as t1-tN, the robotic arm 106 is in a first configuration (or pose) at a first defined time step ti, a second configuration at time step t2, and so forth. Similarly, the robotic arm 122 is in a first configuration (or pose) a first defined time step t1, a second configuration at time step t2, and so forth. As such, the controllers 204, 206 may determine a known configuration for each of the robotic arms 106, 122, including a known position for the end-effectors 116, 132, respectively, by referencing a given defined time step t1-tN [0095] FIG. 7 A depicts an operating environment 700a representing operations of the robotic system 100 at a start of a security attack. Assume at an attack time tA, the attacker 208 successfully compromises and takes control of the controller 206 for the robot 118. Further assume that the attack time tA corresponds with a defined time step t1-tN. In the operating environment 700a, the robotic arms 102, 118 are in respective configurations to avoid collisions with each other.
[0096] FIG. 7B depicts an operating environment 700b representing operations of the robotic system 100 when a security attack is detected. Sometime after the attack time tA, the IDS 202 detects the attack on the controller 206 for the robot 118 at a detect time to. The IDS 202 issues an alarm 210 to the robotic safety system 214 and/or the controller 204. As with the attack time tA, the detect time tD corresponds with a defined time step ti-tN. In the operating environment 700b, the robotic arms 102, 118 are in respective configurations that could potentially cause collisions with each other.
[0097] FIG. 7C depicts and operating environment 700c representing operations of the robotic system 100 when a security attack is neutralized. Sometime after the detect time to, the IDS 202 neutralizes the attack on the controller 206 for the robot 118 at a neutralization time tN. As with the attack time tA and the detect time to, the neutralization time tN corresponds with a defined time step ti-tN (not necessarily the last defined time step TN). In the operating environment 700c, the robotic arms 102, 118 have collided with each other even though the security attack has been neutralized. Consequently, the robotic safety system 214 needs to operate to secure the robot 102 and corresponding robotic arm 106 in at least a time period between tN- tA.
[0098] FIG. 8 depicts an operating environment 800. The operating environment 800 illustrates an uncertainty region 802 for the compromised robotic arm 122 of the robot 118 during an uncertainty time period between tN- tA. As shown in FIG. 8, assume an attack time tA for an attack on the robot 118 occurs at a defined time step ts. After the defined time step t5, the robotic safety system 214 is uncertain as to the exact position of the end-effector 132 for the compromised robotic arm 122. However, the robotic safety system 214 is aware of a last known position, configuration or pose for the robotic arm 122 at the defined time step t5 since it has access to the defined trajectory 318-2 for the robotic arm 122 stored in a database. Based on this information, the motion planner 312 can use configuration information 316 for the compromised robotic arm 122 at defined time step ts to calculate an uncertainty region 802 for the robotic arm 122 after the defined time step ts. The uncertainty region 802 comprises a three-dimensional (3D) shape that defines estimated boundaries for a reach envelope of the second end-effector 132 after the given defined time step t5.
[0099] As previously discussed, during an offline phase, the motion planner 312 previously computes the uncertainty region 802 for the compromised robotic arm 122 using configuration information 316 for the compromised robotic arm 122 at defined time step t5, and then computes a safety trajectory 320-1 for the un-compromised robotic arm 106 of the robot 102 based on the uncertainty region 802, which is subsequently stored in a database. During an online phase, the motion controller 314, either directly or indirectly, causes the un-compromised robotic arm 106 to deviate from a current position along the defined trajectory 318-1 to a safety position 804 along the safety trajectory 320-1, which is pre- computed to avoid the uncertainty region 802 of the compromised robotic arm 122 of the compromised robot 118. Since all safety trajectories 320 are computed during the offline phase and stored in a database, the motion controller 314 can quickly retrieve and execute the safety trajectory 320-1 during real-time operation of the robots 102, 118.
[0100] FIG. 9 depicts an operating environment 900 for the robotic system 100. As shown in the operating environment 900, during an offline phase 902, the motion planner 312 for the robotic safety system 214 computes a series of safety trajectories 320-1, 320-2, 320-3, 320-4 and 320-5 at corresponding defined time steps t1, t2, ts, t4 and t5, respectively, for the robotic arm 106 of the robot 102. The safety trajectories 320-1 to 320-5 are stored in a database accessible by the robotic safety system 214. The safety trajectories 320-1 to 320-5 are indexed in the database for quick retrieval by the motion controller 314 of the robotic safety system 214. For instance, the safety trajectories 320-1 to 320-5 may be indexed in the database by a position, pose or configuration for the robotic arm 106 at a corresponding defined time step ti-ts, respectively. Each of the safety trajectories 320-1 to 320-5 represent an alternative trajectory for the robotic arm 106 from a normal defined trajectory 318-1. More particularly, each of the safety trajectories 320-1 to 320-5 represent an alternative trajectory that starts at a current position for the end-effector 116 of the robotic arm 106 at a defined time step t1-t5 to a safety position 804 for the end-effector 116 of the robotic arm 106.
[0101] As shown in FIG. 9, at defined time steps t1, t2 the safety trajectories 320-1, 320-2, respectively, both easily bypass the uncertainty region 802. The safety trajectories 320-3, 320-4 are closer to the uncertainty region 802, and after deviation from the defined trajectory 318-1 at defined time steps t3, t4, respectively, resume following the defined trajectory 318-1 at defined time steps t8, t7, respectively. The safety trajectory 320-5 at defined time step t5 passes more closely to the uncertainty region 802, and if the robotic arm 106 would have continued along the defined trajectory 318-1 it would have traveled directly through the uncertainty region 802 at defined time step t6, thereby leading to potential collisions with the robotic arm 122.
[0102] FIG. 10 depicts an operating environment 1000 for the robotic system 100. As shown in the operating environment 1000, the robotic arm 106 of the robot 102 follows a defined trajectory 318-1 during normal operations. In preparation for a security attack, the motion planner 312 of the robotic safety system 214 generates safety trajectories 320 for the robotic arm 106. To generate the safety trajectories 320, the motion planner 312 considers uncertainty regions 802 for the robotic arm 122 of the robot 118 for each defined time step T1-TN. The resulting safety trajectories 320 are a discrete collection of tuples of joint positions at given times or deltas for its instantiation. The safety trajectories 320 fulfill position, velocity and acceleration limits and are collision-free with a corresponding uncertainty region 802, denoted as occupied space C in FIG. 10, and with no-self collisions. The regular distributions of these samples is ensured with respect to a task-space.
[0103] The motion planner 312 uses kinematics to calculate a motion for the robotic arms 106, 122. Geometric transformations, also called rigid transformations, are used to describe movement of components of the robotic arms 106, 122, thereby simplifying the derivation of equations of motion. Using the robotic arm 106 as an example, the motion planner 312 implements a motion planning algorithm which takes as input a set of configuration information 316 for the robotic arm 106 that includes a set of joint limits, an initial kinematic frame , a final kinematic frame direct kinematics , inverse kinematics φ
Figure imgf000025_0001
Figure imgf000025_0002
, and an occupied space C.
[0104] As shown in FIG. 10, the robot 102 has a set of joints J0-J5. The joints J0-J5 may have associated joint limits represented as a minimum [J(tj] and a maximum [J(t), J'(t), J"(t)], where J(t) represents a joint position state which is an angular value from encoders of all joints of the robot at time t and dimension is the degrees of freedom of the robot 102, J"(t) represents a joint velocity state which is an angular derivative (with respect to time) computed by the encoders of all joints of the robot 102 at time t, and J"(t) represents a joint acceleration state which is an angular second derivative (with respect to time) computed by the encoders of all joints of the robot at time t.
[0105] The robot 102 also has a robot base frame 1002 and a robot end-effector frame 1004. The robot base frame 1002 is a kinematic frame in SE3 defined as a robot base or first element of a kinematic chain. The robot end-effector frame 1004 is a kinematic frame in SE3 defined as a last element of the kinematic chain. A frame is a time-variant
Figure imgf000026_0006
rigid transformation in SE3 representing a mapping from the robot base frame 1002 (frame
B) to the robot end-effector frame 1004 (frame E) at time ta. A frame is a time-
Figure imgf000026_0005
variant rigid transformation in SE3 representing a mapping from the robot base frame 1002 (frame B) to the robot end-effector frame 1004 (frame E) at time tb.
[0106] The uncertainty region 802 denoted as an occupied space C is defined as an occupied space represented in three-dimensions (3D) where 3D C R3, which is a subspace where the robotic arm 106 should not be located at any point along a safety trajectory 320.
[0107] The direct kinematics is defined which is a forward-kinematic
Figure imgf000026_0004
function transform of a joint position state to a kinematic frame denoted as
Figure imgf000026_0001
[0108] The inverse kinematics is defined as , which is an inverse-
Figure imgf000026_0003
kinematic function transform of a kinematic frame to its generating j oint position state J
Figure imgf000026_0002
or empty if not reachable.
[0109] In one embodiment, for example, the motion planner 312 can use a rapid- exploration random trees (RRT) algorithm to generate a safety trajectory 320, which can be represented as follows:
Figure imgf000026_0007
[0110] In one embodiment, for example, the motion planner 312 can also implement a trajectory adapting, smoothing and regularization algorithm A to a safety trajectory 320. The trajectory adapting, smoothing and regularization algorithm A may ensure joint velocities remain within joint limits, smooths a safety trajectory 320 to avoid higher accelerations than limits along inflection regions, and regularizes discrete samples at parametric task-space distances (translation or rotation) allowing homogeneous key points distributions for branching, as explained further in FIG. 11.
[0111] FIG. 11 depicts a logic flow 1100. The logic flow 1100 may be representative of some or all of the operations executed by one or more embodiments described herein. For example, the logic flow 1100 may include some or all of the operations performed by the motion planner 312 to generate or compute safety trajectories 320 during the offline phase 902. Embodiments are not limited in this context.
[0112] At block 1102, in a first operation, the logic flow 1100 receives a set of configuration information 316 representing full robot and application inputs. For example, the motion planner 312 implements a motion planning algorithm which takes as input a set of configuration information 316 for the robotic arm 106 that includes a set of joint limits, an initial kinematic frame a final kinematic frame direct kinematics inverse
Figure imgf000027_0003
Figure imgf000027_0004
kinematics φ, and an occupied space C.
[0113] At block 1104, in a second operation, the logic flow 1100 determines whether a safety trajectory 320 exists or is possible for a robotic arm at a given defined time step and uncertainty region. If no safety trajectory 320 exists at block 1104, the motion planner 312 outputs 0 no trajectory found and moves to the next defined time step. If a safety trajectory 320 exists at block 1104, then control moves to block 1106.
[0114] At block 1106, in a third operation, the logic flow 1100 generates a safety trajectory 320 which is a collision-free trajectory with n samples, denoted
Figure imgf000027_0002
[0115] At block 1108, in a fourth operation, the logic flow 1100 performs trajectory adapting, smoothing and regularization. For example, the motion planner 312 can implement a trajectory adapting, smoothing and regularization algorithm A to a safety trajectory 320. For smoothing, if trajectory smoothing is not within limits along inflection regions, then control is passed back to block 1104 for replanning of the safety trajectory 320. For regularization, the motion planner 312 receives as input a set of parameters for regularization granularity [dr, dir], where dr represents a translation in millimeters (mm) and dR represents rotation in radians.
[0116] At block 1110, in a fifth operation, the logic flow 1100 performs trajectory post- processing with m samples, denoted as
Figure imgf000027_0001
• The safety trajectory 320 is output by the motion planner 312 and stored in the database for use by the motion controller 314 during an online phase for the robotic system 100.
[0117] FIG. 12 depicts an operating environment 1200. The operating environment 1200 illustrates an online phase 1202 for the robotic system 100. As shown in FIG. 12, the robotic arm 106 traverses the defined trajectory 318-1 during normal operations. Sometime after defined time step t3. the attacker 208 initiates a cybersecurity attack on the controller 206 of the robot 118, thereby compromising movement of the robotic arm 122. The IDS 202 detects the cybersecurity attack, and sends an alarm 210 to the robotic safety system 214. The motion controller 314 retrieves a safety trajectory 320-1 for the robotic arm 106 indexed at defined time step t4 in the database. The safety trajectory 320-1 was previously generated by the motion planner 312 during the offline phase 902 based on a last known position, pose or configuration of the robotic arm f22 at defined time step t3 and associated uncertainty region 802. The motion controller 3f4 then causes, indirectly through the controller 204 of the robotic arm 106 or through direct control of the robotic arm 106, the robotic arm 106 to deviate from the defined trajectory 318-1 at the defined time step t4 and instead follow the safety trajectory 320-1 to safely bypass the uncertainty region 802. The robotic arm 106 traverses the safety trajectory 320-1 and arrives at a position, pose or configuration for the robotic arm 106 at the defined time step t7, where it resumes movement along the defined trajectory 318-1 to a safety position 804.
[0118] FIG. 13 depicts a logic flow 1300. The logic flow 1300 may be representative of some or all of the operations executed by one or more embodiments described herein. For example, the logic flow 1300 may include some or all of the operations performed by the motion controller 314 to retrieve and execute safety trajectories 320 during the online phase 1202. Embodiments are not limited in this context.
[0119] At block 1302, the logic flow 1300 receives as input a frame and joint
Figure imgf000028_0006
information
Figure imgf000028_0001
robotic arm 106 along the defined trajectory 318-1 at a detect time td for an attack. The logic flow 1300 outputs (jd, td) which represents instantaneous joint position state at time h.
[0120] At block 1304, the logic flow 1300 calculates
Figure imgf000028_0007
where intjn represents a rounding to an integer of m bits. The logic flow 1300 then calculates where G represents K-bits shift and re-packing
Figure imgf000028_0008
in n=(m*Robot DoF) bits integer. The logic flow 1300 then calculates • The result & is used as a query index to a
Figure imgf000028_0002
database 1308. The database 1308 stores safety trajectories 320 indexed by &. The logic flow 1300 retrieves an indexed safety trajectory 320-1, and outputs to the block 1306. [0121] At block 1306, the logic flow receives the indexed safety trajectory 320-1, and outputs joint information denoted as outputs a
Figure imgf000028_0003
frame the robotic arm 106.
Figure imgf000028_0009
[0122] FIG. 14 depicts an operating environment 1400. The operating environment 1400 illustrates an example of various frames of the end-effector 116 of the robotic arm 106
Figure imgf000028_0004
as it follows the defined trajectory 318-1. It also illustrates a set of current frames
Figure imgf000028_0005
times i, j, k, I, m representing start frames for safety trajectories 320 for the end-effector 116 of the robotic arm 106 to avoid the uncertainty region 802 denoted as occupied space C for the end-effector 132 of the robotic arm 122 at various defined time steps. Joint position to frame mapping for the frames ns represented as • Translation and
Figure imgf000029_0003
Figure imgf000029_0001
rotation distances between frames is represented granularity is defined by dr which
Figure imgf000029_0002
represents translation in mm and dR which represents rotation in radians. Joint positions for the end-effector 116 of the robotic arm 106 along the defined trajectory 320-1 is denoted as (jd, td) which represents an instantaneous j oint position state at time h.
[0123] FIG. 15 illustrates an embodiment of a logic flow 1500. The logic flow 1500 may be representative of some or all of the operations executed by one or more embodiments described herein. For example, the logic flow 1500 may include some or all of the operations performed by the motion planner 312 to generate or compute safety trajectories 320. The logic flow 1500 presents a more general case where the motion planner 312 computes safety trajectories 320 for the entire configuration or pose of the robotic arms 106, 122, including the end-effectors, to avoid an uncertainty region 802. Embodiments are not limited in this context.
[0124] The logic flow 1500 illustrate a series of general operational blocks 1502 to 1508. In block 1502, logic flow 1500 receives configuration information for a first robot with a first robotic arm at a defined time step. In block 1504, logic flow 1500 receives an uncertainty region for a second robot with a second robotic arm, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second robotic arm at the defined time step. In block 1506, logic flow 1500 determines a safety trajectory for the first robotic arm based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first robotic arm and a safety position for the first robotic arm that avoids the uncertainty region of the second robotic arm at the defined time step. In block 1508, logic flow 1500 stores the safety trajectory in a database.
[0125] By way of example, in block 1502, the motion planner 312 receives configuration information 316 for a first robot 102 with a first robotic arm 106 at a defined time step Ti- TN. In block 1504, the motion planner 312 receives an uncertainty region 802 for a second robot 118 with a second robotic arm 122. The uncertainty region 802 may comprise a three- dimensional shape that defines boundaries for a reach envelope of the second robotic arm at the defined time step TI-TN. In block 1506, the motion planner 312 determines a safety trajectory 320 for the first robotic arm 106 based on the uncertainty region 802, the safety trajectory 320 to define a series of positions between a current position for the first robotic arm 106 and a safety position 804 for the first robotic arm 106 that avoids the uncertainty region 802 of the second robotic arm 122 at the defined time step TI-TN. In block 1508, the motion planner 312 stores the safety trajectory 320 in a database.
[0126] FIG. 16 illustrates an embodiment of a logic flow 1600. The logic flow 1600 may be representative of some or all of the operations executed by one or more embodiments described herein. For example, the logic flow 1600 may include some or all of the operations performed by the motion controller 314 to retrieve and execute a safety trajectory 320 for one or both robotic arms 106, 122. The logic flow 1600 presents a more general case where the motion controller 314 retrieves and executes safety trajectories 320 for the entire configuration or pose of the robotic arms 106, 122, including the end-effectors, to avoid an uncertainty region 802. Embodiments are not limited in this context.
[0127] The logic flow 1600 illustrates a series of general operational blocks 1602 to 1614. In block 1602, logic flow 1600 monitors a first controller for a first robot with a first robotic arm, the first controller to control movement of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time step. In block 1604, logic flow 1600 monitors a second controller for a second robot with a second robotic arm, the second controller to control movement of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time step. In block 1606, logic flow 1600 detects a cyberattack on the second controller for the second robot. In block 1608, logic flow 1600 identifies a defined time step corresponding to a time of attack detection, the identified defined time step to comprise an attack detection time step. In block 1610, logic flow 1600 determines a current position for the first robotic arm from among the series of positions along the first defined trajectory based on the attack detection time step. In block 1612, logic flow 1600 retrieves a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first robotic arm, the safety trajectory to define a series of positions between the current position for the first robotic arm and a safety position for the first robotic arm that avoids an uncertainty region of the second robotic arm. In block 1614, logic flow 1600 sending control information to the first controller to cause the first robotic arm to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second robotic arm.
[0128] By way of example, in block 1602, motion controller 314 monitors a first controller 204 for a first robot 102 with a first robotic arm 106, the first controller 204 to control movement of the first robotic arm 106 between a series of positions along a first defined trajectory 318 , each position associated with a defined time step TI-TN. In block 1604, motion controller 314 monitors a second controller 206 for a second robot 118 with a second robotic arm 122, the second controller 206 to control movement of the second robotic arm 122 between a series of positions along a second defined trajectory 318, each position associated with a defined time step TI-TN. In block 1606, motion controller 314 detects a cyberattack on the second controller 206 for the second robot 118. In block 1608, the motion controller 314 identifies a defined time step TI-TN corresponding to a time of attack detection, the identified defined time step TI-TN to comprise an attack detection time step. In block 1610, the motion controller 314 determines a current position for the first robotic arm 106 from among the series of positions along the first defined trajectory 318 based on the attack detection time step. In block 1612, the motion controller 314 retrieves a safety trajectory 320 from among multiple safety trajectories 320 stored in a database based on the current position for the first robotic arm 106, the safety trajectory 320 to define a series of positions between the current position for the first robotic arm 106 and a safety position 804 for the first robotic arm 106 that avoids an uncertainty region of the second robotic arm 122. In block 1614, the motion controller 314 sends control information to the first controller 204 to cause the first robotic arm 106 to move from the current position to the safety position 804 along the safety trajectory 320 in order to prevent a collision with the second robotic arm 122.
[0129] FIG. 17 illustrates computer readable storage medium 1700. Computer readable storage medium 1700 may comprise any non -transitory computer-readable storage medium or machine-readable storage medium, such as an optical, magnetic or semiconductor storage medium. In various embodiments, computer readable storage medium 1700 may comprise an article of manufacture. In some embodiments, computer readable storage medium 1700 may store computer executable instructions 1702 with which circuitry (e.g., controllers 204, 206, processing circuitry 302, network interface 322, or the like) can execute. For example, computer executable instructions 1702 can include instructions to implement operations described with respect to logic flows 400, 500, 1100 and 1300. Examples of computer readable storage medium 1700 or machine-readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of computer executable instructions 1702 may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like.
[0130] FIG. 18 illustrates one example of a system architecture and data processing device that may be used to implement one or more illustrative aspects described herein in a standalone and/or networked environment. Various network nodes, such as the data server 1810, web server 1806, computer 1804, and laptop 1802 may be interconnected via a wide area network 1808 (WAN), such as the internet. Other networks may also or alternatively be used, including private intranets, corporate networks, LANs, metropolitan area networks (MANs) wireless networks, personal networks (PANs), and the like. Network 1808 is for illustration purposes and may be replaced with fewer or additional computer networks. A local area network (LAN) may have one or more of any known LAN topology and may use one or more of a variety of different protocols, such as ethernet. Devices data server 1810, web server 1806, computer 1804, laptop 1802 and other devices (not shown) may be connected to one or more of the networks via twisted pair wires, coaxial cable, fiber optics, radio waves or other communication media.
[0131] Computer software, hardware, and networks may be utilized in a variety of different system environments, including standalone, networked, remote-access (aka, remote desktop), virtualized, and/or cloud-based environments, among others.
[0132] The term "network" as used herein and depicted in the drawings refers not only to systems in which remote storage devices are coupled together via one or more communication paths, but also to stand-alone devices that may be coupled, from time to time, to such systems that have storage capability. Consequently, the term "network" includes not only a "physical network" but also a "content network," which is comprised of the data— attributable to a single entity— which resides across all physical networks.
[0133] The components may include data server 1810, web server 1806, and client computer 1804, laptop 1802. Data server 1810 provides overall access, control and administration of databases and control software for performing one or more illustrative aspects described herein. Data server 1610 may be connected to web server 1806 through which users interact with and obtain data as requested. Alternatively, data server 1810 may act as a web server itself and be directly connected to the internet. Data server 1810 may be connected to web server 1806 through the network 1808 (e.g., the internet), via direct or indirect connection, or via some other network. Users may interact with the data server 1810 using remote computer 1804, laptop 1802, e.g., using a web browser to connect to the data server 1810 via one or more externally exposed web sites hosted by web server 1806. Client computer 1804, laptop 1802 may be used in concert with data server 1810 to access data stored therein, or may be used for other purposes. For example, from client computer 1804, a user may access web server 1806 using an internet browser, as is known in the art, or by executing a software application that communicates with web server 1806 and/or data server 1810 over a computer network (such as the internet). [0134] Servers and applications may be combined on the same physical machines, and retain separate virtual or logical addresses, or may reside on separate physical machines. FIG. 18 illustrates just one example of a network architecture that may be used, and those of skill in the art will appreciate that the specific network architecture and data processing devices used may vary, and are secondary to the functionality that they provide, as further described herein. For example, services provided by web server 1806 and data server 1810 may be combined on a single server.
[0135] Each component data server 1810, web server 1806, computer 1804, laptop 1802 may be any type of known computer, server, or data processing device. Data server 1810, e.g., may include a processor 1812 controlling overall operation of the data server 1810. Data server 1810 may further include RAM 1816, ROM 1818, network interface 1814, input/output interfaces 1820 (e.g., keyboard, mouse, display, printer, etc.), and memory 1822. Input/output interfaces 1820 may include a variety of interface units and drives for reading, writing, displaying, and/or printing data or files. Memory 1822 may further store operating system software 1824 for controlling overall operation of the data server 1810, control logic 1826 for instructing data server 1810 to perform aspects described herein, and other application software 1828 providing secondary, support, and/or other functionality which may or may not be used in conjunction with aspects described herein. The control logic may also be referred to herein as the data server software control logic 1826. Functionality of the data server software may refer to operations or decisions made automatically based on rules coded into the control logic, made manually by a user providing input into the system, and/or a combination of automatic processing based on user input (e.g., queries, data updates, etc.).
[0136] Memory 1822 may also store data used in performance of one or more aspects described herein, including a first database 1630 and a second database 1832. The databases 1830, 1832 can store information such as configuration information 316, defined trajectories 318, and/or safety trajectories 320. In some embodiments, the first database may include the second database (e.g., as a separate table, report, etc.). That is, the information can be stored in a single database, or separated into different logical, virtual, or physical databases, depending on system design. Web server 1806, computer 1804, laptop 1802 may have similar or different architecture as described with respect to data server 1810. Those of skill in the art will appreciate that the functionality of data server 1810 (or web server 1806, computer 1804, laptop 1802) as described herein may be spread across multiple data processing devices, for example, to distribute processing load across multiple computers, to segregate transactions based on geographic location, user access level, quality of service (QoS), etc.
[0137] One or more aspects may be embodied in computer-usable or readable data and/or computer-executable instructions, such as in one or more program modules, executed by one or more computers or other devices as described herein. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types when executed by a processor in a computer or other device. The modules may be written in a source code programming language that is subsequently compiled for execution, or may be written in a scripting language such as (but not limited to) HTML or XML. The computer executable instructions may be stored on a computer readable medium such as a nonvolatile storage device. Any suitable computer readable storage media may be utilized, including hard disks, CD-ROMs, optical storage devices, magnetic storage devices, and/or any combination thereof. In addition, various transmission (non-storage) media representing data or events as described herein may be transferred between a source and a destination in the form of electromagnetic waves traveling through signal-conducting media such as metal wires, optical fibers, and/or wireless transmission media (e.g., air and/or space), various aspects described herein may be embodied as a method, a data processing system, or a computer program product.
Therefore, various functionalities may be embodied in whole or in part in software, firmware and/or hardware or hardware equivalents such as integrated circuits, field programmable gate arrays (FPGA), and the like. Particular data structures may be used to more effectively implement one or more aspects described herein, and such data structures are contemplated within the scope of computer executable instructions and computer-usable data described herein.
[0138] It will be appreciated that the exemplary devices shown in the block diagrams described above may represent one functionally descriptive example of many potential implementations. Accordingly, division, omission or inclusion of block functions depicted in the accompanying figures does not infer that the hardware components, circuits, software and/or elements for implementing these functions would necessarily be divided, omitted, or included in embodiments.
[0139] At least one computer-readable storage medium may include instructions that, when executed, cause a system to perform any of the computer-implemented methods described herein.
[0140] Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Moreover, unless otherwise noted the features described above are recognized to be usable together in any combination. Thus, any features discussed separately may be employed in combination with each other unless it is noted that the features are incompatible with each other.
[0141] The following examples pertain to further embodiments, from which numerous permutations and configurations will be apparent.
[0142] In one aspect, a method to reduce robotic collisions, includes monitoring a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, monitoring a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, detecting a cyberattack on the second controller for the second robot, identifying a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, determining a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval, retrieving a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end- effector, the safety trajectory to define a series of positions between the current position for the first end-effector and a safety position for the first end-effector that avoids an uncertainty region of the second end-effector, and sending control information to the first controller to cause the first end-effector to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second end- effector.
[0143] The method may also include where the first robot and the second robot are collaborative robots arranged to move in a shared physical space in order to execute tasks to accomplish shared goals.
[0144] The method may also include where the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the first end-effector is a terminus for the kinematic chain. [0145] The method may also include where the second robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the second end-effector is a terminus for the kinematic chain.
[0146] The method may also include where the first defined trajectory for the first end- effector and the second defined trajectory for the second end-effector are arranged to avoid collisions between the first end-effector and the second end-effector when moving within a shared space.
[0147] The method may also include includes detecting the cyberattack on the second controller for the second robot during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
[0148] The method may also include includes comparing the time of attack detection to each time interval associated with each position in the first defined trajectory until a match is found to identify the attack detection time interval.
[0149] The method may also include includes retrieving the current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval from a database.
[0150] The method may also include where the uncertainty region includes a three- dimensional shape that defines boundaries for a reach envelope of the second end-effector after the attack detection time interval.
[0151] The method may also include where an uncertainty region is generated for each position associated with a time interval for the second end-effector of the second robotic arm, and stored in the database before the time of attack detection.
[0152] The method may also include includes executing the control information by the first controller to cause the first end-effector to move from the current position to the safety point along the safety trajectory in order to prevent a collision with the second end-effector. [0153] The method may also include includes neutralizing the detected cyberattack on the second controller for the second robot. Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
[0154] In one aspect, an apparatus includes a processor circuit. The apparatus also includes a memory storing instructions that, when executed by the processor, cause the processor circuit to monitor a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, monitor a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, detect a cyberattack on the second controller for the second robot, identify a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, determine a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval, retrieve a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end-effector, the safety trajectory to define a series of positions between the current position for the first end-effector and a safety position for the first end-effector that avoids an uncertainty region of the second end-effector, and send control information to the first controller to cause the first end-effector to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second end- effector.
[0155] The apparatus may also include where the first robot and the second robot are collaborative robots arranged to move in a shared physical space in order to execute tasks to accomplish shared goals.
[0156] The apparatus may also include where the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the first end-effector is a terminus for the kinematic chain.
[0157] The apparatus may also include where the second robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the second end-effector is a terminus for the kinematic chain.
[0158] The apparatus may also include where the first defined trajectory for the first end- effector and the second defined trajectory for the second end-effector are arranged to avoid collisions between the first end-effector and the second end-effector when move within a shared space.
[0159] The apparatus may also include the processor circuit to detect the cyberattack on the second controller for the second robot during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization. [0160] The apparatus may also include the processor circuit to compare the time of attack detection to each time interval associated with each position in the first defined trajectory until a match is found to identify the attack detection time interval.
[0161] The computing apparatus may also include the processor circuit to retrieve the current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval from a database.
[0162] The computing apparatus may also include where the uncertainty region includes a three-dimensional shape that defines boundaries for a reach envelope of the second end- effector after the attack detection time interval.
[0163] The computing apparatus may also include where an uncertainty region is generated for each position associated with a time interval for the second end-effector of the second robotic arm, and stored in the database before the time of attack detection.
[0164] The computing apparatus may also include the processor circuit to execute the control information by the first controller to cause the first end-effector to move from the current position to the safety point along the safety trajectory in order to prevent a collision with the second end-effector.
[0165] The computing apparatus may also include the processor circuit to neutralize the detected cyberattack on the second controller for the second robot. Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
[0166] In one aspect, a non-transitory computer-readable storage medium, the computer- readable storage medium including instructions that when executed by a computer, cause the computer to monitor a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, monitor a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, detect a cyberattack on the second controller for the second robot, identify a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, determine a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval, retrieve a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end-effector, the safety trajectory to define a series of positions between the current position for the first end-effector and a safety position for the first end-effector that avoids an uncertainty region of the second end-effector, and send control information to the first controller to cause the first end-effector to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second end- effector.
[0167] The computer-readable storage medium may also include where the first robot and the second robot are collaborative robots arranged to move in a shared physical space in order to execute tasks to accomplish shared goals.
[0168] The computer-readable storage medium may also include where the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the first end- effector is a terminus for the kinematic chain.
[0169] The computer-readable storage medium may also include where the second robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the second end-effector is a terminus for the kinematic chain.
[0170] The computer-readable storage medium may also include where the first defined trajectory for the first end-effector and the second defined trajectory for the second end- effector are arranged to avoid collisions between the first end-effector and the second end- effector when move within a shared space.
[0171] The computer-readable storage medium may also include includes detect the cyberattack on the second controller for the second robot during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
[0172] The computer-readable storage medium may also include includes compare the time of attack detection to each time interval associated with each position in the first defined trajectory until a match is found to identify the attack detection time interval.
[0173] The computer-readable storage medium may also include includes retrieve the current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval from a database.
[0174] The computer-readable storage medium may also include where the uncertainty region includes a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector after the attack detection time interval.
[0175] The computer-readable storage medium may also include where an uncertainty region is generated for each position associated with a time interval for the second end- effector of the second robotic arm, and stored in the database before the time of attack detection.
[0176] The computer-readable storage medium may also include includes execute the control information by the first controller to cause the first end-effector to move from the current position to the safety point along the safety trajectory in order to prevent a collision with the second end-effector.
[0177] The computer-readable storage medium may also include includes neutralize the detected cyberattack on the second controller for the second robot. Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
[0178] In one aspect, a method to compute a trajectory, includes receiving configuration information for a first robot with a first robotic arm and a first end-effector at a defined time interval, receiving an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time interval, determining a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time interval, and storing the safety trajectory in a database.
[0179] The method may also include where the defined time interval is associated with a defined position for the first end-effector and the second end-effector.
[0180] The method may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the first robotic arm, a joint velocity state for each joint of the first robotic arm, and a joint acceleration state for each joint of the first robotic arm.
[0181] The method may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes an initial kinematic frame and a final kinematic frame for the first robotic arm at the defined time interval.
[0182] The method may also include includes determining the uncertainty region for the second robot with the second robotic arm and the second end-effector based on configuration information for the second robot, where the configuration information includes a set of parameters for the second robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the second robotic arm, a joint velocity state for each joint of the second robotic arm, and a joint acceleration state for each joint of the second robotic arm.
[0183] The method may also include includes adapting the safety trajectory to ensure movement of each joint of the first robotic arm remains within joint acceleration parameters.
[0184] The method may also include includes smoothing the safety trajectory to ensure movement of the first robotic arm remains within acceleration parameters along inflection regions.
[0185] The method may also include includes regularizing the safety trajectory for safe- trajectory branching and management.
[0186] The method may also include includes receiving multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
[0187] The method may also include includes determining multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
[0188] The method may also include includes storing multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector in the database.
[0189] The method may also include includes indexing multiple safety trajectories for the first end-effector in the database using defined time intervals and associated positions of the first end-effector at each defined time interval. Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
[0190] In one aspect, a computing apparatus includes a processor circuit. The computing apparatus also includes a memory storing instructions that, when executed by the processor, cause the processor circuit to receive configuration information for a first robot with a first robotic arm and a first end-effector at a defined time interval, receive an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time interval, determine a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time interval, and store the safety trajectory in a database.
[0191] The computing apparatus may also include where the defined time interval is associated with a defined position for the first end-effector and the second end-effector. [0192] The computing apparatus may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the first robotic arm, a joint velocity state for each joint of the first robotic arm, and a joint acceleration state for each joint of the first robotic arm.
[0193] The computing apparatus may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes an initial kinematic frame and a final kinematic frame for the first robotic arm at the defined time interval.
[0194] The computing apparatus may also include the processor circuit to determine the uncertainty region for the second robot with the second robotic arm and the second end- effector based on configuration information for the second robot, where the configuration information includes a set of parameters for the second robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the second robotic arm, a joint velocity state for each joint of the second robotic arm, and a joint acceleration state for each joint of the second robotic arm.
[0195] The computing apparatus may also include the processor circuit to adapt the safety trajectory to ensure movement of each joint of the first robotic arm remains within joint acceleration parameters.
[0196] The computing apparatus may also include the processor circuit to smooth the safety trajectory to ensure movement of the first robotic arm remains within acceleration parameters along inflection regions.
[0197] The computing apparatus may also include the processor circuit to regularize the safety trajectory for safe-trajectory branching and management.
[0198] The computing apparatus may also include the processor circuit to receive multiple uncertainty regions for the second robot with the second robotic arm and the second end- effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
[0199] The computing apparatus may also include the processor circuit to determine multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
[0200] The computing apparatus may also include the processor circuit to store multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector in the database.
[0201] The computing apparatus may also include the processor circuit to index multiple safety trajectories for the first end-effector in the database using defined time intervals and associated positions of the first end-effector at each defined time interval. Other technical features may be readily apparent to one skilled in the art from the following figures, descriptions, and claims.
[0202] In one aspect, a non-transitory computer-readable storage medium, the computer- readable storage medium including instructions that when executed by a computer, cause the computer to receive configuration information for a first robot with a first robotic arm and a first end-effector at a defined time interval, receive an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time interval, determine a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time interval, and store the safety trajectory in a database.
[0203] The computer-readable storage medium may also include where the defined time interval is associated with a defined position for the first end-effector and the second end- effector.
[0204] The computer-readable storage medium may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the first robotic arm, a joint velocity state for each joint of the first robotic arm, and a joint acceleration state for each joint of the first robotic arm.
[0205] The computer-readable storage medium may also include where the configuration information includes a set of parameters for the first robotic arm at the defined time interval, the set of parameters includes an initial kinematic frame and a final kinematic frame for the first robotic arm at the defined time interval. [0206] The computer-readable storage medium may also include includes determine the uncertainty region for the second robot with the second robotic arm and the second end- effector based on configuration information for the second robot, where the configuration information includes a set of parameters for the second robotic arm at the defined time interval, the set of parameters includes a joint position state for each joint of the second robotic arm, a joint velocity state for each joint of the second robotic arm, and a joint acceleration state for each joint of the second robotic arm.
[0207] The computer-readable storage medium may also include includes adap the safety trajectory to ensure movement of each joint of the first robotic arm remains within joint acceleration parameters.
[0208] The computer-readable storage medium may also include includes smooth the safety trajectory to ensure movement of the first robotic arm remains within acceleration parameters along inflection regions.
[0209] The computer-readable storage medium may also include includes regularize the safety trajectory for safe-trajectory branching and management.
[0210] The computer-readable storage medium may also include includes receive multiple uncertainty regions for the second robot with the second robotic arm and the second end- effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
[0211] The computer-readable storage medium may also include includes determine multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector, each uncertainty region associated with a position of the second end-effector at a defined time interval within a series of defined time intervals.
[0212] The computer-readable storage medium may also include includes store multiple safety trajectories for the first end-effector based on multiple uncertainty regions for the second robot with the second robotic arm and the second end-effector in the database.
[0213] The computer-readable storage medium may also include includes index multiple safety trajectories for the first end-effector in the database using defined time intervals and associated positions of the first end-effector at each defined time interval.
[0214] In one aspect, a method to compute a trajectory, includes receiving configuration information for a first robot with a first robotic arm at a defined time interval, receiving an uncertainty region for a second robot with a second robotic arm, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second robotic arm at the defined time interval, determining a safety trajectory for the first robotic arm based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first robotic arm and a safety position for the first robotic arm that avoids the uncertainty region of the second robotic arm at the defined time interval, and storing the safety trajectory in a database. The method may also include the first robotic arm to have a first end-effector and the second robotic arm to have a second end-effector.
[0215] In one aspect, a method to reduce robotic collisions, includes monitoring a first controller for a first robot with a first robotic arm, the first controller to control movement of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, monitoring a second controller for a second robot with a second robotic arm, the second controller to control movement of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, detecting a cyberattack on the second controller for the second robot, identifying a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, determining a current position for the first robotic arm from among the series of positions along the first defined trajectory based on the attack detection time interval, retrieving a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first robotic arm, the safety trajectory to define a series of positions between the current position for the first robotic arm and a safety position for the first robotic arm that avoids an uncertainty region of the second robotic arm, and sending control information to the first controller to cause the first robotic arm to move from the current position to the safety point along the safety trajectory in order to prevent a collision with the second robotic arm. The method may also include the first robotic arm to have a first end- effector and the second robotic arm to have a second end-effector.
[0216] Other aspects and embodiments include any of the aforementioned method, apparatus, system and/or computer-readable storage medium examples implemented as a “means plus function” apparatus or system.
[0217] For instance, in one aspect, an apparatus to reduce robotic collisions includes a means for monitoring a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval, a means for monitoring a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval, a means for detecting a cyberattack on the second controller for the second robot, identifying a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval, a means for determining a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval, a means for retrieving a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end- effector, the safety trajectory to define a series of positions between the current position for the first end-effector and a safety position for the first end-effector that avoids an uncertainty region of the second end-effector, and a means for sending control information to the first controller to cause the first end-effector to move from the current position to the safety point along the safety trajectory in order to prevent a collision with the second end- effector.
[0218] For instance, in one aspect, an apparatus to compute a trajectory, includes a means for receiving configuration information for a first robot with a first robotic arm and a first end-effector at a defined time interval, a means for receiving an uncertainty region for a second robot with a second robotic arm and a second end-effector, the uncertainty region to comprise a three-dimensional shape that defines boundaries for a reach envelope of the second end-effector at the defined time interval, a means for determining a safety trajectory for the first end-effector based on the uncertainty region, the safety trajectory to define a series of positions between a current position for the first end-effector and a safety position for the first end-effector that avoids the uncertainty region of the second end-effector at the defined time interval, and a means for storing the safety trajectory in a database.
[0219] Other “means plus function” aspects and embodiments are equally applicable to any of the aspects or embodiments described herein.

Claims

CLAIMS What is claimed is:
1. A method to reduce robotic collisions, comprising: monitoring a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval; monitoring a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval; detecting a cyberattack on the second controller for the second robot; identifying a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval; determining a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval; retrieving a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end-effector, the safety trajectory to define a series of positions between the current position for the first end-effector and a safety position for the first end-effector that avoids an uncertainty region of the second end- effector; and sending control information to the first controller to cause the first end-effector to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second end-effector.
2. The method of claim 1, wherein the first robot and the second robot are collaborative robots arranged to move in a shared physical space in order to execute tasks to accomplish shared goals.
3. The method of claim 1, wherein the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the first end-effector is a terminus for the kinematic chain.
4. The method of claim 1, wherein the second robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allowing rotational motion or translational displacement, and the second end-effector is a terminus for the kinematic chain.
5. The method of claim 1, wherein the first defined trajectory for the first end-effector and the second defined trajectory for the second end-effector are arranged to avoid collisions between the first end-effector and the second end-effector when moving within a shared space.
6. The method of claim 1, comprising detecting the cyberattack on the second controller for the second robot during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
7. The method of claim 1, comprising comparing the time of attack detection to each time interval associated with each position in the first defined trajectory until a match is found to identify the attack detection time interval.
8. An apparatus comprising: a processor circuit; and a memory storing instructions that, when executed by the processor, cause the processor circuit to: monitor a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval; monitor a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval; detect a cyberattack on the second controller for the second robot; identify a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval; determine a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval; retrieve a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end-effector, the safety trajectory to define a series of positions between the current position for the first end-effector and a safety position for the first end-effector that avoids an uncertainty region of the second end- effector; and send control information to the first controller to cause the first end-effector to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second end-effector.
9. The apparatus of claim 8, wherein the first robot and the second robot are collaborative robots arranged to move in a shared physical space in order to execute tasks to accomplish shared goals.
10. The apparatus of claim 8, wherein the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the first end-effector is a terminus for the kinematic chain.
11. The apparatus of claim 8, wherein the second robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the second end-effector is a terminus for the kinematic chain.
12. The apparatus of claim 8, wherein the first defined trajectory for the first end-effector and the second defined trajectory for the second end-effector are arranged to avoid collisions between the first end-effector and the second end-effector when move within a shared space.
13. The apparatus of claim 8, the processor circuit to detect the cyberattack on the second controller for the second robot during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
14. The apparatus of claim 8, the processor circuit to compare the time of attack detection to each time interval associated with each position in the first defined trajectory until a match is found to identify the attack detection time interval.
15. A non-transitory computer-readable storage medium, the computer-readable storage medium including instructions that when executed by a computer, cause the computer to: monitor a first controller for a first robot with a first robotic arm, the first controller to control movement of a first end-effector of the first robotic arm between a series of positions along a first defined trajectory, each position associated with a defined time interval; monitor a second controller for a second robot with a second robotic arm, the second controller to control movement of a second end-effector of the second robotic arm between a series of positions along a second defined trajectory, each position associated with a defined time interval; detect a cyberattack on the second controller for the second robot; identify a defined time interval corresponding to a time of attack detection, the identified defined time interval to comprise an attack detection time interval; determine a current position for the first end-effector from among the series of positions along the first defined trajectory based on the attack detection time interval; retrieve a safety trajectory from among multiple safety trajectories stored in a database based on the current position for the first end-effector, the safety trajectory to define a series of positions between the current position for the first end-effector and a safety position for the first end-effector that avoids an uncertainty region of the second end- effector; and send control information to the first controller to cause the first end-effector to move from the current position to the safety position along the safety trajectory in order to prevent a collision with the second end-effector.
16. The computer-readable storage medium of claim 15, wherein the first robot and the second robot are collaborative robots arranged to move in a shared physical space in order to execute tasks to accomplish shared goals.
17. The computer-readable storage medium of claim 15, wherein the first robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the first end-effector is a terminus for the kinematic chain.
18. The computer-readable storage medium of claim 15, wherein the second robotic arm is a mechanical arm with one or more links to form a kinematic chain, each link connected by a joint allow rotational motion or translational displacement, and the second end-effector is a terminus for the kinematic chain.
19. The computer-readable storage medium of claim 15, wherein the first defined trajectory for the first end-effector and the second defined trajectory for the second end-effector are arranged to avoid collisions between the first end-effector and the second end-effector when move within a shared space.
20. The computer-readable storage medium of claim 15, comprising detect the cyberattack on the second controller for the second robot during an uncertainty time period, the uncertainty time period to comprise a time interval between a time of attack and a time of attack neutralization.
PCT/US2022/042459 2022-09-02 2022-09-02 Post-attack real-time trajectory recovery of collaborative robotic arms WO2024049444A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/042459 WO2024049444A1 (en) 2022-09-02 2022-09-02 Post-attack real-time trajectory recovery of collaborative robotic arms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/042459 WO2024049444A1 (en) 2022-09-02 2022-09-02 Post-attack real-time trajectory recovery of collaborative robotic arms

Publications (1)

Publication Number Publication Date
WO2024049444A1 true WO2024049444A1 (en) 2024-03-07

Family

ID=90098490

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/042459 WO2024049444A1 (en) 2022-09-02 2022-09-02 Post-attack real-time trajectory recovery of collaborative robotic arms

Country Status (1)

Country Link
WO (1) WO2024049444A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05119814A (en) * 1991-05-20 1993-05-18 Brother Ind Ltd Non-interference controller for robot arm
JP2015160253A (en) * 2014-02-26 2015-09-07 キヤノン株式会社 Trajectory generation method, robot device, program and recording medium
US20160008078A1 (en) * 2012-12-10 2016-01-14 Intuitive Surgical Operations, Inc. Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
JP2016190315A (en) * 2015-03-30 2016-11-10 株式会社トヨタプロダクションエンジニアリング Program creation support method, program creation support device and program
US20210331712A1 (en) * 2019-08-05 2021-10-28 Lg Electronics Inc. Method and apparatus for responding to hacking on autonomous vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05119814A (en) * 1991-05-20 1993-05-18 Brother Ind Ltd Non-interference controller for robot arm
US20160008078A1 (en) * 2012-12-10 2016-01-14 Intuitive Surgical Operations, Inc. Collision avoidance during controlled movement of image capturing device and manipulatable device movable arms
JP2015160253A (en) * 2014-02-26 2015-09-07 キヤノン株式会社 Trajectory generation method, robot device, program and recording medium
JP2016190315A (en) * 2015-03-30 2016-11-10 株式会社トヨタプロダクションエンジニアリング Program creation support method, program creation support device and program
US20210331712A1 (en) * 2019-08-05 2021-10-28 Lg Electronics Inc. Method and apparatus for responding to hacking on autonomous vehicle

Similar Documents

Publication Publication Date Title
US9862090B2 (en) Surrogate: a body-dexterous mobile manipulation robot with a tracked base
Hatanaka et al. Passivity-based bilateral human-swarm-interactions for cooperative robotic networks and human passivity analysis
Yang et al. A new master-slave torque design for teleoperation system by TS fuzzy approach
Tsai et al. Ensuring safety in human-robot coexistence environment
Ceriani et al. Reactive task adaptation based on hierarchical constraints classification for safe industrial robots
Parusel et al. Modular state-based behavior control for safe human-robot interaction: A lightweight control architecture for a lightweight robot
Wang et al. Shared control for teleoperation enhanced by autonomous obstacle avoidance of robot manipulator
Perales Gómez et al. SafeMan: A unified framework to manage cybersecurity and safety in manufacturing industry
KR101820580B1 (en) Safe robot with path progress variables
WO2024049444A1 (en) Post-attack real-time trajectory recovery of collaborative robotic arms
Kiemel et al. Learning collision-free and torque-limited robot trajectories based on alternative safe behaviors
Sariyildiz et al. Support Vector Regression based inverse kinematic modeling for a 7-DOF redundant robot arm
Boujelben et al. A multi-agent architecture with hierarchical fuzzy controller for a mobile robot
Wang et al. Automatic obstacle avoidance using redundancy for shared controlled telerobot manipulator
Wu et al. Comparative study of robot kinematic calibration algorithms using a unified geometric framework
LeBel et al. An anticipative kinematic limitation avoidance algorithm for collaborative robots: Three-dimensional case
Xu et al. Hierarchical predefined‐time control of teleoperation systems with state and communication constraints
Sen et al. A cooperative target-centric formation with bounded acceleration
Seo et al. Deep Imitation Learning for Humanoid Loco-manipulation through Human Teleoperation
Shakhmametova et al. Intelligent technologies integration in the task of unaccented trajectories search in robotics
Spencer et al. Collision avoidance techniques for tele-operated and autonomous manipulators in overlapping workspaces
Liu et al. Real-time estimation of sensorless planar robot contact information
Dahalan et al. Numerical evaluation of mobile robot navigation in static indoor environment via EGAOR Iteration
Al-Faiz et al. Inverse kinematics analysis for manipulator robot with wrist offset based on the closed-form algorithm
Zube et al. Model predictive contact control for human-robot interaction