CN115258510A - Robot system with object update mechanism and method for operating the robot system - Google Patents

Robot system with object update mechanism and method for operating the robot system Download PDF

Info

Publication number
CN115258510A
CN115258510A CN202210933005.4A CN202210933005A CN115258510A CN 115258510 A CN115258510 A CN 115258510A CN 202210933005 A CN202210933005 A CN 202210933005A CN 115258510 A CN115258510 A CN 115258510A
Authority
CN
China
Prior art keywords
initial
package
sensor data
additional sensor
plan
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210933005.4A
Other languages
Chinese (zh)
Inventor
金本良树
鲁仙·出杏光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mujin Technology
Original Assignee
Mujin Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/841,545 external-priority patent/US20230025647A1/en
Application filed by Mujin Technology filed Critical Mujin Technology
Publication of CN115258510A publication Critical patent/CN115258510A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/137Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
    • B65G1/1371Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed with data records
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1684Tracking a line or surface by means of sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G61/00Use of pick-up or transfer devices or of manipulators for stacking or de-stacking articles not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G63/00Transferring or trans-shipping at storage areas, railway yards or harbours or in opening mining cuts; Marshalling yard installations

Abstract

A system and method for determining false detections and subsequent responses of objects is disclosed herein. The robotic system may use a motion plan derived based on initial detection of the package to transfer the package from the start location to the task location. During implementation of the movement plan, the robotic system may obtain additional sensor data that may be used to deviate from the initial movement plan and implement an alternate movement plan to transfer the package to the mission location.

Description

Robot system with object update mechanism and method for operating the same
The present application is a divisional application of chinese application No. cn202210880924.X, having a date of 7/25/2022 entitled "robot system with object update mechanism and method for operating said robot system".
Cross Reference to Related Applications
This application claims the benefit of U.S. provisional patent application No.63/225,346, filed on 23/7/2021, which is incorporated herein by reference in its entirety.
Technical Field
The present technology relates generally to robotic systems and, more particularly, to robotic systems having object update mechanisms.
Background
Robots (e.g., machines configured to automatically/autonomously perform physical actions) are now widely used in many fields. For example, robots may be used to perform various tasks (e.g., manipulating or transferring objects) in manufacturing, packaging, transporting, and/or shipping. In performing tasks, the robot may replicate human actions, replacing or reducing human involvement that would otherwise be required to perform dangerous or repetitive tasks. Robots often lack the precision necessary to replicate the human sensitivity and/or adaptability needed to perform more complex tasks. For example, robots are often difficult to adapt to unexpected conditions, such as due to incorrect object recognition. Accordingly, there remains a need for improved robotic systems and techniques for controlling and managing various aspects of a robot to address unexpected conditions.
Drawings
Fig. 1 illustrates an example environment in which a robotic system transports objects in accordance with one or more embodiments of the present technology.
Fig. 2 is a block diagram illustrating a robotic system in accordance with one or more embodiments of the present technique.
Fig. 3 illustrates a multi-component transfer assembly in accordance with one or more embodiments of the present technology.
Fig. 4 illustrates an example top view of a pair of parcels, according to one or more embodiments of the present technology.
Fig. 5 is a functional block diagram of a robotic system in accordance with one or more embodiments of the present technique.
Fig. 6 is a flow diagram for operating a robotic system, according to one or more embodiments of the present technology.
Detailed Description
Systems and methods for diverting unexpected objects are described herein. The system may include or access a registration record describing a set of features (e.g., physical characteristics such as surface image, size, weight, centroid (CoM) position, etc.) for each desired object. The system may include and/or be in communication with a set of sensors (e.g., vision sensors, weight/torque sensors, etc.) that may obtain measurements or represent an object corresponding to a feature set. For example, the system may interact with one or more cameras to obtain two-dimensional (2D) and/or three-dimensional (3D) images or depth maps of one or more objects at a starting location. The obtained image may be processed by a detection process, which may include locating edges of the object, estimating or measuring dimensions of the object, determining the identity of the object, determining the location of the object, and the like. The detection results can be used to derive a movement plan for transferring each object from the starting position to the task position. As described in detail below, during implementation of the derived motion plan, the system may obtain other measurements, such as weight measurements and/or corresponding torque vectors, via one or more weight and/or torque sensors on the end effector and the robotic arm. The system may compare the obtained measurements with expected data to verify the detected identity of the object. When the measurements and expected data do not match, the system may process the obtained measurements (such as weight and/or torque vectors) to estimate or calculate the target object's CoM. In some embodiments, the dynamically obtained CoM may be used to transfer the target object, for example by updating an existing motion plan with the CoM and/or not recalculating an overall motion plan.
During operation, the system may anticipate two or more objects having similar physical characteristics (e.g., the difference in length and width of the objects is within a predetermined threshold range). Alternatively or additionally, the system may receive or encounter objects that are not registered but that have similar physical characteristics to one or more of the registered objects. When handling such similar objects, the system may erroneously detect/identify the target/imaged object. As an illustrative example, object 'a' and object 'B' may have lengths and widths that cannot be consistently distinguished while having different heights and/or weights (e.g., distinguished from a top view of the two objects given hardware and/or software granularity or system capacity). Therefore, the system may erroneously recognize the actual object 'a' as the object 'B'.
False detection may cause the system to rely on and/or anticipate other false parameters for subsequent processing, such as for deriving and/or implementing a motion plan to divert erroneously identified objects. For example, false detections may cause the system to rely on false height measurements, which may cause accidental collisions during transfer of a falsely detected object. Also, detection errors may cause the system to rely on the wrong weight, which may cause clamping failure and workpiece loss during transfer.
To prevent such downstream failures, the system may use the deviation in expected values to identify and adjust for detection errors. Continuing with the illustrative example, object 'a', as object 'B', may have other different physical aspects, such as height, weight, and/or CoM. When the system implements a motion plan for object 'B' resulting from false detections, the system may anticipate a corresponding weight and/or torque vector (e.g., coM) of object 'B'. When a corresponding measurement (e.g., different from the expected parameters) is received from the actual object 'a', the system may determine that an error condition and/or determine that the initial detection of the target object may be erroneous.
In some embodiments, the system may respond to error detection by deriving an estimated geometry of the gripped object, such as by enlarging or reducing the footprint (e.g., length and/or width) of the gripped object based on actual measurements (e.g., coM). For example, the system may use the gripping location as a reference and adjust the lateral dimensions of the gripped object so that the actual CoM is located in the center portion of the adjusted footprint. In one or more embodiments, the system can be configured to release the object and then re-clamp the object on the actual CoM. Additionally or alternatively, the system may determine a height of the gripped object, such as using a cross sensor and/or side view camera positioned along the transfer path (e.g., above the start location or above the destination location).
In some embodiments, the system can update one or more states based on the actual CoM and/or remove one or more initial portions related to the corresponding motion plan (e.g., without re-deriving the overall motion plan) to account for differences in the initial and adjusted clamp positions. Alternatively, the system may use the detected actual aspect and/or the adjusted footprint to derive an updated movement plan, such as when the difference exceeds one or more predetermined threshold parameters. Further, the system may use the actual measurements to update the registration record, such as by registering a new object or updating a record of the originally detected object. Continuing with the illustrative example described above, the system may create a new registration record for object 'A' using the measured weight/torque vector, coM, new height, etc. Alternatively, the system may update the record of object 'B' to include the measured weight/torque vector, coM, new height, etc.
Using real-time/actual measurements, the robotic system may provide effective verification of object detection. Thus, the robotic system may increase throughput by reducing or eliminating errors caused by object false detections. In addition, the robotic system may implement an automated response, such as by re-planning the motion without operator input, to recover from the false detection and successfully transfer the false detected object to the task location. Thus, the robotic system may improve efficiency and overall transfer speed by eliminating the need for human verification or operator-provided response instructions.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the presently disclosed technology. In other embodiments, the techniques described herein may be practiced without these specific details. In other instances, well-known features, such as specific functions or routines, are not described in detail in order to avoid unnecessarily obscuring the present disclosure. Reference in the specification to "an embodiment," "one embodiment," or the like means that a particular feature, structure, material, or characteristic described is included in at least one embodiment of the disclosure. Thus, appearances of such phrases in this specification are not necessarily all referring to the same embodiment. On the other hand, such references are not necessarily mutually exclusive. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments. It is to be understood that the various embodiments shown in the figures are merely illustrative representations and are not necessarily drawn to scale.
For the sake of clarity, several details describing structures or processes that are well known and commonly associated with robotic systems and subsystems, but which may unnecessarily obscure some important aspects of the disclosed technology, are not set forth in the following description. In addition, while the following disclosure sets forth several embodiments of different aspects of the technology, several other embodiments may have different configurations or different components than those described in this section. Accordingly, the disclosed technology may have other embodiments with additional elements or without several elements described below.
Many embodiments or aspects of the disclosure described below may take the form of computer-executable or controller-executable instructions, including routines executed by a programmable computer or controller. One skilled in the relevant art will appreciate that the disclosed techniques can be practiced on computers or controller systems other than those shown and described below. The techniques described herein may be embodied in a special purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions described below. Accordingly, the terms "computer" and "controller" are used generically herein to refer to any data processor, and may include internet appliances and hand-held devices (including palm-top computers, wearable computers, cellular or mobile phones, multi-processor systems, processor-based or programmable consumer electronics, network computers, minicomputers, and the like). The information processed by these computers and controllers may be presented at any suitable display medium, including a Liquid Crystal Display (LCD). Instructions for performing computer-or controller-executable tasks may be stored in or on any suitable computer-readable medium including hardware, firmware, or a combination of hardware and firmware. The instructions may be embodied in any suitable memory device, including for example, a flash drive, a USB device, and/or other suitable media, including tangible, non-transitory computer-readable media.
The terms "coupled" and "connected," along with their derivatives, may be used herein to describe a structural relationship between components. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, "connected" may be used to indicate that two or more elements are in direct contact with each other. Unless otherwise apparent from the context, the term "coupled" may be used to indicate that two or more elements are in direct or indirect contact with each other (with other intervening elements between them), or that two or more elements cooperate or interact with each other (e.g., interact in a causal relationship, such as for signal transmission/reception or for function calls), or both.
Suitable environment
Fig. 1 is an illustration of an example environment in which a robotic system 100 transports objects in accordance with one or more embodiments of the present technique. The robotic system 100 may include and/or communicate with one or more units (e.g., robots) configured to perform one or more tasks. Aspects of object detection/updating may be practiced or implemented by various units.
For the example shown in fig. 1, the robotic system 100 may include and/or communicate with an unloading unit 102, a transfer unit 104 (e.g., a palletizing robot and/or a picking robot), a transport unit 106, a loading unit 108, or combinations thereof in a warehouse or a distribution/shipping hub. Each unit in the robotic system 100 may be configured to perform one or more tasks. The tasks may be combined in sequence to perform targeted operations, such as unloading an object from a truck or van and storing the object in a warehouse, or unloading an object from a storage location and preparing the object for shipment. For another example, the task may include placing an object at the task location (e.g., on top of the pallet and/or inside the cabinet/cage/box/bin). As described below, the robotic system may derive a plan for placing and/or stacking objects (e.g., a placement location/orientation, a sequence for transferring objects, and/or a corresponding motion plan). Each of the units may be configured to perform a sequence of actions (e.g., by operating one or more components therein) to perform a task according to one or more of the derived plans.
In some embodiments, the task may include manipulating (e.g., moving and/or reorienting) the target object 112 (e.g., one of a package, a box, a case, a cage, a pallet, etc., corresponding to the task being performed), such as moving the target object 112 from the starting location 114 to the task location 116. For example, the unloading unit 102 (e.g., an unpacking robot) may be configured to transfer the target object 112 from a location in a vehicle (e.g., a truck) to a location on a conveyor belt. Moreover, the transfer unit 104 may be configured to transfer the target object 112 from one location (e.g., a conveyor belt, pallet, or cabinet) to another location (e.g., a pallet, cabinet, etc.). For another example, the transfer unit 104 (e.g., a palletizing robot) may be configured to transfer the target objects 112 from a source location (e.g., a pallet, a picking area, and/or a conveyor) to a destination pallet. Upon completion of the operation, the transport unit 106 may transfer the target objects 112 from the area associated with the transfer unit 104 to the area associated with the loading unit 108, and the loading unit 108 may transfer the target objects 112 from the transfer unit 104 to a storage location (e.g., a location on a shelf) (e.g., by moving a pallet carrying the target objects 112). Details regarding the tasks and associated actions/computations are described below.
For illustrative purposes, the robotic system 100 is described in the context of a shipping center; however, it should be understood that the robotic system 100 may be configured to perform tasks in other environments/for other purposes (such as for manufacturing, assembly, packaging, healthcare, and/or other types of automation). It should also be understood that the robotic system 100 may include and/or communicate with other units not shown in fig. 1, such as manipulators, service robots, modular robots, and the like. For example, in some embodiments, other units may include: an unstacking unit for transferring objects from a cage car or pallet onto a conveyor or other pallet; a container switching unit for transferring an object from one container to another container; a packaging unit for packaging an object; a sorting unit for grouping the objects according to one or more characteristics of the objects; a pick-up unit for manipulating (e.g., for sorting, grouping, and/or transferring) the objects differently depending on one or more characteristics of the objects; or a combination thereof.
The robotic system 100 may include a controller 109 coupled to a physical or structural member (e.g., a robotic manipulator arm) connected at joints for movement (e.g., rotational and/or translational displacement). The controller 109 may include devices and/or circuitry (e.g., one or more processors and/or one or more memories) configured to control one or more aspects of implementing tasks. The structural members and joints may form a kinematic chain configured to manipulate an end effector (e.g., gripper) configured to perform one or more tasks (e.g., gripping, spinning, welding, etc.) in accordance with the use/operation of the robotic system 100. The robotic system 100 may include and/or be in communication with actuation devices (e.g., motors, actuators, wires, artificial muscles, electroactive polymers, etc.) configured to drive or manipulate (e.g., displace and/or reorient) the structural members about or at the corresponding joints (e.g., via the controller 109). In some embodiments, the robotic unit may include a transport motor configured to transport the corresponding unit/chassis from one place to another.
The robotic system 100 may include and/or be in communication with sensors configured to obtain information for accomplishing tasks, such as for manipulating structural members and/or transporting robotic units. The sensors may include devices configured to detect or measure one or more physical properties of the robotic system 100 (e.g., the state, condition, and/or position of one or more structural members/joints thereof) and/or the surrounding environment. Some examples of sensors may include accelerometers, gyroscopes, force sensors, strain gauges, tactile sensors, torque sensors, position encoders, and the like.
For example, in some embodiments, the sensor may include one or more imaging devices (e.g., visual and/or infrared cameras, 2D and/or 3D imaging cameras, distance measuring devices, such as lidar or radar, etc.) configured to detect the surrounding environment. The imaging device may generate a representation of the detected environment, such as a digital image and/or a point cloud, which may be processed by machine/computer vision (e.g., for automated inspection, robot guidance, or other robotic applications). The robotic system 100 may process the digital image and/or point cloud to identify the target object 112, the starting location 114, the task location 116, the pose of the target object 112, or a combination thereof.
For manipulating the target object 112, the robotic system 100 may capture and analyze images of a designated area (e.g., a pickup location, such as the interior of a truck or on a conveyor belt) to identify the target object 112 and its starting location 114. Similarly, the robotic system 100 may capture and analyze an image of another designated area (e.g., a drop location for placing objects on a conveyor, a location for placing objects inside a container, or a location on a pallet for stacking purposes) to identify the task location 116. For example, the imaging device may include one or more cameras configured to generate images of the pick-up area, and/or one or more cameras configured to generate images of the task area (e.g., the drop zone). Based on the captured images, the robotic system 100 may determine a starting location 114, a task location 116, associated gestures, a packing/placement plan, a transfer/packing order, and/or other processing results, as described below.
For example, in some embodiments, the sensors may include position sensors (e.g., position encoders, potentiometers, etc.) configured to detect the position of structural members (e.g., robotic arms and/or end effectors) and/or corresponding joints of the robotic system 100. The robotic system 100 may use position sensors to track the position and/or orientation of structural members and/or joints during task performance.
The robotic system 100 may also include or be communicatively coupled to one or more devices 111 separate from the controller 109. For example, additional devices 111 (e.g., one or more computing devices or subsystems) such as a warehouse management system that oversees overall management of locations and/or inventory records, a delivery coordination system (e.g., an Automated Guided Vehicle (AGV) control system, a conveyor control system, etc.), a sequencer that controls the order of tasks or associated objects, a motion planning system that derives a motion plan for each task, and so forth. In other embodiments, the controller 109 may be configured to perform one or more such functions of the device 111 (e.g., sequence derivation, motion planning, etc.).
Robot system
Fig. 2 is a block diagram illustrating components of a robotic system 100 in accordance with one or more embodiments of the present technique. For example, in some embodiments, the robotic system 100 (e.g., at one or more of the aforementioned units or components and/or robots) may include electronic/electrical devices, such as one or more processors 202, one or more storage devices 204, one or more communication devices 206, one or more input-output devices 208, one or more actuation devices 212, one or more transport motors 214, one or more sensors 216, or a combination thereof. The various devices may be coupled to one another via wired and/or wireless connections. For example, one or more of the units/components and/or robotic units used in the robotic system 100 may include a bus, such as a system bus, a Peripheral Component Interconnect (PCI) bus or a PCI express bus, a hypertransport or Industry Standard Architecture (ISA) bus, a Small Computer System Interface (SCSI) bus, a Universal Serial Bus (USB), an IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus (also known as a "firewire"). Also, for example, the robotic system 100 may include and/or communicate with bridges, adapters, controllers, or other signal-related devices for providing wired connections between devices. The wireless connection may be based on, for example, a cellular communication protocol (e.g., 3g,4g, LTE,5g, etc.), a wireless Local Area Network (LAN) protocol (e.g., wireless fidelity (WIFI)), a peer-to-peer or inter-device communication protocol (e.g., bluetooth, near Field Communication (NFC), etc.), an internet of things (IoT) protocol (e.g., NB-IoT, zigbee, Z-wave, LTE-M, etc.), and/or other wireless communication protocols.
Processor 202 may include a data processor (e.g., a Central Processing Unit (CPU), a special purpose computer, and/or an in-vehicle server) configured to execute instructions (e.g., software instructions) stored on a storage device 204 (e.g., computer memory). The processor 202 may implement program instructions to control/interact with other devices to cause the robotic system 100 to perform actions, tasks, and/or operations.
The storage 204 may include a non-transitory computer-readable medium having program instructions (e.g., software) stored thereon. Some examples of storage 204 may include volatile memory (e.g., cache memory and/or Random Access Memory (RAM)) and/or non-volatile memory (e.g., flash memory and/or a disk drive). Other examples of storage 204 may include portable memory drives and/or cloud storage.
In some embodiments, the storage 204 may be used to further store and have access to processing results, templates (e.g., shape templates, gripping kits, etc.) and/or predetermined data/thresholds. For example, the storage 204 may include a Registration Database System (RDS) that stores a registration record 232 (also referred to as primary data). Each registration record 232 may include a description of a corresponding object (e.g., a box, a case, a container, and/or a product) that may be manipulated by the robotic system 100. For one or more of the objects expected to be manipulated by the robotic system 100, the registration record 232 may include one or more physical characteristics or attributes, such as size, shape (e.g., a template for a potential pose and/or a computer-generated model for recognizing objects in different poses), color scheme, image, identification information (e.g., a barcode, a Quick Response (QR) code, a logo, etc., and/or an expected location thereof), expected mass or weight, or any combination thereof. Registration records 232 may also include steering related information about the object, such as the CoM position on each of the objects, one or more templates Sis, expected sensor measurements (e.g., force, torque, pressure, and/or contact measurements) corresponding to one or more actions/maneuvers, visual data (e.g., reference radar/lidar data), or any combination thereof.
The storage device 204 may also store object tracking data. In some embodiments, the object tracking data may include a log of the objects that were scanned or manipulated. In some embodiments, the object tracking data may include image data (e.g., pictures, point clouds, real-time video feeds, etc.) of the object at one or more locations (e.g., designated pick or drop locations and/or conveyor belts). In some embodiments, the object tracking data may include a location and/or an orientation of the object at one or more locations.
The communication device 206 may include circuitry configured to communicate with an external or remote device via a network. For example, the communication device 206 may include a receiver, transmitter, modulator/demodulator (modem), signal detector, signal encoder/decoder, connector port, network card, and the like. The communication device 206 may be configured to send, receive, and/or process electrical signals according to one or more communication protocols (e.g., internet Protocol (IP), wireless communication protocols, etc.). In some embodiments, the robotic system 100 may use the communication device 206 to exchange information between units of the robotic system 100 and/or to exchange information with systems or devices external to the robotic system 100 (e.g., for reporting, data collection, analysis, and/or troubleshooting purposes).
Input-output devices 208 may include user interface devices configured to communicate information to and/or receive information from a human operator. For example, input-output devices 208 may include a display 210 and/or other output devices (e.g., speakers, haptic circuitry, or haptic feedback devices, etc.) for communicating information to a human operator. Also, input-output devices 208 may include control or receiving devices such as a keyboard, mouse, touch screen, microphone, user Interface (UI) sensors (e.g., a camera for receiving motion commands), wearable input devices, and the like. In some embodiments, the robotic system 100 may use the input-output device 208 to interact with a human operator in performing an action, task, operation, or a combination thereof.
In some embodiments, a controller (e.g., controller 109 of fig. 1) may include a processor 202, a storage device 204, a communication device 206, and/or an input-output device 208. The controller may be a separate component or part of a unit/assembly. For example, each of the unloading units, transfer assemblies, transport units, and loading units of the system 100 may include one or more controllers. In some embodiments, a single controller may control multiple units or independent components.
The robotic system 100 may include and/or communicate with physical or structural members (e.g., robotic manipulator arms) connected at joints for movement (e.g., rotational and/or translational displacement). The structural members and joints may form a kinematic chain configured to manipulate an end effector (e.g., gripper) configured to perform one or more tasks (e.g., gripping, spinning, welding, etc.) in accordance with the use/operation of the robotic system 100. The power chain may include actuation devices 212 (e.g., motors, actuators, wires, artificial muscles, electroactive polymers, etc.) configured to drive or manipulate (e.g., displace and/or reorient) the structural members about or at the corresponding joints. In some embodiments, the power train may include a transport motor 214 configured to transport the corresponding unit/chassis from one location to another. For example, the actuation device 212 and transport motor are coupled to or part of a robotic arm, linear slide, or other robotic component.
The sensors 216 may be configured to obtain information for accomplishing a task, such as for manipulating a structural member and/or for transporting a robotic unit. The sensors 216 may include devices configured to detect or measure one or more physical properties of the controller, the robotic unit (e.g., the state, condition, and/or position of one or more structural members/joints thereof), and/or the surrounding environment. Some examples of sensors 216 may include contact sensors, proximity sensors, accelerometers, gyroscopes, force sensors, strain gauges, torque sensors, position encoders, pressure sensors, vacuum sensors, and the like.
In some embodiments, for example, the sensor 216 may include one or more imaging devices 222 (e.g., 2-and/or 3-dimensional imaging devices) configured to detect the surrounding environment. The imaging device may include a camera (including a visual and/or infrared camera), a lidar device, a radar device, and/or other ranging or detection devices. The imaging device 222 may generate a representation (such as a digital image and/or a point cloud) of the detected environment for enabling machine/computer vision (e.g., for automated inspection, robot guidance, or other robotic applications).
Referring now to fig. 1 and 2, the robotic system 100 (e.g., via the processor 202) may process the image data and/or the point cloud to identify the target package 112, the starting location 114, the task location 116, the pose of the target package 112, or a combination thereof. The robotic system 100 may use image data from the imaging device 222 to determine how to access and pick objects. The image of the object can be analyzed to determine a motion plan for positioning the vacuum gripper assembly to grip the target object. The robotic system 100 (e.g., via various units) may capture and analyze images of a designated area (e.g., the interior of a truck, the interior of a container, or the pick-up location of an object on a conveyor belt) to identify the target package 112 and its starting location 114. Similarly, the robotic system 100 may capture and analyze an image of another designated area (e.g., a drop location for placing objects on a conveyor belt, a location for placing objects inside a container, or a location on a pallet for stacking purposes) to identify the task location 116.
Also, for example, the sensors 216 may include position sensors 224 (e.g., position encoders, potentiometers, etc.) configured to detect the position of structural members (e.g., robotic arms and/or end effectors) and/or corresponding joints of the robotic system 100. The robotic system 100 may use the position sensors 224 to track the position and/or orientation of the structural members and/or joints during task performance. The unloading unit, transfer unit, transport unit/assembly, and loading unit disclosed herein may include a sensor 216.
In some embodiments, the sensors 216 may include one or more force sensors 226 (e.g., weight sensors, strain gauges, piezoresistive/piezoelectric sensors, capacitive sensors, piezoresistive sensors, and/or other tactile sensors) configured to measure the force applied to the power train, such as at the end effector. For example, the sensor 216 may be used to determine a load (e.g., a gripped object) on the robotic arm. Force sensors 226 may be attached to or around the end effector and configured such that the resulting measurements represent the weight of the gripped object and/or the torque vector relative to a reference position. In one or more embodiments, the robotic system 100 can process the torque vector, weight, and/or other physical characteristics (e.g., dimensions) of the object to estimate the CoM of the gripped object.
Robot transfer assembly
Fig. 3 illustrates a transfer assembly 104 in accordance with one or more embodiments of the present technique. Transfer assembly 104 may include an imaging system 160 and a robotic arm system 132. Imaging system 160 may provide image data captured from the target environment to depalletizing platform 110. Robotic arm system 132 may include a robotic arm assembly 139 and an end effector 140 (e.g., gripper assembly). Robotic arm assembly 139 may position end effector 140 over a set of objects in stack 165 located at picking environment 163.
Fig. 3 shows an end effector 140 that carries a single object or package 112 ("package 112") positioned above the conveyor 120. The end effector 140 may release the package 112 onto a task location 116 (such as a conveyor belt 120) of fig. 1, and the robotic arm system 132 may then retrieve the package 112a,112b by positioning the unloaded end effector 140 directly over both packages 112a,112b. The end effector 140 may then hold one or more of the parcels 112a,112b via vacuum grippers, and the robotic arm system 132 may carry the held parcel 112a and/or 112b to a location directly above the conveyor 120. End effector 140 may then release (e.g., simultaneously or sequentially) packages 112a,112b onto conveyor 120. This process may be repeated any number of times to carry the object from the stack 165 to the conveyor 120.
With continued reference to fig. 3, depalletizing platform 110 may include any platform, surface, and/or structure upon which a plurality of objects or packages 112 (referred to simply as "packages 112") may be stacked and/or stacked and prepared for transport. The imaging system 160 may include one or more imaging devices 161 configured to capture image data of the packages 112 on the destacking platform 110. The imaging device 161 may capture range data, position data, video, still images, lidar data, radar data, and/or motion at the pickup environment 163. It should be noted that although the terms "object" and "package" are used herein, the terms include any other item capable of being gripped, lifted, transported, and delivered, such as, but not limited to, "box," "carton," or any combination thereof. Further, while polygonal boxes (e.g., rectangular boxes) are shown in the figures disclosed herein, the shape of the boxes is not limited to such shapes, but includes any regular or irregular shape that can be gripped, lifted, transported, and delivered.
As with the depalletizing platform 110, the receiving conveyor 120 may include any platform, surface, and/or structure designated to receive a package 112 for further tasks/operations. In some embodiments, the receiving conveyor 120 may include a conveyor system for transporting the packages 112 from one location (e.g., a release point) to another location for further operations (e.g., sorting and/or storage).
Examples of error identification
The robotic system 100 may be configured to respond to and process any misidentification or misdetection of the package 112. Fig. 4 illustrates an example top view 400 of a pair of parcels 112a and 112b (e.g., imaging data from imaging system 160), according to one or more embodiments of the present technology. Referring to fig. 3 and 4 together, imaging system 160 may include one or more downward facing cameras configured to obtain corresponding overhead views of packages 112a and 112b. Thus, when the parcels 112a and 112b have similar lateral dimensions (e.g., length and width), the robotic system 100 may erroneously identify the parcel. For example, the overhead view image 400 may cause the robotic system 100 to falsely identify the parcel 112a as the parcel 112b. Thus, the robotic system 100 may derive a motion plan for the parcel 112a that erroneously accounts for the height 192b instead of the height 192a, the CoM 194b instead of the CoM 194a, etc. For example, the motion plan may correspond to a clamping position 196 that corresponds to (e.g., overlaps) CoM 194b instead of CoM 194a.
The robotic system 100 may implement at least an initial portion of the resulting movement plan, such as by grasping the package 112a at the gripping location 196 and/or lifting the package 112a. The robotic system 100 may be configured to obtain a set of force measurements while and/or after initially lifting the package 112a. The obtained measurements may include the actual weight of the package 112a and/or a vector 197 (e.g., a torque vector) corresponding to the difference between the actual CoM 194a and the gripping location 196. The robotic system 100 may be configured to compare the obtained sensor output corresponding to the package 112a to an expected sensor output (e.g., a registered characteristic of the package 112 b). When the outputs of the comparisons do not match, the robotic system 100 may detect a detection error.
In some embodiments, the robotic system 100 may calculate a difference between an expected value and an obtained value of the sensor output. When the calculated difference exceeds a predetermined maximum error condition, the robotic system 100 may signal a fault condition and/or notify an operator. Alternatively or additionally, the robotic system 100 may continue the existing motion plan when the difference is below a predetermined minimum error condition.
The robotic system 100 may be configured to reprocess the package 112a under certain conditions, such as when the difference satisfies a predetermined range or template. For example, the robotic system 100 may derive the estimated geometry 198 based on the vector 197, the expected dimensions, and/or the grip position 196. The robotic system 100 may derive the estimated geometry 198 such that the actual CoM 194a (e.g., which is an estimate from the vector 197) is located in a central portion of the estimated geometry 198. Moreover, the robotic system 100 may manipulate the robotic arm and/or implement the process to determine the actual height 192a of the parcel 112a. The robotic system 100 may use the estimated geometry, the estimated CoM, and/or the actual height to process the package 112a. For example, the robotic system 100 may use the updated measurements or parameters to re-derive a new/replacement motion plan and/or adjust an existing motion plan, such as by increasing lift height, decreasing travel speed, releasing the package 112a, and re-gripping at the actual CoM 194a, etc. The robotic system 100 may also update the registration record 232, such as by creating a new record for the parcel 112a and/or updating the record for the parcel 112b with the latest results (e.g., by replacing height, weight, coM, etc.).
Example System connections
The robotic system 100 may implement a set of functions when initially handling the package 112 and/or in response to false object detection. Fig. 5 is a functional block diagram of a robotic system 100 in accordance with one or more embodiments of the present technique. The robotic system 100 may include a detection module 502, a planning module 504, and/or an implementation module 506 configured to process the package 112 and respond to error detections. The robotic system 100 may implement the detection module 502, the planning module 504, and/or the implementation module 506 using the controller 109 of fig. 1, one or more of the devices 111 of fig. 1, one or more of the processors 202 of fig. 2, or a combination thereof.
The detection module 502 may be configured to detect one or more of the parcels 112. In some implementations, the detection module 502 can process the image data 510 received from the imaging device 161. The detection module 502 may process the image data by extracting features such as 2D/3D edges, corners, textures (e.g., surface images or designs), etc. from the image data 510. Detection module 502 may compare the extracted features to registration record 232 of fig. 2 to recognize or identify the registered object depicted in image data 510. The detection module 502 may also identify the location of one or more of the objects depicted in the image data 510 (e.g., identified registered objects). For example, the detection module 502 may identify/verify that a set of edges belong to a particular object, while other edges may belong to other/blocked objects. The detection module 502 may generate a detection result (e.g., initial result 512) for each detected object. The detection results may include the identity of the detected object, the position/pose of the detected object, and/or other details about the detected object.
The detection module 502 may include an auto registration module 522 configured to automatically (e.g., without user input) register unrecognized and/or unregistered objects. For example, detection module 502 may determine that (1) a set of edges satisfies one or more predetermined conditions to be considered an object (e.g., an exposed or accessible object), but (2) one or more physical features of the object do not sufficiently match corresponding features of the registered object reflected in registration record 232. The auto-enrollment module 522 may automatically create an enrollment record for the unrecognized or unregistered object using one or more of the results of the calculations, such as a portion of the image data 510 that reflects the visual characteristics of the object, one or more dimensions of the object, and so forth. In some embodiments, the detection module 502, the automatic registration module 522, or a combination thereof may use information obtained during the transfer of the object to update the detection results and/or create/update a new registration record. Details regarding the update are described below.
The planning module 504 may process the detection results received from the detection module 502 to derive a motion plan for the detected object. For example, the planning module 504 may identify a start location, a task location, a package travel path, a corresponding maneuver, a corresponding setting/command/timing, or a combination thereof for transferring each of the detected objects. In some embodiments, planning module 504 may start from the task location and iteratively/incrementally advance and consider the new location toward the starting location to derive the motion plan. Accordingly, the planning module 504 may generate an initial motion plan 514 for each of the initial detection results 512.
The implementation module 506 may process the motion plan (e.g., the initial plan) received from the planning module 504 to implement the transfer of the corresponding object. The implementation module 506 may interact with the robotic arm system 132 for planning implementations. For example, the implementation module 506 may generate and transmit control data 516 (e.g., commands and/or settings) corresponding to the motion plan, and the robotic arm system 132 may execute the motion plan according to the control data 516. Based on executing the motion plan, the robotic arm system 132 may transfer the corresponding object.
The robotic arm system 132 may provide implementation feedback 518 that includes one or more measurements obtained during the object transfer. For example, the robotic arm system 132 may measure weight, torque measurements, and/or height measurements (e.g., profile images) after implementing an initial portion of the motion plan, such as after grasping and lifting the target object. The robotic system 100 may send the measurements or derivations thereof to the implementation module 506 and/or the detection module 502 using implementation feedback 518. In some embodiments, the derivation may include, for example: (1) Estimated CoM position derived from torque measurements, or (2) object height derived from profile images. In some embodiments, the implementation module 506 and/or the robotic arm system 132 may suspend the transfer while the implementation module 506 and/or the detection module 502 verifies the detection result. In other embodiments, the implementation module 506 and/or the robotic arm system 132 may implement an initial displacement (e.g., lifting and/or lateral movement of the package) to obtain the measurement, and then return/reposition the package until the detection and motion plan are verified.
The robotic system 100 may receive implementation feedback 518 to verify the detection results. For example, if the detected object matches a registered object, the robotic system 100 may compare the measurement to the corresponding characteristic stored in the registration record for the target package. When the mismatch of the actual measurement and the expected data exceeds a predetermined condition, the robotic system 100 may determine that the initial detection result 512 is erroneous and/or initiate one or more false responses. For example, when the expected set of objects includes one other similar (e.g., appearance and/or size) object, the detection module 502 may generate an updated detection result 524 that identifies the target object as the other/similar object. Alternatively or additionally, the detection module 502 may use the estimated geometry 198 of fig. 4 to generate an updated detection result 524. The detection module 502 may classify the target object as an unregistered object and provide the estimated geometry 198 as a footprint. Also, the auto-registration module 522 may use the estimated geometry 198 to register unregistered objects.
In response to identifying the detection error, planning module 504 may generate an alternative movement plan 520 using the updated results 524. The implementation module 506 may use the alternate motion plan 520 instead of the initial motion plan 514 to transfer the target object.
Flow of operation
Fig. 6 is a flow diagram of a method 600 for operating a robotic system (e.g., robotic system 100 of fig. 1) according to one or more embodiments of the present disclosure. The method 600 may be used to determine and respond to false detection of a target object. The method 600 may be implemented based on executing instructions stored on one or more of the storage devices 204 of fig. 2 using one or more of the processors 202 of fig. 2. In implementing the motion plan and/or method 600, the processor 202 may send the motion plan or associated set/sequence of commands/settings to a robotic unit (e.g., the transfer assembly 104 of fig. 3 and/or the end effector 140 of fig. 3). Accordingly, transfer assembly 104 and/or end effector 140 may execute a motion plan to grasp and transfer the package.
At block 604, the robotic system 100 may obtain image data (e.g., 2D and/or 3D imaging results, such as image data 510 of fig. 5). The robotic system 100 may obtain image data 510 from the imaging system 160 of fig. 3. The obtained image data 510 may depict an object or package (e.g., the stack 165 of fig. 3) at a starting location that is similar to the pick-up environment 163 of fig. 3. In some embodiments, the obtained image data may correspond to an overhead view of the parcel.
At block 606, the robotic system 100 may detect a target object based on the image data 510 obtained by the analysis. The robotic system may generate a detection result, such as initial result 512 of fig. 5. For example, the robotic system 100 may detect edges (e.g., via a Sobel filter) and identify corners or joints between the edges to determine a continuous surface. The continuous surface may be used to estimate package location and/or identification. The robotic system 100 may compare the texture of the surface image or area to the corresponding data of the registered objects in the registration record 232 to identify or recognize the package. Alternatively or additionally, the robotic system 100 may measure and/or calculate the dimensions (e.g., length and width) of the determined surface. The resulting dimensions may be compared to corresponding data in the registration record 232 to identify the package. When the analyzed texture and/or dimensions depicted in the image data 510 match corresponding features of the registered object, the robotic system 100 may use the determined identity, location, and/or dimensions of the surface to generate a detection result. In other words, the robot system 100 may generate a detection result by mapping the matched feature of the registered object to the matched portion of the image data.
At block 608, the robotic system 100 may derive one or more motion plans. The robotic system 100 may use the planning module 504 of fig. 5 (e.g., a device/system separate from the detection module 502, the implementation module 506, and/or the controller 109 of fig. 1 of fig. 5). In some embodiments, the robotic system 100 may process one motion plan for one parcel in each iteration. In other embodiments, the robotic system 100 may derive a plurality of motion plans for corresponding subsets of packages.
The robotic system 100 may initiate a motion planning process based on providing the detection results from the detection module 502 to the planning module 504. The planning module 504 may use the received detection results to derive a motion plan, such as by estimating and evaluating a path of the target parcel and/or the robotic assembly. For example, the robotic system 100 may begin analyzing from the package location (as indicated by the detection results) and derive the gripping pose of the robotic component according to a predetermined set of rules (e.g., range of motion of the joints, collision avoidance rules, etc.). The robotic system 100 may use the gripping pose as a starting point and iteratively derive travel sections that satisfy a predetermined set of rules, such as for collision avoidance and/or minimizing travel time. As a more specific example, the robotic system 100 may derive a first segment for lifting the parcel until an expected collision or desired height corresponding to the expected height of the target parcel and the overall height of the stack or container walls. The robotic system 100 may derive a lateral segment from the end of the first segment toward the task location until a lateral position of the expected collision or task location. The robotic system 100 may repeat the iterative process to derive combinations of path segments that avoid collisions and connect the starting position to the task position. In some embodiments, the robotic system 100 may derive the combination of segments in reverse order, such as from a derived placement pose of the package at the task location to a current pose of the package at the starting location. The robotic system 100 may translate the combination of sections into corresponding commands and/or settings to operate the robotic unit or components therein (e.g., actuators, motors, etc.). In some embodiments, the robotic system 100 may derive a set of candidate motion plans and corresponding metrics (e.g., overall transfer times). The robotic system may derive the motion plan by selecting one or more plans that optimize the corresponding metric.
At block 610, the robotic system 100 (e.g., via the implementation module 506) may begin implementing the derived motion plan. For example, the processor 202, the implementation module 506, and/or the controller 109 may initiate implementation of the motion plan (e.g., the initial plan 514) by sending the motion plan, an initial portion thereof, and/or corresponding commands/settings to a robotic unit (e.g., the transfer unit 104 of fig. 1). The robotic unit may execute the received plan or corresponding commands/settings, such as grasping and lifting the target package while transferring the package from the start location 114 to the task location 116.
At block 612, the robotic system 100 may obtain additional sensor data after or while implementing the motion plan or the first/initial portion thereof. For example, the robotic system 100 may obtain weight, height, and/or torque data (e.g., weight vectors and/or different representations of the CoM 194 b) from corresponding sensors at or around the end effector. Moreover, the robotic system 100 may obtain height data of the object, such as by lifting the object until a crossing/clearing event is detected at a line sensor located above the starting position, and/or by obtaining and analyzing side view images of the stack.
In some embodiments, the robotic system 100 may derive an estimate of the actual CoM (e.g., coM 194 b) based on the measured weight, torque, weight vector, or a combination thereof. For example, the robotic system 100 may use a predetermined process or equation that calculates an estimated position of the CoM (e.g., relative to the gripping position) based on the measured weight and a horizontal component (e.g., direction and/or corresponding magnitude) in the weight vector/torque measurement.
At decision block 614, the robotic system 100 may compare the obtained data to corresponding ones of the registered records 232 of the identified target object. For example, the robotic system 100 may compare the obtained weight data to expected weight data in the registration record 232 of the identified target object. Also, the robotic system 100 may use the obtained weight and/or torque data and the actual grip position to estimate the actual CoM of the gripped object. The robotic system 100 may compare the actual CoM with the expected CoM of the identified target object. In some embodiments, the robotic system 100 may compare the obtained height to an expected height.
As an illustrative example, the robotic system 100 may be configured to derive a clamping position centered or positioned directly above the CoM from the detection results. Thus, the robotic system 100 may compare the weight vector and/or torque measurements from the end effector to predetermined verification thresholds to determine whether the package was accurately detected. When the estimated value of the actual CoM is different from the expected CoM, the torque or horizontal component of the weight vector may exceed the validation threshold, thus allowing the robotic system 100 to determine false detections. Alternatively or additionally, the robotic system 100 may use the difference between the expected value of the detected weight, height, and/or lateral dimension of the object and the measured height, weight, and/or lateral dimension (e.g., via images captured after initially displacing the parcel) to test for false detection errors.
If the obtained data matches the expected data, the robotic system 100 may complete the existing motion plan, as shown in block 616. If the obtained data does not match the expected data, the robotic system 100 may determine a false detection of the target/currently gripped package. At block 618, the robotic system 100 may calculate a measurement difference between the corresponding values for subsequent evaluation/processing. For example, the robotic system 100 may compare the measurement difference to a predetermined minimum threshold, as shown in decision block 620. The minimum threshold may correspond to a negligible difference (e.g., a difference in weight, coM location, etc.) between the expected value of the detected object and the actual real-time measurement corresponding to the parcel. A minimum threshold may be used to decide on continued implementation of the existing/originally provided movement plan. Thus, when the measurement difference is below such a minimum, the robotic system 100 may continue/complete the implementation of the existing movement plan despite the detected package false detection as shown in block 616.
When the measurements are very different (e.g., greater than a minimum value and non-negligible), the robotic system 100 may perform one or more automatic responses, such as for deriving and implementing an alternative/replacement movement plan, before or in preparation for further remedial responses. In some embodiments, the robotic system 100 (e.g., via the controller 109 and/or the implementation module 506) may suspend implementation of the initial motion plan in response to the false detection. In other embodiments, the robotic system 100 (e.g., via the controller 109 and/or the implementation module 506) may implement an automatic replacement operation to obtain sufficient measurement differences. For example, the controller 109 may track progress along the initial movement plan by storing executed commands/settings and/or traversed portions of the movement plan. The controller 109 may implement the automatic replacement operation by reversing the progress of the storage, such as by implementing the stored commands/settings in the reverse direction and/or in the reverse order. Additionally or alternatively, the controller 109 may release the gripped package after the initial lifting. Thus, the controller 109 may reposition the package at the starting location in preparation for deriving and implementing an alternative/replacement motion plan.
Upon determining an appropriate response to the false detection, the robotic system 100 may also compare the measurement difference to a predetermined maximum threshold, as shown in decision block 622. The maximum threshold may correspond to a limit at which errors may be recovered, such as when motion planning is re-implemented. When the measurement difference exceeds the maximum threshold, the robotic system 100 may implement an error process, as shown in block 624. Some examples of error procedures may include releasing or repositioning the target at the target package, notifying a human operator, and so forth.
For certain types of recoverable false detections (not shown), the robotic system 100 may dynamically adjust and complete the originally provided motion plan. For example, the robotic system 100 may use the controller 109 and/or the implementation module 506 to dynamically adjust one or more aspects of the initially provided motion plan, without using the planning module 504, to derive an alternative motion plan. Some examples of dynamic adjustment may include (1) adjusting the end effector/package height by at least the calculated height difference, and/or (2) adjusting the speed according to the weight difference and according to a predetermined method/equation.
In response to the addressable false detection, the robotic system 100 (e.g., via the controller 109 and/or one of the modules shown in fig. 5) may use the obtained actual sensor data to derive an alternate footprint, as shown in block 626. For example, the robotic system 100 may derive the estimated geometry 198 of fig. 4, such as using the actual CoM and grip position as described above and/or one or more corners detected from the image data (e.g., 3D image). The robotic system 100 may derive the estimated geometry 198 by enlarging or shrinking the initial footprint such that the actual/measured CoM is located in a central portion of the estimated geometry 198. For example, the robotic system 100 may generate a straight line using one of the 3D corner and the actual CoM as a reference point and estimate a relative corner located along the straight line. The robotic system 100 may estimate the corner position as being opposite and equidistant from the CoM compared to the reference 3D corner. The robotic system 100 may the following bounds are extrapolated: (1) boundaries that meet at right angles at opposing corners, (2) boundaries that define a rectangular shape, and (3) boundaries that enclose the actual CoM to derive an alternate footprint.
At block 628, the robotic system 100 may dynamically derive (e.g., after partial implementation of the initial motion plan, such as when pausing or negating the plan as described above) an updated motion plan (e.g., the alternate motion plan 520 of fig. 5). The robotic system 100 may use the obtained actual sensor data and/or the alternate footprint to derive an alternate motion plan 520. In some embodiments, for example, the controller 109 and/or the implementation module 506 may analyze the implementation feedback 518 and obtain a replacement plan based on sending a request for replacement (e.g., the implementation feedback 518 of fig. 5) to the planning module 504. The request may include actual sensor data and/or an alternate footprint. In other embodiments, planning module 504 may receive and analyze implementation feedback 518 and/or derive alternative footprints. Planning module 504 may then derive an alternative movement plan 520 based on the received request or the analysis results. For example, the planning module 504 may be similar to that described above but use an alternative footprint, an estimate of the actual CoM, measured dimensions (e.g., height and/or lateral dimensions), measured weight, etc. to derive the alternative motion plan 520. In deriving the alternate motion plan 520, the planning module 504 may actually assume that the parcel is an unregistered object that is different or unrelated to the registration record corresponding to the initial detection result. Accordingly, the planning module 504 may derive an alternate motion plan 520 that corresponds to an alternate clamp position, an alternate clamp strength (e.g., a number of new suction cups for engaging a package), an alternate transfer height, an alternate transfer speed, or a combination thereof that is different from the corresponding component of the initial motion plan. Depending on the default response, the planning module 504 may derive an alternative movement plan 520 for starting from the start position of the relocated package or starting from the pause position.
In some embodiments, the planning module 504 may derive an updated movement plan by considering other error patterns for enlarged alternative footprints (e.g., double pick-ups, such as when two adjacent packages are inadvertently grasped by the end effector). Also, when the actual height is not initially obtained, the robotic system 100 may derive an updated motion plan to include a maximum lift height corresponding to the maximum height allowed for the robotic unit and/or the maximum height/size in the registered package in the registration record 232. In some embodiments, the robotic system 100 may place objects in a predetermined area designated for accidental wrapping. The planning module 504 can derive an alternative motion plan 520 to include instructions for re-gripping a parcel centered at an estimate of the actual CoM position rather than the initial gripping position associated with the initial motion plan.
The implementation module 506 may then receive the alternate motion plan 520 from the planning module 504 as a response to the request. At block 630, the robotic system 100 may implement the updated motion plan. In other words, the implementation module 506 may implement the alternate movement plan 520 instead of the initial movement plan 514 or the remainder thereof to transfer the package to the task location 116. The robotic system 100 may implement an updated motion plan without notifying or involving a human operator. Thus, the robotic system 100 may implement an automatic response to package misidentification to complete the task without human intervention.
At block 632, the robotic system 100 may update the registration record 232 based on the misrecognized object. In some embodiments, the robotic system 100 may create a new record for the gripped/transferred package in addition to or in place of the record associated with the initial detection. The robotic system 100 may include the actual sensor data and/or corresponding processing results in the new record, such as the obtained weight, the obtained actual height, the actual CoM, and/or the estimated geometry 198. In other embodiments of the present invention, the substrate may be, the robotic system 100 may update an existing record of the package originally identified, such as by replacing weight, height, lateral dimensions, coM, etc. with actual sensor data and/or corresponding processing results.
Conclusion
The above detailed description of examples of the disclosed technology is not intended to be exhaustive or to limit the disclosed technology to the precise form disclosed above. While specific examples of the disclosed technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the disclosed technology, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative implementations may perform routines having steps, or employ systems having blocks in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative solutions or sub-combinations. Each of these processes or blocks may be implemented in a number of different ways. Also, while processes or blocks are shown at times as being performed serially, these processes or blocks may instead be performed or implemented concurrently, or may be performed at different times. Moreover, any specific numbers mentioned herein are merely examples; alternative implementations may employ different values or ranges.
These and other changes can be made to the disclosed technology in light of the above detailed description. While the detailed description describes certain examples of the disclosed technology, as well as the best mode contemplated, no matter how detailed the above appears in text, the disclosed technology can be practiced in many ways. The details of the system vary widely in its specific implementation details, yet are still encompassed by the present technology disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the disclosed technology should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the disclosed technology with which that terminology is associated. Accordingly, the invention is not limited except as by the appended claims. In general, unless the above detailed description section explicitly defines terms used in the following claims, such terms should not be construed to limit the disclosed technology to the specific examples disclosed in the specification.
While certain aspects of the invention are presented below in certain claim forms, applicants contemplate the various aspects of the invention in any number of claim forms. Accordingly, applicants reserve the right to pursue such additional claim forms after filing the application as such.

Claims (20)

1. A method of operating a robotic system, the method comprising:
generating an initial detection result based on image data, wherein the detection result identifies a parcel depicted in the image data;
initiating implementation of an initial movement plan derived based on the initial detection results, wherein the initial movement plan is used to transfer the package from a start location to a task location;
obtaining additional sensor data during implementation of at least a first portion of the initial movement plan;
comparing the additional sensor data to a registration record corresponding to the initial detection result;
determining a false detection based on the comparison;
dynamically deriving an alternate motion plan based on the obtained additional sensor data;
implementing the alternate movement plan instead of the initial movement plan or a second portion thereof to transfer the package to the task location; and
updating a registration record to include the additional sensor data for the package.
2. The method of claim 1, further comprising:
deriving an alternative footprint based on the obtained additional sensor data, wherein
Deriving the alternate motion plan based on the alternate footprint rather than the dimensions in the registration record.
3. The method of claim 2, wherein:
the additional sensor data comprises measurements representative of a centroid (CoM) position of the package; and is
Deriving the alternative footprint at a central portion of the alternative footprint with the CoM location or an estimate thereof.
4. The method of claim 3, wherein:
the additional sensor data includes a transfer from execution of the initial motion plan a weight vector or torque measurement provided by the wrapped robotic unit;
the method further comprises the following steps:
estimating the CoM position based on the weight vector.
5. The method of claim 3, wherein the alternate motion plan includes instructions for re-gripping the parcel above the CoM position other than an initial gripping position associated with the initial motion plan.
6. The method of claim 1, wherein the first and second light sources are selected from the group consisting of a red light source, a green light source, and a blue light source, wherein:
the additional sensor data comprises a measured weight of the package; and is
Deriving the alternate motion plan based on the measured weight rather than the weight indicated by the registration record.
7. The method of claim 1, wherein:
the additional sensor data comprises a measured height of the package; and is
Deriving the alternate motion plan based on the measured altitude rather than the altitude indicated by the registration record.
8. The method of claim 1, wherein:
updating the enrollment record includes automatically replacing features of a record within the enrollment record with the additional sensor data.
9. The method of claim 1, wherein:
updating the registration record includes automatically registering the package by creating a new registration record with the additional sensor data, wherein the new registration record supplements the registration record corresponding to the initial detection result.
10. The method of claim 1, further comprising:
suspending implementation of the initial movement plan when the additional sensor data deviates from a corresponding expected characteristic in accordance with the registration record corresponding to the initial detection result;
wherein
The replacement movement plan is for transferring the package from a pause location and for replacing the second portion of the initial movement plan.
11. The method of claim 10, wherein:
initiating implementation of the initial movement plan comprises receiving the initial movement plan from and derived by an external planning module; and is
Dynamically obtaining the replacement motion plan includes dynamically adjusting the initial motion plan using a controller separate from the external planning module.
12. The method of claim 1, wherein:
implementing, using a controller, an automatic replacement operation when the additional sensor data deviates from a corresponding expected characteristic according to a registration record associated with the initial detection result, wherein the automatic replacement operation includes reversing the first portion of the initial motion plan and/or releasing the package to place the package back into the starting position without communicating with an external planning module;
obtaining the alternate motion plan comprises:
sending a request for the alternate movement plan from the controller to the external planning module, wherein the request includes the additional sensor data, an
Receiving the replacement motion plan from the external planning module.
13. The method of claim 12, the method further comprising:
deriving, using the external planning module, the initial motion plan based on the initial detection results, wherein the initial motion plan corresponds to an initial gripping location, an initial transfer height, an initial transfer speed, or a combination thereof; and
deriving the alternative motion plan based on the additional sensor data using the external planning module, wherein
The deriving of the alternate movement plan includes assuming that the package is a different unregistered object than the registration record corresponding to the initial detection result,
the replacement motion plan corresponds to a new gripping position, a new transfer height, a new transfer velocity, or a combination thereof, different from the corresponding component of the initial motion plan, and
the alternate motion plan is for transferring the repositioned package from the starting location to the task location.
14. A robotic system, the robotic system comprising:
at least one processor;
at least one memory including processor instructions that, when executed, cause the at least one processor to:
implementing an initial motion plan for transferring a parcel from a starting location to a task location, wherein the initial motion plan is derived based on initial detection results associated with image data depicting the parcel at the starting location;
obtaining additional sensor data during implementation of at least a first portion of the initial movement plan;
determining a false detection based on comparing the additional sensor data to a registration record corresponding to the initial detection result;
in response to determining the false detection, dynamically obtaining an alternate motion plan based on the obtained additional sensor data; and
implementing the alternate movement plan instead of the initial movement plan or a second portion thereof to transfer the package to the task location.
15. A non-transitory computer-readable medium comprising processor instructions that, when executed by one or more processors, cause the one or more processors to perform a method comprising:
receiving an initial motion plan for transferring a parcel from a starting location to a task location from an external planning module, wherein the initial motion plan is derived based on initial detection results associated with image data depicting the parcel at the starting location;
implementing at least a first portion of the initial movement plan for gripping and lifting the package from the starting location;
obtaining additional sensor data during implementation of at least the first portion of the initial movement plan, wherein the additional sensor data includes measured torque vectors, weights, heights, or combinations thereof, with respect to a gripped and lifted package;
determining a false detection based on comparing the additional sensor data to corresponding features in a registration record used to generate the initial detection result;
in response to determining the false detection, dynamically obtaining an alternate motion plan based on the obtained additional sensor data; and
implementing the alternate movement plan instead of the initial movement plan or a second portion thereof to transfer the package to the task location.
16. The non-transitory computer-readable medium of claim 15, wherein:
determining the false detection when the torque vector corresponds to a center of mass (CoM) position that deviates from an expected CoM position and/or a grip position associated with the initial motion plan;
the method further comprises the following steps:
deriving an alternative footprint based on the obtained additional sensor data, wherein
Deriving the alternate motion plan based on the alternate footprint rather than the dimensions in the registration record.
17. The non-transitory computer-readable medium of claim 15, wherein:
the additional sensor data comprises a measured weight and/or a measured height of the package; and is
Deriving the alternate movement plan based on the measured weight and/or the measured height instead of the weight or height indicated by the registration record.
18. The non-transitory computer-readable medium of claim 15, wherein the method further comprises:
suspending implementation of the initial movement plan when the additional sensor data deviates from a corresponding expected characteristic according to the registration record corresponding to the initial detection result, wherein
The replacement movement plan is for transferring the package from a pause location and for replacing the second portion of the initial movement plan.
19. The non-transitory computer-readable medium of claim 15, wherein the method further comprises:
implementing an automatic replacement operation using a controller when the additional sensor data deviates from a corresponding expected characteristic according to a registration record associated with the initial detection, wherein the automatic replacement operation includes reversing the first portion of the initial movement plan and/or dropping the package to place the package back to the starting location without communicating with an external planning module;
wherein obtaining the alternate movement plan comprises:
sending a request for the alternate movement plan from the controller to the external planning module, wherein the request includes the additional sensor data, an
Receiving the replacement movement plan from the external planning module.
20. The non-transitory computer-readable medium of claim 15, wherein the method further comprises:
updating a registration record to include the additional sensor data about the package based on: (1) Creating a new registration record representing the package in place of the registration record, the new registration record including the additional sensor data instead of the corresponding feature, or (2) replacing the corresponding feature in the registration record with the additional sensor data.
CN202210933005.4A 2021-07-23 2022-07-25 Robot system with object update mechanism and method for operating the robot system Pending CN115258510A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US202163225346P 2021-07-23 2021-07-23
US63/225,346 2021-07-23
US17/841,545 2022-06-15
US17/841,545 US20230025647A1 (en) 2021-07-23 2022-06-15 Robotic system with object update mechanism and methods for operating the same
CN202210880924.XA CN115676223A (en) 2021-07-23 2022-07-25 Robot system with object update mechanism and method for operating the same

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202210880924.XA Division CN115676223A (en) 2021-07-23 2022-07-25 Robot system with object update mechanism and method for operating the same

Publications (1)

Publication Number Publication Date
CN115258510A true CN115258510A (en) 2022-11-01

Family

ID=83784084

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210933005.4A Pending CN115258510A (en) 2021-07-23 2022-07-25 Robot system with object update mechanism and method for operating the robot system

Country Status (1)

Country Link
CN (1) CN115258510A (en)

Similar Documents

Publication Publication Date Title
US11501445B2 (en) Robotic system with automated package scan and registration mechanism and methods of operating the same
US11654558B2 (en) Robotic system with piece-loss management mechanism
US11648676B2 (en) Robotic system with a coordinated transfer mechanism
JP7175487B1 (en) Robotic system with image-based sizing mechanism and method for operating the robotic system
JP7126667B1 (en) Robotic system with depth-based processing mechanism and method for manipulating the robotic system
CN111618852B (en) Robot system with coordinated transfer mechanism
JP7218881B1 (en) ROBOT SYSTEM WITH OBJECT UPDATE MECHANISM AND METHOD FOR OPERATING ROBOT SYSTEM
CN115258510A (en) Robot system with object update mechanism and method for operating the robot system
US20230071488A1 (en) Robotic system with overlap processing mechanism and methods for operating the same
US20240132303A1 (en) Robotic systems with dynamic motion planning for transferring unregistered objects
CN115570556A (en) Robotic system with depth-based processing mechanism and method of operation thereof
CN115609569A (en) Robot system with image-based sizing mechanism and method of operating the same
CN115485216A (en) Robot multi-surface gripper assembly and method of operating the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination