CN115557044A - Robotic system and method with multi-purpose labeling system - Google Patents

Robotic system and method with multi-purpose labeling system Download PDF

Info

Publication number
CN115557044A
CN115557044A CN202211009475.8A CN202211009475A CN115557044A CN 115557044 A CN115557044 A CN 115557044A CN 202211009475 A CN202211009475 A CN 202211009475A CN 115557044 A CN115557044 A CN 115557044A
Authority
CN
China
Prior art keywords
labeling
module
label
assembly
conveyor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211009475.8A
Other languages
Chinese (zh)
Inventor
雷磊
张艺轩
陈旭
徐熠
布兰登·蔻兹
鲁仙·出杏光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mujin Technology
Original Assignee
Mujin Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/885,421 external-priority patent/US20230050326A1/en
Application filed by Mujin Technology filed Critical Mujin Technology
Publication of CN115557044A publication Critical patent/CN115557044A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65CLABELLING OR TAGGING MACHINES, APPARATUS, OR PROCESSES
    • B65C9/00Details of labelling machines or apparatus
    • B65C9/26Devices for applying labels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65CLABELLING OR TAGGING MACHINES, APPARATUS, OR PROCESSES
    • B65C9/00Details of labelling machines or apparatus
    • B65C9/40Controls; Safety devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Labeling Devices (AREA)

Abstract

A multi-purpose labeling system may include a conveyor, a visual analysis module, and a labeling assembly. The conveyor may move the object in a first direction. The vision analysis module may include an optical sensor directed at the conveyor to generate image data depicting the object. The labeling assembly may be spaced from the conveyor in a second direction and include a printer, a labeling module, and an alignment assembly. The printer may print a label based on the image data, and the labeling module may have a labeling plate for receiving the label. The alignment assembly may include a lateral movement module, a vertical movement module, and a rotation module for moving the labeling module along or about the first, second, and third directions, and may place the labeling plate adjacent to an object surface.

Description

Robot system and method with multi-purpose labeling system
The application is a divisional application of Chinese application CN202210977868.1, which has a date of 8/15/2022, and is entitled "robot system and method with multipurpose labeling system".
Cross Reference to Related Applications
This application claims the benefit of U.S. provisional patent application No. 63/232,665, filed on 8/13/2021, the entire contents of which are incorporated herein by reference.
Technical Field
The present technology relates generally to robotic systems having labeling systems, and more particularly to labeling systems having automated setup mechanism positions and placement mechanisms.
Background
With the ever-increasing performance and decreasing cost, many robots (e.g., machines configured to automatically and primarily perform physical actions) are now widely used in many areas. For example, robots may be used to perform various tasks (e.g., manipulating, labeling, transferring objects across space) in manufacturing and/or assembly, packaging and/or wrapping, transportation and/or shipping, and the like. In performing the task, the robot may replicate the human actions, thereby replacing or reducing human involvement that would otherwise be required to perform a dangerous or repetitive task.
However, despite technological advances, robots often lack the advancement necessary to replicate human interactions needed to perform larger and/or more complex tasks. Accordingly, there remains a need for improved techniques and systems for managing the operation of and/or interaction between robots and objects.
Drawings
Fig. 1 is an illustration of an exemplary environment in which a robotic system having a multi-purpose labeling mechanism may operate, in accordance with some embodiments of the present technique.
Fig. 2 is a block diagram illustrating the robotic system of fig. 1 in accordance with some embodiments of the present technique.
Fig. 3 is a front perspective view of a first example multi-purpose labeling system configured in accordance with some embodiments of the present technology.
Fig. 4 is a rear perspective view of a second exemplary multi-purpose labeling system configured in accordance with some embodiments of the present technique.
Fig. 5 is a top view of an object having a pre-existing article on its top surface.
Fig. 6 is a top perspective view of a lateral motion module of a multi-purpose labeling system configured in accordance with some embodiments of the present technique.
Fig. 7 is a front perspective view of a vertical motion module of a multi-purpose labeling system configured in accordance with some embodiments of the present technique.
Fig. 8A and 8B are front perspective views of a rotary module of a multi-purpose labeling system configured in accordance with some embodiments of the present technique.
Fig. 9 is a front perspective view of a label flipping module of a multi-purpose labeling system configured in accordance with some embodiments of the present technique.
Fig. 10 is a front perspective view of a labeling module of a multi-purpose labeling system configured in accordance with some embodiments of the present technique.
Fig. 11A and 11B are bottom perspective views of a label adapter of a multi-purpose labeling system configured in accordance with some embodiments of the present technique.
Fig. 12-15 illustrate a process for labeling objects using the multi-purpose labeling system of fig. 1, in accordance with some embodiments of the present technique.
Fig. 16 is a flow diagram illustrating a process for labeling objects using the multi-purpose labeling system of fig. 1, in accordance with some embodiments of the present technique.
The drawings are not necessarily to scale. Similarly, some components and/or operations may be separated into different blocks or combined into a single block for the purpose of discussing some embodiments of the present technology. In addition, while the present technology is susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, there is no intention to limit the technology to the specific implementations described.
For ease of reference, the multi-purpose labeling system and its components are sometimes described herein with reference to top and bottom, upper and lower, upward and downward, longitudinal planes, horizontal planes, x-y planes, vertical planes, and/or z planes, which are oriented spatially relative to the embodiments shown in the figures. However, it should be understood that the end effector and its components may be moved to and used in different spatial orientations without changing the structure and/or function of the embodiments of the present disclosure.
Detailed Description
SUMMARY
Multi-purpose labeling systems and methods are disclosed herein. Such multi-purpose labeling systems may visually inspect objects in or interfaced with the robotic system to determine physical and identifying information about the objects. Based on the physical and identifying information, the labeling system may determine a target labeling location for placing a label on the object. The labeling system may also print and prepare labels for adhering to objects based on physical and identifying information. The multi-purpose labeling system may then automatically align the labeling module with the target labeling position, and may place the label on the object at the target labeling position using the labeling module. By automatically identifying information about an object, generating a label for the object, and placing the label on the object, a labeling system may improve the ability of a robotic system to accomplish complex tasks without the need for human intervention. Additionally, aspects of the multi-purpose labeling system may provide additional benefits, including, for example: among other benefits, are (i) reduced human involvement in subject handling and management, (ii) increased robotic system handling speed, and/or (iii) elimination of the need to remove subjects from the robotic system to place labels thereon.
In various embodiments of the multi-purpose labeling system, the labeling system may include a conveyor, a visual analysis module, and a labeling assembly. The conveyor may move the object in a first direction. The vision analysis module may include an optical sensor that is pointed at the conveyor or a related location to generate image data depicting the object. The labeling assembly may be spaced from the conveyor in a second direction and include a printer, a labeling module, and an alignment assembly. The printer may print a label based on the image data, and the labeling module may have a labeling plate for receiving the label. The alignment assembly may include a lateral movement module, a vertical movement module, and a rotation module for moving the labeling module along or about the first, second, and third directions, and may place the labeling plate adjacent to an object surface. In some embodiments, a labeling system may include one or more controllers having computer-readable media carrying instructions to operate a vision analysis module, a printer, a labeling module, and an alignment assembly.
Embodiments of the labeling system may place labels on objects by optically scanning the objects on a conveyor for visual and physical characteristics. The visual characteristics may include the available labeling space and object identifier reading content. The physical characteristic may include a size of the object. The labeling system may identify a target labeling location from the available labeling space. The labeling system may prepare a label on a labeling module carried by the alignment assembly by reading the contents through the object identifier. The labeling system may then align the labeling module with the target labeling position based on the physical features using the conveyor and the alignment assembly, and may apply the label to the object using the alignment assembly.
For the sake of clarity, several details describing structures or processes that are well known and commonly associated with robotic systems and subsystems but which may unnecessarily obscure some important aspects of the disclosed technology are not set forth in the following description. In addition, while the following disclosure sets forth several embodiments of different aspects of the technology, several other embodiments may have different configurations or components than those described in this section. Accordingly, the disclosed technology may have other embodiments with additional elements or without several elements described below.
Many embodiments or aspects of the disclosure described below may take the form of computer-executable or controller-executable instructions, including routines executed by a programmable computer or controller. One skilled in the relevant art will appreciate that the disclosed techniques can be practiced on computers or controller systems other than those shown and described below. The techniques described herein may be embodied in a special purpose computer or data processor that is specifically programmed, configured, or structured to perform one or more of the computer-executable instructions described below. Thus, the terms "computer" and "controller" as generally used herein refer to any data processor and may include internet appliances and/or applications or handheld devices (including palm-top computers, wearable computers, cellular or mobile phones, multi-processor systems, processor-based or programmable consumer electronics, network computers, minicomputers, and the like). The information processed by these computers and controllers may be presented at any suitable display medium, including a Liquid Crystal Display (LCD). Instructions for performing computer or controller-executable tasks may be stored in or on any suitable computer-readable medium including hardware, firmware, or a combination of hardware and firmware. The instructions may be contained in any suitable memory device, including, for example, a flash drive, a USB device, and/or other suitable media.
The terms "coupled" and "connected," along with their derivatives, may be used herein to describe structural relationships between components. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, "connected" may be used to indicate that two or more elements are in direct contact with each other. Unless otherwise apparent from the context, the term "coupled" may be used to indicate that two or more elements are in contact with each other, either directly or indirectly (with other intervening elements between them), and/or that two or more elements are cooperating or interacting with each other (e.g., interacting in a cause-and-effect relationship, such as for signal transmission/reception or for function calls).
Exemplary Environment for a Robotic System
Fig. 1 is an illustration of an exemplary environment in which a robotic system 100 having a multi-purpose labeling system 104 may operate. The operating environment of the robotic system 100 may include one or more structures, such as robots or robotic devices, configured to perform one or more tasks. Aspects of the multi-purpose labeling system 104 may be practiced or implemented by various structures and/or components.
In the example shown in fig. 1, the robotic system 100 may include an unloading unit 102, a multi-purpose labeling system 104, a transfer unit 106, a transport unit 108, a loading unit 110, or a combination thereof, of a warehouse, distribution center, or shipping center. Each of the units in the robotic system 100 may be configured to perform one or more tasks. The tasks may be combined in a sequence to perform an operation that achieves the goal, such as, for example, (i) unloading an object from a vehicle such as a truck, trailer, van, or rail car (e.g., via unloading unit 102); (ii) Labeling an object (e.g., via the multi-purpose labeling system 104); (iii) Transferring and/or transporting objects from one system to another (e.g., via transfer unit 106, transport unit 108); and/or (iv) store the object in a warehouse or unload the object from a storage location (e.g., via load unit 110). Additionally or alternatively, operations may be performed to achieve different goals, such as loading an object onto a vehicle for transport. In another example, the tasks may include: an object is moved from one location (such as a container, bin, cage, basket, shelf, platform, tray, or conveyor) to another location. Each of the units may be configured to perform a series of actions (such as operating one or more components thereof) to perform a task.
In some embodiments, the task may include interaction with the target object 112, such as manipulation, movement, reorientation, labeling, or a combination thereof of the object. The target object 112 is an object to be handled by the robot system 100. More specifically, the target object 112 may be a specific object of many objects that is the target of an operation or task of the robotic system 100. For example, the target object 112 may be an object that the robotic system 100 has selected or is currently being treated, manipulated, moved, reoriented, tagged, or a combination thereof. By way of example, the target object 112 may include a box, a tube, a package, a bundle, a wide variety of individual items, or any other object that may be handled by the robotic system 100.
As an example, the tasks may include: the target object 112 is transferred from the object source 114 to the task location 116. The object source 114 is a container for storing objects. The object sources 114 may include numerous configurations and forms. For example, the object source 114 may be a platform with or without walls on which objects, such as trays, racks, or conveyor belts, may be placed or stacked. As another example, the object source 114 may be a partially or fully enclosed container, such as a bin, cage, or basket, having a wall or lid in which the object may be placed. In some embodiments, the walls of the partially or fully enclosed object source 114 may be transparent, or may include openings or gaps of various sizes such that portions of the object contained therein may be visible or partially visible through the walls.
Fig. 1 shows examples of possible functions and operations that may be performed by various units of the robotic system 100 to process the target object 112, and it should be understood that the environments and conditions may differ from those described below. For example, the unloading unit 102 may be a vehicle unloading robot configured to transfer the target object 112 from a location in a vehicle (such as a truck) to a location on a conveyor belt. Once on the conveyor, the target objects 112 may be tagged by the multi-purpose tagging system 104 for identification purposes inside or outside the robotic system, such as identifying the contents of the target objects 112, providing shipping labels, or other similar purposes. Details regarding the multi-purpose labeling system 104 are described below. The transfer unit 106 (such as a palletizing robot) may be configured to transfer the labeled target objects 112 from a position on the conveyor belt to a position on the transport unit 108, such as for loading the target objects 112 on a pallet on the transport unit 108. In another example, the transfer unit 106 may be a picking robot configured to transfer the target object 112 from one container to another container. Upon completion of the operation, the transport unit 108 may transfer the target object 112 from the area associated with the transfer unit 106 to the area associated with the loading unit 110, and the loading unit 110 may transfer the target object 112 from the transfer unit 106 (such as by moving a tray carrying the target object 112) to a storage location (such as a location on a shelf).
For illustrative purposes, the robotic system 100 is described in the context of a shipping center; however, it should be understood that the robotic system 100 may be configured to perform tasks in other environments or for other purposes, such as for manufacturing, assembly, packaging, healthcare, or other types of automation. It should also be understood that the robotic system 100 may include other units not shown in fig. 1, such as manipulators, service robots, modular robots. For example, in some embodiments, the robotic system 100 may include: an unstacking unit for transferring objects from a cage, cart or tray onto a conveyor or other tray; a container switching unit for transferring an object from one container to another container; a packing unit for packing an object; a sorting unit for grouping the objects according to one or more characteristics of the objects; a sorting unit for manipulating (such as sorting, grouping and/or transferring) objects differently according to one or more characteristics of the objects; or a combination thereof.
The robotic system 100 may include a controller 120 configured to interface with and/or control one or more of the robotic units. For example, the controller 120 may include circuitry (e.g., one or more processors, memory, etc.) configured to derive motion plans and/or corresponding commands, settings, etc. for operating corresponding robotic units. The controller 120 may communicate the motion plan, commands, settings, etc. to the robotic unit, and the robotic unit may execute the communicated plan to complete a corresponding task, such as transferring the target object 112 from the object source 114 to the task location 116.
Suitable system
Fig. 2 is a block diagram illustrating a robotic system 100 in accordance with one or more embodiments of the present technique. In some embodiments, for example, the robotic system 100 may include electronic devices, electrical devices, or combinations thereof, such as a control unit 202 (also sometimes referred to herein as a "processor 202"), a storage unit 204, a communication unit 206, a system input/output ("I/O") device 208 having a system interface (also sometimes referred to herein as a "user interface" or system or user "IF"), one or more actuation devices 212, one or more transport motors 214, one or more sensor units 216, or combinations thereof, coupled to one another, integrated with or coupled to one or more of the units or robots described above in fig. 1, or combinations thereof.
The control unit 202 may be implemented in a number of different ways. For example, the control unit 202 may be a processor, an application specific integrated circuit ("ASIC"), an embedded processor, a microprocessor, hardware control logic, a hardware finite state machine ("FSM"), a digital signal processor ("DSP"), or a combination thereof. The control unit 202 may execute software 210 and/or instructions to provide the intelligence of the robotic system 100.
The control unit 202 may be operatively coupled to the I/O devices 208 to provide control of the control unit 202 to a user. The I/O devices 208 may be used for communication between the user and control unit 202 and other functional units in the robotic system 100. The I/O devices 208 may also be used for communication external to the robotic system 100. I/O device 208 may receive information from other functional units or from external sources and/or may transmit information to other functional units or to external destinations. External sources and external destinations refer to sources and destinations external to the robotic system 100.
I/O device 208 may be implemented in and include different implementations depending on which functional units or external units are interfacing with I/O device 208. For example, the I/O devices 208 may be implemented with pressure sensors, inertial sensors, microelectromechanical systems ("MEMS"), optical circuitry, waveguides, wireless circuitry, wired circuitry, application programming interfaces, or a combination thereof.
The storage unit 204 may store software instructions 210, master data 246, trace data, or a combination thereof. For illustrative purposes, memory cell 204 is shown as a single element, but it should be understood that memory cell 204 may be a distribution of memory elements. Also for illustrative purposes, the robotic system 100 is shown with the storage unit 204 as a single tier storage system, but it should be understood that the robotic system 100 may have a different configuration of the storage unit 204. For example, the storage unit 204 may be formed of different storage technologies, forming a memory hierarchy including different levels of cache, main memory, rotating media, and/or offline storage.
The storage unit 204 may be a volatile memory, a non-volatile memory, an internal memory, an external memory, or a combination thereof. For example, the storage unit 204 may be a non-volatile storage device, such as non-volatile random access memory ("NVRAM"), flash memory, a magnetic disk storage device, and/or a volatile storage device, such as static random access memory ("SRAM"). As another example, the storage unit 204 may be a non-transitory computer medium including non-volatile memory, such as a hard disk drive, NVRAM, a solid state storage ("SSD"), a compact disc ("CD"), a digital video disc ("DVD"), and/or a universal serial bus ("USB") flash memory device. The software 210 may be stored on a non-transitory computer readable medium for execution by the control unit 202.
In some embodiments, the storage unit 204 is used to further store and/or provide access to processing results, predetermined data, thresholds, or combinations thereof. For example, the storage unit 204 may store master data 246 that includes a description of one or more target objects 112 (e.g., a box type, a box type, a product, or a combination thereof). In one embodiment, the master data 246 includes dimensions, predetermined shapes, templates for potential poses, and/or computer-generated models for recognizing different poses, color schemes, images, identification information (e.g., bar codes, quick Response (QR) codes, logos, etc.), expected locations, expected qualities, or a combination thereof of one or more target objects 112 expected to be manipulated by the robotic system 100.
In some embodiments, master data 246 includes manipulation-related information about one or more objects that may be encountered or handled by robotic system 100. For example, the manipulation-related information of the objects may include a centroid location on each of the objects, expected sensor measurements (e.g., force, torque, pressure, and/or contact measurements) corresponding to one or more actions, manipulations, or a combination thereof.
The communication unit 206 may enable external communication to and from the robot system 100. For example, the communication unit 206 may enable the robotic system 100 to communicate with other robotic systems and/or units, external devices (such as external computers, external databases, external machines, external peripheral devices), or combinations thereof, via a communication path 218 (such as a wired or wireless network).
The communication path 218 may span and represent a variety of networks and/or network topologies. For example, the communication path 218 may include wireless communication, wired communication, optical communication, ultrasonic communication, or a combination thereof. For example, satellite communication, cellular communication, bluetooth, infrared data association standards ("lrDA"), wireless local area network ("WiFi"), and/or worldwide interoperability for microwave access ("WiMAX") are examples of wireless communication that may be included in the communication path 218. Cable, ethernet, digital subscriber line ("DSL"), fiber optic line, fiber to the home ("FTTH"), and/or plain old telephone service ("POTS") are examples of wired communications that may be included in communications path 218. Further, the communication path 218 may traverse multiple network topologies and distances. For example, the communication path 218 may include a direct connection, a personal area network ("PAN"), a local area network ("LAN"), a metropolitan area network ("MAN"), a wide area network ("WAN"), or a combination thereof. The robotic system 100 may transmit information between the various units over the communication path 218. For example, information may be transmitted between the control unit 202, the storage unit 204, the communication unit 206, the I/O devices 208, the actuation devices 212, the transport motors 214, the sensor units 216, or a combination thereof.
The communication unit 206 may also function as a communication hub, allowing the robotic system 100 to function as part of the communication path 218 and is not limited to being an endpoint or terminal unit of the communication path 218. The communication unit 206 may include active and/or passive components, such as microelectronics or an antenna, for interacting with the communication path 218.
The communication unit 206 may include a communication interface 248. The communication interface 248 may be used for communication between the communication unit 206 and other functional units in the robotic system 100. Communication interface 248 may receive information from other functional units and/or from external sources and/or may transmit information to other functional units and/or to external destinations. External sources and external destinations refer to sources and destinations external to the robotic system 100.
Communication interface 248 may comprise different implementations depending on which functional units are interfacing with communication unit 206. Communication interface 248 may be implemented with technologies and techniques similar to the implementation of control interface 240.
The I/O devices 208 may include one or more input sub-devices and/or one or more output sub-devices. Examples of input devices of I/O device 208 may include a keypad, touchpad, soft keys, keyboard, microphone, sensor for receiving remote signals, camera for receiving motion commands, or a combination thereof to provide data and/or communication input. Examples of output devices may include a display interface. The display interface may be any graphical user interface, such as a display, a projector, a video screen, and/or combinations thereof.
The control unit 202 may operate the I/O device 208 to present or receive information generated by the robotic system 100. The control unit 202 may operate the I/O device 208 to present information generated by the robotic system 100. The control unit 202 may also execute software 210 and/or instructions for other functions of the robotic system 100. The control unit 202 may also execute software 210 and/or instructions for interacting with the communication path 218 via the communication unit 206.
The robotic system 100 may include physical and/or structural members, such as robotic manipulator arms, connected at joints for movement (such as rotational displacement, translational displacement, or a combination thereof). The structural members and joints may form a kinematic chain configured to manipulate an end effector (such as a clamping element) to perform one or more tasks (such as clamping, spinning, welding, and/or labeling) depending on the use or operation of the robotic system 100. The robotic system 100 may include an actuation device 212, such as a motor, actuator, wire, artificial muscle, electroactive polymer, or a combination thereof, configured to drive, manipulate, displace, reorient, label, or a combination thereof, the structural member around or at the corresponding joint. In some embodiments, the robotic system 100 may include a transport motor 214 configured to transport the corresponding unit from one place to another.
The robotic system 100 may include a sensor unit 216 configured to obtain information for performing tasks and operations, such as information for manipulating structural members or for transporting the robotic unit. The sensor unit 216 may include a device configured to detect and/or measure one or more physical properties of the robotic system 100, such as a state, condition, location, information about the object and/or surrounding environment of one or more structural members or joints, or a combination thereof. As an example, the sensor unit 216 may include an imaging device, a system sensor, a contact sensor, and/or a combination thereof.
In some embodiments, the sensor unit 216 includes one or more imaging devices 222. The imaging device 222 may be configured to detect and image the surrounding environment. For example, imaging device 222 may include a 2-dimensional camera ("2D"), a 3-dimensional camera ("3D") (both of which may include a combination of visual and infrared capabilities), a lidar, a radar, other ranging devices, and/or other imaging devices. The imaging device 222 may generate a representation, such as a digital image and/or a point cloud, of the detected environment for implementing machine/computer vision for automated inspection, object measurement, robot guidance, and/or other robotic applications. As described in further detail below, the robotic system 100 may process the digital image, the point cloud, or a combination thereof via the control unit 202 to identify the target object 112 of fig. 1, a pose of the target object 112, a size and/or orientation of the target object 112, or a combination thereof. To manipulate the target object 112, the robotic system 100 may capture and analyze images of a specified area (such as the location of objects inside a truck, inside a container, or on a conveyor belt) to identify the target object 112 and its physical properties, as well as the object source 114 of fig. 1. Similarly, the robotic system 100 may capture and analyze images of another designated area (such as a drop location for placing or labeling objects on a conveyor belt, a location for placing objects inside a container, or a location on a pallet for stacking purposes) to identify the task location 116 of fig. 1.
In some embodiments, the sensor unit 216 may include a system sensor 224. The system sensors 224 may monitor the robotic cells within the robotic system 100. For example, system sensors 224 may include units and/or devices for detecting and/or monitoring the position of structural members (such as robotic arms and end effectors, corresponding joints of robotic units, or combinations thereof). As another example, the robotic system 100 may use the system sensors 224 to track the position, orientation, or a combination thereof of the structural members and/or joints during performance of the task. Examples of system sensors 224 may include accelerometers, gyroscopes, position encoders, and/or other similar sensors.
In some embodiments, the sensor unit 216 may include a contact sensor 226, such as a pressure sensor, force sensor, strain gauge, piezoresistive/piezoelectric sensor, capacitive sensor, elastic resistance sensor, torque sensor, linear force sensor, other tactile sensor, and/or any other suitable sensor configured to measure a characteristic associated with direct contact between multiple physical structures and/or surfaces. For example, the contact sensor 226 may measure a characteristic corresponding to the clamping of the end effector on the target object 112, or measure the weight of the target object 112. Accordingly, the contact sensor 226 may output a contact metric representing a quantitative metric, such as a measured force or torque corresponding to a degree of contact and/or stiction between the clamping element and the target object 112. For example, the contact metric may include one or more force or torque readings associated with the force applied by the end effector onto the target object 112.
Suitable multi-purpose labeling systems and related components
Fig. 3 is a front perspective view of a first example multi-purpose labeling system 300 (e.g., an example of the multi-purpose labeling system 104 of fig. 1) that may visually inspect an object (e.g., O1, O2) and place a label thereon, configured in accordance with some embodiments of the present technique. More specifically, in some embodiments, the labeling system 300 may visually inspect objects on the conveyor assembly 330; identifying information about the object, such as physical characteristics (e.g., outer dimensions, unobstructed surface area) and/or identifying information (e.g., one or more objects and/or object content identifiers); determining (e.g., deriving, calculating) a target location (e.g., placement location) for labeling the object; align labeling module 316 with a target labeling location (e.g., TLL); and preparing and adhering a label to the object at the target labeling position. Aspects of the labeling system 300 may efficiently (e.g., faster, requiring less motion) and/or automatically (e.g., requiring no manual input) prepare and adhere labels to objects within a robotic system (e.g., the robotic system 100 of fig. 1) while avoiding pre-existing labels and/or images on the objects as they pass through the robotic system. By providing automatic labeling, the labeling system 300 may improve object tracking and/or management without (i) human involvement, (ii) slowing operation of the robotic system, and/or (iii) removing objects from the robotic system, among other benefits. Moreover, labeling system 300 provides benefits over alternative labeling systems by including alignment (e.g., motion) modules that travel along or about dedicated axes, thereby improving efficiency, robustness, and/or accuracy and improving overall system throughput as compared to free-moving and six-degree-of-freedom robots.
For ease of reference, fig. 3 includes an XYZ reference frame corresponding to the illustrated labeling system 300. The x-axis and y-axis are parallel to the ground below the labeling system 300. The x-axis is along the length of the labeling system 300 (e.g., along the length of the conveyor assembly 330) and the y-axis is perpendicular to the x-axis. The z-axis is perpendicular to the ground (e.g., along the height of the labeling system 300). Unless otherwise noted, the reference frame included in subsequent figures is aligned with the reference frame of fig. 3.
As shown in fig. 3, labeling system 300 may include a control cabinet 302 having equipment therein (e.g., one or more of the processors or control units 202 of fig. 2) for managing the operation of labeling system 300, conveyor assembly 330, and/or labeling assembly 310 for visually inspecting objects on conveyor assembly 330 and adhering labels thereto. One or both of control cabinet 302 and labeling assembly 310 may be carried by labeling assembly frame 304. The assembly frame 304 may be coupled to or rest on the ground. In some embodiments, the assembly frame 304 may be coupled to the conveyor assembly 330 and may move with the conveyor assembly (e.g., as the conveyor assembly 330 may telescope, tilt, rotate, and/or otherwise move relative to the ground).
The conveyor assembly 330 may include a conveyor 332 carried by a conveyor support 334 (e.g., housing, post). The conveyor 332 may move the object from a first end of the conveyor assembly 330 to a second end of the conveyor assembly 330 (e.g., in a first direction), and hold (e.g., stop, slowly move) the object along the length of the conveyor assembly 330 (e.g., under a portion of the labeling assembly 310). The conveyor 332 may include one or more linear and/or non-linear motorized belts, rollers, multi-directional rollers, wheels, and/or any suitable mechanism operable to selectively move and/or hold objects thereon. As shown, the conveyor assembly 330 includes a single conveyor 332. In some embodiments, the conveyor assembly 330 may include one or more additional conveyors 332 that, in turn, are used for independent movement of the objects and/or holding the objects thereon. Further, in some embodiments, labeling system 300 may include one or more conveyor assemblies 330 having one or more conveyors 332.
Labeling assembly 310 may include: a visual analysis module 312 for visually inspecting the objects, (ii) a print module 314 for printing labels, (iii) a labeling module 316 for receiving the printed labels and for adhering the labels to the objects, and (iv) a labeling alignment assembly for aligning the labeling module 316 with the target labeling position of each object. In some embodiments, labeling assembly 310 may also include a label flipping module 318 for preparing (e.g., by folding, flipping, and/or peeling) printed labels for labeling module 316. The labeling alignment assembly may include, for example, a lateral motion module 320 operable along the y-axis, a vertical motion module 322 operable along the z-axis, and/or a rotation module 324 operable about the z-axis, each configured to move the labeling module 316 along and/or about a respective identified axis. As shown in fig. 3, the vertical movement module 322 and the rotation module 324 are hidden from view by the protective cover.
The object may first interface with the labeling component 310 at the visual analysis module 312. The vision analysis module 312 may collect object information (e.g., collected and/or derived from one or more of the object reads, image data, etc.) for the labeling system 300 to identify the object and/or a target labeling location thereon. The visual analysis module 312 may also collect information for aligning the labeling module 316 with the target labeling position. The target labeling location may be a portion of one or more surfaces of the object that satisfy one or more predetermined conditions for adhering labels. For example, the target labeling location may be separate from (e.g., non-overlapping with) one or more existing labels, images, logos, object surface damage, and/or other similar items to keep these uncovered when placing the labels. Additionally or alternatively, the target labeling location may be associated with a known and/or preferred location. For example, known locations may facilitate more efficient object labeling reading and/or other similar situations of object handling (such as for packaging and/or clipping) based on industry standards, future handling of the object, customer specifications, and/or certain labeling locations. Further, in some embodiments, the target labeling position may be a set position for a particular object, regardless of the item on the surface of the object.
The vision analysis module 312 may be coupled to the assembly frame 304 and positioned above the conveyor assembly 330 to analyze the object before the object reaches the labeling assembly 310. The vision analysis module 312 may include one or more imaging and/or optical sensor devices (e.g., imaging device 222 of fig. 2) having a field of view (e.g., VF) directed toward the conveyor assembly 330 or a related location for analyzing the object (e.g., generating image data depicting the object and/or optically scanning the object). For example, the visual analysis module 312 may include: (i) One or more 3D cameras for scanning an exterior surface of an object using one or more vision, infrared, lidar, radar, and/or other ranging features; (ii) One or more 2D cameras for recognizing images, labels and/or other content on a surface of an object; and/or (iii) one or more scanners for reading identifiers (e.g., barcodes, QR, RFID, or similar codes) on the objects.
The object information collected by the 2D and 3D cameras may include physical characteristics of the object. For example, both the 2D camera and the 3D camera may collect the size, rotational orientation ((e.g., about the z-axis) and/or position (e.g., along the y-axis) of the surface (e.g., top, side, or sides) of the object (individually or collectively, object pose). The 3D camera may further collect the height, width, and/or length of the object, as well as other external dimensions of the object when the object is non-rectangular or non-square.the 2D camera may further collect images that identify the texture (e.g., visual characteristics) of one or more surfaces of the object.
The object information collected by the scanner may include identification information (e.g., object identifier read content), such as an object and/or object content identifier (e.g., shipping number, object identifier, content identifier, product model number, etc.). In some embodiments, the identification information may be derived from a physical characteristic. For example, labeling system 300 may use visual analysis module 312, a controller in control cabinet 302, and/or one or more devices external to labeling assembly 310 to analyze the object information/image data for identifying the target labeling location. In analyzing the object information, the labeling system 300 may derive or detect one or more identifiable information depicted in the image data, such as physical dimensions, object identifiers, visual/textual patterns, and so forth. The labeling system 300 may compare the identifiable information to the master data 246 of fig. 2 to detect or identify the imaged object. The labeling system 300 may further use registration information in the master data 246 and/or analyze the image data to identify a target labeling location. The labeling system 300 may derive the target labeling location as an area of minimum desired size, of uniform texture, and/or absent any recognizable pattern (e.g., a bar code, QR code, letter or design indicia, etc.).
The printing module 314 may print a label to be adhered to the analyzed object using the object information. The print module 314 may include a housing coupled to the assembly frame 304 with the printer in the housing. As shown in fig. 3, the housing is coupled to the assembly frame 304 via a lateral motion module 320. In some embodiments, the housing may instead be directly connected to the assembly frame 304. The printer may prepare and dispense labels from the print module 314 to the labeling module 316 and/or to the label flipping module 318. The printer may print labels having one or more shapes and sizes and one or more background colors and print colors. Further, the printer may print labels having text, images, symbols, and/or any other similar information thereon.
For example, the print module 314 may print rectangular and/or square labels that are less than or equal to 1.0 inches x1.0 inches (2.5 cm x 2.5 cm) or greater than or equal to 4.0 inches x6.0 inches (10.2cm x 15.2cm). In addition, the printed label may have, for example, black and white; black background, white font and red symbol; red background and yellow image; or any other combination of background and print color and content. In some embodiments, the print module 314 may print non-rectangular and/or non-square labels, such as triangular, circular, oval, and/or any other shape. In addition, the print module 314 may print labels having adhesive on one or more portions thereof. For example, a protective covering that requires flipping, folding, and/or peeling (e.g., over the adhesive) prior to adhering to an object may include an adhesive covering a first side (e.g., the side facing the conveyor assembly 330) and an adhesive covering at least a portion of a second side (e.g., the side facing away from the conveyor assembly 330).
As the object is visually analyzed, labeling component 310 may print the label and transfer the label to labeling module 316. Then, labeling alignment assembly and conveyor 332 (collectively, "alignment elements") may cooperate to align labeling module 316 with the target labeling position. For example, (i) the conveyor 332 may advance the object along the x-axis (e.g., along a first direction) to align the labeling module 316 with the target labeling position, (ii) the traverse module 320 may move the labeling module 316 along the y-axis (e.g., along a second direction) to align with the target labeling position, and (iii) the rotation module 324 may rotate the labeling module 316 about the z-axis to align with the target labeling position. Once aligned along the x-axis and the y-axis and aligned about the z-axis, the vertical motion module 322 may move the labeling module 316 along the z-axis (e.g., along the third direction) to place the labeling module 316 against the top surface of the object to adhere the label thereto.
In some embodiments, one or more of the alignment elements and/or print module 314 may operate cooperatively and/or sequentially to align the target labeling position with labeling module 316. For example, upon and/or after visually analyzing the object and identifying the target labeling location, the print module 314 may print a label, the conveyor 332 may cooperate to advance the object along the x-axis, and/or the lateral movement module 320 may cooperate to move the labeling module 316 along the y-axis. Vertical motion module 322 and rotation module 324 may then cooperate to move labeling module 316 along and about the z-axis, respectively, and place a label on the object. In some embodiments, the vertical movement module 322 and/or the rotation module 324 may be mated before, during, or after the conveyor 332 and the lateral movement module 320. Further, the vertical movement module 322 and/or the rotation module 324 may be engaged when or just after (e.g., 0.5 seconds, 1 second, 5 seconds, etc.) the labeling module 316 is aligned with the target labeling position along the x, y, and/or z-axis and/or about the z-axis. Once the label is placed on the object, the labeling module 316 may be retracted by the alignment assembly and prepared to place the label on a subsequent object (e.g., O2). For example, the visual analysis module 312 may visually analyze subsequent objects when the labeling module 316 is aligned with a target labeling position of an object (e.g., O1) and/or when a label is placed on the object (e.g., O1).
Fig. 4 is a rear perspective view of a second exemplary multi-purpose labeling system 400 (e.g., an example of the multi-purpose labeling system 104 of fig. 1) configured in accordance with some embodiments of the present technique, similar to the labeling system 300 of fig. 3, that can visually inspect an object for placement of a label thereon. Labeling system 400 of fig. 4 may include one or more or all of the same and/or similar elements that perform the corresponding operations of labeling system 300 of fig. 3. The portions of labeling system 400 of fig. 4 may correspond to a set (e.g., three) of zones associated with portions of conveyor assembly 430 for managing visual analysis and labeling of objects. Further, instead of being coupled to the visual analysis module 312 of the labeling component 310 of fig. 3, the labeling system 400 of fig. 4 may include a visual analysis unit 416 that is physically separate from the labeling component 310.
The three zones of the labeling system 400 of fig. 4 may include a scan zone, a queuing zone, and a labeling zone corresponding to object processing stages. The object may enter a scan area on a first portion of the conveyor 432 of the conveyor assembly 430, where the visual analysis unit 416 may identify information about the object (e.g., object information) in preparation for finding the target labeling location. The objects may then be moved to a queuing area on the second portion of the conveyor 432 where one or more objects may be held, such as when identifying a target labeling location for each object and/or when the labeling component 310 prepares a label for the next object. Finally, the object may be moved to a labeling area on a third portion of conveyor 432, where labeling assembly 310 and the third portion of conveyor 432 may align labeling module 316 with the target labeling position and labeling module 316 of fig. 3 may adhere the label to the object.
Similar to the conveyor assembly 330 of fig. 3, in some embodiments, the conveyor assembly 430 of fig. 4 may include one or more conveyor assemblies 430 having one or more conveyors 432. For example, the first, second, and/or third portions of the conveyor 432 may correspond to segments (e.g., conveyors) of a single conveyor 432 of a single conveyor assembly 430. Alternatively, as another example, labeling system 400 may include a single conveyor assembly 430 having three conveyors 432, each conveyor corresponding to a first, second, or third section; or labeling system 400 may include three conveyor assemblies 430, each corresponding to a first, second, or third section and having a single conveyor 432.
The visual analysis unit 416 may be carried by the visual analysis unit frame 404. The visual analysis unit frame 404 may be coupled to the ground or rest on the ground. In some embodiments, the visual analysis unit frame 404 may be coupled to and movable with the conveyor assembly 430. The visual analysis unit 416 may collect object information for the labeling system 400 to identify the object and/or a target labeling position thereon, as well as collect information for aligning the labeling module 316 with the target labeling position. The vision analysis unit 416 may include one or more imaging devices and/or sensors (e.g., imaging device 222 of fig. 2) directed toward the conveyor assembly 430 or a related location. For example, the visual analysis unit 416 may include: (ii) one or more 3D cameras, (ii) one or more 2D cameras, (iii) one or more scanners, and/or (iv) one or more sensors for tracking information about the conveyor assembly 430 and/or objects thereon.
One or more 3D cameras, one or more 2D cameras, and one or more scanners may be coupled to any portion of the vision analysis unit frame 404 and positioned to analyze any one or more surfaces of the object. For example, the position of the 3D camera 418 may be disposed at the top, front, or back of the visual analysis unit frame 404 facing forward of the object (e.g., toward or away from the top of the frame 404 of the labeling assembly 310, respectively) to collect the height, width, and length of the object within the field of view (e.g., VF) for labeling alignment and placement. One or more 2D cameras 420 may be coupled to the top and/or one or more sides of the vision analysis unit frame 404 to collect images of the top and/or one or more sides of the object to identify the target labeling location. A scanner 422 may be coupled to the top and/or one or more sides of the visual analysis unit frame 404 to collect identifying information from the object. Similarly, one or more sensors 424 may be coupled to the top and/or one or more sides of the vision analysis unit frame 404 for tracking information about the conveyor assembly 430 and/or objects thereon. For example, the sensors 424 may include one or more encoders, switches, force sensors, level sensors, proximity meters, IR beam sensors, light curtains, and/or any similar sensors for tracking the operation of the conveyor 432, identification information about objects on the conveyor, and/or the position and/or pose of objects on the position.
Fig. 5 is a top view of an object 500 having a pre-existing item (e.g., pre-existing label 502, pre-existing image 504) on its top surface. The object 500 is an example of an object that may be processed within a robotic system (e.g., robotic system 100 of fig. 1), including visual analysis and labeling by a multi-purpose labeling system (e.g., labeling systems 300, 400 of fig. 3 and 4). When the object 500 is docked with the labeling system, the top and/or one or more sides of the object 500 may be visually analyzed (e.g., by the visual analysis module 312 of fig. 3 or the visual analysis unit 416 of fig. 4) to identify: (ii) its surface texture, (ii) recognition (e.g., identification) and/or other information about the object 500, and/or (iii) the pose of the object 500 relative to the robotic system and/or labeling system.
The robotic system and/or labeling system may derive a target labeling location (e.g., TLL) for placing a label (e.g., by the labeling system) on the object 500 and/or print a label for placement on the object 500 based on the surface texture, identification information, and/or other information about the object 500, one or more object surfaces, and/or an item on the object surface. Further, the robotic system and/or labeling system may align a labeling module (e.g., labeling module 316 of fig. 3) with the target labeling position, and the labeling system may place a label thereat based on the object pose.
In some embodiments, the labeling system may derive a target labeling position that is separate from (e.g., does not overlap) and/or relative to the pre-existing item. The labeling system may operate according to one or more predetermined rules to derive a target labeling position. For example, the labeling system may derive the target labeling position based on rules that favor one or more areas (e.g., halves, quadrants, corner areas, etc.), use the pre-existing label 502 and/or the pre-existing image 504 as a reference, and so on. As shown in fig. 5, the labeling system may derive a target labeling position based on using a pre-existing label 502 as a reference. Accordingly, the labeling system may align a first reference edge (e.g., the edge closest to the shared object edge) of the target labeling location with a first edge of the preexisting label 502. The labeling system may identify a second reference edge (e.g., an edge that is perpendicular to the first reference edge and faces a larger or interior region of the object). The labeling system may derive a pose of the target labeling position based on the second reference edge (such as according to the separation distance and/or aligning a corresponding second edge of the label in parallel with the second reference edge). The labeling system may also derive the target labeling location as covering or partially overlapping one or more pre-existing items based on the object information and/or information identified from the pre-existing items. For example, the labeling system may derive the target labeling location of the label as covering an outdated label, covering a label unrelated to the contents of the object, partially covering a previous label (e.g., affixing a barcode label to a previous barcode), or any similar location.
Fig. 6 is a top perspective view of a lateral motion module 320 of a labeling system configured in accordance with some embodiments of the present technique. Lateral movement module 320 may be a sub-element of the labeling alignment assembly of fig. 3 and 4 and/or may be operable to align labeling module 316 with a target labeling position along the y-axis. The lateral motion module 320 may include a lateral frame 602 movably coupled to an upper portion of the assembly frame 304. The transverse frame 602 may be coupled to the upper portion using any suitable mechanism that allows the transverse motion module 320 to translate along the y-axis. For example, the transverse frame 602 may be coupled to the upper portion by one or more brackets 604 that ride on one or more transverse rails 606 (e.g., rails, slides) coupled between the transverse frame 602 and the front and/or rear of the upper portion.
The cross frame 602 may be translated along one or more rails 606 using one or more motors controlled by a robotic system (e.g., robotic system 100 of fig. 1) and/or labeling system 300. For example, one or more lateral racks 612 may be coupled to one or more lateral rails 606 and/or an upper portion of the assembly frame 304, and one or more lateral servos 608 may be coupled to the lateral frame 602. Each lateral servo 608 may include one or more lateral pinions 610 that cooperate with one or more lateral racks 612, and may selectively drive the lateral pinions 610 to translate the lateral motion module 320. In some embodiments, one or more lateral racks 612 may instead be coupled to the lateral frame 602, and one or more lateral servos 608 may be coupled to the assembly frame 304. As shown in fig. 6, the lateral movement module 320 includes: the print module assembly includes (i) a transverse frame 602 coupling the print module 314 to an upper portion of the assembly frame 304, (ii) eight transverse carriages 604 (e.g., four each at the front and back), (iii) four transverse tracks 606 (e.g., two at the front and two at the back), and (iv) two transverse servos 608 and two transverse racks 612 (e.g., one at the front and one at the back).
Fig. 7 is a front perspective view of a vertical motion module 322 of a labeling system configured in accordance with some embodiments of the present technique. For ease of reference, selected elements of labeling assembly 310 (such as assembly frame 304, portions of lateral motion module 320, and the protective covering of fig. 3 over portions of vertical motion module 322) are not included. As shown in fig. 7, vertical movement module 322 may be a sub-element of the labeling alignment assembly of fig. 3 and 4, and may align labeling module 316 with the target labeling position along the z-axis (e.g., press labeling module 316 against the object). Vertical movement module 322 may include a vertical shaft 702 (e.g., a hollow or solid beam, rod, or similar structure) that is movably coupled to another structure of print module 314, label flipping module 318, lateral movement module 320, and/or labeling assembly 310, where vertical shaft 702 carries labeling module 316.
Vertical shaft 702 may be coupled to labeling assembly 310 using any suitable mechanism that allows labeling module 316 to translate along the z-axis. For example, vertical shaft 702 may be carried by vertical support assembly 703, which is stationary along the z-axis (relative to labeling assembly 310) and rotatable about the z-axis. Vertical support assembly 703 may include an upper vertical support frame 704 and a lower vertical support frame 706 that are coupled to one or more structures extending from lateral frame 602. In addition, opposing side brackets 708 (or single side brackets 708) may extend between the upper bracket 704 and the lower bracket 706. In some embodiments, the vertical support assembly 703 may not include the upper bracket 704 or the lower bracket 706. The vertical shaft 702 may extend through the upper bracket 704 and/or the lower bracket 706 and between the side brackets 708.
Vertical axis 702 may be translated along the z-axis using one or more motors controlled by the robotic system and/or the labeling system. For example, one or more vertical racks 714 may be coupled to vertical shaft 702, and one or more vertical servos 710 may be coupled to vertical support assembly 703. Each vertical servo 710 may include one or more vertical pinions 712 interfaced with one or more vertical racks 714, and may selectively drive the vertical pinions to translate vertical shaft 702. Additionally, vertical support assembly 703 may include one or more vertical support gears 716 and/or vertical support cams 718 (e.g., cam rollers, cam surfaces) to maintain alignment of vertical shaft 702 along the z-axis and to allow smooth movement of vertical shaft 702 along the z-axis. The vertical support gear 716 may interface with one or more vertical racks 714. Vertical support cam 718 may interface with a surface of vertical shaft 702. As shown in fig. 7, the vertical movement module 322 includes: (ii) a vertical shaft 702, (ii) an upper bracket 704, (iii) a lower bracket 706, (iv) two opposing side brackets 708, (v) a vertical servo 710 with a vertical pinion 712 coupled thereto, (vi) a vertical rack 714, (vii) three vertical support gears 716, and (viii) two vertical support cams 718.
Any suitable method of rigidly or selectively coupling labeling module 316 to vertical shaft 702 may be used to couple labeling module 316 to the bottom end of the vertical shaft (e.g., the end closest to conveyors 332, 432 of fig. 3, 4). For example, labeling module 316 may be coupled to vertical shaft 702 using a press fit or threaded connection, one or more fasteners, or any similar mechanical or chemical (e.g., epoxy, adhesive) method. In some embodiments, labeling module 316 (or portions thereof) may be integrally formed with vertical shaft 702. Wires, tubes, and/or other structures supporting operation of labeling module 316 (collectively "supply lines") may pass through holes along the length of vertical shaft 702 (e.g., when vertical shaft 702 is hollow) and/or holes along its outer surface. Portions of one or more supply lines extending above vertical shaft 702 may be protected and/or organized within supply harness 720, such as one or more cable tracks or carriers; straps, ties and/or clamps; a cable bushing; and/or any other suitable wire covering and/or organizer.
In some embodiments, vertical motion module 322 may alternatively align labeling module 316 with the target labeling position along the z-axis by vertically translating labeling module 316 and one or more other components of the labeling assembly (e.g., one or more elements of the labeling assembly other than vertical motion module 322). For example, the vertical motion module 322 may be movably coupled to the assembly frame 304, the lateral motion module 320 of fig. 6 may be movably coupled to the vertical motion module 320, and the remainder of the labeling assembly 310 may be coupled to the lateral motion module 320. As another example, the lateral motion module 320 may be movably coupled to the assembly frame 304, the vertical motion module 322 may be movably coupled to the lateral motion module 320, and the remainder of the labeling assembly 310 may be coupled to the vertical motion module 322. In some embodiments, labeling assembly 310 may include a plurality of vertical motion modules 322. For example, labeling assembly 310 may include a vertical motion module 322 between assembly frame 304 and lateral motion module 320 and between the lateral motion module and labeling module 316. By including a vertical motion module 322 between the assembly frame and the rest of labeling assembly 310 and/or including multiple vertical motion modules 322, labeling system 310 may benefit from additional ranges of motion along the z-axis, operating speeds, and reduced maximum torque and/or lateral forces experienced by the die.
In some embodiments, the vertical motion module 322 may alternatively include the same or similar mechanism as that which allows the lateral motion module 320 of fig. 6 to translate along the y-axis. In these embodiments, labeling assembly 310, labeling module 316, or lateral movement module 320 may be coupled to assembly frame 304 by one or more carriages (similar to carriage 604 of fig. 6) riding on one or more vertical rails (similar to lateral rails 606 of fig. 6) coupled to assembly frame 304. Vertical motion module 322 may then similarly include a vertical rack that interfaces with a pinion driven by a vertical servomotor to align labeling module 316 with the target labeling position. In these embodiments, the labeling system 310 may benefit from increased efficiency and accuracy in moving the labeling module 316 to a target labeling position, thereby increasing overall throughput, as compared to, for example, a free-moving and/or six-degree-of-freedom (e.g., arm-like) robot.
Fig. 8A and 8B are front perspective views of a rotation module 324 of a labeling system configured in accordance with some embodiments of the present technique. In particular, fig. 8A shows the rotation module 324 in an x-axis aligned position (e.g., 0 ° rotation), while fig. 8B shows the rotation module 324 in a p-axis aligned position (e.g., +90 ° rotation). For ease of reference, selected elements of labeling assembly 310 (such as portions of assembly frame 304, lateral motion module 320, and print module 314, and the protective covering of fig. 3 over portions of vertical motion module 324) are not included. As shown in fig. 8A and 8B, rotation module 324 may be a sub-element of the labeling alignment assembly of fig. 3 and 4, and may align labeling module 316 with the target labeling position about the z-axis. For example, rotation module 324 may rotate labeling module 316 any incremental amount of rotation between + and/or-180 from the x-axis.
The rotation module 324 may include a rotating portion that interfaces with the vertical motion module 322 and may rotate through a stationary portion coupled to the print module 314, the label flipping module 318, the lateral motion module 320, and/or any other structure of the labeling assembly 310. The rotating portion may include one or more alignment gears 802 configured to rotate the vertical shaft 702 about the z-axis. Alignment gear 802 may be rotatably coupled to upper bracket 704 and/or lower bracket 706, and may interface with vertical support bracket 708 and/or vertical shaft 702 to rotate vertical shaft 702. For example, alignment gear 802 may be rigidly coupled to and rotate upper bracket 704 and/or lower bracket 706. As another example, vertical shaft 702 may extend through an opening of alignment gear 802, and an inner surface of the opening may press against and rotate vertical shaft 702.
The stationary portion may rotate the rotating portion using one or more motors controlled by the robotic system and/or labeling system 300. For example, one or more rotary servos 804 may each selectively drive a rotary pinion 806 that interfaces with the alignment gear 802 to rotate the vertical motion module 322. As shown in fig. 8A and 8B, the rotation module 324 includes an alignment gear 802 coupled to an upper vertical support bracket 704, a rotation servo 804 coupled to one of the beams extending from the transverse frame 602, and a rotation pinion 806 coupled thereto. Although the components of the alignment assembly described may include servomechanisms and/or gearing to cause partial translation and/or rotation thereof, any suitable rotation and/or translation mechanism may be used. For example, elements of the alignment assembly may additionally or alternatively include electrical (e.g., magnetic), pneumatic, and/or hydraulic linear and/or rotary actuators; a belt and pulley assembly; additional gearing (e.g., worm gears, gear trains, gearbox assemblies); and/or any similar mechanism for operating the alignment assembly.
Fig. 9 is a front perspective view of a label flipping module (e.g., label flipping module 318) of a labeling system configured in accordance with some embodiments of the present technology. Label flipping module 318 may receive one or more labels from print module 314 of fig. 3, 4 and prepare and/or transfer the one or more labels to labeling module 316 of fig. 3, 4. For example, label flipping module 318 may receive one or more labels that need to be flipped, folded, and/or peeled; one or more of these operations may be performed on the tag; and transfers the label to labeling module 316. Label flip module 318 may include a transfer plate 902 rotatably coupled to a label flip frame 904. One or more motors controlled by the robotic system and/or labeling system 300 may rotate (and/or incrementally rotate) the transfer plate 902 between a receiving (e.g., first) position (as shown in fig. 9) and a transferring (e.g., second) position opposite the receiving position. For example, the transfer plate 902 may be rotated 150 °, 160 °, 170 °, 180 °, or 190 °, or an amount greater than, less than, or any increment therebetween, from the receiving position to the transfer position along arrow 912. The labels may be held on the bottom surface of transfer plate 902 (in the receiving position) by inverting suction assembly 908 (e.g., a vacuum assembly) to draw air through slots 910 in transfer plate 902. Additional operational details of the label flipping module are described below.
Fig. 10 is a front perspective view of a labeling module 316 of a labeling system configured in accordance with some embodiments of the present technique. Labeling module 316 may receive one or more labels from print module 314 of fig. 3, 4 and/or labeling module 316 of fig. 3, 4 to adhere to an object. The labeling module 316 may include an upper labeling support 1002 coupled to a vertical shaft 702, a labeling plate 1004 spaced therefrom by a compliant assembly 1010, and a labeling suction assembly 1020 (e.g., a vacuum assembly). The compliant assembly 1010 may allow the bottom surface of the labeling plate 1004 to align (e.g., parallel, coplanar, etc.) with the labeling surface of the object. The compliant assembly 1010 may include: one or more compliant posts 1012 movably coupling and holding the upper labeling support 1002 and the labeling plate 1004; and a spring mechanism 1014 biasing the upper labeling support 1002 and the labeling plate 1004 apart. For example, the compliant strut 1012 may be rigidly coupled to the upper labeling support 1002 and slidably coupled to the labeling plate 1004. The spring mechanism 1014 may comprise a helical compression spring around the compliant support 1012 that allows the labeling plate 1004 to move relative to the upper labeling support 1002. The labeling suction assembly 1020 may retain one or more labels on the bottom surface of the labeling plate 1004 by sucking air through the slots extending through the labeling plate 1004. In some embodiments, labeling plate 1004 may include an anti-stick material to prevent portions of the labels from adhering to labeling module 316.
Fig. 11A and 11B are bottom perspective views of label adapters 1102, 1104 of a labeling system configured in accordance with some embodiments of the present technology. Specifically, fig. 11A shows a first tag adapter 1102 having an array of twenty-one air through slots; and figure 11B shows a second tag adapter 1104 having an array of six through slots. The first label adapter 1102 of fig. 11A or the second label adapter 1104 of fig. 11B may be coupled to (e.g., adhered to, fastened to) the label inversion module 318 of fig. 9 and/or the labeling module 316 of fig. 10 to improve the performance of the inversion suction assembly 908 of fig. 9 and/or the labeling suction assembly 1020 of fig. 10, respectively. In some embodiments, first or second label adapters 1102, 1104 may not be included, and the array of through slots may instead extend through transfer plate 902 and/or labeling plate 1004.
The through slot array of first label adapter 1102 may correspond to the shape and/or size of a label that may cover a majority of the bottom surface of transfer plate 902 of fig. 9 and/or labeling plate 1004 of fig. 10. Similarly, the through slot array of second label adapter 1104 may correspond to the shape and/or size of a label covering a small portion of the bottom surface of transfer plate 902 and/or labeling plate 1004. By corresponding the through slot array to the shape and/or size of the labels, a better seal may be formed between the labels and the bottom surface of transfer plate 902 and/or labeling plate 1004 by inverting suction assembly 908 and/or labeling suction assembly, respectively. In some embodiments, the array of through-slots may alternatively correspond to any one or more additional label shapes and/or sizes, may correspond to labels held at certain locations thereon by the label flipping module 318 and/or labeling module 316, and/or may correspond to any other arrangement that improves the performance of the flipping suction assembly 908 and/or label suction assembly 1020.
Fig. 12-15 illustrate a process for labeling objects using the labeling system 300 and/or robotic system of fig. 3, in accordance with some embodiments of the present technique. The process may generally include: (ii) visually analyzing the object (e.g., O1) of fig. 12 to derive a target labeling location (e.g., TLL) thereon and/or a pose thereof, (ii) preparing the label for placement on the object of fig. 13-14, and (iii) aligning the labeling module 316 with the target labeling location and placing the label thereat of fig. 15. Although fig. 12-15 illustrate the labeling process with respect to labeling system 300, labeling system 400 of fig. 4 may follow one or more of the same and/or similar steps performed by its corresponding elements. For example, the conveyor assembly 430 of fig. 4 may perform the operations described with respect to the conveyor assembly 330 of fig. 3, the visual analysis unit 416 of fig. 4 may perform the operations described with respect to the visual analysis module 312 of fig. 3, and/or any other similar operations of the labeling system 300 of fig. 3 that may be performed by corresponding elements of the labeling system 400 of fig. 4.
Fig. 12 illustrates a front perspective view of a labeling system 300 that visually analyzes an object to derive a target labeling position thereon and/or a pose thereof, in accordance with some embodiments of the present technology. For example, the conveyor 332 and/or the conveyor assembly 330 may move or hold the object or a portion thereof within a field of view (e.g., VF) of the visual analysis module 312. One or more imaging devices of the vision analysis module 312 may collect object information regarding the physical and/or identifying characteristics of the object. The labeling system 300 may use the collected object information to identify the object, derive a target labeling location on the object, and identify a pose (e.g., a first positional pose) of the object relative to the conveyor assembly 330, the conveyor 332, the labeling system 300, and/or the robotic system, while in some embodiments the object is spaced apart from the labeling module 316. Based on the identified object, target labeling position, and pose, labeling system 300 may prepare the label of fig. 13A-14, and align the labeling module with labeling module 316 and place the label on the object at the target labeling position of fig. 15.
Fig. 13A-14 illustrate front perspective views of selected components of labeling assembly 310 preparing a label (e.g., label 1300) for placement on an identified object, in accordance with some embodiments of the present technique. In particular, fig. 13A and 13B illustrate a label assembly 310 including a label flipping module 318 between a printing module 314 and a labeling module 316; and fig. 14 shows labeling assembly 310 without label flipping module 318 between print module 314 and labeling module 316.
With respect to fig. 13A, in some embodiments, a label 1300 for an identified object may need to be folded after printing and/or before placement on the object. For example, print module 314 may print label 1300 that includes a front portion that extends above the bottom of labeling module 316 and a rear portion that extends above the bottom of label flipping module 318 (when transfer plate 902 of fig. 9 is in the receiving position). The bottom side of the front and/or back portions of label 1300 (e.g., facing conveyor 332) may include adhesive for adhering label 1300 together after folding. At least the top side of the back portion may include an adhesive for adhering the label 1300 to an object, and may include information printed thereon. At least the front top side may include information printed thereon.
Before and/or while the label 1300 is extending from the print module 314 (e.g., printed by the print module, ejected from the print module), the inversion suction assembly 908 of fig. 9 and/or the labeling suction assembly 1020 of fig. 10 may cooperate to retain the label 1300 on the label inversion module 318 and/or the labeling module 316. Once label 1300 is printed, as shown in fig. 13B: label flipping module 318 may be activated (e.g., transfer plate 902 of fig. 9 may be rotated to the transfer position) to fold label 1300, press and adhere the back portion of label 1300 to the front portion, (ii) flipping suction assembly 908 may be disengaged, and/or (iii) label flipping module 318 may be deactivated (e.g., transfer plate 902 may be returned to the receiving position). As shown, the labeling suction assembly 1020 may then hold the prepared (e.g., folded) label 1300 with adhesive (previously located on the top surface of the back) facing the object and target labeling location.
In some embodiments, the label for the identified object may additionally or alternatively need to be flipped after printing. For example, the print module 314 may print a label extending over the bottom of the label flipping module 318 with an adhesive for adhering the label to an object facing the label flipping module 318. In these embodiments, the label may include adhesive on the top surface (e.g., facing the label flipping module 318), and may include information printed on the bottom surface and/or not include adhesive on the bottom surface. The inverting suction assembly 908 may cooperate to hold and temporarily adhere the label to the label inverting module 318 before, while, and/or after the label extends from the print module 314. Once the label is printed and partially adhered to the label flipping module 318: the label flipping module 318 may be activated, (ii) the flipping suction assembly 908 may be disengaged, (iii) the labeling suction assembly 1020 may cooperate to hold the label against the labeling module 316, and/or (iv) the label flipping module 318 may be deactivated and the label may be separated therefrom. Labeling suction assembly 1020 may hold a prepared (e.g., flipped) label with adhesive (previously on the top surface) facing the object and target labeling position.
In some embodiments, the label for the identified object does not need to be folded nor flipped. For example, as shown in fig. 14, the print module 314 may be adjacent to the labeling module 316 (e.g., not including the label flipping module 318) and may print labels directly to the labeling module 316. In these embodiments, the label may include adhesive on the bottom surface (e.g., facing the conveyor 332), and may include information printed on the top surface and/or not include adhesive on the top surface. Before, during, and/or after the label extends from the printing module 314, the labeling suction assembly 1020 may cooperate to hold the label against the labeling module 316.
Fig. 15 illustrates a front perspective view of a labeling system 300 aligning a labeling module 316 with a target labeling position and placing a label thereat, in accordance with some embodiments of the present technique. For example, one or more of the alignment elements (e.g., conveyor assembly 330, conveyor 332, lateral motion module 320, vertical motion module 322, and/or rotation module 324) may cooperate simultaneously and/or sequentially to move the object or portion thereof under labeling assembly 310 and align labeling module 316 with the target labeling position (e.g., along and/or about the x, y, and/or z axes). The vertical motion module 322 may press the labeling module 316 with a prepared (e.g., printed, folded, flipped, and/or transferred) label held thereon against the top surface of the object to adhere the label thereto. Once the label is adhered to the surface of the object, the labeling suction assembly 1020 may be disengaged (e.g., release the label and retract from the top surface of the object). In addition, the lateral motion module 320, the vertical motion module 322, and/or the rotation module 324 may reposition the labeling module 316 to receive a label for a subsequent object. For example, the location of labeling module 316 may be repositioned to be adjacent to label flipping module 318 and/or printing module 314.
Fig. 16 is a flow diagram illustrating a process 1600 for labeling an object using a labeling system, in accordance with some embodiments of the present technology. The operation of process 1600 is intended to be illustrative and non-limiting. In some embodiments, for example, process 1600 may be accomplished with one or more additional operations not described, without one or more of the operations described, or with operations described and/or not described in an alternative order. As shown in fig. 16, process 1600 may include: optically scanning the object on the object conveyor to obtain visual and physical characteristics (process portion 1602); identifying a target labeling location from the visual features (process portion 1604); preparing an object label on a labeling module carried by the registration assembly based on the visual characteristic (process portion 1606); aligning the labeling module with the target labeling position using the object conveyor and alignment assembly based on the physical features (process portion 1608); and applying an object label to the object using the alignment component based on the physical feature (process portion 1610). The process may be performed or carried out by the robotic system 100 of fig. 1 and 2, the labeling system 300 of fig. 3, the labeling system 400 of fig. 4, and/or any similar robot and/or labeling system or portion thereof.
Optically scanning the object on the object conveyor to obtain the visual and physical characteristics (process portion 1602) can include moving the object or a portion thereof and/or maintaining it within a field of view of the visual analysis module and/or unit and/or collecting information about the object with one or more imaging devices of the visual analysis module and/or unit. For example, one or more imaging devices may collect information about visual features, such as one or more available labeling spaces (e.g., available labeling spaces) and/or one or more object identifier reads. The available labeling space may include a surface area of the object having a minimum desired size and/OR uniform texture and/OR not including any recognizable pattern (e.g., bar code, OR code, letter OR design indicia, etc.). The one or more imaging devices may also collect information about physical features, such as the height, width, and/or length of the object, and/or additional external dimensions; and may collect information about the physical characteristics of the pose of the object relative to the labeling system and/or robotic system. For example, with respect to object pose, the collected information may identify (or be used to identify) a distance and/or rotation of the object and/or one or more object surfaces relative to the labeling system or a portion thereof.
Identifying (e.g., deriving) a target labeling location from the visual features (process portion 1604) may include the labeling system and/or the robotic system analyzing the available labeling space to find locations that satisfy one or more predetermined conditions for placing a label. For example, the location may correspond to a location within an available labeling space, a location specified by an industry standard, a location that improves future handling of the subject, and/or another location that facilitates more efficient reading of the subject's label, such as keeping the label a distance from other surface content, rotating the label in a particular orientation, and so forth.
Preparing the object label on the labeling module carried by the alignment assembly based on the visual characteristic (process portion 1606) may include the labeling system and/or the robotic system instructing the labeling assembly to print and prepare the label on the labeling module. The printing module may read content to print a label with information thereon based on the available labeling space and/or one or more object identifiers. For example, the print module (or labeling system and/or robotic system) may select the type of label to be printed (e.g., shape, size, color, etc.), and/or the barcode, QR code, letters, and/or design to be printed on the label. The label flipping module may then fold, flip, peel off the printed label and/or transfer it to a labeling module. The labeling module may hold the printed label with an object-facing adhesive by engaging a suction assembly.
Based on the physical characteristics, aligning the labeling module with the target labeling position using the object conveyor and alignment assembly (process portion 1608) may include engaging the object conveyor, the lateral motion module, the vertical motion module, and/or the rotation module to move the object or a portion thereof under the labeling assembly. Further, the aligning may comprise deriving an object placement pose in which the labeling module is aligned with the target labeling position. For example, based at least on the height, width, length, and/or pose of the object at the visual analysis module and/or unit, the labeling system and/or robotic system may derive an object placement pose, wherein the object may be located below the labeling component, and the labeling module may be aligned with a target labeling position (e.g., a position of the object where the target labeling position is located within a region of possible orientations of the labeling module by the alignment component). The labeling system and/or robotic system may also derive a motion plan to align the labeling module with the target labeling position when the object is in the placement pose. The motion plan may include an offset distance between the target labeling position and the labeling module between the pose and the placement pose of the object at the visual analysis module and/or the unit. The offset distance may include a distance along and/or around an operational axis of the object conveyor and elements of the alignment assembly. The object conveyor, lateral motion module, vertical motion module, and/or rotation module may then selectively cooperate simultaneously and/or sequentially to reduce and/or eliminate the respective offset distances. In some embodiments, the vertical motion module may maintain an offset distance between the target labeling position and the labeling module along its operating axis above a certain threshold distance. For example, the offset along the z-axis can be maintained to be at least greater than or less than 1 inch, 2 inches, or 3 inches (2.5 cm, 5.1cm, or 7.6 cm).
Based on the physical characteristics, applying the object label to the object using the alignment assembly (process portion 1610) can include pressing the label adhesive against the object at the target labeling location. For example, the vertical motion modules may cooperate to eliminate offset distances between the target labeling position and the labeling module along its operational axis. The vertical motion module may further press the labeling module against the surface of the object (e.g., apply a force to the object via the labeling module), ensuring that the label adheres to the object. The suction assembly may be disengaged by one or more elements of the labeling assembly and the labeling module retracted, and the object conveyor may move the objects from below the labeling assembly and/or to a subsequent portion of the labeling system and/or robotic system.
Aspects of one or more of the described robotic systems and/or labeling systems may efficiently and/or automatically prepare and adhere labels to objects within the robotic systems. The labels may be affixed to avoid pre-existing labels, images, and/or other items on the object as the object passes through the robotic system. By providing automatic labeling, the robot and/or labeling system may improve object tracking and/or management without human intervention, without slowing operation of the robot system, and/or without removing objects from the robot system.
Examples of the invention
The present techniques are illustrated, for example, in accordance with various aspects described below. For convenience, various examples of aspects of the present technology are described as numbered examples (1, 2, 3, etc.). These are provided as examples and do not limit the present technology. It should be noted that any dependent examples may be combined in any suitable manner and placed in respective independent examples. Other examples may be presented in a similar manner.
1. A multi-purpose labeling system, comprising:
a conveyor operable to move an object in a first direction;
a vision analysis module comprising an optical sensor directed at the conveyor and configured to generate image data depicting the object;
at least one processor and at least one memory component having instructions that, when executed by the processor, perform operations comprising calculating a placement location on the object based on readings by the vision analysis module; and
a labeling assembly spaced from the conveyor in a second direction, the labeling assembly comprising:
a printer configured to print a label based on the image data,
a labeling module having a labeling plate configured to receive the label from the printer, an
An alignment assembly having:
a lateral movement module configured to move the labeling module in a third direction,
a vertical motion module configured to move the labeling module along the second direction, wherein the first, second, and third directions are orthogonal to one another, an
A rotation module configured to rotate the labeling module about the second direction, wherein the alignment assembly is operable to place the labeling plate adjacent to the placement location.
2. The multi-purpose labeling system of example 1, further comprising a label flipping module located between the printer and the labeling module, the label flipping module configured to transfer the label from the printer to the labeling plate.
3. The multi-purpose labeling system of example 2, wherein the label flipping module comprises:
a transfer plate rotatable between a first position and a second position, an
A vacuum assembly, wherein the transfer plate is located above the vacuum assembly in the first position, and wherein the flipping plate is located above the labeling plate in the second position.
4. The multi-purpose labeling system of example 1, further comprising an assembly frame that carries the labeling assembly above the conveyor and spaces the labeling assembly from the conveyor along the second direction.
5. The multi-purpose labeling system of example 4, wherein the lateral motion module is movably coupled to the assembly frame and carries the printer, the labeling module, the vertical motion module, and the rotation module.
6. The multi-purpose labeling system of example 5, wherein the lateral motion module is movably coupled to the assembly frame using a carriage and a track.
7. The multi-purpose labeling system of example 4, wherein the printer is rigidly coupled to the frame, and the lateral motion module is movably coupled to the assembly frame and carries the labeling module, the vertical motion module, and the rotation module.
8. The multi-purpose labeling system of example 1, wherein the at least one processor and at least one memory component having instructions that, when executed by the processor, perform operations further comprising:
deriving a placement pose of the object for affixing the tag on the object at the placement location; and
deriving a motion plan for operating the labeling assembly to apply the label based on the placement pose.
9. The multi-purpose labeling system of example 8, wherein calculating the placement location comprises identifying one or more labels, images, logos, or surface imperfections on the object, and calculating the placement location as non-overlapping with the one or more labels, images, logos, or surface imperfections on the object.
10. The multi-purpose labeling system of example 1, further comprising a visual analysis module frame separate from the labeling assembly and spaced from the labeling assembly along the first direction, wherein the visual analysis module frame carries the visual analysis module above the conveyor and spaced from the conveyor along the second direction.
11. The multi-purpose labeling system of example 1, wherein the labeling module comprises a compliant assembly configured to align the labeling plate with the surface of the object when the labeling plate is adjacent to the surface of the object.
12. The multi-purpose labeling system of example 1, wherein said image data generated by said vision analysis module comprises both 2D image data and/or 3D image data.
13. A multi-purpose labeling system, comprising:
one or more controllers having a computer-readable medium carrying instructions that, when executed, result in operations comprising:
causing a vision analysis module having an optical sensor directed at a conveyor to generate image data depicting objects on the conveyor;
printing a label based on the image data;
transferring the label to a labeling module having a labeling plate,
calculating a placement position on the object based on the reading by the vision analysis module, an
Aligning the labeling module with the placement location using an alignment assembly and the conveyor, wherein the alignment assembly has:
a lateral movement module configured to move the labeling module in a first direction,
a vertical motion module configured to move the labeling module along a second direction, wherein the first direction and the second direction are orthogonal to each other, an
A rotation module configured to rotate the labeling module about the second direction.
14. The multi-purpose labeling system of example 13, wherein the operations further comprise positioning the labeling plate adjacent to the surface of the object using the alignment assembly to place the label on the surface of the object.
15. The multi-purpose labeling system of example 13, wherein aligning the labeling module with the object based on the reading by the visual analysis module further comprises:
identifying a first positional pose of the object at a first position spaced from the labeling module;
evaluating a shift between the first positional pose and the labeling module; and
operating the conveyor, the lateral motion module, the vertical motion module, and the rotation module to eliminate the offset.
16. The multi-purpose labeling system of example 13, wherein aligning the labeling module with the object based on the reading by the visual analysis module further comprises identifying a target labeling location for placing the label on the surface of the object.
17. A method for placing a label on an object using a multi-purpose labeling system, comprising:
optically scanning an object on an object conveyor to obtain a visual characteristic and a physical characteristic, wherein the visual characteristic comprises available labeling space and object identifier reading, and wherein the physical characteristic comprises a size of the object;
identifying a target labeling location from the available labeling space;
preparing an object label on a labeling module carried by an alignment assembly based on the object identifier reading content;
aligning the labeling module with the target labeling position using the object conveyor and the alignment assembly based on the physical feature; and
applying the object label to the object using the alignment component based on the physical feature.
18. The method of example 17, wherein the alignment assembly includes a lateral motion module, and wherein aligning further comprises:
advancing the object conveyor to align the labeling module with the target labeling position in a first direction, an
Engaging the lateral movement module to align the labeling module with the target labeling position in a second direction.
19. The method of example 17, wherein the alignment assembly comprises a rotation module, and wherein aligning further comprises:
advancing the object conveyor to align the labeling module with the target labeling position in a first direction, and
engaging the rotation module to rotationally align the labeling module with the target labeling position.
20. The method of example 17, wherein the alignment assembly includes a vertical motion module, and wherein applying further comprises mating the vertical motion module to rotationally align the label module to adhere the object label to the object.
Conclusion
From the foregoing it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but that well-known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the technology. In the event that any material incorporated by reference herein conflicts with the present disclosure, the present disclosure controls. Where the context permits, singular or plural terms may also include plural or singular terms, respectively. Furthermore, unless the word "or" is expressly limited to mean that a single item excludes other items in a list involving two or more items, the use of "or" in such list should be construed as including: any single item in the list, (b) all items in the list, or (c) any combination of items in the list. Further, as used herein, the phrase "and/or" as in "a and/or B" refers to a alone, B alone, and both a and B. Furthermore, the terms "comprising," "including," "having," and "with" are used throughout to mean including at least the recited features, such that any greater number of the same features and/or additional types of other features are not excluded.
It should also be understood from the foregoing that various modifications may be made without departing from the disclosure or techniques. For example, one of ordinary skill in the art may appreciate that the various components of the present technology may be further divided into sub-components, or that the various components and functions of the present technology may be combined and integrated. Moreover, certain aspects of the techniques described in the context of particular embodiments may also be combined or eliminated in other embodiments. Moreover, while advantages associated with certain embodiments of the technology have been described in the context of those embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the technology. Accordingly, the present disclosure and associated techniques may encompass other embodiments not explicitly shown or described herein.

Claims (20)

1. A multi-purpose labeling system, comprising:
a conveyor operable to move an object in a first direction;
a vision analysis module comprising an optical sensor directed at the conveyor and configured to generate image data depicting the object;
at least one processor and at least one memory component having instructions that, when executed by the processor, perform operations comprising calculating a placement location on the object based on readings by the vision analysis module; and
a labeling assembly spaced from the conveyor in a second direction, the labeling assembly comprising:
a printer configured to print a label based on the image data,
a labeling module having a labeling plate configured to receive the label from the printer, an
An alignment assembly having:
a lateral movement module configured to move the labeling module in a third direction,
a vertical motion module configured to move the labeling module along the second direction, wherein the first, second, and third directions are orthogonal to one another, an
A rotation module configured to rotate the labeling module about the second direction, wherein the alignment assembly is operable to place the labeling plate adjacent to the placement location.
2. The multi-purpose labeling system of claim 1, further comprising a label flipping module located between the printer and the labeling module, the label flipping module configured to transfer the label from the printer to the labeling plate.
3. The multipurpose labeling system of claim 2, wherein the label flipping module comprises:
a transfer plate rotatable between a first position and a second position, an
A vacuum assembly, wherein the transfer plate is located above the vacuum assembly in the first position, and wherein the flipping plate is located above the labeling plate in the second position.
4. The multi-purpose labeling system of claim 1, further comprising an assembly frame that carries the labeling assembly above the conveyor and spaces the labeling assembly from the conveyor along the second direction.
5. The multi-purpose labeling system of claim 4, wherein the lateral movement module is movably coupled to the assembly frame and carries the printer, the labeling module, the vertical movement module, and the rotation module.
6. The multipurpose labeling system of claim 5, wherein the lateral motion module is movably coupled to the assembly frame using a bracket and a track.
7. The multi-purpose labeling system of claim 4, wherein the printer is rigidly coupled to the frame, and the lateral motion module is movably coupled to the assembly frame and carries the labeling module, the vertical motion module, and the rotation module.
8. The multi-purpose labeling system of claim 1, wherein the at least one processor and at least one memory component having instructions that when executed by the processor perform operations further comprising:
deriving a placement pose for the object that attaches the tag to the object at the placement location; and
deriving a motion plan for operating the labeling assembly to affix the label from the placement pose.
9. The multipurpose labeling system of claim 8, wherein calculating the placement location comprises identifying and avoiding one or more labels, images, logos, or surface damage on the object.
10. The multi-purpose labeling system of claim 1, further comprising a visual analysis module frame separate from the labeling assembly and spaced from the labeling assembly along the first direction, wherein the visual analysis module frame carries the visual analysis module above the conveyor and spaced from the conveyor along the second direction.
11. The multipurpose labeling system of claim 1, wherein the labeling module comprises a compliant assembly configured to align the labeling plate with the surface of the object when the labeling plate is adjacent to the surface of the object.
12. The multi-purpose labeling system of claim 1, wherein the image data generated by the visual analysis module comprises both 2D image data and/or 3D image data.
13. A multi-purpose labeling system, comprising:
one or more controllers having a computer-readable medium carrying instructions that, when executed, result in operations comprising:
causing a vision analysis module having an optical sensor directed at a conveyor to generate image data depicting objects on the conveyor;
printing a label based on the image data;
transferring the label to a labeling module having a labeling plate,
calculating a placement location on the object based on the reading by the vision analysis module, an
Aligning the labeling module with the placement location using an alignment assembly and the conveyor, wherein the alignment assembly has:
a lateral motion module configured to move the labeling module in a first direction,
a vertical motion module configured to move the labeling module along a second direction, wherein the first direction and the second direction are orthogonal to each other, an
A rotation module configured to rotate the labeling module about the second direction.
14. The multi-purpose labeling system of claim 13, wherein the operations further comprise positioning the labeling plate adjacent to the surface of the object using the alignment assembly to place the label on the surface of the object.
15. The multi-purpose labeling system of claim 13, wherein aligning the labeling module with the object based on the reading by the visual analysis module further comprises:
identifying a first positional pose of the object at a first location spaced from the labeling module;
evaluating a shift between the first positional pose and the labeling module; and
operating the conveyor, the lateral motion module, the vertical motion module, and the rotation module to eliminate the offset.
16. The multi-purpose labeling system of claim 13, wherein aligning the labeling module with the object based on the reading by the visual analysis module further comprises identifying a target labeling location for placing the label on the surface of the object.
17. A method for placing a label on an object using a multi-purpose labeling system, comprising:
optically scanning an object on an object conveyor to obtain a visual characteristic and a physical characteristic, wherein the visual characteristic comprises available labeling space and object identifier reading, and wherein the physical characteristic comprises a size of the object;
identifying a target labeling location from the available labeling space;
preparing an object label on a labeling module carried by an alignment assembly based on the object identifier reading content;
aligning the labeling module with the target labeling position using the object conveyor and the alignment assembly based on the physical feature; and
applying the object label to the object using the alignment component based on the physical feature.
18. The method of claim 17, wherein the alignment assembly comprises a lateral motion module, and wherein aligning further comprises:
advancing the object conveyor to align the labeling module with the target labeling position in a first direction, an
Engaging the lateral movement module to align the labeling module with the target labeling position in a second direction.
19. The method of claim 17, wherein the alignment assembly comprises a rotation module, and wherein aligning further comprises:
advancing the object conveyor to align the labeling module with the target labeling position in a first direction, an
Engaging the rotation module to rotationally align the labeling module with the target labeling position.
20. The method of claim 17, wherein the alignment assembly comprises a vertical motion module, and wherein applying further comprises mating the vertical motion module to rotationally align the label module to adhere the object label to the object.
CN202211009475.8A 2021-08-13 2022-08-15 Robotic system and method with multi-purpose labeling system Pending CN115557044A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US202163232665P 2021-08-13 2021-08-13
US63/232,665 2021-08-13
US17/885,421 US20230050326A1 (en) 2021-08-13 2022-08-10 Robotic systems with multi-purpose labeling systems and methods
US17/885,421 2022-08-10
CN202210977868.1A CN115703559A (en) 2021-08-13 2022-08-15 Robot system and method with multi-purpose labeling system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202210977868.1A Division CN115703559A (en) 2021-08-13 2022-08-15 Robot system and method with multi-purpose labeling system

Publications (1)

Publication Number Publication Date
CN115557044A true CN115557044A (en) 2023-01-03

Family

ID=84777510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211009475.8A Pending CN115557044A (en) 2021-08-13 2022-08-15 Robotic system and method with multi-purpose labeling system

Country Status (1)

Country Link
CN (1) CN115557044A (en)

Similar Documents

Publication Publication Date Title
CN110329710B (en) Robot system with robot arm adsorption control mechanism and operation method thereof
JP6738112B2 (en) Robot system control device and control method
US20180065806A1 (en) Picking apparatus
CN106002236B (en) Restructural mounting work station
US20210114826A1 (en) Vision-assisted robotized depalletizer
EP1818284A1 (en) Automatic warehouse
US20200277139A1 (en) Warehouse system
US11180317B1 (en) Rotary sortation and storage system
US11110613B2 (en) Holding mechanism, transfer device, handling robot system, and robot handling method
CN114762983A (en) Robot system with clamping mechanism
Aleotti et al. Toward future automatic warehouses: An autonomous depalletizing system based on mobile manipulation and 3d perception
JP2012040669A (en) Bulk picking device and method thereof
CN114118931A (en) Logistics management system and method for monitoring cargo state in real time
US20230050326A1 (en) Robotic systems with multi-purpose labeling systems and methods
CN115557044A (en) Robotic system and method with multi-purpose labeling system
WO2023193773A1 (en) Robotic systems with object handling mechanism and associated systems and methods
Cosma et al. An autonomous robot for indoor light logistics
US20230052763A1 (en) Robotic systems with gripping mechanisms, and related systems and methods
CN112495805A (en) Sorting system and method based on multi-face code reading, electronic terminal and storage medium
WO2023086868A1 (en) Automated product unloading, handling, and distribution
WO2022190102A1 (en) System and method for identifying or acquiring data corresponding to a handled item
CN111618852B (en) Robot system with coordinated transfer mechanism
US10968051B1 (en) Adjustable robotic end of arm tool for multiple object handling
JP2023004755A (en) Handling system, instruction device, handling method, program, and storage medium
US20240149460A1 (en) Robotic package handling systems and methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination