CN114833818A - Robot manufacturing system and method - Google Patents
Robot manufacturing system and method Download PDFInfo
- Publication number
- CN114833818A CN114833818A CN202210063237.9A CN202210063237A CN114833818A CN 114833818 A CN114833818 A CN 114833818A CN 202210063237 A CN202210063237 A CN 202210063237A CN 114833818 A CN114833818 A CN 114833818A
- Authority
- CN
- China
- Prior art keywords
- robot
- workpiece
- image
- control unit
- work area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 91
- 238000000034 method Methods 0.000 title abstract description 34
- 238000003384 imaging method Methods 0.000 claims abstract description 52
- 230000004044 response Effects 0.000 claims abstract description 30
- 238000004891 communication Methods 0.000 claims abstract description 13
- 230000008859 change Effects 0.000 claims description 14
- 230000015654 memory Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 8
- 238000013500 data storage Methods 0.000 description 5
- 241000282412 Homo Species 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 239000012636 effector Substances 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 238000003331 infrared imaging Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000003801 milling Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 238000007493 shaping process Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Manipulator (AREA)
Abstract
A robot manufacturing system and method includes a robot including a control unit in communication with an operating member. The control unit is configured to operate the operating member with respect to the workpiece within the work area according to the control plan. The presence sensor is configured to detect the presence of a human within the work area. The imaging device is configured to acquire an image of at least a portion of the workpiece in response to the presence sensor detecting the presence of a human being within the work area.
Description
Technical Field
Embodiments of the present disclosure relate generally to robotic manufacturing systems and methods, such as may be used to form part of a vehicle.
Background
The various structural members are formed and assembled by robots. Furthermore, human operators often work with robots in certain manufacturing installations. In particular, a collaborative robot (i.e., a collaborative robot) interacts with one or more humans in a workspace to form a structure. During the manufacturing process, the cooperating robots and human operators each perform certain defined tasks on the structure.
During the manufacturing process, one or more human operators check the assembly conditions of the structural members to ensure correct and desired shaping, assembly, etc. It will be appreciated that manual assembly condition checking is time consuming and prone to human error. For example, humans often walk around the structure being formed and compare the assembled state to a large number (e.g., hundreds or more) of engineering drawings to ensure manufacturing integrity. This procedure is generally applicable to conventional automated assembly processes when the robot and the human work independently of each other and often cannot interfere with the work of each other.
However, in a cooperative robot in which a human and a robot work together, a work area including a structural member and the structural member itself often develop and change over time. In such installations, the human operator may alter the structural member to a point where the cooperating robot may no longer recognize the state of the structural member, resulting in a quality failure.
Disclosure of Invention
There is a need for a system and a method for efficiently and effectively manufacturing structural members by a cooperative robotic process. Further, there is a need for a system and method for ensuring consistency in the quality of structural members formed by robotic or collaborative robotic processes.
In view of these needs, certain embodiments of the present disclosure provide a robot manufacturing system that includes a robot including a control unit in communication with an operating member. The control unit is configured to operate the operating member with respect to the workpiece within the work area according to the control plan. The presence sensor is configured to detect the presence of a human within the work area. The imaging device is configured to acquire an image of at least a portion of the workpiece in response to the presence sensor detecting the presence of a human being within the work area.
In at least one embodiment, the control unit is further configured to: the image is compared with reference data of the workpiece, if the image conforms to the reference data, the operation of the operating member is continued according to the control plan, and if the image does not conform to the reference data, the operation of the operating member according to the control plan is stopped.
In at least one embodiment, the control unit is further configured to update the reference data based on at least one change between the image and the reference data. As a further example, the control unit is further configured to update the control plan after updating the reference data.
As an example, a presence sensor is configured to detect the presence of a human being within an operational area of a workpiece within a work area.
As one example, the robot includes one or both of a presence sensor or an imaging device. As another example, one or both of the presence sensor or the imaging device are remote from the robot within the work area.
In at least one embodiment, the imaging device is configured to acquire the image after the human is no longer within the work area.
In at least one embodiment, the imaging device is configured to acquire an image of at least a portion of the workpiece in response to the presence sensor detecting the presence of a human being within an operational area of the workpiece within the work area.
As an example, the robot further comprises a transport allowing the robot to move within the work area.
Certain embodiments of the present disclosure provide a robot manufacturing method, including: operating, by a control unit of the robot, an operating member of the robot with respect to a workpiece within the work area according to a control plan; detecting, by a presence sensor, a presence of a human within a work area; and acquiring, by the imaging device, an image of at least a portion of the workpiece in response to the presence sensor detecting the presence of a human being within the work area.
In at least one embodiment, the robot manufacturing method further comprises comparing, by the control unit, the image with reference data of the workpiece. Continuing to operate the operating member according to the control plan by the control unit if the image conforms to the reference data; the control unit stops the operator from operating the operating member according to the control plan if the image does not conform to the reference data.
In at least one embodiment, the robot manufacturing method further comprises updating, by the control unit, the reference data based on at least one change between the image and the reference data. In at least one further example, the robot manufacturing method further comprises updating, by the control unit, the control plan after updating the reference data.
Drawings
Fig. 1 shows a schematic block diagram of a robot manufacturing system according to an embodiment of the present disclosure.
Fig. 2 shows a schematic block diagram of a robot according to an embodiment of the present disclosure.
Fig. 3 shows a schematic block diagram of a work area according to an embodiment of the present disclosure.
Fig. 4 shows a schematic block diagram of a robot according to an embodiment of the present disclosure.
FIG. 5 illustrates a perspective view of a workpiece within a work area according to an embodiment of the present disclosure.
Fig. 6 illustrates a perspective view of a robot accessing a workpiece within a work area in accordance with an embodiment of the present disclosure.
FIG. 7 shows a simplified view of an image of a workpiece and reference data for the workpiece, according to an embodiment of the disclosure.
Fig. 8 shows a flow diagram of a robot manufacturing method according to an embodiment of the disclosure.
Fig. 9 shows a flow diagram of a robot manufacturing method according to an embodiment of the present disclosure.
Detailed Description
Certain embodiments of the foregoing summary, as well as the following detailed description, will be better understood when read in conjunction with the appended drawings. As used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not necessarily excluding plural said elements or steps. Furthermore, references to "one embodiment" are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Furthermore, unless explicitly stated to the contrary, embodiments "comprising" or "having" one or more elements having a particular condition may include additional elements not having that condition.
Certain embodiments of the present disclosure provide a system and method for manufacturing a structural member. The system and method include operating the robotic system within the work area and according to a control plan for the robotic system. The system and method also include detecting the presence of a human in the work area via a human presence detector. In response to detecting the presence of a human, the system and method further include detecting an assembly condition of the structure via an assembly condition detector, comparing the detected assembly condition with a previous assembly condition to identify an assembly condition change that occurred due to the presence of a human, and updating the control plan based on the identified assembly condition change.
In at least one embodiment, the method may include populating a digital twin of the workpiece, the digital twin indicating an aspect of the workpiece prior to operation of the robotic system, and deactivating the digital twin in response to detecting the presence of the human. In at least one embodiment, updating the control plan includes refilling the digital twins based on the detected assembly conditions.
In at least one embodiment, disabling the digital twin in response to detecting the presence of a human and responsively updating the control plan may help avoid repairs and/or reworking that may otherwise occur based on the robot not knowing the changes made by the human. Further, deactivating the digital twins in response to detecting the presence of a human may also provide an increase in manufacturing rates, as the system and method may reduce or avoid continuously scanning the entire structure and updating the digital twins during operation.
Certain embodiments of the present disclosure provide a system and method for robust team-style human/robot assembly in which robotics is enhanced using human detectors. The system and method may use the presence of a human being in the vicinity of the workpiece as a signal to deactivate the robot digital twin and refill it with any changes that the mechanic may make while touching the workpiece, thereby ensuring that the digital twin reflects reality. Thus, the manufacturing quality is ensured. In addition, the system and method save a significant amount of equipment down time as compared to conventional methods that suffer from incorrect assembly conditions unknown to the automated equipment.
In at least one embodiment, a robot (e.g., a mobile collaborative robot) maintains a digital twin of part states during a manufacturing process. When the manufacturing process is semi-automated, there are tasks performed by the mechanic prior to the robot operation, and other tasks performed by the mechanic while the robot is operating. For example, temporary fasteners may be installed before the robot drills a hole in the area of interest, for example.
Cooperative robots coordinate humans and robots to improve productivity, but this means that robots of such systems often need to recognize what a human operator has done. As such, certain embodiments of the present disclosure include robots enhanced with an assembly condition detector and a human proximity detector. For example, the assembly condition detector includes a camera (or other sensor) to determine whether a hole is drilled, a fill material, whether a bracket has been installed, and any other manufacturing requirements that require verification. When the robot is working, the human proximity detector can determine when a human is in proximity to the workpiece. The robot fills in digital twins representing how the workpiece is before starting operation using the assembly condition detector. The robot may continuously monitor the human proximity detector during performance of the task. If triggered, the robot may pause and disable the digital twin because the mechanic may have made changes that the robot may not know. After the human has left, before starting to work blindly like a normal robot, the digital twin (which may be empty or indeterminate and needs to be updated) causes the set-up condition detector to scan again, fill the digital twin, and automatically determine any changes that need to be made to the robot program, make those changes, and then resume work.
Fig. 1 shows a schematic block diagram of a robot manufacturing system 100 according to an embodiment of the present disclosure. The robotic manufacturing system 100 includes a robot 102 that operates on a workpiece 104 within a work area 106. In at least one embodiment, the robotic manufacturing system 100 is a collaborative robotic system in which both the robot 102 and a human operator perform certain operations on the workpiece 104.
The robot 102 includes a control unit 108 in communication with an operating member 110, such as through one or more wired or wireless connections. The control unit 108 is also in communication with the memory 112, e.g., via one or more wired or wireless connections. In at least one embodiment, the control unit 108 includes a memory 112.
The control unit 108 operates the operating member 110 to perform operations on the workpiece 104 based on a control plan 114 stored in a memory 112. The control plan 114 includes or otherwise stores reference data 116 for the workpiece 104 throughout various stages of the manufacturing process. As an example, the reference data 116 may include image data, image transformation data, and/or the like. In at least one example, the reference data 116 includes an image of the workpiece, which may be a digital twin (digital twin) of the workpiece. Such a digital twin may be deactivated, updated, and/or otherwise manipulated. The reference data 116 may be in the same format or a different format than the image 122 acquired by the imaging device 120. In at least one example, the reference data 116 includes: initial reference data 116a associated with the workpiece 104 prior to any operations being performed thereon; final reference data 116n associated with the workpiece 104 after all manufacturing operations have been performed thereon; and intermediate reference data 116i associated with various stages of manufacture between the initial reference data 116a and the final reference data 116 n.
The workpiece 104 may be any structural member configured to be formed by a robotic and/or collaborative robotic process. For example, the workpiece 104 may be a structural member of a vehicle. As a further example, the workpiece 104 may be part of a fuselage, a wing, or the like of an aircraft. As another example, the workpiece 104 may be a structural member that forms a portion of a fixed structure (e.g., a residential or commercial building).
The operating member 110 may be any component configured to perform an operation on the workpiece 104. For example, the operating member 110 may be a drill bit, a saw, a milling device, a lathe, a gripper arm, and/or the like. The operating member 110 may be disposed on an end effector, an articulated arm, and/or the like of the robot 102.
The robotic manufacturing system 100 also includes a presence sensor 118 in communication with the control unit 108, such as via one or more wired or wireless connections. In at least one embodiment, the presence sensors 118 are disposed remotely from the robot 102 within the work area 106. For example, the presence sensors 118 may be disposed on a boom, base, table, wall, ceiling, floor, or the like within the work area 106. As another example, the robot 102 includes a presence sensor 118.
The presence sensors 118 are configured to detect the presence of a human being within the work area 106 and/or within a target area within the work area 106. For example, the presence sensor 118 is configured to detect a human, such as by pattern recognition, machine learning, and/or the like. By way of example, the presence sensor 118 is or includes an imaging device, such as a camera, infrared sensor, laser, ultrasonic detector, or the like, that acquires an image and determines whether a human is within the image by recognizing a pattern or image associated with the shape of the human (e.g., legs, torso, and/or other parts).
The robotic manufacturing system 100 also includes an imaging device 120 in communication with the control unit 108, such as via one or more wired or wireless connections. In at least one embodiment, the imaging device 120 is disposed remotely from the robot 102 within the work area 106. For example, the imaging device 120 may be disposed on a boom, base, table, wall, ceiling, floor, or the like within the work area 106. As another example, the robot 102 includes an imaging device 120.
The imaging device 120 is configured to acquire an image of the workpiece 104. For example, the imaging device 120 acquires a plurality of images of the workpiece 104 during the manufacturing process. The imaging device 120 may be a camera, such as a digital camera, an infrared imaging device, a laser imaging device, an ultrasonic imaging device, and/or the like.
In operation, the workpiece 104 is initially set up, for example by one or more human operators positioning the workpiece 104 in a work area 106 where the robot 102 is able to operate on the workpiece 104. After the workpiece 104 is initially set, the imaging device 120 acquires an image 122 of the workpiece 104. The control unit 108 receives image data 124, which includes the image 122 from the imaging device 120. The control unit 108 compares the image 122 with the associated reference data 116 (e.g., the initial reference data 116a) (e.g., within the control plan 114). If the image 122 corresponds to the associated reference data 116 during a particular manufacturing stage, the control unit 108 operates the operating members 110 of the robot 102 to perform one or more operations on the workpiece 104 in accordance with the control plan 114. However, if the image 122 does not correspond to the associated reference data 116, the control unit 108 invalidates the manufacturing process and does not continue operating on the workpiece 104. The imaging device 120 and the control unit 108 operate accordingly during the entire manufacturing process from the initial stage to the final stage.
During the manufacturing process, the presence sensor 118 monitors the work area 106 to determine whether a human has entered the work area. If the presence sensor 118 detects the presence of a human being, the presence sensor 118 outputs a presence signal 126 to the control unit 108. The presence signal 126 indicates the presence of a human within the work area 106.
In response to receiving the presence signal 126, the control unit 108 stops operation on the workpiece 104. In response to the control unit 108 receiving the presence signal 126, the imaging device 120 acquires the image 122 of the workpiece 104. For example, the image device 120 acquires the image 122 immediately after the control unit 108 receives the presence signal 126. As another example, the imaging device 120 acquires the image 122 after the control unit 108 determines that the individual is no longer present in the working area 106 based on the presence signal 126. For example, the imaging device 120 is configured to acquire images after a human is no longer within the work area 106. Thus, in at least one example, acquiring an image of at least a portion of a workpiece in response to a presence sensor detecting a presence of a human being within a work area comprises: waiting until the detected individual no longer exists within the work area before acquiring the image. For example, acquiring an image of at least a portion of the workpiece in response to the presence sensor detecting the presence of a human being within the work area may include, after detecting the presence of a human being within the work area, the control unit determining that a human being is no longer present in the work area; and acquiring an image after determining that the human is no longer present in the work area. In this manner, the resulting image acquired by the imaging device 120 reflects any work performed by the individual while in the work area 106.
In at least one embodiment, the control unit 108 may not change the current control plan 114 in response to receiving the presence signal 126. For example, when an individual enters the work area 106, the robot 102 may execute the current control plan 114. However, an individual may be within a certain portion of the work area 106 that does not affect the control plan 114 in another area of the work area 106. In this way, the control unit 108 may operate the robot 102 to complete a particular portion of the current control plan 114 and continue a subsequent portion of the control plan 114 in which the individual is detected by the presence sensor 118. In this way, the reference data 116 may be invalid for subsequent portions of the control plan 114, and the imaging device 120 may then acquire a new image 122 of the workpiece 104, and the control plan 114 may then be updated accordingly.
In at least one embodiment, the image 122 is a digital twin of the workpiece 104. The control unit 108 receives image data 124 comprising an image 122, the image 122 representing the workpiece 104 after the presence signal 126 has been received and the robot 102 stops operating on the workpiece 104. The control unit 108 then compares the image 122 with the relevant reference data 116 for the particular manufacturing stage. If the image 122 corresponds to the associated reference data 116, the control unit 108 reengages to operate on the workpiece 104. However, if the image 122 does not correspond to the associated reference data 116, the control unit 108 disables the manufacturing process and the robot 102 may not operate further on the workpiece 104. Instead, the control unit 108 updates the image 122 (e.g., the digital twin of the workpiece 104) and/or the one or more reference data 116 to include changes that may have been made by the human operator, and updates the control plan 114 accordingly. Examples of the control unit 108 updating the image 122 include modifying the image to include or remove a feature, or replacing a portion of the entirety of the image 122.
In at least one embodiment, the control unit 108 updates the image 122 before updating the control plan 114. In at least one other embodiment, the control unit 108 updates the image 122 and the control plan 114 simultaneously.
In at least one embodiment, the presence sensor 118 is configured to detect the presence of a human being within a target area of the work area 106. For example, the presence sensor 118 is configured to detect the presence of a human within an operating area 128 (e.g., within a radius of 10 feet or less) of a particular operation on the workpiece 104. If the presence sensor 118 detects presence within the operating region 128, the control unit 108 stops operation and a comparison is made between the image 122 and the associated reference data 116, as described herein. However, if the presence sensor 118 does not detect presence within the operation area 128, the control unit 108 does not stop operation even if presence is detected within the work area 106 outside the operation area 128.
In at least one embodiment, the control unit 108 determines an operating region 128 relative to the workpiece 104. The control unit 108 may determine the operating region 128 from data stored in the memory 112. As described herein, in at least one embodiment, the control unit 108 may be triggered by the presence sensor only when presence is detected within the operating region 128. Thus, if it is detected that only a portion of the workpiece 104 (e.g., the area where work is being performed) is present, rather than the entire workpiece 104, a subsequent operation (e.g., image acquisition) may be triggered.
As described above, the robotic manufacturing system 100 includes the robot 102, and the robot 102 operates on the workpiece 104 within the work area 106 according to the control plan 114. The presence sensor 118 is configured to detect the presence of a human in the work area 106. In response to detecting the presence of a human, the imaging device 120 detects an assembly condition of the workpiece 104, for example, by acquiring an image 122 of the workpiece 104. The control unit 108 then compares the assembly condition (i.e., the image 122) of the workpiece 104 with a previous assembly condition image (e.g., the reference data 116) to identify assembly condition changes that occur due to the presence of a human. The control unit 108 may then update the control plan 114 based on the identified assembly condition change.
In at least one embodiment, the image 122 acquired by the imaging device 120 is a digital twin of the workpiece 104. As an example, a digital twin indicates how the workpiece 104 looks before the operational phase of the robot 102. In at least one embodiment, the control unit 108 deactivates the digital twin (e.g., the image 122) in response to the presence sensor 118 detecting the presence of a human being within the work area 106 and/or the operational area 128 of the work area 106.
In at least one embodiment, the control unit 108 updates the control plan 114 by a digital twin (e.g. by inserting an operating area, such as a through hole to be formed) based on the detected assembly conditions. For example, the control unit 108 may use the detected presence of a person as a signal to deactivate the robot's digital twin and refill the digital twin with any changes that the person may have made when touching the workpiece, thereby ensuring that the digital twin reflects reality.
As described herein, in at least one embodiment, the robotic manufacturing system 100 includes a robot 102, the robot 102 including a control unit 108 in communication with an operating member 110. The control unit 108 is configured to operate the operating member 110 with respect to the workpiece 104 within the working area 106 according to a control plan 114. The presence sensor 118 is configured to detect the presence of a human within the work area 106. The imaging device 120 is configured to acquire an image 122 of at least a portion of the workpiece 104 in response to the presence sensor 118 detecting the presence of a human being within the work area 106. The control unit 108 is further configured to compare the image 122 with reference data 116 of the workpiece 104. The control unit 108 is further configured to: if the image 122 corresponds to the reference data 116, the operating member 110 continues to be operated according to the control plan 114. The control unit 108 is further configured to: if the image 122 does not correspond to the reference data 116, the operation of the operating member 110 according to the control plan 114 is stopped.
In at least one embodiment, the control unit 108 is further configured to repopulate one or both of the image 122 or the reference data 116 based on at least one change between the image 122 or the reference data 116. For example, the control unit 108 updates the reference data 116a, 116i, and/or 116n to reflect the change (depending on the manufacturing stage). The repopulated reference data 116a, 116i, and/or 116n provides revised and updated reference data 116 for the control plan 114. In at least one embodiment, the control unit 108 is further configured to update the control plan 114 after one or both of the image 122 or the reference data 116 is updated (e.g., based on the updated reference data 116).
As used herein, the terms "control unit," "central processing unit," "CPU," "computer," or similar terms may include any processor-based or microprocessor-based system, including systems using microcontrollers, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASICs), logic circuits, and any other circuit or processor, including hardware, software, or combinations thereof, that is capable of executing the functions described herein. These are exemplary only, and thus are not intended to limit in any way the definition and/or meaning of these terms. For example, as described herein, the control unit 108 may be or include one or more processors configured to control operations.
To process data, the control unit 108 is configured to execute a set of instructions stored in one or more data storage units or elements (such as one or more memories). For example, the control unit 108 may include or be coupled to one or more memories. The data storage unit may also store data or other information as desired or needed. The data storage elements may be in the form of physical memory elements within the information source or processor.
The set of instructions may include various commands that instruct the control unit 108 as a processing machine to perform specific operations, such as the methods and processes of the various embodiments of the subject matter described herein. The set of instructions may be in the form of a software program. The software may be in various forms, such as system software or application software. Further, the software may be in the form of a collection of separate programs, a subset of programs within a larger program, or a portion of a program. The software may also include modular programming in the form of object-oriented programming. The processing of input data by a processing machine may be in response to a user command, or in response to the results of a previous processing, or in response to a request made by another processing machine.
The figures of embodiments herein may show one or more control or processing units, such as control unit 108. It should be understood that a processing or control unit may represent circuitry, or portions thereof, which may be implemented as hardware with associated instructions (e.g., software stored on a tangible and non-transitory computer readable storage medium such as a computer hard drive, ROM, RAM, etc.) to perform the operations described herein. The hardware may include state machine circuitry that is hardwired to perform the functions described herein. Alternatively, the hardware may comprise electronic circuitry that includes and/or is coupled to one or more logic-based devices, such as microprocessors, processors, controllers, or the like. Alternatively, the control unit 108 may represent processing circuitry, such as one or more of a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), a microprocessor, and/or the like. The circuitry in various embodiments may be configured to execute one or more algorithms to perform the functions described herein. The one or more algorithms may include aspects of the embodiments disclosed herein, whether or not explicitly identified in a flowchart or a method.
As used herein, the terms "software" and "firmware" are interchangeable, and include any computer program stored in a data storage unit (e.g., one or more memories) for execution by a computer, including RAM memory, ROM memory, EPROM memory, EEPROM memory, and non-volatile RAM (nvram) memory. The above data storage unit types are exemplary only, and are thus not limiting as to the types of memory usable for storage of a computer program.
Fig. 2 shows a schematic block diagram of a robot 102 according to an embodiment of the present disclosure. Referring to fig. 1 and 2, in at least one embodiment, the robot 102 includes a presence sensor 118 and an imaging device 120. In at least one other embodiment, the robot 102 includes only one of the presence sensor 118 or the imaging device 120. In at least one other embodiment, the robot 102 does not include the presence sensor 118 or the imaging device 120.
Fig. 3 shows a schematic block diagram of the work area 106 according to an embodiment of the present disclosure. Referring to fig. 1 and 2, in at least one embodiment, the robot 102 does not include a presence sensor 118 or an imaging device 120. In contrast, in such embodiments, the presence sensor 118 and the imaging device 120 are remote from the robot 102 within the work area 106.
Fig. 4 shows a schematic block diagram of a robot 102 according to an embodiment of the present disclosure. In at least one embodiment, the robot 102 includes a vehicle 130 in communication with the control unit 108, such as through one or more wired or wireless connections. The vehicle 130 is a mobile platform that allows the robot 102 to move around the work area 106, thereby providing a mobile robot. For example, the conveyance 130 may include wheels, tracks, rollers, hinged legs, or the like.
In at least one embodiment, the robot 102 also includes a navigation subsystem 132 in communication with the control unit 108, such as through one or more wired or wireless connections. The navigation subsystem 132 allows the control unit 108 to know where the robot 102 is within the work area 106 (shown in fig. 1) and to operate the vehicle 130 to move the robot 102 to a desired location within the work area 106.
In at least one other embodiment, the robot 102 does not include the navigation subsystem 132 and/or the vehicle 130. Rather, the robot 102 may be fixed in a particular location, such as along an assembly line.
Fig. 5 illustrates a perspective view of a workpiece 104 within a work area 106 according to an embodiment of the present disclosure. Referring to fig. 1 and 6, a workpiece 104 is initially set up in a work area 106, for example, by one or more human operators. For example, the workpiece 104 may be a frame member of a vehicle.
Fig. 6 illustrates a perspective view of the robot 102 approaching a workpiece 104 within the work area 106, in accordance with an embodiment of the present disclosure. Referring to fig. 1, 5, and 6, after the workpiece 104 is initially set up, the robot 102 approaches the workpiece 104 to operate thereon according to the control plan 114.
FIG. 7 illustrates a simplified view of an image 122 of a workpiece 104 and reference data 116 of the workpiece 104, in accordance with an embodiment of the invention. Referring to fig. 1, 5, 6, and 7, in response to the presence sensor 118 detecting the presence of a human being within the work area 106 (and/or the operational area 128 of the work area 106), the imaging device 120 acquires an image 122 of the workpiece 104. Then, the control unit 108 stops the operation of the operating member 110. The control unit 108 then compares the image 122 with reference data 116 for a particular stage of manufacture. If the image 122 corresponds to the reference data 116, the control unit 108 reactivates the operating member 110 to continue operating on the workpiece 104 in accordance with the control plan 114. However, if the image 122 does not correspond to the reference data 116, the control unit 108 disables the manufacturing process and may avoid further operations on the workpiece 104. In at least one embodiment, in response to the image 122 not conforming to the reference data 116, the control unit 108 refills the image 122 based on changes made on the workpiece 104 (e.g., additional through holes formed in the workpiece 104 as reflected in the image 122), which are digital twins of the workpiece 104 that are currently present. The control unit 108 updates the control plan 114 and the reference data 116 accordingly and continues to operate the operating member 110 based on the updated control plan 114.
Fig. 8 shows a flow diagram of a robot manufacturing method according to an embodiment of the disclosure. Referring to fig. 1 and 8, a robotic manufacturing method includes disposing a workpiece 104 within a work area 106 at 200. Next, at 202, the robot 102 is positioned within an operating range of the workpiece 104. The operation range is a range in which the operation member 110 can operate on the workpiece 104.
At 204, the control unit 108 controls the operating members 110 of the robot 102 to operate on the workpiece 104 according to the control plan 114. At 206, the control unit 108 determines (based on the control plan 114) whether the manufacturing process is complete. If the manufacturing process is complete, the robot manufacturing method ends at 208.
However, if the manufacturing process is not complete at 206, the method proceeds to 210 where at 210 the presence sensor 118 detects whether a human being is within the work area 106 (and/or the operational area 128 of the workpiece 104). If presence is not detected, the method returns to 204.
However, if presence is detected, the robot manufacturing method proceeds from 210 to 212, at 212, the control unit 108 stops the operation of the operating member 110. Next, at 214, the imaging device 120 acquires an image 122 of the workpiece 104 after detecting the presence of a human. At 216, the control unit 108 compares the image 122 with reference data 116 stored in the memory 112. The reference data 116 corresponds to the current manufacturing stage determined by the control unit 108.
At 218, the control unit 108 determines whether the image 122 conforms to the reference data 116. If the image 122 conforms to the reference data 116, the method returns to 204.
However, if the control unit 108 determines at 218 that the image 122 does not conform to the reference data 116 (e.g., due to the formation of additional through-holes, cuts, and/or the like in the workpiece 104), then the method proceeds from 218 to 220, at 220, the control unit 108 re-populates the image 122 and/or the reference data 116 with the detected changes. Next, at 222, the control unit 108 updates the control plan 114 based on the detected changes, and the method returns to 204.
Optionally, after 222, human collaboration may be necessary in order to continue operation of the robot. For example, foreign matter fragments may be present. The individual may need to remove such foreign debris. Thus, after updating the control plan at 222, operations may cease until further steps are taken, for example, by the individual.
Fig. 9 shows a flow diagram of a robot manufacturing method according to an embodiment of the present disclosure. Referring to fig. 1 and 9, a robot manufacturing method includes providing a robot 102 at 300, the robot 102 including a control unit 108 in communication with an operating member 110; at 302, the operating member 110 is operated by the control unit 108 relative to the workpiece 104 within the work area 106 according to the control plan 114; at 304, detecting the presence of a human within the work area 106 by the presence sensor 118; and at 306, acquiring, by the imaging device 120, an image 122 of at least a portion of the workpiece 104 in response to the presence sensor 118 detecting the presence of a human within the work area 106.
In at least one embodiment, the robot manufacturing method further comprises comparing, by the control unit 108, the image 122 with reference data 116 of the workpiece 104; if the image 122 corresponds to the reference data 116, the operation of the operating member 110 according to the control plan 114 is continued by the control unit 108; and if the image 122 does not correspond to the reference data 116, the operation of the operating member 110 according to the control plan 114 is stopped by the control unit 108.
In at least one embodiment, the robot manufacturing method further includes updating, by the control unit 108, the reference data 116 based on at least one change between the image 122 and the reference data 116.
In at least one embodiment, the robot manufacturing method further comprises updating the control plan 114 by the control unit 108 after updating the reference data 116.
As another example, the detecting includes detecting, by the presence sensor 118, a presence of a human being within the operational area 128 of the workpiece 104 within the work area 106.
As one example, the robot manufacturing method further includes disposing one of the presence sensor 118 or the imaging device 120 on or within the robot 102. As another example, the robot manufacturing method further includes disposing the presence sensor 118 and the imaging device 120 on or within the robot 102. As another example, a robot manufacturing method includes placing one or both of the presence sensor 118 or the imaging device 120 remotely from the robot 102 within the work area 106.
As an example, the robot manufacturing method further includes moving the robot 102 within the work area 106 using a transport 130 (shown in fig. 4).
Further, the present disclosure includes embodiments according to the following clauses:
clause 1. a robotic manufacturing system, comprising:
a robot comprising a control unit in communication with an operating member, wherein the control unit is configured to operate the operating member relative to a workpiece within a work area according to a control plan;
a presence sensor configured to detect a presence of a human within a work area; and
an imaging device configured to acquire an image of at least a portion of the workpiece in response to the presence sensor detecting the presence of a human being within the work area.
Clause 2. the robot manufacturing system of clause 1, wherein the control unit is further configured to:
comparing the image with reference data of the workpiece,
if the image corresponds to the reference number, continuing to operate the operating member according to the control plan, and
if the image does not conform to the reference data, stopping operating the operating member according to the control plan.
Clause 3. the robot manufacturing system of clause 2, wherein the control unit is further configured to update the reference data based on at least one change between the image and the reference data.
Clause 4. the robot manufacturing system according to clause 3, wherein the control unit is further configured to update the control plan after updating the reference data.
Clause 5. the robotic manufacturing system of any of clauses 1-4, wherein the presence sensor is configured to detect the presence of the human within an operational area of the workpiece within the work area.
Clause 6. the robot manufacturing system of any of clauses 1-5, wherein the robot includes one or both of the presence sensor or the imaging device.
Clause 7. the robot manufacturing system of any of clauses 1-6, wherein the imaging device is configured to acquire the image after the human is no longer within the work area.
Clause 8. the robotic manufacturing system of any of clauses 1-7, wherein the imaging device is configured to acquire the image of the at least a portion of the workpiece in response to the sensor detecting the presence of the human being within an operational area of the workpiece within the work area.
Clause 9. the robot manufacturing system of any of clauses 1-8, wherein one or both of the sensor or the imaging device is remote from the robot within the work area.
Clause 10. the robot manufacturing system of any of clauses 1-9, wherein the robot further comprises a transport that allows the robot to move within the work area.
Clause 11, a robot manufacturing method, comprising:
operating, by a control unit of the robot, an operating member of the robot with respect to a workpiece within a work area according to a control plan;
detecting, by a presence sensor, a presence of a human within the work area; and
in response to the presence sensor detecting the presence of the human within the work area, acquiring, by an imaging device, an image of at least a portion of the workpiece.
Clause 12. the robot manufacturing method of clause 11, further comprising:
comparing, by the control unit, the image with reference data of the workpiece;
continuing, by the control unit, to operate the operating member according to the control plan if the image conforms to the reference data; and is
Stopping, by the control unit, operating the operating member according to the control plan if the image does not conform to the reference data.
Clause 13. the robot manufacturing method of clause 12, further comprising updating, by the control unit, the reference data based on at least one change between the image and the reference data.
Clause 14. the robot manufacturing method according to clause 13, further comprising updating, by the control unit, the control plan after updating the reference data.
Clause 15. the robot manufacturing method of any of clauses 11-14, wherein the detecting comprises detecting, by the presence sensor, a presence of the human within an operational area of the workpiece within the work area.
Clause 16. the robot manufacturing method of any of clauses 11-15, wherein the acquiring comprises acquiring the image after the human is no longer within the work area.
Clause 17 the robot manufacturing method of any of clauses 11-16, wherein the acquiring comprises acquiring an image of at least a portion of the workpiece in response to the presence sensor detecting the presence of the human being within an operational area of the workpiece within the work area.
Clause 18. the robot manufacturing method of any one of clauses 11 to 17, further moving the robot within the work area using a transport.
Clause 19. a robotic manufacturing system, comprising:
a robot, comprising:
a control unit in communication with an operating member, wherein the control unit is configured to operate the operating member with respect to a workpiece within a work area according to a control plan;
a presence sensor configured to detect a presence of a human within the work area and within an operational area of the workpiece within the work area; and
an imaging device configured to acquire an image of at least a portion of the workpiece in response to the presence sensor detecting the presence of the human being within the work area,
wherein the control unit is further configured to:
comparing the image with reference data of the workpiece,
stopping operating the operating member according to the control plan if the image does not conform to the reference data,
updating the reference data based on at least one change between the image and the reference data, an
Updating the control plan after the reference data update.
Clause 20. the robot manufacturing system of clause 19, wherein the robot further comprises a transport that allows the robot to move within the work area.
As described herein, certain embodiments of the present disclosure provide systems and methods for efficiently and effectively manufacturing structural members by robotic processes, such as collaborative robotic processes. Further, certain embodiments of the present disclosure provide systems and methods for ensuring consistency in the quality of structural members formed by robotic or collaborative robotic processes.
Although various spatial and directional terms, such as top, bottom, lower, middle, lateral, horizontal, vertical, front, etc., may be used to describe embodiments of the present disclosure, it is understood that such terms are used only with respect to the orientations shown in the drawings. The orientation may be reversed, rotated, or otherwise changed such that the upper portion becomes the lower portion and vice versa, horizontal becomes vertical, and so forth.
As used herein, a structure, limitation, or element that is "configured to" perform a task or operation is specifically structurally formed, constructed, or adapted in a manner that corresponds to the task or operation. For the purposes of clarity and avoidance of doubt, an object that can only be modified to perform a task or operation is not "configured to" perform the task or operation as used herein.
It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the various embodiments of the invention without departing from their scope. While the dimensions and types of materials described herein are intended to define the parameters of the various embodiments of the disclosure, these embodiments are by no means limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of various embodiments of the disclosure should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims and this detailed description, the terms "including" and "in which" are used as the plain-english equivalents of the respective terms "comprising" and "wherein". Furthermore, the terms "first," "second," and "third," etc. are used merely as labels, and are not intended to impose numerical requirements on their objects. Furthermore, the limitations of the appended claims are not written in a means-plus-function format, and are not intended to be interpreted based on 35u.s.c. § 112(f), unless and until such claim limitations explicitly use the phrase "means for", without a functional recitation of further structural elements following.
This written description uses examples to disclose various embodiments of the disclosure, including the best mode, and also to enable any person skilled in the art to practice various embodiments of the disclosure, including making and using any devices or systems and performing any incorporated methods. The patentable scope of various embodiments of the disclosure is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
Claims (10)
1. A robotic manufacturing system (100), comprising:
a robot (102) comprising a control unit (108) in communication with an operating member (110), wherein the control unit (108) is configured to operate the operating member (110) relative to a workpiece (104) within a work area (106) according to a control plan (114);
a presence sensor (118) configured to detect a presence of a human within the work area (106); and
an imaging device (120) configured to acquire an image (122) of at least a portion of the workpiece (104) in response to the presence sensor (118) detecting the presence of the human being within the work area (106).
2. The robotic manufacturing system (100) of claim 1, wherein the control unit (108) is further configured to:
comparing the image (122) with reference data (116) of the workpiece (104),
if the image (122) corresponds to the reference data (116), continuing to operate the operating member (110) according to the control plan (114), and
-if the image (122) does not correspond to the reference data (116), stopping operating the operating member (110) according to the control plan (114).
3. The robotic manufacturing system (100) according to claim 2, wherein the control unit (108) is further configured to update the reference data (116) based on at least one change between the image (122) and the reference data (116).
4. The robot manufacturing system (100) of claim 3, wherein the control unit (108) is further configured to update the control plan (114) after updating the reference data (116).
5. The robotic manufacturing system (100) according to any one of claims 1-4, wherein the presence sensor (118) is configured to detect the presence of the human being within an operating region (128) of the workpiece (104) within the working region (106).
6. The robotic manufacturing system (100) of any of claims 1-4, wherein the robot (102) includes one or both of the presence sensor (118) or the imaging device (120).
7. The robotic manufacturing system (100) according to any one of claims 1-4, wherein the imaging device (120) is configured to acquire the image (122) after the human being is no longer within the work area (106).
8. The robotic manufacturing system (100) according to any one of claims 1-4, wherein the imaging device (120) is configured to acquire the image (122) of the at least a portion of the workpiece (104) in response to the sensor (118) detecting the presence of the human being within an operating region (128) of the workpiece (104) within the work region (106).
9. The robotic manufacturing system (100) according to any one of claims 1-4, wherein the robot (102) further comprises a transport (130) that allows the robot (102) to move within the work area (106).
10. A robot manufacturing method, comprising:
operating (302), by a control unit (108) of a robot (102), an operating member (110) of the robot (102) with respect to a workpiece (104) within a work area (106) according to a control plan (114);
detecting (304) a presence of a human within the work area (106) by a presence sensor (118); and
acquiring (306), by an imaging device (120), an image (122) of at least a portion of the workpiece (104) in response to the presence sensor (118) detecting the presence of the human within the work area (106).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163143981P | 2021-02-01 | 2021-02-01 | |
US63/143,981 | 2021-02-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114833818A true CN114833818A (en) | 2022-08-02 |
Family
ID=82561458
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210063237.9A Pending CN114833818A (en) | 2021-02-01 | 2022-01-20 | Robot manufacturing system and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114833818A (en) |
-
2022
- 2022-01-20 CN CN202210063237.9A patent/CN114833818A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11027431B2 (en) | Automatic calibration method for robot system | |
US20190291275A1 (en) | Robotic system and method for operating on a workpiece | |
CN105094049B (en) | Learning path control | |
US7904201B2 (en) | Robot programming device | |
EP1769890A2 (en) | Robot simulation device | |
US9162357B2 (en) | Control method for robot system and robot system | |
US8942850B2 (en) | Method and system for assisting in the handling of robotized machines in a congested environment | |
US6035243A (en) | System for handling defects produced during the automated assembly of palletized elements | |
KR102525831B1 (en) | Control system, controller and control method | |
CN104245514A (en) | Automated system for joining portions of a chassis and method thereof | |
JP2013136123A (en) | Assisting device and assisting method for teaching operation for robot | |
JP7054376B2 (en) | Spray construction system | |
CN114833818A (en) | Robot manufacturing system and method | |
WO2019126657A1 (en) | Safety control module for a robot assembly and method of same | |
CN110815226B (en) | Method for returning to initial position at any posture and any position of robot | |
US11279034B2 (en) | Position monitoring of a kinematic linkage | |
EP4035846A1 (en) | Robotic manufacturing systems and methods | |
JP7215056B2 (en) | Construction work device and construction work method | |
US11940772B2 (en) | Numerical controller and industrial machine control system | |
CN109863472B (en) | Encoding method and program for robot control device, and robot control device | |
CN112180836B (en) | Foreign matter detection device | |
JP7270389B2 (en) | fire detection system | |
US9815217B2 (en) | System and method for forming holes onto a sheet-metal assembly | |
JP2022176229A (en) | Construction work device and construction work method | |
WO2021216831A1 (en) | Method and system for object tracking in robotic vision guidance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |