CN117428754A - Error compensation method, device, equipment and storage medium based on workpiece assembly - Google Patents

Error compensation method, device, equipment and storage medium based on workpiece assembly Download PDF

Info

Publication number
CN117428754A
CN117428754A CN202210832476.6A CN202210832476A CN117428754A CN 117428754 A CN117428754 A CN 117428754A CN 202210832476 A CN202210832476 A CN 202210832476A CN 117428754 A CN117428754 A CN 117428754A
Authority
CN
China
Prior art keywords
workpiece
combined
virtual model
robot
assembly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210832476.6A
Other languages
Chinese (zh)
Inventor
黄亮
刘思仁
薛雷
刘华莹
吴凡
陈家祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Commercial Aircraft Corp of China Ltd
Shanghai Aircraft Manufacturing Co Ltd
Original Assignee
Commercial Aircraft Corp of China Ltd
Shanghai Aircraft Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Commercial Aircraft Corp of China Ltd, Shanghai Aircraft Manufacturing Co Ltd filed Critical Commercial Aircraft Corp of China Ltd
Priority to CN202210832476.6A priority Critical patent/CN117428754A/en
Publication of CN117428754A publication Critical patent/CN117428754A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/1605Simulation of manipulator lay-out, design, modelling of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1687Assembly, peg and hole, palletising, straight line, weaving pattern movement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an error compensation method, device and equipment based on workpiece assembly and a storage medium. The method comprises the following steps: acquiring a robot virtual model corresponding to a robot and a combined virtual model corresponding to a combined workpiece in an assembly scene; wherein the combined workpiece comprises at least two sub-workpieces; determining a relative position between the robot virtual model and the combined virtual model; determining a target error gap between different sub-workpieces according to the robot virtual model, the combined virtual model and the relative position; and controlling the robot to perform assembly error compensation on each sub-workpiece based on the target error gap. The embodiment of the invention improves the assembly accuracy and the assembly efficiency of the workpiece.

Description

Error compensation method, device, equipment and storage medium based on workpiece assembly
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for error compensation based on workpiece assembly.
Background
In the field of assembling workpieces of large-scale structures in aerospace, the positioning and assembling of a large number of parts are manually completed by related operators at present due to the limitation of assembly space and the diversity of shape and size of the workpieces. This type of assembly operation requires an excessive level of skill from the personnel involved and has low assembly accuracy and efficiency.
Disclosure of Invention
The invention provides an error compensation method, device and equipment based on workpiece assembly and a storage medium, so as to improve workpiece assembly accuracy and assembly efficiency.
According to an aspect of the present invention, there is provided an error compensation method based on workpiece assembly, the method comprising:
acquiring a robot virtual model corresponding to a robot and a combined virtual model corresponding to a combined workpiece in an assembly scene; wherein the combined workpiece comprises at least two sub-workpieces;
determining a relative position between the robot virtual model and the combined virtual model;
determining a target error gap between different sub-workpieces according to the robot virtual model, the combined virtual model and the relative position;
and controlling the robot to perform assembly error compensation on each sub-workpiece based on the target error gap.
According to another aspect of the present invention, there is provided an error compensation device based on workpiece assembly, the device comprising:
the combined virtual model acquisition module is used for acquiring a robot virtual model corresponding to the robot and a combined virtual model corresponding to the combined workpiece in the assembly scene; wherein the combined workpiece comprises at least two sub-workpieces;
a relative position determination module for determining a relative position between the robot virtual model and the combined virtual model;
the target error gap determining module is used for determining target error gaps among different sub-workpieces according to the robot virtual model, the combined virtual model and the relative positions;
and the error compensation module is used for controlling the robot to carry out assembly error compensation on each sub-workpiece based on the target error gap.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the workpiece assembly-based error compensation method of any one of the embodiments of the invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to implement the workpiece assembly-based error compensation method according to any of the embodiments of the present invention when executed.
According to the embodiment of the invention, the robot virtual model corresponding to the robot and the combined virtual model corresponding to the combined workpiece in the assembly scene are obtained; determining a relative position between the robot virtual model and the combined virtual model; determining target error gaps among different sub-workpieces according to the robot virtual model, the combined virtual model and the relative positions; and controlling the robot to perform assembly error compensation on each sub-workpiece based on the target error gap. According to the technical scheme, the target error gap between the workpieces is determined in an automatic mode, so that the robot positioning compensation assembly is realized, the assembly accuracy and the assembly efficiency of the workpieces are improved, and the situations of lower assembly efficiency, assembly limitation and lower assembly accuracy caused by manually completing the positioning assembly of the workpieces are avoided.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for error compensation based on workpiece assembly according to a first embodiment of the present invention;
FIG. 2 is a flow chart of a method for error compensation based on workpiece assembly according to a second embodiment of the invention;
FIG. 3 is a schematic structural view of an error compensation device based on workpiece assembly according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device implementing an error compensation method based on workpiece assembly according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a workpiece assembly-based error compensation method according to an embodiment of the present invention, where the method may be performed by a workpiece assembly-based error compensation device, and the workpiece assembly-based error compensation device may be implemented in hardware and/or software, and the workpiece assembly-based error compensation device may be configured in an electronic device. As shown in fig. 1, the method includes:
s110, acquiring a robot virtual model corresponding to a robot and a combined virtual model corresponding to a combined workpiece in an assembly scene; wherein the combined workpiece comprises at least two sub-workpieces.
The assembly scene can be a scene for controlling the robot to assemble a structural workpiece in the aerospace field. The combined workpiece may be a workpiece after assembly, for example, the combined workpiece may be a workpiece obtained by assembling and combining an aircraft skin and a frame. If the combined workpiece is a workpiece assembled by the aircraft skin and the frame, the corresponding sub-workpieces are respectively divided into the aircraft skin and the frame.
The robot in the scene may be a robot that assembles a workpiece. The robot virtual model corresponding to the robot may be a space model obtained by modeling the robot using modeling software, for example, the modeling software may be AutoCAD (automatic computer aided design software, autodesk Computer Aided Design). Correspondingly, the combined virtual model corresponding to the combined workpiece can be a space model obtained after modeling the combined workpiece. The robot virtual model and the combined virtual model can also scan and acquire the corresponding robot and the corresponding workpiece; or may be obtained directly from software with a correlation model, which is not limited in this embodiment.
S120, determining the relative position between the robot virtual model and the combined virtual model.
The relative position between the robot virtual model and the combined virtual model may be determined by a position tracking device, which may be, for example, a C-Track optical tracker.
For example, a scanner may be installed at the end of the robot, and a tracking device, such as a C-Track optical tracker, may Track the scanner at the end of the robot, thereby determining the relative position of the robot with respect to the tracking device, and further implementing position calibration of the robot. After the robot position calibration is accurate, the robot tail end scanner scans part of the characteristics of the combined workpiece to obtain part of point cloud data of the combined workpiece. And roughly aligning the partial point cloud data of the combined workpiece and the combined virtual model of the combined workpiece, so that the position calibration of the combined workpiece is realized, and the relative position between the robot virtual model and the combined virtual model can be determined.
In an alternative embodiment, determining the relative position between the robot virtual model and the combined virtual model comprises: determining a calibration position of a robot virtual model; controlling a robot to scan the combined workpiece to obtain partial point cloud data corresponding to the combined workpiece; and under the calibration position, carrying out alignment operation on part of the point cloud data and the combined simulation model to obtain the relative position between the robot virtual model and the combined virtual model.
The calibration position of the virtual model of the robot may be a virtual position of the robot where the position of the robot is mapped in the virtual scene in the real assembly scene, where the virtual position corresponds to the position of the robot in the real scene.
The calibration position of the robot virtual model can be determined through the C-Track optical tracker, specifically, the relative position of the C-Track optical tracker and the robot virtual model can be determined through the C-Track optical tracker, so that the determination of the calibration position of the robot virtual model is realized.
The partial point cloud data of the combined workpiece may be point cloud data corresponding to partial features of the combined workpiece. For example, the robot end scanner can be controlled to scan any position of the combined workpiece, so as to obtain partial point cloud data corresponding to partial characteristics of the combined workpiece at the position.
Illustratively, a plurality of calibration points are selected from the partial point cloud data and the combined simulation model, respectively, and the calibration points in the partial point cloud data have a corresponding relationship with the calibration points in the combined simulation model. And according to the selected standard points, based on a preset fitting algorithm, completing the rapid fitting of partial point cloud data of the combined workpiece and the combined simulation model, and realizing the rough alignment of the partial point cloud data and the combined simulation model, thereby obtaining the relative position between the robot virtual model and the combined virtual model.
The robot scans the combined workpiece to obtain partial point cloud data, and in the fitting alignment process, the robot is controlled to perform alignment operation on the partial point cloud data and the combined virtual model, so that the relative position between the robot virtual model and the combined virtual model can be obtained after the partial point cloud data and the combined virtual model are roughly aligned.
According to the method, the device and the system for compensating the assembly errors of the workpiece, the calibration position of the virtual model of the robot is determined, and the obtained partial point cloud data corresponding to the combined workpiece and the combined simulation model are aligned under the calibration position, so that the accurate determination of the relative position between the virtual model of the robot and the combined virtual model is realized, and the assembly errors of the workpiece are conveniently compensated based on the accurately determined relative position, so that the accuracy of the error compensation of the workpiece is improved.
It can be understood that the calibration position of the robot virtual model can be determined manually by related personnel, and in order to improve the determination efficiency and the determination accuracy of the calibration position, the calibration position can also be determined in an automatic mode.
In an alternative embodiment, determining the calibration position of the robot virtual model comprises: controlling an end effector of the robot to move to a preset measurement point; acquiring position coordinates of the end effector moving to each measurement point; and determining the calibration position of the robot virtual model according to the position coordinates.
The measurement point location may be a coordinate location arbitrarily selected by a related technician in an operation space of the robot. The number of measurement points may also be preset by a related technician, for example, the number of measurement points may be 4-6.
Illustratively, 4-6 measurement points are selected in the robot operation space in advance, and the end effector of the robot is controlled to sequentially move to the selected measurement points. Wherein the end effector may be a scanner. Accordingly, the device for tracking the robotic end effector may be a C-Tracker optical Tracker. Specifically, the moving position of the scanner is tracked in real time by a C-Tracker optical Tracker, so that the position coordinates of the end scanner moving to each measuring point are determined. And according to the position coordinates, determining a transformation matrix from the robot end scanner to the C-Track optical tracker, so as to determine the relative position of the robot relative to the C-Track optical tracker, and realizing the position calibration of the robot virtual model.
The optional embodiment controls the end effector of the robot to move to a preset measurement point; acquiring position coordinates of the end effector moving to each measurement point; according to the position coordinates, the calibration position of the robot virtual model is determined, automatic determination of the calibration position of the robot virtual model is realized, and the determination efficiency and the determination accuracy of the calibration position of the robot virtual model are improved.
S130, determining target error gaps among different sub-workpieces according to the robot virtual model, the combined virtual model and the relative positions.
It will be appreciated that during assembly of the sub-workpieces, there may be instances where the sub-workpieces are not precisely bonded, for example, where there may be workpiece gaps between the aircraft skin and the frame. The target error gap is a workpiece gap left when the assembly of each sub-workpiece is not fit accurately.
For example, a set of scan points may be selected from the combined virtual model for controlling the robot to scan the combined workpiece according to the scan points. The combined virtual model is scanned according to the scanning point positions under the relative position of the simulation control machine virtual model, a group of scanning path codes which can be identified by the robot are generated, for example, NC code files, and the robot in the control scene scans the combined workpiece in the scene based on the scanning path codes to obtain scanning point cloud data corresponding to the combined workpiece. Combining the scan point cloud data with the combined virtual model may determine an error between the actual measurement data and the real data. The combined virtual model is real data and is a model with no gap error between sub-workpieces. The scanned point cloud data is actually measured point cloud data, and is the point cloud data under the condition that gap errors possibly exist between the sub-workpieces. The error gap existing between the sub-workpieces can be determined by scanning the point cloud data and combining the virtual model, and at least one error gap existing is taken as a target error gap.
And S140, controlling the robot to perform assembly error compensation on each sub-workpiece based on the target error gap.
By way of example, a translational rotation matrix corresponding to the target error gap can be determined, and the robot is controlled to automatically adjust the terminal pose based on the translational rotation matrix to realize assembly positioning and error compensation between the sub-workpieces. The translational rotation matrix corresponding to the target error gap may be determined by using an existing technical means, which is not limited in this embodiment.
According to the embodiment of the invention, the robot virtual model corresponding to the robot and the combined virtual model corresponding to the combined workpiece in the assembly scene are obtained; determining a relative position between the robot virtual model and the combined virtual model; determining target error gaps among different sub-workpieces according to the robot virtual model, the combined virtual model and the relative positions; and controlling the robot to perform assembly error compensation on each sub-workpiece based on the target error gap. According to the technical scheme, the target error gap between the workpieces is determined in an automatic mode, so that the robot positioning compensation assembly is realized, the assembly accuracy and the assembly efficiency of the workpieces are improved, and the situations of lower assembly efficiency, assembly limitation and lower assembly accuracy caused by manually completing the positioning assembly of the workpieces are avoided.
Example two
Fig. 2 is a flowchart of an error compensation method based on workpiece assembly according to a second embodiment of the present invention, where the present embodiment is optimized and improved based on the above technical solutions.
Further, the step of determining the target error gaps among different sub-workpieces according to the robot virtual model, the combined virtual model and the relative position is refined into the step of determining the combined scanning point cloud data corresponding to the combined workpiece according to the robot virtual model and the relative position; and determining a target error gap corresponding to the combined workpiece according to the combined scanning point cloud data and the combined virtual model. In order to refine the determination of the target error gap.
As shown in fig. 2, the method comprises the following specific steps:
s210, acquiring a robot virtual model corresponding to a robot and a combined virtual model corresponding to a combined workpiece in an assembly scene; wherein the combined workpiece comprises at least two sub-workpieces.
S220, determining the relative position between the robot virtual model and the combined virtual model.
S230, determining combined scanning point cloud data corresponding to the combined workpiece according to the robot virtual model, the combined virtual model and the relative position.
For example, the scan points may be preselected in the combined virtual model, and the simulated robot virtual model is controlled to generate the scan path based on the preselected scan points at the relative positions. And the robot in the control scene scans the combined workpiece according to the scanning path to obtain combined scanning point cloud data corresponding to the combined workpiece.
In an alternative embodiment, determining the combined scan point cloud data corresponding to the combined workpiece according to the robot virtual model, the combined virtual model and the relative position includes: determining a scanning point position of the robot according to the combined virtual model; generating a scanning path according to the scanning point positions, the robot virtual model and the relative positions; and controlling the robot to perform point cloud scanning on the combined workpiece based on the scanning path to obtain combined scanning point cloud data corresponding to the combined workpiece.
For example, scan points for scanning the combined workpiece may be selected from the combined virtual model. The scan point may be selected manually by a related technician, or may be selected automatically from a combined virtual model, which is not limited in this embodiment. The number of the scanning points is multiple, and the number of the scanning points can be specifically determined according to actual requirements.
And controlling the robot virtual model in the simulated relative position to sequentially scan the combined virtual model based on the selected scanning point positions to generate a scanning path. The data form of the scan path may be an NC code that can be recognized by the robot. And the robot in the control scene performs point cloud scanning on the combined workpiece based on the scanning path to obtain combined scanning point cloud data corresponding to the combined workpiece. Specifically, the robot starts from the first scanning point position to acquire point cloud data based on the scanning path, and finally generates combined scanning point cloud data corresponding to the combined workpiece after the last scanning point position is scanned.
S240, determining a target error gap corresponding to the combined workpiece according to the combined scanning point cloud data and the combined virtual model.
For example, the combination scanning point cloud data corresponding to the combination workpiece and the combination virtual model may be fitted based on a preset fitting manner, so as to obtain a target error gap corresponding to the combination workpiece.
In an alternative embodiment, the composite workpiece includes a first assembly sub-workpiece and a second assembly sub-workpiece having a conforming assembly relationship; the combined virtual model comprises a first workpiece model corresponding to the first assembly sub-workpiece and a second workpiece model corresponding to the second assembly sub-workpiece; the combined scanning point cloud data comprise first scanning point cloud data corresponding to a first assembly sub-workpiece and second scanning point cloud data corresponding to a second assembly sub-workpiece;
wherein the first and second assembly sub-workpieces may be workpieces having an assembly relationship, for example, the first assembly sub-workpiece may be an aircraft skin and the second assembly sub-workpiece may be a frame. The assembly relationship may be a snug fit of the aircraft skin to the frame.
The combined virtual model may include a first workpiece model corresponding to the first assembly sub-workpiece and a second workpiece model corresponding to the second assembly sub-workpiece. Correspondingly, the combined scanning point cloud data may include first scanning point cloud data corresponding to the first assembly sub-workpiece and second scanning point cloud data corresponding to the second assembly sub-workpiece.
Correspondingly, determining the target error gap corresponding to the combined workpiece according to the combined scanning point cloud data and the combined virtual model comprises the following steps: determining first deviation data corresponding to a first assembly sub-workpiece according to the first workpiece model and the first scanning point cloud data; determining second deviation data corresponding to a second assembly sub-workpiece according to the second workpiece model and the second scanning point cloud data; and determining a target error gap between the first assembly sub-workpiece and the second assembly sub-workpiece according to the first deviation data and the second deviation data.
The first workpiece model and the first scanning point cloud data are fitted to obtain first deviation data corresponding to the first assembly sub-workpiece. The first deviation data may be a deviation color map, and specifically may be generated by fitting the first workpiece model and the first scan point cloud data by using existing deviation color map generating software or tools.
Fitting the second workpiece model and the second scanning point cloud data to obtain second deviation data corresponding to a second assembly sub-workpiece. The second deviation data may be a deviation color map, and specifically may be generated by fitting the second workpiece model and the second scan point cloud data by using existing deviation color map generating software or tools.
And comparing and analyzing the deviation data of the first deviation data and the second deviation data, so as to obtain a target error gap between the first assembly sub-workpiece and the second assembly sub-workpiece.
In an alternative embodiment, determining a target error gap between the first and second sub-assemblies based on the first and second deviation data comprises: determining at least one candidate error gap between the first and second assembly sub-workpieces based on the first and second deviation data; and taking the at least one candidate error gap as a target error gap, wherein the at least one candidate error gap corresponds to the average error gap.
It should be noted that, according to the first deviation data and the second deviation data, at least one candidate error gap between the first assembly sub-workpiece and the second assembly sub-workpiece may be obtained. Wherein each candidate error gap may be an error gap at a different location on the first and second assembly sub-workpieces, and the error gaps may be different in magnitude. An average value of each candidate error gap may be determined, and an average error gap corresponding to each candidate error gap may be used as the target error gap.
Optionally, the minimum mean square error corresponding to the first assembly sub-workpiece and the second assembly sub-workpiece can be determined according to the first deviation data and the second deviation data, and the minimum mean square error is used as the target error gap.
S250, controlling the robot to perform assembly error compensation on each sub-workpiece based on the target error gap.
According to the scheme, the combined scanning point cloud data corresponding to the combined workpiece is determined according to the robot virtual model, the combined virtual model and the relative position; according to the combined scanning point cloud data and the combined virtual model, the target error gap corresponding to the combined workpiece is determined, and the accurate determination of the target error gap is realized, so that positioning compensation assembly can be accurately performed according to the accurately determined target error gap. According to the technical scheme, the target error gap between the workpieces is determined in an automatic mode, so that the robot positioning compensation assembly is realized, the assembly accuracy and the assembly efficiency of the workpieces are improved, and the situations of lower assembly efficiency, assembly limitation and lower assembly accuracy caused by manually completing the positioning assembly of the workpieces are avoided.
Example III
Fig. 3 is a schematic structural diagram of an error compensation device based on workpiece assembly according to a third embodiment of the present invention. The error compensation device based on workpiece assembly provided by the embodiment of the invention can be suitable for error compensation of workpieces subjected to fitting assembly, and can be realized in a hardware and/or software mode, as shown in fig. 3, and specifically comprises: a combined virtual model acquisition module 301, a relative position determination module 302, a target error gap determination module 303, and an error compensation module 304. Wherein,
the combined virtual model obtaining module 301 is configured to obtain a robot virtual model corresponding to a robot and a combined virtual model corresponding to a combined workpiece in an assembly scene; wherein the combined workpiece comprises at least two sub-workpieces;
a relative position determination module 302 for determining a relative position between the robot virtual model and the combined virtual model;
a target error gap determining module 303, configured to determine a target error gap between different sub-workpieces according to the robot virtual model, the combined virtual model, and the relative position;
and the error compensation module 304 is used for controlling the robot to perform assembly error compensation on each sub-workpiece based on the target error gap.
According to the embodiment of the invention, the robot virtual model corresponding to the robot and the combined virtual model corresponding to the combined workpiece in the assembly scene are obtained; determining a relative position between the robot virtual model and the combined virtual model; determining target error gaps among different sub-workpieces according to the robot virtual model, the combined virtual model and the relative positions; and controlling the robot to perform assembly error compensation on each sub-workpiece based on the target error gap. According to the technical scheme, the target error gap between the workpieces is determined in an automatic mode, so that the robot positioning compensation assembly is realized, the assembly accuracy and the assembly efficiency of the workpieces are improved, and the situations of lower assembly efficiency, assembly limitation and lower assembly accuracy caused by manually completing the positioning assembly of the workpieces are avoided.
Optionally, the target error gap determining module 303 includes:
the combined point cloud data determining unit is used for determining combined scanning point cloud data corresponding to the combined workpiece according to the robot virtual model, the combined virtual model and the relative position;
and the target error gap determining unit is used for determining the target error gap corresponding to the combined workpiece according to the combined scanning point cloud data and the combined virtual model.
Optionally, the combined point cloud data determining unit includes:
the scanning point position determining subunit is used for determining the scanning point position of the robot according to the combined virtual model;
a scan path generation subunit, configured to generate a scan path according to the scan point location, the robot virtual model, and the relative position;
and the combined point cloud data determining subunit is used for controlling the robot to perform point cloud scanning on the combined workpiece based on the scanning path to obtain combined scanning point cloud data corresponding to the combined workpiece.
Optionally, the combined workpiece includes a first assembly sub-workpiece and a second assembly sub-workpiece having a conforming assembly relationship; the combined virtual model comprises a first workpiece model corresponding to the first assembly sub-workpiece and a second workpiece model corresponding to the second assembly sub-workpiece; the combined scanning point cloud data comprise first scanning point cloud data corresponding to a first assembly sub-workpiece and second scanning point cloud data corresponding to a second assembly sub-workpiece;
correspondingly, the target error gap determining unit includes:
the first deviation determining subunit is used for determining first deviation data corresponding to the first assembly sub-workpiece according to the first workpiece model and the first scanning point cloud data;
a second deviation determining subunit, configured to determine second deviation data corresponding to the second assembly sub-workpiece according to the second workpiece model and the second scan point cloud data;
and the target error gap determining subunit is used for determining a target error gap between the first assembly sub-workpiece and the second assembly sub-workpiece according to the first deviation data and the second deviation data.
Optionally, the target error gap determining subunit is specifically configured to:
determining at least one candidate error gap between a first assembly sub-workpiece and the second assembly sub-workpiece according to the first deviation data and the second deviation data;
and taking the at least one candidate error gap as a target error gap, wherein the at least one candidate error gap corresponds to the average error gap.
Optionally, the relative position determining module 302 includes:
the calibration position determining unit is used for determining the calibration position of the robot virtual model;
a partial point cloud data determining unit, configured to control the robot to scan the combined workpiece, so as to obtain partial point cloud data corresponding to the combined workpiece;
and the relative position determining unit is used for carrying out alignment operation on the partial point cloud data and the combined simulation model under the calibration position to obtain the relative position between the robot virtual model and the combined virtual model.
Optionally, the calibration position determining unit includes:
the measuring point position control subunit is used for controlling the end effector of the robot to move to a preset measuring point position;
a position coordinate acquisition subunit, configured to acquire position coordinates of the end effector moving to each measurement point;
and the calibration position determining subunit is used for determining the calibration position of the robot virtual model according to the position coordinates.
The error compensation device based on the workpiece assembly provided by the embodiment of the invention can execute the error compensation method based on the workpiece assembly provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 4 shows a schematic diagram of an electronic device 40 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic device 40 includes at least one processor 41, and a memory communicatively connected to the at least one processor 41, such as a Read Only Memory (ROM) 42, a Random Access Memory (RAM) 43, etc., in which the memory stores a computer program executable by the at least one processor, and the processor 41 may perform various suitable actions and processes according to the computer program stored in the Read Only Memory (ROM) 42 or the computer program loaded from the storage unit 48 into the Random Access Memory (RAM) 43. In the RAM 43, various programs and data required for the operation of the electronic device 40 may also be stored. The processor 41, the ROM 42 and the RAM 43 are connected to each other via a bus 44. An input/output (I/O) interface 45 is also connected to bus 44.
Various components in electronic device 40 are connected to I/O interface 45, including: an input unit 46 such as a keyboard, a mouse, etc.; an output unit 47 such as various types of displays, speakers, and the like; a storage unit 48 such as a magnetic disk, an optical disk, or the like; and a communication unit 49 such as a network card, modem, wireless communication transceiver, etc. The communication unit 49 allows the electronic device 40 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 41 may be various general and/or special purpose processing components with processing and computing capabilities. Some examples of processor 41 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 41 performs the various methods and processes described above, such as an error compensation method based on workpiece assembly.
In some embodiments, the workpiece assembly-based error compensation method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 48. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 40 via the ROM 42 and/or the communication unit 49. When the computer program is loaded into RAM 43 and executed by processor 41, one or more steps of the error compensation method based on workpiece assembly described above may be performed. Alternatively, in other embodiments, the processor 41 may be configured to perform the workpiece assembly-based error compensation method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. An error compensation method based on workpiece assembly, comprising:
acquiring a robot virtual model corresponding to a robot and a combined virtual model corresponding to a combined workpiece in an assembly scene; wherein the combined workpiece comprises at least two sub-workpieces;
determining a relative position between the robot virtual model and the combined virtual model;
determining a target error gap between different sub-workpieces according to the robot virtual model, the combined virtual model and the relative position;
and controlling the robot to perform assembly error compensation on each sub-workpiece based on the target error gap.
2. The method of claim 1, wherein said determining a target error gap between different sub-workpieces based on said robot virtual model, said combined virtual model, and said relative positions comprises:
determining combined scanning point cloud data corresponding to the combined workpiece according to the robot virtual model, the combined virtual model and the relative position;
and determining a target error gap corresponding to the combined workpiece according to the combined scanning point cloud data and the combined virtual model.
3. The method of claim 2, wherein the determining the combined scan point cloud data corresponding to the combined workpiece based on the robot virtual model, the combined virtual model, and the relative position comprises:
determining a scanning point position of the robot according to the combined virtual model;
generating a scanning path according to the scanning point positions, the robot virtual model and the relative positions;
and controlling the robot to perform point cloud scanning on the combined workpiece based on the scanning path to obtain combined scanning point cloud data corresponding to the combined workpiece.
4. The method of claim 2 wherein the composite workpiece comprises a first assembled sub-workpiece and a second assembled sub-workpiece having a conforming assembly relationship; the combined virtual model comprises a first workpiece model corresponding to the first assembly sub-workpiece and a second workpiece model corresponding to the second assembly sub-workpiece; the combined scanning point cloud data comprise first scanning point cloud data corresponding to a first assembly sub-workpiece and second scanning point cloud data corresponding to a second assembly sub-workpiece;
correspondingly, the determining the target error gap corresponding to the combined workpiece according to the combined scanning point cloud data and the combined virtual model comprises the following steps:
determining first deviation data corresponding to the first assembly sub-workpiece according to the first workpiece model and the first scanning point cloud data;
determining second deviation data corresponding to the second assembly sub-workpiece according to the second workpiece model and the second scanning point cloud data;
and determining a target error gap between the first assembly sub-workpiece and the second assembly sub-workpiece according to the first deviation data and the second deviation data.
5. The method of claim 4, wherein determining a target error gap between the first and second sub-assemblies based on the first and second bias data comprises:
determining at least one candidate error gap between a first assembly sub-workpiece and the second assembly sub-workpiece according to the first deviation data and the second deviation data;
and taking the at least one candidate error gap as a target error gap, wherein the at least one candidate error gap corresponds to the average error gap.
6. The method of any of claims 1-5, wherein the determining a relative position between the robot virtual model and the combined virtual model comprises:
determining a calibration position of the robot virtual model;
controlling the robot to scan the combined workpiece to obtain partial point cloud data corresponding to the combined workpiece;
and under the calibration position, carrying out alignment operation on the partial point cloud data and the combined simulation model to obtain the relative position between the robot virtual model and the combined virtual model.
7. The method of claim 6, wherein the determining the nominal position of the robot virtual model comprises:
controlling an end effector of the robot to move to a preset measurement point;
acquiring position coordinates of the end effector moving to each measurement point;
and determining the calibration position of the robot virtual model according to the position coordinates.
8. An error compensation device based on workpiece assembly, comprising:
the combined virtual model acquisition module is used for acquiring a robot virtual model corresponding to the robot and a combined virtual model corresponding to the combined workpiece in the assembly scene; wherein the combined workpiece comprises at least two sub-workpieces;
a relative position determination module for determining a relative position between the robot virtual model and the combined virtual model;
the target error gap determining module is used for determining target error gaps among different sub-workpieces according to the robot virtual model, the combined virtual model and the relative positions;
and the error compensation module is used for controlling the robot to carry out assembly error compensation on each sub-workpiece based on the target error gap.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the workpiece assembly-based error compensation method of any of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to implement the workpiece assembly-based error compensation method of any of claims 1-7 when executed.
CN202210832476.6A 2022-07-14 2022-07-14 Error compensation method, device, equipment and storage medium based on workpiece assembly Pending CN117428754A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210832476.6A CN117428754A (en) 2022-07-14 2022-07-14 Error compensation method, device, equipment and storage medium based on workpiece assembly

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210832476.6A CN117428754A (en) 2022-07-14 2022-07-14 Error compensation method, device, equipment and storage medium based on workpiece assembly

Publications (1)

Publication Number Publication Date
CN117428754A true CN117428754A (en) 2024-01-23

Family

ID=89554102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210832476.6A Pending CN117428754A (en) 2022-07-14 2022-07-14 Error compensation method, device, equipment and storage medium based on workpiece assembly

Country Status (1)

Country Link
CN (1) CN117428754A (en)

Similar Documents

Publication Publication Date Title
EP3842749A2 (en) Positioning method, positioning device and electronic device
JP7214803B2 (en) Building positioning method, device, electronic device, storage medium, program, and terminal device
CN115273071A (en) Object identification method and device, electronic equipment and storage medium
CN115147831A (en) Training method and device of three-dimensional target detection model
CN117428754A (en) Error compensation method, device, equipment and storage medium based on workpiece assembly
CN116833606A (en) Welding method, device, equipment and storage medium
CN114609969B (en) Numerical control machine tool track error compensation method based on cloud computing
CN116298785A (en) Electronic signal testing method and device, electronic equipment and storage medium
CN114734444B (en) Target positioning method and device, electronic equipment and storage medium
CN113015117B (en) User positioning method and device, electronic equipment and storage medium
CN115049810A (en) Coloring method, device and equipment for solid-state laser radar point cloud and storage medium
CN117283568A (en) Control method, device, equipment and medium of mechanical arm
CN114694138B (en) Road surface detection method, device and equipment applied to intelligent driving
CN116258714B (en) Defect identification method and device, electronic equipment and storage medium
CN117565042A (en) Machine ginseng number identification method, device, equipment and storage medium
CN115420195A (en) Motion precision compensation method, device, equipment and storage medium
CN115494776A (en) AR (augmented reality) -glasses-based equipment control method, device, equipment and medium
CN114972511A (en) Method and device for determining pose of calibration object, electronic equipment and storage medium
CN117565057A (en) Machine ginseng number calibration method, device, equipment and storage medium
CN115526918A (en) Structural deformation detection method, system, device, electronic equipment and medium
CN118143947A (en) Pose determining method, pose determining device, pose determining equipment and storage medium
CN115752469A (en) Part detection method and device, electronic equipment and storage medium
CN115091465A (en) Mechanical arm path compensation method and device, electronic equipment and storage medium
CN117962327A (en) Mounting method and device of special-shaped part, electronic equipment and storage medium
CN116167168A (en) Vehicle cabin modeling method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination