CN117726673B - Weld joint position obtaining method and device and electronic equipment - Google Patents
Weld joint position obtaining method and device and electronic equipment Download PDFInfo
- Publication number
- CN117726673B CN117726673B CN202410171766.XA CN202410171766A CN117726673B CN 117726673 B CN117726673 B CN 117726673B CN 202410171766 A CN202410171766 A CN 202410171766A CN 117726673 B CN117726673 B CN 117726673B
- Authority
- CN
- China
- Prior art keywords
- point cloud
- local
- global
- model
- registration result
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 239000013598 vector Substances 0.000 claims description 37
- 230000002159 abnormal effect Effects 0.000 claims description 15
- 238000012545 processing Methods 0.000 claims description 12
- 238000001514 detection method Methods 0.000 claims description 11
- 238000004364 calculation method Methods 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 9
- 238000006243 chemical reaction Methods 0.000 claims description 8
- 238000003466 welding Methods 0.000 description 36
- 238000004422 calculation algorithm Methods 0.000 description 30
- 239000011159 matrix material Substances 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 238000003860 storage Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 238000001914 filtration Methods 0.000 description 6
- 230000004927 fusion Effects 0.000 description 5
- 238000009826 distribution Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- DIWRORZWFLOCLC-UHFFFAOYSA-N Lorazepam Chemical compound C12=CC(Cl)=CC=C2NC(=O)C(O)N=C1C1=CC=CC=C1Cl DIWRORZWFLOCLC-UHFFFAOYSA-N 0.000 description 2
- 230000002349 favourable effect Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Image Processing (AREA)
Abstract
The application provides a weld joint position obtaining method and device and electronic equipment, and relates to the technical field of computers. The method comprises the following steps: obtaining a first model point cloud of a workpiece model corresponding to a workpiece to be welded; obtaining a global point cloud of a workpiece to be welded; registering the first model point cloud and the global point cloud to obtain a global registration result; acquiring a first local point cloud of a workpiece to be welded, wherein the global point cloud and the first local point cloud are acquired by point cloud acquisition equipment; unifying the first local point cloud and the first model point cloud under a target coordinate system based on a global registration result to obtain a second local point cloud and a second model point cloud; registering the second local point cloud and the second model point cloud to obtain a local registration result; and obtaining the target weld joint position from the second local point cloud according to the local registration result and the weld joint position information in the second model point cloud. In this way, an accurate weld position can be obtained.
Description
Technical Field
The application relates to the technical field of computers, in particular to a method and a device for obtaining a welding seam position and electronic equipment.
Background
In order to improve the production efficiency, robot welding is generally used at present. In robot welding, the welding seam position needs to be identified before welding, namely the position to be welded is identified, and then the robot is controlled to weld according to the identified welding seam position. Currently, two-dimensional images are generally acquired, and weld seam identification is performed on the two-dimensional images to determine the weld seam position. This approach, while achieving weld location, is not accurate.
Disclosure of Invention
The embodiment of the application provides a welding seam position obtaining method, a welding seam position obtaining device, electronic equipment and a readable storage medium, which can accurately obtain the welding seam position and improve the welding precision.
Embodiments of the application may be implemented as follows:
In a first aspect, an embodiment of the present application provides a method for obtaining a weld position, the method including:
obtaining a first model point cloud of a workpiece model corresponding to a workpiece to be welded;
obtaining a global point cloud of the workpiece to be welded;
registering the first model point cloud and the global point cloud to obtain a global registration result;
obtaining a first local point cloud of the workpiece to be welded, wherein the global point cloud and the first local point cloud are acquired through a point cloud acquisition device;
Unifying the first local point cloud and the first model point cloud under a target coordinate system based on the global registration result to obtain a second local point cloud and a second model point cloud;
registering the second local point cloud and the second model point cloud to obtain a local registration result;
And obtaining a target weld joint position from the second local point cloud according to the local registration result and the weld joint position information in the second model point cloud.
In a second aspect, an embodiment of the present application provides a weld position obtaining apparatus, including:
the obtaining module is used for obtaining a first model point cloud of a workpiece model corresponding to the workpiece to be welded;
The obtaining module is further used for obtaining the global point cloud of the workpiece to be welded;
The global registration module is used for registering the first model point cloud and the global point cloud to obtain a global registration result;
The acquisition module is further used for acquiring a first local point cloud of the workpiece to be welded, wherein the global point cloud and the first local point cloud are acquired through a point cloud acquisition device;
The conversion module unifies the first local point cloud and the first model point cloud under a target coordinate system based on the global registration result to obtain a second local point cloud and a second model point cloud;
The local registration module is used for registering the second local point cloud and the second model point cloud to obtain a local registration result;
And the position determining module is used for obtaining a target weld position from the second local point cloud according to the local registration result and the weld position information in the second model point cloud.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory stores machine executable instructions executable by the processor, where the processor may execute the machine executable instructions to implement the weld position obtaining method according to the foregoing embodiment.
In a fourth aspect, an embodiment of the present application provides a readable storage medium having stored thereon a computer program which, when executed by a processor, implements the weld position obtaining method according to the foregoing embodiment.
The embodiment of the application provides a weld joint position obtaining method, a device, electronic equipment and a readable storage medium, wherein a first model point cloud of a workpiece model corresponding to a workpiece to be welded and a global point cloud of the workpiece to be welded are obtained; then, registering the first model point cloud and the global point cloud to obtain a global registration result; then, based on a global registration result, unifying the obtained first local point cloud and the first model point cloud of the workpiece to be welded under a target coordinate system to obtain a second local point cloud and a second model point cloud, wherein the first local point cloud and the second local point cloud are acquired through a point cloud acquisition device; and then, registering the second local point cloud and the second model point cloud to obtain a local registration result, and further obtaining a target weld position from the second local point cloud based on the local registration result and weld position information in the second model point cloud. Therefore, a more accurate local registration result can be obtained based on the global and local matrix fusion mode, so that the welding seam position can be conveniently and accurately obtained, and the welding precision is further improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic block diagram of an electronic device according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of a method for obtaining a weld position according to an embodiment of the present application;
FIG. 3 is a flow chart illustrating the sub-steps included in step S130 in FIG. 2;
FIG. 4 is a flow chart illustrating the sub-steps included in step S131 in FIG. 3;
FIG. 5 is a flow chart of the sub-steps included in step S132 in FIG. 3;
FIG. 6 is one of the flow charts of the sub-steps included in step S160 of FIG. 2;
FIG. 7 is a second flowchart illustrating the sub-steps included in the step S160 of FIG. 2;
FIG. 8 is a flow chart illustrating the sub-steps included in step S170 in FIG. 2;
fig. 9 is a schematic block diagram of a weld position obtaining apparatus according to an embodiment of the present application.
Icon: 100-an electronic device; 110-memory; a 120-processor; 130-a communication unit; 200-a weld position obtaining device; 210-obtaining a module; 220-a global registration module; 230-a conversion module; 240-a local registration module; 250-a position determination module.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present application.
It is noted that relational terms such as "first" and "second", and the like, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The inventor finds that point cloud registration can be directly carried out on local point clouds and model point clouds of a workpiece to be welded, and then a welding seam is identified from the local point clouds based on registration results and welding seam position information in the model point clouds.
Since initial registration is generally used to determine an initial pose so that two point clouds are approximately coincident, in order to further improve registration accuracy, an ICP algorithm is mostly used as an accurate registration algorithm to optimize, and a conventional ICP algorithm has relatively strict requirements on initial registration conditions, and requires that the overlapping degree of two point clouds is high when ICP registration is performed, i.e., requires that an initial registration result is accurate, or else is easy to fall into local optimum.
If the local point cloud and the model point cloud are directly registered by adopting the mode so as to further determine the welding seam, the welding seam identification effect is poor due to poor accuracy of an initial registration result.
In order to accurately obtain a welding seam position, the embodiment of the application provides a welding seam position obtaining method, a device, electronic equipment and a readable storage medium, and a more accurate local registration result is obtained based on a global and local matrix fusion mode, so that the welding seam position can be accurately obtained conveniently, and the welding precision is improved.
Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a block diagram of an electronic device 100 according to an embodiment of the application. The electronic device 100 may be, but is not limited to, a robot, a smart phone, a computer, a server, etc. for welding. The electronic device 100 may include a memory 110, a processor 120, and a communication unit 130. The memory 110, the processor 120, and the communication unit 130 are electrically connected directly or indirectly to each other to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
Wherein the memory 110 is used for storing programs or data. The Memory 110 may be, but is not limited to, random access Memory (Random Access Memory, RAM), read Only Memory (ROM), programmable Read Only Memory (Programmable Read-Only Memory, PROM), erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc.
The processor 120 is used to read/write data or programs stored in the memory 110 and perform corresponding functions. For example, the memory 110 stores therein a weld position obtaining device including at least one software function module that may be stored in the memory 110 in the form of software or firmware (firmware). The processor 120 executes various functional applications and data processing by running software programs and modules stored in the memory 110, such as the weld position obtaining apparatus in the embodiment of the present application, that is, implements the weld position obtaining method in the embodiment of the present application.
The communication unit 130 is configured to establish a communication connection between the electronic device 100 and other communication terminals through a network, and is configured to transmit and receive data through the network.
It should be understood that the structure shown in fig. 1 is merely a schematic diagram of the structure of the electronic device 100, and that the electronic device 100 may further include more or fewer components than those shown in fig. 1, or have a different configuration than that shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
Referring to fig. 2, fig. 2 is a flowchart of a method for obtaining a weld position according to an embodiment of the application. The method is applicable to the electronic device 100 described above. The specific flow of the weld position obtaining method is explained in detail below. In this embodiment, the method may include step S110 to step S170.
Step S110, a first model point cloud of a workpiece model corresponding to a workpiece to be welded is obtained.
In this embodiment, the workpiece model corresponding to the workpiece to be welded may be a model drawn by a user in advance for the workpiece to be welded, or may be a model provided by a manufacturer corresponding to the workpiece to be welded. Wherein the first model point cloud is used to describe the shape of the workpiece model. Alternatively, the first model point cloud may be generated according to a workpiece model file of the workpiece to be welded provided by a manufacturer. For example, manufacturers provide a artifact model file that can be opened by CAD software, which can be used to generate a first model point cloud.
Step S120, obtaining a global point cloud of the workpiece to be welded.
In this embodiment, the point cloud collecting device may collect point clouds of the workpiece to be welded, so as to obtain a plurality of point clouds of the workpiece to be welded, and then splice the plurality of point clouds, and use the spliced result as a global point cloud of the workpiece to be welded. The coordinate system used by the global point cloud may be a camera coordinate system of the point cloud acquisition device. The global point cloud can describe the approximate shape of the workpiece to be welded, and point cloud collection is not required to be carried out on all surfaces of the workpiece to be welded. Thus, the global point cloud is convenient to obtain quickly.
Step S130, registering the first model point cloud and the global point cloud to obtain a global registration result.
In this embodiment, in a case where the first model point cloud and the global point cloud are obtained, registration processing may be performed, and the obtained processing result is taken as a global registration result. Optionally, the global registration result may be obtained through a conventional registration algorithm, or may be obtained through registration by other manners. The global registration result is a transformation matrix.
Step S140, obtaining a first local point cloud of the workpiece to be welded.
In this embodiment, the first local point cloud may be obtained by the point cloud collecting device through one-time point cloud collection of the workpiece to be welded. The first local point cloud and the point cloud for obtaining the global point cloud may be obtained separately, that is, the point cloud for obtaining the global point cloud may be obtained first, then the global point cloud is obtained through stitching, and then the first local point cloud is acquired again. Or the first local point cloud may be one of a plurality of point clouds for stitching to obtain the global point cloud.
The step S110, the step S120, and the step S140 may be performed simultaneously, or may be performed sequentially, for example, the step S110 may be performed first, the step S120 may be performed second, and the step S140 may be performed later, or the step S120 may be performed first, the step S140 may be performed second, and the step S110 may be performed later, which may be specifically determined according to the actual requirement. Step S130 may be performed before step S140 or after step S140, and may be specifically determined according to actual requirements.
Step S150, based on the global registration result, unifying the first local point cloud and the first model point cloud to the target coordinate system to obtain a second local point cloud and a second model point cloud.
And under the condition that the global registration result is obtained, taking the global registration result as an initial value of registration of the first local point cloud and the first model point cloud, so that the first local point cloud and the first model point cloud are unified under a target coordinate system, and a second local point cloud and a second model point cloud are obtained. And processing the first local point cloud to the target coordinate system to obtain the second local point cloud, and processing the first model point cloud to the target coordinate system to obtain the second model point cloud. The target coordinate system may be specifically determined in combination with an actual requirement, for example, a camera coordinate system in which the global point cloud is located, or other coordinate systems converted from the camera coordinate system may be specifically determined in combination with an actual requirement.
Step S160, registering the second local point cloud and the second model point cloud to obtain a local registration result.
And under the condition that the second local point cloud and the second model point cloud are obtained, registering the second local point cloud and the second model point cloud by adopting a conventional fine registration algorithm or other modes, and taking a registration result as a local registration result. The local registration result is a transformation matrix.
Step S170, obtaining a target weld position from the second local point cloud according to the local registration result and the weld position information in the second model point cloud.
In this embodiment, the first model point cloud marks the weld position, which may be manually pre-marked or may be determined by other means. After the local registration result is obtained, the second model point cloud and the second local point cloud can be unified to be under a coordinate system according to the local registration result, then the welding seam in the second local point cloud is identified according to the welding seam position information in the second model point cloud, and the target welding seam position is further determined. For example, a point in the second local point cloud that matches a weld point in the second model point cloud is taken as a weld point location. The weld position information in the second model point cloud may be obtained based on the marked weld position in the first model point cloud.
In this embodiment, through the registration of the first model point cloud and the global point cloud, a better initial value can be provided for the fine registration of the first local point cloud and the first model point cloud, so that the possibility of sinking into local optimum is reduced, an accurate local registration result is obtained through fine registration, and then a welding seam in the first local point cloud is identified by combining the local registration result and the first model point cloud. Therefore, based on the global and local fusion registration mode, the method is beneficial to obtaining more accurate weld joint positions in welding scenes and improves welding accuracy.
Optionally, in this embodiment, under the condition that the first model point cloud and the global point cloud are obtained, the first model point cloud and the global point cloud may be directly registered, so as to obtain a global registration result. The first model point cloud and the global point cloud can be downsampled, and then the downsampled first model point cloud and global point cloud are registered to obtain a global registration result, so that the data processing capacity can be reduced. Voxel filtering can be performed on both the first model point cloud and the global point cloud, or voxel filtering can be performed on the first model point cloud, normal vector filtering can be performed on the global point cloud, and the like, so that points to be processed are reduced.
As a possible implementation manner, the first model point cloud and the global point cloud may be registered by a conventional coarse registration method, and the obtained registration result is used as the global registration result. The conventional coarse registration method may be, but is not limited to, LORAX algorithm, four-point method, super 4PCS algorithm, and the like.
As another possible implementation manner, the first model point cloud and the global point cloud may be registered in a conventional registration manner including coarse registration and fine registration, and the obtained registration result is used as the global registration result. That is, coarse registration is performed on the first model point cloud and the global point cloud, the first model point cloud and the global point cloud are unified to be under a coordinate system according to a coarse registration result, and then fine registration is performed. In this method, coarse registration may be performed by a coarse registration method such as LORAX algorithm, four-point method, super 4PCS algorithm, or the like, and fine registration may be performed by ICP algorithm, GICP (Generalized Iterative Closest Point ) algorithm, NDT (Normal Distributions Transform, normal distribution transform) algorithm, or the like.
Wherein GICP is an improved point cloud registration algorithm, which is an extension and improvement of the standard ICP algorithm. The ICP algorithm is mainly used to register two point clouds or three-dimensional models so that they are aligned as much as possible. GICP introduce some improvements on the ICP basis to improve the robustness and efficiency of registration. Unlike ICP, GICP uses a point-to-face distance metric, i.e., a point is projected onto a target surface, and then the distance of the point to the surface, i.e., the distance between the points before and after the projection, is calculated. Such distance measures can better handle non-rigid deformations and improve the robustness of registration; and, smoothness constraints between neighboring points are introduced to better handle noise and improve stability of registration. This constraint helps to make the registration result smoother, avoiding sharp registration errors.
As yet another possible implementation, in order to make global registration relatively accurate and fast, the global registration result may be obtained quickly and accurately by planar rotation and registration based on a conventional registration algorithm in the manner shown in fig. 3. Referring to fig. 3, fig. 3 is a flow chart illustrating the sub-steps included in step S130 in fig. 2. In this embodiment, the step S130 may include sub-steps S131 to S132.
And a sub-step S131, performing plane detection on the global point cloud to obtain a target plane.
In this embodiment, the global point cloud may be subjected to plane detection, so as to detect multiple planes. The detected plane may be taken as the initial plane. Optionally, all the initial planes obtained by detection can be used as target planes, so that the accuracy of the global registration result obtained based on the target planes can be ensured conveniently.
As a possible implementation, the target plane may be determined in the manner shown in fig. 4. Referring to fig. 4, fig. 4 is a flow chart illustrating the sub-steps included in step S131 in fig. 3. In this embodiment, the substep S131 may include substeps S1311 to S1312.
In the substep S1311, for the global point cloud, plane detection is performed to obtain a plurality of initial planes.
Substep S1312, the target plane is selected from the plurality of initial planes.
Alternatively, for the plurality of initial planes, 1 initial plane or a plurality of initial planes may be selected as the target plane from the plurality of initial planes in order of the number from the higher to the lower according to the number of points corresponding to each initial plane. Wherein the number of initial planes is greater than the number of target planes. Thus, the accuracy can be ensured, and the processing speed can be ensured.
And step S132, according to the target normal vector and the target plane, obtaining rotation information corresponding to the target plane, and according to the rotation information, registering the first model point cloud and the global point cloud to obtain the global registration result.
In this embodiment, the target normal vector may be determined. The target normal vector is used for representing the direction of one coordinate axis of the three-dimensional coordinate system, and the modular length of the target normal vector can be 1 or not 1, and can be specifically determined in combination with actual requirements. As one possible implementation manner, the target normal vector is one of three normal vectors of the workpiece model, where the three normal vectors are: x-axis (1, 0), y-axis (0, 1, 0), z-axis (0, 1).
And calculating the rotation information corresponding to the target plane according to the target normal vector and the target plane. The rotation information comprises a rotation shaft and a rotation angle. And then, the rotation information is used as an initial value, and the first model point cloud and the global point cloud are registered to obtain the global registration result. Optionally, an ICP algorithm, GICP algorithm or NDT algorithm may be used to register the first model point cloud and the global point cloud to obtain the global registration result. Thus, the global registration result can be quickly and accurately obtained through plane detection, rotation and registration based on a registration algorithm.
And when the target plane is only one, the registration result of the first model point cloud and the global point cloud, which are calculated according to the target plane, can be used as the global result.
Optionally, in this embodiment, to further improve accuracy of the global registration result, the target planes are multiple, and the global registration result is obtained according to multiple target planes in a manner shown in fig. 5. Referring to fig. 5, fig. 5 is a flow chart illustrating the sub-steps included in step S132 in fig. 3. In this embodiment, the substep S132 may include substeps S1321 to S1323.
In the substep S1321, for each target plane, rotation information corresponding to the target plane is calculated according to the target plane and a target normal vector.
In the substep S1322, for each target plane, according to the rotation information of the target plane, the first model point cloud and the global point cloud, an initial global registration result corresponding to the target plane and an error value corresponding to the initial global registration result are obtained through registration calculation.
Sub-step S1322, taking the initial global registration result corresponding to the minimum error value of the obtained error values as the global registration result.
In this embodiment, when there are a plurality of target planes, the rotation axis and the rotation angle corresponding to the target plane may be calculated for each of the target planes according to the target plane and the target normal vector by the following calculation formula. Wherein, the calculation formula is:
wherein rot represents the rotation axis, Representing the rotation angle,/>Representing the normal vector of 1 target plane,/>Representing the target normal vector. Wherein the target normal vector is one of T x、Ty、Tz, T x represents (1, 0), T y represents (0, 1, 0), and T z represents (0, 1).
After a rotation shaft and a rotation angle corresponding to a target plane are obtained, the first model point cloud can be rotated according to the rotation shaft and the rotation angle, and a conventional registration algorithm is adopted to register the global point cloud and the rotated point cloud, so that a registration result can be obtained as a registration result corresponding to the target plane. The registration result corresponding to the target plane may be used as an initial global registration result. In the conventional registration algorithm-based processing process, an error value corresponding to the initial global registration result can be obtained, and the error value is used for describing registration errors. For example, the first model point cloud and the global point cloud are registered based on the rotation information and the ICP algorithm, and an initial global registration result and a corresponding current error value can be obtained by calculation.
After the calculation of the rotation information and the registration based on the conventional registration algorithm are executed for each target plane, an initial global registration result and an error value corresponding to each target plane can be obtained. The obtained error values may be compared to determine a minimum error value of the error values, and an initial global registration result corresponding to the minimum error value is used as the global registration result. The global registration result may be an RT matrix. Thus, a better rough registration result which is convenient for registering the first local point cloud at the follow-up time can be obtained.
In this embodiment, the plane rotation and the global registration based on the conventional registration algorithm are adopted, and the plane detection can identify the plane structures in the scene, and these structures are usually stable features, so that the initial position information (i.e. the global registration result) obtained by using the plane detection has a certain robustness, and can process some complex scenes. Meanwhile, plane detection can be used for reducing the data volume in the point cloud, and main structural information is extracted, so that the computational complexity of the subsequent registration process is reduced; by processing the planar structure, the dimensions of the point cloud can be reduced to a subset that is easier to process, and thus the processing time can be reduced over a length. In addition, in order to improve the accuracy of the registration matrix based on plane rotation, the registration based on a conventional registration algorithm is introduced for fine adjustment, so that the global registration result cannot deviate in a large scale, and the speed and stability of the subsequent registration are improved.
In this embodiment, under the condition that the global registration result is obtained, the first local point cloud and the first model point cloud may be unified into a target coordinate system according to the global registration result, so as to obtain a second local point cloud and a second model point cloud. The target coordinate system is a robot-based coordinate system, and the first local point cloud and the global point cloud are obtained by a point cloud collecting device at the tail end of the robot, in which case, the first local point cloud may be converted into the second local point cloud according to a hand-eye calibration matrix obtained in advance, and the first model point cloud may be converted into the second model point cloud based on the hand-eye calibration matrix and the global registration result. Therefore, the welding seam position under the robot base coordinate system can be obtained conveniently, and control is facilitated.
Optionally, the second local point cloud and the second model point cloud may be directly registered, so as to obtain a local registration result. The first local point cloud and the first model point cloud can be downsampled, then the second local point cloud and the second model point cloud are obtained through conversion, and a local registration result is obtained through registration; or unifying the first local point cloud and the first model point cloud under a target coordinate system to obtain a second local point cloud and a second model point cloud, sampling the second local point cloud and the second model point cloud, and registering to obtain a local registration result. That is, the downsampling may be performed before or after the conversion according to the global registration result. The specific way of downsampling can be determined according to actual requirements. For example, normal vector filtering is performed on the first local point cloud or the second local point cloud, and voxel filtering is performed on the first model point cloud or the second model point cloud; or voxel filtering is performed.
When a DLP camera (a point cloud acquisition device) acquires, noise and outliers in the natural environment may be obtained, which may negatively affect the point cloud registration. Wherein outliers may come from sensor noise, occlusion, moving objects, etc. In this embodiment, to further ensure accuracy of the local registration result, the abnormal points in the second local point cloud may be first identified and deleted, and then the registration is performed by combining with the second model point cloud, so as to obtain the local registration result.
Referring to fig. 6, fig. 6 is a schematic flow chart of the sub-steps included in step S160 in fig. 2. In this embodiment, the step S160 may include sub-steps S161 to S163, S166 to S167.
And substep S161, determining a matching point pair according to the second local point cloud and the second model point cloud.
In this embodiment, the matching point pair includes 1 point and the nearest matching point of the point; that is, the matching point pair includes two points, one of which is a point in the second local point cloud, and the other of which is a point in the second model point cloud. Optionally, the nearest matching point between the second local point cloud and the second model point cloud can be obtained through KD_Tree calculationWherein i represents the number of points in the point cloud Q, Q represents the second local point cloud, and P represents the second model point cloud.
In the substep S162, for each matching point pair, an included angle between normal vectors corresponding to two points in the matching point pair is calculated.
And step S163, if the included angle is larger than a preset angle, taking the points belonging to the second local point cloud in the matching point pair as abnormal points.
In this embodiment, for each matching point pair, the normal vector corresponding to each of the two points in the matching point pair may be calculated. For example, the 1 matching point pair includes a point 1 and a point 2, the point 1 is a point in the second local point cloud, and the point 2 is a point in the second model point cloud; for the point 1, calculating to obtain a normal vector of the point 1 according to the point 1 and a reference point corresponding to the point 1 in the second local point cloud, wherein the reference point corresponding to the point 1 is positioned in a certain range of the point 1; similarly, for the point 2, according to the point 2 and the reference point corresponding to the point 2 in the second model point cloud, the normal vector of the point 2 is calculated, and the reference point corresponding to the point 2 is located in a certain range of the point 2. After the normal vectors corresponding to the two points in one matching point pair are calculated, the included angle between the normal vectors corresponding to the two points in the matching point pair can be calculated. The calculation formula of the included angle can be as follows:
wherein, Normal vector representing unit vectorized acquisition point,/>And the normal vector of the unit vectorized workpiece model point is represented. The acquisition points represent points in a second local point cloud and the workpiece model points represent points in the second model point cloud.
After calculating an included angle corresponding to the matching point pair, the included angle can be compared with a preset angle. If the included angle is larger than the preset angle, it may be considered that the point pair belonging to the second local point cloud in the matching point pair has a side effect, in which case, the point belonging to the second local point cloud in the matching point pair may be determined as an abnormal point. If the included angle is smaller than or equal to the preset angle, the point pair belonging to the second local point cloud in the matching point pair can be directly considered to be favorable for local matching, and the point belonging to the second local point cloud in the matching point pair is not deleted, namely the point belonging to the second local point cloud in the matching point pair is reserved; and determining whether to delete the point belonging to the second local point cloud in the matching point pair by combining other judging modes under the condition that the included angle is smaller than the preset angle. The preset angle may be set in combination with actual requirements, for example, set to 10 degrees.
And step S166, deleting the abnormal point from the second local point cloud to obtain a third local point cloud.
And deleting the abnormal point from the second local point cloud under the condition that the abnormal point is determined, and taking the second local point cloud with the completed abnormal point as a third local point cloud. Therefore, the local point cloud when the local registration result is obtained can be smoother, and the influence of noise is reduced.
Substep S167, obtaining the local registration result through registration according to the third local point cloud and the second model point cloud.
In this embodiment, a conventional registration algorithm may be used to register the third local point cloud and the second model point cloud, so as to obtain a local registration result. The conventional registration algorithm may be, but is not limited to, GICP algorithm, NDT algorithm, etc.
As one possible implementation, the local registration result is obtained by GICP registration. GICP registration is an extension to the ICP registration method. To improve the robustness of the registration GICP uses the covariance matrix of the point cloud surface to construct a cost function of the point cloud registration.
The closest point in the second local point cloud and the second model point cloud can be calculated by kd_tree: Wherein/> And/>As nearest neighbor,/>For a point in the second local point cloud,/>For points in the second model point cloud, while satisfying a gaussian distribution:
Wherein, C is the covariance matrix corresponding to each point, d is the error between each pair of corresponding points, and the error also satisfies the gaussian distribution, so the final optimization function is:
And (5) obtaining a local registration result by iteratively solving the T.
Referring to fig. 7, fig. 7 is a second flowchart illustrating the sub-steps included in step S160 in fig. 2. In this embodiment, the step S160 may further include sub-steps S164 to S165.
Substep S164, for each of the matching point pairs, calculating a distance between two points in the matching point pair.
And step S165, if the distance is larger than the preset distance, taking the points belonging to the second local point cloud in the matching point pair as abnormal points.
In this embodiment, the distance between the two points in the matching point pair may be calculated according to the position information of each of the two points in the matching point pair for the matching point pair. And comparing the distance with a preset distance, and if the distance is larger than the preset distance, determining the points belonging to the second local point cloud in the matched point pair as abnormal points. If the distance is smaller than or equal to the preset distance, determining whether to take the point belonging to the second local point cloud in the matching point pair as an abnormal point according to other judging modes. The outlier determined by sub-steps S162-S163, S164-S165 may be deleted from the second local point cloud to obtain a third local point cloud.
For example, for a matching point pair, an included angle between normal vectors of two points in the matching point pair and a distance between the two normal vectors can be calculated, and if the included angle and/or the distance is greater than a corresponding threshold value, a point in the matching point pair belonging to the second local point cloud can be determined as an abnormal point.
Based on the global and local fusion registration mode, the method is favorable for obtaining more accurate weld joint positions in welding scenes and improves welding precision. In general, the registration strategy comprehensively utilizing global structure information and local fine tuning can achieve a good registration effect in different point cloud scenes.
Referring to fig. 8, fig. 8 is a flowchart illustrating the sub-steps included in step S170 in fig. 2. In this embodiment, the step S170 may include sub-steps S171 to S172.
And step S171, converting the second model point cloud into a third model point cloud according to the local registration result.
Substep S172, obtaining the target weld position from the third local point cloud according to the weld position information in the third model point cloud.
In this embodiment, in the case of obtaining the local registration result, the second model point cloud may be converted according to the local registration result, and the conversion result may be used as a third model point cloud. And then, taking a matching point corresponding to the welding point in the third model point cloud in the third local point cloud as the welding point of the third local point cloud, and taking the welding point of the third local point cloud as a target welding position. Wherein the weld points in the third model point cloud are determined based on the weld position information in the third model point cloud.
In the embodiment, firstly, a global registration result is obtained, and is applied to a point cloud of a workpiece model to obtain a better measured position posture, and a second model point cloud can be obtained at the moment; then, the closest point of matching between the second local point cloud and the second model point cloud of the workpiece under the target coordinate system can be calculated through KD_Tree, the normal vector included angle and the normal vector distance between the points are calculated, the normal vector included angle and the normal vector distance between the points are compared with corresponding threshold values, and the points larger than the corresponding threshold values are deleted. In the scheme, after the global registration obtains a better pose, if a large number of exposure points exist in the local point cloud, the included angles and the distances between the normal vectors of the exposure points and the normal vectors of corresponding matching points in the second model point cloud are larger than a threshold value, and the points are regarded as noise points and need to be deleted. And then registering according to the second model point cloud and the third local point cloud obtained by deleting the noise points, and further obtaining the target weld position from the third local point cloud based on registration combination and weld position information in the second model point cloud. Therefore, the target weld joint position can be accurately obtained through global registration result application, KD tree calculation of normal vector included angles of the matching points, calculation of matching point distances, abnormal point deletion and registration.
In order to perform the corresponding steps in the above embodiments and the various possible ways, an implementation of the weld position obtaining apparatus 200 is given below, and alternatively, the weld position obtaining apparatus 200 may employ the device structure of the electronic device 100 shown in fig. 1 and described above. Further, referring to fig. 9, fig. 9 is a block diagram of a weld position obtaining apparatus 200 according to an embodiment of the application. It should be noted that, the basic principle and the technical effects of the weld position obtaining apparatus 200 provided in this embodiment are the same as those of the foregoing embodiments, and for brevity, reference may be made to the corresponding contents of the foregoing embodiments. In this embodiment, the weld position obtaining apparatus 200 may include: the acquisition module 210, the global registration module 220, the conversion module 230, the local registration module 240, and the location determination module 250.
The obtaining module 210 is configured to obtain a first model point cloud of a workpiece model corresponding to a workpiece to be welded.
The obtaining module 210 is further configured to obtain a global point cloud of the workpiece to be welded.
The global registration module 220 is configured to register the first model point cloud and the global point cloud to obtain a global registration result.
The obtaining module 210 is further configured to obtain a first local point cloud of the workpiece to be welded. The global point cloud and the first local point cloud are acquired through point cloud acquisition equipment.
The conversion module 230 unifies the first local point cloud and the first model point cloud under a target coordinate system based on the global registration result, and obtains a second local point cloud and a second model point cloud.
The local registration module 240 is configured to register the second local point cloud and the second model point cloud to obtain a local registration result.
The position determining module 250 is configured to obtain a target weld position from the second local point cloud according to the local registration result and the weld position information in the second model point cloud.
Alternatively, the above modules may be stored in the memory 110 shown in fig. 1 or solidified in an Operating System (OS) of the electronic device 100 in the form of software or Firmware (Firmware), and may be executed by the processor 120 in fig. 1. Meanwhile, data, codes of programs, and the like, which are required to execute the above-described modules, may be stored in the memory 110.
The embodiment of the application also provides a readable storage medium, on which a computer program is stored, which when being executed by a processor, implements the weld position obtaining method.
In summary, the embodiments of the present application provide a method, an apparatus, an electronic device, and a readable storage medium for obtaining a weld position, where a first model point cloud of a workpiece model corresponding to a workpiece to be welded and a global point cloud of the workpiece to be welded are obtained first; then, registering the first model point cloud and the global point cloud to obtain a global registration result; then, based on a global registration result, unifying the obtained first local point cloud and the first model point cloud of the workpiece to be welded under a target coordinate system to obtain a second local point cloud and a second model point cloud, wherein the first local point cloud and the second local point cloud are acquired through a point cloud acquisition device; and then, registering the second local point cloud and the second model point cloud to obtain a local registration result, and further obtaining a target weld position from the second local point cloud based on the local registration result and weld position information in the second model point cloud. Therefore, a more accurate local registration result can be obtained based on the global and local matrix fusion mode, so that the welding seam position can be conveniently and accurately obtained, and the welding precision is further improved.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, for example, of the flowcharts and block diagrams in the figures that illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above description is only of alternative embodiments of the present application and is not intended to limit the present application, and various modifications and variations will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (10)
1. A method of obtaining a weld position, the method comprising:
obtaining a first model point cloud of a workpiece model corresponding to a workpiece to be welded;
obtaining a global point cloud of the workpiece to be welded;
registering the first model point cloud and the global point cloud to obtain a global registration result;
obtaining a first local point cloud of the workpiece to be welded, wherein the global point cloud and the first local point cloud are acquired through a point cloud acquisition device, the global point cloud is acquired by splicing a plurality of point clouds acquired by the point cloud acquisition device, and the first local point cloud is acquired by the point cloud acquisition device through one-time point cloud acquisition;
Unifying the first local point cloud and the first model point cloud under a target coordinate system based on the global registration result to obtain a second local point cloud and a second model point cloud;
registering the second local point cloud and the second model point cloud to obtain a local registration result;
And obtaining a target weld joint position from the second local point cloud according to the local registration result and the weld joint position information in the second model point cloud.
2. The method of claim 1, wherein registering the first model point cloud and the global point cloud to obtain a global registration result comprises:
Performing plane detection on the global point cloud to obtain a target plane;
And according to a target normal vector and the target plane, acquiring rotation information corresponding to the target plane, and registering the first model point cloud and the global point cloud according to the rotation information to acquire the global registration result, wherein the target normal vector is used for representing the direction of one coordinate axis of a three-dimensional coordinate system, and the rotation information comprises a rotation axis and a rotation angle.
3. The method of claim 2, wherein the plurality of target planes are provided, the obtaining rotation information corresponding to the target planes according to a target normal vector and the target planes, and registering the first model point cloud and a global point cloud according to the rotation information, to obtain the global registration result, includes:
For each target plane, calculating and obtaining rotation information corresponding to the target plane according to the target plane and a target normal vector;
for each target plane, according to the rotation information of the target plane, the first model point cloud and the global point cloud, obtaining an initial global registration result corresponding to the target plane and an error value corresponding to the initial global registration result through registration calculation;
And taking the initial global registration result corresponding to the minimum error value in the obtained error values as the global registration result.
4. The method according to claim 2, wherein performing plane detection on the global point cloud to obtain a target plane includes:
Performing plane detection on the global point cloud to obtain a plurality of initial planes;
the target planes are selected from the plurality of initial planes, wherein the number of the initial planes is greater than the number of the target planes.
5. The method of claim 1, wherein registering the second local point cloud and the second model point cloud to obtain a local registration result comprises:
Determining a matching point pair according to the second local point cloud and the second model point cloud, wherein the matching point pair comprises two points, one point is a point in the second local point cloud, and the other point is a point in the second model point cloud;
for each matching point pair, calculating the included angle between the normal vectors corresponding to the two points in the matching point pair;
If the included angle is larger than a preset angle, taking the points belonging to the second local point cloud in the matched point pair as abnormal points;
Deleting the abnormal point from the second local point cloud to obtain a third local point cloud;
And according to the third local point cloud and the second model point cloud, obtaining the local registration result through registration.
6. The method of claim 5, wherein registering the second local point cloud and the second model point cloud to obtain a local registration result, further comprises:
For each matching point pair, calculating the distance between two points in the matching point pair;
And if the distance is larger than the preset distance, taking the points belonging to the second local point cloud in the matched point pair as abnormal points.
7. The method according to claim 5 or 6, wherein the obtaining the target weld position from the second local point cloud based on the local registration result and the weld position information in the second model point cloud comprises:
Converting the second model point cloud into a third model point cloud according to the local registration result;
and obtaining the target weld position from the third local point cloud according to the weld position information in the third model point cloud.
8. The method according to any one of claims 1 to 6, wherein the global point cloud is obtained by stitching a plurality of point clouds acquired by the point cloud acquisition device at the end of the robot, and the target coordinate system is a robot base coordinate system; and/or, the processing performed in the process of obtaining the global registration result comprises ICP registration; and/or the processing performed in the process of obtaining the local registration result comprises GICP registration.
9. A weld position obtaining apparatus, characterized by comprising:
the obtaining module is used for obtaining a first model point cloud of a workpiece model corresponding to the workpiece to be welded;
The obtaining module is further used for obtaining the global point cloud of the workpiece to be welded;
The global registration module is used for registering the first model point cloud and the global point cloud to obtain a global registration result;
The acquisition module is further configured to acquire a first local point cloud of the workpiece to be welded, where the global point cloud and the first local point cloud are acquired by a point cloud acquisition device, the global point cloud is acquired by splicing a plurality of point clouds acquired by the point cloud acquisition device, and the first local point cloud is acquired by the point cloud acquisition device through one-time point cloud acquisition;
The conversion module unifies the first local point cloud and the first model point cloud under a target coordinate system based on the global registration result to obtain a second local point cloud and a second model point cloud;
The local registration module is used for registering the second local point cloud and the second model point cloud to obtain a local registration result;
And the position determining module is used for obtaining a target weld position from the second local point cloud according to the local registration result and the weld position information in the second model point cloud.
10. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor to implement the weld location acquisition method of any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410171766.XA CN117726673B (en) | 2024-02-07 | 2024-02-07 | Weld joint position obtaining method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410171766.XA CN117726673B (en) | 2024-02-07 | 2024-02-07 | Weld joint position obtaining method and device and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117726673A CN117726673A (en) | 2024-03-19 |
CN117726673B true CN117726673B (en) | 2024-05-24 |
Family
ID=90203792
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410171766.XA Active CN117726673B (en) | 2024-02-07 | 2024-02-07 | Weld joint position obtaining method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117726673B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118212233B (en) * | 2024-05-20 | 2024-09-27 | 法奥意威(苏州)机器人系统有限公司 | Linear weld joint identification method and device and electronic equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112325796A (en) * | 2020-10-26 | 2021-02-05 | 上海交通大学 | Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing |
CN112446844A (en) * | 2020-11-27 | 2021-03-05 | 广东电网有限责任公司肇庆供电局 | Point cloud feature extraction and registration fusion method |
CN115578408A (en) * | 2022-07-28 | 2023-01-06 | 四川大学 | Point cloud registration blade profile optical detection method, system, equipment and terminal |
CN116329824A (en) * | 2023-04-24 | 2023-06-27 | 仝人智能科技(江苏)有限公司 | Hoisting type intelligent welding robot and welding method thereof |
CN116518864A (en) * | 2023-04-07 | 2023-08-01 | 同济大学 | Engineering structure full-field deformation detection method based on three-dimensional point cloud comparison analysis |
CN117475170A (en) * | 2023-12-22 | 2024-01-30 | 南京理工大学 | FPP-based high-precision point cloud registration method guided by local-global structure |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015006224A1 (en) * | 2013-07-08 | 2015-01-15 | Vangogh Imaging, Inc. | Real-time 3d computer vision processing engine for object recognition, reconstruction, and analysis |
-
2024
- 2024-02-07 CN CN202410171766.XA patent/CN117726673B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112325796A (en) * | 2020-10-26 | 2021-02-05 | 上海交通大学 | Large-scale workpiece profile measuring method based on auxiliary positioning multi-view point cloud splicing |
CN112446844A (en) * | 2020-11-27 | 2021-03-05 | 广东电网有限责任公司肇庆供电局 | Point cloud feature extraction and registration fusion method |
CN115578408A (en) * | 2022-07-28 | 2023-01-06 | 四川大学 | Point cloud registration blade profile optical detection method, system, equipment and terminal |
CN116518864A (en) * | 2023-04-07 | 2023-08-01 | 同济大学 | Engineering structure full-field deformation detection method based on three-dimensional point cloud comparison analysis |
CN116329824A (en) * | 2023-04-24 | 2023-06-27 | 仝人智能科技(江苏)有限公司 | Hoisting type intelligent welding robot and welding method thereof |
CN117475170A (en) * | 2023-12-22 | 2024-01-30 | 南京理工大学 | FPP-based high-precision point cloud registration method guided by local-global structure |
Non-Patent Citations (1)
Title |
---|
基于平移域估计的点云全局配准算法;杨滨华;赵高鹏;刘鲁江;薄煜明;;计算机应用;20160610(第06期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN117726673A (en) | 2024-03-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Urban et al. | Multicol-slam-a modular real-time multi-camera slam system | |
Kang et al. | Automatic targetless camera–lidar calibration by aligning edge with gaussian mixture model | |
CN117726673B (en) | Weld joint position obtaining method and device and electronic equipment | |
CN110842901B (en) | Robot hand-eye calibration method and device based on novel three-dimensional calibration block | |
KR20150079730A (en) | Systems and methods of merging multiple maps for computer vision based tracking | |
WO2011115143A1 (en) | Geometric feature extracting device, geometric feature extracting method, storage medium, three-dimensional measurement apparatus, and object recognition apparatus | |
CN111798398B (en) | Point cloud noise reduction method and device, electronic equipment and computer readable storage medium | |
CN116433737A (en) | Method and device for registering laser radar point cloud and image and intelligent terminal | |
O'Byrne et al. | A stereo‐matching technique for recovering 3D information from underwater inspection imagery | |
Scaramuzza et al. | A robust descriptor for tracking vertical lines in omnidirectional images and its use in mobile robotics | |
CN113822996B (en) | Pose estimation method and device for robot, electronic device and storage medium | |
JP6673504B2 (en) | Information processing device, database generation device, method, program, and storage medium | |
CN118212233A (en) | Linear weld joint identification method and device and electronic equipment | |
CN117911628A (en) | Ancient building reconstruction method based on three-dimensional laser scanning and unmanned aerial vehicle oblique photography | |
CN117745778A (en) | Point cloud registration realization method and device, storage medium and electronic equipment | |
CN113538699A (en) | Positioning method, device and equipment based on three-dimensional point cloud and storage medium | |
JP2019105992A (en) | Image processing device, image processing program and image processing method | |
CN111656404B (en) | Image processing method, system and movable platform | |
CN115880453A (en) | Point cloud data matching method, matching device and computer readable medium | |
KR102624644B1 (en) | Method of estimating the location of a moving object using vector map | |
Dantanarayana et al. | Object recognition and localization from 3D point clouds by maximum-likelihood estimation | |
Kovacs et al. | Edge detection in discretized range images | |
WO2017042852A1 (en) | Object recognition appratus, object recognition method and storage medium | |
Xu et al. | Automatic registration method for TLS LiDAR data and image-based reconstructed data | |
Rastgar | Robust self-calibration and fundamental matrix estimation in 3D computer vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |