WO2020042032A1 - 视觉焊接系统的缝隙检测方法以及系统 - Google Patents

视觉焊接系统的缝隙检测方法以及系统 Download PDF

Info

Publication number
WO2020042032A1
WO2020042032A1 PCT/CN2018/103077 CN2018103077W WO2020042032A1 WO 2020042032 A1 WO2020042032 A1 WO 2020042032A1 CN 2018103077 W CN2018103077 W CN 2018103077W WO 2020042032 A1 WO2020042032 A1 WO 2020042032A1
Authority
WO
WIPO (PCT)
Prior art keywords
metal parts
gap
dot
image
information
Prior art date
Application number
PCT/CN2018/103077
Other languages
English (en)
French (fr)
Inventor
阳光
王磊
Original Assignee
深圳配天智能技术研究院有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳配天智能技术研究院有限公司 filed Critical 深圳配天智能技术研究院有限公司
Priority to PCT/CN2018/103077 priority Critical patent/WO2020042032A1/zh
Priority to CN201880087341.3A priority patent/CN111630342B/zh
Publication of WO2020042032A1 publication Critical patent/WO2020042032A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures

Definitions

  • the present application relates to the field of gap detection, and in particular, to a gap detection method and system for a vision welding system.
  • the present application provides a gap detection method and system for a vision welding system, so as to solve the problem that gap detection is difficult or costly today.
  • a technical solution adopted in the present application is to provide a gap detection method of a visual welding system, including the steps of: determining a layout relationship of two metal parts in a region to be welded by visual inspection; when the two metal parts are spatially different surfaces During the setting, a dot-shaped laser pattern is projected on the two metal parts; the gap information between the two metal parts is determined by the structured light detection method according to the dot-shaped laser pattern.
  • a visual welding system including a visual inspection system for determining an arrangement relationship between two metal parts of a region to be welded by a visual inspection method; a laser projection device For projecting a dot-like laser pattern on two metal parts whose arrangement is determined to be different in space; the visual inspection system further determines between the two metal parts in a structured light detection manner based on the dot-like pattern. Gap information.
  • a computer storage medium storing a program file capable of implementing any of the above methods.
  • a gap detection method and system for a visual welding system which determines the arrangement relationship of two metal parts in a region to be welded by visual inspection, and when it is determined that the two metal parts are arranged on different surfaces in space, A spot-shaped laser pattern is projected on the two metal parts, and the gap information between the two metal parts is determined by the structured light detection method according to the dot-shaped laser pattern, so that the gap detection of the two metal parts in the welding area can be realized.
  • FIG. 1 is a schematic flowchart of a first embodiment of a method for detecting a gap in a visual welding system of the present application
  • FIG. 2 is a schematic flowchart of a second embodiment of a method for detecting a gap in a visual welding system of the present application
  • FIG. 3 is a schematic flowchart of a third embodiment of a method for detecting a gap in a visual welding system of the present application
  • FIG. 4 is a schematic flowchart of a fourth embodiment of a method for detecting a gap in a visual welding system of the present application
  • FIG. 5 is a schematic diagram of a specific embodiment of the embodiments of FIG. 3 and FIG. 4;
  • FIG. 6 is a schematic diagram of a simple principle of coordinate calculation in the specific embodiment shown in FIG. 5;
  • FIG. 7 is a schematic diagram of another simple principle of coordinate calculation in the specific embodiment shown in FIG. 5;
  • FIG. 8 is a schematic structural diagram of an embodiment of a visual welding system of the present application.
  • FIG. 9 is a schematic structural diagram of an embodiment of a computer storage medium of the present application.
  • FIG. 1 is a schematic flowchart of a first embodiment of a gap detection method of a visual welding system of the present application.
  • a specific gap detection method includes the following steps:
  • the two metal parts in the welding area must be inspected first, and specifically, the two metals are imaged by visual inspection. Acquisition, image recognition and processing.
  • the two metal components are arranged in a spatially different plane and a spatially coplanar state.
  • FIG. 2 is a schematic flowchart of a second embodiment of a gap detection method of the visual welding system of the present application, and is a sub-example of step S11, which specifically includes the following steps:
  • S111 Perform image acquisition on the area to be welded to obtain a visual inspection image.
  • an automated method is used to detect the welding area.
  • image acquisition is required, and the entire welding area is mainly acquired by a machine to obtain a visual inspection image of two metal parts in the welding area.
  • S112 Perform image recognition on the visual inspection image to identify two metal parts from the visual inspection image.
  • This application mainly completes the gap detection by projecting a laser pattern onto the surface of a metal part. It is necessary to determine the position information of two metal parts, including specific coordinates, position arrangement, etc., so that the visual inspection image containing the metal part needs to be identified. To identify metal parts.
  • a structured light detection method is used; when the arrangement of the two metal parts is a spatial coplanarity, a brightness detection method is used.
  • image collection is performed on the welding area, and two metal parts in the collected visual inspection image are identified, and then the arrangement relationship between the two metal parts is determined to complete the vision of the two metals in the welding area. Detection and identification, and determine the detection method to be used next through the arrangement.
  • a structured light detection method is adopted.
  • a dot-shaped laser-shaped pattern is specifically projected on the two metal parts, and then image collection and recognition are performed. And processing to determine whether a gap exists between the two metal parts, where the gap information includes whether there is a gap and size information of the gap.
  • two metal parts are arranged and tested, and then different gap detection methods are determined based on different detection results.
  • FIG. 3 is a schematic flowchart of a third embodiment of a gap detection method of the visual welding system of the present application.
  • a corresponding laser pattern is formed on the surfaces of two metal parts, and then projected onto the metal The spatial position information of the laser pattern on the part, thereby obtaining the position of the metal part corresponding to the laser pattern, and finally determining the gap information between the two metals, wherein the gap information includes whether there is a gap and the size information of the gap. It includes the following steps:
  • step S131 a dot-shaped laser pattern is projected onto the two metal parts, so that the number of the dot-shaped laser patterns acting on the surfaces of the two metal parts is not less than two, respectively.
  • a dot-shaped laser pattern needs to be projected onto two metal parts, including a first laser pattern and a second laser pattern, wherein at least two dot-shaped laser patterns act on two spaced-apart surfaces.
  • a laser pattern is formed on the surface of one of the metal parts of the metal part, and at least two dot-shaped laser patterns act on the surface of the other metal part of the two metal parts, and a corresponding laser pattern is also formed.
  • S132 Perform image acquisition on two metal parts that project a dot-shaped laser pattern to obtain a gap detection image.
  • S133 Perform image recognition on the gap detection image to identify a spot-shaped laser pattern from the gap detection image.
  • the dot-shaped laser pattern acts on the surfaces of two metal parts and forms corresponding laser patterns
  • the spatial position information of the dot-shaped laser pattern is actually equivalent to the metal where the dot-shaped laser pattern is located.
  • the laser pattern in the gap detection image needs to be identified first.
  • S134 Determine gap information between two metal parts according to position information of the dot-shaped laser pattern in the gap detection image.
  • the positional information of the dot-shaped laser pattern is obtained by calculation, and then the position information of the corresponding metal part is obtained. Then, the comparison between the two metal parts is determined and determined. Gap information.
  • FIG. 4 is a schematic flowchart of a fourth embodiment of the gap detection method of the visual welding system of the present application, and FIG. 4 is a sub-example of step S134 of FIG. 3, and the point is selected by identifying the spot-shaped laser pattern , Determine the straight line, calculate the distance and compare to get the gap between the two metal parts, including the following steps:
  • S1341 Determine the spatial coordinates of at least two reference points on the surfaces of the two metal parts according to the position information of the dot-shaped laser pattern in the gap detection image.
  • the position of the laser pattern acting on the metal part is equivalent to the position information of the metal part where it is located.
  • Spatial location information Specifically, at least two points of information are needed to calculate a straight line.
  • position information of the laser pattern in the gap detection image spatial coordinates of at least two points in the laser pattern are obtained.
  • FIG. 5 is a schematic diagram of a specific embodiment of the embodiments of FIG. 3 and FIG. 4.
  • two metal parts are detected by projecting a spot-shaped laser.
  • the welding area includes two metal parts, M and N, respectively.
  • a spot-shaped laser By projecting a spot-shaped laser onto the two metal parts, at least two spot-shaped laser patterns are applied to the surface of the metal part M.
  • the optimal principle is to form only two spot laser patterns, namely the spot laser E and the spot laser G.
  • the spot laser F and the spot laser H are formed on the surface of the metal member N.
  • two points are respectively taken from the laser patterns on the two metal parts.
  • the horizontal and vertical axis coordinates of the points taken can be determined through the image, and on the one hand, based on the bias of the image
  • the angle and distance information are used to determine the vertical axis coordinates of the points taken, so that the space coordinates of the points taken can be obtained.
  • the metal part M takes E and G points, and the coordinates are E (x1, y1, z1), G ( x2, y2, z2), the corresponding straight line is L1; the metal part N takes F and H points, its coordinates are F (X1, Y1, Z1), H (X2, Y2, Z2), and the corresponding straight line is L2 That is, the metal component M corresponds to the straight line L1, and the metal component N corresponds to the straight line L2.
  • the three-dimensional coordinates of the points detected by the 2D camera described above can be specifically as follows:
  • the laser light from the light source 21 forms a laser pattern on the surface of the metal M.
  • the intersection point of the image sensor 11 and the plane where the metal M is located is the Q point. Therefore, m, the image sensor 11, and the Q point together Form a right triangle, one of which is ⁇ 1.
  • the offset angle of the image sensor 11 when collecting m points It is known, and its L1 is the offset distance from m point to Q point. The distance between the two points is calculated. Therefore, in a right-angled triangle, the angle value of a right-angle side and a non-right-angle angle are known.
  • the distance value of the other side that is, the distance from the image sensor 21 to the Q point, Establish the coordinate axis, you can get its horizontal, vertical and vertical coordinates.
  • the falling point is n
  • the intersection point is P
  • the offset angle is ⁇ 2
  • the offset distance is L2. The distance from the image sensor 21 to the point P can also be obtained.
  • the laser pattern may be projected vertically on the metal part, and a point P is taken on the metal part.
  • the light source, the camera, and the point P form a right-angled triangle, and the same method is used for calculation.
  • the offset is the relative distance between the light source and the camera.
  • a method of constructing multiple planes and coordinate systems may be used for acquisition.
  • the light source 21 projects a dot-shaped laser pattern on the metal part M
  • the light source 21 and the two rays form a lateral direction, respectively.
  • the light plane ⁇ h and the longitudinal light plane ⁇ v, where ⁇ c is the image plane, and the coordinate system for constructing the image sensor 11 is O c x c y c z c , where O p is the intersection of the optical axis collected by the image sensor 11 and the image plane ⁇ c
  • the undistorted image coordinate system is O u x u y u .
  • the three-dimensional world coordinate system is O w x w y w z w , where it is defined that O c x c is parallel to O u x u and O c z c is perpendicular to ⁇ c.
  • the model of the entire image sensor 21 can be expressed as:
  • is not 0, (fx, fy) is an effective focal length of the image sensor 11 in the x, y directions, and (u0, v0) is a principal point coordinate of the image sensor 11.
  • ri 1 .... 9) is an element of the orthogonal rotation matrix R, and tx, ty, and tz are elements of the translation vector T.
  • the space point P has a unique projection point p on the image plane ⁇ c, that is, the point p corresponds to the unique ray O c p in space, and P is located on O c p.
  • equations of rays O c p can be determined by equations (1), and equations of light planes ⁇ h and ⁇ v are determined by equations (2) and (3), respectively.
  • the intersection point can be shifted to determine the point P in O c x c y c Z c three-dimensional coordinates.
  • a 3D coordinate information about the projection point can be obtained by a 2D camera, which is not limited to the above manner in a specific embodiment.
  • S1342 Determine a spatial straight line equation corresponding to each metal component according to the spatial coordinates of at least two reference points on each metal component.
  • the spatial position of L1 corresponds to the metal part M
  • the spatial position of L2 corresponds to the metal part N. That is, the above two equations are the spatial position information equations of the two metal parts, respectively.
  • S1343 Determine gap information between two metal parts according to a linear equation of the space of each metal part.
  • the length of the common vertical line segment between two straight lines you can select any point Q on L1 and make a straight line L3 parallel to L2 through the point Q. At this time, L1 and L3 form a plane O, Then take any point W on L2. At this time, only the distance from the point W to the plane O is required. This is the length of the common vertical line segment between L1 and L2.
  • the above is only a kind of straight line distance in different planes.
  • the method may be any other method for determining the distance between two straight lines in other embodiments, without any limitation.
  • the male perpendicular line segment is used to determine the gap information between two metal parts according to the difference information. If the length of the common vertical line segment is greater than the threshold length, that is, the difference is greater than 0, it represents the distance between the two metal parts.
  • the specific value of the difference is the size of the gap between the two metal parts; if the length of the common vertical line segment is different from a preset length threshold If it is equal to 0, it is determined that there is no gap between the two metal parts.
  • FIG. 8 is a schematic structural diagram of an embodiment of a visual welding system of the present application.
  • the vision welding system includes a vision inspection system 10 and a laser projection device 20.
  • the visual inspection system 10 is used to perform visual inspection on two metal parts in the welding area. After obtaining the arrangement of the two metal parts, the laser projection device 20 projects a point shape on the two metal parts determined as different surfaces in space. For the laser pattern, the visual inspection system 10 further determines the gap information between the two metal parts in a structured light detection manner according to the dot laser pattern.
  • the visual inspection system 11 specifically includes an image sensor and a processor 12.
  • the image sensor 11 first performs image acquisition on the welding area to obtain a visual inspection image, and then the processor 12 acquires the image detected by the image sensor 11.
  • the visual inspection image of the image is processed to identify the two metal parts in the visual inspection image that need to be tested for gaps. Further, the specific arrangement relationship and position information of the identified two metal parts are determined, and the position and The arrangement information transmission laser projection device 20.
  • the laser projection device 20 mainly includes a light source.
  • the visual inspection system 10 determines the positions of two metal parts and the specific arrangement relationship, the position information and arrangement of the two metal parts are sent by the visual inspection system 10 to determine the laser detection.
  • the method, specifically, and the information transmitted by the processor 12, the light source 21 projects different laser patterns based on the obtained position information and arrangement information of the two metal parts.
  • the light source 21 of the laser projection device 20 projects a dot-shaped laser pattern onto two metal parts, and makes the dot-shaped laser patterns acting on the surfaces of the two metal parts respectively according to the specific position information of the metal parts. No less than two.
  • the image sensor 11 is responsible for image acquisition of the two metal parts to obtain a gap detection image.
  • the processor 12 obtains the gap detection image from the obtained gap detection image.
  • the laser detection patterns of the two metal surfaces are respectively shown in the gap detection image, and the spatial coordinates of at least two reference points on the surfaces of the two metal parts are respectively based on the position information in the laser pattern gap image.
  • the spatial linear equation corresponding to each metal component is determined by the spatial coordinates of the reference point, and the gap information between the two metal components is further calculated according to the spatial linear equation of each metal component.
  • the image sensor 11 uses a low-cost 2D image sensor, which cooperates with the processor 12 and the light source 21 to form a 3D structural model of the light source 21, metal parts, and the image sensor 11, Based on the principle of structured light, by collecting images on metal parts, on the one hand, the horizontal position information of the points taken is obtained from the plane position information of the points taken, and on the other hand, based on the image's deviation from the light source 21 and the image sensor 11 Gets the vertical axis coordinates of the taken point.
  • the processor 12 provided in this embodiment is not limited to image processing, but can also perform other processing, such as controlling the projection direction of the light source 20 and controlling the acquisition angle of the image sensor 11.
  • the processor 12 It can also be implemented by external connection. In order to save costs, the processor 12 can also perform related processing on multiple systems at the same time.
  • FIG. 9 is a schematic structural diagram of an embodiment of a computer storage medium of the present application.
  • the program file 31 can be stored in the storage device in the form of a software product, and is also recorded.
  • the various calculated data includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform all or part of the steps of the methods in the embodiments of the present application.
  • the aforementioned storage devices include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disks or optical disks and other media that can store program codes Or terminal devices such as computers, servers, mobile phones, and tablets.
  • the present application provides a gap detection method and system for a visual welding system.
  • the corresponding laser detection method is selected to the two Each metal part projects a laser pattern of a predetermined shape, and according to the collected laser pattern, calculates the positional relationship of the laser pattern, and further calculates the gap information between the two metal parts.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本申请公开了一种视觉焊接系统的缝隙检测方法以及系统,方法包括:通过视觉检测方式确定待焊接区域的两个金属部件的排布关系;当所述两个金属部件为空间异面设置时,向所述两个金属部件投射点状激光图案;根据所述点状激光图案以结构光检测方式确定所述两个金属部件之间的缝隙信息。通过上述方法,本申请能够实现对焊接区域的缝隙检测,并实现了通过采用2D视觉系统实现了3D视觉检测系统的功能,改善了技术,节省了成本。

Description

视觉焊接系统的缝隙检测方法以及系统 【技术领域】
本申请涉及到缝隙检测领域,特别是涉及一种视觉焊接系统的缝隙检测方法以及系统。
【背景技术】
在焊接领域中,如何检测待焊接金属间是否存在缝隙尤为重要,但是现如今的单相机的视觉系统难以对需要焊接的立体金属间存在的缝隙进行有效的检测,同时,在现有技术下,空间以及3D视觉系统比较昂贵,难以实现量产,因此引用了2D视觉系统,但是一般的光用2D视觉系统在查看金属存在诸多的问题;例如:视觉角度、金属表面的反光不均匀、金属与周围环境的相似灰度不好区分等。
【发明内容】
本申请提供一种视觉焊接系统的缝隙检测方法以及系统,以解决如今缝隙检测难度较大或成本较高的问题。
本申请采用的一个技术方案是:提供一种视觉焊接系统的缝隙检测方法,包括步骤:通过视觉检测方式确定待焊接区域的两个金属部件的排布关系;当两个金属部件为空间异面设置时,向两个金属部件投射点状形激光图案;根据点状形激光图案以结构光检测方式确定两个金属部件之间的缝隙信息。
为解决上述技术问题,本申请采用的另一个技术方案是:一种视觉焊接系统,包括视觉检测系统,用于通过视觉检测方式确定待焊接区域的两个金属部件的排布关系;激光投射装置,用于对确定排布方式为空间异面 设置的两个金属部件投射点状激光图案;所述视觉检测系统进一步根据所述点状图案以结构光检测方式确定所述两个金属部件之间的缝隙信息。
为解决上述技术问题,本申请采用的另一个技术方案是:一种计算机存储介质,存储有能够实现上述任一方法的程序文件。
本申请的有益效果是:一种视觉焊接系统的缝隙检测方法以及系统,通过视觉检测方式确定待焊接区域的两个金属部件的排布关系,当确定两个金属部件为空间异面设置时,向两个金属部件投射点状形激光图案,并根据点状形激光图案以结构光检测方式确定两个金属部件之间的缝隙信息,从而可以实现对焊接区域的两个金属部件进行缝隙检测。
【附图说明】
图1是本申请视觉焊接系统的缝隙检测的方法第一实施方式的流程示意图;
图2是本申请视觉焊接系统的缝隙检测的方法第二实施方式的流程示意图;
图3是本申请视觉焊接系统的缝隙检测的方法第三实施方式的流程示意图;
图4是本申请视觉焊接系统的缝隙检测的方法第四实施方式的流程示意图;
图5是图3以及图4实施例的一个具体实施例示意图;
图6是图5所示具体实施例中坐标计算的简单原理示意图;
图7是图5所示具体实施例中坐标计算的另简单原理示意图;
图8是本申请视觉焊接系统的一实施方式的结构示意图;
图9是本申请计算机存储介质的一实施方式的结构示意图。
【具体实施方式】
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本申请的一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
请参阅图1,图1是本申请视觉焊接系统的缝隙检测方法第一实施方式的流程示意图,具体的缝隙检测方法包括以下步骤:
S11,通过视觉检测方式确定焊接区域的两个金属部件的排布关系。
在本实施例中,由于对不同的排布的金属部件采用的是不同的焊接方法,所以首先要对焊接区域的两个金属部件进行检测,具体是通过视觉检测的方式对两个金属进行图像采集、图像识别以及处理。
在具体的实施例中,两个金属部件的排布方式分别是空间异面以及空间共面设置。
参阅图2,图2是本申请视觉焊接系统的缝隙检测方法第二实施方式的流程示意图,且是步骤S11的子实施例,具体包括如下步骤:
S111,对待焊接区域进行图像采集,以获得视觉检测图像。
在本实施例中,采用是自动化的方法对待焊接区域进行检测,首先需要进行图像采集,主要是通过机器对整个焊接区域进行图像采集,从而获得焊接区域的两个金属部件的视觉检测图像。
S112,对视觉检测图像进行图像识别,以从视觉检测图像中识别出两个金属部件。
本申请主要通过将激光图案投射到金属部件表面从而完成缝隙检测,需要确定两个金属部件的位置信息,包括具体的坐标、位置排布等等,从而需要对含有金属部件的视觉检测图像进行识别,识别出金属部件。
S113,确定两个金属部件在视觉检测图像中的排布关系。
在本申请中,由于对于不同的金属焊接方式采取不同的检测方式,即对于金属的不同排布方式采用不同的检测方式,因此通过识别后,对两个金属部件的位置信息进行处理,从而得出其位置关系。
当测视觉检测图像中两个金属部件的排布方式为空间异面的时候,则采用结构光检测方式;当两个金属部件的排布方式为空间共面的时候,则采用亮度检测方式。
上述实施方式中,通过对焊接区域进行图像采集,并识别所采集到的视觉检测图像中的两个金属部件,随后确定两个金属部件的排布关系,从而完成对焊接区域两个金属的视觉检测与识别,并通过排布方式确定接下来使用的检测方式。
S13,当两个金属部件为空间异面设置时,采用结构光检测方式进行检测。
当检测出的两个金属部件为空间异面设置时,则采用结构光检测方式,在本实施例中,具体是通过向两个金属部件投射点状激光形图案,然后进行图像的采集、识别以及处理确定是否两个金属部件存在缝隙,其中缝隙信息包括是否存在缝隙及缝隙的大小信息。
上述实施方式中,通过对两个金属部件进行排布检测,再通过不同检测结果确定不同的缝隙检测方式。
参阅图3,图3是本申请视觉焊接系统的缝隙检测方法第三实施方式的流程示意图,在本实施例中,是通过在两个金属部件表面上形成相应的激光图案,继而得到投射到金属部件上的激光图案的空间位置信息,从而得到与激光图案对应的金属部件的位置,最后确定出两个金属之间的缝隙信息,其中缝隙信息包括是否存在缝隙以及缝隙的大小信息。具体包括如下步骤:
S131,向两个金属部件投射点状形激光图案,以使得分别作用于两个金属部件的表面上的点状激光图案的数量不少于两个。
在本实施例中,需要向两个金属部件投射点状形的激光图案,包括第一激光图案与第二激光图案,其中,至少有两个点状激光图案作用于两个处于空间异面设置的金属部件的其中一个金属部件表面上,形成一个激光图案,至少有两个点状激光图案作用于两个金属部件中的另外一个金属部件表面上,同样形成相应激光图案。
S132,对投射有点状形激光图案的两个金属部件进行图像采集,以获取缝隙检测图像。
在向两个金属部件投射点状形激光图案并在分别两个金属部件的表面上形成相关的激光图案后,需要去采集获取缝隙检测图像,其中,缝隙检测图像中相对视觉检测图像多了相应的激光图案。
S133,对缝隙检测图像进行图像识别,以从缝隙检测图像中识别出点状形激光图案。
在本实施例中,由于点状形激光图案作用于两个金属部件的表面并形成相应的激光图案,即点状形激光图案的空间位置信息其实就是相当于这个点状形激光图案所在的金属部件的空间位置信息,首先需要将缝隙检测图像中激光图案识别出来。
S134,根据点状形激光图案在缝隙检测图像中的位置信息确定两个金属部件之间的缝隙信息。
从缝隙检测图像检测出点状形激光图案后,通过计算得到点状形激光图案的位置信息,继而得到所对应的金属部件的位置信息,然后通过比对、确定得到两个金属部件之间的缝隙信息。
参阅图4,图4是本申请视觉焊接系统的缝隙检测方法第四实施方式的流程示意图,且图4是图3步骤S134的子实施例,通过对识别出来的点 状形激光图案进行选点、确定直线、计算距离以及比对来得到两个金属部件是否存在缝隙,具体包括如下步骤:
S1341,根据点状形激光图案在缝隙检测图像中的位置信息分别确定两个金属部件的表面上的至少两个参考点的空间坐标。
在上述的实施例中已经说明,作用于金属部件上的激光图案的位置相当于其所在的金属部件的位置信息,即只要计算点状形激光图案所在的直线的空间位置信息,即金属部件的空间位置信息,具体的,计算一条直线需要至少两点的信息。具体的,根据激光图案在缝隙检测图像中的位置信息,从而得到其中激光图案中至少两个点的空间坐标。
请进一步参阅图5,图5是图3以及图4实施例的一个具体实施例示意图,其中,在本实施例中,采用的是通过投射点状激光对两个金属部件进行检测。
其中,焊接区域包括了两个金属部件分别是M以及N,通过向两个金属部件投射点状激光,使得至少两个点状激光图案作用于金属部件M表面上,在本实施例中,采用最优原则,只形成两个点状激光图案,分别是点状激光E以及点状激光G,同样的,在金属部件N表面形成点状激光F以及点状激光H。
在具体的实施例中,从两个金属部件上的激光图案上分别取两点,根据结构光的检测原理,一方面可以通过图像确定所取点的横纵轴坐标,一方面根据图像的偏移角度以及距离信息来确定所取点的竖轴坐标,从而可以得出所取点的空间坐标,其中金属部件M取E以及G点,坐标分别是是E(x1,y1,z1),G(x2,y2,z2),所对应的直线为L1;金属部件N取F以及H点,其坐标是F(X1,Y1,Z1),H(X2,Y2,Z2),所对应的直线为L2,即金属部件M对应直线L1,金属部件N对应直线L2。
上述所述的通过2D摄像头进行检测其点的三维坐标具体可以通过如 下方法:
如图6,光源21的激光打在金属M表面形成一个激光图案,取其中一个落点m,图像传感器11与金属M所在的平面的交点为Q点,因此m、图像传感器11、Q点一起构成一个直角三角形,其中一个直角为θ1,这里是图像传感器11在进行采集m点时候的偏移角,是已知的,其L1为m点到Q点的偏移距离,可以通过图像采集后的两点距离进行得出,因此在一个直角三角形中,已知一条直角边一个非直角的角度值,根据三角函数可以得出另外一个边的距离值,即图像传感器21到Q点的距离,建立坐标轴,可以得到其横纵竖坐标。
同理,对于金属N而言,其落点为n,交点为P,偏移角为θ2,偏移距离为L2,也可以得出图像传感器21到P点的距离。
在其他实施例中,也可以是垂直向金属部件投射激光图案,并在金属部件取一点P,此时光源、摄像机、P点构成一个直角三角形,同样的采用三角函数的方法进行计算,此时偏移位移为光源与摄像机的相对距离。
在又一实施例中,可以采用构建多个平面及坐标系的方法进行获取,如图7所示,光源21在向金属部件M投射点状激光图案后,光源21与两条光线分别形成横向光平面πh与纵向光平面πv,其中πc为图像平面,构建图像传感器11的坐标系为O cx cy cz c,其中O p为图像传感器11采集到的光轴与图像平面πc的交点,无畸变图像坐标系为O ux uy u。三维世界坐标系为O wx wy wz w,其中定义O cx c平行于O ux u和O cz c垂直于πc。
设πh或者πv上任意一点P在πc上的透视投影点为p,设P的三维世界坐标为(x w,y w,z w),p的无畸变图像坐标系坐标为(xu,yu)。
则整个图像传感器21的模型可以表示为:
Figure PCTCN2018103077-appb-000001
其中,ρ不为0,(fx,fy)为图像传感器11在x,y方向上的有效焦距,(u0,v0)为图像传感器11的主点坐标。ri(i=1....9)为正交旋转矩阵R的元素,tx,ty,tz为平移矢量T的元素。
由(1)式可知,空间点P在图像平面πc有唯一投影点p,即点p对应空间的唯一射线O cp,且P位于O cp上。
设πh上任意点坐标为P cH=[x cHy cHz cH] T,则πh的方程为:
a Hx cH+b Hy cH+c Hz cH+d H=0...........(2)
类似的,πv上任意点坐标为P cH=[x cVy cVz cV] T,则πv的方程为:
a vx cv+b vy cv+c vz cv+d v=0...........(3)
由(1)-(3)式可知,由(1)式可以确定射线O cp的方程,由(2)和(3)式分别确定光平面πh与πv的方程,由Ocp和πh或者πv的交点可以位移确定点P在O cx cy cZ c三维坐标。
上述方式均可以采用2D摄像机得到关于投射点的一个三维坐标信息,在具体实施例中不限于上述方式。
S1342,根据各金属部件上的至少两个参考点的空间坐标确定各金属部件对应的空间直线方程。
在得到激光图案中的至少两个点的空间坐标后,通过空间直线方程计算出所对应的直线方程:
LI:(x-x1)/(x2-x1)=(y-y1)/(y2-y1)=(z-z1)/(z2-z1),所对应的为金属部件M。
L2:(x-X1)/(X2-X1)=(y-Y1)/(Y2-Y1)=(z-Z1)/(Z2-Z1),所对应的为金属部件N。
即L1的空间位置对应是金属部件M,L2的空间位置对应的是金属部件N,即上述的两个方程则分别为两个金属部件的空间位置信息方程。
S1343,根据各金属部件的空间直线方程确定两个金属部件之间缝隙信息。
由于上述步骤已经确定两金属部件为空间异面设置,即L1与L2必定属于空间异面直线。
具体的,需要通过计算两直线之间的公垂线段的长度,可以通过在L1上选取任意一点Q,过点Q作一条平行于L2的直线L3,此时,L1以及L3构成一个平面O,然后在L2上取任意一点W,此时,只要求出点W到平面O的距离则可,这个具体就是L1,L2之间的公垂线段的长度,上述只是异面空间直线距离的一种求法,在其他实施例中,可以是其他任何求两直线之间距离的方法,不作任何限定。
继而要将这个公垂线段的长度与长度阈值进行比较,因为在具体的实施例中,金属部件是会存在一定的厚度,即两个直线之间是一定会存在距离的,则将公垂线段的长度与设定的长度阈值作差,根据差值信息确定两个金属部件之间的缝隙信息,如果公垂线段的长度大于阈值长度,即差值大于0,则代表两个金属部件的距离大于金属部件的厚度,即代表两个金属部件存在有缝隙,并且这个差值的具体数值即为两个金属部件的缝隙大小;如果所述公垂线段的长度与预设的长度阈值的差值等于0,则判定所述两个金属部件之间不存在缝隙。
请参阅图8,图8是本申请视觉焊接系统的一实施方式的结构示意图。
本实施例中,视觉焊接系统包括:视觉检测系统10以及激光投射装置20。
其中,视觉检测系统10用于对焊接区域内的两个金属部件进行视觉检测,获得两个金属部件的排布方式后,激光投射装置20对确定为空间异面 的两个金属部件投射点状激光图案,视觉检测系统10进一步根据点状激光图案以结构光检测方式确定两个金属部件之间的缝隙信息。
其中,视觉检测系统11具体包括了图像传感器及处理器12,在具体实施例中,首先图像传感器11对焊接区域进行图像采集,从而获得视觉检测图像,然后处理器12对图像传感器11所采集到的视觉检测图像进行处理,识别出视觉检测图像中的需要进行缝隙检测的两个金属部件,进一步的,并确定识别出的两个金属部件的具体的排布关系以及位置信息,并将位置与排布信息发送激光投射装置20。
激光投射装置20主要包括光源,当视觉检测系统10确定了两个金属部件的位置以及具体的排布关系,通过视觉检测系统10发送过来两个金属部件的位置信息以及排布方式来确定激光检测的方式,具体的,及通过处理器12传来的信息,光源21通过所得到的两个金属部件的位置信息以及排布信息来投射不同激光图案。
可选的,视觉检测系统10传来消息为的两个金属的排布关系为空间异面设置时:
则激光投射装置20的光源21投射点状激光图案到两个金属部件上,并根据金属部件具体的位置信息,使得分别作用于所述两个金属部件的表面上的所述点状激光图案的数量不少于两个。
在光源21将预设的激光图案投射到两个金属部件表面后,图像传感器11负责对两个金属部件进行图像采集,从而获取到缝隙检测图像,处理器12通过获取到的缝隙检测图像,从缝隙检测图像中分别出两个金属表面的激光图案,并根据激光图案缝隙图像中的位置信息分别两个金属部件表面的至少两个参考点的空间坐标,在根据各金属部件上的至少两个参考点的空间坐标确定各金属部件对应的空间直线方程,进一步根据各金属部件的空间直线方程计算两个金属部件之间缝隙信息。
其具体的检测方式,上述实施例中已经进行过叙述,这里不再赘述。
需要知道的,本实施例提供的图像传感器11采用的是成本较低2D图像传感器,其通过与处理器12、光源21进行配合,构成一个光源21、金属部件、图像传感器11的3D结构模型,基于结构光原理,通过对金属部件上的图像进行采集,一方面所取点的平面位置信息得到所取点的横纵轴坐标,一方面根据图像相对光源21、图像传感器11的偏移度来获取到所取点的竖轴坐标。
同时,本实施例提供的处理器12不仅仅限制于图像处理,还可以进行其他的处理,如控制光源20投射的方向,控制图像传感器11的采集角度等,在其他实施例中,处理器12也可以通过外接来实现,为了节省成本,处理器12也可以同时对多个系统进行相关处理。
参阅图9,图9为本申请计算机存储介质一实施方式的结构示意图,有能够实现上述所有方法的程序文件31,该程序文件31可以以软件产品的形式存储在上述存储装置中,同时还是记录各种计算的数据,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本申请各个实施方式方法的全部或部分步骤。
而前述的存储装置包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质,或者是计算机、服务器、手机、平板等终端设备。
综上,本领域技术人员容易理解,本申请提供一种视觉焊接系统的缝隙检测方法以及系统,通过对焊接区域的两个金属部件的位置视觉检测以及排布方式,选择相应激光检测方式向两个金属部件投射预设形状的激光图案,并根据采集到的激光图案,计算出激光图案的位置关系,进一步计算 出两个金属部件之间的缝隙信息,通过上述方法,通过采用2D视觉系统缝隙检测代替了昂贵的3D视觉系统缝隙检测,且对减少了光源的使用,从而优化了缝隙检测方法,提高了工作效率,并且降低了成本。
以上仅为本申请的实施方式,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (16)

  1. 一种视觉焊接系统的缝隙检测方法,其特征在于,所述方法包括:
    通过视觉检测方式确定待焊接区域的两个金属部件的排布关系;
    当所述两个金属部件为空间异面设置时,向所述两个金属部件投射点状激光图案;
    根据所述点状激光图案以结构光检测方式确定所述两个金属部件之间的缝隙信息。
  2. 根据权利要求1所述的方法,其特征在于,所述通过视觉检测方式确定待焊接区域的两个金属部件的排布关系包括:
    对所述待焊接区域进行图像采集,以获得视觉检测图像;
    对所述视觉检测图像进行图像识别,以从所述视觉检测图像中识别出所述两个金属部件;
    确定所述两个金属部件在所述视觉检测图像中的排布关系。
  3. 根据权利要求1所述的方法,其特征在于,所述当所述两个金属部件为空间异面设置时,向所述两个金属部件投射点状激光图案包括:
    使得分别作用于所述两个金属部件的表面上的所述点状激光图案的数量不少于两个。
  4. 根据权利要求3所述的方法,其特征在于,所述根据所述点状激光图案以结构光检测方式确定所述两个金属部件之间的缝隙信息包括:
    对投射有所述点状激光图案的所述两个金属部件进行图像采集,以获取缝隙检测图像;
    对所述缝隙检测图像进行图像识别,以从所述缝隙检测图像中识别出所述点状激光图案;
    根据所述点状激光图案在所述缝隙检测图像中的位置信息确定所述两个金属部件之间的缝隙信息。
  5. 根据权利要求4所述的方法,其特征在于,所述根据所述点状激光图案在所述缝隙检测图像中的位置信息确定所述两个金属部件之间的缝隙信息包括:
    根据所述点状激光图案在所述缝隙检测图像中的位置信息分别确定所述两个金属部件的表面上的至少两个参考点的空间坐标;
    根据各所述金属部件上的至少两个参考点的空间坐标确定各所述金属部件对应的空间直线方程;
    根据各所述金属部件的所述空间直线方程确定所述两个金属部件之间的缝隙信息。
  6. 根据权利要求5所述的方法,其特征在于,所述根据各所述金属部件的所述空间直线方程确定所述两个金属部件之间的缝隙信息包括:
    根据各所述金属部件的所述空间直线方程计算所述空间直线方程所对应的直线之间的公垂线段的长度;
    将所述公垂线段的长度与预设的长度阈值作差;
    根据差值信息确定所述两个金属部件之间的缝隙信息。
  7. 根据权利要求6所述的方法,其特征在于,所述根据差值信息确定所述两个金属部件之间的缝隙信息,包括:
    若所述公垂线段的长度与预设的长度阈值的差值大于0,则判定所述两个金属部件之间存在缝隙,并将所述差值作为所述缝隙的大小;
    若所述公垂线段的长度与预设的长度阈值的差值等于0,则判定所述两个金属部件之间不存在缝隙。
  8. 一种视觉焊接系统,其特征在于,所述系统包括:
    视觉检测系统,用于通过视觉检测方式确定待焊接区域的两个金属部件的排布关系;
    激光投射装置,用于对确定排布方式为空间异面设置的两个金属部件投射点状激光图案;
    所述视觉检测系统进一步根据所述点状图案以结构光检测方式确定所述两个金属部件之间的缝隙信息。
  9. 根据权利要求8所述的系统,其特征在于,所述视觉检测系统包括:
    图像传感器,用于对所述待焊接区域进行图像采集,以获得视觉检测图像;
    处理器,用于对所述视觉检测图像进行图像识别,以从所述视觉检测图像中识别出所述两个金属部件,并确定所述两个金属部件在所述视觉检测图像中的排布关系。
  10. 根据权利要求9所述的系统,其特征在于,所述激光投射装置包括:
    光源,用于分别向所述两个金属部件投射点状激光图案;使得分别作用于所述两个金属部件的表面上的所述点状激光图案的数量不少于两个。
  11. 根据权利要求10所述的系统,其特征在于,所述图像传感器进一步包括对投射有所述点状激光图案的所述两个金属部件进行图像采集,以获取缝隙检测图像。
  12. 根据权利要求11所述的系统,其特征在于,所述第二处理器进一步包括:对所述缝隙检测图像进行图像识别,以从所述缝隙检测图像中识别出所述点状激光图案;根据所述点状激光图案在所述缝 隙检测图像中的位置信息确定所述两个金属部件之间的缝隙信息。
  13. 根据权利要求12所述的系统,其特征在于,当采用点状激光图案进行检测时,所述激光检测系统根据所述点状激光图案在所述缝隙检测图像中的位置信息分别确定所述两个金属部件的表面上的至少两个参考点的空间坐标;根据各所述金属部件上的至少两个参考点的空间坐标确定各所述金属部件对应的空间直线方程;根据各所述金属部件的所述空间直线方程确定所述两个金属部件之间的缝隙信息。
  14. 根据权利要求13所述的系统,其特征在于,所述根据各所述金属部件的所述空间直线方程确定所述两个金属部件之间的缝隙信息包括:
    根据各所述金属部件的所述空间直线方程计算所述空间直线方程所对应的直线之间的公垂线段的长度;
    将所述公垂线段的长度与预设的长度阈值进行作差;
    根据差值信息确定所述两个金属部件之间的缝隙信息。
  15. 根据权利要求14所述的系统,其特征在于,所述根据差值信息确定所述两个金属部件之间的缝隙信息,包括:
    若所述公垂线段的长度与预设的长度阈值的差值大于0,则判定所述两个金属部件之间存在缝隙,并将所述差值作为所述缝隙的大小;
    若所述公垂线段的长度与预设的长度阈值的差值等于0,则判定所述两个金属部件之间不存在缝隙。
  16. 一种计算机存储介质,其特征在于,存储能够实现如权利1-7任一所述方法的程序文件。
PCT/CN2018/103077 2018-08-29 2018-08-29 视觉焊接系统的缝隙检测方法以及系统 WO2020042032A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/103077 WO2020042032A1 (zh) 2018-08-29 2018-08-29 视觉焊接系统的缝隙检测方法以及系统
CN201880087341.3A CN111630342B (zh) 2018-08-29 2018-08-29 视觉焊接系统的缝隙检测方法以及系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/103077 WO2020042032A1 (zh) 2018-08-29 2018-08-29 视觉焊接系统的缝隙检测方法以及系统

Publications (1)

Publication Number Publication Date
WO2020042032A1 true WO2020042032A1 (zh) 2020-03-05

Family

ID=69644899

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/103077 WO2020042032A1 (zh) 2018-08-29 2018-08-29 视觉焊接系统的缝隙检测方法以及系统

Country Status (2)

Country Link
CN (1) CN111630342B (zh)
WO (1) WO2020042032A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570654A (zh) * 2021-06-16 2021-10-29 上海工程技术大学 基于最小外接矩形的汽车表面缝隙尺寸检测方法及其应用
CN116067280A (zh) * 2022-12-30 2023-05-05 广东富华机械装备制造有限公司 集装箱焊接位置检测方法、装置、存储介质和设备

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113414762B (zh) * 2021-06-09 2024-05-03 配天机器人技术有限公司 焊道路径的偏移方法、装置、机器人及存储装置
KR20230134597A (ko) * 2021-12-29 2023-09-21 컨템포러리 엠퍼렉스 테크놀로지 씨오., 리미티드 머신 비전 검출 방법, 이의 검출 장치 및 검출 시스템
CN116882063A (zh) * 2023-07-24 2023-10-13 深圳市南方众悦科技有限公司 一种汽车配件的自适应选型分析方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0724575A (ja) * 1993-07-13 1995-01-27 Sekisui Chem Co Ltd 隙間の距離計測方法
CN105203072A (zh) * 2014-06-23 2015-12-30 联想(北京)有限公司 一种信息处理方法和电子设备
CN105571502A (zh) * 2015-12-29 2016-05-11 上海交通大学 搅拌摩擦焊接中焊缝间隙的测量方法
CN107824940A (zh) * 2017-12-07 2018-03-23 淮安信息职业技术学院 基于激光结构光的焊缝视觉跟踪系统及方法

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5928608A (ja) * 1982-08-09 1984-02-15 Matsushita Electric Ind Co Ltd 溶接線検出装置
US4942539A (en) * 1988-12-21 1990-07-17 Gmf Robotics Corporation Method and system for automatically determining the position and orientation of an object in 3-D space
GB9116151D0 (en) * 1991-07-26 1991-09-11 Isis Innovation Three-dimensional vision system
JP2000088542A (ja) * 1998-09-09 2000-03-31 Mitsubishi Heavy Ind Ltd はんだ付検査装置及び検査方法
US7429999B2 (en) * 2004-05-24 2008-09-30 CENTRE DE RECHERCHE INDUSTRIELLE DU QUéBEC Camera calibrating apparatus and method
CN1617009A (zh) * 2004-09-20 2005-05-18 深圳大学 基于空间点阵投影的三维数字成像方法
CN104002021A (zh) * 2014-06-06 2014-08-27 哈尔滨工业大学 用于多层多道焊道自动识别与跟踪的装置
CN104408732B (zh) * 2014-12-10 2017-07-28 东北大学 一种基于全向结构光的大视场深度测量系统及方法
CN106382884A (zh) * 2016-08-18 2017-02-08 广东工业大学 一种点光源扫描焊缝的检测方法
JP6279060B1 (ja) * 2016-12-02 2018-02-14 ジャパンマリンユナイテッド株式会社 レーザセンサ、及び計測方法
CN106984926B (zh) * 2017-05-03 2018-07-06 武汉科技大学 一种焊缝跟踪系统及焊缝跟踪方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0724575A (ja) * 1993-07-13 1995-01-27 Sekisui Chem Co Ltd 隙間の距離計測方法
CN105203072A (zh) * 2014-06-23 2015-12-30 联想(北京)有限公司 一种信息处理方法和电子设备
CN105571502A (zh) * 2015-12-29 2016-05-11 上海交通大学 搅拌摩擦焊接中焊缝间隙的测量方法
CN107824940A (zh) * 2017-12-07 2018-03-23 淮安信息职业技术学院 基于激光结构光的焊缝视觉跟踪系统及方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570654A (zh) * 2021-06-16 2021-10-29 上海工程技术大学 基于最小外接矩形的汽车表面缝隙尺寸检测方法及其应用
CN116067280A (zh) * 2022-12-30 2023-05-05 广东富华机械装备制造有限公司 集装箱焊接位置检测方法、装置、存储介质和设备
CN116067280B (zh) * 2022-12-30 2023-11-14 广东富华机械装备制造有限公司 集装箱焊接位置检测方法、装置、存储介质和设备

Also Published As

Publication number Publication date
CN111630342A (zh) 2020-09-04
CN111630342B (zh) 2022-04-15

Similar Documents

Publication Publication Date Title
WO2020042032A1 (zh) 视觉焊接系统的缝隙检测方法以及系统
JP5122948B2 (ja) タッチ面に対応するポインタを検出するための装置及び方法
JP5713159B2 (ja) ステレオ画像による3次元位置姿勢計測装置、方法およびプログラム
CN104424649B (zh) 检测运动物体的方法和系统
WO2020206666A1 (zh) 基于散斑图像的深度估计方法及装置、人脸识别系统
US8913125B2 (en) Electronic device and method for regulating coordinates of probe measurement system
WO2021031781A1 (zh) 投影图像校准方法、装置及投影设备
JP2015040856A (ja) 3dスキャナ
US20160125638A1 (en) Automated Texturing Mapping and Animation from Images
EP2551633B1 (en) Three dimensional distance measuring device and method
KR20090085160A (ko) 대화형 입력 시스템 및 방법
Guan et al. DeepMix: mobility-aware, lightweight, and hybrid 3D object detection for headsets
JP2015005181A (ja) 情報処理装置、判定方法および判定プログラム
Ziaei et al. Real-time markerless Augmented Reality for Remote Handling system in bad viewing conditions
US20180157328A1 (en) Calibration systems and methods for depth-based interfaces with disparate fields of view
CN114638795A (zh) 一种多结构光测量单元在线测量方法及系统
JP2008309595A (ja) オブジェクト認識装置及びそれに用いられるプログラム
Rodrigues et al. An intelligent real time 3D vision system for robotic welding tasks
WO2020042030A1 (zh) 视觉焊接系统的缝隙检测方法以及系统
EP3051492B1 (en) Method and apparatus for determining disparity
US20220292717A1 (en) 3D Object Detection Using Random Forests
CN104457709A (zh) 一种距离检测方法及电子设备
JP2015085434A (ja) ロボット、画像処理方法及びロボットシステム
WO2020042031A1 (zh) 视觉焊接系统的缝隙检测方法以及系统
CN113804195B (zh) 信息处理方法及设备和室内地图定位方法及设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18932300

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30.06.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18932300

Country of ref document: EP

Kind code of ref document: A1