WO2022116218A1 - 一种用于复杂表面的线结构光中心提取方法 - Google Patents

一种用于复杂表面的线结构光中心提取方法 Download PDF

Info

Publication number
WO2022116218A1
WO2022116218A1 PCT/CN2020/134128 CN2020134128W WO2022116218A1 WO 2022116218 A1 WO2022116218 A1 WO 2022116218A1 CN 2020134128 W CN2020134128 W CN 2020134128W WO 2022116218 A1 WO2022116218 A1 WO 2022116218A1
Authority
WO
WIPO (PCT)
Prior art keywords
stripe
line
center
intensity
area
Prior art date
Application number
PCT/CN2020/134128
Other languages
English (en)
French (fr)
Inventor
康连朋
赵昕玥
何再兴
Original Assignee
浙江大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 浙江大学 filed Critical 浙江大学
Priority to US18/035,859 priority Critical patent/US20230401729A1/en
Priority to PCT/CN2020/134128 priority patent/WO2022116218A1/zh
Publication of WO2022116218A1 publication Critical patent/WO2022116218A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking

Definitions

  • the invention relates to the technical field of computer vision three-dimensional measurement and industrial automation, and is embodied in a line-structured light center extraction algorithm used in the face of complex environment and surface interference of complex parts when a line-structured light scanning method is used for three-dimensional reconstruction, three-dimensional measurement and other applications. .
  • the center extraction of the line structured light bar is a key step in the line structured light scanning process, which directly affects the accuracy of the entire system, especially on the surface of the measured object, where the color is darker, the reflection phenomenon is more serious, and the measurement background interference is strong In such cases, the accuracy and robustness of line structure stripe center extraction are poor.
  • the center extraction algorithm of line structured light is mainly aimed at the situation that the background interference is not strong, and how to optimize the speed and accuracy of the algorithm processing, while the center extraction algorithm under the surface of complex objects is less, resulting in the inability to use line structured light in complex situations.
  • the industrial application of the scanning method greatly limits specific industrial application scenarios.
  • the present invention considers the characteristics of the light stripes from the overall correlation of the line structured light scanning image, and quantitatively characterizes the characteristics of the line structured light stripes judged by the human eye, so that it is still possible to achieve relatively high performance in complex situations. Good accuracy and robustness expand more application scenarios of line structured light scanning.
  • the present invention adopts the following technical solutions:
  • a method for extracting a line-structured light center for complex surfaces comprising the following steps:
  • Step 1 Combine the extremum method and the connected domain filtering method to process the line structured light image, and determine the exact center point of the laser stripe as the seed point of the center extraction algorithm.
  • Step 2 At the seed point or node position, calculate the intensity characteristic value and the direction deviation characteristic value according to the image gray value in the half-circle 180° direction, establish a score function to judge the optimal stripe direction, determine the optimal line fitting length, and then determine the next step. a node location.
  • Step 3 Perform growth extraction on all nodes, stop the iterative operation when the gray value of the image decreases sharply, and obtain the center line of the line-structured light bar. Finally, the gray barycenter method is performed on the local area of the obtained center line to obtain the exact center point of the light bar.
  • step 1 the extreme value method is used to obtain the maximum pixel point of the light strip.
  • the interference area with a small area and the reflective area whose width does not meet the stripe feature are filtered.
  • the connected domain of the region feature is used as the light bar region, and the seed points processed by the algorithm are selected, and the iterative line segment fitting is performed upward and downward respectively.
  • step 2 first calculate the average grayscale value VL of the adjacent pixels in the ⁇ direction with the seed point as the starting point and the distance L in the ⁇ direction, and obtain the intensity characterization value V ⁇ in this direction by weighted average, as shown in formula (1):
  • f L Rl
  • V ⁇ is used to characterize the difference between the intensity feature in this direction and the intensity feature in the background direction.
  • the directional continuity characteristic value ⁇ dcv is established to represent the change of stripe direction in two consecutive growth processes. The more likely the direction is.
  • a score function is established to calculate the extreme points of all intensity eigenvalues, and the stripe direction corresponding to the maximum value of the score function is selected as the growth direction, as shown in formula (2):
  • step 3 a single seed point can extract a complete light strip skeleton through growth.
  • the cycle is stopped, and the light strip is obtained. the centerline.
  • the exact center point of the laser fringe is calculated by the gray-scale centroid method.
  • I is a laser fringe image
  • I(i,j) is the grayscale value of the pixel in the i-th row and j-th column
  • (i 0 ,j 0 ) are the pixel coordinates of the resulting centerline.
  • the exact center point coordinates of the i 0th row are calculated by formula (3):
  • the present invention has the following beneficial effects:
  • the invention can effectively reduce the interference in the face of the surface of complex textured parts, the surface of dark parts, the surface of reflective parts and the background environment, etc., so that the process of extracting the center of the light strip has better robustness , stability, and provides a better stripe center extraction scheme.
  • Fig. 1 is the flow chart in the present invention
  • Fig. 2 is the calculation schematic diagram of intensity characteristic value
  • Fig. 3 is the change schematic diagram of intensity characteristic value
  • Fig. 4, Fig. 5, Fig. 6 are the effect schematic diagrams of the center extraction method of the present invention.
  • FIG. 1 The flow chart of the present invention is shown in FIG. 1 .
  • the linear structured light image is processed by combining the extreme value method and the connected domain filtering method, and the image is processed by the strong filtering method. That is, the seed point.
  • the average gray value of the adjacent pixels in the ⁇ direction with the seed point P as the starting point and the distance L in the direction of The extreme points of intensity eigenvalues serve as possible fringe directions, as shown in Fig. 3.
  • the score function in the possible stripe direction is calculated according to formula (2), the optimal stripe direction is determined, and the next node is determined for successive growth extraction.
  • Figure 4/5/6 shows the center extraction effect of different experimental objects. It can be seen that the surface of the object is reflective, the brightness of the laser is too dark, and the surface texture of the object is complicated. The method in this paper can effectively filter out the complex The interference caused by the surface can achieve accurate and Lupin center extraction effect, which can meet the three-dimensional measurement of line structured light in complex situations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种用于复杂物体表面的线结构光中心提取方法。首先根据极值法与连通域滤波算法选取可靠的光条种子点位置;之后在该种子点位置下,根据半周180°方向上的灰度值计算强度表征值和偏差表征值,以此加权合成该位置各方向的得分函数,从而根据得分函数来判断该位置光条的方向和最佳拟合线段长度,生成下一节点位置,逐次生长提取激光条纹的中心线,最后在中心线局部区域利用灰度重心法提取完整的条纹中心像素点。能够量化人眼判别光条的特征,改进了传统方法中按行方向或局部区域的中心提取方法,从全局角度实现了条纹的提取,从而在复杂干扰情况下完整地提取条纹中心像素点。

Description

一种用于复杂表面的线结构光中心提取方法 技术领域
本发明涉及计算机视觉三维测量及工业自动化技术领域,具体表现在利用线结构光扫描方法进行三维重建、三维测量等应用时,面对复杂环境与复杂零件表面干扰所采用的线结构光中心提取算法。
背景技术
目前,在工业自动化领域的很多应用场景下,如无序堆叠物体的抓取,产品三维数据测量等场景,都需要对物体的三维点云信息进行获取,线结构光扫描法作为三维点云数据获取的一种原理方法,相较于面结构光扫描,线结构光扫描存在“精度较高”、“对于反光物体敏感性较低”等优点。
在实际应用中,线结构光条的中心提取是线结构光扫描过程的关键步骤,直接影响整个系统的精度,尤其在被测物体表面存在颜色较深,反光现象较严重,测量背景干扰较强等情况下,线结构条纹中心提取的准确性和鲁棒性都较差。
目前线结构光中心提取算法主要针对背景干扰不强的情况下,研究如何去优化算法处理的速度和精度,而对复杂物体表面下的中心提取算法较少,导致复杂情况下无法利用线结构光扫描方法进行工业应用,极大的限制了特定工业应用场景。
针对复杂物体表面干扰的应用场景,本发明从线结构光扫描图像的整体关联方面考虑光条的特征,对人眼判断线结构光条纹的特征进行量化表征,使得在复杂情况下依然可以实现较好的准确性与鲁棒性,扩展了线结构光扫描的更多应用场景。
发明内容
本发明为解决复杂场景下,由复杂纹理引起的假带状条纹区域和由深色表 面引起的暗条纹区域严重降低了条纹中心提取的效果,且难以从局部角度滤除伪造的条纹并检测真实的条纹的问题,提出了一种基于激光条纹特征估计的线结构光中心提取方法,能够较好的滤除干扰,准确稳定地提取线结构光条的中心像素点。
为了解决上述技术问题,本发明采用如下技术方案:
一种用于复杂表面的线结构光中心提取方法,其特征在于,包括如下步骤:
步骤1、结合极值法与连通域滤波方法对线结构光图像进行处理,确定激光条纹的准确中心点作为中心提取算法的种子点。
步骤2、在种子点或节点位置,根据半周180°方向上图像灰度值计算强度表征值与方向偏差表征值,建立得分函数判断最优条纹方向,确定最优线条拟合长度,从而确定下一节点位置。
步骤3、对所有节点进行生长提取,至图像灰度值剧减时停止迭代运算,得到线结构光条中心线,最后对所得中心线局部区域进行灰度重心法求取精确光条中心点。
步骤1中,利用极值法处理后得到光条的极大值像素点,通过计算八连通域面积,对面积较小的干扰区域和宽度不满足条纹特征的反光区域进行过滤处理,将满足条纹区域特征的连通域作为光条区域,选择算法处理的种子点,分别向上和向下进行迭代线段拟合。
通过上述步骤,可以去除较小的干扰区域和较大的反射区域,同时将满足特征的某些伪条纹区域视为替代种子点。中心线提取后,将根据中心线的长度以及与其他确定的中心线的连接来判断和估计来自不同种子点的多个并排中心线,从而确定最佳中心线,以确保在发生以下情况时中心线提取正确错误种子点数。
步骤2中,首先计算以种子点为起点的θ方向距离为L的直线邻近像素平均灰度值V L,加权平均得到该方向的强度表征值V θ,如式(1)所示:
Figure PCTCN2020134128-appb-000001
其中,f L=R-l,
Figure PCTCN2020134128-appb-000002
且基于条纹宽度,常量R设置为20。V θ用来表征该方向上的强度特征与背景方向上的强度特征差异,强度表征值越大,该方向与背景的差异越大,条纹方向的可能性越大。
为应对由部分深色表面导致的激光条纹过暗,建立方向连续性特征值θ dcv以表示连续两次生长过程中条纹方向的变化,方向变化值越小,激光条纹的连续性越强,条纹方向的可能性越大。最后建立得分函数对所有强度特征值极值点进行计算,选择得分函数最大值对应的条纹方向作为生长方向,如式(2):
Figure PCTCN2020134128-appb-000003
步骤3中,单个种子点可通过生长提取出一条完整的光条骨架,当处理区域的灰度值极低时或前后两次迭代的强度表征值衰减程度过大时,停止循环,得到光条的中心线。
在所得条纹中心线的局部邻近区域,利用灰度重心法计算激光条纹的准确中心点。假设I是激光条纹图像,I(i,j)是第i行第j列的像素的灰度值,(i 0,j 0)是所得中心线的像素坐标。通过公式(3)计算出第i 0行的精确中心点坐标:
Figure PCTCN2020134128-appb-000004
由于采用上述技术方案,本发明具有以下有益效果:
本发明可以在线结构光扫描过程中,面对复杂纹理零件表面、深色零件表面,反光零件表面以及背景环境干扰等情况下,有效减少干扰,使光条中心提取过程具有较好的鲁棒性、稳定性,提供了一种较好的条纹中心提取方案。
附图说明
下面根据附图对本发明作进一步说明。
图1是本发明中的流程图;
图2是强度特征值的计算示意图;
图3是强度特征值的变化示意图;
图4、图5、图6是本发明中心提取方法的效果示意图;
具体实施方式
下面结合附图和实施例对本发明进一步说明。本发明的流程图如图1所示。
首先结合极值法与连通域滤波方法对线结构光图像进行处理,采用强过滤方式对图像进行处理,选取最符合激光条纹特征的连通域区域,取区域中心点作为激光条纹的可靠中心点,即种子点。
如图2所示,计算以种子点P为起点的θ方向距离为L的直线邻近像素平均灰度值,加权平均得到该方向的强度表征值,根据激光条纹方向强度与背景区域差异明显,选择强度特征值的极值点作为可能的条纹方向,如图3所示。然后根据式(2)计算可能条纹方向上的得分函数,确定最优条纹方向,并确定下一节点进行逐次生长提取。当满足条件:1)经平滑和滤波后,强度差异特征值无明显极值点;2)强度特征值小于背景强度时,停止生长提取算法,得到激光条纹的中心线。
最后为上述所得的中心线进行过滤处理,为每一行方向上确定唯一的中心线,以去除复杂物体表面的干扰,经式(3)灰度重心法后输出激光条纹的中心点。
如图4/5/6所示为不同实验对象的中心提取效果图,可以看出面对物体表面反光、激光亮度过暗、物体表面纹理干扰复杂等情况,本文方法能够有效的滤除由复杂表面引起的干扰,实现准确、鲁邦的中心提取效果,能够满足复杂情况下的线结构光三维测量。
以上仅为本发明的具体实施例,但本发明的技术特征并不局限于此。任何以本发明为基础,为解决基本相同的技术问题,实现基本相同的技术效果,所作出地简单变化、等同替换或者修饰等,皆涵盖于本发明的保护范围之中。

Claims (2)

  1. 一种用于复杂表面的线结构光中心提取方法,其特征在于,包括如下步骤:
    步骤1:结合极值法与连通域滤波方法对线结构光图像进行处理,确定激光条纹的准确可信中心点作为中心提取算法的种子点,利用极值法处理后得到光条的极大值像素点,通过计算八连通域面积,对面积较小的干扰区域和宽度不满足条纹特征的反光区域进行过滤处理,将满足条纹区域特征的连通域作为光条区域,选择算法处理的种子点;通过上述步骤,可以去除较小的干扰区域和较大的反射区域,同时将满足特征的某些伪条纹区域视为替代种子点;中心线提取后,将根据中心线的长度以及与其他确定的中心线的连接来判断和估计来自不同种子点的多个并排中心线,从而确定最佳中心线,以确保在发生以下情况时中心线提取正确错误种子点数;
    步骤2:在种子点或节点位置,根据半周180°方向上图像灰度值计算强度表征值与方向偏差表征值,建立得分函数判断最优条纹方向,确定最优线条拟合长度,从而确定下一节点位置;根据线激光高亮条状区域的特征,为表达光条与背景区域的强度差异性,建立强度表征值V θ并选取强度极值点以表示光条的高亮特征。为应对由部分深色表面导致的激光条纹过暗,建立方向连续性特征值θ dcv以表示连续两次生长过程中条纹方向的变化,方向变化值越小,激光条纹的连续性越强,条纹方向的可能性越大。最后建立得分函数对所有强度表征值极值点进行计算,选择得分函数S最大值对应的条纹方向作为生长方向,如式:
    Figure PCTCN2020134128-appb-100001
    步骤3:对所有节点进行生长提取,至图像灰度值剧减时停止迭代运算,得到线结构光条中心线,最后对所得中心线局部区域进行灰度重心法求取精确光条中心点。
  2. 根据权利要求1所述一种用于复杂表面的线结构光中心提取方法,其特征在于:所述步骤3中,单个种子点可通过生长提取出一条完整的光条骨架,当 当满足条件:1)经平滑和滤波后,强度差异特征值无明显极值点;2)强度特征值小于背景强度时,停止生长提取算法,得到激光条纹的中心线。对由不同种子点生长出的并排中心线进行唯一化过滤后,对所得条纹中心线的局部邻近区域,利用灰度重心法计算激光条纹的准确中心点。
PCT/CN2020/134128 2020-12-05 2020-12-05 一种用于复杂表面的线结构光中心提取方法 WO2022116218A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/035,859 US20230401729A1 (en) 2020-12-05 2020-12-05 Line structured light center extraction method for complicated surfaces
PCT/CN2020/134128 WO2022116218A1 (zh) 2020-12-05 2020-12-05 一种用于复杂表面的线结构光中心提取方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/134128 WO2022116218A1 (zh) 2020-12-05 2020-12-05 一种用于复杂表面的线结构光中心提取方法

Publications (1)

Publication Number Publication Date
WO2022116218A1 true WO2022116218A1 (zh) 2022-06-09

Family

ID=81853695

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/134128 WO2022116218A1 (zh) 2020-12-05 2020-12-05 一种用于复杂表面的线结构光中心提取方法

Country Status (2)

Country Link
US (1) US20230401729A1 (zh)
WO (1) WO2022116218A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115953459A (zh) * 2023-03-10 2023-04-11 齐鲁工业大学(山东省科学院) 一种复杂光照条件下的激光条纹中心线提取方法
CN116074633A (zh) * 2023-03-06 2023-05-05 宜科(天津)电子有限公司 一种自动多重曝光方法
CN116433707A (zh) * 2023-06-14 2023-07-14 武汉工程大学 复杂背景下线结构光中心亚像素精确提取方法及系统
CN117237434A (zh) * 2023-11-15 2023-12-15 太原理工大学 一种h型钢尺寸测量方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09280837A (ja) * 1996-04-18 1997-10-31 Nippon Steel Corp 形状計測のための縞パターン投影画像の二値化方法
KR100684630B1 (ko) * 2006-01-03 2007-02-22 삼성중공업 주식회사 용접선 추적을 위한 이미지 프로세싱 방법
CN104657587A (zh) * 2015-01-08 2015-05-27 华中科技大学 一种激光条纹中心线的提取方法
CN108592823A (zh) * 2017-12-04 2018-09-28 湖南大学 一种基于双目视觉彩色条纹编码的解码方法
CN111325831A (zh) * 2020-03-04 2020-06-23 中国空气动力研究与发展中心超高速空气动力研究所 一种基于分层聚类和置信传播的彩色结构光光条检测方法
CN112037201A (zh) * 2020-08-31 2020-12-04 河北工程大学 一种基于线结构光图像的有向生长结构光中心提取方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09280837A (ja) * 1996-04-18 1997-10-31 Nippon Steel Corp 形状計測のための縞パターン投影画像の二値化方法
KR100684630B1 (ko) * 2006-01-03 2007-02-22 삼성중공업 주식회사 용접선 추적을 위한 이미지 프로세싱 방법
CN104657587A (zh) * 2015-01-08 2015-05-27 华中科技大学 一种激光条纹中心线的提取方法
CN108592823A (zh) * 2017-12-04 2018-09-28 湖南大学 一种基于双目视觉彩色条纹编码的解码方法
CN111325831A (zh) * 2020-03-04 2020-06-23 中国空气动力研究与发展中心超高速空气动力研究所 一种基于分层聚类和置信传播的彩色结构光光条检测方法
CN112037201A (zh) * 2020-08-31 2020-12-04 河北工程大学 一种基于线结构光图像的有向生长结构光中心提取方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116074633A (zh) * 2023-03-06 2023-05-05 宜科(天津)电子有限公司 一种自动多重曝光方法
CN115953459A (zh) * 2023-03-10 2023-04-11 齐鲁工业大学(山东省科学院) 一种复杂光照条件下的激光条纹中心线提取方法
CN116433707A (zh) * 2023-06-14 2023-07-14 武汉工程大学 复杂背景下线结构光中心亚像素精确提取方法及系统
CN116433707B (zh) * 2023-06-14 2023-08-11 武汉工程大学 复杂背景下线结构光中心亚像素精确提取方法及系统
CN117237434A (zh) * 2023-11-15 2023-12-15 太原理工大学 一种h型钢尺寸测量方法及装置
CN117237434B (zh) * 2023-11-15 2024-02-09 太原理工大学 一种h型钢尺寸测量方法及装置

Also Published As

Publication number Publication date
US20230401729A1 (en) 2023-12-14

Similar Documents

Publication Publication Date Title
WO2022116218A1 (zh) 一种用于复杂表面的线结构光中心提取方法
CN101398886B (zh) 一种基于双目被动立体视觉的快速三维人脸识别方法
CN103226821B (zh) 基于视差图像素分类校正优化的立体匹配方法
CN102236794B (zh) 3d场景中3d对象的识别和姿态确定
CN106530347B (zh) 一种稳定的高性能圆特征检测方法
CN110766669B (zh) 一种基于多目视觉的管线测量方法
CN104850850A (zh) 一种结合形状和颜色的双目立体视觉图像特征提取方法
CN108765476A (zh) 一种偏振图像配准方法
CN108596975A (zh) 一种针对弱纹理区域的立体匹配算法
CN103727930A (zh) 一种基于边缘匹配的激光测距仪与相机相对位姿标定方法
CN113313815A (zh) 一种机械臂抓取物体实时三维重建方法
CN105913013A (zh) 双目视觉人脸识别算法
CN109636790B (zh) 一种管路结构的识别方法及装置
CN109993747A (zh) 融合点线特征的快速图像匹配方法
CN105513094A (zh) 基于三维Delaunay三角剖分的立体视觉跟踪方法及系统
CN112085675A (zh) 深度图像去噪方法、前景分割方法及人体运动监测方法
CN107610174B (zh) 一种鲁棒的基于深度信息的平面检测方法及系统
Stentoumis et al. A local adaptive approach for dense stereo matching in architectural scene reconstruction
CN111161308A (zh) 一种基于关键点匹配的双波段融合目标提取方法
Li et al. A center-line extraction algorithm of laser stripes based on multi-Gaussian signals fitting
Xi et al. Research on the algorithm of noisy laser stripe center extraction
Borisagar et al. A novel segment-based stereo matching algorithm for disparity map generation
CN113834447B (zh) 一种户外复杂环境下高动态激光光条自适应成像处理方法
CN114283199A (zh) 一种面向动态场景的点线融合语义slam方法
Zhao et al. Steel plate surface defect recognition method based on depth information

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20964063

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20964063

Country of ref document: EP

Kind code of ref document: A1