WO2022206161A1 - 一种基于特征点识别的块体运动实时检测方法 - Google Patents

一种基于特征点识别的块体运动实时检测方法 Download PDF

Info

Publication number
WO2022206161A1
WO2022206161A1 PCT/CN2022/074244 CN2022074244W WO2022206161A1 WO 2022206161 A1 WO2022206161 A1 WO 2022206161A1 CN 2022074244 W CN2022074244 W CN 2022074244W WO 2022206161 A1 WO2022206161 A1 WO 2022206161A1
Authority
WO
WIPO (PCT)
Prior art keywords
feature point
block
pixel
real
feature
Prior art date
Application number
PCT/CN2022/074244
Other languages
English (en)
French (fr)
Inventor
陈松贵
陈汉宝
张华庆
高林春
赵旭
胡传琦
彭程
王依娜
马隽
谭忠华
栾英妮
Original Assignee
交通运输部天津水运工程科学研究所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 交通运输部天津水运工程科学研究所 filed Critical 交通运输部天津水运工程科学研究所
Priority to US17/892,232 priority Critical patent/US20220414901A1/en
Publication of WO2022206161A1 publication Critical patent/WO2022206161A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/254Analysis of motion involving subtraction of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Definitions

  • the invention belongs to the field of ocean engineering, and in particular relates to a real-time detection method of block motion based on feature point identification.
  • the damage of the breakwater is usually measured by the movement and fracture of the protective block. Therefore, it is necessary to study the laboratory block motion detection device.
  • Commonly used block motion detection methods include visual inspection, close-up photography, and photogrammetry from a fixed position. Visual inspection is useful to check for specific damage, the number of broken blocks and the number of moving blocks per section of the breakwater can be visually checked, but this method is time consuming and not suitable for inspecting the entire breakwater; close-up photography is only for documentation Visual inspection results, useful for examining the detailed condition of localized damage; photogrammetry from a fixed location to produce overlay photographs covering the entire water surface condition of the breakwater is the most useful and cost-effective method of breakwater inspection; before and after passing the test Acquired images to analyze the state of breakwaters is a feasible method, however, the detection and extraction of objects in images are computationally demanding and therefore cannot be used in systems with limited computing power. However, none of the above methods can only qualitatively analyze the movement and displacement of the block, and neither achieve quantitative measurement nor real-time measurement.
  • the present invention proposes a real-time detection method of block motion based on feature point recognition to solve the problem that the existing detection methods cannot only qualitatively analyze the motion and displacement of blocks, and realize quantitative measurement and real-time measurement. .
  • a real-time detection method for block motion based on feature point recognition comprising the following steps:
  • S2 The digital image of the surface block of the breakwater is captured by the camera, and the digital image is sent to the digital signal processing system based on the field programmable logic gate array;
  • S3 The digital signal processing system performs feature point detection on the blocks in the image
  • S5 Compare the change of the position of the feature points of the block before and after the test, and obtain the change value of the feature point by taking the difference of the coordinates of the two images before and after the test, so as to obtain the displacement of the block;
  • step S3 includes the following steps:
  • S36 Determine the matching degree of the two feature points by calculating the distance between the two feature points, the shorter the distance between the two feature points, the higher the matching degree;
  • S37 Screen the feature points corresponding to each block, and retain the feature point with the highest matching degree to represent the block, thereby completing the feature point detection.
  • the method of using the Hessian matrix to generate the edge point of the digital image used in step S31 is as follows:
  • f(x,y) is the pixel value of the image
  • the discriminant of the Hessian matrix is:
  • the discriminant of the Hessian matrix obtains a local maximum value, it can be determined that the current point is brighter or darker than other points in the surrounding neighborhood, and this point is the position of the feature point.
  • step S32 uses to construct a Gaussian pyramid, the size of the image remains unchanged, and only the size and scale of the Gaussian blur template are changed.
  • step S34 the steps of counting the Harr wavelet features in the neighborhood of pixel points used in step S34 are as follows:
  • S342 Assign a Gaussian weight coefficient to the Haar wavelet response value, so that the contribution of the response close to the feature point is larger than that of the response far away from the feature point;
  • S344 Traverse the entire area, and select the direction of the longest vector as the main direction of the feature point.
  • the generation feature point descriptor utilized in step S35 is to take a 4*4 rectangular area block along the main direction of the feature point around the feature point, and count the Harr wavelet features of the horizontal and vertical directions of each sub-region pixel, the Haar wavelet features include the sum of horizontal values, the sum of absolute values in horizontal directions, the sum of vertical values, and the sum of absolute values in vertical directions.
  • step S4 is as follows:
  • o 0 is the origin of the pixel coordinate system
  • (u 0 , v 0 ) is the pixel coordinate of the center of the image plane
  • o 1 is the origin of the physical coordinate system
  • dx is the physical size of each pixel in the u-axis direction
  • dy is the physical size of each pixel in the v-axis direction.
  • step S1 the calibration of the camera used in step S1 adopts the Zhang Zhengyou calibration method.
  • the present invention has the following beneficial effects:
  • the invention proposes a real-time detection device for block motion based on feature point recognition, which adopts a remote-controlled digital camera to send gray-scale pixels and a block motion measurement algorithm implemented in a digital signal processing system based on a field programmable logic gate array. Grayscale pixels were analyzed to measure the motion of the laboratory block.
  • the block motion measurement algorithm builds a Gaussian pyramid.
  • the size of the images in different groups is the same. The difference is that the template size of the box filter used in different groups gradually increases, and the same group of images of different layers use the same size of filter.
  • the scale space factor of the filter is gradually increased to achieve scale invariance; the block motion measurement algorithm rotates the picture to the main direction before generating the feature descriptor, ensuring that a feature point generates the descriptor The information of the same image is used to achieve rotation invariance, and it has a better effect on the scale changes that occur when lighting, shadows and focus are lost.
  • the block motion measurement algorithm is implemented in hardware in the field programmable logic gate array, which greatly reduces the system's requirement for computing power, and can calculate the block displacement in the images before and after the test in real time, with fast response speed.
  • the laboratory block motion measuring device of the present invention also has the advantages of low cost, convenient installation, less damage, and low maintenance cost.
  • FIG. 1 is a schematic flowchart of a method for real-time detection of block motion based on feature point identification according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a breakwater according to an embodiment of the present invention.
  • 1-data processor 2-digital signal processing system; 3-digital camera; 4-image acquisition controller; 5-water surface; 6-breakwater; 7-block.
  • the terms “installed”, “connected” and “connected” should be understood in a broad sense, unless otherwise expressly specified and limited, for example, it may be a fixed connection or a detachable connection Connection, or integral connection; can be mechanical connection, can also be electrical connection; can be directly connected, can also be indirectly connected through an intermediate medium, can be internal communication between two elements.
  • installed should be understood in a broad sense, unless otherwise expressly specified and limited, for example, it may be a fixed connection or a detachable connection Connection, or integral connection; can be mechanical connection, can also be electrical connection; can be directly connected, can also be indirectly connected through an intermediate medium, can be internal communication between two elements.
  • a method for real-time detection of block 7 motion based on feature point identification includes the following steps:
  • S3 The digital signal processing system performs feature point detection on the block 7 in the image
  • the digital camera is arranged above the water tank and is perpendicular to the model cover.
  • the image acquisition controller is directly connected to the digital camera to control continuous shooting of images.
  • the digital signal processing system processes and analyzes the collected images, and transmits the analysis results to data processor.
  • the digital signal processing system is used to process the image in real time
  • the digital camera is configured to send grayscale pixels
  • the camera communicates with the digital signal processing using but not limited to the USB bus
  • the image acquisition controller can control the camera remotely without touching the camera Take consecutive images.
  • the field programmable logic gate array module includes: a video acquisition module, an image storage module, a data processing module and an image display module.
  • the feature point detection described in step S3 includes the following steps:
  • S36 Determine the matching degree of the two feature points by calculating the distance between the two feature points, the shorter the distance between the two feature points, the higher the matching degree;
  • S37 Screen the feature points corresponding to each block 7, and retain the feature point with the highest matching degree to represent the block 7, thereby completing the detection of the feature points.
  • the edge point method of using the Hessian matrix to generate the digital image used in step S31 is as follows:
  • f(x,y) is the pixel value of the image
  • the discriminant of the Hessian matrix is:
  • the discriminant of the Hessian matrix obtains a local maximum value, it can be determined that the current point is brighter or darker than other points in the surrounding neighborhood, and this point is the position of the feature point.
  • Step S32 uses to construct a Gaussian pyramid, the size of the image remains unchanged, and only the size and scale of the Gaussian blur template are changed.
  • step S34 The steps of counting the Harr wavelet features in the neighborhood of pixel points used in step S34 are as follows:
  • S342 Assign a Gaussian weight coefficient to the Haar wavelet response value, so that the contribution of the response close to the feature point is larger than that of the response far away from the feature point;
  • S344 Traverse the entire area, and select the direction of the longest vector as the main direction of the feature point.
  • the generated feature point descriptor used in step S35 is to take a 4*4 rectangular area block along the main direction of the feature point around the feature point, and count the Harr wavelet features of the horizontal and vertical directions of each sub-region pixel.
  • the Haar wavelet feature Including the sum of the horizontal value, the sum of the absolute value of the horizontal direction, the sum of the vertical direction, and the sum of the absolute value of the vertical direction.
  • step S4 The coordinate transformation method utilized in step S4 is as follows:
  • o 0 is the origin of the pixel coordinate system
  • (u 0 , v 0 ) is the pixel coordinate of the center of the image plane
  • o 1 is the origin of the physical coordinate system
  • dx is the physical size of each pixel in the u-axis direction
  • dy is the physical size of each pixel in the v-axis direction.
  • step S1 the calibration of the camera used in step S1 adopts the Zhang Zhengyou calibration method.
  • the digital camera 3 is used to take digital images of the breakwater 6 before and after the test under the remote control of the image acquisition controller 4, and then the gray pixels are transmitted to the scene based on the
  • the digital signal processing system 2 of the programmable logic gate array performs feature point detection, coordinate transformation and displacement calculation of the block 7 in the image in the digital signal processing system, and finally displays the result on the data processor 1 .
  • a real-time detection device for laboratory block motion based on feature point recognition adopts a remote-controlled digital camera to transmit grayscale pixels and block motion realized in a digital signal processing system based on a field programmable logic gate array.
  • the measurement algorithm analyzes the grayscale pixels to measure the motion of the laboratory block.
  • the described block motion measurement algorithm builds a Gaussian pyramid, the size of the images in different groups is the same, the difference is that the template size of the box filter used in different groups gradually increases, and the same group of images of different layers use the same size.
  • the size of the filter, but the scale space factor of the filter gradually increases, so as to achieve scale invariance; the block motion measurement algorithm rotates the image to the main direction before generating the feature descriptor, ensuring a feature point.
  • the block motion measurement algorithm is implemented in hardware in the field programmable logic gate array, which greatly reduces the system's requirement for computing power, and can calculate the block displacement in the images before and after the test in real time, with fast response speed.
  • the laboratory block motion measuring device of the present invention also has the advantages of low cost, convenient installation, less damage, and low maintenance cost.

Abstract

本发明提供了一种基于特征点识别的块体运动实时检测方法,包括以下步骤:S1:对摄像机进行标定;S2:通过摄像机拍摄防波堤表面块体的数字图像,将数字图像发送到基于现场可编程逻辑门阵列的数字信号处理系统;S3:数字信号处理系统对图像中的块体进行特征点检测;S4:特征点检测完成后进行坐标转换;S5:比较试验前后块体的特征点位置的变化,对试验前后的两张图像的坐标作差得到特征点变化值,从而得到块体的位移量;S6:将位移计算结果显示在数据处理器上。本发明所述的一种基于特征点识别的块体运动实时检测方法解决了现有检测方法无法对块体的运动和位移只能进行定性地分析,实现定量的测量和实时测量的问题。

Description

一种基于特征点识别的块体运动实时检测方法 技术领域
本发明属于海洋工程领域,尤其是涉及一种基于特征点识别的块体运动实时检测方法。
背景技术
防波堤的破坏通常用护面块体的移动、断裂等条件来衡量的,因此,对实验室块体运动检测装置的研究是十分有必要的。常用的块体运动检测方法有目视检查、特写摄影以及从固定位置摄影测量等。目视检查对于检查特定的损坏是有用的,每段防波堤的破碎块体的数量和移动块体的数量可以进行目视检查,但这种方法耗时,不适合检测整个防波堤;特写摄影只是记录视觉检测结果,对于检查局部损伤的详细状况是有用的;从固定位置进行摄影测量,以制作覆盖防波堤整个水面状况的重叠照片,是防波堤检测中最有用和最具成本效益的方法;通过试验前后采集到的图像来分析防波堤的状态是一个可行的方法,然而,对图像中目标的检测和提取对计算要求很高,因此不能用于计算能力有限的系统。但上述方法均未对块体的运动和位移只能进行定性地分析,均未实现定量的测量和实时测量。
发明内容
有鉴于此,本发明提出一种基于特征点识别的块体运动实时检测方法以解决现有检测方法无法对块体的运动和位移只能进行定性地分析,实现定量的测量和实时测量的问题。
为达到上述目的,本发明的技术方案是这样实现的:
一种基于特征点识别的块体运动实时检测方法,包括以下步骤:
S1:对摄像机进行标定;
S2:通过摄像机拍摄防波堤表面块体的数字图像,将数字图像发送到基于现场可编程逻辑门阵列的数字信号处理系统;
S3:数字信号处理系统对图像中的块体进行特征点检测;
S4:特征点检测完成后进行坐标转换;
S5:比较试验前后块体的特征点位置的变化,对试验前后的两张图像的坐标作差得到特征点变化值,从而得到块体的位移量;
S6:将位移计算结果显示在数据处理器上。
进一步的,步骤S3所述的特征点检测包括以下步骤:
S31:使用Hessian矩阵生成数字图像的边缘点,图像中每个边缘点都设置一个Hessian矩阵
S32:利用数字图像构建高斯金字塔;
S33:对Hessian矩阵处理过的每个像素点与其三维邻域内点的大小进行比较,如果该像素点是邻域内像素点的最大值或最小值,则保留下来,作为初步的特征点;
S34:统计特征点邻域内的Harr小波特征;
S35:根据Harr小波特征生成特征点描述子;
S36:通过计算两个特征点间的距离判断两个特征点的匹配度,两个特征点的距离越短匹配度越高;
S37:对与每个块体对应的特征点进行筛选,保留匹配度最高的特征点来表示该块体,从而完成特征点的检测。
进一步的,步骤S31利用的使用Hessian矩阵生成数字图像的边缘点方法如下:
Figure PCTCN2022074244-appb-000001
其中,f(x,y)为图像的像素值;
Hessian矩阵的判别式为:
Figure PCTCN2022074244-appb-000002
当Hessian矩阵的判别式取得局部极大值时,可判定当前点是比周围邻域内其它点更亮或更暗的点,则该点为特征点的位置。
进一步的,步骤S32利用的构建高斯金字塔,图像的大小不变,只改变高斯模糊模板的尺寸和尺度大小。
进一步的,步骤S34利用的统计像素点邻域内的Harr小波特征的步骤如下:
S341:以特征点为中心,计算邻域内所有点在水平和垂直方向的Haar小波响应总和;
S342:给Haar小波响应值赋高斯权重系数,使得靠近特征点的响应贡献比远离特征点的响应贡献大;
S343:给邻域内Haar小波响应相加以形成新的矢量;
S344:遍历整个区域,选择最长矢量的方向为该特征点的主方向。
进一步的,步骤S35利用的生成特征点描述子是在特征点周围沿着特征点的主方向取4*4的矩形区域块,统计每个子区域像素的水平方向和垂直方向的Harr小波特征,该Haar小波特征包括水平方向值之和、水平方向绝对值之和、垂直方向之和、垂直方向绝对值之和。
进一步的,步骤S4利用的坐标转换方法如下:
建立o 0uv像素坐标系,o 0为像素坐标系的原点,(u 0,v 0)为图像平面中心的像素坐标,建立o 1xy为物理坐标系,o 1为物理坐标系的原点;
Figure PCTCN2022074244-appb-000003
Figure PCTCN2022074244-appb-000004
dx为每个像素在u轴方向上的物理尺寸,dy为每个像素在v轴方向上的物理尺寸。
进一步的,步骤S1利用的对摄像机进行标定采用张正友标定法。
相对于现有技术,本发明具有以下有益效果:
本发明提出的一种基于特征点识别的块体运动实时检测装置,采用远程控制的数码摄像机发送灰度像素和基于现场可编程逻辑门阵列的数字信号处理系统中实现的块体运动测量算法对灰度像素进行分析来测量实验室块体的运动。块体运动测量算法构建了高斯金字塔,不同组间图像的尺寸都是一致的,不同的是不同组间使用的盒式滤波器的模板尺寸逐渐增大,同一组不同层图像使用相同尺寸的滤波器,但是滤波器的尺度空间因子逐渐增大,以此来实现尺度不变性;所述块体运动测量算法在生成特征描述子前将图片旋转到主方向上,保证了一个特征点生成描述子用的是同一块图像的信息,实现了旋转不变性,对光照、阴影及焦点丢失时发生的尺度变化有更好的效果。最后,在现场可编程逻辑门阵列中对块体运动测量算法进行了硬件实现,大大降低了系统对计算能力的要求,能够实时计算试验前后图像中块体的位移,响应速度快。此外,本发明所述的实验室块体运动测量装置还具有造价低、安装方便、不易损坏、维修成本低的优点。
附图说明
构成本发明的一部分的附图用来提供对本发明的进一步理解,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。在附图中:
图1为本发明实施例所述的一种基于特征点识别的块体运动实时检测方法流程示意图;
图2为本发明实施例所述的防波堤示意图。
附图标记说明:
1-数据处理器;2-数字信号处理系统;3-数码摄像机;4-图像采集控制器;5-水面;6-防波堤;7-块体。
具体实施方式
需要说明的是,在不冲突的情况下,本发明中的实施例及实施例中的特征可以相互组合。
在本发明的描述中,需要理解的是,术语“中心”、“纵向”、“横向”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”、“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。此外,术语“第一”、“第二”等仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”等的特征可以明示或者隐含地包括一个或者更多个该特征。在本发明的描述中,除非另有说明,“多个”的含义是两个或两个以上。
在本发明的描述中,需要说明的是,除非另有明确的规定和限定,术语 “安装”、“相连”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是机械连接,也可以是电连接;可以是直接相连,也可以通过中间媒介间接相连,可以是两个元件内部的连通。对于本领域的普通技术人员而言,可以通过具体情况理解上述术语在本发明中的具体含义。
下面将参考附图并结合实施例来详细说明本发明。
如图1所示,一种基于特征点识别的块体7运动实时检测方法,包括以下步骤:
S1:对摄像机进行标定;
S2:通过摄像机拍摄防波堤6表面块体7的数字图像,将数字图像发送到基于现场可编程逻辑门阵列的数字信号处理系统;
S3:数字信号处理系统对图像中的块体7进行特征点检测;
S4:特征点检测完成后进行坐标转换;
S5:比较试验前后块体7的特征点位置的变化,对试验前后的两张图像的坐标作差得到特征点变化值,从而得到块体7的位移量;
S6:将位移计算结果显示在数据处理器上。
数码摄像机设置于水槽上方且垂直于模型护面,所述图像采集控制器直接和数码摄像机相连来控制连续拍摄图像,所述数字信号处理系统将采集的图像进行处理分析,并将分析结果传递给数据处理器。
数字信号处理系统用于对图像进行实时处理,数码摄像机被配置为可以发送灰度像素,该摄像机使用但不限于USB总线与数字信号处理通信,图像采集控制器在不接触摄像机即可远程控制摄像机拍摄连续的图像。
所述现场可编程逻辑门阵列模块包括:视频采集模块、图像存储模块、 数据处理模块和图像显示模块。
如图1所示,步骤S3所述的特征点检测包括以下步骤:
S31:使用Hessian矩阵生成数字图像的边缘点,图像中每个边缘点都设置一个Hessian矩阵
S32:利用数字图像构建高斯金字塔;
S33:对Hessian矩阵处理过的每个像素点与其三维邻域内点的大小进行比较,如果该像素点是邻域内像素点的最大值或最小值,则保留下来,作为初步的特征点;
S34:统计特征点邻域内的Harr小波特征;
S35:根据Harr小波特征生成特征点描述子;
S36:通过计算两个特征点间的距离判断两个特征点的匹配度,两个特征点的距离越短匹配度越高;
S37:对与每个块体7对应的特征点进行筛选,保留匹配度最高的特征点来表示该块体7,从而完成特征点的检测。
步骤S31利用的使用Hessian矩阵生成数字图像的边缘点方法如下:
Figure PCTCN2022074244-appb-000005
其中,f(x,y)为图像的像素值;
Hessian矩阵的判别式为:
Figure PCTCN2022074244-appb-000006
当Hessian矩阵的判别式取得局部极大值时,可判定当前点是比周围邻域内其它点更亮或更暗的点,则该点为特征点的位置。
步骤S32利用的构建高斯金字塔,图像的大小不变,只改变高斯模糊模板的尺寸和尺度大小。
步骤S34利用的统计像素点邻域内的Harr小波特征的步骤如下:
S341:以特征点为中心,计算邻域内所有点在水平和垂直方向的Haar小波响应总和;
S342:给Haar小波响应值赋高斯权重系数,使得靠近特征点的响应贡献比远离特征点的响应贡献大;
S343:给邻域内Haar小波响应相加以形成新的矢量;
S344:遍历整个区域,选择最长矢量的方向为该特征点的主方向。
步骤S35利用的生成特征点描述子是在特征点周围沿着特征点的主方向取4*4的矩形区域块,统计每个子区域像素的水平方向和垂直方向的Harr小波特征,该Haar小波特征包括水平方向值之和、水平方向绝对值之和、垂直方向之和、垂直方向绝对值之和。
步骤S4利用的坐标转换方法如下:
建立o 0uv像素坐标系,o 0为像素坐标系的原点,(u 0,v 0)为图像平面中心的像素坐标,建立o 1xy为物理坐标系,o 1为物理坐标系的原点;
Figure PCTCN2022074244-appb-000007
Figure PCTCN2022074244-appb-000008
dx为每个像素在u轴方向上的物理尺寸,dy为每个像素在v轴方向上 的物理尺寸。
进一步的,步骤S1利用的对摄像机进行标定采用张正友标定法。
如图2所示,当水面5完全平静之后使用数码摄像机3在图像采集控制器4的远程控制下分别对试验前后的防波堤6护面进行数字图像的拍摄,然后将灰度像素传输到基于现场可编程逻辑门阵列的数字信号处理系统2,在数字信号处理系统中对图像中的块体7进行特征点检测、坐标转换以及块体7位移的计算,最后将结果显示在数据处理器1上。
本发明所述的一种基于特征点识别的实验室块体运动实时检测装置,采用远程控制的数码摄像机发送灰度像素和基于现场可编程逻辑门阵列的数字信号处理系统中实现的块体运动测量算法对灰度像素进行分析来测量实验室块体的运动。所述的块体运动测量算法构建了高斯金字塔,不同组间图像的尺寸都是一致的,不同的是不同组间使用的盒式滤波器的模板尺寸逐渐增大,同一组不同层图像使用相同尺寸的滤波器,但是滤波器的尺度空间因子逐渐增大,以此来实现尺度不变性;所述块体运动测量算法在生成特征描述子前将图片旋转到主方向上,保证了一个特征点生成描述子用的是同一块图像的信息,实现了旋转不变性,对光照、阴影及焦点丢失时发生的尺度变化有更好的效果。最后,在现场可编程逻辑门阵列中对块体运动测量算法进行了硬件实现,大大降低了系统对计算能力的要求,能够实时计算试验前后图像中块体的位移,响应速度快。此外,本发明所述的实验室块体运动测量装置还具有造价低、安装方便、不易损坏、维修成本低的优点。
以上所述仅为本发明的较佳实施例而已,并不用以限制本发明,凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (8)

  1. 一种基于特征点识别的块体运动实时检测方法,其特征在于:包括以下步骤:
    S1:对摄像机进行标定;
    S2:通过摄像机拍摄防波堤表面块体的数字图像,将数字图像发送到基于现场可编程逻辑门阵列的数字信号处理系统;
    S3:数字信号处理系统对图像中的块体进行特征点检测;
    S4:特征点检测完成后进行坐标转换;
    S5:比较试验前后块体的特征点位置的变化,对试验前后的两张图像的坐标作差得到特征点变化值,从而得到块体的位移量;
    S6:将位移计算结果显示在数据处理器上。
  2. 根据权利要求1所述的一种基于特征点识别的块体运动实时检测方法,其特征在于:步骤S3所述的特征点检测包括以下步骤:
    S31:使用Hessian矩阵生成数字图像的边缘点,图像中每个边缘点都设置一个Hessian矩阵
    S32:利用数字图像构建高斯金字塔;
    S33:对Hessian矩阵处理过的每个像素点与其三维邻域内点的大小进行比较,如果该像素点是邻域内像素点的最大值或最小值,则保留下来,作为初步的特征点;
    S34:统计特征点邻域内的Harr小波特征;
    S35:根据Harr小波特征生成特征点描述子;
    S36:通过计算两个特征点间的距离判断两个特征点的匹配度,两个特征点的距离越短匹配度越高;
    S37:对与每个块体对应的特征点进行筛选,保留匹配度最高的特征点来表示该块体,从而完成特征点的检测。
  3. 根据权利要求2所述的一种基于特征点识别的块体运动实时检测方法,其特征在于:步骤S31利用的使用Hessian矩阵生成数字图像的边缘点方法如下:
    Figure PCTCN2022074244-appb-100001
    其中,f(x,y)为图像的像素值;
    Hessian矩阵的判别式为:
    Figure PCTCN2022074244-appb-100002
    当Hessian矩阵的判别式取得局部极大值时,可判定当前点是比周围邻域内其它点更亮或更暗的点,则该点为特征点的位置。
  4. 根据权利要求2所述的一种基于特征点识别的块体运动实时检测方法,其特征在于:步骤S32利用的构建高斯金字塔,图像的大小不变,只改变高斯模糊模板的尺寸和尺度大小。
  5. 根据权利要求2所述的一种基于特征点识别的块体运动实时检测方法,其特征在于:步骤S34利用的统计像素点邻域内的Harr小波特征的步骤如下:
    S341:以特征点为中心,计算邻域内所有点在水平和垂直方向的Haar小波响应总和;
    S342:给Haar小波响应值赋高斯权重系数,使得靠近特征点的响应贡献比远离特征点的响应贡献大;
    S343:给邻域内Haar小波响应相加以形成新的矢量;
    S344:遍历整个区域,选择最长矢量的方向为该特征点的主方向。
  6. 根据权利要求2所述的一种基于特征点识别的块体运动实时检测方法,其特征在于:步骤S35利用的生成特征点描述子是在特征点周围沿着特征点的主方向取4*4的矩形区域块,统计每个子区域像素的水平方向和垂直方向的Harr小波特征,该Haar小波特征包括水平方向值之和、水平方向绝对值之和、垂直方向之和、垂直方向绝对值之和。
  7. 根据权利要求1所述的一种基于特征点识别的块体运动实时检测方法,其特征在于:步骤S4利用的坐标转换方法如下:
    建立o 0uv像素坐标系,o 0为像素坐标系的原点,(u 0,v 0)为图像平面中心的像素坐标,建立o 1xy为物理坐标系,o 1为物理坐标系的原点;
    Figure PCTCN2022074244-appb-100003
    Figure PCTCN2022074244-appb-100004
    dx为每个像素在u轴方向上的物理尺寸,dy为每个像素在v轴方向上的物理尺寸。
  8. 根据权利要求1所述的一种基于特征点识别的块体运动实时检测方法,其特征在于:步骤S1利用的对摄像机进行标定采用张正友标定法。
PCT/CN2022/074244 2021-03-31 2022-01-27 一种基于特征点识别的块体运动实时检测方法 WO2022206161A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/892,232 US20220414901A1 (en) 2021-03-31 2022-08-22 Real-time detection method of block motion based on feature point recognition

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110352363.1A CN112967319A (zh) 2021-03-31 2021-03-31 一种基于特征点识别的块体运动实时检测方法
CN202110352363.1 2021-03-31

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/892,232 Continuation US20220414901A1 (en) 2021-03-31 2022-08-22 Real-time detection method of block motion based on feature point recognition

Publications (1)

Publication Number Publication Date
WO2022206161A1 true WO2022206161A1 (zh) 2022-10-06

Family

ID=76280822

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/074244 WO2022206161A1 (zh) 2021-03-31 2022-01-27 一种基于特征点识别的块体运动实时检测方法

Country Status (4)

Country Link
US (1) US20220414901A1 (zh)
CN (1) CN112967319A (zh)
LU (1) LU502661B1 (zh)
WO (1) WO2022206161A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117132913A (zh) * 2023-10-26 2023-11-28 山东科技大学 基于无人机遥感与特征识别匹配的地表水平位移计算方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112967319A (zh) * 2021-03-31 2021-06-15 交通运输部天津水运工程科学研究所 一种基于特征点识别的块体运动实时检测方法
CN113205541A (zh) * 2021-05-31 2021-08-03 交通运输部天津水运工程科学研究所 一种基于视觉边缘检测的实验室空间波浪实时测量方法
CN116634285B (zh) * 2023-04-25 2024-02-02 钛玛科(北京)工业科技有限公司 一种用于原材料检测设备的线阵相机自动白平衡方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105551264A (zh) * 2015-12-25 2016-05-04 中国科学院上海高等研究院 一种基于车牌特征匹配的车速检测方法
CN106295641A (zh) * 2016-08-09 2017-01-04 鞍钢集团矿业有限公司 一种基于影像surf特征的边坡位移自动监测方法
US10255525B1 (en) * 2017-04-25 2019-04-09 Uber Technologies, Inc. FPGA device for image classification
CN112967319A (zh) * 2021-03-31 2021-06-15 交通运输部天津水运工程科学研究所 一种基于特征点识别的块体运动实时检测方法
CN113205541A (zh) * 2021-05-31 2021-08-03 交通运输部天津水运工程科学研究所 一种基于视觉边缘检测的实验室空间波浪实时测量方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106408609B (zh) * 2016-09-13 2019-05-31 江苏大学 一种基于双目视觉的并联机构末端运动位姿检测方法
CN109118544B (zh) * 2018-07-17 2022-05-27 南京理工大学 基于透视变换的合成孔径成像方法
CN110135438B (zh) * 2019-05-09 2022-09-27 哈尔滨工程大学 一种基于梯度幅值预运算的改进surf算法
CN110634137A (zh) * 2019-09-26 2019-12-31 杭州鲁尔物联科技有限公司 一种基于视觉感知的桥梁变形的监测方法、装置及设备
CN111472586A (zh) * 2020-05-27 2020-07-31 交通运输部天津水运工程科学研究所 护面块体的制作系统、制作方法及在试验中的应用
CN112258588A (zh) * 2020-11-13 2021-01-22 江苏科技大学 一种双目相机的标定方法、系统及存储介质
CN112465876A (zh) * 2020-12-11 2021-03-09 河南理工大学 一种立体匹配方法及设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105551264A (zh) * 2015-12-25 2016-05-04 中国科学院上海高等研究院 一种基于车牌特征匹配的车速检测方法
CN106295641A (zh) * 2016-08-09 2017-01-04 鞍钢集团矿业有限公司 一种基于影像surf特征的边坡位移自动监测方法
US10255525B1 (en) * 2017-04-25 2019-04-09 Uber Technologies, Inc. FPGA device for image classification
CN112967319A (zh) * 2021-03-31 2021-06-15 交通运输部天津水运工程科学研究所 一种基于特征点识别的块体运动实时检测方法
CN113205541A (zh) * 2021-05-31 2021-08-03 交通运输部天津水运工程科学研究所 一种基于视觉边缘检测的实验室空间波浪实时测量方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117132913A (zh) * 2023-10-26 2023-11-28 山东科技大学 基于无人机遥感与特征识别匹配的地表水平位移计算方法
CN117132913B (zh) * 2023-10-26 2024-01-26 山东科技大学 基于无人机遥感与特征识别匹配的地表水平位移计算方法

Also Published As

Publication number Publication date
CN112967319A (zh) 2021-06-15
US20220414901A1 (en) 2022-12-29
LU502661B1 (en) 2022-12-12

Similar Documents

Publication Publication Date Title
WO2022206161A1 (zh) 一种基于特征点识别的块体运动实时检测方法
CN109360203B (zh) 图像配准方法、图像配准装置及存储介质
CN110146030A (zh) 基于棋盘格标志法的边坡表面变形监测系统和方法
CN110197185B (zh) 一种基于尺度不变特征变换算法监测桥下空间的方法和系统
CN115294145B (zh) 一种输电线路弧垂的测量方法及系统
CN113688817A (zh) 一种自动巡检的仪表识别方法及识别系统
CN111031311A (zh) 成像质量检测方法、装置、电子设备及可读存储介质
CN116228780B (zh) 基于计算机视觉的硅片缺陷检测方法及系统
CN113205541A (zh) 一种基于视觉边缘检测的实验室空间波浪实时测量方法
US9204130B2 (en) Method and system for creating a three dimensional representation of an object
JP2016526182A (ja) レンズ装着平面性の即時調整方法及び装置
CN112470189B (zh) 光场系统的遮挡消除
KR20180125095A (ko) Ptz 촬상장치 기반 변위 측정 시스템 및 방법
CN116778094B (zh) 一种基于优选视角拍摄的建筑物变形监测方法及装置
CN112924037A (zh) 基于图像配准的红外体温检测系统及检测方法
Jiang et al. Full-field deformation measurement of structural nodes based on panoramic camera and deep learning-based tracking method
CN115880643A (zh) 一种基于目标检测算法的社交距离监测方法和装置
CN114299153A (zh) 一种超大电力设备的相机阵列同步标定方法及系统
CN114549613A (zh) 基于深度超分辨率网络的结构位移测量方法及装置
CN112858331A (zh) 一种vr屏幕的检测方法及检测系统
US11481996B2 (en) Calculation device, information processing method, and storage medium
Désaulniers et al. Performance evaluation of panoramic electro-optic imagers using the TOD method
CN116993803B (zh) 滑坡形变监测方法、装置及电子设备
CN113362244B (zh) 基于优先级和数据使用计划的图像处理方法
US20230386077A1 (en) Position estimation system, position estimation method, and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22778349

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22778349

Country of ref document: EP

Kind code of ref document: A1