WO2020051750A1 - 图像处理方法、边缘提取方法、处理设备及存储介质 - Google Patents

图像处理方法、边缘提取方法、处理设备及存储介质 Download PDF

Info

Publication number
WO2020051750A1
WO2020051750A1 PCT/CN2018/104900 CN2018104900W WO2020051750A1 WO 2020051750 A1 WO2020051750 A1 WO 2020051750A1 CN 2018104900 W CN2018104900 W CN 2018104900W WO 2020051750 A1 WO2020051750 A1 WO 2020051750A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
analysis window
value
pixel value
Prior art date
Application number
PCT/CN2018/104900
Other languages
English (en)
French (fr)
Inventor
阳光
Original Assignee
深圳配天智能技术研究院有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳配天智能技术研究院有限公司 filed Critical 深圳配天智能技术研究院有限公司
Priority to PCT/CN2018/104900 priority Critical patent/WO2020051750A1/zh
Priority to CN201880087313.1A priority patent/CN111630565B/zh
Publication of WO2020051750A1 publication Critical patent/WO2020051750A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Definitions

  • the present application relates to the field of image processing, and in particular, to an image processing method, an edge extraction method, a processing device, and a storage medium.
  • Edge extraction is a basic operation in image processing and can be applied to different fields.
  • edge detection is used to detect the surface quality of the workpiece. Specifically, the image of the workpiece surface is obtained, and then the edge extraction of the image of the workpiece surface is performed. To detect the presence of stains or scratches on the surface of the workpiece. In specific applications, the surface of the workpiece is often lightly stained, and the edges in the image of the corresponding workpiece surface are not clear and cannot be easily detected.
  • the present application provides an image processing method, an edge extraction method, a processing device, and a computer storage medium to solve the problem that the edges in an image to be detected are unclear and difficult to detect.
  • the present application provides an image processing method.
  • the method includes: generating an analysis window applied to the image; and calculating the relationship between the center pixel point and other pixel points of the pixel point covered by the current position of the analysis window on the image. The difference between the pixel values; resets the pixel value of the center pixel point based on the largest difference among the differences.
  • an image edge extraction method which includes: using the above method to process the pixel values of the image; filtering pixels whose pixel values meet a preset condition as the edge points of the image; based on the edges Points for edge extraction.
  • the present application provides an image processing device.
  • the device includes a processor and a memory.
  • the memory stores a computer program, and the processor is configured to execute the computer program to implement the foregoing method.
  • the present application provides a storage medium for storing a computer program, and the computer program can be executed to implement the above method.
  • the method of the present application processes the pixel values of an image, first generates an analysis window applied to the image, and then calculates the difference between the pixel values of the center pixel and other pixel points of the pixels covered by the current position of the analysis window on the image , Resetting the pixel value of the central pixel point according to the maximum difference value in the difference value, that is, the pixel value after the pixel point reset can reflect the maximum difference between the pixel point and other pixel points.
  • This application resets the pixel value of a pixel point to highlight the largest difference between a central pixel point and a neighboring pixel point, making the edges in the image clearer and easier to detect.
  • FIG. 1 is a schematic flowchart of an embodiment of an image processing method according to the present application.
  • FIG. 2 is a comparison diagram before and after a pixel value of a pixel in an analysis window in the method shown in FIG. 1 is reset;
  • FIG. 3 is a schematic flowchart of another embodiment of an image processing method according to the present application.
  • FIG. 4 is a schematic diagram of the azimuth angles of other pixels corresponding to the maximum difference in the method shown in FIG. 3 relative to the center pixel;
  • FIG. 5 is a schematic comparison diagram before and after processing an image by using the method shown in FIG. 1 or FIG. 3;
  • FIG. 5 is a schematic comparison diagram before and after processing an image by using the method shown in FIG. 1 or FIG. 3; FIG.
  • FIG. 6 is a schematic flowchart of an embodiment of an edge extraction method of an image of the present application.
  • FIG. 7 is a schematic structural diagram of an embodiment of an image processing apparatus according to the present application.
  • FIG. 8 is a schematic structural diagram of an embodiment of a storage medium of the present application.
  • the processing method of this application is to reset the pixel values of the image to highlight the difference between each pixel and neighboring pixels.
  • the analysis window for the image is generated, and the center pixel within the area covered by the analysis window is adjacent.
  • the difference calculation, the maximum difference in the neighboring difference can reflect the difference between the central pixel and the neighboring pixel, so the pixel value of the central pixel is reset according to the maximum difference, and the pixel value is re-weighted according to the above steps. Home.
  • FIG. 1 is a schematic flowchart of an embodiment of an image processing method according to the present application.
  • the image processing method of this embodiment includes the following steps.
  • S11 Generate an analysis window applied to the image.
  • an analysis window applied to the image is generated, that is, an analysis area is divided when the image is analyzed and calculated.
  • the analysis window limits the size and range of the analyzed data. Therefore, if the analysis window is large, the analysis data is comprehensive, but each analysis time Will be longer; correspondingly, if the analysis window is smaller, the analysis time is shorter, but the analysis data is not comprehensive.
  • the analysis window can be a rectangular window, a circular window, a fan window, etc.
  • the pixels in the image need to be analyzed, so a rectangular window is generally used.
  • the analysis window specifically corresponds to (2n + 1 ) ⁇ (2n + 1) pixels rectangular window, n is an integer greater than or equal to 1, at this time there is a center point in the analysis window of the rectangle, and there is also a center pixel point among the pixels corresponding to its coverage area.
  • S12 Calculate the difference between the pixel values of the central pixel and other pixels among the pixels covered by the current position of the analysis window on the image.
  • the pixels covered by the analysis window at the current position can be determined, and the pixels covered by the analysis window at the current setting position on the image can be determined, and then the covered pixels are analyzed.
  • the pixel values are processed to highlight the differences between the pixels and neighboring pixels. Therefore, the method used in step S12 is to calculate the center pixel and other pixels among the pixels covered by the analysis window. Pixel value difference.
  • the analysis and calculation in the analysis window in this embodiment is mainly based on the central pixel point, to obtain the pixel value difference between the central pixel point and other pixel points, that is, the difference between the central pixel point and its neighboring pixel points can be known.
  • step S12 After the difference between the central pixel and other pixels is known in step S12, the pixel value of the central pixel is reset so that the new pixel value reflects the difference between the central pixel and the other pixels.
  • step S13 the pixel value of the central pixel is reset according to the maximum difference among the differences, that is, the new pixel value can reflect the maximum difference between the central pixel and other pixels.
  • FIG. 2 is a schematic diagram of comparison between before and after resetting the pixel value of a pixel in an analysis window in the method shown in FIG. 1, where the analysis window is a rectangular window corresponding to 3 ⁇ 3 pixels.
  • the pixel value of the central pixel is 45, and the difference between the pixel value of the neighboring pixel and the pixel value of 12 is the largest, and the maximum difference is 33; then the pixel value of the central pixel is reset according to the maximum difference of 33.
  • the maximum difference value is directly used as the pixel value of the central pixel point. Therefore, after resetting, the pixel value of the central pixel point becomes 33.
  • the above steps S12-S13 can reset the pixel value of a central pixel point.
  • the stain corresponds to a lighter edge point on the image.
  • the pixel value corresponding to the edge point is reset to make the edge point clearer.
  • the pixel values of other pixels in FIG. 2 can also be reset, but this reset is not performed in this analysis window, but is performed when other pixels are used as the center pixel within the area covered by the analysis window.
  • the reset is specifically implemented by the following step S14.
  • the analysis and calculation for one analysis window can complete the reset of the pixel value of a central pixel, but if you want to highlight the edge lines in an image, you need to reset the pixel values of multiple pixels.
  • the difference value is used to replace its own pixel value.
  • the pixel values of the edge line area can be replaced with larger ones.
  • the difference value makes the edge line thicker, and the pixel values in other areas are replaced with smaller difference values, which can make the edge line relatively deeper, thereby making the edge line in the image more obvious.
  • the image and the analysis window are relatively moved, and the process returns to step S12 until the end condition is satisfied to reset the pixel values of the multiple pixels.
  • the relative movement of the image and the analysis window may be based on one pixel as a step in the row or column direction of the pixel to move the analysis window relative to the image
  • the end condition may be to determine whether the analysis window has traversed the image or the predetermined Area, if the image or a predetermined area of the image has been traversed, the end condition is satisfied.
  • the predetermined area can be the largest area except the edge pixels in the entire image. Since the analysis window occupies a certain area, the calculation in the analysis window only resets the pixel value of the center point. Therefore, the pixels at the edges of the image may not be used.
  • the analysis window performs calculations, and the predetermined area traversed at this time is the area obtained by subtracting the edge pixels from the entire image.
  • the predetermined area can also be an area set by the user. For example, the user can roughly know the position of the edge line by observing the workpiece when detecting the workpiece. In order to speed up the image analysis process, a specific area can be selected as the area for image processing. Perform analysis.
  • the analysis window is moved relative to the pixel row direction of the image, the pixel values of one row of pixels are sequentially reset, and then the step size of the pixel row is shifted relative to the pixel column direction of the image, and then the pixel row Move the analysis window in the direction to reset the pixel values of the pixels in the next row in turn.
  • the analysis window and the image are moved in an S-shape.
  • an analysis window is set on the image, and the central pixel point and other pixel points in the pixels covered by the analysis window are calculated to calculate the pixel value difference.
  • the maximum difference resets the pixel value of the center pixel, thereby highlighting the differences between the pixel and other pixels.
  • This embodiment can also perform the same processing on other pixels in the image according to the reset process, so that the edge lines in the image are more clear and easy to be detected.
  • FIG. 3 is a schematic flowchart of another embodiment of an image processing method according to the present application.
  • the processing method of this embodiment includes the following steps.
  • Steps S21-S22 in this embodiment are basically similar to steps S11-S12 in the foregoing embodiment, and details are not described again.
  • the pixel value of the central pixel point is also reset according to the maximum difference among the difference values.
  • the maximum difference value is not directly used as the pixel value of the central pixel point, but the pixel value of the central pixel point is reset after considering the angle factor of the maximum difference value, so as to eliminate interference with the edge line.
  • the pixel value can be further reset. The consideration of angle factors is added to eliminate interference, which is specifically implemented by the following steps S23-S25.
  • step S22 After calculating the difference in step S22, the maximum difference among the differences is obtained, and other pixel points corresponding to the maximum difference are obtained. In this step, the other pixel points corresponding to the maximum difference are obtained from The azimuth angle, which is the angle factor representing the maximum difference.
  • FIG. 4 is a schematic diagram of the azimuth angles of other pixel points corresponding to the center pixel point in the method shown in FIG. 3.
  • the analysis window in FIG. 4 is a rectangular window corresponding to 5 ⁇ 5 pixels.
  • the azimuth angle of the other pixels corresponding to the maximum difference with respect to the center pixel point is ⁇ .
  • a reference coordinate must be preset. That is, the coordinate XY in FIG. 4 and the corresponding azimuth angle are the counterclockwise deflection angles of the line connecting the central pixel and other pixels with respect to the coordinate X, such as the azimuth angle ⁇ shown in FIG. 4.
  • the determination of the azimuth angle can be specifically calculated by the pixel offset of other pixels on the coordinate XY from the central pixel.
  • the azimuth angle is converted into an angle factor in step S24. For example, a predetermined calculation is performed on the azimuth angle to obtain the angle factor, or the trigonometric function value of the azimuth angle or the radian value corresponding to the azimuth angle. Trigonometric or radian values are used as angle factors.
  • the azimuth angle is divided by 360 ° to obtain an angle factor.
  • the obtained angle factor is between 0 and 1.
  • the maximum difference is multiplied by the angle factor in the subsequent steps.
  • the obtained value range can be between 0 and 225, ensuring that it can be reflected in the image as a pixel value.
  • S25 Determine the pixel value of the central pixel point according to the maximum difference and the angle factor.
  • step S25 the pixel value of the center pixel point is determined according to the maximum difference value and the angle factor of the maximum difference value.
  • This step S26 is similar to step S14 in the foregoing embodiment, and details are not described in detail. This step can also implement pixel value reset for multiple pixels.
  • the pixel value reset processing is performed according to the maximum difference between the pixel points and its angle factor, and the pixel value reset in this embodiment is performed on an image or a preset area of the image, that is, the pixel values of the entire pixel point are uniform. Taking into account the maximum difference and angle factors, it can more accurately reflect the difference between each pixel and neighboring pixels in the presence of interference.
  • FIG. 5 uses the method shown in FIG. 1 or FIG. 3 to process the image. Comparison diagram before and after. When the image in FIG. 5 is processed, the pixel values analyzed and calculated are grayscale values. It can be clearly seen that the processed image in FIG. 5 has more distinct edge lines than the image before processing.
  • the image after the pixel value is reset may be further binarized, that is, the black and white processing of the picture, for example, the gray value is converted to 0 or 255 to achieve binarization.
  • FIG. 6 is a schematic flowchart of an embodiment of an edge extraction method for an image of the present application.
  • the edge extraction method in this embodiment includes the following steps.
  • Processing the pixel values of the image by using the above method can reset the gray value of the image.
  • a threshold value may be set, and a pixel point whose gray value exceeds the threshold value is used as an edge point.
  • edge extraction is performed according to the selected edge points to complete the edge detection of the image.
  • the edge points can be fitted as edge lines, and the line connecting the edge points can also be used as edge lines, that is, the edge extraction is realized.
  • the application can be implemented by an image processing device, and the logical process thereof is expressed by a computer program, and specifically, it is implemented by the image processing device.
  • FIG. 7 is a schematic structural diagram of an embodiment of an image processing apparatus of the present application.
  • the image processing apparatus 100 of this embodiment includes a processor 11 and a memory 12.
  • a computer program is stored in the memory 12, and the processor is configured to execute the computer program to implement the foregoing method.
  • FIG. 8 is a schematic structural diagram of an embodiment of a storage medium of the present application.
  • a computer program is stored in the storage medium 200 in this embodiment, and can be executed to implement the methods of the foregoing embodiments.
  • the storage medium 200 may be a U disk, an optical disk, a server, or the like.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

本申请公开了一种图像处理方法,该方法包括:生成应用于图像的分析窗口;计算分析窗口在图像上当前位置所覆盖的像素点中的中心像素点与其他像素点的像素值的差值;根据差值中的最大差值重置中心像素点的像素值。采用本申请的方法对图像的像素值进行重置以凸显中心像素点与其邻近像素点的差异。

Description

图像处理方法、边缘提取方法、处理设备及存储介质 【技术领域】
本申请涉及图像处理领域,特别涉及一种图像处理方法、边缘提取方法、处理设备及存储介质。
【背景技术】
边缘提取是图像处理中的基本操作,可应用到不同的领域,例如在工业领域,利用边缘提取实现对工件表面质量的检测,具体即获取工件表面的图像,然后对工件表面的图像进行边缘提取,以检测出工件表面是否存在污点或划痕。在具体应用中经常出现工件表面污点较浅的情况,相应工件表面的图像中边缘不清晰,不容易检测出来。
【发明内容】
本申请提供一种图像处理方法、边缘提取方法、处理设备及计算机存储介质,以解决待检测图像中边缘不清晰,难以检测的问题。
为解决上述技术问题,本申请提供一种图像处理方法,该方法包括:生成应用于图像的分析窗口;计算分析窗口在图像上当前位置所覆盖的像素点中的中心像素点与其他像素点的像素值的差值;根据差值中的最大差值重置中心像素点的像素值。
为解决上述技术问题,本申请提供一种图像的边缘提取方法,该方法包括:使用上述方法对图像的像素值进行处理;筛选像素值满足预设条件的像素点作为图像的边缘点;基于边缘点进行边缘提取。
为解决上述技术问题,本申请提供一种图像处理设备,该设备包括处理器和存储器,存储器中存储有计算机程序,处理器用于执行计算机程序以实现上述方法。
为解决上述技术问题,本申请提供一种存储介质,存储介质用于存储计算 机程序,计算机程序能够被执行以实现上述方法。
本申请方法对图像的像素值进行处理,首先生成一应用于图像的分析窗口,然后计算分析窗口在图像上当前位置所覆盖的像素点中的中心像素点与其他像素点的像素值的差值,根据差值中的最大差值重置中心像素点的像素值,即使得像素点重置后的像素值能够体现该像素点与其他像素点的最大差异。本申请对像素点的像素值进行重置,以凸显中心像素点与邻近像素点的最大差异,使得图像中的边缘更加清晰,容易被检测。
【附图说明】
图1是本申请图像处理方法一实施例的流程示意图;
图2是图1所示方法中一分析窗口内像素点的像素值重置前后的对比示意图;
图3是本申请图像处理方法另一实施例的流程示意图;
图4是图3所示方法中最大差值所对应的其他像素点相对中心像素点的方位角度的示意图;
图5是采用图1或图3所示方法对图像进行处理前后的对比示意图;
图6是本申请图像的边缘提取方法一实施例的流程示意图;
图7是本申请图像处理设备一实施例的结构示意图;
图8是本申请存储介质一实施例的结构示意图。
【具体实施方式】
下面将结合本申请实施方式及其附图,对本申请的技术方案进行清楚、完整地描述,显然,所描述的实施方式仅是本申请的一部分实施方式,而不是全部的实施方式。基于本申请中的实施方式,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施方式,都属于本申请保护的范围。
本申请处理方法是对图像的像素值进行重置,以凸显每个像素点与邻近像素点 的差异,具体的生成应用于图像的分析窗口,对分析窗口所覆盖范围内的中心像素点进行邻近差值计算,邻近差值中最大差值即能体现该中心像素点与邻近像素点的差异,因此根据最大差值重置该中心像素点的像素值,根据以上步骤来对图像进行像素值重置。
具体过程请参阅图1,图1是本申请图像处理方法一实施例的流程示意图。本实施例图像处理方法包括以下步骤。
S11:生成应用于图像的分析窗口。
本步骤S11中生成应用于图像的分析窗口,即生成在对图像进行分析计算时划分一个分析区域。在每次分析计算时只针对该分析窗口所覆盖范围内的像素点进行分析,分析窗口限定了所分析数据的大小及范围,因而若分析窗口较大,则分析数据全面,但每次分析时间会较长;相应若分析窗口较小,则分析时间较短,但分析数据不够全面。
分析窗口可以是矩形窗口、圆形窗口、扇形窗口等,在本实施例中需要对图像中的像素点进行分析,因而一般采用矩形窗口,本实施例中分析窗口具体为对应于(2n+1)×(2n+1)个像素点的矩形窗口,n为大于或等于1的整数,此时矩形的分析窗口内有一个中心点,相应其覆盖区域的像素点中也有一中心像素点。
S12:计算分析窗口在图像上当前位置所覆盖的像素点中的中心像素点与其他像素点的像素值差值。
在图像上生成分析窗口后,可确定分析窗口在当前位置所覆盖的像素点,即可确定分析窗口在图像上当前的设置位置覆盖到的像素点,然后对所覆盖的像素点进行分析。
本实施例是对像素值进行处理以凸显像素点与邻近像素点之间的差异,因而在本步骤S12中所采用的方法是计算分析窗口所覆盖像素点中的中心像素点与其他像素点的像素值差值。
本实施例中在分析窗口中的分析计算,是以中心像素点为主,去获取中心像素点与其他像素点的像素值差值,即能够获知中心像素点与其邻近像素点的 差异。
S13:根据差值中的最大差值重置中心像素点的像素值。
在上述步骤S12中获知中心像素点与其他像素点的差异后,重置中心像素点的像素值,以使新的像素值体现中心像素点与其他像素点的差异。本步骤S13中根据差值中的最大差值重置中心像素点的像素值,即能够使新的像素值体现中心像素点与其他像素点的最大差异。
例如图2所示的例子,图2是图1所示方法中一分析窗口内像素点的像素值重置前后的对比示意图,其中分析窗口为对应于3×3个像素点的矩形窗口。
在重置前,中心像素点的像素值为45,与像素值为12的邻近像素点的差值最大,且最大差值为33;然后根据最大差值33重置中心像素点的像素值。图2中直接将最大差值作为中心像素点的像素值,因而在重置后,中心像素点的像素值变为33。
上述步骤S12-S13的过程,可对一个中心像素点的像素值进行重置,而在一种实施方式中,当工件表面存在污点,该污点对应图像上的一个较浅的边缘点时,通过对该边缘点所对应的像素值进行重置,来使得该边缘点更清晰。
此外,图2中其他像素点的像素值也可进行重置,但并不是在此次分析窗口中进行的重置,而是其他像素点在分析窗口所覆盖的范围内作为中心像素点时进行重置,具体由以下步骤S14实现。
S14:将图像和分析窗口相对移动,并返回步骤S12,直至满足结束条件。
对于步骤S12-S13,针对一个分析窗口的分析计算能够完成对一个中心像素点的像素值的重置,但若想要凸显一个图像中的边缘线,需要对多个像素点进行像素值重置,该过程中由于边缘线区域的像素值差异较大,而其他区域的像素值差异较小,采用差异值来代替其本身的像素值,可将边缘线区域的像素值均替换为较大的差异值,使得边缘线加粗,而其他区域的像素值替换为较小的差异值,即可使边缘线相对加深,从而使得图像中的边缘线更加明显。在本步骤S14中将图像和分析窗口相对移动,并返回步骤S12,直至满足结束条件,以 对多个像素点进行像素值重置。
本实施例中,图像和分析窗口的相对移动可以是以一个像素点为步长沿像素的行方向或列方向相对图像移动分析窗口,结束条件可以是判断分析窗口是否已遍历图像或图像的预定区域,若已遍历图像或图像的预定区域,则满足结束条件。
预定区域最大可以是整幅图像中除去边缘像素点的区域,由于分析窗口占据一定的区域,而分析窗口中的计算只是重置中心点的像素值,因而对于图像边缘的像素点,可能无法使用分析窗口进行计算,此时所遍历的预定区域为整幅图像减去边缘像素点后的区域。
预定区域还可以是用户设定的区域,例如,用户在进行工件检测时通过对工件的观察,可以大致知道边缘线所在的位置,为了加快图像分析过程,可选择特定区域作为图像处理的区域来进行分析。
具体来说,相对图像的像素点行方向移动分析窗口,对一行像素点的像素值依次进行重置,然后相对图像的像素点列方向移动一个像素点的步长,再相对图像的像素点行方向移动分析窗口,对下一行像素点的像素值依次进行重置,分析窗口与图像进行S型的相对移动,当分析窗口遍历图像或图像的预定区域后,则结束像素点重置过程。
通过本实施例图像像素值的处理方法,在图像上设置一分析窗口,对分析窗口所覆盖的像素点中的中心像素点与其他像素点进行计算,算出像素值差值,根据差值中的最大差值重置中心像素点的像素值,从而凸显像素点与其他像素点的差异。本实施例还可根据该重置过程,对图像中其他像素点进行相同的处理,使图像中的边缘线更加清晰,易于被检测。
请参阅图3,图3是本申请图像处理方法另一实施例的流程示意图。本实施例处理方法包括以下步骤。
S21:生成应用于图像的分析窗口。
S22:计算分析窗口在图像上当前位置所覆盖的像素点中的中心像素点与其他像素点的像素值差值。
本实施例中的步骤S21-S22与上述实施例步骤S11-S12基本类似,具体不再赘述。在步骤S22中计算出中心像素点与其他像素点的像素值差值后,也是根据差值中的最大差值重置中心像素点的像素值。
但本实施例中并不是直接将最大差值作为中心像素点的像素值,而是在考虑最大差值的角度因素后再重置中心像素点的像素值,这样可排除干扰边缘线。例如在工业检测领域,获取工件表面的图像时,可能会因为打光角度造成所获取的图像中存在干扰边缘线的问题,那么在对图像进行像素值处理时,可进一步在重置的像素值中加入角度因素的考虑,以排除干扰,具体由以下步骤S23-S25实现。
S23:获取差值中最大差值所对应的其他像素点相对中心像素点的方位角度。
在步骤S22中计算差值后,得到差值中的最大差值,并获知该最大差值所对应的其他像素点,在本步骤中获取与最大差值对应的其他像素点相对中心像素点的方位角度,该方位角度即表示最大差值的角度因素。
例如图4所示,图4是图3所示方法中最大差值所对应的其他像素点相对中心像素点的方位角度的示意图。
图4中分析窗口为对应于5×5个像素点的矩形窗口,其中最大差值对应的其他像素点相对中心像素点的方位角度为α,方位角度的获取需要以预先设定一个基准坐标,即图4中的坐标X-Y,相应方位角度为中心像素点与其他像素点的连线相对坐标X的逆时针方向的偏转角度,例如图4所示的方位角度α。
方位角度的确定具体可由坐标X-Y上其他像素点相对中心像素点的像素偏移量来计算,例如图4中,其他像素点相对中心像素点的像素偏移量为(x=1,y=-2),因而可根据α=arctan(y/x),并结合x,y的正负值得出,例如图4中α为296.57°。
S24:根据方位角度获得最大差值的角度因子。
在获取方位角度后,本步骤S24中将方位角度转化为角度因子,例如对方 位角度进行预定的计算,以获取角度因子,或获取方位角度的三角函数值或获取方位角度对应的弧度值,将三角函数值或弧度值作为角度因子。
本实施例中,在获取方位角度后,将方位角度除以360°,从而得到角度因子,这样所得到的角度因子在0~1之间,在后续步骤中最大差值乘以角度因子后所得到的值范围能够落在0~225之间,保证其能够作为像素值体现在图像中。
S25:根据最大差值与角度因子确定中心像素点的像素值。
将最大差值乘以角度因子所得到的数值作为中心像素点的像素值,本步骤S25中根据最大差值及最大差值的角度因子确定中心像素点的像素值。
S26:将图像和分析窗口相对移动,并返回步骤S22,直至满足结束条件。
本步骤S26同上述实施例的步骤S14类似,具体不再赘述,本步骤同样能够实现对多个像素点进行像素值重置。
本实施例中根据像素点的最大差值及其角度因素对像素值重置处理,并对图像或图像的预设区域进行本实施例中的像素值重置,即整个像素点的像素值均考虑了最大差异及角度因素,因而能够在有干扰的情况下更准确的体现每个像素点与邻近像素点的差别。
通过以上两个实施例中所提出的像素值处理方法,能够凸显图像中像素点与其他像素点的差异,例如图5所示,图5是采用图1或图3所示方法对图像进行处理前后的对比示意图。对图5中图像进行处理时,所分析计算的像素值为灰度值,可明显看出图5中处理后的图像相较于处理前的图像,边缘线更明显。
对于图像的像素值处理,如需要,还可进一步对重置像素值后的图像进行二值化处理,即图片的黑白处理,例如将灰度值转化为0或255,实现二值化。
在对图像进行以上实施例所述的像素值处理后,更容易检测出图像中的边缘线。本申请中还提出一种图像的边缘提取方法,请参阅图6,图6是本申请图像的边缘提取方法一实施例的流程示意图。本实施例边缘提取方法包括以下步骤。
S31:对图像的像素值进行处理。
采用上述方法对图像的像素值进行处理,可重置图像的灰度值。
S32:筛选像素值满足预设条件的像素点作为图像的边缘点。
在本步骤S32中可设置一个阈值,将灰度值超出该阈值的像素点作为边缘点。
S33:基于边缘点进行边缘提取。
最后根据所筛选出的边缘点进行边缘提取,以完成图像的边缘检测。例如可将边缘点拟合为边缘线,还可将边缘点的连线作为边缘线,即实现了边缘提取。
对于上述方法,均可通过一图像处理设备实现应用,其逻辑过程通过计算机程序来表示,具体则通过图像处理设备实现。
请参阅图7,图7是本申请图像处理设备一实施例的结构示意图。本实施例图像处理设备100包括处理器11和存储器12。存储器12中存储有计算机程序,处理器用于执行计算机程序以实现上述方法。
请参阅图8,图8是本申请存储介质一实施例的结构示意图。本实施例存储介质200中存储有计算机程序,能够被执行实现上述实施例的方法,该存储介质200可以是U盘、光盘、服务器等。
以上所述仅为本申请的实施方式,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。

Claims (12)

  1. 一种图像处理方法,其特征在于,所述方法包括:
    生成应用于所述图像的分析窗口;
    计算所述分析窗口在所述图像上当前位置所覆盖的像素点中的中心像素点与其他像素点的像素值的差值;
    根据所述差值中的最大差值重置所述中心像素点的像素值。
  2. 根据权利要求1所述的方法,其特征在于,所述方法进一步包括:
    将所述图像和所述分析窗口进行相对移动,并返回所述计算所述分析窗口在当前位置所覆盖的像素点中的中心像素点与其他像素点的像素值差值的步骤,直至遍历所述图像的预定区域。
  3. 根据权利要求1所述的方法,其特征在于,所述分析窗口为对应于(2n+1)×(2n+1)个像素点的矩形窗口,所述n为大于或等于1的整数。
  4. 根据权利要求1所述的方法,其特征在于,所述根据所述差值中的最大差值重置所述中心像素点的像素值的步骤包括:
    将所述差值中的最大差值作为所述中心像素点的像素值。
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述差值中的最大差值重置所述中心像素点的像素值的步骤包括:
    获取所述最大差值所对应的其他像素点相对所述中心像素点的方位角度;
    根据所述方位角度获得所述最大差值的角度因子;
    根据所述最大差值与所述角度因子确定所述中心像素点的像素值。
  6. 根据权利要求5所述的方法,其特征在于,所述根据所述方位角度获得所述最大差值的角度因子,包括:
    将所述方位角度的三角函数值作为所述角度因子;或者,将所述方位角度对应的弧度值作为所述角度因子。
  7. 根据权利要求1所述的方法,其特征在于,所述将所述图像和所述分析窗口 进行相对移动的步骤包括:
    以一个像素点为步长沿所述像素的行方向或列方向相对于所述图像移动所述分析窗口。
  8. 根据权利要求1所述的方法,其特征在于,所述结束条件为判断所述分析窗口是否已遍历所述图像或所述图像的预定区域,若已遍历所述图像或所述图像的预定区域,则满足结束条件。
  9. 根据权利要求1所述的方法,其特征在于,所述像素值为灰度值。
  10. 一种图像的边缘提取方法,其特征在于,所述方法包括:
    使用权利要求1-9中任一项所述的方法对所述图像的像素值进行处理;
    筛选所述像素值满足预设条件的像素点作为所述图像的边缘点;
    基于所述边缘点进行边缘提取。
  11. 一种图像处理设备,其特征在于,所述设备包括处理器和存储器,所述存储器中存储有计算机程序,所述处理器用于执行所述计算机程序以实现权利要求1-10中任一项所述的方法。
  12. 一种存储介质,其特征在于,所述存储介质用于存储计算机程序,所述计算机程序能够被执行以实现权利要求1-10中任一项所述的方法。
PCT/CN2018/104900 2018-09-10 2018-09-10 图像处理方法、边缘提取方法、处理设备及存储介质 WO2020051750A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/104900 WO2020051750A1 (zh) 2018-09-10 2018-09-10 图像处理方法、边缘提取方法、处理设备及存储介质
CN201880087313.1A CN111630565B (zh) 2018-09-10 2018-09-10 图像处理方法、边缘提取方法、处理设备及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/104900 WO2020051750A1 (zh) 2018-09-10 2018-09-10 图像处理方法、边缘提取方法、处理设备及存储介质

Publications (1)

Publication Number Publication Date
WO2020051750A1 true WO2020051750A1 (zh) 2020-03-19

Family

ID=69776933

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/104900 WO2020051750A1 (zh) 2018-09-10 2018-09-10 图像处理方法、边缘提取方法、处理设备及存储介质

Country Status (2)

Country Link
CN (1) CN111630565B (zh)
WO (1) WO2020051750A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113255704A (zh) * 2021-07-13 2021-08-13 中国人民解放军国防科技大学 一种基于局部二值模式的像素差卷积边缘检测方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112215893B (zh) * 2020-10-28 2022-10-28 安徽农业大学 目标二维中心坐标点确定方法、装置、设备及测距系统
CN115802056B (zh) * 2023-01-31 2023-05-05 南通凯沃智能装备有限公司 用于移动终端的用户数据压缩存储方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308573A (zh) * 2008-06-30 2008-11-19 北京中星微电子有限公司 一种消除噪声的方法和装置
JP2012123479A (ja) * 2010-12-06 2012-06-28 Nanao Corp エッジ方向検出装置またはその方法
CN105069807A (zh) * 2015-08-28 2015-11-18 西安工程大学 一种基于图像处理的冲压工件缺陷检测方法
CN106210712A (zh) * 2016-08-11 2016-12-07 上海大学 一种图像坏点检测及处理方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308573A (zh) * 2008-06-30 2008-11-19 北京中星微电子有限公司 一种消除噪声的方法和装置
JP2012123479A (ja) * 2010-12-06 2012-06-28 Nanao Corp エッジ方向検出装置またはその方法
CN105069807A (zh) * 2015-08-28 2015-11-18 西安工程大学 一种基于图像处理的冲压工件缺陷检测方法
CN106210712A (zh) * 2016-08-11 2016-12-07 上海大学 一种图像坏点检测及处理方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113255704A (zh) * 2021-07-13 2021-08-13 中国人民解放军国防科技大学 一种基于局部二值模式的像素差卷积边缘检测方法
CN113255704B (zh) * 2021-07-13 2021-09-24 中国人民解放军国防科技大学 一种基于局部二值模式的像素差卷积边缘检测方法

Also Published As

Publication number Publication date
CN111630565B (zh) 2024-03-01
CN111630565A (zh) 2020-09-04

Similar Documents

Publication Publication Date Title
CN111630563B (zh) 图像的边缘检测方法、图像处理设备及计算机存储介质
JP6934026B2 (ja) ビジョンシステムでラインを検出するためのシステム及び方法
JP5699788B2 (ja) スクリーン領域検知方法及びシステム
WO2020051750A1 (zh) 图像处理方法、边缘提取方法、处理设备及存储介质
CN107515714B (zh) 一种手指触控识别方法、装置和触控投影设备
KR101032446B1 (ko) 영상의 정점 검출 장치 및 방법
CN110268222B (zh) 三维形状计测装置、三维形状计测方法及存储介质
CN110308817B (zh) 一种触控动作识别方法及触控投影系统
WO2018163530A1 (ja) 3次元形状計測装置、3次元形状計測方法、及びプログラム
CN109986201B (zh) 焊缝的跟踪检测方法、装置、存储介质以及激光焊接设备
US10074551B2 (en) Position detection apparatus, position detection method, information processing program, and storage medium
CN116958145B (zh) 图像处理方法、装置、视觉检测系统及电子设备
US10386930B2 (en) Depth determining method and depth determining device of operating body
CN105049706A (zh) 一种图像处理方法及终端
CN116109572A (zh) 工件边缘微弱缺陷的检测方法、装置及电子设备
CN116542979B (zh) 一种基于图像测量的预测的校正方法及终端
KR20100034500A (ko) 영상 디블러링 기법을 이용한 구조물 점검 시스템 및 그 방법
WO2018082498A1 (en) Mid-air finger pointing detection for device interaction
JP2018072115A (ja) 線幅測定方法、線幅測定プログラム、記憶媒体及び情報処理装置
CN116342585A (zh) 一种产品缺陷检测方法、装置、设备及存储介质
Will et al. Features for image processing of OCT images for seam tracking applications in laser welding
CN116416227A (zh) 背景图像处理方法和装置
JP6855938B2 (ja) 距離測定装置、距離測定方法および距離測定プログラム
US9754374B1 (en) Detection of a sheet of light based on second derivative
JP4840822B2 (ja) 画像処理方法、装置およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18933535

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 13.08.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18933535

Country of ref document: EP

Kind code of ref document: A1