CN111630565B - Image processing method, edge extraction method, processing apparatus, and storage medium - Google Patents

Image processing method, edge extraction method, processing apparatus, and storage medium Download PDF

Info

Publication number
CN111630565B
CN111630565B CN201880087313.1A CN201880087313A CN111630565B CN 111630565 B CN111630565 B CN 111630565B CN 201880087313 A CN201880087313 A CN 201880087313A CN 111630565 B CN111630565 B CN 111630565B
Authority
CN
China
Prior art keywords
pixel
image
value
analysis window
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880087313.1A
Other languages
Chinese (zh)
Other versions
CN111630565A (en
Inventor
阳光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Paitian Robot Technology Co ltd
Original Assignee
Shenzhen Paitian Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Paitian Robot Technology Co ltd filed Critical Shenzhen Paitian Robot Technology Co ltd
Publication of CN111630565A publication Critical patent/CN111630565A/en
Application granted granted Critical
Publication of CN111630565B publication Critical patent/CN111630565B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an image processing method, which comprises the following steps: generating an analysis window applied to the image; calculating the difference value between the pixel value of the central pixel point and the pixel value of other pixel points in the pixel points covered by the current position of the analysis window on the image; and resetting the pixel value of the central pixel point according to the maximum difference value in the difference values. The pixel value of the image is reset by adopting the method of the application so as to highlight the difference between the central pixel point and the adjacent pixel points.

Description

Image processing method, edge extraction method, processing apparatus, and storage medium
Technical Field
The present invention relates to the field of image processing, and in particular, to an image processing method, an edge extraction method, a processing device, and a storage medium.
Background
Edge extraction is a basic operation in image processing, and can be applied to different fields, for example, in the industrial field, the detection of the surface quality of a workpiece is realized by utilizing edge extraction, specifically, an image of the surface of the workpiece is obtained, and then the image of the surface of the workpiece is subjected to edge extraction so as to detect whether a stain or a scratch exists on the surface of the workpiece. In specific applications, the condition that the stain on the surface of the workpiece is shallow often occurs, and the edges in the image of the corresponding workpiece surface are not clear and are not easy to detect.
Disclosure of Invention
The application provides an image processing method, an edge extraction method, processing equipment and a computer storage medium, which are used for solving the problems that edges in an image to be detected are unclear and difficult to detect.
In order to solve the above technical problems, the present application provides an image processing method, which includes: generating an analysis window applied to the image; calculating the difference value between the pixel value of the central pixel point and the pixel value of other pixel points in the pixel points covered by the current position of the analysis window on the image; and resetting the pixel value of the central pixel point according to the maximum difference value in the difference values.
In order to solve the above technical problems, the present application provides an edge extraction method of an image, the method comprising: processing pixel values of the image using the method described above; screening pixel points with pixel values meeting preset conditions as edge points of the image; edge extraction is performed based on edge points.
To solve the above technical problem, the present application provides an image processing apparatus, which includes a processor and a memory, in which a computer program is stored, and the processor is configured to execute the computer program to implement the above method.
To solve the above technical problem, the present application provides a storage medium for storing a computer program, which can be executed to implement the above method.
According to the method, the pixel values of the image are processed, an analysis window applied to the image is firstly generated, then the difference value between the pixel values of the central pixel point and other pixel points in the pixel points covered by the current position of the analysis window on the image is calculated, and the pixel value of the central pixel point is reset according to the maximum difference value in the difference value, so that the pixel value after the pixel point is reset can represent the maximum difference between the pixel point and other pixel points. The pixel values of the pixel points are reset, so that the maximum difference between the central pixel point and the adjacent pixel points is highlighted, the edges in the image are clearer, and the detection is easy.
Drawings
FIG. 1 is a flow chart of an embodiment of an image processing method of the present application;
FIG. 2 is a schematic diagram showing the comparison between the pixel values of pixels in an analysis window and the pixel values before and after the pixel values are reset in the method of FIG. 1;
FIG. 3 is a flow chart of another embodiment of an image processing method of the present application;
FIG. 4 is a schematic view illustrating azimuth angles of other pixel points corresponding to the maximum difference in the method shown in FIG. 3 with respect to the center pixel point;
FIG. 5 is a schematic diagram showing a comparison of the images before and after processing by the method shown in FIG. 1 or FIG. 3;
FIG. 6 is a flow chart of an embodiment of an edge extraction method of an image of the present application;
FIG. 7 is a schematic view of a structure of an embodiment of an image processing apparatus of the present application;
FIG. 8 is a schematic diagram illustrating the structure of an embodiment of a storage medium of the present application.
Detailed Description
The technical solutions of the present application will be clearly and completely described below with reference to the embodiments of the present application and the accompanying drawings, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The processing method comprises the steps of resetting pixel values of an image to highlight differences between each pixel point and adjacent pixel points, specifically generating an analysis window applied to the image, calculating adjacent difference values of central pixel points in the coverage range of the analysis window, wherein the largest difference value in the adjacent difference values can represent the differences between the central pixel points and the adjacent pixel points, so that the pixel values of the central pixel points are reset according to the largest difference value, and resetting the pixel values of the image according to the steps.
Referring to fig. 1, fig. 1 is a schematic flow chart of an embodiment of an image processing method of the present application. The image processing method of the present embodiment includes the following steps.
S11: an analysis window is generated that is applied to the image.
In this step S11, an analysis window applied to the image is generated, that is, an analysis region is divided when the image is subjected to analysis calculation. When each analysis calculation is performed, only the pixel points covered by the analysis window are analyzed, and the size and the range of the analyzed data are limited by the analysis window, so that if the analysis window is large, the analysis data are comprehensive, but the analysis time is long each time; correspondingly, if the analysis window is smaller, the analysis time is shorter, but the analysis data is not comprehensive enough.
The analysis window may be a rectangular window, a round window, a fan-shaped window, etc., in this embodiment, the pixel points in the image need to be analyzed, so a rectangular window is generally adopted, in this embodiment, the analysis window is specifically a rectangular window corresponding to (2n+1) × (2n+1) pixel points, n is an integer greater than or equal to 1, at this time, there is a center point in the rectangular analysis window, and there is a center pixel point in the pixel points corresponding to the coverage area of the center pixel point.
S12: and calculating pixel value difference values of the central pixel point and other pixel points in the pixel points covered by the current position of the analysis window on the image.
After the analysis window is generated on the image, the pixel covered by the analysis window at the current position can be determined, namely, the pixel covered by the analysis window at the current setting position on the image can be determined, and then the covered pixel is analyzed.
In this embodiment, the pixel values are processed to emphasize the difference between the pixel points and the neighboring pixel points, so that the method adopted in the step S12 is to calculate the pixel value difference between the central pixel point and the other pixel points in the pixel points covered by the analysis window.
In the analysis and calculation in the analysis window in this embodiment, the central pixel point is taken as the main point, so as to obtain the pixel value difference between the central pixel point and other pixel points, i.e. obtain the difference between the central pixel point and the adjacent pixel points.
S13: and resetting the pixel value of the central pixel point according to the maximum difference value in the difference values.
After knowing the difference between the central pixel point and other pixel points in step S12, resetting the pixel value of the central pixel point so that the new pixel value reflects the difference between the central pixel point and other pixel points. In the step S13, the pixel value of the center pixel is reset according to the maximum difference value of the difference values, i.e. the new pixel value can represent the maximum difference between the center pixel and other pixel points.
For example, fig. 2 is a schematic diagram of the method shown in fig. 1, before and after resetting the pixel values of the pixels in an analysis window, where the analysis window is a rectangular window corresponding to 3×3 pixels.
Before resetting, the pixel value of the central pixel point is 45, the difference value between the central pixel point and the adjacent pixel point with the pixel value of 12 is the largest, and the largest difference value is 33; the pixel value of the center pixel is then reset based on the maximum difference 33. In fig. 2, the maximum difference value is directly taken as the pixel value of the center pixel, so that the pixel value of the center pixel becomes 33 after the reset.
In the above-mentioned steps S12-S13, the pixel value of a central pixel point may be reset, and in one embodiment, when there is a stain on the surface of the workpiece, the stain corresponds to a shallower edge point on the image, the pixel value corresponding to the edge point is reset, so that the edge point is clearer.
In fig. 2, the pixel values of other pixels may be reset, but not the reset performed in the analysis window, but the reset performed when the other pixels are the center pixel within the range covered by the analysis window is specifically implemented in the following step S14.
S14: the image and the analysis window are moved relatively and returned to step S12 until the end condition is satisfied.
For steps S12-S13, the analysis calculation for one analysis window can complete the resetting of the pixel value of one central pixel point, but if the pixel value resetting needs to be performed for a plurality of pixel points to highlight the edge line in one image, in this process, the pixel value difference of the edge line area is larger, the pixel value difference of other areas is smaller, the pixel value of the edge line area is replaced by the difference value instead of the pixel value of the edge line area, so that the edge line is thickened, and the pixel value of the other areas is replaced by the smaller difference value, so that the edge line is relatively deepened, and the edge line in the image is more obvious. In this step S14, the image and the analysis window are moved relatively, and the process returns to step S12 until the end condition is satisfied, so as to reset the pixel values of the plurality of pixel points.
In this embodiment, the relative movement between the image and the analysis window may be that the analysis window is moved along the row direction or the column direction of the pixel by taking one pixel point as a step length, and the end condition may be that whether the analysis window has traversed the image or the predetermined area of the image is judged, and if the image or the predetermined area of the image has been traversed, the end condition is satisfied.
The predetermined area may be the area excluding the edge pixel point in the whole image, and since the analysis window occupies a certain area, the calculation in the analysis window only resets the pixel value of the center point, so that the pixel point at the edge of the image may not be calculated by using the analysis window, and the traversed predetermined area is the area obtained by subtracting the edge pixel point from the whole image.
The predetermined area may be an area set by a user, for example, the user can know the position of the edge line by observing the workpiece when detecting the workpiece, and a specific area may be selected as an area for image processing to analyze in order to speed up the image analysis process.
Specifically, the analysis window is moved relative to the pixel row direction of the image, the pixel values of one row of pixels are sequentially reset, then the step length of one pixel is moved relative to the pixel column direction of the image, the analysis window is moved relative to the pixel row direction of the image, the pixel values of the next row of pixels are sequentially reset, the analysis window and the image are subjected to S-shaped relative movement, and when the analysis window traverses the image or a preset area of the image, the pixel resetting process is ended.
According to the processing method of the image pixel values, an analysis window is arranged on an image, the center pixel point in the pixel points covered by the analysis window and other pixel points are calculated, the pixel value difference is calculated, the pixel value of the center pixel point is reset according to the maximum difference in the difference, and therefore the difference between the pixel point and the other pixel points is highlighted. According to the embodiment, the same processing can be performed on other pixel points in the image according to the resetting process, so that the edge line in the image is clearer and is easy to detect.
Referring to fig. 3, fig. 3 is a flowchart illustrating another embodiment of an image processing method according to the present application. The processing method of the embodiment comprises the following steps.
S21: an analysis window is generated that is applied to the image.
S22: and calculating pixel value difference values of the central pixel point and other pixel points in the pixel points covered by the current position of the analysis window on the image.
Steps S21 to S22 in this embodiment are substantially similar to steps S11 to S12 in the above embodiment, and detailed descriptions thereof are omitted. After calculating the difference between the pixel values of the center pixel and the other pixels in step S22, the pixel value of the center pixel is reset according to the maximum difference value among the difference values.
However, in this embodiment, the maximum difference is not directly used as the pixel value of the central pixel, but the pixel value of the central pixel is reset after the angle factor of the maximum difference is considered, so that the interference edge line can be eliminated. For example, in the field of industrial inspection, when an image of a workpiece surface is acquired, there may be a problem of disturbing edge lines in the acquired image due to a polishing angle, and then, when the image is subjected to pixel value processing, consideration of an angle factor may be further added to the reset pixel value to exclude the disturbance, which is specifically implemented in the following steps S23-S25.
S23: and acquiring azimuth angles of other pixel points corresponding to the maximum difference value in the difference values relative to the central pixel point.
After calculating the difference values in step S22, the maximum difference value in the difference values is obtained, other pixel points corresponding to the maximum difference value are obtained, and in this step, the azimuth angle of the other pixel points corresponding to the maximum difference value relative to the center pixel point is obtained, where the azimuth angle represents the angle factor of the maximum difference value.
For example, as shown in fig. 4, fig. 4 is a schematic diagram illustrating the azimuth angle of the other pixel point corresponding to the maximum difference value in the method shown in fig. 3 with respect to the center pixel point.
In fig. 4, the analysis window is a rectangular window corresponding to 5×5 pixels, where the azimuth angle of the other pixel corresponding to the maximum difference is α, and the azimuth angle needs to be obtained by presetting a reference coordinate, i.e. the coordinate X-Y in fig. 4, and the corresponding azimuth angle is a deflection angle of the line between the central pixel and the other pixel in the counterclockwise direction relative to the coordinate X, for example, the azimuth angle α shown in fig. 4.
The azimuth angle may be specifically determined by calculating the pixel offset of the other pixel point relative to the central pixel point on the coordinate X-Y, for example, in fig. 4, the pixel offset of the other pixel point relative to the central pixel point is (x=1, y= -2), so that it may be obtained by combining positive and negative values of X and Y according to α=arctan (Y/X), for example, α is 296.57 ° in fig. 4.
S24: and obtaining an angle factor of the maximum difference value according to the azimuth angle.
After the azimuth angle is obtained, in step S24, the azimuth angle is converted into an angle factor, for example, the azimuth angle is subjected to predetermined calculation to obtain the angle factor, or a trigonometric function value of the azimuth angle or an radian value corresponding to the azimuth angle is obtained, and the trigonometric function value or the radian value is used as the angle factor.
In this embodiment, after the azimuth angle is obtained, the azimuth angle is divided by 360 ° to obtain an angle factor, so that the obtained angle factor is between 0 and 1, and the value range obtained by multiplying the maximum difference value by the angle factor in the subsequent step can fall between 0 and 225, so that it is ensured that the value can be reflected in the image as a pixel value.
S25: and determining the pixel value of the central pixel point according to the maximum difference value and the angle factor.
The value obtained by multiplying the maximum difference value by the angle factor is used as the pixel value of the center pixel, and in this step S25, the pixel value of the center pixel is determined according to the maximum difference value and the angle factor of the maximum difference value.
S26: the image and the analysis window are moved relatively and returned to step S22 until the end condition is satisfied.
The step S26 is similar to the step S14 of the above embodiment, and is not described in detail, and the step can also implement pixel value resetting for a plurality of pixel points.
In this embodiment, the pixel value is reset according to the maximum difference value of the pixel points and the angle factor thereof, and the pixel value in this embodiment is reset for the image or the preset area of the image, that is, the pixel value of the whole pixel point considers the maximum difference and the angle factor, so that the difference between each pixel point and the adjacent pixel point can be more accurately reflected under the condition of interference.
By the pixel value processing method proposed in the above two embodiments, differences between the pixel point and other pixel points in the image can be highlighted, for example, as shown in fig. 5, and fig. 5 is a schematic diagram of comparison before and after the image is processed by the method shown in fig. 1 or fig. 3. When the image in fig. 5 is processed, the pixel values analyzed and calculated are gray values, and the edge line of the processed image in fig. 5 is more obvious compared with the edge line of the processed image before processing.
For the pixel value processing of the image, if necessary, the binarization processing of the image after the pixel value is further performed, that is, the black and white processing of the image, for example, the gray value is converted into 0 or 255, so as to realize the binarization.
After the pixel value processing described in the above embodiments is performed on the image, it is easier to detect edge lines in the image. Referring to fig. 6, fig. 6 is a schematic flow chart of an embodiment of the image edge extraction method. The edge extraction method of the present embodiment includes the following steps.
S31: pixel values of the image are processed.
The pixel values of the image are processed by adopting the method, and the gray value of the image can be reset.
S32: and screening pixel points with pixel values meeting preset conditions as edge points of the image.
In this step S32, a threshold may be set, and a pixel point whose gray value exceeds the threshold is used as an edge point.
S33: edge extraction is performed based on edge points.
And finally, carrying out edge extraction according to the screened edge points to finish edge detection of the image. For example, edge points can be fitted to edge lines, and the connecting lines of the edge points can be used as edge lines, namely, edge extraction is realized.
For the above methods, the application may be implemented by an image processing apparatus, and the logic process is represented by a computer program, specifically, implemented by the image processing apparatus.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an embodiment of an image processing apparatus of the present application. The image processing apparatus 100 of the present embodiment includes a processor 11 and a memory 12. The memory 12 has stored therein a computer program for execution by a processor to implement the above-described method.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an embodiment of a storage medium of the present application. The storage medium 200 of the present embodiment stores a computer program capable of being executed to implement the method of the above embodiment, and the storage medium 200 may be a U-disk, an optical disk, a server, or the like.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the patent application, and all equivalent structures or equivalent processes using the descriptions and the contents of the present application or other related technical fields are included in the scope of the patent application.

Claims (11)

1. An image processing method, the method comprising:
generating an analysis window applied to the image;
calculating the difference value between the pixel value of the central pixel point and the pixel value of other pixel points in the pixel points covered by the current position of the analysis window on the image;
and resetting the pixel value of the central pixel point according to the maximum difference value in the difference values.
2. The method according to claim 1, wherein the method further comprises:
and relatively moving the image and the analysis window, and returning to the step of calculating the pixel value difference value between the central pixel point and other pixel points in the pixel points covered by the current position of the analysis window until the preset area of the image is traversed.
3. The method of claim 1, wherein the analysis window is a rectangular window corresponding to (2n+1) x (2n+1) pixels, the n being an integer greater than or equal to 1.
4. The method of claim 1, wherein the step of resetting the pixel value of the center pixel point according to the largest one of the differences comprises:
and taking the maximum difference value in the difference values as the pixel value of the central pixel point.
5. The method of claim 1, wherein the step of resetting the pixel value of the center pixel point according to the largest one of the differences comprises:
acquiring azimuth angles of other pixel points corresponding to the maximum difference value relative to the central pixel point;
obtaining an angle factor of the maximum difference value according to the azimuth angle;
and determining the pixel value of the central pixel point according to the maximum difference value and the angle factor.
6. The method of claim 5, wherein said obtaining an angle factor of the maximum difference from the azimuth angle comprises:
taking the trigonometric function value of the azimuth angle as the angle factor; or taking the radian value corresponding to the azimuth angle as the angle factor.
7. The method of claim 2, wherein the step of relatively moving the image and the analysis window comprises:
and moving the analysis window relative to the image along the row direction or the column direction of the pixels by taking one pixel point as a step length.
8. The method of claim 1, wherein the pixel values are gray scale values.
9. A method of edge extraction of an image, the method comprising:
processing pixel values of the image using the method of any one of claims 1-8;
screening pixel points of which the pixel values meet preset conditions as edge points of the image;
and extracting the edge based on the edge points.
10. An image processing apparatus, characterized in that the apparatus comprises a processor and a memory, the memory having stored therein a computer program for executing the computer program to implement the method of any of claims 1-9.
11. A storage medium storing a computer program executable by a processor to implement the method of any one of claims 1-9.
CN201880087313.1A 2018-09-10 2018-09-10 Image processing method, edge extraction method, processing apparatus, and storage medium Active CN111630565B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/104900 WO2020051750A1 (en) 2018-09-10 2018-09-10 Image processing method, edge extracting method, processing device, and storage medium

Publications (2)

Publication Number Publication Date
CN111630565A CN111630565A (en) 2020-09-04
CN111630565B true CN111630565B (en) 2024-03-01

Family

ID=69776933

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880087313.1A Active CN111630565B (en) 2018-09-10 2018-09-10 Image processing method, edge extraction method, processing apparatus, and storage medium

Country Status (2)

Country Link
CN (1) CN111630565B (en)
WO (1) WO2020051750A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112215893B (en) * 2020-10-28 2022-10-28 安徽农业大学 Method, device and equipment for determining target two-dimensional center coordinate point and ranging system
CN113255704B (en) * 2021-07-13 2021-09-24 中国人民解放军国防科技大学 Pixel difference convolution edge detection method based on local binary pattern
CN115802056B (en) * 2023-01-31 2023-05-05 南通凯沃智能装备有限公司 User data compression storage method for mobile terminal

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308573A (en) * 2008-06-30 2008-11-19 北京中星微电子有限公司 Method and apparatus for eliminating noise
JP2012123479A (en) * 2010-12-06 2012-06-28 Nanao Corp Edge direction detection device or method of the same
CN105069807A (en) * 2015-08-28 2015-11-18 西安工程大学 Punched workpiece defect detection method based on image processing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210712B (en) * 2016-08-11 2018-07-10 上海大学 A kind of dead pixel points of images detection and processing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308573A (en) * 2008-06-30 2008-11-19 北京中星微电子有限公司 Method and apparatus for eliminating noise
JP2012123479A (en) * 2010-12-06 2012-06-28 Nanao Corp Edge direction detection device or method of the same
CN105069807A (en) * 2015-08-28 2015-11-18 西安工程大学 Punched workpiece defect detection method based on image processing

Also Published As

Publication number Publication date
WO2020051750A1 (en) 2020-03-19
CN111630565A (en) 2020-09-04

Similar Documents

Publication Publication Date Title
CN111630563B (en) Edge detection method of image, image processing apparatus, and computer storage medium
US11854173B2 (en) System and method for finding lines in an image with a vision system
CN111630565B (en) Image processing method, edge extraction method, processing apparatus, and storage medium
US9646389B2 (en) Systems and methods for image scanning
KR102649038B1 (en) System and method for finding lines in an image with a vision system
US11301712B2 (en) Pointer recognition for analog instrument image analysis
CN106839976B (en) Method and device for detecting lens center
US9689668B2 (en) Image processing apparatus and image processing method
US10074551B2 (en) Position detection apparatus, position detection method, information processing program, and storage medium
KR101018518B1 (en) Structure inspection system using image deblurring technique and method of thereof
CN112581374A (en) Speckle sub-pixel center extraction method, system, device and medium
JP6199799B2 (en) Self-luminous material image processing apparatus and self-luminous material image processing method
US10067029B2 (en) Systems and methods for estimating modulation transfer function in an optical system
JP5448757B2 (en) Radar image processing device
JP5346304B2 (en) Appearance inspection apparatus, appearance inspection system, and appearance inspection method
JP6637823B2 (en) Crack detection method
CN111340715B (en) Grid pattern weakening method and device of image and electronic equipment
JP7124664B2 (en) Signal tracking device and signal tracking method
US10402988B2 (en) Image processing apparatuses and methods
CN115830052A (en) Method and device for extracting features of image of electronic scanning microscope
JP2011011472A (en) Device and method for identifying position of register mark

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 518000, Building A, Building 1, Shenzhen International Innovation Valley, Dashi 1st Road, Xili Community, Xili Street, Nanshan District, Shenzhen City, Guangdong Province 1701

Applicant after: Shenzhen Paitian Robot Technology Co.,Ltd.

Address before: 518063 23 Floor (Room 2303-2306) of Desai Science and Technology Building, Yuehai Street High-tech Zone, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN A&E INTELLIGENT TECHNOLOGY INSTITUTE Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant