CN112581374A - Speckle sub-pixel center extraction method, system, device and medium - Google Patents

Speckle sub-pixel center extraction method, system, device and medium Download PDF

Info

Publication number
CN112581374A
CN112581374A CN201910931509.0A CN201910931509A CN112581374A CN 112581374 A CN112581374 A CN 112581374A CN 201910931509 A CN201910931509 A CN 201910931509A CN 112581374 A CN112581374 A CN 112581374A
Authority
CN
China
Prior art keywords
speckle
center
pixel
image
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910931509.0A
Other languages
Chinese (zh)
Inventor
丁海飞
黄龙祥
朱力
吕方璐
汪博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Guangjian Technology Co Ltd
Original Assignee
Shenzhen Guangjian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangjian Technology Co Ltd filed Critical Shenzhen Guangjian Technology Co Ltd
Priority to CN201910931509.0A priority Critical patent/CN112581374A/en
Publication of CN112581374A publication Critical patent/CN112581374A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a speckle sub-pixel center extraction method, a speckle sub-pixel center extraction system, speckle sub-pixel center extraction equipment and a speckle sub-pixel center extraction medium, wherein the speckle sub-pixel center extraction method comprises the following steps: acquiring a speckle image, and carrying out Gaussian filtering on the speckle image to remove background noise in the speckle image; performing local self-adaptive binarization processing on the speckle image without the background noise to generate a binarized image; traversing each pixel of the binary image, and marking a connected domain to determine each speckle region; extracting a pixel level center for each speckle region through a corresponding connected domain; the sub-pixel level speckle center for each speckle area is determined from the pixel level center for each speckle area. The invention has strong robustness for removing noise, thereby analyzing the speckle center position by using a CCD camera with lower cost and effectively reducing the analysis cost of the speckle quality.

Description

Speckle sub-pixel center extraction method, system, device and medium
Technical Field
The invention relates to a digital speckle technology, in particular to a speckle subpixel center extraction method, a speckle subpixel center extraction system, speckle subpixel center extraction equipment and a speckle subpixel center extraction medium.
Background
The digital speckle technology is a technology for describing speckle energy distribution by using pixel values based on a CCD of a camera, and the principle of the digital speckle technology is to carry out quantitative processing on speckle energy and quantitatively analyze speckle technical parameters by means of an image processing technology. And the speckle energy center is one of very important indexes.
However, due to the influence of the dark current and the nonlinear components inside the camera, the collected speckles have large noise, the energy distribution and the form of the speckles have large difference with the original signals, and especially when poor cameras are used for shooting, the signal-to-noise ratio is very low, so that the speckle identification degree is not high, and the extraction of the speckle center becomes difficult.
Currently, analysis is performed by using an instrument such as a beam profiler, but the instrument is expensive, needs regular maintenance and is difficult to reduce.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a speckle subpixel center extraction method, a speckle subpixel center extraction system, speckle subpixel center extraction equipment and speckle subpixel center extraction media.
The speckle subpixel center extraction method provided by the invention comprises the following steps:
step S1: acquiring a speckle image, and carrying out Gaussian filtering on the speckle image to remove background noise in the speckle image;
step S2: performing local self-adaptive binarization processing on the speckle image without the background noise to generate a binarized image;
step S3: traversing each pixel of the binary image, and marking a connected domain to determine each speckle region;
step S4: extracting a pixel level center for each speckle region through a corresponding connected domain;
step S5: and determining the sub-pixel level speckle center of each speckle area according to the pixel level center for each speckle area.
Preferably, the step S2 includes the steps of:
step S201: dividing a plurality of speckle areas in the speckle image with the background noise removed into a plurality of adjacent areas to be binarized according to the pixel values;
step S202: generating a binarization threshold value according to the average value of the pixel values of all pixels in each area to be binarized;
step S203: and carrying out binarization processing on each region to be binarized through a corresponding binarization threshold value to generate the binarization image.
Preferably, the step S3 further includes the following steps
Step S301: scanning the binary image from pixels (x0, y0) according to pixel rows and columns, and determining a pixel as a seed point when the pixel value (x, y) of the pixel is not zero;
step S302: growing eight-critical domain or four-critical domain through the seed point to form a growth region, and marking pixels in the growth region as VnN is a connected domain number;
step S303: when the growth of a growth area is finished, continuing to scan line by line from a pixel (x0, y0), judging whether the pixel is marked when the pixel value of the pixel is not zero, and continuing to scan when the pixel is marked; when the pixel is not marked, executing step S302 with the pixel as a seed point;
step S304: when all pixels are traversed, the connected domain marking is finished, and a connected domain set V { V } is obtained0,v1,v2...vn(ii) a N is N-1, and N is the number of connected domains, i.e., the number of speckle regions.
Preferably, the step S4 is specifically: let a connected domain v have a set of points { (x)0,y0),(x1,y1),(x2,y2)...(xn,yn) (ii) a N-1, where N is the total number of points in the connected component, the C coordinate of the center point of the connected component may be determined by the following formula:
Figure BDA0002220400690000021
wherein C.x represents the x coordinate of the center point, C.y represents the y coordinate of the center point, and all connected domains are traversed, so that the pixel level centers of all speckle regions can be obtained.
Preferably, the step S5 includes the steps of:
step S501: for each speckle region, a pixel level center (x) is setp,yp) N pixels exist in an integral area which is used as a circle center and is determined by taking one length of the speckle area as a diameter { (x)i,yi) (ii) a N-1, and the gray-scale value for each pixel is { I (x) } 0,1,2i,yi) (ii) a N-1, i is 0,1,2, the barycentric coordinates (x) of the region are then determinedc,yc) Can be determined by the following equation:
Figure BDA0002220400690000031
step S502: order (x)p,yp)=(xc,yc) Taking the speckle distance as the center of a circle, continuously calculating the center of gravity in the circle with the speckle distance as the diameter to obtain a new speckle center of gravity (x)c,yc);
Step S503: when (x)c-xp) Absolute value of (2)<0.1 and yc-yp) Absolute value<At 0.1, the center of gravity is considered to be converged, and the new speckle center of gravity (x)c,yc) That is, the calculated sub-pixel level speckle center, otherwise, step S501 is executed again until the center of gravity converges.
Preferably, when performing gaussian filtering, a value obtained by performing gaussian convolution on a pixel point I (x, y) of the speckle region is:
Figure BDA0002220400690000032
wherein
Figure BDA0002220400690000033
w (s, t) is a weighting factor of the point (x-s, y-t), I (x-s, y-t) is a pixel value of the point (x-s, y-t), and the weighting factor w (s, t) conforms to a Gaussian distribution in size, wherein a and b respectively represent the dimensions of the Gaussian kernel in the x direction and the y direction.
Preferably, when the binarization region is a rectangular region, the coordinates of the upper left corner and the lower right corner are respectively set as (x)0,y0)、(x1,y1) Then the binarization threshold T for this area is determined by the following formula:
Figure BDA0002220400690000034
where T represents the binarization threshold of the rectangular region, I (x, y) represents the grayscale value of the point (x, y), N represents the total number of pixels of the region, and c is the threshold increment.
The speckle subpixel center extraction system provided by the invention is used for realizing the speckle subpixel center extraction method, and is characterized by comprising the following steps:
the noise removal module is used for acquiring a speckle image and performing Gaussian filtering on the speckle image so as to remove background noise in the speckle image;
the binarization processing module is used for carrying out local self-adaptive binarization processing on the speckle image without the background noise to generate a binarization image;
a connected domain marking module, configured to traverse each pixel of the binarized image, and mark a connected domain to determine each speckle region;
the pixel level center extraction module is used for extracting a pixel level center for each speckle area through a corresponding connected domain;
and the sub-pixel extraction module is used for determining the sub-pixel level speckle center of each speckle area according to the pixel level center for each speckle area.
The invention provides speckle sub-pixel center extraction equipment, which comprises:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the speckle subpixel center extraction method of any one of claims 1-7 via execution of the executable instructions.
The invention provides a computer readable storage medium for storing a program which, when executed, implements the steps of the speckle sub-pixel center extraction method.
Compared with the prior art, the invention has the following beneficial effects:
the invention can extract the pixel level center through Gaussian filtering, binarization processing and connected domain marking in sequence, further determines the sub-pixel level speckle center of each speckle area through the gravity center iteration with constraint, has very strong robustness on noise removal, can analyze the speckle center position by using a lower-cost CCD camera, and effectively reduces the analysis cost of the speckle quality.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts. Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flowchart illustrating the steps of a speckle subpixel center extraction method according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating the steps of binarization processing for a speckle region according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating the steps for performing connected domain tagging in an embodiment of the present invention;
FIG. 4 is a flowchart illustrating the steps of sub-pixel level speckle center extraction according to an embodiment of the present invention;
FIG. 5(a) is a schematic diagram of a speckle image with noise in an embodiment of the invention;
FIG. 5(b) is a schematic diagram of a Gaussian filtered speckle image in an embodiment of the invention;
fig. 5(c) is a schematic diagram of a speckle image after binarization processing in the embodiment of the invention;
FIG. 5(d) is a schematic diagram of a speckle image with a speckle center determined by a barycentric iteration method according to an embodiment of the present invention;
FIG. 6 is a schematic block diagram of a speckle subpixel center extraction system according to an embodiment of the present invention;
FIG. 7 is a schematic structural diagram of a speckle subpixel center extraction device in an embodiment of the present invention; and
fig. 8 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
The invention provides a gravity center iterative digital speckle subpixel center extraction method based on constraint, and aims to solve the problems in the prior art.
The following describes the technical solutions of the present invention and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a flowchart of steps of a speckle subpixel center extraction method in an embodiment of the present invention, and as shown in fig. 1, the gravity center iterative digital speckle subpixel center extraction method based on band constraint provided by the present invention includes the following steps:
step S1: acquiring a speckle image, and carrying out Gaussian filtering on the speckle image to remove background noise in the speckle image;
the speckle image comprises a plurality of scattered speckle areas;
the energy distribution of each speckle region conforms to a gaussian distribution, i.e., the central energy is highest and decreases from the center to the outer edge. The ideal speckle region has small energy gradient, no obvious abrupt change of the energy gradient and radial shape in the center, namely, the gradient directions are all directions pointing from the center to the outer edge.
As shown in fig. 5(a), the acquired speckle image is an original speckle pattern with noise, and the energy distribution of the speckle area with noise does not show a distinct gaussian distribution.
When the speckle images are gaussian-filtered, the energy of the filtered speckle images decreases significantly from the center to the periphery as shown in fig. 5 (b).
However, due to the influence of the dark current of the CCD camera, background noise exists, thereby destroying the gaussian distribution state of the speckle region, and particularly, the influence on the edge of the speckle region is larger. In order to reduce the influence of background noise introduced by the CCD camera, gaussian filtering is required to smooth the speckle image.
The following two considerations are mainly taken into account in selecting gaussian filtering: each speckle area is convoluted by Gaussian kernel, so that the speckle area can be regarded as being correlated once, and the noise can be filtered while the form of the speckle area is kept.
The gaussian filtering enables the speckle center to be more prominent because the calculated value is the largest when the gaussian kernel and the speckle region completely coincide, and the coincident region, i.e. the gaussian kernel center point kernel coincides with the theoretical center point of the speckle region.
Assuming that there is a point I (x, y) in the speckle region, the value of the point I (x, y) after Gaussian convolution is
Figure BDA0002220400690000061
Wherein
Figure BDA0002220400690000062
w (s, t) is the weighting factor of the point (x-s, y-t), and I (x-s, y-t) is the gray value of the point (x-s, y-t). The weighting factors w (s, t) are in accordance with a gaussian distribution in which a, b represent the dimensions of the gaussian kernel in x, y directions, respectively. However, since the energy distribution of the speckle region is a gaussian distribution symmetrical in the x and y directions, a is b in the embodiment of the present invention.
Step S2: performing local self-adaptive binarization processing on the speckle image without the background noise to generate a binarized image;
for the entire speckle image, there are speckle regions with signals and speckle-free regions. Because the energy of the central area and the edge of the whole field angle area is different during speckle projection, if the histogram statistics is carried out on the whole speckle image and the average value is taken for binarization processing, the risk that darker speckles are directly binarized into 0 exists.
Therefore, in the embodiment of the invention, a local adaptive binarization processing method is adopted, namely, the binarization threshold value is determined according to the speckle area characteristics of each part, and because the pixel value projected by the speckle energy is inevitably larger than the peripheral non-energy projection area, the excessively dark speckle area cannot be lost by local binarization.
Fig. 2 is a flowchart of the step of binarizing the speckle region in the embodiment of the present invention, and as shown in fig. 2, the step S2 includes the following steps:
step S201: dividing a plurality of speckle areas in the speckle image with the background noise removed into a plurality of adjacent areas to be binarized according to the pixel values;
step S202: generating a binarization threshold value according to the average value of the pixel values of all pixels in each area to be binarized;
step S203: and carrying out binarization processing on each region to be binarized through a corresponding binarization threshold value to generate the binarization image.
As shown in fig. 5(c), in the binarized image, the speckle region is not affected by light and dark, and is very robust to energy.
For a rectangular area, let the coordinates of the upper left corner and the lower right corner be (x)0,y0) And (x)1,y1) Then the binarization threshold for this region can be determined by the following formula
Figure BDA0002220400690000071
Wherein, T represents the binarization threshold of the rectangular region, I (x, y) represents the pixel value of the point (x, y), N represents the total number of pixels in the region, c is a threshold increment, which can be positive or negative, and represents the tolerance to background noise, and the larger c is, the smaller the tolerance to background noise is, the easier the binarization is to zero. After the threshold is found, the region is binarized to result in
Figure BDA0002220400690000072
Setting the pixel less than or equal to the binarization threshold value T as 0, and setting the pixel more than the binarization threshold value T as 1. Obviously, the area set to 1 is a speckle area, and the area set to 0 is a background area.
Step S3: traversing each pixel of the binary image, and marking a connected domain to determine each speckle region;
the connected domain is one of the most effective and intuitive ways to show whether the pixels represent the same object on the image. The idea is to connect a series of pixels with the same certain property, and the connected paths are usually 4 fields and 8 neighborhoods.
In the embodiment of the invention, for the speckle images, one connected domain is one speckle area, and the number of the connected domains represents the number of the speckle areas because a certain gap exists between the speckle areas.
Fig. 3 is a flowchart of steps of performing connected component labeling in the embodiment of the present invention, and as shown in fig. 3, in the embodiment of the present invention, based on the binarized image, labeling is performed in the following manner:
step S301: scanning the binary image from pixels (x0, y0) according to pixel rows and columns, and determining a pixel as a seed point when the pixel value of the pixel (x, y) is not zero;
step S302: growing eight-critical domain or four-critical domain through the seed point to form a growth region, and marking pixels in the growth region as VnN is a connected domain number;
step S303: when the growth of a growth area is finished, continuing to scan line by line from a pixel (x0, y0), judging whether the pixel is marked when the pixel value of the pixel is not zero, and continuing to scan when the pixel is marked; when the pixel is not marked, executing step S302 with the pixel as a seed point;
step S304: when in useWhen all pixels are traversed, the connected domain marking is finished to obtain a connected domain set V { V }0,v1,v2...vn(ii) a N is N-1, and N is the number of connected domains, i.e., the number of speckle regions.
Step S4: extracting a pixel level center for each speckle region through a corresponding connected domain;
let a connected domain v have a set of points { (x)0,y0),(x1,y1),(x2,y2)...(xn,yn) (ii) a N-1, where N is the total number of points in the connected component, the C coordinate of the center point of the connected component may be determined by the following formula:
Figure BDA0002220400690000081
wherein C.x represents the x coordinate of the center point, C.y represents the y coordinate of the center point, and all connected domains are traversed, so that the pixel level centers of all speckle regions can be obtained.
Step S5: and determining the sub-pixel level speckle center of each speckle area according to the pixel level center by a gravity center iteration method under the radius constraint.
And for each speckle area, determining a circular integral area by taking the central coordinate of the pixel level center as a circle center and taking half of the length of the speckle area as a radius, and performing iterative solution by using a gravity center method. The speckle region length is the length in a direction through the center of the pixel level.
The principle of the gravity center method is as follows: each pixel is seen as a point of quality, i.e. the grey value of the pixel. If the integral area is regarded as a sheet of iron sheet, the mass of each point on the iron sheet can be known, and then the gravity center position of the iron sheet can be known.
The reason why the connected component is not used as the integration region is that:
because the connected domain is obtained by the binary image, when the background noise is large, the edge of the binary image is not the speckle edge; the energy of the speckle region decreases from the center, and the edge is difficult to determine. Half of the length of the speckle area is taken as the integral radius, so that the position of the center of gravity is not influenced, for example, when the center of gravity of a sphere with the radius of r is calculated, the peripheral air has no influence on the center of gravity. At half the speckle spacing, it is guaranteed that there is one and only one speckle region within the integration region, since the target object being solved is a single speckle region, not a group of speckle regions.
Fig. 4 is a flowchart of the steps of extracting the sub-pixel level speckle center in the embodiment of the present invention, and as shown in fig. 4, the step S5 includes the following steps:
step S501: for each speckle region, a pixel level center (x) is setp,yp) N pixels exist in an integral area which is used as a circle center and is determined by taking one length of the speckle area as a diameter { (x)i,yi) (ii) a N-1, and the gray-scale value for each pixel is { I (x) } 0,1,2i,yi) (ii) a N-1, i is 0,1,2, the barycentric coordinates (x) of the region are then determinedc,yc) Can be determined by the following equation:
Figure BDA0002220400690000091
step S502: order (x)p,yp)=(xc,yc) Taking the speckle distance as the center of a circle, continuously calculating the center of gravity in the circle with the speckle distance as the diameter to obtain a new speckle center of gravity (x)c,yc);
Step S503: when (x)c-xp) Absolute value of (2)<0.1 and yc-yp) Absolute value<At 0.1, the center of gravity is considered to be converged, and the new speckle center of gravity (x)c,yc) That is, the calculated sub-pixel level speckle center, otherwise, step S501 is executed again until the center of gravity converges.
As shown in fig. 5(d), the extraction of the speckle center at the sub-pixel level is also very accurate after the iteration of the centroid method. The original calculated center coordinates are sub-pixels, but the mark can only be drawn by warping to pixels.
Fig. 6 is a schematic block diagram of a speckle subpixel center extraction system in an embodiment of the present invention, and as shown in fig. 6, the speckle subpixel center extraction system provided in the present invention is used for implementing the speckle subpixel center extraction method, and includes:
the noise removal module is used for acquiring a speckle image and performing Gaussian filtering on the speckle image so as to remove background noise in the speckle image;
the binarization processing module is used for carrying out local self-adaptive binarization processing on the speckle image without the background noise to generate a binarization image;
a connected domain marking module, configured to traverse each pixel of the binarized image, and mark a connected domain to determine each speckle region;
the pixel level center extraction module is used for extracting a pixel level center for each speckle area through a corresponding connected domain;
and the sub-pixel extraction module is used for determining the sub-pixel level speckle center of each speckle area according to the pixel level center for each speckle area.
The embodiment of the invention also provides depth reconstruction equipment based on the depth data and the boundary, which comprises a processor. A memory having stored therein executable instructions of the processor. Wherein the processor is configured to perform the steps of the depth data and boundary based depth reconstruction method via execution of executable instructions.
As described above, in the embodiment of the present invention, the boundary of the target object is generated by extracting the boundary of each target object in the target image, and the depth image corresponding to the target image is complemented according to the boundary of the target object to generate the target depth image, so that the integrity of the depth image can be improved, and the depth reconstruction of the target object is facilitated.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
Fig. 7 is a schematic structural diagram of a depth reconstruction apparatus based on depth data and a boundary according to the present invention. An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 7. The electronic device 600 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 7, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code, which can be executed by the processing unit 610, to cause the processing unit 610 to perform the steps according to various exemplary embodiments of the present invention described in the above section of the depth data and boundary based depth reconstruction method of the present specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in FIG. 7, other hardware and/or software modules may be used in conjunction with electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the present invention further provides a computer-readable storage medium for storing a program, where the program implements the steps of the depth reconstruction method based on the depth data and the boundary when being executed. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the depth data and boundary based depth reconstruction method section of the present description above, when the program product is run on the terminal device.
As shown above, when the program of the computer-readable storage medium of this embodiment is executed, in the present invention, the boundary of the target object is generated by performing boundary extraction on each target object in the target image, and the depth image corresponding to the target image is complemented according to the boundary of the target object to generate the target depth image, so that the integrity of the depth image can be improved, and thus the depth reconstruction of the target object is facilitated.
Fig. 7 is a schematic structural diagram of a computer-readable storage medium of the present invention. Referring to fig. 7, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In the embodiment of the invention, the extraction of the speckle center with sub-pixel precision can be realized through a series of operations such as denoising, binarization, connected domain labeling, pixel center extraction, gravity center iteration and the like, and the noise removal has very strong robustness, so that a cheap CCD camera can be used for analyzing the position of the speckle center, and the analysis cost of the speckle quality is effectively reduced.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.

Claims (10)

1. A speckle subpixel center extraction method is characterized by comprising the following steps:
step S1: acquiring a speckle image, and carrying out Gaussian filtering on the speckle image to remove background noise in the speckle image;
step S2: performing local self-adaptive binarization processing on the speckle image without the background noise to generate a binarized image;
step S3: traversing each pixel of the binary image, and marking a connected domain to determine each speckle region;
step S4: extracting a pixel level center for each speckle region through a corresponding connected domain;
step S5: and determining the sub-pixel level speckle center of each speckle area according to the pixel level center for each speckle area.
2. The speckle subpixel center extraction method of claim 1, wherein said step S2 comprises the steps of:
step S201: dividing a plurality of speckle areas in the speckle image with the background noise removed into a plurality of adjacent areas to be binarized according to the pixel values;
step S202: generating a binarization threshold value according to the average value of the pixel values of all pixels in each area to be binarized;
step S203: and carrying out binarization processing on each region to be binarized through a corresponding binarization threshold value to generate the binarization image.
3. The speckle subpixel center extraction method of claim 1, wherein said step S3 further comprises the step of
Step S301: scanning the binary image from pixels (x0, y0) according to pixel rows and columns, and determining a pixel as a seed point when the pixel value (x, y) of the pixel is not zero;
step S302: growing eight-critical domain or four-critical domain through the seed point to form a growth region, and marking pixels in the growth region as VnN is a connected domain number;
step S303: when the growth of a growth area is finished, continuing to scan line by line from a pixel (x0, y0), judging whether the pixel is marked when the pixel value of the pixel is not zero, and continuing to scan when the pixel is marked; when the pixel is not marked, executing step S302 with the pixel as a seed point;
step S304: when all pixels are traversed, the connected domain marking is finished, and a connected domain set V { V } is obtained0,v1,v2...vn(ii) a N is N-1, and N is the number of connected domains, i.e., the number of speckle regions.
4. The speckle subpixel center extraction method according to claim 1, wherein the step S4 specifically comprises: let a connected domain v have a set of points { (x)0,y0),(x1,y1),(x2,y2)...(xn,yn) (ii) a N-1, where N is the total number of points in the connected component, the C coordinate of the center point of the connected component may be determined by the following formula:
Figure FDA0002220400680000021
wherein C.x represents the x coordinate of the center point, C.y represents the y coordinate of the center point, and all connected domains are traversed, so that the pixel level centers of all speckle regions can be obtained.
5. The speckle subpixel center extraction method of claim 1, wherein said step S5 comprises the steps of:
step S501: for each speckle region, a pixel level center (x) is setp,yp) N pixels exist in an integral area which is used as a circle center and is determined by taking one length of the speckle area as a diameter { (x)i,yi) (ii) a N-1, and the gray-scale value for each pixel is { I (x) } 0,1,2i,yi) (ii) a N-1, i is 0,1,2, the barycentric coordinates (x) of the region are then determinedc,yc) Can be determined by the following equation:
Figure FDA0002220400680000022
step S502: order (x)p,yp)=(xc,yc) Taking the speckle distance as the center of a circle, continuously calculating the center of gravity in the circle with the speckle distance as the diameter to obtain a new speckle center of gravity (x)c,yc);
Step S503: when (x)c-xp) Absolute value of (2)<0.1 and yc-yp) Absolute value<At 0.1, the center of gravity is considered to be converged, and the new speckle center of gravity (x)c,yc) That is, the calculated sub-pixel level speckle center, otherwise, step S501 is executed again until the center of gravity converges.
6. The speckle subpixel center extraction method of claim 1, wherein the value obtained by performing gaussian convolution on the pixel point I (x, y) of the speckle region during gaussian filtering is as follows:
Figure FDA0002220400680000023
wherein
Figure FDA0002220400680000024
w (s, t) is a weighting factor of the point (x-s, y-t), I (x-s, y-t) is a pixel value of the point (x-s, y-t), and the weighting factor w (s, t) conforms to a Gaussian distribution in size, wherein a and b respectively represent the dimensions of the Gaussian kernel in the x direction and the y direction.
7. The speckle subpixel center extraction method of claim 2, wherein when the binarized region is a rectangular region, the coordinates of the upper left corner and the lower right corner are (x) respectively0,y0)、(x1,y1) Then the binarization threshold T for this area is determined by the following formula:
Figure FDA0002220400680000031
where T represents the binarization threshold of the rectangular region, I (x, y) represents the grayscale value of the point (x, y), N represents the total number of pixels of the region, and c is the threshold increment.
8. A speckle subpixel center extraction system for implementing the speckle subpixel center extraction method of any one of claims 1 to 7, comprising:
the noise removal module is used for acquiring a speckle image and performing Gaussian filtering on the speckle image so as to remove background noise in the speckle image;
the binarization processing module is used for carrying out local self-adaptive binarization processing on the speckle image without the background noise to generate a binarization image;
a connected domain marking module, configured to traverse each pixel of the binarized image, and mark a connected domain to determine each speckle region;
the pixel level center extraction module is used for extracting a pixel level center for each speckle area through a corresponding connected domain;
and the sub-pixel extraction module is used for determining the sub-pixel level speckle center of each speckle area according to the pixel level center for each speckle area.
9. A speckle subpixel center extraction apparatus, comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the speckle subpixel center extraction method of any one of claims 1-7 via execution of the executable instructions.
10. A computer readable storage medium storing a program which when executed performs the steps of the speckle sub-pixel center extraction method of any one of claims 1 to 7.
CN201910931509.0A 2019-09-29 2019-09-29 Speckle sub-pixel center extraction method, system, device and medium Pending CN112581374A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910931509.0A CN112581374A (en) 2019-09-29 2019-09-29 Speckle sub-pixel center extraction method, system, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910931509.0A CN112581374A (en) 2019-09-29 2019-09-29 Speckle sub-pixel center extraction method, system, device and medium

Publications (1)

Publication Number Publication Date
CN112581374A true CN112581374A (en) 2021-03-30

Family

ID=75110571

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910931509.0A Pending CN112581374A (en) 2019-09-29 2019-09-29 Speckle sub-pixel center extraction method, system, device and medium

Country Status (1)

Country Link
CN (1) CN112581374A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113762253A (en) * 2021-08-24 2021-12-07 北京的卢深视科技有限公司 Speckle extraction method and device, electronic device and storage medium
CN114332147A (en) * 2021-12-30 2022-04-12 北京的卢深视科技有限公司 Speckle pattern preprocessing method and device, electronic equipment and storage medium
CN114755798A (en) * 2022-03-25 2022-07-15 中国科学院信息工程研究所 Laser focusing control method and system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1045646A (en) * 1989-03-16 1990-09-26 清华大学 Precision orthogonal scanning video analysis technology and instruments such as speckle pattern
CN101363732A (en) * 2008-09-17 2009-02-11 北京航空航天大学 High frame frequency sun sensor and implementing method thereof
CN101408985A (en) * 2008-09-22 2009-04-15 北京航空航天大学 Method and apparatus for extracting circular luminous spot second-pixel center
CN101571953A (en) * 2009-05-20 2009-11-04 深圳泰山在线科技有限公司 Object detection method, system and stereoscopic vision system
CN102135413A (en) * 2010-12-14 2011-07-27 河南科技大学 Phase vortex based digital speckle correlation measurement method
CN103020988A (en) * 2012-11-27 2013-04-03 西安交通大学 Method for generating motion vector of laser speckle image
CN103455813A (en) * 2013-08-31 2013-12-18 西北工业大学 Method for facula center positioning of CCD image measurement system
CN104574312A (en) * 2015-01-06 2015-04-29 深圳市元征软件开发有限公司 Method and device of calculating center of circle for target image
CN105160692A (en) * 2015-04-09 2015-12-16 南京信息工程大学 First-moment center of mass calculating method for sliding window with threshold
CN107133627A (en) * 2017-04-01 2017-09-05 深圳市欢创科技有限公司 Infrared light spot center point extracting method and device
CN107316318A (en) * 2017-05-26 2017-11-03 河北汉光重工有限责任公司 Aerial target automatic testing method based on multiple subarea domain Background fitting
CN107784669A (en) * 2017-10-27 2018-03-09 东南大学 A kind of method that hot spot extraction and its barycenter determine
CN109883333A (en) * 2019-03-14 2019-06-14 武汉理工大学 A kind of non-contact displacement strain measurement method based on characteristics of image identification technology

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1045646A (en) * 1989-03-16 1990-09-26 清华大学 Precision orthogonal scanning video analysis technology and instruments such as speckle pattern
CN101363732A (en) * 2008-09-17 2009-02-11 北京航空航天大学 High frame frequency sun sensor and implementing method thereof
CN101408985A (en) * 2008-09-22 2009-04-15 北京航空航天大学 Method and apparatus for extracting circular luminous spot second-pixel center
CN101571953A (en) * 2009-05-20 2009-11-04 深圳泰山在线科技有限公司 Object detection method, system and stereoscopic vision system
CN102135413A (en) * 2010-12-14 2011-07-27 河南科技大学 Phase vortex based digital speckle correlation measurement method
CN103020988A (en) * 2012-11-27 2013-04-03 西安交通大学 Method for generating motion vector of laser speckle image
CN103455813A (en) * 2013-08-31 2013-12-18 西北工业大学 Method for facula center positioning of CCD image measurement system
CN104574312A (en) * 2015-01-06 2015-04-29 深圳市元征软件开发有限公司 Method and device of calculating center of circle for target image
CN105160692A (en) * 2015-04-09 2015-12-16 南京信息工程大学 First-moment center of mass calculating method for sliding window with threshold
CN107133627A (en) * 2017-04-01 2017-09-05 深圳市欢创科技有限公司 Infrared light spot center point extracting method and device
CN107316318A (en) * 2017-05-26 2017-11-03 河北汉光重工有限责任公司 Aerial target automatic testing method based on multiple subarea domain Background fitting
CN107784669A (en) * 2017-10-27 2018-03-09 东南大学 A kind of method that hot spot extraction and its barycenter determine
CN109883333A (en) * 2019-03-14 2019-06-14 武汉理工大学 A kind of non-contact displacement strain measurement method based on characteristics of image identification technology

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113762253A (en) * 2021-08-24 2021-12-07 北京的卢深视科技有限公司 Speckle extraction method and device, electronic device and storage medium
CN114332147A (en) * 2021-12-30 2022-04-12 北京的卢深视科技有限公司 Speckle pattern preprocessing method and device, electronic equipment and storage medium
CN114332147B (en) * 2021-12-30 2023-01-24 合肥的卢深视科技有限公司 Speckle pattern preprocessing method and device, electronic equipment and storage medium
CN114755798A (en) * 2022-03-25 2022-07-15 中国科学院信息工程研究所 Laser focusing control method and system

Similar Documents

Publication Publication Date Title
WO2018176938A1 (en) Method and device for extracting center of infrared light spot, and electronic device
CN113870293B (en) Image processing method, image processing device, electronic equipment and storage medium
CN106920233B (en) Scratch detection method, apparatus and electronic equipment based on image procossing
US9830736B2 (en) Segmenting objects in multimedia data
CN112581374A (en) Speckle sub-pixel center extraction method, system, device and medium
CN106839976B (en) Method and device for detecting lens center
CN107169489B (en) Method and apparatus for tilt image correction
CN110136069B (en) Text image correction method and device and electronic equipment
EP3182365B1 (en) Writing board detection and correction
CN111144337B (en) Fire detection method and device and terminal equipment
CN107577979A (en) DataMatrix type Quick Response Codes method for quickly identifying, device and electronic equipment
WO2024016632A1 (en) Bright spot location method, bright spot location apparatus, electronic device and storage medium
CN105049706A (en) Image processing method and terminal
CN110516731B (en) Visual odometer feature point detection method and system based on deep learning
CN108960247B (en) Image significance detection method and device and electronic equipment
CN116071272A (en) Image correction method and device, electronic equipment and storage medium thereof
CN112154479A (en) Method for extracting feature points, movable platform and storage medium
CN114445499A (en) Checkerboard angular point automatic extraction method, system, equipment and medium
CN111178111A (en) Two-dimensional code detection method, electronic device, storage medium and system
US9070015B2 (en) System and method for iris detection in digital images
KR102452511B1 (en) Method and apparatus for detection element image in drawings
CN111768384B (en) Cell counting method and system based on three-dimensional scanning imaging
CN113313642A (en) Image denoising method and device, storage medium and electronic equipment
CN110728686B (en) Voronoi-based vehicle-mounted lamp image segmentation method
CN114463764A (en) Table line detection method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210330

WD01 Invention patent application deemed withdrawn after publication