CN113936050B - Speckle image generation method, electronic device, and storage medium - Google Patents

Speckle image generation method, electronic device, and storage medium Download PDF

Info

Publication number
CN113936050B
CN113936050B CN202111224652.XA CN202111224652A CN113936050B CN 113936050 B CN113936050 B CN 113936050B CN 202111224652 A CN202111224652 A CN 202111224652A CN 113936050 B CN113936050 B CN 113936050B
Authority
CN
China
Prior art keywords
speckle
parallax
map
speckle pattern
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111224652.XA
Other languages
Chinese (zh)
Other versions
CN113936050A (en
Inventor
李东洋
化雪诚
王海彬
刘祺昌
户磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Dilusense Technology Co Ltd
Original Assignee
Hefei Dilusense Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Dilusense Technology Co Ltd filed Critical Hefei Dilusense Technology Co Ltd
Priority to CN202111224652.XA priority Critical patent/CN113936050B/en
Publication of CN113936050A publication Critical patent/CN113936050A/en
Application granted granted Critical
Publication of CN113936050B publication Critical patent/CN113936050B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20216Image averaging

Abstract

The embodiment of the invention relates to the field of image processing, and discloses a speckle image generation method, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a depth map and a homologous object image of a given structured light camera; acquiring a disparity map corresponding to the depth map; taking the parallax map as a mapping path, and mapping a preset reference speckle pattern to an object speckle pattern to obtain a parallax speckle pattern; and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map. The speckle image data corresponding to the depth information is generated by comprehensively processing the three-dimensional scene information, namely the depth information and the homologous object image of the given structured light system, so that the efficiency of obtaining the speckle image is improved.

Description

Speckle image generation method, electronic device, and storage medium
Technical Field
The present invention relates to the field of image processing, and in particular, to a speckle image generation method, an electronic device, and a storage medium.
Background
At present, the most active technical branch in the field of machine vision belongs to the depth perception technology, and the speckle structure light technology is an important part in the depth perception technology. As the most common active stereoscopic vision technology, the speckle structured light technology is further classified into a monocular structured light technology, an active binocular structured light system, and the like. The speckle image data is of great importance in both traditional algorithms and depth recovery algorithms of machine learning and even deep learning.
Generally, the effect obtained by using a neural network method to recover depth information of a structured light speckle system is not better than that of passive binocular color, and one reason is that the training scale of the neural network is directly related to the insufficient data volume of speckle images. At present, the speckle image data is acquired directly by using a speckle structure light camera. This type of acquisition has a problem of low acquisition efficiency.
Disclosure of Invention
An object of an embodiment of the present invention is to provide a speckle image generating method, an electronic device, and a storage medium, which are capable of generating speckle image data corresponding to depth information by comprehensively processing three-dimensional scene information, i.e., depth information and a homologous object image, of a given structured light system, thereby improving efficiency of obtaining a speckle image.
In order to solve the above technical problem, an embodiment of the present invention provides a speckle image generating method, including:
acquiring a depth map and a homologous object image of a given structured light camera;
acquiring a disparity map corresponding to the depth map;
taking the parallax map as a mapping path, and mapping a preset reference speckle pattern to an object speckle pattern to obtain a parallax speckle pattern;
And superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map.
An embodiment of the present invention also provides an electronic device, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the speckle image generation method as described above.
Embodiments of the present invention also provide a computer-readable storage medium storing a computer program which, when executed by a processor, implements the speckle image generation method as described above.
Compared with the prior art, the method and the device have the advantages that the depth map and the homologous object image of the given structured light camera are obtained; acquiring a disparity map corresponding to the depth map; mapping a preset reference speckle pattern to an object speckle pattern by taking the parallax pattern as a mapping path to obtain a parallax speckle pattern; and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map. According to the method and the device, the three-dimensional scene information, namely the depth information and the homologous object image of the given structured light system are comprehensively processed to generate the speckle image data corresponding to the depth information, so that the efficiency of obtaining the speckle image is improved.
Drawings
Fig. 1 is a specific flowchart one of a speckle image generation method according to an embodiment of the present invention;
FIG. 2 is a detailed flowchart II of a speckle image generation method according to an embodiment of the invention;
fig. 3 is a detailed flowchart three of the speckle image generation method according to the embodiment of the invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
An embodiment of the present invention relates to a speckle image generation method, and as shown in fig. 1, the speckle image generation method provided in this embodiment includes the following steps.
Step 101: a depth map and a homologous object image are acquired for a given structured light camera.
The structured light camera can be any one of a monocular structured light camera and an active binocular structured light camera; the depth map may be a depth map based on speckle images; the homologous object image can be an infrared image or a color image.
Specifically, the given depth map and the homologous object image may be obtained by actually shooting a target object by a structured light camera; it is also possible to render a depth map and a homologous object image by image synthesis software or the like, and to take these images as a depth map and a homologous object image captured by a given structured light camera.
Step 102: and acquiring a disparity map corresponding to the depth map.
Specifically, based on the type of the structured light camera, the imaging principle, the internal and external parameters, and the correspondence between the depth information and the parallax information, a corresponding parallax map may be acquired based on the depth map.
In one example, the structured light camera may be a monocular structured light camera; correspondingly, the process of obtaining the disparity map corresponding to the depth map in this step may include:
calculating the parallax value of each pixel point in the parallax map by the following formula (1)d
Figure 600251DEST_PATH_IMAGE001
………………………(1)
Wherein the content of the first and second substances,fis the focal length of the monocular structured light camera, lIs taken as a baseline and is,zthe depth values of the pixel points in the depth map,z r is the distance from the plane of the reference speckle pattern to the optical center of the lens of the monocular structured light camera.
Specifically, for a monocular structured light system, a disparity map refers to the difference between the column coordinates of pixels of the same name in an object map (e.g., an object speckle map) and a reference map (e.g., a reference speckle map). Because of the speckle projector and the optical sensor (usually an infrared IR lens, focal length isf) There is a base line in betweenlAccording to the trigonometric principle, a certain three-dimensional point (x, y, z) and a reference map (usually a plane, z = z) can be calculated by formula (1) r ) The disparity value between them.
In another example, the structured light camera may be an active binocular structured light camera, and the speckle projector is disposed under any one of the binocular structured light cameras; correspondingly, the process of obtaining the disparity map corresponding to the depth map in this step may include:
calculating the parallax value of each pixel point in the parallax map by the following formula (2)d
Figure 140954DEST_PATH_IMAGE002
………………………(2)
Wherein the content of the first and second substances,ffor the focal length of the active binocular structured light camera,lis taken as a baseline and is,zthe depth values of the pixel points in the depth map.
Specifically, for an active binocular structured light system, the disparity map refers to the difference in column coordinates between the homonymous points of the binocular image. Because binocular (usually two RGB cameras of the same type, focal length is noted f) There is a base line in betweenlAccording to the triangle principle, the parallax information can be obtained through the formula (2), and then a parallax map is constructed. This disparity map amounts to placing the speckle projector under the left (right) eye camera by default. If the speckle projector is placed at other positions, the system becomes two monocular structured light systems, and the parallax map is obtained according to the formula (1).
If other data contains scene depth information, the presence of a speckle projector and optical receiver (focal length: f) can be simulatedf) So that the distance between the two is a baselinelTherefore, the monocular structured light system can be manually formed, and the parallax map is calculated by adopting the formula (1).
In addition, the speckle projector pattern can select different spacings and different sizes of speckles as well as different forms of speckle patterns. In general, given that three-dimensional depth scene information is not dense, such as sparse point cloud information of an object provided by a lidar laser scanner, etc., in this embodiment, non-information areas in the disparity map may be temporarily set as invalid areas, which are non-speckle point areas, and do not need to be processed in subsequent operations.
Step 103: and mapping the preset reference speckle pattern to the object speckle pattern by taking the parallax pattern as a mapping path to obtain the parallax speckle pattern.
Specifically, the reference speckle image generally refers to a pseudo-random pattern with a certain regularity, which is shown by projecting speckles onto a plane. According to the disparity map obtained in the previous step, a preset reference speckle pattern can be mapped to the object speckle pattern to obtain a speckle pattern as close as possible to the object speckle pattern. Since the speckle pattern is obtained based on the disparity value, it may be referred to as a disparity speckle pattern.
Step 104: and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map.
Specifically, after the parallax speckle pattern is obtained, the parallax speckle pattern is superimposed on the homologous object image to form an object speckle pattern.
Compared with the related art, the method has the advantages that the depth map and the homologous object image of the given structured light camera are obtained; acquiring a disparity map corresponding to the depth map; mapping a preset reference speckle pattern to an object speckle pattern by taking the parallax pattern as a mapping path to obtain a parallax speckle pattern; and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map. According to the method and the device, the three-dimensional scene information, namely the depth information and the homologous object image of the given structured light system are comprehensively processed to generate the speckle image data corresponding to the depth information, so that the efficiency of obtaining the speckle image is improved.
Another embodiment of the present invention relates to a speckle image generation method, as shown in fig. 2, which is an improvement of the steps of the method shown in fig. 1 in that the process of obtaining a parallax speckle pattern is refined. As shown in fig. 2, step 103 includes the following sub-steps.
Step 1031: and determining the homonymous pixel points in the parallax speckle pattern and the reference speckle pattern based on the parallax pattern.
Specifically, according to the disparity map obtained in step 102, the reference speckle pattern may be mapped to the disparity speckle image using a direct mapping manner or an indirect manner. This mapping reflects the pixel location variation of the speckle point. The pixel points in the two speckle patterns with the position change relation are the same-name pixel points.
In one example, the following equation (3) may be used to determine the pixel points in the disparity speckle pattern (c:)x’,y’) For reference to a pixel point in the speckle pattern (x,y) The same-name pixel points are as follows:
Figure DEST_PATH_IMAGE003
………………………(3)
wherein the content of the first and second substances,d(x,y) Is a pixel point (x,y) The corresponding disparity value.
Specifically, formula (3) is a direct mapping method, in which pixel points (b, c, d, e)x’,y’) A certain point on the parallax speckle image (1)x,y) For reference to pixels on the speckle image (x’,y’) And corresponding homonymous pixel points. The indirect mapping is done by solving equation (3) back. Because the parallax value obtained by actual calculation may not be an integer value, linear interpolation can be performed in the x direction by adopting an interpolation mode to obtain pixel points with the same name, and interpolation is not required in the y direction.
Step 1032: and determining homonymous scattered spots based on homonymous pixel points.
Specifically, each speckle point typically contains a plurality of pixel points. Due to the homonymy relationship among the pixel points, the homonymy relationship among the scattered spots can be further obtained, and the specific pixel position of the scattered spot in the parallax speckle pattern is further determined according to the speckle point in the reference speckle pattern.
In one example, the process of determining the homonymous scatters based on the homonymous pixel points can be implemented as follows.
Step 1: and determining pixel points included by each speckle point in the reference speckle pattern.
Specifically, a brightness threshold may be set, and the brightness value of each pixel may be compared with the brightness threshold. The pixel points which are larger than the brightness threshold value are pixel points which form scattered spots, and the pixel points which are not larger than the brightness threshold value are pixel points which do not form scattered spots. Then, the pixel points forming the scattered spots are subjected to polymerization division according to the pixel positions, and the adjacent pixel points located in the same region belong to one scattered spot.
Step 2: and taking pixel points in an area surrounded by the homonymous pixel points corresponding to the edge pixel points included in the speckle points in the parallax speckle pattern as the pixel points included in the corresponding homonymous speckle.
For example, a speckle point in the reference speckle pattern includes a pixel points, where b (b < a) pixel points are edge pixel points, b 'homonymous pixel points corresponding to the b pixel points are determined in the parallax speckle pattern, then an area enclosed by the b' homonymous pixel points as edges is regarded as a speckle, and the speckle point is a homonymous speckle corresponding to the speckle point in the reference speckle pattern.
Step 1033: and determining the brightness distribution of the same-name scattered spots in the parallax speckle pattern according to the brightness distribution of each scattered spot in the reference speckle pattern.
Specifically, although the correspondence of the pixel positions of the scattered spots is established by executing step 1032, the correspondence of the luminance information is lacked. Since the luminance information of a scattered spot is inversely proportional to the square of its radius, a map of the luminance information of a speckle spot is obtained by counting the change in radius before and after the scattered spot. The relationship between the brightness and the speckle radius in the x-direction of the parallax speckle pattern is:
Figure 253267DEST_PATH_IMAGE004
i.e. byxLuminance value at coordinatesI x Andits radius from the center point of speckle pointrIs inversely proportional to the square of.
In one example, this step may be implemented as follows. Namely: respectively counting the total brightness values of pixel points contained in the speckle points aiming at each speckle point in the reference speckle pattern I all And determining the brightness value of the central pixel point of the same-name scattered spot corresponding to the speckle point in the parallax speckle patternI o Is composed of
Figure DEST_PATH_IMAGE005
The brightness of a pixel having a radius r from the center pixel is
Figure 791695DEST_PATH_IMAGE006
(ii) a Wherein r is the number of pixels.
Specifically, after the same-name scattered spots in the reference speckle pattern and the parallax speckle pattern are determined, the brightness values of the pixel points contained in the scattered spots in the reference speckle pattern are counted to obtain the total brightness valueI all . Then, finding out the homonymous scattered spot corresponding to the speckle point in the parallax speckle pattern, and calculating the brightness value of the central pixel point of the homonymous scattered spotI o Is arranged as
Figure 786196DEST_PATH_IMAGE005
. Determining the brightness value of the central pixel pointI o Then, the brightness values of other pixel points in the same-name scattered spots are set, and the brightness of the pixel point with the radius r away from the central pixel point is set as
Figure DEST_PATH_IMAGE007
(ii) a Wherein r is the number of pixel points and is more than or equal to 0. For example, when r =0, that is, the current pixel is the center pixel; when r =1, that is, the current pixel is the peripheral first-layer pixel of the central pixel, and so on.
Compared with the related art, the method and the device determine the homonymous pixel points in the parallax speckle pattern and the reference speckle pattern based on the parallax pattern; determining homonymous scattered spots based on homonymous pixel points; and determining the brightness distribution of the same-name scattered spots in the parallax speckle pattern according to the brightness distribution of each scattered spot in the reference speckle pattern, thereby realizing the mapping of the reference speckle pattern to the object speckle pattern and obtaining the parallax speckle pattern.
Another embodiment of the present invention relates to a speckle image generation method, as shown in fig. 3, which is an improvement over the method steps shown in fig. 1 in that the process of superimposing the images of the homologous objects with the parallax speckles is refined. As shown in fig. 3, step 104 includes the following sub-steps.
Step 1041: and (4) counting the brightness mean value and the contrast corresponding to the pixel points contained in each scattered spot in the parallax speckle pattern in the image of the homologous object.
Specifically, since the images of the same source objects are usually infrared images or color images, when the parallax speckle pattern is directly superimposed on the images of the objects, the parallax speckle pattern appears to be obtrusive if not processed. In this embodiment, the dodging technique can be used to coordinate the overall brightness and contrast of the scattered spots before and after the superimposition with the surrounding pixels. Firstly, counting the brightness mean value of pixel points in the homologous infrared image in the corresponding area of each speckle pointIAnd contrast valuec(effective luminance maximum value to effective luminance minimum value).
Step 1041: calculating the second value contained in each speckle in the object speckle pattern by the following formula (4) based on the brightness mean value and the contrastiBrightness of each pixel pointI i ': the specific calculation method is as follows:
Figure 68273DEST_PATH_IMAGE008
………………………(4)
Wherein, the first and the second end of the pipe are connected with each other,J i the first to be included in each speckle in the parallax speckle patterniThe brightness of each pixel point is determined,Iis the average value of the luminance values,cis the contrast.
Specifically, for the finally generated object speckle pattern, the pixel points in the speckle point region can calculate the corresponding brightness values through the formula (4), and the pixel points in the non-speckle point region still adopt the brightness values of the pixel points with the same name in the parallax speckle pattern.
Compared with the related art, the embodiment calculates the brightness mean value and the contrast corresponding to the pixel points contained in each speckle in the parallax speckle pattern in the images of the homologous objects; and calculating the second scattered spots contained in the scattered spot image of the object based on the brightness mean value and the contrastiBrightness of each pixel pointI i 'Thereby ensuring the overall brightness and contrast of speckle points before and after superpositionCoordinated with its surrounding pixels.
Another embodiment of the invention relates to an electronic device, as shown in FIG. 4, comprising at least one processor 202; and a memory 201 communicatively coupled to the at least one processor 202; wherein the memory 201 stores instructions executable by the at least one processor 202, the instructions being executable by the at least one processor 202 to enable the at least one processor 202 to perform any of the method embodiments described above.
Where the memory 201 and the processor 202 are coupled in a bus, the bus may comprise any number of interconnected buses and bridges that couple one or more of the various circuits of the processor 202 and the memory 201 together. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor 202 is transmitted over a wireless medium through an antenna, which further receives the data and transmits the data to the processor 202.
The processor 202 is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. While memory 201 may be used to store data used by processor 202 in performing operations.
Another embodiment of the present invention relates to a computer-readable storage medium storing a computer program. The computer program realizes any of the above-described method embodiments when executed by a processor.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (10)

1. A speckle image generation method, comprising:
Acquiring a depth map and a homologous object image of a given structured light camera;
acquiring a disparity map corresponding to the depth map;
taking the parallax map as a mapping path, and mapping a preset reference speckle map to an object speckle map to obtain a parallax speckle map, wherein the parallax speckle map is a speckle map close to the object speckle map;
and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map.
2. The method of claim 1, wherein the structured light camera is a monocular structured light camera;
the acquiring of the disparity map corresponding to the depth map includes:
calculating the parallax value of each pixel point in the parallax map by the following formulad
Figure 323757DEST_PATH_IMAGE001
Wherein, the first and the second end of the pipe are connected with each other,fis the focal length of the monocular structured light camera,lis taken as a baseline and is,zis the depth value of a pixel point in the depth map,z r the distance from the plane of the reference speckle pattern to the optical center of the lens of the monocular structured light camera.
3. The method of claim 1, wherein the structured light camera is an active binocular structured light camera and the speckle projector is positioned under any of the binocular structured light cameras;
The acquiring of the disparity map corresponding to the depth map includes:
calculating the parallax value of each pixel point in the parallax map by the following formulad
Figure 606971DEST_PATH_IMAGE002
Wherein, the first and the second end of the pipe are connected with each other,fis the focal length of the active binocular structured light camera,lis taken as a baseline and is,zand the depth values of the pixel points in the depth map are obtained.
4. The method according to any one of claims 1 to 3, wherein the mapping a preset reference speckle pattern to an object speckle pattern by using the disparity map as a mapping path to obtain the disparity speckle pattern comprises:
determining homonymous pixel points in the parallax speckle pattern and the reference speckle pattern based on the parallax pattern;
determining homonymous scattered spots based on the homonymous pixel points;
and determining the brightness distribution of the same-name scattered spots in the parallax speckle pattern according to the brightness distribution of each scattered spot in the reference speckle pattern.
5. The method of claim 4, wherein the determining, based on the disparity map, pixels of the disparity speckle map that are of the same name as pixels in the reference speckle map comprises:
determining pixel points in the parallax speckle pattern by adopting the following formula (x’,y’) For pixels in the reference speckle patternPoint (A)x, y) The same-name pixel points are as follows:
Figure 898275DEST_PATH_IMAGE003
Wherein the content of the first and second substances,d(x,y) Is a pixel point (x,y) The corresponding disparity value.
6. The method of claim 4, wherein said determining a homonymous blob based on said homonymous pixel points comprises:
determining pixel points included by each speckle point in the reference speckle pattern;
and taking pixel points in an area surrounded by homonymous pixel points corresponding to edge pixel points included in the speckle points in the parallax speckle pattern as pixel points included in the corresponding homonymous speckle.
7. The method of claim 4, wherein determining the intensity distribution of the same-name speckle pattern in the parallax speckle pattern from the intensity distribution of each speckle pattern in the reference speckle pattern comprises:
respectively counting the total brightness value of the pixel points contained in the speckle point aiming at each scattered spot in the reference scattered spot patternI all And determining the brightness value of the central pixel point of the same-name scattered spot corresponding to the speckle point in the parallax speckle patternI o Is composed of
Figure 634150DEST_PATH_IMAGE004
The brightness of the pixel point with the radius r from the central pixel point is
Figure 301892DEST_PATH_IMAGE005
(ii) a Wherein r is the number of pixels.
8. The method of claim 4, wherein the superimposing the image of the homologous object with the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map comprises:
Calculating the brightness mean value and the contrast ratio corresponding to the pixel points contained in each speckle in the parallax speckle pattern in the homologous object image;
calculating the second contained speckle in each speckle in the object speckle pattern based on the brightness mean value and the contrast by the following formulaiBrightness of each pixel pointI i '
Figure 439612DEST_PATH_IMAGE006
Wherein, the first and the second end of the pipe are connected with each other,J i is the first scattered spot included in the parallax speckle patterniThe brightness of each pixel point is determined,Iis the average value of the luminance values,cis the contrast.
9. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the speckle image generation method of any one of claims 1 to 8.
10. A computer-readable storage medium storing a computer program, wherein the computer program is executed by a processor to implement the speckle image generation method according to any one of claims 1 to 8.
CN202111224652.XA 2021-10-21 2021-10-21 Speckle image generation method, electronic device, and storage medium Active CN113936050B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111224652.XA CN113936050B (en) 2021-10-21 2021-10-21 Speckle image generation method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111224652.XA CN113936050B (en) 2021-10-21 2021-10-21 Speckle image generation method, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN113936050A CN113936050A (en) 2022-01-14
CN113936050B true CN113936050B (en) 2022-08-12

Family

ID=79281061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111224652.XA Active CN113936050B (en) 2021-10-21 2021-10-21 Speckle image generation method, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN113936050B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114255233B (en) * 2022-03-01 2022-05-31 合肥的卢深视科技有限公司 Speckle pattern quality evaluation method and device, electronic device and storage medium
CN114627174A (en) * 2022-03-30 2022-06-14 杭州萤石软件有限公司 Depth map generation system and method and autonomous mobile device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019205887A1 (en) * 2018-04-28 2019-10-31 Oppo广东移动通信有限公司 Method and apparatus for controlling photographing, electronic device, and computer readable storage medium
CN112771573A (en) * 2019-04-12 2021-05-07 深圳市汇顶科技股份有限公司 Depth estimation method and device based on speckle images and face recognition system

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017138210A1 (en) * 2016-02-12 2017-08-17 ソニー株式会社 Image pickup apparatus, image pickup method, and image pickup system
WO2018133027A1 (en) * 2017-01-20 2018-07-26 深圳大学 Grayscale constraint-based method and apparatus for integer-pixel search for three-dimensional digital speckle pattern
CN108399596B (en) * 2018-02-07 2020-12-18 深圳奥比中光科技有限公司 Depth image engine and depth image calculation method
WO2019196683A1 (en) * 2018-04-12 2019-10-17 Oppo广东移动通信有限公司 Method and device for image processing, computer-readable storage medium, and electronic device
CN108764052B (en) * 2018-04-28 2020-09-11 Oppo广东移动通信有限公司 Image processing method, image processing device, computer-readable storage medium and electronic equipment
EP3672223B1 (en) * 2018-04-28 2022-12-28 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Data processing method, electronic device, and computer-readable storage medium
WO2019205890A1 (en) * 2018-04-28 2019-10-31 Oppo广东移动通信有限公司 Image processing method, apparatus, computer-readable storage medium, and electronic device
EP3644261B1 (en) * 2018-04-28 2023-09-06 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image processing method, apparatus, computer-readable storage medium, and electronic device
US11176694B2 (en) * 2018-10-19 2021-11-16 Samsung Electronics Co., Ltd Method and apparatus for active depth sensing and calibration method thereof
CN111145342B (en) * 2019-12-27 2024-04-12 山东中科先进技术研究院有限公司 Binocular speckle structured light three-dimensional reconstruction method and system
CN112927280B (en) * 2021-03-11 2022-02-11 北京的卢深视科技有限公司 Method and device for acquiring depth image and monocular speckle structured light system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019205887A1 (en) * 2018-04-28 2019-10-31 Oppo广东移动通信有限公司 Method and apparatus for controlling photographing, electronic device, and computer readable storage medium
CN112771573A (en) * 2019-04-12 2021-05-07 深圳市汇顶科技股份有限公司 Depth estimation method and device based on speckle images and face recognition system

Also Published As

Publication number Publication date
CN113936050A (en) 2022-01-14

Similar Documents

Publication Publication Date Title
US10771689B2 (en) Image processing method and device, computer-readable storage medium and electronic device
CN113936050B (en) Speckle image generation method, electronic device, and storage medium
CN103795998B (en) Image processing method and image processing equipment
CN112634374B (en) Stereoscopic calibration method, device and system for binocular camera and binocular camera
WO2017076106A1 (en) Method and device for image splicing
EP2568253B1 (en) Structured-light measuring method and system
CN113034568B (en) Machine vision depth estimation method, device and system
CN106815869B (en) Optical center determining method and device of fisheye camera
CN109978934B (en) Binocular vision stereo matching method and system based on matching cost weighting
KR101418167B1 (en) Method and device control stereoscopic camera
CN207766424U (en) A kind of filming apparatus and imaging device
CN109255810A (en) Image processing apparatus and image processing method
JP7378219B2 (en) Imaging device, image processing device, control method, and program
CN107991665A (en) It is a kind of based on fixed-focus camera to target three-dimensional coordinate method for continuous measuring
CN108881717A (en) A kind of Depth Imaging method and system
CN114066950A (en) Monocular speckle structure optical image matching method, electronic device and storage medium
CN110602474A (en) Method, device and equipment for determining image parallax
WO2021104308A1 (en) Panoramic depth measurement method, four-eye fisheye camera, and binocular fisheye camera
WO2023142352A1 (en) Depth image acquisition method and device, terminal, imaging system and medium
JP2019091122A (en) Depth map filter processing device, depth map filter processing method and program
CN113808185B (en) Image depth recovery method, electronic device and storage medium
CN114331919B (en) Depth recovery method, electronic device, and storage medium
KR20200057929A (en) Method for rectification of stereo images captured by calibrated cameras and computer program
US10097777B2 (en) Depth map from multi-focal plane images
CN112752088B (en) Depth image generation method and device, reference image generation method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220516

Address after: 230091 room 611-217, R & D center building, China (Hefei) international intelligent voice Industrial Park, 3333 Xiyou Road, high tech Zone, Hefei, Anhui Province

Applicant after: Hefei lushenshi Technology Co.,Ltd.

Address before: 100083 room 3032, North B, bungalow, building 2, A5 Xueyuan Road, Haidian District, Beijing

Applicant before: BEIJING DILUSENSE TECHNOLOGY CO.,LTD.

Applicant before: Hefei lushenshi Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant