CN113936050A - Speckle image generation method, electronic device, and storage medium - Google Patents
Speckle image generation method, electronic device, and storage medium Download PDFInfo
- Publication number
- CN113936050A CN113936050A CN202111224652.XA CN202111224652A CN113936050A CN 113936050 A CN113936050 A CN 113936050A CN 202111224652 A CN202111224652 A CN 202111224652A CN 113936050 A CN113936050 A CN 113936050A
- Authority
- CN
- China
- Prior art keywords
- speckle
- speckle pattern
- parallax
- map
- pixel points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000013507 mapping Methods 0.000 claims abstract description 19
- 239000000126 substance Substances 0.000 claims description 10
- 230000003287 optical effect Effects 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 5
- 238000012545 processing Methods 0.000 abstract description 5
- 238000005516 engineering process Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006116 polymerization reaction Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20216—Image averaging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
The embodiment of the invention relates to the field of image processing, and discloses a speckle image generation method, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring a depth map and a homologous object image of a given structured light camera; acquiring a disparity map corresponding to the depth map; taking the parallax map as a mapping path, and mapping a preset reference speckle pattern to an object speckle pattern to obtain a parallax speckle pattern; and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map. The speckle image data corresponding to the depth information is generated by comprehensively processing the three-dimensional scene information, namely the depth information and the homologous object image of the given structured light system, so that the efficiency of obtaining the speckle image is improved.
Description
Technical Field
The present invention relates to the field of image processing, and in particular, to a speckle image generation method, an electronic device, and a storage medium.
Background
At present, the most active technical branch in the field of machine vision belongs to the depth perception technology, and the speckle structure light technology is an important part in the depth perception technology. As the most common active stereoscopic vision technology, the speckle structured light technology is further classified into a monocular structured light technology, an active binocular structured light system, and the like. The speckle image data is of great importance in both traditional algorithms and depth recovery algorithms of machine learning and even deep learning.
Generally, the effect obtained by using a neural network method to recover depth information of a structured light speckle system is not better than that of passive binocular color, and one reason is that the training scale of the neural network is directly related to the insufficient data volume of speckle images. At present, the speckle image data is acquired directly by using a speckle structure optical camera. This type of acquisition has a problem of low acquisition efficiency.
Disclosure of Invention
An object of an embodiment of the present invention is to provide a speckle image generating method, an electronic device, and a storage medium, which are capable of generating speckle image data corresponding to depth information by comprehensively processing three-dimensional scene information, i.e., depth information and a homologous object image, of a given structured light system, thereby improving efficiency of obtaining a speckle image.
In order to solve the above technical problem, an embodiment of the present invention provides a speckle image generating method, including:
acquiring a depth map and a homologous object image of a given structured light camera;
acquiring a disparity map corresponding to the depth map;
taking the parallax map as a mapping path, and mapping a preset reference speckle pattern to an object speckle pattern to obtain a parallax speckle pattern;
and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map.
An embodiment of the present invention also provides an electronic device, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the speckle image generation method as described above.
Embodiments of the present invention also provide a computer-readable storage medium storing a computer program which, when executed by a processor, implements the speckle image generation method as described above.
Compared with the prior art, the method and the device have the advantages that the depth map and the homologous object image of the given structured light camera are obtained; acquiring a disparity map corresponding to the depth map; mapping a preset reference speckle pattern to an object speckle pattern by taking the parallax pattern as a mapping path to obtain a parallax speckle pattern; and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map. According to the method and the device, the three-dimensional scene information, namely the depth information and the homologous object image of the given structured light system are comprehensively processed to generate the speckle image data corresponding to the depth information, so that the efficiency of obtaining the speckle image is improved.
Drawings
Fig. 1 is a first detailed flowchart of a speckle image generation method according to an embodiment of the present invention;
FIG. 2 is a detailed flowchart II of a speckle image generation method according to an embodiment of the invention;
fig. 3 is a detailed flowchart three of the speckle image generation method according to the embodiment of the invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
An embodiment of the present invention relates to a speckle image generation method, and as shown in fig. 1, the speckle image generation method provided in this embodiment includes the following steps.
Step 101: a depth map and a homologous object image are acquired for a given structured light camera.
The structured light camera can be any one of a monocular structured light camera and an active binocular structured light camera; the depth map may be a depth map based on a speckle image; the homologous object image can be an infrared image or a color image.
Specifically, the given depth map and the homologous object image may be obtained by actually shooting a target object by a structured light camera; it is also possible to render a depth map and a homologous object image by image synthesis software or the like, and to take these images as a depth map and a homologous object image captured by a given structured light camera.
Step 102: and acquiring a disparity map corresponding to the depth map.
Specifically, based on the type of the structured light camera, the imaging principle, the internal and external parameters, and the correspondence between the depth information and the parallax information, a corresponding parallax map may be acquired based on the depth map.
In one example, the structured light camera may be a monocular structured light camera; correspondingly, the process of obtaining the disparity map corresponding to the depth map in this step may include:
calculating the parallax value of each pixel point in the parallax map by the following formula (1)d:
Wherein the content of the first and second substances,fis the focal length of the monocular structured light camera,lis taken as a baseline and is,zthe depth values of the pixel points in the depth map,z r is the distance from the plane of the reference speckle pattern to the optical center of the lens of the monocular structured light camera.
Specifically, for a monocular structured light system, a disparity map refers to the difference between the column coordinates of pixels of the same name in an object map (e.g., an object speckle map) and a reference map (e.g., a reference speckle map). Because of the speckle projector and the optical sensor (usually an infrared IR lens, focal length isf) There is a base line in betweenlAccording to the trigonometric principle, a certain three-dimensional point (x, y, z) and a reference map (usually a plane, z = z) can be calculated by formula (1)r) The disparity value between them.
In another example, the structured light camera may be an active binocular structured light camera, and the speckle projector is disposed under any one of the binocular structured light cameras; correspondingly, the process of obtaining the disparity map corresponding to the depth map in this step may include:
calculating the parallax value of each pixel point in the parallax map by the following formula (2)d:
Wherein the content of the first and second substances,ffor the focal length of the active binocular structured light camera,lis taken as a baseline and is,zthe depth values of the pixel points in the depth map.
Specifically, for an active binocular structured light system, the disparity map refers to the difference in column coordinates between the homonymous points of the binocular image. Because binocular (usually two RGB cameras of the same type, focal length is notedf) There is a base line in betweenlAccording to the triangle principle, the parallax information can be obtained through the formula (2), and then a parallax map is constructed. This disparity map amounts to placing the speckle projector on the left (right) eye by defaultAnd (5) unloading. If the speckle projector is placed at other positions, the system becomes two monocular structured light systems, and the parallax map is obtained according to the formula (1).
If other data contains scene depth information, the presence of a speckle projector and optical receiver (focal length: f) can be simulatedf) Making the distance between the two be a base linelTherefore, the monocular structured light system can be manually formed, and the parallax map is calculated by adopting the formula (1).
In addition, the speckle projector pattern can select different spacings and different sizes of speckles as well as different forms of speckle patterns. In general, given that three-dimensional depth scene information is not dense, such as sparse point cloud information of an object provided by a lidar laser scanner, etc., in this embodiment, non-information areas in the disparity map may be temporarily set as invalid areas, which are non-speckle point areas, and do not need to be processed in subsequent operations.
Step 103: and mapping the preset reference speckle pattern to the object speckle pattern by taking the parallax pattern as a mapping path to obtain the parallax speckle pattern.
Specifically, the reference speckle image generally refers to a pseudo-random pattern with a certain regularity, which is shown by projecting speckles onto a plane. According to the disparity map obtained in the previous step, a preset reference speckle pattern can be mapped to the object speckle pattern to obtain a speckle pattern as close as possible to the object speckle pattern. Since the speckle pattern is obtained based on the disparity value, it may be referred to as a disparity speckle pattern.
Step 104: and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map.
Specifically, after the parallax speckle pattern is obtained, the parallax speckle pattern is superimposed on the homologous object image to form an object speckle pattern.
Compared with the related art, the method has the advantages that the depth map and the homologous object image of the given structured light camera are obtained; acquiring a disparity map corresponding to the depth map; mapping a preset reference speckle pattern to an object speckle pattern by taking the parallax pattern as a mapping path to obtain a parallax speckle pattern; and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map. According to the method and the device, the three-dimensional scene information, namely the depth information and the homologous object image of the given structured light system are comprehensively processed to generate the speckle image data corresponding to the depth information, so that the efficiency of obtaining the speckle image is improved.
Another embodiment of the present invention relates to a speckle image generation method, as shown in fig. 2, which is an improvement of the steps of the method shown in fig. 1 in that the process of obtaining a parallax speckle pattern is refined. As shown in fig. 2, step 103 includes the following sub-steps.
Step 1031: and determining the homonymous pixel points in the parallax speckle pattern and the reference speckle pattern based on the parallax pattern.
Specifically, according to the disparity map obtained in step 102, the reference speckle pattern may be mapped to the disparity speckle image using a direct mapping manner or an indirect manner. This mapping reflects the pixel location variation of the speckle point. The pixel points in the two speckle patterns with the position change relation are the same-name pixel points.
In one example, the following equation (3) may be used to determine the pixel points in the disparity speckle pattern (c:)x’,y’) For reference to a pixel point in the speckle pattern (x,y) The same-name pixel points are as follows:
wherein the content of the first and second substances,d(x,y) Is a pixel point (x,y) The corresponding disparity value.
Specifically, formula (3) is a direct mapping method, in which pixel points (b, c, d, e)x’,y’) A certain point on the parallax speckle image (1)x,y) For reference to pixels on the speckle image (x’,y’) And corresponding homonymous pixel points. The indirect mapping is done by solving equation (3) back. Because the actually calculated parallax value may not be an integer value, an interpolation mode may be adoptedAnd linear interpolation is carried out in the x direction to obtain the pixel points with the same name, and interpolation is not needed in the y direction.
Step 1032: and determining homonymous scattered spots based on homonymous pixel points.
Specifically, each speckle point typically contains a plurality of pixel points. Due to the homonymy relationship among the pixel points, the homonymy relationship among the scattered spots can be further obtained, and the specific pixel position of the scattered spot in the parallax speckle pattern is further determined according to the speckle point in the reference speckle pattern.
In one example, the process of determining the homonymous scatters based on the homonymous pixel points can be implemented as follows.
Step 1: and determining pixel points included by each speckle point in the reference speckle pattern.
Specifically, a brightness threshold may be set, and the brightness value of each pixel may be compared with the brightness threshold. The pixel points which are larger than the brightness threshold value are pixel points which form scattered spots, and the pixel points which are not larger than the brightness threshold value are pixel points which do not form scattered spots. Then, the pixel points forming the scattered spots are subjected to polymerization division according to the pixel positions, and the adjacent pixel points located in the same region belong to one scattered spot.
Step 2: and taking pixel points in an area surrounded by the homonymous pixel points corresponding to the edge pixel points included in the speckle points in the parallax speckle pattern as the pixel points included in the corresponding homonymous speckle.
For example, a speckle point in the reference speckle pattern includes a pixel points, where b (b < a) pixel points are edge pixel points, b 'homonymous pixel points corresponding to the b pixel points are determined in the parallax speckle pattern, then an area enclosed by the b' homonymous pixel points as edges is regarded as a speckle, and the speckle point is a homonymous speckle corresponding to the speckle point in the reference speckle pattern.
Step 1033: and determining the brightness distribution of the same-name scattered spots in the parallax speckle pattern according to the brightness distribution of each scattered spot in the reference speckle pattern.
Specifically, although the correspondence of the pixel positions of the scattered spots is established by executing step 1032, the correspondence of the luminance information is lacked. Due to the fact thatSince the luminance information of the scattered spot is inversely proportional to the square of the radius thereof, a map of the luminance information of the speckle spot is obtained by counting the change in the radius before and after the scattered spot. The relationship between the brightness and the speckle radius in the x-direction of the parallax speckle pattern is:i.e. byxLuminance value at coordinatesI x Andits radius from the center point of speckle pointrIs inversely proportional to the square of.
In one example, this step may be implemented as follows. Namely: respectively counting the total brightness values of pixel points contained in the speckle points aiming at each speckle point in the reference speckle patternI all And determining the brightness value of the central pixel point of the same-name scattered spot corresponding to the speckle point in the parallax speckle patternI o Is composed ofThe brightness of a pixel having a radius r from the center pixel is(ii) a Wherein r is the number of pixels.
Specifically, after the same-name scattered spots in the reference speckle pattern and the parallax speckle pattern are determined, the brightness values of the pixel points contained in the scattered spots in the reference speckle pattern are counted to obtain the total brightness valueI all . Then, finding out the homonymous scattered spot corresponding to the speckle point in the parallax speckle pattern, and calculating the brightness value of the central pixel point of the homonymous scattered spotI o Is arranged as. Determining the brightness value of the central pixel pointI o Then, the brightness values of other pixel points in the same-name scattered spots are set, and the brightness of the pixel point with the radius r away from the central pixel point is set as(ii) a Wherein r is the number of pixel points and is more than or equal to 0. For example, when r =0, that is, the current pixel is the center pixel; when r =1, that is, the current pixel is the peripheral first-layer pixel of the central pixel, and so on.
Compared with the related art, the method and the device determine the homonymous pixel points in the parallax speckle pattern and the reference speckle pattern based on the parallax pattern; determining homonymous scattered spots based on homonymous pixel points; and determining the brightness distribution of the same-name scattered spots in the parallax speckle pattern according to the brightness distribution of each scattered spot in the reference speckle pattern, thereby realizing the mapping of the reference speckle pattern to the object speckle pattern and obtaining the parallax speckle pattern.
Another embodiment of the present invention relates to a speckle image generation method, as shown in fig. 3, which is an improvement over the method steps shown in fig. 1 in that the process of superimposing the images of the homologous objects with the parallax speckles is refined. As shown in fig. 3, step 104 includes the following sub-steps.
Step 1041: and (4) counting the brightness mean value and the contrast corresponding to the pixel points contained in each scattered spot in the parallax speckle pattern in the image of the homologous object.
Specifically, since the images of the same source objects are usually infrared images or color images, when the parallax speckle pattern is directly superimposed on the images of the objects, the parallax speckle pattern appears to be obtrusive if not processed. In this embodiment, the dodging technique can be used to coordinate the overall brightness and contrast of the scattered spots before and after the superimposition with the surrounding pixels. Firstly, counting the brightness mean value of pixel points in the homologous infrared image in the corresponding area of each speckle pointIAnd contrast valuec(effective luminance maximum value to effective luminance minimum value).
Step 1041: calculating the second value contained in each speckle in the object speckle pattern by the following formula (4) based on the brightness mean value and the contrastiBrightness of each pixel pointI i ': the specific calculation method is as follows:
wherein the content of the first and second substances,J i the first to be included in each speckle in the parallax speckle patterniThe brightness of each pixel point is determined,Iis the average value of the luminance values,cis the contrast.
Specifically, for the finally generated object speckle pattern, the pixel points in the speckle point region can calculate the corresponding brightness values through the formula (4), and the pixel points in the non-speckle point region still adopt the brightness values of the pixel points with the same name in the parallax speckle pattern.
Compared with the related art, the embodiment calculates the brightness mean value and the contrast corresponding to the pixel points contained in each speckle in the parallax speckle pattern in the images of the homologous objects; and calculating the second scattered spots contained in the scattered spot image of the object based on the brightness mean value and the contrastiBrightness of each pixel pointI i 'Therefore, the overall brightness and contrast of the speckle points before and after superposition are ensured to be coordinated with the surrounding pixel points.
Another embodiment of the invention relates to an electronic device, as shown in FIG. 4, comprising at least one processor 202; and a memory 201 communicatively coupled to the at least one processor 202; wherein the memory 201 stores instructions executable by the at least one processor 202, the instructions being executable by the at least one processor 202 to enable the at least one processor 202 to perform any of the method embodiments described above.
Where the memory 201 and the processor 202 are coupled in a bus, the bus may comprise any number of interconnected buses and bridges that couple one or more of the various circuits of the processor 202 and the memory 201 together. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor 202 is transmitted over a wireless medium through an antenna, which further receives the data and transmits the data to the processor 202.
The processor 202 is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And memory 201 may be used to store data used by processor 202 in performing operations.
Another embodiment of the present invention relates to a computer-readable storage medium storing a computer program. The computer program realizes any of the above-described method embodiments when executed by a processor.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.
Claims (10)
1. A speckle image generation method, comprising:
acquiring a depth map and a homologous object image of a given structured light camera;
acquiring a disparity map corresponding to the depth map;
taking the parallax map as a mapping path, and mapping a preset reference speckle pattern to an object speckle pattern to obtain a parallax speckle pattern;
and superposing the homologous object image and the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map.
2. The method of claim 1, wherein the structured light camera is a monocular structured light camera;
the acquiring of the disparity map corresponding to the depth map includes:
calculating the parallax value of each pixel point in the parallax map by the following formulad:
Wherein the content of the first and second substances,fis the focal length of the monocular structured light camera,lis taken as a baseline and is,zis the depth value of a pixel point in the depth map,z r the distance from the plane of the reference speckle pattern to the optical center of the lens of the monocular structured light camera.
3. The method of claim 1, wherein the structured light camera is an active binocular structured light camera and the speckle projector is positioned under any of the binocular structured light cameras;
the acquiring of the disparity map corresponding to the depth map includes:
calculating the parallax value of each pixel point in the parallax map by the following formulad:
Wherein the content of the first and second substances,fis the focal length of the active binocular structured light camera,lis taken as a baseline and is,zand the depth values of the pixel points in the depth map are obtained.
4. The method according to any one of claims 1 to 3, wherein the mapping a preset reference speckle pattern to an object speckle pattern by using the disparity map as a mapping path to obtain the disparity speckle pattern comprises:
determining homonymous pixel points in the parallax speckle pattern and the reference speckle pattern based on the parallax pattern;
determining homonymous scattered spots based on the homonymous pixel points;
and determining the brightness distribution of the same-name scattered spots in the parallax speckle pattern according to the brightness distribution of each scattered spot in the reference speckle pattern.
5. The method of claim 4, wherein the determining, based on the disparity map, pixels of the disparity speckle map that are of the same name as pixels in the reference speckle map comprises:
determining pixel points in the parallax speckle pattern by adopting the following formula (x’,y’) Is a pixel point in the reference speckle pattern (x, y) The same-name pixel points are as follows:
wherein the content of the first and second substances,d(x,y) Is a pixel point (x,y) The corresponding disparity value.
6. The method of claim 4, wherein said determining a homonymous blob based on said homonymous pixel points comprises:
determining pixel points included by each speckle point in the reference speckle pattern;
and taking pixel points in an area surrounded by homonymous pixel points corresponding to edge pixel points included in the speckle points in the parallax speckle pattern as pixel points included in the corresponding homonymous speckle.
7. The method of claim 4, wherein determining the intensity distribution of the same-name speckle pattern in the parallax speckle pattern from the intensity distribution of each speckle pattern in the reference speckle pattern comprises:
respectively counting the total brightness value of the pixel points contained in the speckle point aiming at each scattered spot in the reference scattered spot patternI all And determining the brightness value of the central pixel point of the same-name scattered spot corresponding to the speckle point in the parallax speckle patternI o Is composed ofThe brightness of the pixel point with the radius r from the central pixel point is(ii) a Wherein r is the number of pixels.
8. The method of claim 4, wherein the superimposing the image of the homologous object with the parallax speckle pattern to obtain an object speckle pattern corresponding to the depth map comprises:
calculating the brightness mean value and the contrast ratio corresponding to the pixel points contained in each speckle in the parallax speckle pattern in the homologous object image;
calculating the second contained speckle in each speckle in the object speckle pattern based on the brightness mean value and the contrast by the following formulaiBrightness of each pixel pointI i ':
Wherein the content of the first and second substances,J i is the first scattered spot included in the parallax speckle patterniThe brightness of each pixel point is determined,Iis the average value of the luminance values,cis the contrast.
9. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the speckle image generation method of any one of claims 1 to 8.
10. A computer-readable storage medium storing a computer program, wherein the computer program is executed by a processor to implement the speckle image generation method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111224652.XA CN113936050B (en) | 2021-10-21 | 2021-10-21 | Speckle image generation method, electronic device, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111224652.XA CN113936050B (en) | 2021-10-21 | 2021-10-21 | Speckle image generation method, electronic device, and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113936050A true CN113936050A (en) | 2022-01-14 |
CN113936050B CN113936050B (en) | 2022-08-12 |
Family
ID=79281061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111224652.XA Active CN113936050B (en) | 2021-10-21 | 2021-10-21 | Speckle image generation method, electronic device, and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113936050B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114255233A (en) * | 2022-03-01 | 2022-03-29 | 合肥的卢深视科技有限公司 | Speckle pattern quality evaluation method and device, electronic device and storage medium |
WO2023185375A1 (en) * | 2022-03-30 | 2023-10-05 | 杭州萤石软件有限公司 | Depth map generation system and method, and autonomous mobile device |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017138210A1 (en) * | 2016-02-12 | 2017-08-17 | ソニー株式会社 | Image pickup apparatus, image pickup method, and image pickup system |
WO2018133027A1 (en) * | 2017-01-20 | 2018-07-26 | 深圳大学 | Grayscale constraint-based method and apparatus for integer-pixel search for three-dimensional digital speckle pattern |
WO2019153626A1 (en) * | 2018-02-07 | 2019-08-15 | 深圳奥比中光科技有限公司 | Depth image engine and depth image calculation method |
EP3561724A1 (en) * | 2018-04-28 | 2019-10-30 | Guangdong Oppo Mobile Telecommunications Corp., Ltd | Image processing method and device, computer-readable storage medium and electronic device |
WO2019205889A1 (en) * | 2018-04-28 | 2019-10-31 | Oppo广东移动通信有限公司 | Image processing method, apparatus, computer-readable storage medium, and electronic device |
WO2019205890A1 (en) * | 2018-04-28 | 2019-10-31 | Oppo广东移动通信有限公司 | Image processing method, apparatus, computer-readable storage medium, and electronic device |
WO2019205887A1 (en) * | 2018-04-28 | 2019-10-31 | Oppo广东移动通信有限公司 | Method and apparatus for controlling photographing, electronic device, and computer readable storage medium |
US20200126246A1 (en) * | 2018-10-19 | 2020-04-23 | Samsung Electronics Co., Ltd. | Method and apparatus for active depth sensing and calibration method thereof |
CN111145342A (en) * | 2019-12-27 | 2020-05-12 | 山东中科先进技术研究院有限公司 | Binocular speckle structured light three-dimensional reconstruction method and system |
US20200154033A1 (en) * | 2018-04-28 | 2020-05-14 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method and Apparatus for Processing Data, and Computer Readable Storage Medium |
US20200151425A1 (en) * | 2018-04-12 | 2020-05-14 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Image Processing Method, Image Processing Device, Computer Readable Storage Medium and Electronic Device |
WO2020206666A1 (en) * | 2019-04-12 | 2020-10-15 | 深圳市汇顶科技股份有限公司 | Depth estimation method and apparatus employing speckle image and face recognition system |
CN112927280A (en) * | 2021-03-11 | 2021-06-08 | 北京的卢深视科技有限公司 | Method and device for acquiring depth image and monocular speckle structured light system |
-
2021
- 2021-10-21 CN CN202111224652.XA patent/CN113936050B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017138210A1 (en) * | 2016-02-12 | 2017-08-17 | ソニー株式会社 | Image pickup apparatus, image pickup method, and image pickup system |
WO2018133027A1 (en) * | 2017-01-20 | 2018-07-26 | 深圳大学 | Grayscale constraint-based method and apparatus for integer-pixel search for three-dimensional digital speckle pattern |
WO2019153626A1 (en) * | 2018-02-07 | 2019-08-15 | 深圳奥比中光科技有限公司 | Depth image engine and depth image calculation method |
US20200151425A1 (en) * | 2018-04-12 | 2020-05-14 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Image Processing Method, Image Processing Device, Computer Readable Storage Medium and Electronic Device |
WO2019205887A1 (en) * | 2018-04-28 | 2019-10-31 | Oppo广东移动通信有限公司 | Method and apparatus for controlling photographing, electronic device, and computer readable storage medium |
WO2019205890A1 (en) * | 2018-04-28 | 2019-10-31 | Oppo广东移动通信有限公司 | Image processing method, apparatus, computer-readable storage medium, and electronic device |
WO2019205889A1 (en) * | 2018-04-28 | 2019-10-31 | Oppo广东移动通信有限公司 | Image processing method, apparatus, computer-readable storage medium, and electronic device |
US20200154033A1 (en) * | 2018-04-28 | 2020-05-14 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Method and Apparatus for Processing Data, and Computer Readable Storage Medium |
EP3561724A1 (en) * | 2018-04-28 | 2019-10-30 | Guangdong Oppo Mobile Telecommunications Corp., Ltd | Image processing method and device, computer-readable storage medium and electronic device |
US20200126246A1 (en) * | 2018-10-19 | 2020-04-23 | Samsung Electronics Co., Ltd. | Method and apparatus for active depth sensing and calibration method thereof |
WO2020206666A1 (en) * | 2019-04-12 | 2020-10-15 | 深圳市汇顶科技股份有限公司 | Depth estimation method and apparatus employing speckle image and face recognition system |
CN112771573A (en) * | 2019-04-12 | 2021-05-07 | 深圳市汇顶科技股份有限公司 | Depth estimation method and device based on speckle images and face recognition system |
CN111145342A (en) * | 2019-12-27 | 2020-05-12 | 山东中科先进技术研究院有限公司 | Binocular speckle structured light three-dimensional reconstruction method and system |
CN112927280A (en) * | 2021-03-11 | 2021-06-08 | 北京的卢深视科技有限公司 | Method and device for acquiring depth image and monocular speckle structured light system |
Non-Patent Citations (2)
Title |
---|
YAOYAO等: "Non-invasive depth-resolved imaging through scattering layers via speckle correlations and parallax", 《APPLIED PHYSICS LETTERS》 * |
许仁超等: "基于双目视觉的数字散斑时空相关三维面形测量", 《激光杂志》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114255233A (en) * | 2022-03-01 | 2022-03-29 | 合肥的卢深视科技有限公司 | Speckle pattern quality evaluation method and device, electronic device and storage medium |
WO2023185375A1 (en) * | 2022-03-30 | 2023-10-05 | 杭州萤石软件有限公司 | Depth map generation system and method, and autonomous mobile device |
Also Published As
Publication number | Publication date |
---|---|
CN113936050B (en) | 2022-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10771689B2 (en) | Image processing method and device, computer-readable storage medium and electronic device | |
CN113936050B (en) | Speckle image generation method, electronic device, and storage medium | |
CN103795998B (en) | Image processing method and image processing equipment | |
CN112634374B (en) | Stereoscopic calibration method, device and system for binocular camera and binocular camera | |
WO2017076106A1 (en) | Method and device for image splicing | |
CN113034568B (en) | Machine vision depth estimation method, device and system | |
EP2568253B1 (en) | Structured-light measuring method and system | |
CN106815869B (en) | Optical center determining method and device of fisheye camera | |
JP5633058B1 (en) | 3D measuring apparatus and 3D measuring method | |
CN207766424U (en) | A kind of filming apparatus and imaging device | |
CN110567441B (en) | Particle filter-based positioning method, positioning device, mapping and positioning method | |
US20140056508A1 (en) | Apparatus and method for image matching between multiview cameras | |
CN104424640A (en) | Method and device for carrying out blurring processing on images | |
JP7378219B2 (en) | Imaging device, image processing device, control method, and program | |
CN107991665A (en) | It is a kind of based on fixed-focus camera to target three-dimensional coordinate method for continuous measuring | |
WO2021104308A1 (en) | Panoramic depth measurement method, four-eye fisheye camera, and binocular fisheye camera | |
JP2017021759A (en) | Image processor, image processing method and program | |
CN108881717A (en) | A kind of Depth Imaging method and system | |
CN108924408A (en) | A kind of Depth Imaging method and system | |
CN110602474A (en) | Method, device and equipment for determining image parallax | |
WO2023142352A1 (en) | Depth image acquisition method and device, terminal, imaging system and medium | |
JP2019091122A (en) | Depth map filter processing device, depth map filter processing method and program | |
CN108074250A (en) | Matching power flow computational methods and device | |
US20210215827A1 (en) | Using time-of-flight techniques for stereoscopic image processing | |
CN113808185B (en) | Image depth recovery method, electronic device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20220516 Address after: 230091 room 611-217, R & D center building, China (Hefei) international intelligent voice Industrial Park, 3333 Xiyou Road, high tech Zone, Hefei, Anhui Province Applicant after: Hefei lushenshi Technology Co.,Ltd. Address before: 100083 room 3032, North B, bungalow, building 2, A5 Xueyuan Road, Haidian District, Beijing Applicant before: BEIJING DILUSENSE TECHNOLOGY CO.,LTD. Applicant before: Hefei lushenshi Technology Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |