CN117395485A - Integrated polarized light field depth perception imaging device and method adopting same - Google Patents

Integrated polarized light field depth perception imaging device and method adopting same Download PDF

Info

Publication number
CN117395485A
CN117395485A CN202311129533.5A CN202311129533A CN117395485A CN 117395485 A CN117395485 A CN 117395485A CN 202311129533 A CN202311129533 A CN 202311129533A CN 117395485 A CN117395485 A CN 117395485A
Authority
CN
China
Prior art keywords
view
light field
polarized light
image
target scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311129533.5A
Other languages
Chinese (zh)
Inventor
李宁
张希雅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Zhuoxi Brain And Intelligence Research Institute
Original Assignee
Hangzhou Zhuoxi Brain And Intelligence Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Zhuoxi Brain And Intelligence Research Institute filed Critical Hangzhou Zhuoxi Brain And Intelligence Research Institute
Priority to CN202311129533.5A priority Critical patent/CN117395485A/en
Publication of CN117395485A publication Critical patent/CN117395485A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/12Beam splitting or combining systems operating by refraction only
    • G02B27/123The splitting element being a lens or a system of lenses, including arrays and surfaces with refractive power
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/286Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising for controlling or changing the state of polarisation, e.g. transforming one polarisation state into another
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4015Demosaicing, e.g. colour filter array [CFA], Bayer pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4046Scaling the whole image or part thereof using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4053Super resolution, i.e. output image resolution higher than sensor resolution
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/557Depth or shape recovery from multiple images from light fields, e.g. from plenoptic cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to an integrated polarized light field depth perception imaging device and a method adopting the device, wherein the device comprises the following steps: the system comprises a main lens for condensing and imaging, a micro lens array for acquiring multi-view light field information of a target scene, a micro polarizer array for modulating linear polarization of incident light in different directions, a CMOS image sensor for acquiring light intensity information and an image reading circuit for converting the light intensity information into digital image signals. Therefore, the device can be used for collecting and preprocessing multi-view polarized light field images of the target scene under the scattering medium condition, reconstructing to obtain images with different polarization directions under different view angles, further obtaining multi-view clear images meeting preset resolution, and estimating parallax of the target scene according to the multi-view clear images to obtain a depth map of the target scene. Therefore, the problems that the depth perception technology is difficult to realize effective depth measurement under the condition of a scattering medium and the like are solved, and the anti-scattering medium interference capability of the depth perception technology is improved.

Description

Integrated polarized light field depth perception imaging device and method adopting same
Technical Field
The application relates to the technical field of depth perception, in particular to an integrated polarized light field depth perception imaging device and a method adopting the same.
Background
The depth perception technology has important functions and values in the fields of automatic driving, industrial detection, intelligent robots, microscopic imaging and the like.
In the related art, depth perception techniques of the main stream can be generally divided into two categories: active depth perception techniques and passive depth perception techniques. Active sensing technologies mainly include Direct-Time of flight (d-ToF) such as lidar, indirect-Time of flight (i-ToF), millimeter wave radar, structured light imaging, etc.; the passive perception technology is mainly binocular stereoscopic vision.
However, the active depth perception technology has higher depth perception precision due to the active illumination, but the cost is relatively higher due to the more complex structure; the passive depth sensing technology has higher spatial resolution, lower cost and easier deployment, but has poorer depth measurement accuracy than the active technology.
A common disturbance in depth perception applications is scattering media such as fog, haze, smoke, etc. and tissue media in microscopic imaging. While scattering media can cause serious disturbances to depth perception. For example, the laser radar and the structured light mainly adopt near infrared active light, and the light source is seriously disturbed under the condition of encountering scattering media such as haze, so that the depth perception precision is seriously affected; the millimeter wave radar has better penetrability to a scattering medium due to longer wavelength, but the space resolution of the conventional millimeter wave radar is extremely low, which limits the application of the millimeter wave radar; in addition, binocular stereoscopic vision mainly adopts two visible light cameras to realize depth perception, so that clear images are difficult to obtain under the condition of scattering media, and difficulty is brought to stereoscopic matching.
Therefore, in the related art, the depth perception technology cannot well realize the high resolution depth perception under the condition of the scattering medium, so that the development of the related field is limited, and the solution is needed.
Disclosure of Invention
The application provides an integrated polarized light field depth perception imaging device and an anti-scattering interference depth perception method, which are used for solving the problems that a depth perception technology is difficult to realize stable and effective depth measurement under the condition of scattering media such as haze and the like, thereby limiting the development of related fields and the like, and greatly improving the anti-scattering medium interference capability of the depth perception technology.
An embodiment of a first aspect of the present application provides an integrated polarized light field depth perception imaging device, including:
the main lens is used for condensing and imaging the target scene; the micro lens array is used for acquiring multi-view light field information of the target scene by utilizing a plurality of micro lenses based on a condensation imaging result; the micro-polaroid array is used for modulating linear polarization of incident light in different directions to obtain multi-angle polarization information of the target scene; a CMOS (Complementary Metal Oxide Semiconductor ) image sensor for collecting light intensity information passing through the microlens array and the micro-polarizer array, wherein the light intensity information includes the multi-view light field information and the multi-angle polarization information; and the image reading circuit is used for converting the light intensity information into a digital image signal to obtain a multi-view polarized light field image of the target scene.
Optionally, in some embodiments, the primary lens is the same as the F-number of the microlenses in the microlens array.
Optionally, in some embodiments, each microlens corresponds to an image pixel of a plurality of CMOS image sensors.
Optionally, in some embodiments, the microlens array includes four-direction periodically arranged micropolarizers, wherein the four-direction periodically arranged micropolarizers are 0-degree micropolarizer, 45-degree micropolarizer, 90-degree micropolarizer and 135-degree micropolarizer, respectively.
An embodiment of a second aspect of the present application provides an anti-scattering interference depth perception method, which adopts the integrated polarized light field depth perception imaging device as described above, wherein the method includes the following steps:
acquiring multi-view polarized light field images of a target scene under the condition of a scattering medium by using the integrated polarized light field depth perception imaging device; preprocessing the multi-view polarized light field image, and reconstructing to obtain images with different polarization directions under different view angles; processing the images with different polarization directions under different view angles to obtain a multi-view clear image meeting the preset resolution condition; and estimating the parallax of the target scene according to the multi-view clear image to obtain a depth map of the target scene.
Optionally, in some embodiments, the preprocessing the multi-view polarized light field image, and reconstructing the multi-view polarized light field image to obtain images with different polarization directions under different view angles includes: performing polarization demosaicing reconstruction on the polarized light field image based on a preset polarization demosaicing algorithm to obtain a first reconstructed image; and carrying out light field multi-view high-resolution reconstruction on the first reconstructed image based on a preset light field multi-view reconstruction algorithm to obtain images with different polarization directions under different view angles.
Optionally, in some embodiments, the preset polarization demosaicing algorithm includes an interpolation algorithm or a deep learning algorithm; the preset light field multi-view reconstruction algorithm comprises a deconvolution reconstruction algorithm or a deep learning algorithm.
Optionally, in some embodiments, the processing the images with different polarization directions under the different viewing angles to obtain a multi-viewing-angle clear image meeting a preset resolution condition includes: and processing the images with different polarization directions under different view angles according to a preset polarization defogging algorithm based on a polarization scattering mechanism to obtain a multi-view-angle clear image meeting a preset resolution condition.
Optionally, in some embodiments, the predetermined polarization defogging algorithm comprises at least one of a difference method, a stokes method, and a deep learning method.
Optionally, in some embodiments, the estimating the disparity map of the target scene according to the multi-view sharp image, to obtain a depth map of the target scene includes: estimating a parallax image of the target scene according to the multi-view clear image based on a preset multi-view parallax estimation algorithm to obtain a depth image of the target scene, wherein the preset multi-view parallax estimation algorithm comprises a classical matching algorithm or a deep learning algorithm.
From this, this application has utilized main lens, microlens array, little polarizer array, CMOS image sensor and image readout circuit to have constructed integrated polarized light field depth perception imaging device, this application utilizes the device to gather the multi-view polarized light field image of target scene under the scattering medium condition, carry out the preliminary treatment to multi-view polarized light field image, reconstruct and obtain different polarization direction images under the different visual angles, and handle different polarization direction images under the different visual angles and obtain the multi-view clear image that satisfies the preset resolution condition, finally estimate the parallax of target scene according to multi-view clear image, obtain the depth map of target scene. Therefore, the problems that the depth perception technology is difficult to realize stable and effective depth measurement under the condition of scattering media such as haze and the like, development of related fields is limited and the like are solved, and the anti-scattering medium interference capability of the depth perception technology is greatly improved.
Additional aspects and advantages of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic diagram of an integrated polarized light field depth perception imaging device provided according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a polarized light field imaging region corresponding to a single microlens in an integrated polarized light field depth perception imaging device according to one embodiment of the present application;
FIG. 3 is a flowchart of an anti-interference depth perception method according to an embodiment of the present application;
FIG. 4 is a flow chart for polarized light field image preprocessing according to one specific embodiment of the present application;
fig. 5 is a flowchart of an anti-interference depth perception method according to one embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary and intended for the purpose of explaining the present application and are not to be construed as limiting the present application.
The following describes an integrated polarized light field depth perception imaging device and an anti-scattering interference depth perception method according to an embodiment of the present application with reference to the accompanying drawings. Aiming at the problem that the depth perception technology mentioned in the background technology is difficult to realize stable and effective depth measurement under the condition of scattering media such as haze and the like so as to limit the development of the related field, the application provides an integrated polarized light field depth perception imaging device, wherein a main lens is used for condensing and imaging a target scene; the micro-lens array is used for acquiring multi-view light field information of a target scene by utilizing a plurality of micro-lenses based on a condensation imaging result; the micro-polaroid array is used for modulating linear polarization of incident light in different directions to obtain multi-angle polarization information of a target scene; the CMOS image sensor is used for collecting light intensity information passing through the micro lens array and the micro polaroid array, wherein the light intensity information comprises multi-view light field information and multi-angle polarization information; and the image reading circuit is used for converting the light intensity information into a digital image signal to obtain a multi-view polarized light field image of the target scene. Therefore, the problems that the depth perception technology is difficult to realize stable and effective depth measurement under the condition of scattering media such as haze and the like, development of related fields is limited and the like are solved, and the anti-scattering medium interference capability of the depth perception technology is greatly improved.
Specifically, fig. 1 is a schematic diagram of an integrated polarized light field depth perception imaging device according to an embodiment of the present application.
As shown in fig. 1, the integrated polarized light field depth perception imaging device 10 includes: a main lens 1, a microlens array 2, a micro-polarizer array 3, a CMOS image sensor 4, and an image readout circuit 5.
Specifically, a main lens 1 for condensing and imaging a target scene; a microlens array 2 for acquiring multi-view light field information of a target scene using a plurality of microlenses based on a condensing imaging result; the micro-polarizer array 3 is used for modulating linear polarization of incident light in different directions to obtain multi-angle polarization information of a target scene; the CMOS image sensor 4 is used for collecting light intensity information passing through the micro lens array and the micro polaroid array, wherein the light intensity information comprises multi-view light field information and multi-angle polarization information; and the image reading circuit 5 is used for converting the light intensity information into a digital image signal to obtain a multi-view polarized light field image of the target scene.
Alternatively, in some embodiments, as shown in fig. 1, the primary lens 1 has the same F-number as the microlenses in the microlens array 2.
It will be appreciated that the microlens F-numbers in the main lens 1 and the microlens array 2 should be kept consistent to achieve normal light field imaging.
Alternatively, in some embodiments, as shown in fig. 1, each microlens corresponds to an image pixel of a plurality of CMOS image sensors 4.
The microlens array 2 includes a plurality of microlenses, and parameters of the microlenses may be set according to practical application requirements, which is not specifically limited herein. In this embodiment of the present application, a single microlens needs to correspond to image pixels of a plurality of CMOS image sensors 4, and the number of the corresponding pixels is not specifically limited, and can be set by those skilled in the art according to actual requirements.
In addition, as shown in fig. 1, the micro polarizer array 3 of the embodiment of the present application is pixel-level, that is, each pixel has a micro polarizer corresponding to it in a certain direction.
Alternatively, in some embodiments, the microlens array includes four-directional periodically arranged micropolarizers, wherein the four-directional periodically arranged micropolarizers are 0 degree micropolarizer, 45 degree micropolarizer, 90 degree micropolarizer and 135 degree micropolarizer, respectively.
Therefore, the linear polarization modulation of the micro-polarizer array 3 on the incident light in different directions can be utilized, so that the measurement of the polarization state of the incident light is realized, and the light intensity information passing through the micro-lens array and the micro-polarizer array is acquired through the CMOS image sensor 4, so that the polarized light field imaging is realized.
For example, fig. 2 is a schematic diagram of a polarized light field imaging area corresponding to a single microlens in an integrated polarized light field depth perception imaging device according to an embodiment of the present application, as shown in fig. 2, a single microlens 6 corresponds to an 8×8 CMOS image sensor pixel area, and then the present application can obtain a 0 degree micro-polarizer and a corresponding CMOS imaging unit 7, a 45 degree micro-polarizer and a corresponding CMOS imaging unit 8, a 90 degree micro-polarizer and a corresponding CMOS imaging unit 9, and a 135 degree micro-polarizer and a corresponding CMOS imaging unit 10. Thus, the present application can acquire 0 degree, 45 degree, 90 degree and 135 degree polarization information of the target scene using these periodically arranged micro polarizers.
In addition, it should be noted that, as shown in fig. 1 and 2, the micro polarizer array 3 in the embodiment of the present application should be aligned with the pixel array of the CMOS image sensor 4 in a precisely fitting manner, so as to avoid occurrence of crosstalk between different pixels. The microlens array 2 should be kept at a proper distance from the underlying micro-polarizer array 3 and the CMOS image sensor 4 while maintaining alignment with the underlying micro-polarizer array 3 and the CMOS image sensor 4, and the specific distance should be set according to parameters of the microlenses to ensure high quality polarized light field imaging.
In some cases, the present application contemplates implementing polarized light field imaging using multiple polarized camera arrays, which, however, requires a large number of polarized cameras, is costly, bulky, and difficult to apply in multiple fields. On the other hand, if polarized light field imaging is realized by a mode of space two-dimensional moving scanning of a polarized camera, the method needs to acquire images at each scanning position to simulate a multi-camera array, and the method is only applicable to static scenes and cannot be used for dynamic scenes although only a single polarized camera is needed, so that the application of the method is limited.
According to the integrated polarized light field depth perception imaging device provided by the embodiment of the application, multi-view polarized light field information of a target scene can be obtained, linear polarization modulation is carried out on incident light in different directions, so that multi-angle polarized information of the target scene is obtained, and finally the collected light intensity information is converted into a digital image signal, so that multi-view polarized light field images of the target scene are obtained. Therefore, the integrated polarized light field depth perception imaging device adopts a more integrated design, can realize anti-scattering medium interference depth perception through a single camera, and has lower cost and wider application scene compared with the method for realizing polarized light field imaging; compared with the active sensing technology and the double/multiple sensing technology in the background technology, the method has higher integration level.
Next, an anti-scattering interference depth sensing method according to an embodiment of the present application, which employs the integrated polarized light field depth sensing imaging device 10 of the above embodiment, will be described with reference to the accompanying drawings.
Specifically, fig. 3 is a flowchart of an anti-interference depth sensing method according to an embodiment of the present application, and as shown in fig. 3, the anti-interference depth sensing method includes the following steps:
in step S301, a multi-view polarized light field image of a target scene under a scattering medium condition is acquired using the integrated polarized light field depth perception imaging device 10.
The scattering medium of the embodiment of the application can be mist, haze, smoke, biological tissues and the like.
Specifically, as shown in fig. 1, the main lens 1 in the integrated polarized light field depth perception imaging device 10 of the present application is used to perform condensation imaging on a target scene under the condition of a scattering medium, and multiple microlenses are used to obtain multi-view light field information of the target scene, so that linear polarization modulation of incident light in different directions is performed by using the micro-polarizer array 3, and multi-angle polarized information of the target scene is obtained, and further, the multi-view light field information and the multi-angle polarized information passing through the micro-lens array and the micro-polarizer array are collected by the CMOS image sensor 4, and finally, the image readout circuit 5 is used to convert the light intensity information into a digital image signal, so as to obtain a multi-view polarized light field image of the target scene.
In step S302, the multi-view polarized light field image is preprocessed, and images with different polarization directions under different view angles are reconstructed.
It can be understood that the method and the device can obtain high-resolution blurred images with different polarization directions and different visual angles by preprocessing the acquired multi-visual-angle polarized light field image of the target scene and reconstructing the multi-visual-angle polarized light field image.
Optionally, in some embodiments, preprocessing the multi-view polarized light field image, and reconstructing to obtain images with different polarization directions under different view angles includes: performing polarization demosaicing reconstruction on the polarized light field image based on a preset polarization demosaicing algorithm to obtain a first reconstructed image; and carrying out light field multi-view high-resolution reconstruction on the first reconstructed image based on a preset light field multi-view reconstruction algorithm to obtain images with different polarization directions under different view angles.
Optionally, in some embodiments, the preset polarization demosaicing algorithm includes an interpolation algorithm or a deep learning algorithm; the preset light field multi-view reconstruction algorithm comprises a deconvolution reconstruction algorithm or a deep learning algorithm.
Specifically, the preprocessing operation of the multi-view polarized light field image in the embodiment of the present application includes mosaic reconstruction and light field multi-view high resolution reconstruction, where the polarization demosaicing algorithm may be a classical interpolation algorithm or a deep learning algorithm, and the light field multi-view reconstruction algorithm may be a deconvolution reconstruction algorithm or a deep learning algorithm, and the like, and is not limited herein.
In some embodiments, fig. 4 is a flowchart for polarized light field image preprocessing according to one specific embodiment of the present application, and as shown in fig. 4, the polarized light field image preprocessing flow according to the embodiment of the present application includes the following steps:
step S401, an original polarized light field image is obtained;
step S402, performing polarization demosaicing reconstruction on the polarized light field image by using a polarization demosaicing algorithm, such as an interpolation algorithm or a deep learning algorithm, to obtain a first reconstructed image;
step S403, performing light field multi-view high resolution reconstruction on the first reconstructed image by using a light field multi-view reconstruction algorithm, such as a deconvolution reconstruction algorithm or a deep learning algorithm;
in step S404, images with different polarization directions under different viewing angles are generated.
Therefore, the reconstructed polarized light field image can be obtained through preprocessing the image, and further processing can be performed on images with different polarization directions under different view angles, so that the image meets a certain resolution.
In step S303, images with different polarization directions under different viewing angles are processed to obtain a multi-viewing angle clear image meeting a preset resolution condition.
Optionally, in some embodiments, processing the images with different polarization directions at different viewing angles to obtain a multi-view sharp image meeting a preset resolution condition includes: based on a polarization descattering mechanism, images in different polarization directions under different view angles are processed according to a preset polarization defogging algorithm, so that a multi-view clear image meeting a preset resolution condition is obtained.
Optionally, in some embodiments, the predetermined polarization defogging algorithm comprises at least one of a difference method, a stokes method, and a deep learning method.
It should be noted that the present application is not specifically limited to the polarization defogging algorithm, and those skilled in the art may select according to practical situations.
Therefore, the high-resolution clear images at different visual angles can be obtained by processing the images with different polarization directions at different visual angles obtained by processing in the step S302 according to a preset polarization defogging algorithm based on a polarization defogging mechanism.
In step S304, the parallax of the target scene is estimated according to the multi-view sharp image, and the depth map of the target scene is obtained.
Optionally, in some embodiments, estimating a disparity map of the target scene according to the multi-view sharp image, to obtain a depth map of the target scene includes: and estimating a disparity map of the target scene according to the multi-view clear image based on a preset multi-view disparity estimation algorithm to obtain a depth map of the target scene, wherein the preset multi-view disparity estimation algorithm comprises a classical matching algorithm or a deep learning algorithm.
In addition, the selection of the multi-view disparity estimation algorithm is not particularly limited, and may be set by those skilled in the art according to requirements.
It can be understood that, according to the preset multi-view parallax estimation algorithm, the parallax map of the scene can be estimated by using the clear multi-view high-resolution image obtained in step S303, so as to finally obtain the depth map of the target scene.
In order for those skilled in the art to further understand the anti-interference depth perception method of the present application, the following examples are set forth to further illustrate the specific steps of the method:
specifically, fig. 5 is a flowchart of an anti-interference depth sensing method according to an embodiment of the present application, and as shown in fig. 5, the method includes the following steps:
step S501, an integrated polarized light field depth perception imaging device is constructed according to the process of the embodiment;
step S502, collecting polarized light field images of a target scene (fog, haze, biological tissues and the like) under the condition of a scattering medium by using an integrated polarized light field depth perception imaging device, and then reconstructing to obtain high-resolution blurred images with different polarization directions and different visual angles by using a preset image preprocessing operation;
step S503, based on a polarization descattering mechanism, processing the images with different polarization directions under different viewing angles obtained by processing in step S502 according to a preset polarization defogging algorithm to obtain high-resolution clear images under different viewing angles;
step S504, estimating a parallax map of the scene by using the clear multi-view high-resolution image obtained in step S503 according to a preset multi-view parallax estimation algorithm, thereby further obtaining a depth map of the target scene.
Therefore, the integrated polarized light field depth perception imaging device and the anti-scattering interference depth perception method realize real-time acquisition of scene multi-view polarized information, reconstruct a scene clear image by utilizing the multi-view polarized image, and then estimate scene parallax based on the clear multi-view image, thereby realizing depth perception of anti-scattering medium interference, and solving the problem that high-resolution high-precision depth perception is difficult to realize under the condition of scattering medium in the related art.
In addition, the integrated polarized light field depth perception imaging device and the anti-scattering interference depth perception method have wider application range, and can be used for anti-haze depth perception (such as automatic driving) of a macroscopic scene and anti-biological tissue scattering interference microscopic depth perception of a microscopic scene.
According to the anti-scattering interference depth perception method provided by the embodiment of the application, the integrated polarized light field depth perception imaging device is utilized to collect multi-view polarized light field images of a target scene under the condition of scattering media, the multi-view polarized light field images are preprocessed, images with different polarization directions under different view angles are obtained through reconstruction, multi-view clear images meeting the preset resolution condition are obtained through processing the images with different polarization directions under different view angles, parallax of the target scene is estimated according to the multi-view clear images, and the depth map of the target scene is obtained. Therefore, the problems that the depth perception technology is difficult to realize stable and effective depth measurement under the condition of scattering media such as haze and the like, development of related fields is limited and the like are solved, and the anti-scattering medium interference capability of the depth perception technology is greatly improved.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or N embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present application, the meaning of "N" is at least two, such as two, three, etc., unless explicitly defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and additional implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order from that shown or discussed, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. As with another embodiment, if implemented in hardware, may be implemented with a combination of any one or more of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable gate arrays, field programmable gate arrays, and the like.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, where the program may be stored in a computer readable storage medium, and where the program, when executed, includes one or a combination of the steps of the method embodiments.
Although embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.

Claims (10)

1. An integrated polarized light field depth perception imaging device, comprising:
the main lens is used for condensing and imaging the target scene;
the micro lens array is used for acquiring multi-view light field information of the target scene by utilizing a plurality of micro lenses based on a condensation imaging result;
the micro-polaroid array is used for modulating linear polarization of incident light in different directions to obtain multi-angle polarization information of the target scene;
the CMOS image sensor is used for collecting light intensity information passing through the micro lens array and the micro polarizer array, wherein the light intensity information comprises multi-view light field information and multi-angle polarization information;
and the image reading circuit is used for converting the light intensity information into a digital image signal to obtain a multi-view polarized light field image of the target scene.
2. The integrated polarized light field depth perception imaging device of claim 1, wherein the primary lens is the same as the F-number of the microlenses in the microlens array.
3. The integrated polarized light field depth perception imaging device of claim 1, wherein each microlens corresponds to an image pixel of a plurality of CMOS image sensors.
4. The integrated polarized light field depth perception imaging device of claim 1, wherein the microlens array comprises four circularly arranged micropolarizers, wherein the four circularly arranged micropolarizers are 0 degree micropolarizer, 45 degree micropolarizer, 90 degree micropolarizer and 135 degree micropolarizer, respectively.
5. An anti-scattering interference depth perception method, characterized in that an integrated polarized light field depth perception imaging device according to any one of claims 1-4 is employed, wherein the method comprises the steps of:
acquiring multi-view polarized light field images of a target scene under the condition of a scattering medium by using the integrated polarized light field depth perception imaging device;
preprocessing the multi-view polarized light field image, and reconstructing to obtain images with different polarization directions under different view angles;
processing the images with different polarization directions under different view angles to obtain a multi-view clear image meeting the preset resolution condition; and
and estimating the parallax of the target scene according to the multi-view clear image to obtain a depth map of the target scene.
6. The method of claim 5, wherein preprocessing the multi-view polarized light field image and reconstructing the multi-view polarized light field image into images with different polarization directions at different view angles, comprises:
performing polarization demosaicing reconstruction on the polarized light field image based on a preset polarization demosaicing algorithm to obtain a first reconstructed image;
and carrying out light field multi-view high-resolution reconstruction on the first reconstructed image based on a preset light field multi-view reconstruction algorithm to obtain images with different polarization directions under different view angles.
7. The method of claim 6, wherein the pre-set polarization demosaicing algorithm comprises an interpolation algorithm or a deep learning algorithm;
the preset light field multi-view reconstruction algorithm comprises a deconvolution reconstruction algorithm or a deep learning algorithm.
8. The method according to claim 5, wherein the processing the images of different polarization directions at different viewing angles to obtain a multi-view sharp image satisfying a preset resolution condition includes:
and processing the images with different polarization directions under different view angles according to a preset polarization defogging algorithm based on a polarization scattering mechanism to obtain a multi-view-angle clear image meeting a preset resolution condition.
9. The method of claim 8, wherein the predetermined polarization defogging algorithm comprises at least one of a difference method, a stokes method, and a deep learning method.
10. The method according to claim 5, wherein estimating the disparity map of the target scene from the multi-view sharp image, to obtain the depth map of the target scene, comprises:
estimating a parallax image of the target scene according to the multi-view clear image based on a preset multi-view parallax estimation algorithm to obtain a depth image of the target scene, wherein the preset multi-view parallax estimation algorithm comprises a classical matching algorithm or a deep learning algorithm.
CN202311129533.5A 2023-09-01 2023-09-01 Integrated polarized light field depth perception imaging device and method adopting same Pending CN117395485A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311129533.5A CN117395485A (en) 2023-09-01 2023-09-01 Integrated polarized light field depth perception imaging device and method adopting same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311129533.5A CN117395485A (en) 2023-09-01 2023-09-01 Integrated polarized light field depth perception imaging device and method adopting same

Publications (1)

Publication Number Publication Date
CN117395485A true CN117395485A (en) 2024-01-12

Family

ID=89463832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311129533.5A Pending CN117395485A (en) 2023-09-01 2023-09-01 Integrated polarized light field depth perception imaging device and method adopting same

Country Status (1)

Country Link
CN (1) CN117395485A (en)

Similar Documents

Publication Publication Date Title
CN107004685B (en) Solid-state imaging device and electronic apparatus
KR102456875B1 (en) Depth imaging device, method and application
JP5929553B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
CN102917235B (en) Image processing apparatus and image processing method
US9036004B2 (en) Three-dimensional image capture device
US8908054B1 (en) Optics apparatus for hands-free focus
US9048153B2 (en) Three-dimensional image sensor
US20120112037A1 (en) Three-dimensional imaging device
JP6119193B2 (en) Distance measuring device and distance measuring method
CN101662590A (en) Image pickup apparatus and image processing apparatus
EP3513550B1 (en) Flat digital image sensor
CN103686134A (en) Image pickup apparatus, image pickup system, image processing device, and method of controlling image pickup apparatus
US9544570B2 (en) Three-dimensional image pickup apparatus, light-transparent unit, image processing apparatus, and program
CN108805921B (en) Image acquisition system and method
JP2011182237A (en) Compound-eye imaging device, and image processing method in the same
CN104935793A (en) Filter-array-equipped microlens and solid-state imaging device
JP2006119843A (en) Image forming method, and apparatus thereof
CN109792511B (en) Full photon aperture view slippage for richer color sampling
WO2020071253A1 (en) Imaging device
CN117395485A (en) Integrated polarized light field depth perception imaging device and method adopting same
JP5982907B2 (en) Image processing apparatus, image processing method, and program
KR101857977B1 (en) Image apparatus for combining plenoptic camera and depth camera, and image processing method
CN115208999A (en) Imaging method and system based on light field camera array
CN115086550A (en) Meta-imaging method and system
CN115209000A (en) Dynamic phase difference estimation method and system for remote sensing imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination