WO2022177223A1 - Reconstruction d'image faisant appel à une fonction d'étalement de points multiples pour un dispositif de prise de vues derrière un dispositif d'affichage - Google Patents

Reconstruction d'image faisant appel à une fonction d'étalement de points multiples pour un dispositif de prise de vues derrière un dispositif d'affichage Download PDF

Info

Publication number
WO2022177223A1
WO2022177223A1 PCT/KR2022/001920 KR2022001920W WO2022177223A1 WO 2022177223 A1 WO2022177223 A1 WO 2022177223A1 KR 2022001920 W KR2022001920 W KR 2022001920W WO 2022177223 A1 WO2022177223 A1 WO 2022177223A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
psfs
electronic device
depth position
instructions
Prior art date
Application number
PCT/KR2022/001920
Other languages
English (en)
Inventor
Changgeng Liu
Ernest Rehmatulla Post
Ziwen Jiang
Ye Zhao
Gustavo Alejandro GUAYAQUIL SOSA
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2022177223A1 publication Critical patent/WO2022177223A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • This disclosure relates generally to electronic displays, and, more particularly, to the multiple point spread function (PSF) based reconstruction of images captured by a camera behind the electronic displays.
  • PSF point spread function
  • AMLCDs active matrix liquid crystal displays
  • AMOLEDs active matrix organic light emitting displays
  • micro-LED displays are typically the types of the displays that are deployed for use in personal electronic devices (e.g., mobile phones, tablet computers, smartwatches, and so forth).
  • personal electronic devices may generally include a front-facing camera, which may be disposed adjacent to the display, and may be utilized most often by users to capture self-portraits (e.g., "selfies").
  • self-portraits e.g., "selfies”
  • front-facing camera systems grow in complexity (e.g., depth cameras)
  • more and more of the area designated for the display of the electronic device may be traded off to expand the area designated for the camera system. This may lead to a reduction in resolution and viewing area of the display.
  • One technique to overcome the reduction in resolution and viewing area of the display may be to dispose the front-facing camera system completely behind or underneath the display panel.
  • disposing the front-facing camera system behind the display panel may often degrade images captured by the front-facing camera. It may be thus useful to provide improved techniques to reconstruct images captured by front-facing camera systems disposed behind a display panel.
  • a method for image reconstruction includes capturing, by a camera disposed behind a display panel of an electronic device, an original image through a semi-transparent pixel region of the display panel, and determining a depth position with respect to at least one object identified within the original image.
  • the method further includes obtaining, based on the depth position, a plurality of point spread functions (PSFs) corresponding to a plurality of lateral positions at the depth position, and generating a set of image patches based on the plurality of PSFs.
  • PSFs point spread functions
  • Each image patch of the set of image patches is generated based on a different one of the plurality of PSFs.
  • the method concludes with generating a reconstructed image corresponding to the original image based on the set of image patches.
  • an electronic device for image reconstruction includes a display panel, a camera disposed behind the display panel, one or more computer-readable storage media including instructions, and one or more processors coupled to the storage media and the camera.
  • the processor is configured to execute the instructions to: capture, by a camera disposed behind a display panel of the electronic device, an original image through a semi-transparent pixel region of the display panel; determine a depth position with respect to at least one object identified within the original image; obtain, based on the depth position, a plurality of point spread functions (PSFs) corresponding to a plurality of lateral positions at the depth position; generate a set of image patches based on the plurality of PSFs, wherein each image patch of the set of image patches is generated based on a different one of the plurality of PSFs; and generate a reconstructed image corresponding to the original image based on the set of image patches.
  • PSFs point spread functions
  • a computer-readable medium includes instructions that, when executed by one or more processors of an electronic device, cause the one or more processors to: capture, by a camera disposed behind a display panel of the electronic device, an original image through a semi-transparent pixel region of the display panel; determine a depth position with respect to at least one object identified within the original image; obtain, based on the depth position, a plurality of point spread functions (PSFs) corresponding to a plurality of lateral positions at the depth position; generate a set of image patches based on the plurality of PSFs, wherein each image patch of the set of image patches is generated based on a different one of the plurality of PSFs; and generate a reconstructed image corresponding to the original image based on the set of image patches.
  • PSFs point spread functions
  • FIG. 1a illustrates an example diagram of an electronic device.
  • FIG. 1b illustrates an example system and workflow diagram for reconstructing images captured by a camera disposed behind a display of an electronic device.
  • FIG. 2 illustrates an example system for measuring and determining one or more premeasured PSFs.
  • FIG. 3 illustrates another example system for measuring and determining one or more premeasured PSFs.
  • FIG. 4 illustrates an example system for reconstructing images based on multiple PSFs captured at a particular depth position and differing lateral positions.
  • FIG. 5 illustrates is a flow diagram of a method for determining particular subregions and the corresponding particular measured and stored PSFs to be utilized for image reconstruction.
  • FIG. 6a illustrates a workflow diagram for reconstructing an image based on measured and stored PSFs at a determined axial depth position and differing lateral positions.
  • FIG. 6b illustrates another workflow diagram for reconstructing an image based on measured and stored PSFs at a determined axial depth position and differing lateral positions.
  • FIG. 7 illustrates chart including example experimental data.
  • FIG. 8 illustrates chart including example experimental data.
  • FIG. 9 illustrates chart including example experimental data.
  • FIG. 10 illustrates a running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 11a illustrates a running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 11b illustrates a running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 12 illustrates a running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 13 illustrates a running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 14 illustrates a running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 15 illustrates another running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 16 illustrates another running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 17 illustrates another running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 18 illustrates another running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 19 illustrates another running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 20 illustrates another running example for reconstructing an original image utilizing PSFs measured at a particular depth position and at a number of different lateral positions.
  • FIG. 21 illustrates is a flow diagram of a method for determining particular subregions and the corresponding particular measured and stored PSFs to be utilized for image reconstruction.
  • FIG. 22 illustrates a flow diagram of a method for reconstructing an image captured by a camera disposed behind a display of an electronic device using multiple PSFs.
  • FIG. 23 illustrates an example computer system.
  • a method includes: capturing, by a camera disposed behind a display panel of the electronic device, an original image through a semi-transparent pixel region of the display panel; determining a depth position with respect to at least one object identified within the original image; accessing, based on the depth position, a plurality of point spread functions (PSFs) corresponding to a plurality of lateral positions at the depth position; generating a set of image patches based on the plurality of PSFs, wherein each image patch of the set of image patches is generated based on a different one of the plurality of PSFs; and generating a reconstructed image corresponding to the original image based on the set of image patches.
  • PSFs point spread functions
  • accessing the plurality of PSFs comprises selecting, from a memory of the electronic device, a plurality of premeasured PSFs corresponding to the plurality of lateral positions, respectively.
  • determining the depth position with respect to at least one object comprises calculating the depth position based on a determined image sensor to camera lens distance of the camera or a determined focal length of the camera.
  • generating the set of image patches based on the plurality of PSFs comprises generating a subset of image patches for each color component of a plurality of color components.
  • generating the reconstructed image further comprises de-convolving each image patch of the subset of image patches for each color component of the plurality of color components.
  • generating the reconstructed image further comprises stitching together the subset of image patches for each color component of the plurality of color components.
  • generating the reconstructed image further comprises: performing a filtering of the stitched subset of image patches for each color component of the plurality of color components; and performing a color balancing and correction of the stitched subset of image patches for each color component of the plurality of color components.
  • an electronic device in a second aspect, includes: a display panel; a camera disposed behind the display panel; one or more non-transitory computer-readable storage media including instructions; and one or more processors coupled to the storage media and the camera.
  • the one or more processors are configured to execute the instructions to: capture, by a camera disposed behind a display panel of the electronic device, an original image through a semi-transparent pixel region of the display panel; determine a depth position with respect to at least one object identified within the original image; access, based on the depth position, a plurality of point spread functions (PSFs) corresponding to a plurality of lateral positions at the depth position; generate a set of image patches based on the plurality of PSFs, wherein each image patch of the set of image patches is generated based on a different one of the plurality of PSFs; and generate a reconstructed image corresponding to the original image based on the set of image patches.
  • PSFs point spread functions
  • the instructions to access the plurality of PSFs further comprise instructions to select, from a memory of the electronic device, a plurality of premeasured PSFs corresponding to the plurality of lateral positions, respectively.
  • the instructions to determine the depth position with respect to at least one object further comprise instructions to calculate the depth position based on a determined image sensor to camera lens distance of the camera or a determined focal length of the camera.
  • the instructions to generate the set of image patches based on the plurality of PSFs further comprise instructions to generate a subset of image patches for each color component of a plurality of color components.
  • the instructions to generate the reconstructed image further comprise instructions to de-convolve each image patch of the subset of image patches for each color component of the plurality of color components.
  • the instructions to generate the reconstructed image further comprise instructions to stitch together the subset of image patches for each color component of the plurality of color components.
  • the instructions to generate the reconstructed image further comprise instructions to: perform a filtering of the stitched subset of image patches for each color component of the plurality of color components; and perform a color balancing and correction of the stitched subset of image patches for each color component of the plurality of color components.
  • a computer-readable medium includes instructions that, when executed by one or more processors of an electronic device, cause the one or more processors to: capture, by a camera disposed behind a display panel of the electronic device, an original image through a semi-transparent pixel region of the display panel; determine a depth position with respect to at least one object identified within the original image; access, based on the depth position, a plurality of point spread functions (PSFs) corresponding to a plurality of lateral positions at the depth position; generate a set of image patches based on the plurality of PSFs, wherein each image patch of the set of image patches is generated based on a different one of the plurality of PSFs; and generate a reconstructed image corresponding to the original image based on the set of image patches.
  • PSFs point spread functions
  • the instructions to access the plurality of PSFs further comprise instructions to select, from a memory of the electronic device, a plurality of premeasured PSFs corresponding to the plurality of lateral positions, respectively.
  • the instructions to determine the depth position with respect to at least one object further comprise instructions to calculate the depth position based on a determined image sensor to camera lens distance of the camera or a determined focal length of the camera.
  • the instructions to generate the set of image patches based on the plurality of PSFs further comprise instructions to generate a subset of image patches for each color component of a plurality of color components.
  • the instructions to generate the reconstructed image further comprise instructions to de-convolve each image patch of the subset of image patches for each color component of the plurality of color components.
  • the instructions to generate the reconstructed image further comprise instructions to stitch together the subset of image patches for each color component of the plurality of color components.
  • the present embodiments are directed toward techniques for reconstructing images captured by a camera disposed behind a display of an electronic device based on multiple PSFs captured at a particular depth and differing lateral positions.
  • the electronic device may capture, by a camera disposed behind a display panel of the electronic device, an original image through a semi-transparent (or transparent, partially transparent, etc.) region of the display panel.
  • the original image may include a number of color components, and more specifically red (R) color components, green (G) color components, and blue (B) color components.
  • the camera may determine the bit depth of the original image.
  • the electronic device may then determine a depth position with respect to at least one object identified within the original image. For example, in particular embodiments, the electronic device may determine the depth position with respect to at least one object by calculating the depth position based on a determined focal length of the camera.
  • the electronic device may then access, based on the depth position, a number of point spread functions (PSFs) corresponding to a number of lateral positions at the depth position. For example, in particular embodiments, the electronic device may access the number of PSFs by selecting, from a memory of the electronic device, a number of premeasured PSFs corresponding to the number of lateral positions, respectively. In particular embodiments, the electronic device may then generate a set of image patches based on the number of PSFs, in which each image patch of the set of image patches is generated based on a different one of the number of PSFs. For example, in particular embodiments, the electronic device may generate the set of image patches based on the number of PSFs by generating a subset of image patches for each color component of the number of color components.
  • PSFs point spread functions
  • the electronic device may then generate a reconstructed image corresponding to the original image based on the set of image patches. For example, in particular embodiments, the electronic device may generate the reconstructed image by de-convolving each image patch of the subset of image patches for each color component of the number of color components. In particular embodiments, the electronic device may further generate the reconstructed image by stitching together the subset of image patches for each color component of the number of color components. In particular embodiments, the electronic device may further generate the reconstructed image by performing a filtering of the stitched subset of image patches for each color component of the number of color components, and performing a color balancing and correction of the stitched subset of image patches for each color component of the number of color components.
  • the present embodiments may increase the viewing area and the resolution of the display of the electronic device by disposing one or more front-facing cameras of the electronic device behind the display.
  • the electronic device may further provide for improved graphical user interfaces (GUI) with a full screen view in its entirety, as opposed to limited to only displaying battery status, cellular signal strength data, Wi-Fi status, time info, and so forth, in line with a notch design or hole-punch design.
  • GUI graphical user interfaces
  • the present techniques may further increase an aesthetic quality of the electronic device, as well as allow a user of the electronic device to display higher resolution images on the display of the electronic device.
  • the present techniques may allow the one or more front-facing cameras to be placed anywhere (e.g., in a center area of the display), as opposed to in a corner or along an edge of the display of the electronic device. This may provide an improved user experience and/or GUI, such as by directing a user taking a selfie to gaze at the center area of the display and further by giving the impression of eye-to-eye contact with another user when the user is participating in a videoconference, a video-telephonic exchange, or other video-streaming service.
  • present embodiments are described primarily with respect to reconstructing images captured by a camera disposed behind a display of an electronic device based on multiple PSFs captured at a particular depth and differing lateral positions, the present embodiments further contemplate reconstructing images based on multiple PSFs captured at a particular depth and differing lateral positions utilizing any suitable arrangements of cameras, light sources, and so forth.
  • the present embodiments as described herein may be used for reconstructing images based on multiple PSFs captured at a particular depth and differing lateral positions in any system where images captured by the system may be distorted (e.g., blurred) due to, for example, an object depth position relative to the camera lens being unknown beforehand and the associated PSFs being different for each of various differing lateral positions with respect to a particular object depth position.
  • images captured by the system may be distorted (e.g., blurred) due to, for example, an object depth position relative to the camera lens being unknown beforehand and the associated PSFs being different for each of various differing lateral positions with respect to a particular object depth position.
  • the particular embodiments may equally apply to applications in which, for example, an image is captured through micro-perforations utilizing a concealed camera and/or utilizing an inverse filter to generate a higher-quality image than that achievable by less advanced optical devices.
  • FIG. 1a illustrates an example diagram 100A of an electronic device 102.
  • the electronic device 102 may include, for example, any of various personal electronic devices 102, such as a mobile phone electronic device, a tablet computer electronic device, a laptop computer electronic device, and so forth.
  • the personal electronic device 102 may include, among other things, one or more processor(s) 104, memory 106, sensors 108, cameras 110, a display 112, input structures 114, network interfaces 116, a power source 118, and an input/output (I/O) interface 120.
  • FIG. 1 is merely one example of a particular implementation and is intended to illustrate the types of components that may be included as part of the electronic device 102.
  • the one or more processor(s) 104 may be operably coupled with the memory 106 to perform various algorithms for providing multiple point spread function based image reconstruction for a camera behind a display.
  • Such programs or instructions executed by the processor(s) 104 may be stored in any suitable article of manufacture that includes one or more tangible, computer-readable media at least collectively storing the instructions or routines, such as the memory 106.
  • the memory 106 may include any suitable articles of manufacture for storing data and executable instructions, such as random-access memory (RAM), read-only memory (ROM), rewritable flash memory, hard drives, and so forth.
  • programs e.g., an operating system
  • encoded on such a computer program product may also include instructions that may be executed by the processor(s) 104 to enable the electronic device 102 to provide various functionalities.
  • the sensors 108 may include, for example, one or more cameras (e.g., depth cameras), touch sensors, microphones, motion detection sensors, thermal detection sensors, light detection sensors, time of flight (ToF) sensors, ultrasonic sensors, infrared sensors, or other similar sensors that may be utilized to detect various user inputs (e.g., user voice inputs, user gesture inputs, user touch inputs, user instrument inputs, user motion inputs, and so forth).
  • the cameras 110 may include any number of cameras (e.g., wide cameras, narrow cameras, telephoto cameras, ultra-wide cameras, depth cameras, and so forth) that may be utilized to capture various 2D and 3D images.
  • the display 112 may include any display architecture (e.g., AMLCD, AMOLED, micro-LED, and so forth), which may provide further means by which users may interact and engage with the electronic device 102.
  • one more of the cameras 110 may be disposed behind or underneath (e.g., as indicated by the dashed lines of electronic device 102) display panel of the display 112 (e.g., one or more of the cameras 110 may be completely concealed by the display panel of the display 112), and thus the display 112 may include a transparent pixel region and/or semi-transparent pixel region through which the one or more concealed cameras 110 may detect light, and, by extension, capture images.
  • the one more of the cameras 110 may be disposed anywhere behind or underneath the display panel of the display 112, such as at a center area behind the display 112, at an upper area behind the display 112, or at a lower area behind the display 112.
  • the input structures 114 may include any physical structures utilized to control one or more global functions of the electronic device 102 (e.g., pressing a button to power "ON” or power “OFF” the electronic device 102).
  • the network interface 116 may include, for example, any number of network interfaces suitable for allowing the electronic device 102 to access and receive data over one or more cloud-based networks (e.g., a cloud-based service that may service hundreds or thousands of the electronic device 102 and the associated users corresponding thereto) and/or distributed networks.
  • the power source 118 may include any suitable source of power, such as a rechargeable lithium polymer (Li-poly) battery and/or an alternating current (AC) power converter that may be utilized to power and/or charge the electronic device 102 for operation.
  • the I/O interface 120 may be provided to allow the electronic device 102 to interface with various other electronic or computing devices, such as one or more auxiliary electronic devices.
  • FIG. 1b illustrates an example system and workflow diagram 100B for reconstructing images captured by a camera disposed behind a display 112 of an electronic device, in accordance with the presently disclosed embodiments.
  • the electronic device may capture, by an image sensor 122 and camera lens 124 disposed behind a display panel 112 of the electronic device, an image of a real-world scene 126.
  • the image of the real-world scene 126 captured by the image sensor 122 may correspond to an original image 128.
  • the original image 128 may be degraded (e.g., blurred or distorted).
  • the electronic device may retrieve, for one or more pixel regions of the original image 128, the PSFs (e.g., a function of 3D diffraction pattern of light emitted from an imperceptibly small point light source and captured by one or more image sensors 122) for each of the RGB color components of the original image 128.
  • the PSFs may be stored on the electronic device.
  • the electronic device may determine the respective PSF for each of the RGB color components by selecting (at functional block 132), from the memory of the electronic device, the premeasured PSFs for each of the RGB color components.
  • the electronic device may determine multiple PSFs in various pixel regions of the real-world scene 126 to capture the PSFs' variation with the angle of incidence to the optical axis of the display panel 112.
  • electronic device may then perform (at functional block 134), for the number of pixel regions of the original image 128, a deconvolution of each of the RGB color components of the original image 128 based on their respective PSFs.
  • the electronic device may perform the deconvolution of each of the RGB color components by performing a Richardson-Lucy deconvolution of each of the RGB color components or by performing a Tikhonov regularized inverse filter deconvolution of each of the RGB color components.
  • other deconvolution techniques may be utilized.
  • the electronic device may then generate (at functional block 136) a reconstructed image 138 corresponding to the original image 128 based on the deconvolutions of each of the RGB color components. As illustrated by comparison of the original image 128 to the reconstructed image 138, the electronic device may generally generate the reconstructed image 138 by removing a blurring effect of the original image 128.
  • FIG. 2 illustrates an example system 200 for measuring and determining one or more premeasured point spread functions (PSFs) (e.g., individually for each of the RGB color components and/or one or more particular monochromatic color components) of an electronic device, in accordance with the presently disclosed embodiments.
  • PSFs point spread functions
  • the example system 200 may be utilized for measuring and determining a number of PSFs.
  • the electronic device may premeasure (e.g., determine experimentally during a calibration process and/or manufacturing process of the electronic device) and store the PSFs of the electronic device.
  • FIG. 2 illustrates an example system 200 for measuring and determining one or more premeasured point spread functions (PSFs) (e.g., individually for each of the RGB color components and/or one or more particular monochromatic color components) of an electronic device, in accordance with the presently disclosed embodiments.
  • the example system 200 may be utilized for measuring and determining a number of PSFs.
  • the electronic device may premeasure (e.g., determine
  • point light source 140 may emit a light wave into the direction of the electronic device through, for example, a pinhole or other imperceptibly small aperture.
  • the light wave may pass through, for example, the display panel 112 and the camera lens 124, and may be ultimately detected by the image sensor 122.
  • the electronic device may then premeasure the one or more PSFs for each of the RGB color components and/or one or more particular monochromatic color components based on, for example, a sampling of a transfer function corresponding to the display panel 112 in response to the point light source 140.
  • FIG. 3 illustrates another example system 300 for measuring and determining one or more premeasured PSFs utilizing on-axis point light source 140A and off-axis point light source 140B, respectively.
  • PSFs measured at different depth positions may include different measured values, as well as PSFs measured at different lateral positions may each include different measured values. For example, as depicted by FIG. 2 and FIG.
  • PSFs 202, 204, and 206 measured at different depth positions may include different measured values, as well as PSFs 208 and 210 measured at different lateral positions (e.g., (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 )) may include different measured values.
  • the object depth position relative to the camera lens 124 may be unknown beforehand.
  • the suitable PSF at the correct axial position with respect to the object of interest being captured would first have to be determined. Otherwise, in accordance with particular embodiments, the blurred original image 128 may not be reconstructed in a manner that would completely remove the blurring artifacts.
  • the correct axial depth position may be useful to determine the correct axial depth position and then determine multiple PSFs at the axial depth position (e.g., multiple PSFs at a single axial depth position Z 0 , but at multiple different lateral positions (e.g., (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 )) may be measured to compensate for the fact that PSFs measured at different lateral positions (e.g., (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 )) may each include a different measured value.
  • multiple PSFs at the axial depth position e.g., multiple PSFs at a single axial depth position Z 0 , but at multiple different lateral positions (e.g., (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 )
  • PSFs measured at different lateral positions e.g., (X 1
  • FIG. 4 illustrates an example system 400 for reconstructing images based on multiple PSFs (e.g., individually for each of the RGB color components and/or one or more particular monochromatic color components) captured at a particular depth position and differing lateral positions, in accordance with the presently disclosed embodiments.
  • the example system 400 may be utilized as the basis for measuring and determining one or more PSFs.
  • the point light source 140 e.g., a white LED or an array of white LEDs
  • the light wave may pass through, for example, the display panel 112 and the camera lens 124, and may be ultimately detected by the image sensor 122.
  • an axial depth position Z 0 may be determined for a particular object of interest (e.g., point light source 140) that may be positioned within a 3D object space 402.
  • the determined axial depth position Z 0 may be calculated utilizing infrared (IR) light and/or structured light and or more depth cameras that may capture the point light source 140 apart from the electronic device and/or by way of the electronic device.
  • the determined axial depth position Z 0 may also be estimated based on the physical experimental setup of the example system 400.
  • a number of PSFs may be measured at a number of different lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ) at the determined axial depth position Z 0 .
  • the different lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ) may be located at the determined axial depth position Z 0 and may be defined laterally with respect to the 3D object space 402.
  • a first region 404 (e.g., "Region 1") may be defined laterally with respect to the 3D object space 402 and may correspond to a lateral position value of Z 01
  • a second region 406 (e.g., "Region 2") may be defined laterally with respect to the 3D object space 402 and may correspond to a lateral position value of Z 02
  • a third region 408 (e.g., "Region 3”) may be defined laterally with respect to the 3D object space 402 and may correspond to a lateral position value of Z 03 .
  • the first region 404 (e.g., "Region 1") corresponding to the lateral position value of Z 01
  • the second region 406 e.g., "Region 2" corresponding to the lateral position value of Z 02
  • the third region 408 (e.g., "Region 3") corresponding to the lateral position value of Z 03
  • the respective measured PSFs corresponding to the differing lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), and (X 3 ,Y 3 ,Z 0 ) may represent the PSF for the particular sub-region of pixels (e.g., patch of pixels).
  • a corresponding number of PSFs may be measured at each of the determined number of different lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ).
  • the electronic device may then store the measured PSFs (e.g., for each of the RGB color components and/or one or more particular monochromatic color components) corresponding to the determined number of different lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ) into, for example, the memory 106 of FIG. 1a to be later utilized to reconstruct images captured by the camera disposed behind the display panel 112 of the electronic device.
  • the measured PSFs e.g., for each of the RGB color components and/or one or more particular monochromatic color components
  • any number of lateral positions 404, 406, and 408 and corresponding differing lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), and (X 3 ,Y 3 ,Z 0 ) may be determined, and thus any number of PSFs may be measured.
  • the present techniques may allow the electronic device to perform image reconstruction by utilizing the respective measured PSFs to stitch together the reconstructed image based on the particular sub-regions of pixels to which the respective measured PSFs correspond.
  • FIG. 5 illustrates is a flow diagram of a method 500 for determining particular sub-regions and the corresponding particular measured and stored PSFs to be utilized for image reconstruction, in accordance with the presently disclosed embodiments.
  • the method 500 may be performed utilizing one or more processing devices (e.g., the one or more processors 104 of FIG.
  • 1a may include hardware (e.g., a general purpose processor, a graphic processing unit (GPU), an application-specific integrated circuit (ASIC), a system-on-chip (SoC), a microcontroller, a field-programmable gate array (FPGA), a central processing unit (CPU), an application processor (AP), a visual processing unit (VPU), a neural processing unit (NPU), a neural decision processor (NDP), or any other processing device(s) that may be suitable for processing image data), software (e.g., instructions running/executing on one or more processors), firmware (e.g., microcode), or some combination thereof.
  • hardware e.g., a general purpose processor, a graphic processing unit (GPU), an application-specific integrated circuit (ASIC), a system-on-chip (SoC), a microcontroller, a field-programmable gate array (FPGA), a central processing unit (CPU), an application processor (AP), a visual processing unit (VPU), a neural processing unit (NPU),
  • one or more blocks of the method 500 may be performed only once or on an as-needed (e.g., per request/instruction) basis, such as when a photograph is being captured.
  • one or more blocks of the method 500 may be performed continuously and/or iteratively (e.g., automatically running multiple times over a duration of time), such as when a video is being recorded or when a camera application is executing a viewfinder/camera-preview is being continuously displayed.
  • the method 500 may begin at block 502 with the one or more processing devices (e.g., the one or more processors 104 of the electronic device 102 of FIG. 1a) turning on the camera 110 and continuing at block 504 with enabling camera auto focus or manual focus function to focus on an object of interest within a real world scene.
  • the method 500 may continue at block 506 with the one or more processing devices determining an image sensor 122 to camera lens 124 distance Z i of the camera 110 and then at block 508 with determining the axial depth position Z 0 of the object of interest within the real world scene (e.g., based on lens imaging equation utilizing the known the image sensor 122 to camera lens 124 distance Z i ).
  • the method 500 may continue at block 510 with the one or more processing devices determining the particular sub-regions of pixels (e.g., differing lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), and (X 3 ,Y 3 ,Z 0 )) corresponding to the determined axial depth position Z 0 and concluding at block 512 with selecting the measured and stored PSFs from, for example, a look-up table (LUT) of the memory 106 corresponding to the particular sub-regions (found region) of pixels (e.g., differing lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), and (X 3 ,Y 3 ,Z 0 )) for image reconstruction.
  • LUT look-up table
  • FIG. 6a illustrates a workflow diagram 600A for reconstructing an image based on measured and stored PSFs at a determined axial depth position Z 0 and differing lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), and (X 3 ,Y 3 ,Z 0 ).
  • an original image may be captured (at functional block 602).
  • Bayer raw image data corresponding to the original image may be then converted (at functional block 604) into separate RGB color components and split (at functional block 606) into a number of image patches (at functional blocks 608A, 608B, and 608C).
  • deconvolutions for each of the RGB color component image patches, deconvolutions (at block 610A, 610B, and 610C) may be performed utilizing the stored measured and stored PSFs corresponding to the respective image patches.
  • multiple reconstructed patches may be stitched together (at functional blocks 612A, 612B, and 612C) into a single RGB reconstruction image that may be further filtered for denoising (at functional blocks 614A, 614B, and 614C) to reduce the noise.
  • color balancing and correction at functional block 616) may be performed on the reconstruction image and then output the finalized stitched-together reconstruction image (at functional block 618).
  • FIG. 6b illustrates a workflow diagram 600B that describes the function of the dashed rectangle in FIG. 6a.
  • the respective measured and stored PSFs corresponding to the determined axial depth position Z 0 may be selected (at functional block 622), for example, from one or more LUTs within the memory 106 of the electronic device 102 of FIG. 1a.
  • a further selection of the respective measured PSF corresponding to the particular image patch may be determined and then the deconvolution (at functional block 628) of each of the RGB color components (at functional block 626) may be performed.
  • FIG. 7, FIG. 8, and FIG. 9 illustrate example experimental data, which includes a position data plot 700, a PSF LUT 800, and a position data table 900.
  • the position data plot 700 may depict depth of focus (DOF) plotted against axial depth position Z 0 , in accordance with the presently disclosed embodiments.
  • DOE depth of focus
  • the PSF LUT 800 may include axial depth position Z 0 (e.g., 300mm, 325mm, 350mm, 375mm, 400mm, 420mm, 425mm, 450mm, and so forth) and the respective measured PSFs corresponding to the differing lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), and (X 3 ,Y 3 ,Z 0 ).
  • Z 0 e.g., 300mm, 325mm, 350mm, 375mm, 400mm, 420mm, 425mm, 450mm, and so forth
  • the position data table 900 may simply depict the relationship between axial depth position Z 0 and the lateral sub-regions (e.g., corresponding to the differing lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), and (X 3 ,Y 3 ,Z 0 )) at which the PSFs are measured and stored to the PSF LUT 800.
  • FIGS. 10 through 20 illustrate one or more running examples of the presently disclosed techniques of reconstructing images based on multiple PSFs (e.g., individually for each of the RGB color components and/or one or more particular monochromatic color components) captured at a particular depth position and differing lateral positions.
  • PSFs e.g., individually for each of the RGB color components and/or one or more particular monochromatic color components
  • 10 to 14 may depict a running example in which an original image may be captured and reconstructed utilizing PSFs measured at, for example, up to a total of 6 different lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), (X 3 ,Y 3 ,Z 0 ), (X 4 ,Y 4 ,Z 0 ), (X 5 ,Y 5 ,Z 0 ), and (X 6 ,Y 6 ,Z 0 ).
  • 15 to 20 may depict another running example in which an original image may be captured and reconstructed utilizing PSFs measured at, for example, up to a total of 35 different lateral positions (X 1 ,Y 1 ,Z 0 ) to (X 35 ,Y 35 ,Z 0 ).
  • PSFs measured at, for example, up to a total of 35 different lateral positions (X 1 ,Y 1 ,Z 0 ) to (X 35 ,Y 35 ,Z 0 ).
  • a number of PSFs may be measured at each of a determined number of different lateral positions (e.g., 6 different lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), (X 3 ,Y 3 ,Z 0 ), (X 4 ,Y 4 ,Z 0 ), (X 5 ,Y 5 ,Z 0 ), and (X 6 ,Y 6 ,Z 0 )).
  • the six different lateral positions may each correspond to a different subregion of pixels and, as may be observed, may each include a different measured PSF.
  • FIGS. 11a and 11b illustrate a real world scene 1100A and an original image 1100B, respectively.
  • the determined sub-regions and the measured PSFs corresponding to the sub-regions e.g., and corresponding to the six different lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), (X 3 ,Y 3 ,Z 0 ), (X 4 ,Y 4 ,Z 0 ), (X 5 ,Y 5 ,Z 0 ), and (X 6 ,Y 6 ,Z 0 )
  • image patches e.g., image patches "1", “2” , “3” , "4" , "5", and "6
  • stitched-together reconstructed image 1200 as depicted in FIG.
  • FIG. 13 shows the six restored patches prior to performing the image stitching process.
  • FIG. 14 depicts the final stitched reconstructed image 1400 from the six restored image patches. It should be appreciated that FIG. 14 is included merely for the purposes of illustration of the presently disclosed techniques.
  • the presently disclosed embodiments may include generating reconstructed images utilizing an N number of PSFs corresponding to an N number of different lateral positions (e.g., (X 1 ,Y 1 ,Z 0 ) to (X N ,Y N ,Z 0 )) to completely recover the original image (e.g., a recovered image more akin to the real world scene 1100A in FIG. 11a).
  • a number of PSFs may be measured at each of a determined number of different lateral positions (e.g., 35 different lateral positions (X 1 ,Y 1 ,Z 0 ) to (X 35 ,Y 35 ,Z 0 )).
  • the 35 different lateral positions (X 1 ,Y 1 ,Z 0 ) to (X 35 ,Y 35 ,Z 0 ) may each correspond to a different sub-region of pixels and, as may be observed, may each include a different measured PSF.
  • FIGS. 16 and 17 illustrate a real world scene 1600 and an original image 1700, respectively.
  • image patches e.g., image patches "1" to "35" may be generated and utilized to generate a stitched-together reconstructed image 1800 as depicted in FIG. 18 and a final reconstructed image 1900 as depicted in FIG. 19.
  • image patches e.g., image patches "11" and "18" are depicted in the stitched-together reconstructed image 1800.
  • FIG. 20 depicts a reconstructed image 2000, which illustrates that, without reconstructing an image based on measured and stored PSFs at a determined axial depth position and differing lateral positions, the reconstructed image 2000 may still include blurring artifacts when generated based on utilizing only a single PSF for the entire reconstructed image.
  • FIG. 21 illustrates is a flow diagram of a method 2100 for determining particular sub-regions and the corresponding particular measured and stored PSFs to be utilized for image reconstruction, in accordance with the presently disclosed embodiments.
  • the method 2100 may be performed utilizing one or more processing devices (e.g., the one or more processors 104 of FIG.
  • 1a may include hardware (e.g., a general purpose processor, a graphic processing unit (GPU), an application-specific integrated circuit (ASIC), a system-on-chip (SoC), a microcontroller, a field-programmable gate array (FPGA), a central processing unit (CPU), an application processor (AP), a visual processing unit (VPU), a neural processing unit (NPU), a neural decision processor (NDP), or any other processing device(s) that may be suitable for processing image data), software (e.g., instructions running/executing on one or more processors), firmware (e.g., microcode), or some combination thereof.
  • hardware e.g., a general purpose processor, a graphic processing unit (GPU), an application-specific integrated circuit (ASIC), a system-on-chip (SoC), a microcontroller, a field-programmable gate array (FPGA), a central processing unit (CPU), an application processor (AP), a visual processing unit (VPU), a neural processing unit (NPU),
  • one or more blocks of the method 2100 may be performed only once or on an as-needed (e.g., per request/instruction) basis, such as when a photograph is being captured.
  • one or more blocks of the method 2100 may be performed continuously and/or iteratively (e.g., automatically running multiple times over a duration of time), such as when a video is being recorded or when a camera application is executing a viewfinder/camera-preview is being continuously displayed.
  • the method 2100 may begin at block 2102 with the one or more processing devices (e.g., the one or more processors 104 of the electronic device 102 of FIG. 1a) turning on the camera 110 and continuing at block 2104 with enabling camera auto focus or manual focus function to focus on an object of interest within a real world scene.
  • the method 2100 may continue at block 2106 with the one or more processing devices reading focal length, an image sensor 122 to camera lens 124 distance Z i , directly from the camera 110 and then at block 2108 with determining the axial depth position Z 0 of the object of interest within the real world scene (e.g., based on lens imaging equation utilizing the read focal length Z i ).
  • the method 2100 may continue at block 2110 with the one or more processing devices determining the particular sub-regions of pixels (e.g., differing lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), and (X 3 ,Y 3 ,Z 0 )) corresponding to the determined axial depth position Z 0 and concluding at block 2112 with selecting the measured and stored PSFs from, for example, a look-up table (LUT) of the memory 106 corresponding to the particular sub-regions (found region) of pixels (e.g., differing lateral positions (X 1 ,Y 1 ,Z 0 ), (X 2 ,Y 2 ,Z 0 ), and (X 3 ,Y 3 ,Z 0 )) for image reconstruction.
  • LUT look-up table
  • FIG. 22 illustrates is a flow diagram of a method 2200 for reconstructing images captured by a camera disposed behind a display of an electronic device based on multiple PSFs captured at a particular depth and differing lateral positions, in accordance with the presently disclosed embodiments.
  • the method 2200 may be performed utilizing one or more processing devices (e.g., the one or more processors 104 of FIG.
  • 1a may include hardware (e.g., a general purpose processor, a graphic processing unit (GPU), an application-specific integrated circuit (ASIC), a system-on-chip (SoC), a microcontroller, a field-programmable gate array (FPGA), a central processing unit (CPU), an application processor (AP), a visual processing unit (VPU), a neural processing unit (NPU), a neural decision processor (NDP), or any other processing device(s) that may be suitable for processing image data), software (e.g., instructions running/executing on one or more processors), firmware (e.g., microcode), or some combination thereof.
  • hardware e.g., a general purpose processor, a graphic processing unit (GPU), an application-specific integrated circuit (ASIC), a system-on-chip (SoC), a microcontroller, a field-programmable gate array (FPGA), a central processing unit (CPU), an application processor (AP), a visual processing unit (VPU), a neural processing unit (NPU),
  • one or more blocks of the method 2200 may be performed only once or on an as-needed (e.g., per request/instruction) basis, such as when a photograph is being captured.
  • one or more blocks of the method 2200 may be performed continuously and/or iteratively (e.g., automatically running multiple times over a duration of time), such as when a video is being recorded or when a camera application is executing a viewfinder/camera-preview is being continuously displayed.
  • the method 2200 may begin block 2202 with the one or more processing devices (e.g., one or more processors 104 of the electronic device 102 of FIG. 1a) capturing, by a camera disposed behind a display panel of the electronic device, an original image through a semi-transparent pixel region of the display panel, in which the original image includes one or more color components.
  • the method 2200 may then continue at block 2204 with the one or more processing devices determining a depth position with respect to at least one object identified within the original image.
  • the method 2200 may then continue at block 2206 with the one or more processing devices accessing, based on the depth position, a plurality of point spread functions (PSFs) corresponding to a plurality of lateral positions at the depth position.
  • PSFs point spread functions
  • the method 2200 may then continue at block 2208 with the one or more processing devices generating a set of image patches based on the plurality of PSFs, wherein each image patch of the set of image patches is generated based on a different one of the plurality of PSFs.
  • the method 2200 may then conclude at block 2210 with the one or more processing devices generating a reconstructed image corresponding to the original image based on the set of image patches.
  • the present embodiments may increase the viewing area and the resolution of the display panel of the electronic device by disposing one or more front-facing cameras of the electronic device behind the display panel.
  • the electronic device may further provide for improved (GUIs) with a full screen view in its entirety, as opposed to limited to only displaying battery status, cellular signal strength data, Wi-Fi status, time info, and so forth, in line with a notch design or hole-punch design.
  • GUIs improved
  • the present techniques may further increase an aesthetic quality of the electronic device, as well as allow a user of the electronic device to display higher resolution images on the display panel of the electronic device.
  • the present techniques may allow the one or more front-facing cameras to be placed anywhere, such as in a center area of the display panel (e.g., as opposed to in a corner or along an edge of the display panel) of the electronic device.
  • This may provide an improved user experience and/or GUI, such as by directing a user taking a selfie to gaze at the center area of the display panel, and further by giving the impression of eye-to-eye contact with another user when the user is participating in a videoconference, a video-telephonic exchange, or other video-streaming service.
  • FIG. 23 illustrates an example computer system 2300 that may be utilized for reconstructing images captured by a camera disposed behind a display of an electronic device based on multiple PSFs captured at a particular depth and differing lateral positions, in accordance with the presently disclosed embodiments.
  • one or more computer systems 2300 perform one or more steps of one or more methods described or illustrated herein.
  • one or more computer systems 2300 provide functionality described or illustrated herein.
  • software running on one or more computer systems 2300 performs one or more steps of one or more methods described or illustrated herein or provides functionality described or illustrated herein.
  • Particular embodiments include one or more portions of one or more computer systems 2300.
  • reference to a computer system may encompass a computing device, and vice versa, where appropriate.
  • reference to a computer system may encompass one or more computer systems, where appropriate.
  • computer system 2300 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (e.g., a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, a tablet computer system, an augmented/virtual reality device, or a combination of two or more of these.
  • SBC single-board computer system
  • PDA personal digital assistant
  • server a server
  • tablet computer system augmented/virtual reality device
  • one or more computer systems 2300 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein.
  • one or more computer systems 2300 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein.
  • One or more computer systems 2300 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
  • computer system 2300 includes a processor 2302, memory 2304, storage 2306, an input/output (I/O) interface 2308, a communication interface 2310, and a bus 2312.
  • I/O input/output
  • this disclosure describes and illustrates a particular computer system having a particular number of particular components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement.
  • processor 2302 includes hardware for executing instructions, such as those making up a computer program.
  • processor 2302 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 2304, or storage 2306; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 2304, or storage 2306.
  • processor 2302 may include one or more internal caches for data, instructions, or addresses. This disclosure contemplates processor 2302 including any suitable number of any suitable internal caches, where appropriate.
  • processor 2302 may include one or more instruction caches, one or more data caches, and one or more translation lookaside buffers (TLBs). Instructions in the instruction caches may be copies of instructions in memory 2304 or storage 2306, and the instruction caches may speed up retrieval of those instructions by processor 2302.
  • TLBs translation lookaside buffers
  • Data in the data caches may be copies of data in memory 2304 or storage 2306 for instructions executing at processor 2302 to operate on; the results of previous instructions executed at processor 2302 for access by subsequent instructions executing at processor 2302 or for writing to memory 2304 or storage 2306; or other suitable data.
  • the data caches may speed up read or write operations by processor 2302.
  • the TLBs may speed up virtual-address translation for processor 2302.
  • processor 2302 may include one or more internal registers for data, instructions, or addresses. This disclosure contemplates processor 2302 including any suitable number of any suitable internal registers, where appropriate.
  • processor 2302 may include one or more arithmetic logic units (ALUs); be a multi-core processor; or include one or more processors 2302.
  • memory 2304 includes main memory for storing instructions for processor 2302 to execute or data for processor 2302 to operate on.
  • computer system 2300 may load instructions from storage 2306 or another source (such as, for example, another computer system 2300) to memory 2304.
  • Processor 2302 may then load the instructions from memory 2304 to an internal register or internal cache.
  • processor 2302 may retrieve the instructions from the internal register or internal cache and decode them.
  • processor 2302 may write one or more results (which may be intermediate or final results) to the internal register or internal cache.
  • Processor 2302 may then write one or more of those results to memory 2304.
  • processor 2302 executes only instructions in one or more internal registers or internal caches or in memory 2304 (as opposed to storage 2306 or elsewhere) and operates only on data in one or more internal registers or internal caches or in memory 2304 (as opposed to storage 2306 or elsewhere).
  • One or more memory buses may couple processor 2302 to memory 2304.
  • Bus 2312 may include one or more memory buses, as described below.
  • one or more memory management units reside between processor 2302 and memory 2304 and facilitate accesses to memory 2304 requested by processor 2302.
  • memory 2304 includes random access memory (RAM).
  • This RAM may be volatile memory, where appropriate. Where appropriate, this RAM may be dynamic RAM (DRAM) or static RAM (SRAM). Moreover, where appropriate, this RAM may be single-ported or multi-ported RAM.
  • DRAM dynamic RAM
  • SRAM static RAM
  • Memory 2304 may include one or more memory devices, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.
  • storage 2306 includes mass storage for data or instructions.
  • storage 2306 may include a hard disk drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these.
  • Storage 2306 may include removable or non-removable (or fixed) media, where appropriate.
  • Storage 2306 may be internal or external to computer system 2300, where appropriate.
  • storage 2306 is non-volatile, solid-state memory.
  • storage 2306 includes read-only memory (ROM).
  • this ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these.
  • This disclosure contemplates mass storage 2306 taking any suitable physical form.
  • Storage 2306 may include one or more storage control units facilitating communication between processor 2302 and storage 2306, where appropriate. Where appropriate, storage 2306 may include one or more storages 2306. Although this disclosure describes and illustrates particular storage, this disclosure contemplates any suitable storage.
  • I/O interface 2308 includes hardware, software, or both, providing one or more interfaces for communication between computer system 2300 and one or more I/O devices.
  • Computer system 2300 may include one or more of these I/O devices, where appropriate.
  • One or more of these I/O devices may enable communication between a person and computer system 2300.
  • an I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet, touch screen, trackball, video camera, another suitable I/O device or a combination of two or more of these.
  • An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 2308 for them.
  • I/O interface 2308 may include one or more device or software drivers enabling processor 2302 to drive one or more of these I/O devices.
  • I/O interface 2308 may include one or more I/O interfaces, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.
  • communication interface 2310 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 2300 and one or more other computer systems or one or more networks.
  • communication interface 2310 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network.
  • NIC network interface controller
  • WNIC wireless NIC
  • WI-FI network wireless network
  • computer system 2300 may communicate with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these.
  • PAN personal area network
  • LAN local area network
  • WAN wide area network
  • MAN metropolitan area network
  • computer system 2300 may communicate with a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination of two or more of these.
  • Computer system 2300 may include any suitable communication interface 2310 for any of these networks, where appropriate.
  • Communication interface 2310 may include one or more communication interfaces 2310, where appropriate.
  • bus 2312 includes hardware, software, or both coupling components of computer system 2300 to each other.
  • bus 2312 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination of two or more of these.
  • Bus 2312 may include one or more buses 2312, where appropriate.
  • a computer-readable storage medium or media may include one or more semiconductor-based or other integrated circuits (ICs) (such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)), hard disk drives (HDDs), hybrid hard drives (HHDs), optical discs, optical disc drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, floppy disk drives (FDDs), magnetic tapes, solid-state drives (SSDs), RAM-drives, SECURE DIGITAL cards or drives, any other suitable computer-readable storage media, or any suitable combination of two or more of these, where appropriate.
  • ICs such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)
  • HDDs hard disk drives
  • HHDs hybrid hard drives
  • ODDs optical disc drives
  • magneto-optical discs magneto-optical drives
  • FDDs floppy diskettes
  • references in the appended claims to an apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative. Additionally, although this disclosure describes or illustrates particular embodiments as providing particular advantages, particular embodiments may provide none, some, or all of these advantages.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

Un procédé consiste à capturer, par un dispositif d'affichage situé derrière un panneau d'affichage d'un dispositif électronique, une image d'origine à travers une région de pixel semi-transparente du panneau d'affichage, et à déterminer une position de profondeur par rapport à au moins un objet identifié dans l'image d'origine. Le procédé consiste en outre à obtenir, en fonction de la position de profondeur, une pluralité de fonctions d'étalement ponctuel (PSF) correspondant à une pluralité de positions latérales au niveau de la position de profondeur, et à générer un ensemble de correctifs d'image reposant sur la pluralité de PSF. Chaque correctif d'image de l'ensemble de correctifs d'image est généré en fonction d'un PSF différent de la pluralité de PSF. Le procédé se termine par la génération d'une image reconstruite correspondant à l'image d'origine sur la base de l'ensemble de correctifs d'image.
PCT/KR2022/001920 2021-02-16 2022-02-08 Reconstruction d'image faisant appel à une fonction d'étalement de points multiples pour un dispositif de prise de vues derrière un dispositif d'affichage WO2022177223A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/176,535 2021-02-16
US17/176,535 US11721001B2 (en) 2021-02-16 2021-02-16 Multiple point spread function based image reconstruction for a camera behind a display

Publications (1)

Publication Number Publication Date
WO2022177223A1 true WO2022177223A1 (fr) 2022-08-25

Family

ID=82800488

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/001920 WO2022177223A1 (fr) 2021-02-16 2022-02-08 Reconstruction d'image faisant appel à une fonction d'étalement de points multiples pour un dispositif de prise de vues derrière un dispositif d'affichage

Country Status (2)

Country Link
US (1) US11721001B2 (fr)
WO (1) WO2022177223A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220029310A (ko) * 2020-08-31 2022-03-08 삼성전자주식회사 이미지 센서, 이미지 센서를 포함하는 이미지 획득 장치 및 그것의 동작 방법
US11722796B2 (en) 2021-02-26 2023-08-08 Samsung Electronics Co., Ltd. Self-regularizing inverse filter for image deblurring
US11893668B2 (en) * 2021-03-31 2024-02-06 Leica Camera Ag Imaging system and method for generating a final digital image via applying a profile to image information

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080165261A1 (en) * 2007-01-09 2008-07-10 Yuji Kamo Imaging apparatus adapted to implement electrical image restoration processing
US20160277658A1 (en) * 2015-03-16 2016-09-22 Dae Kwan Kim Image signal processor and devices including the same
US20170076430A1 (en) * 2014-05-28 2017-03-16 Huawei Technologies Co., Ltd. Image Processing Method and Image Processing Apparatus
US20190355101A1 (en) * 2017-01-12 2019-11-21 Intel Corporation Image refocusing
US20210029336A1 (en) * 2019-07-26 2021-01-28 Samsung Electronics Company, Ltd. Processing images captured by a camera behind a display

Family Cites Families (93)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3971065A (en) 1975-03-05 1976-07-20 Eastman Kodak Company Color imaging array
TW510131B (en) 2000-05-24 2002-11-11 Chi Mei Electronic Corp Image input/output device
JP4389371B2 (ja) 2000-09-28 2009-12-24 株式会社ニコン 画像修復装置および画像修復方法
AU2003217081A1 (en) * 2002-03-17 2003-09-29 Gareth Paul Bell Optimising point spread function of spatial filter
KR20050107741A (ko) * 2003-01-16 2005-11-15 디-블러 테크놀로지즈 리미티드 이미지 인해스먼트 기능을 가진 카메라
US7053613B2 (en) 2004-06-03 2006-05-30 Fa-Hsuan Lin Method for parallel image reconstruction using automatic regularization
ES2272192B1 (es) 2005-10-14 2008-03-16 Consejo Superior Invet. Cientificas Metodo de deconvolucion ciega y superresolucion para secuencias y conjuntos de imagenes, y sus aplicaciones.
DE102005052061A1 (de) 2005-11-01 2007-05-16 Carl Zeiss Imaging Solutions G Verfahren und Vorrichtung zur Bildverarbeitung
EP1958151B1 (fr) 2005-11-10 2014-07-30 DigitalOptics Corporation International Amelioration d'image dans un domaine de mosaique
US7724351B2 (en) 2006-01-30 2010-05-25 Asml Netherlands B.V. Lithographic apparatus, device manufacturing method and exchangeable optical element
US8036481B2 (en) * 2006-07-14 2011-10-11 Eastman Kodak Company Image processing apparatus and image restoration method and program
JP2008070566A (ja) 2006-09-13 2008-03-27 Matsushita Electric Ind Co Ltd カメラシステム、カメラ本体、交換レンズユニットおよび像ブレ補正方法
US7719719B2 (en) 2006-09-18 2010-05-18 Xerox Corporation Sharpening a halftoned image
US7796872B2 (en) * 2007-01-05 2010-09-14 Invensense, Inc. Method and apparatus for producing a sharp image from a handheld device containing a gyroscope
JP4930109B2 (ja) 2007-03-06 2012-05-16 ソニー株式会社 固体撮像装置、撮像装置
JP5188138B2 (ja) 2007-10-15 2013-04-24 キヤノン株式会社 像ぶれ補正装置を有する光学機器
US8289438B2 (en) * 2008-09-24 2012-10-16 Apple Inc. Using distance/proximity information when applying a point spread function in a portable media device
WO2010081229A1 (fr) 2009-01-19 2010-07-22 The University Of British Columbia Imagerie multiplexée
US20100188528A1 (en) 2009-01-28 2010-07-29 Kabushiki Kaisha Toshiba Image recording device, manufacturing apparatus of image recording device, and manufacturing method of image recording device
US8654234B2 (en) * 2009-07-26 2014-02-18 Massachusetts Institute Of Technology Bi-directional screen
US20110075257A1 (en) * 2009-09-14 2011-03-31 The Arizona Board Of Regents On Behalf Of The University Of Arizona 3-Dimensional electro-optical see-through displays
US8587703B2 (en) 2009-12-01 2013-11-19 Aptina Imaging Corporation Systems and methods for image restoration
JP2011128978A (ja) * 2009-12-18 2011-06-30 Sony Corp 情報処理装置、情報処理方法、及びプログラム
KR101633397B1 (ko) 2010-03-12 2016-06-27 삼성전자주식회사 영상 복원 장치, 영상 복원 방법 및 영상 복원 시스템
KR101640456B1 (ko) * 2010-03-15 2016-07-19 삼성전자주식회사 디스플레이 패널의 각 픽셀들의 개구부를 통해 촬영하는 촬영 장치 및 방법
JP2011242616A (ja) * 2010-05-19 2011-12-01 Sony Corp 画像表示装置、電子機器、画像表示システム、画像取得方法、プログラム
US9159270B2 (en) * 2010-08-31 2015-10-13 Dolby Laboratories Licensing Corporation Ambient black level
JP5635844B2 (ja) * 2010-09-06 2014-12-03 キヤノン株式会社 焦点調整装置および撮像装置
WO2012041492A1 (fr) 2010-09-28 2012-04-05 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Procédé et dispositif destinés à récupérer une image numérique à partir d'une séquence d'images numériques observées
KR101781533B1 (ko) 2010-12-23 2017-09-27 삼성디스플레이 주식회사 영상 촬영 장치 및 이의 영상 촬영 방법
CN102812715B (zh) * 2011-01-27 2015-08-19 松下电器产业株式会社 三维图像摄影装置以及三维图像拍摄方法
JP2013005258A (ja) 2011-06-17 2013-01-07 Panasonic Corp ブレ補正装置、ブレ補正方法及び帳票
US9338354B2 (en) 2011-10-03 2016-05-10 Nikon Corporation Motion blur estimation and restoration using light trails
KR101894391B1 (ko) 2011-10-05 2018-09-04 삼성전자주식회사 진단영상 생성장치, 의료영상시스템 및 빔포밍 수행방법
WO2013093916A1 (fr) 2011-12-21 2013-06-27 Xceed Imaging Ltd. Lentille optique à diminution de halo
KR101864452B1 (ko) * 2012-01-12 2018-06-04 삼성전자주식회사 이미지 촬영 및 화상 통화 장치와 방법
JP5968073B2 (ja) 2012-05-18 2016-08-10 キヤノン株式会社 画像処理装置、撮像装置、画像処理方法、および画像処理プログラム
US20130321686A1 (en) * 2012-06-01 2013-12-05 Kar-Han Tan Display-camera system with switchable diffuser
US8989447B2 (en) * 2012-08-13 2015-03-24 Texas Instruments Incorporated Dynamic focus for computational imaging
US9310843B2 (en) 2013-01-02 2016-04-12 Apple Inc. Electronic devices with light sensors and displays
DE112013004507T5 (de) * 2013-03-28 2015-12-31 Fujifilm Corporation Bildverarbeitungsvorrichtung, Bilderfassungsvorrichtung, Bildverarbeitungsverfahren, Programm und Aufzeichnungsmedium
KR101462351B1 (ko) * 2013-08-16 2014-11-14 영남대학교 산학협력단 시선일치 영상통화장치
WO2015146364A1 (fr) * 2014-03-28 2015-10-01 富士フイルム株式会社 Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme
CN104182727B (zh) 2014-05-16 2021-07-30 深圳印象认知技术有限公司 超薄型指纹、掌纹采集装置及指纹、掌纹图像采集方法
US10018851B2 (en) 2014-08-14 2018-07-10 Yakov Soskind Optical field transformation methods and systems
JP6071974B2 (ja) 2014-10-21 2017-02-01 キヤノン株式会社 画像処理方法、画像処理装置、撮像装置および画像処理プログラム
US9940717B2 (en) * 2014-12-23 2018-04-10 Intel Corporation Method and system of geometric camera self-calibration quality assessment
CN104537620B (zh) 2014-12-30 2017-04-12 华中科技大学 一种方向自适应图像去模糊方法
US9648236B2 (en) * 2015-02-19 2017-05-09 Blackberry Limited Device with a front facing camera having discrete focus positions
US10330566B2 (en) * 2015-03-05 2019-06-25 Eyenetra, Inc. Methods and apparatus for small aperture lensometer
WO2016154392A1 (fr) 2015-03-24 2016-09-29 University Of Florida Research Foundation, Inc. Dispositif de privatisation optique, système et procédé d'utilisation
JP6347763B2 (ja) 2015-05-19 2018-06-27 キヤノン株式会社 画像処理装置、撮像装置、画像処理方法および画像処理プログラム
US10548473B2 (en) * 2015-06-23 2020-02-04 Essilor International Optometry measuring scale
US10178381B2 (en) * 2015-07-14 2019-01-08 Microsoft Technology Licensing, Llc Depth-spatial frequency-response assessment
KR20170009601A (ko) 2015-07-17 2017-01-25 삼성전자주식회사 단층 촬영 장치 및 그에 따른 단층 영상 처리 방법
JP6503963B2 (ja) * 2015-07-29 2019-04-24 オムロン株式会社 光デバイス
JP6608763B2 (ja) 2015-08-20 2019-11-20 株式会社東芝 画像処理装置及び撮影装置
KR101850871B1 (ko) 2015-08-26 2018-04-23 주식회사 디알텍 방사선 영상의 처리방법 및 방사선 촬영시스템
US10250782B2 (en) 2015-10-08 2019-04-02 Samsung Electro-Mechanics Co., Ltd. Camera module, electronic device, and method of operating the same using pre-estimated lens-customized point spread function (PSF)
US10416087B2 (en) 2016-01-01 2019-09-17 Kla-Tencor Corporation Systems and methods for defect detection using image reconstruction
KR102419624B1 (ko) 2016-01-21 2022-07-11 삼성전자 주식회사 전자 장치의 센서 배치 구조
US10191577B2 (en) 2016-02-16 2019-01-29 Samsung Electronics Co., Ltd. Electronic device
US9911208B2 (en) 2016-04-11 2018-03-06 Toshiba Medical Systems Corporation Apparatus and method of iterative image reconstruction using regularization-parameter control
US20170316552A1 (en) 2016-04-27 2017-11-02 Ramot At Tel-Aviv University Ltd. Blind image deblurring via progressive removal of blur residual
DE102016217785A1 (de) 2016-09-16 2018-03-22 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Optische Anordnung zur Erzeugung von Lichtfeldverteilungen und Verfahren zum Betrieb einer optischen Anordnung
US10217190B2 (en) 2016-12-27 2019-02-26 Kla-Tencor Corporation System and method for reconstructing high-resolution point spread functions from low-resolution inspection images
US10958841B2 (en) 2017-01-06 2021-03-23 Intel Corporation Integrated image sensor and display pixel
JP2020521174A (ja) * 2017-05-18 2020-07-16 アリゾナ ボード オブ リージェンツ オン ビハーフ オブ ザ ユニバーシティ オブ アリゾナ 多層高ダイナミックレンジ・ヘッドマウントディスプレイ
CN107290846B (zh) 2017-08-04 2019-06-21 南京理工大学 基于环状可编程led照明的定量相位显微成像方法
US10595724B2 (en) * 2017-08-04 2020-03-24 Ho Wa LAI Adaptor for an image capture device for fundus photography
JP6625144B2 (ja) 2018-01-05 2019-12-25 キヤノン株式会社 画像処理方法およびそれを用いた撮像装置、画像処理装置、画像処理プログラム、記憶媒体、および、レンズ装置
CN108335268B (zh) 2018-01-05 2021-09-07 广西师范大学 一种基于盲解卷积的彩色图像去模糊的方法
US11073712B2 (en) 2018-04-10 2021-07-27 Apple Inc. Electronic device display for through-display imaging
EP3837584A4 (fr) * 2018-09-26 2021-09-08 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Procédé et dispositif de récupération d'image passant par un dispositif d'affichage
CN112888993B (zh) 2018-09-26 2023-05-05 Oppo广东移动通信有限公司 成像设备和电子设备
JP6991957B2 (ja) * 2018-11-28 2022-01-13 株式会社東芝 画像処理装置、撮像装置及び画像処理方法
US11038143B2 (en) 2018-12-06 2021-06-15 Samsung Display Co., Ltd. Display device and electronic device having the same
KR20200118266A (ko) 2019-04-03 2020-10-15 삼성디스플레이 주식회사 표시 장치 및 이의 제조 방법
US11516374B2 (en) 2019-06-05 2022-11-29 Synaptics Incorporated Under-display image sensor
KR20200144193A (ko) 2019-06-17 2020-12-29 삼성디스플레이 주식회사 표시 장치
CN110675347B (zh) 2019-09-30 2022-05-06 北京工业大学 一种基于组稀疏表示的图像盲复原方法
US11893482B2 (en) * 2019-11-14 2024-02-06 Microsoft Technology Licensing, Llc Image restoration for through-display imaging
WO2021122471A1 (fr) 2019-12-17 2021-06-24 Testa Ilaria Système d'imagerie
KR20210078129A (ko) 2019-12-18 2021-06-28 엘지디스플레이 주식회사 표시 장치
KR20210095771A (ko) 2020-01-23 2021-08-03 삼성디스플레이 주식회사 표시 장치
US20220292637A1 (en) * 2020-06-16 2022-09-15 Mayo Foundation For Medical Education And Research Methods for High Spatial and Temporal Resolution Ultrasound Imaging of Microvessels
EP4150562A4 (fr) 2020-07-02 2024-02-07 Samsung Electronics Co Ltd Dispositif électronique et procédé de commande de dispositif électronique
KR20220014764A (ko) 2020-07-29 2022-02-07 삼성전자주식회사 전자 장치 및 전자 장치의 인공 지능 학습 모델의 훈련 데이터 생성 방법
KR20220029310A (ko) 2020-08-31 2022-03-08 삼성전자주식회사 이미지 센서, 이미지 센서를 포함하는 이미지 획득 장치 및 그것의 동작 방법
KR20220028962A (ko) 2020-08-31 2022-03-08 삼성전자주식회사 이미지 개선 방법, 이미지 개선 장치, 그 장치의 학습 방법 및 학습 장치
CN112202991B (zh) 2020-09-17 2022-05-27 江西欧迈斯微电子有限公司 摄像模组、电子设备及光学元件和摄像模组的制备方法
KR20220058143A (ko) 2020-10-30 2022-05-09 삼성전자주식회사 영상 복원 방법 및 장치
CN114331886A (zh) 2021-12-23 2022-04-12 西安工业大学 一种基于深度特征的图像去模糊方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080165261A1 (en) * 2007-01-09 2008-07-10 Yuji Kamo Imaging apparatus adapted to implement electrical image restoration processing
US20170076430A1 (en) * 2014-05-28 2017-03-16 Huawei Technologies Co., Ltd. Image Processing Method and Image Processing Apparatus
US20160277658A1 (en) * 2015-03-16 2016-09-22 Dae Kwan Kim Image signal processor and devices including the same
US20190355101A1 (en) * 2017-01-12 2019-11-21 Intel Corporation Image refocusing
US20210029336A1 (en) * 2019-07-26 2021-01-28 Samsung Electronics Company, Ltd. Processing images captured by a camera behind a display

Also Published As

Publication number Publication date
US11721001B2 (en) 2023-08-08
US20220261966A1 (en) 2022-08-18

Similar Documents

Publication Publication Date Title
WO2021020821A1 (fr) Traitement d'images capturées par un appareil photo placé derrière un panneau d'affichage
WO2022177223A1 (fr) Reconstruction d'image faisant appel à une fonction d'étalement de points multiples pour un dispositif de prise de vues derrière un dispositif d'affichage
WO2017088127A1 (fr) Procédé de photographie, dispositif de photographie et terminal
WO2014189193A1 (fr) Méthode d'affichage d'image, appareil d'affichage d'image, et support d'enregistrement
WO2017189104A1 (fr) Masque de fusion permettant de créer un effet de parallaxe des images couleur et monochrome pour la macrophotographie
TW201724015A (zh) 成像方法、成像裝置及電子裝置
US20140333751A1 (en) Image processing apparatus and system, method for processing image, and program
US10762664B2 (en) Multi-camera processor with feature matching
TW201404143A (zh) 動態攝像機模式切換
WO2019029386A1 (fr) Terminal mobile et procédé d'imagerie correspondant
US11164298B2 (en) Long exposure filter
WO2019029573A1 (fr) Procédé de floutage d'image, support d'informations lisible par ordinateur et dispositif informatique
CN113850367A (zh) 网络模型的训练方法、图像处理方法及其相关设备
WO2020027902A1 (fr) Capteur de caméra monochrome et chromatique combiné
US10063825B2 (en) Method of operating mobile device and mobile system
WO2022169139A2 (fr) Appareil électronique et procédé de commande associé
JP2023169254A (ja) 撮像素子、撮像素子の作動方法、プログラム、及び撮像システム
WO2023063792A1 (fr) Procédé et appareil de déconvolution optique
WO2018012704A2 (fr) Dispositif et procédé de traitement d'image
JP2012222508A (ja) 画像処理装置および画像処理プログラム
US11722796B2 (en) Self-regularizing inverse filter for image deblurring
WO2022113866A1 (fr) Dispositif de détection, dispositif d'imagerie, procédé de détection et programme
WO2023014130A1 (fr) Automatisation de recherche pour une structure d'écran améliorée pour systèmes de caméra sous l'écran
JP6073403B2 (ja) イメージデータ生成方法及びイメージデータ生成装置
CN107277370A (zh) 对焦方法、装置、计算机可读存储介质和移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22756423

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22756423

Country of ref document: EP

Kind code of ref document: A1