CN114640763A - Off-screen camera sensor, image processing method, electronic device, and storage medium - Google Patents

Off-screen camera sensor, image processing method, electronic device, and storage medium Download PDF

Info

Publication number
CN114640763A
CN114640763A CN202210192672.1A CN202210192672A CN114640763A CN 114640763 A CN114640763 A CN 114640763A CN 202210192672 A CN202210192672 A CN 202210192672A CN 114640763 A CN114640763 A CN 114640763A
Authority
CN
China
Prior art keywords
raw image
image
pixel value
array
micro
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210192672.1A
Other languages
Chinese (zh)
Inventor
高昌军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jigan Technology Co ltd
Original Assignee
Beijing Jigan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jigan Technology Co ltd filed Critical Beijing Jigan Technology Co ltd
Priority to CN202210192672.1A priority Critical patent/CN114640763A/en
Publication of CN114640763A publication Critical patent/CN114640763A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • G06T5/77
    • G06T5/80
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Abstract

The embodiment of the application discloses an under-screen camera sensor, an image processing method, electronic equipment and a storage medium. The image processing method comprises the following steps: acquiring an original RAW image acquired by a camera sensor under a screen; extracting a first pixel value of each pixel point corresponding to the first microlens unit from the original RAW image, and generating a first RAW image based on the first pixel value; extracting a second pixel value of each pixel point corresponding to the second microlens unit from the original RAW image, and generating a second RAW image based on the second pixel value; and carrying out glare repair treatment on the first area in the first RAW image based on the second RAW image to obtain a third RAW image, wherein the brightness value of each pixel point in the first area is greater than the first numerical value.

Description

Off-screen camera sensor, image processing method, electronic device, and storage medium
Technical Field
The present disclosure relates to the field of machine vision technologies, and in particular, to an off-screen camera sensor, an image processing method, an electronic device, and a storage medium.
Background
In recent years, with the popularization of electronic devices, the user's demand for a full-screen has become stronger. In order to realize the front-end shooting function of the full-screen electronic equipment, equipment manufacturers design various schemes for the screen pixel sequencing mode and the screen pixel size of the lower camera area of the full-screen electronic equipment. The multiple design schemes balance the screen display effect of the camera area under the screen and the light incoming amount of the camera area under the screen to a certain extent, but have some problems, for example, the interference or diffraction of screen pixels in the camera area under the screen to light can generate strong fog feeling and special-shaped light spots, which causes the problem of image glare.
In the prior art, a plurality of frames of images with different brightness are mainly fused, and then a single glare removing algorithm is operated to improve the problem of picture glare, however, the glare removing algorithm mainly improves the overall picture fog feeling and the glare problems of a part of surface light sources, a linear light source and a small point light source, and cannot cover all light sources, for example, the glare improvement of the sun or a slightly strong light source is not large, and the problem of an abnormal light band is often introduced, so that the glare removing effect is poor.
Disclosure of Invention
The embodiment of the application provides an off-screen camera sensor, an image processing method, electronic equipment and a storage medium, and aims to solve the technical problem that in the prior art, the image glare removing effect is poor.
According to a first aspect of the present application, there is disclosed an under-screen camera sensor comprising a pixel array, a color filter array, and a microlens array;
the color filter array and the micro-lens array are both positioned above the pixel array, the micro-lens array comprises a first micro-lens unit and a second micro-lens unit, and the second micro-lens unit is an anti-glare micro-lens.
Optionally, as an embodiment, the color filter array is located above the pixel array, and the microlens array is located above the color filter array; alternatively, the first and second electrodes may be,
the micro lens array is positioned above the pixel array, and the color filter array is positioned above the micro lens array.
Optionally, as an embodiment, the color filter array includes a plurality of sets of filter units arranged repeatedly, each set of filter units includes a plurality of different filter units, and one filter unit covers one microlens unit in the microlens array;
m first micro-lens units are covered on M filter units in each group of filter units, N second micro-lens units are covered on N filter units in each group of filter units, wherein M is larger than N and is an integer larger than 1;
the positions of the M filter units in different sets of filter units are the same, and the positions of the N filter units in different sets of filter units are the same.
Optionally, as an embodiment, each set of filter units is arranged in any one of the following manners:
Figure BDA0003524925690000021
and
Figure BDA0003524925690000022
optionally, as an embodiment, the arrangement of each set of filter units is
Figure BDA0003524925690000023
In the case of (3), M is 12 and N is 4;
n filter units in each set of filter units include: r in the ith row and the jth column, G in the ith row and the jth +2 column, G in the ith +2 row and the jth column, and B in the ith +2 row and the jth +2 column, wherein i is more than or equal to 1 and less than or equal to 2, and j is more than or equal to 1 and less than or equal to 2.
According to a second aspect of the present application, an image processing method is disclosed, for performing glare-free processing on a RAW image acquired by an off-screen camera sensor in the first aspect, the method includes:
acquiring an original RAW image acquired by the under-screen camera sensor;
extracting a first pixel value of each pixel point corresponding to a first micro-lens unit in the under-screen camera sensor from the original RAW image, and generating a first RAW image based on the first pixel value;
extracting a second pixel value of each pixel point corresponding to a second micro-lens unit in the under-screen camera sensor from the original RAW image, and generating a second RAW image based on the second pixel value, wherein the second micro-lens unit is an anti-glare micro-lens;
determining a first area in the first RAW image, wherein the brightness value of each pixel point in the first area is greater than a first numerical value;
and carrying out glare repairing treatment on the first area in the first RAW image based on the second RAW image to obtain a third RAW image.
Optionally, as an embodiment, the performing glare repair processing on the first region in the first RAW image based on the second RAW image to obtain a third RAW image includes:
determining a second region corresponding to the first region in the second RAW image;
for each pixel point Pi in the first area, performing weighted summation operation on the original pixel value of the Pi and the original pixel value of a pixel point Ri at a corresponding position in the second area to obtain a target pixel value of the Pi, wherein the weight value of the Ri is greater than that of the Pi;
for each pixel point Pj in the area outside the first area in the first RAW image, performing weighted summation operation on an original pixel value of the Pj and an original pixel value of a pixel point Rj at a corresponding position in the second image to obtain a target pixel value of the Pj, wherein the weight value of the Pj is greater than that of the Rj;
and replacing the original pixel value of each Pi in the first RAW image with a corresponding target pixel value, and replacing the original pixel value of each Pj with a corresponding target pixel value to obtain a third RAW image.
Optionally, as an embodiment, the method further includes:
carrying out image reconstruction on the third RAW image to obtain a first full-color image;
and displaying the first full-color image on a photographing preview interface of the electronic equipment.
Optionally, as an embodiment, the method further includes:
storing the first RAW image and the second RAW image into a cache queue, wherein the cache queue stores a plurality of first RAW images and corresponding second RAW images with different exposure degrees in the same photographing scene;
when a photographing instruction for a target scene is received, reading a plurality of first RAW images with different exposure degrees and corresponding second RAW images in the target scene from the cache queue;
generating a fourth RAW image according to the first RAW images with different exposure degrees and the corresponding second RAW images under the target scene;
and carrying out image reconstruction on the fourth RAW image to obtain a second full-color image.
According to a third aspect of the present application, an image processing apparatus is disclosed, configured to perform glare removing processing on a RAW image collected by an under-screen camera sensor in the first aspect, the apparatus includes:
the acquisition module is used for acquiring an original RAW image acquired by the under-screen camera sensor;
the first generation module is used for extracting a first pixel value of each pixel point corresponding to a first micro-lens unit in the under-screen camera sensor from the original RAW image and generating a first RAW image based on the first pixel value;
a second generation module, configured to extract, from the original RAW image, a second pixel value of each pixel point corresponding to a second microlens unit in the off-screen camera sensor, and generate a second RAW image based on the second pixel value, where the second microlens unit is an anti-glare microlens;
a determining module, configured to determine a first region in the first RAW image, where a brightness value of each pixel in the first region is greater than a first numerical value;
and the repairing module is used for carrying out glare repairing treatment on the first area in the first RAW image based on the second RAW image to obtain a third RAW image.
Optionally, as an embodiment, the repair module includes:
a region determining sub-module, configured to determine a second region in the second RAW image, where the second region corresponds to the first region;
the first calculation submodule is used for carrying out weighted summation operation on the original pixel value of Pi and the original pixel value of a pixel point Ri at the corresponding position in the second area to obtain a target pixel value of the Pi, wherein the weight value of the Ri is greater than that of the Pi;
the second calculation submodule is used for performing weighted summation operation on an original pixel value of the Pj and an original pixel value of a pixel Rj at a corresponding position in the second image to obtain a target pixel value of the Pj for each pixel point Pj in a region, except the first region, in the first RAW image, wherein the weighted value of the Pj is greater than that of the Rj;
and the replacing submodule is used for replacing the original pixel value of each Pi in the first RAW image with a corresponding target pixel value and replacing the original pixel value of each Pj with a corresponding target pixel value to obtain a third RAW image.
Optionally, as an embodiment, the apparatus further includes:
the first reconstruction module is used for carrying out image reconstruction on the third RAW image to obtain a first full-color image;
and the display module is used for displaying the first full-color image on a photographing preview interface of the electronic equipment.
Optionally, as an embodiment, the apparatus further includes:
the storage module is used for storing the first RAW image and the second RAW image into a cache queue, wherein the cache queue stores a plurality of first RAW images and corresponding second RAW images with different exposure degrees in the same photographing scene;
the reading module is used for reading a plurality of first RAW images with different exposure degrees and corresponding second RAW images in a target scene from the cache queue when a photographing instruction for the target scene is received;
the third generation module is used for generating a fourth RAW image according to the first RAW image and the corresponding second RAW image with multiple different exposure degrees in the target scene;
and the second reconstruction module is used for carrying out image reconstruction on the fourth RAW image to obtain a second full-color image.
According to a fourth aspect of the present application, an electronic device is disclosed, comprising a memory, a processor and a computer program stored on the memory, the processor executing the computer program to implement the image processing method as in the second aspect.
According to a fifth aspect of the present application, a computer readable storage medium is disclosed, having stored thereon a computer program/instructions which, when executed by a processor, implement the image processing method as in the second aspect.
According to a sixth aspect of the application, a computer program product is disclosed, comprising computer programs/instructions which, when executed by a processor, implement the image processing method as in the second aspect.
In the embodiment of the application, a hardware structure of the off-screen camera sensor in the full-screen electronic device can be improved, the improved microlens array of the off-screen camera sensor comprises an anti-glare microlens unit in addition to a common microlens unit, so that when a shooting scene is subjected to image acquisition, a normal RAW image can be acquired through the common microlens unit in the microlens array, and an anti-glare RAW image is acquired through the anti-glare microlens unit in the microlens array, wherein the normal RAW image and the anti-glare RAW image comprise complete information of the shooting scene, and then the anti-glare RAW image is used for carrying out glare removal processing on the normal RAW image to obtain the non-glare RAW image. Compared with the prior art, in the embodiment of the application, because the brightness of the anti-glare RAW image is lower than that of the normal RAW image and the anti-glare RAW image has no problems of abnormal glare and obvious fog feeling, the anti-glare RAW image is used for carrying out glare removing treatment on the normal RAW image, the obtained image contains complete information of a shooting scene and does not have the glare problem, the glare problem under various light source scenes including a strong light scene can be solved, and the glare removing effect is good.
Drawings
FIG. 1 is a block diagram of an underscreen camera sensor of one embodiment of the present application;
FIG. 2 is a block diagram of an off-screen camera sensor of another embodiment of the present application;
FIG. 3 is a first exemplary diagram of an arrangement of filter elements in a color filter array according to an embodiment of the present application;
FIG. 4 is a second exemplary diagram of an arrangement of filter elements in a color filter array according to an embodiment of the present application;
FIG. 5 is a third exemplary diagram of an arrangement of filter elements in a color filter array according to an embodiment of the present application;
FIG. 6 is an exemplary diagram of an arrangement of microlens elements in a microlens array according to one embodiment of the present application;
FIG. 7 is a flow diagram of an image processing method of an embodiment of the present application;
FIG. 8 is an exemplary diagram of an image processing method of an embodiment of the present application;
fig. 9 is a schematic configuration diagram of an image processing apparatus according to an embodiment of the present application;
fig. 10 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
It should be noted that for simplicity of description, the method embodiments are described as a series of acts, but those skilled in the art should understand that the embodiments are not limited by the described order of acts, as some steps can be performed in other orders or simultaneously according to the embodiments. Further, those skilled in the art will also appreciate that the embodiments described in the specification are presently preferred and that no particular act is required of the embodiments of the application.
In recent years, technical research based on artificial intelligence, such as computer vision, deep learning, machine learning, image processing, and image recognition, has been actively developed. Artificial Intelligence (AI) is an emerging scientific technology for studying and developing theories, methods, techniques and application systems for simulating and extending human Intelligence. The artificial intelligence subject is a comprehensive subject and relates to various technical categories such as chips, big data, cloud computing, internet of things, distributed storage, deep learning, machine learning and neural networks. Computer vision is used as an important branch of artificial intelligence, particularly a machine is used for identifying the world, and the computer vision technology generally comprises the technologies of face identification, living body detection, fingerprint identification and anti-counterfeiting verification, biological feature identification, face detection, pedestrian detection, target detection, pedestrian identification, image processing, image identification, image semantic understanding, image retrieval, character identification, video processing, video content identification, behavior identification, three-dimensional reconstruction, virtual reality, augmented reality, synchronous positioning and map construction (SLAM), computational photography, robot navigation and positioning and the like. With the research and progress of artificial intelligence technology, the technology is applied to various fields, such as security, city management, traffic management, building management, park management, face passage, face attendance, logistics management, warehouse management, robots, intelligent marketing, computational photography, mobile phone images, cloud services, smart homes, wearable equipment, unmanned driving, automatic driving, smart medical treatment, face payment, face unlocking, fingerprint unlocking, testimony verification, smart screens, smart televisions, cameras, mobile internet, live webcasts, beauty treatment, medical beauty treatment, intelligent temperature measurement and the like.
Taking the field of computing cameras as an example, at present, in order to balance the screen display effect of a camera area under a screen of a full-screen electronic device and the light inflow amount of the camera under the screen, device manufacturers design various schemes for the arrangement mode and the size of the screen pixels of the camera area, and the various schemes balance the screen display effect of the camera area under the screen and the light inflow amount of the camera area under the screen to a certain extent, but have some problems, for example, interference or diffraction of the screen pixels in the camera area under the screen to light can generate strong fog feeling and special-shaped light spots, which causes the problem of image glare.
In the prior art, a plurality of frames of images with different brightness are mainly fused, and then a single glare removing algorithm is operated to improve the problem of picture glare, however, the glare removing algorithm mainly improves the overall picture fog feeling and the glare problems of a part of surface light sources, a linear light source and a small point light source, and cannot cover all light sources, for example, the glare improvement of the sun or a slightly strong light source is not large, and the problem of an abnormal light band is often introduced, so that the glare removing effect is poor.
In order to solve the foregoing technical problem, an embodiment of the present application provides an off-screen camera sensor, an image processing method, an electronic device, and a storage medium.
First, the off-screen camera sensor in the embodiment of the present application will be described below.
The embodiment of the application provides a camera sensor under screen, includes: a pixel array, a color filter array, and a microlens array; the color filter array and the micro-lens array are both positioned above the pixel array, the micro-lens array comprises a first micro-lens unit and a second micro-lens unit, and the second micro-lens unit is an anti-glare micro-lens.
In some embodiments, as shown in FIG. 1, the color filter array in the under-screen camera sensor is located above the pixel array and the microlens array is located above the color filter array.
In other embodiments, as shown in FIG. 2, the microlens array in the under-screen camera sensor is positioned above the pixel array and the color filter array is positioned above the microlens array.
A camera sensor under a screen is a device for converting optical signals into electric signals, each photosensitive unit on the sensor is called a pixel point, so that a pixel array is formed, and the pixel value of each pixel point is used for representing the illumination intensity sensed by the corresponding pixel point, but the color cannot be represented.
In order to characterize Color, a Color Filter Array (CFA) is covered on a surface of the pixel array, the Color filter array includes a plurality of Color filter units, each Color filter unit corresponds to a pixel point in the pixel array, and each Color filter unit only allows light of a single Color to pass through and is captured by the corresponding pixel point. After the processing, the pixel value of the pixel point can be used for representing the illumination intensity of the light with the specific color.
In order to converge the optical signal, a microlens array is covered on the surface of the pixel array, the microlens array comprises a plurality of microlens units, each microlens unit corresponds to one color filter unit in the color filter array and one pixel point in the pixel array, namely, the pixel points in the sensor, the color filter units and the microlens units have one-to-one correspondence.
In the embodiment of the present application, the first microlens unit is a common microlens, that is, a microlens without an anti-glare function. The second microlens unit may be a microlens with an anti-glare coating film or a microlens made of an anti-glare material, that is, a microlens having an anti-glare function.
In some embodiments, the color filter array includes a plurality of sets of filter units arranged repeatedly, each set of filter units includes a plurality of different filter units, and one filter unit covers one microlens unit in the microlens array; m first micro-lens units are covered on M filter units in each group of filter units, N second micro-lens units are covered on N filter units in each group of filter units, wherein M is larger than N and is an integer larger than 1; the positions of the M filter elements in different sets of filter elements are the same, and the positions of the N filter elements in different sets of filter elements are the same.
In some embodiments, the arrangement of each group of filter units may specifically be any of the following:
Figure BDA0003524925690000101
and
Figure BDA0003524925690000102
for ease of understanding, the description will be made in conjunction with the exemplary diagrams shown in fig. 3 to 6.
In one example, as shown in fig. 3, the arrangement of each group of filter units may be a four bayer array arrangement, in which each group of filter units includes 16 color filter units, respectively, 4 adjacent red filter units (denoted by "R"), 4 adjacent green filter units (denoted by "G"), and 4 adjacent blue filter units (denoted by "B"), the red filter unit allows only red light to pass through, the green filter unit allows only green light to pass through, and the blue filter unit allows only blue light to pass through, and in the arrangement, the captured illumination information includes 25% of red light information, 50% of green light information, and 25% of blue light information.
In one example, as shown in fig. 4, the arrangement of each group of filter units may be an RCCC arrangement, in which each group of filter units includes 4 filter units, namely, 1 red filter unit (denoted by "R") and 3 brightness filter units (denoted by "C"), the red filter unit allows only red light to pass through, and the brightness filter unit allows all wavelengths of light to pass through, and in the arrangement, the captured illumination information has a brightness information of 75% and a red information of 25%.
In one example, as shown in fig. 5, each group of filter units may be arranged in an arrangement including a red filter unit, a green filter unit, a blue filter unit, and a brightness filter unit, in which the captured illumination information includes 50% brightness information, 25% green information, 12.5% blue information, and 12.5% red information.
In addition, in the embodiment of the present application, the arrangement manner of the color filter array may also be other than the three modes, which is not limited in the embodiment of the present application.
In some embodiments, the filter elements in each set are arranged in a manner
Figure BDA0003524925690000111
In the case of the above-described situation,
m is 12 and N is 4; n filter units in each set of filter units include: r in the ith row and the jth column, G in the ith row and the jth +2 column, G in the ith +2 row and the jth column, and B in the ith +2 row and the jth +2 column, wherein i is more than or equal to 1 and less than or equal to 2, and j is more than or equal to 1 and less than or equal to 2.
In one example, as shown in fig. 6, fig. 6 is an exemplary diagram illustrating an arrangement manner of the microlens arrays corresponding to the filter units in each group of filter units, where the microlens units corresponding to the filter units R, G and B are first microlens units, and the microlens units corresponding to the filter units R _ deflare, G _ deflare, and B _ deflare second microlens units.
Next, an image processing method provided in an embodiment of the present application is described.
Fig. 7 is a flowchart of an image processing method according to an embodiment of the present application, for performing a glare-removing process on a RAW image acquired by the off-screen camera sensor, as shown in fig. 7, the method may include the following steps: step 701, step 702, step 703, step 704 and step 705, wherein,
in step 701, a RAW image acquired by an off-screen camera sensor is acquired.
In this embodiment, the original RAW image is a full-size RAW image, that is, how many pixels are included in the pixel array, and how many pixels are included in the original RAW image, for example, the pixel array of the under-screen camera sensor includes 3000 × 4000 pixels, and the number of pixels of the corresponding original RAW image is also 3000 × 4000.
In the embodiment of the application, the pixel value of each pixel point in the pixel array of the under-screen camera sensor can be acquired, and the original RAW image is generated according to the acquired pixel value and the arrangement mode of the pixel points in the pixel array.
In step 702, a first pixel value of each pixel point corresponding to a first microlens unit in the under-screen camera sensor is extracted from the original RAW image, and a first RAW image is generated based on the first pixel value.
In the embodiment of the present application, since the first microlens unit is a common microlens, the illumination information can be normally illuminated onto the pixel point of the under-screen camera sensor through the first microlens unit and the color filter unit, and therefore the first RAW image separated from the original RAW image is a normal RAW image, but there is a glare problem therein.
In step 703, a second pixel value of each pixel point corresponding to a second microlens unit in the off-screen camera sensor is extracted from the original RAW image, and a second RAW image is generated based on the second pixel value.
In the embodiment of the application, because the second microlens unit is an anti-glare microlens, the illumination information passes through the second microlens unit, then reflected and refracted optical signals are filtered, and then the optical signals are irradiated onto the pixel points of the under-screen camera sensor through the color filter unit, so that the second RAW image separated from the original RAW image is an anti-glare RAW image (also called as a non-glare RAW image), wherein the glare problem does not exist, but the image brightness is lower than that of a normal RAW image.
In the embodiment of the present application, since all information of the photographed scene is recorded in the original RAW image, the first RAW image and the second RAW image separated from the original RAW image also include all information of the photographed scene.
In step 704, a first region in the first RAW image is determined, wherein the brightness value of each pixel in the first region is greater than a first value.
In the embodiment of the present application, since the glare problem mainly occurs in the highlight area in the image, it is necessary to detect the highlight area in the first RAW image, that is, the first area where the brightness value of each pixel is greater than the first value.
In this embodiment of the application, a value of the first value may be set, and after the value of the first value is set, the value of the first value is used as a comparison standard, and any image area detection method in the prior art is used to compare a pixel value of each pixel point in the first RAW image with the first value, so as to determine the first area in the first RAW image.
In step 705, a glare repair process is performed on the first region in the first RAW image based on the second RAW image, so as to obtain a third RAW image.
In the embodiment of the application, the second RAW image without the glare problem is used, and the glare repairing processing is performed on the high-light area in the first RAW image with the glare problem, so that the third RAW image without the glare problem can be obtained.
In some embodiments, the step 705 may specifically include the following steps (not shown in the figure): step 7051, step 7052, step 7053 and step 7054, wherein,
in step 7051, a second region corresponding to the first region in the second RAW image is determined.
In step 7052, for each pixel point Pi in the first area, a weighted sum operation is performed on the original pixel value of Pi and the original pixel value of the pixel point Ri at the corresponding position in the second area to obtain a target pixel value of Pi, where a weight value of Ri is greater than a weight value of Pi.
In step 7053, for each pixel point Pj in the region outside the first region in the first RAW image, a weighted sum operation is performed on an original pixel value of the Pj and an original pixel value of a pixel point Rj at a corresponding position in the second image, so as to obtain a target pixel value of the Pj, where a weight value of the Pj is greater than a weight value of the Rj.
In step 7054, the original pixel value of each Pi in the first RAW image is replaced with a corresponding target pixel value, and the original pixel value of each Pj is replaced with a corresponding target pixel value, so as to obtain a third RAW image.
For ease of understanding, the image de-glare process will be described by taking the arrangement of the microlens arrays shown in fig. 6 and the RAW image shown in fig. 8 as examples.
The arrangement of the microlens arrays in the off-screen camera sensor is as shown in fig. 6, after an original RAW image is obtained from the off-screen camera sensor, for example, the size of the original RAW image is 16 × 16, pixel values of pixel points corresponding to filter units R _ deflare, G _ deflare and B _ deflare are extracted from the original RAW image, and a second RAW image with the size of 4 × 4 as shown in fig. 8 is generated; the pixel values of the pixel points corresponding to the filter units R, G and B are extracted from the original RAW image, taking the pixel values of the three pixel points corresponding to the three filter units R in each group of filter units as an example, three-in-one binning is performed on the pixel values of the three pixel points to obtain an R pixel value, similarly, three-in-one binning is performed on the three pixel points corresponding to the filter unit G and the three pixel points corresponding to the filter unit B to obtain two G pixel values and a B pixel value, and finally, a first RAW image with a size of 4 × 4 is generated as shown in fig. 8.
When the glare restoration processing is performed, for each pixel point (Pi1, Pi2, Pi3, Pi4) in the first region, a weighted summation operation is performed on the original pixel value of Pi1 and the original pixel value of pixel point Ri1 in the second region to obtain a target pixel value of Pi1, and similarly, the target pixel values of Pi2 to Pi4 can be calculated.
For each pixel point (Pj1, Pj2, Pj3, Pj4, Pj5, Pj6, Pj7, Pj8, Pj9, Pj10, Pj11, and Pj12) in the region outside the first region in the first RAW image, the original pixel value of Pj1 and the original pixel value of Rj1 in the second image are subjected to weighted summation operation to obtain a target pixel value of Pj1, and similarly, target pixel values of Pj2 to Pj12 can be obtained through calculation.
And then, replacing original pixel values of Pi 1-Pi 4 in the first RAW image with target pixel values of Pi 1-Pi 4, and replacing original pixel values of Pj 1-Pj 12 in the first RAW image with target pixel values of Pj 1-Pj 12, and finally obtaining a third RAW image.
It can be seen from the above embodiments that, in this embodiment, a hardware structure of an off-screen camera sensor in a full-screen electronic device may be improved, and the improved microlens array of the off-screen camera sensor includes an anti-glare microlens unit in addition to a common microlens unit, so that when an image is acquired in a shooting scene, a normal RAW image may be acquired through the common microlens unit in the microlens array, and an anti-glare RAW image may be acquired through the anti-glare microlens unit in the microlens array, where the normal RAW image and the anti-glare RAW image include complete information of the shooting scene, and then the anti-glare RAW image is used to perform de-glare processing on the normal RAW image, so as to obtain a non-glare RAW image. Compared with the prior art, in the embodiment of the application, because the brightness of the anti-glare RAW image is lower than that of the normal RAW image and the anti-glare RAW image has no problems of abnormal glare and obvious fog feeling, the anti-glare RAW image is used for carrying out glare removing treatment on the normal RAW image, the obtained image contains complete information of a shooting scene and does not have the glare problem, the glare problem under various light source scenes including a strong light scene can be solved, and the glare removing effect is good.
In consideration of the fact that the glare removing algorithm scheme introduced by equipment manufacturers in the prior art is mainly applied to the photographing process, a corresponding algorithm is not added during photographing preview to improve the effect, so that the difference between the image displayed in the photographing preview interface and the image obtained by actual photographing is larger (mainly reflected in that the effects of light reflection removal, defogging and the like of the preview image are much worse than those of the photographed image).
In order to solve the above technical problem, in another embodiment provided by the present application, the following steps may also be added on the basis of the embodiment of the method shown in fig. 7:
carrying out image reconstruction on the third RAW image to obtain a first full-color image; and displaying the first full-color image on a photo preview interface of the electronic equipment.
In the embodiment of the present application, glare has been removed from the third RAW image, but considering that the third RAW image is in a mosaic form, in order to obtain a full-color image from incomplete color samples, image reconstruction needs to be performed on the incomplete color samples, that is, demosaicing processing needs to be performed on the third RAW image.
In practical applications, any demosaicing method in the prior art may be adopted to demosaic the third RAW image to obtain a corresponding first full-color image, where an image format of the first full-color image may be a YUV format.
Therefore, in the embodiment of the application, the corresponding algorithm can be added during the photographing preview to improve the effect, the same algorithm is multiplexed during the photographing, the preview effect is consistent with the photographing effect, and the processing time of the whole algorithm flow is reduced because a special glare removing algorithm module is not added.
In order to reduce the duration of image processing in the photographing process and improve the image processing efficiency, when actually photographing, the first RAW image and the second RAW image that have been generated before may be multiplexed, and accordingly, in a further embodiment provided in this application, on the basis of the embodiment of the method shown in fig. 7, the following steps may be further added:
storing the first RAW image and the second RAW image into a cache queue, wherein the cache queue stores the first RAW images and the corresponding second RAW images with different exposure degrees under the same photographing scene;
when a photographing instruction for a target scene is received, reading a plurality of first RAW images with different exposure degrees and corresponding second RAW images in the target scene from a buffer queue;
generating a fourth RAW image according to the first RAW images with different exposure degrees and the corresponding second RAW images under the target scene;
and performing image reconstruction on the fourth RAW image to obtain a second full-color image.
In the embodiment of the present application, the fourth RAW image may be a RAW image which is free of glare and has an HDR effect.
In this embodiment of the application, the second full-color image is a color image corresponding to the target scene, and the format of the second full-color image may be a JPEG format.
In one example, the target scene corresponds to three sets of RAW images with different exposure levels, which are: (first RAW image at first exposure level, second RAW image at first exposure level), (first RAW image at second exposure level, second RAW image at second exposure level), and (first RAW image at third exposure level, second RAW image at third exposure level).
In some embodiments, referring to step 705 in the embodiment shown in fig. 7, a fifth RAW image at the first exposure level may be generated based on the first RAW image at the first exposure level and the second RAW image at the first exposure level; similarly, a sixth RAW image at the second exposure degree is generated based on the first RAW image at the second exposure degree and the second RAW image at the second exposure degree; generating a seventh RAW image at the third exposure degree based on the first RAW image at the third exposure degree and the second RAW image at the third exposure degree; thereafter, a fourth RAW image is generated based on the fifth, sixth, and seventh RAW images.
In other embodiments, all of the first RAW image and part of the second RAW image may be selected from three sets of RAW images corresponding to different exposure degrees of the target scene, and the fourth RAW image may be generated based on all of the first RAW image and part of the second RAW image.
Therefore, in the embodiment of the application, when actual shooting is performed, the first RAW image and the second RAW image which are generated before can be multiplexed, so that the time length of image processing in the shooting process is reduced, and the image processing efficiency is improved.
Fig. 9 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application, and as shown in fig. 9, an image processing apparatus 900 may include: an obtaining module 901, a first generating module 902, a second generating module 903, a determining module 904, and a repairing module 905, wherein,
an obtaining module 901, configured to obtain an original RAW image acquired by the off-screen camera sensor;
a first generating module 902, configured to extract, from the original RAW image, a first pixel value of each pixel point corresponding to a first microlens unit in the under-screen camera sensor, and generate a first RAW image based on the first pixel value;
a second generating module 903, configured to extract, from the original RAW image, a second pixel value of each pixel point corresponding to a second microlens unit in the off-screen camera sensor, and generate a second RAW image based on the second pixel value, where the second microlens unit is an anti-glare microlens;
a determining module 904, configured to determine a first region in the first RAW image, where a brightness value of each pixel in the first region is greater than a first numerical value;
a repairing module 905, configured to perform glare repairing on the first region in the first RAW image based on the second RAW image, to obtain a third RAW image.
It can be seen from the above embodiments that, in this embodiment, a hardware structure of an off-screen camera sensor in a full-screen electronic device can be improved, and the improved microlens array of the off-screen camera sensor includes an anti-glare microlens unit in addition to a common microlens unit, so that when an image is acquired in a shooting scene, a normal RAW image can be acquired through the common microlens unit in the microlens array, and an anti-glare RAW image is acquired through the anti-glare microlens unit in the microlens array, where the normal RAW image and the anti-glare RAW image include complete information of the shooting scene, and then the anti-glare RAW image is used to perform de-glare processing on the normal RAW image, so as to obtain a non-glare RAW image. Compared with the prior art, in the embodiment of the application, because the brightness of the anti-glare RAW image is lower than that of the normal RAW image and the anti-glare RAW image has no problems of abnormal glare and obvious fog feeling, the anti-glare RAW image is used for carrying out glare removing treatment on the normal RAW image, the obtained image contains complete information of a shooting scene and does not have the glare problem, the glare problem under various light source scenes including a strong light scene can be solved, and the glare removing effect is good.
Optionally, as an embodiment, the repair module 905 may include:
a region determination sub-module, configured to determine a second region in the second RAW image, where the second region corresponds to the first region;
the first calculation submodule is used for carrying out weighted summation operation on the original pixel value of Pi and the original pixel value of a pixel point Ri at the corresponding position in the second area to obtain a target pixel value of the Pi, wherein the weight value of the Ri is greater than that of the Pi;
the second calculation submodule is used for performing weighted summation operation on an original pixel value of the Pj and an original pixel value of a pixel Rj at a corresponding position in the second image to obtain a target pixel value of the Pj for each pixel point Pj in a region, except the first region, in the first RAW image, wherein the weighted value of the Pj is greater than that of the Rj;
and the replacing submodule is used for replacing the original pixel value of each Pi in the first RAW image with a corresponding target pixel value and replacing the original pixel value of each Pj with a corresponding target pixel value to obtain a third RAW image.
Optionally, as an embodiment, the image processing apparatus 900 may further include:
the first reconstruction module is used for carrying out image reconstruction on the third RAW image to obtain a first full-color image;
and the display module is used for displaying the first full-color image on a photographing preview interface of the electronic equipment.
Optionally, as an embodiment, the image processing apparatus 900 may further include:
the storage module is used for storing the first RAW image and the second RAW image into a cache queue, wherein the cache queue stores a plurality of first RAW images and corresponding second RAW images with different exposure degrees in the same photographing scene;
the reading module is used for reading a plurality of first RAW images with different exposure degrees and corresponding second RAW images in a target scene from the cache queue when a photographing instruction for the target scene is received;
the third generation module is used for generating a fourth RAW image according to the first RAW images with different exposure degrees and the corresponding second RAW images under the target scene;
and the second reconstruction module is used for carrying out image reconstruction on the fourth RAW image to obtain a second full-color image.
Any one step and specific operation in any one step in the embodiments of the image processing method provided by the present application may be completed by a corresponding module in the image processing apparatus. The respective processes of operations performed by the respective modules in the image processing apparatus refer to the respective processes of operations described in the embodiment of the image processing method.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
Fig. 10 is a block diagram of an electronic device according to an embodiment of the present application. The electronic device includes a processing component 1022 that further includes one or more processors, and memory resources, represented by memory 1032, for storing instructions, such as application programs, that are executable by the processing component 1022. The application programs stored in memory 1032 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1022 is configured to execute instructions to perform the above-described methods.
The electronic device may also include a power supply component 1026 configured to perform power management for the electronic device, a wired or wireless network interface 1050 configured to connect the electronic device to a network, and an input/output (I/O) interface 1058. The electronic device may operate based on an operating system stored in memory 1032, such as Windows Server, MacOS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
According to yet another embodiment of the present application, there is also provided a computer-readable storage medium having stored thereon a computer program/instructions which, when executed by a processor, implement the steps in the image processing method according to any one of the above-mentioned embodiments.
According to yet another embodiment of the present application, there is also provided a computer program product comprising computer programs/instructions which, when executed by a processor, implement the steps in the image processing method according to any one of the above embodiments.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one of skill in the art, embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the true scope of the embodiments of the present application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The details of the off-screen camera sensor, the image processing method, the electronic device and the storage medium provided by the present application are introduced above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiment is only used to help understand the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (12)

1. The under-screen camera sensor is characterized by comprising a pixel array, a color filter array and a micro-lens array;
the color filter array and the micro-lens array are both positioned above the pixel array, the micro-lens array comprises a first micro-lens unit and a second micro-lens unit, and the second micro-lens unit is an anti-glare micro-lens.
2. The underscreen camera sensor of claim 1,
the color filter array is positioned above the pixel array, and the micro-lens array is positioned above the color filter array; alternatively, the first and second electrodes may be,
the micro lens array is positioned above the pixel array, and the color filter array is positioned above the micro lens array.
3. The sensor of claim 1 or 2, wherein the color filter array comprises a plurality of sets of filter units arranged repeatedly, each set of filter units comprises a plurality of different filter units, and one filter unit covers one microlens unit in the microlens array;
m first micro-lens units are covered on M filter units in each group of filter units, N second micro-lens units are covered on N filter units in each group of filter units, wherein M is larger than N and is an integer larger than 1;
the positions of the M filter units in different sets of filter units are the same, and the positions of the N filter units in different sets of filter units are the same.
4. The sensor of claim 3, wherein each set of filter elements is arranged in any one of the following ways:
Figure FDA0003524925680000011
and
Figure FDA0003524925680000012
5. the underscreen camera sensor of claim 4 which isCharacterized in that each group of filter units is arranged in a mode of
Figure FDA0003524925680000021
In the case of (3), M is 12 and N is 4;
n filter units in each set of filter units include: r in the ith row and the jth column, G in the ith row and the jth +2 column, G in the ith +2 row and the jth column, and B in the ith +2 row and the jth +2 column, wherein i is more than or equal to 1 and less than or equal to 2, and j is more than or equal to 1 and less than or equal to 2.
6. An image processing method for performing glare-free processing on RAW images acquired by the off-screen camera sensor according to any one of claims 1 to 5, the method comprising:
acquiring an original RAW image acquired by the under-screen camera sensor;
extracting a first pixel value of each pixel point corresponding to a first micro-lens unit in the under-screen camera sensor from the original RAW image, and generating a first RAW image based on the first pixel value;
extracting a second pixel value of each pixel point corresponding to a second micro-lens unit in the under-screen camera sensor from the original RAW image, and generating a second RAW image based on the second pixel value, wherein the second micro-lens unit is an anti-glare micro-lens;
determining a first area in the first RAW image, wherein the brightness value of each pixel point in the first area is greater than a first numerical value;
and carrying out glare repairing treatment on the first area in the first RAW image based on the second RAW image to obtain a third RAW image.
7. The method according to claim 6, wherein the performing glare repair processing on the first region in the first RAW image based on the second RAW image to obtain a third RAW image comprises:
determining a second region in the second RAW image corresponding to the first region;
for each pixel point Pi in the first area, performing weighted summation operation on the original pixel value of the Pi and the original pixel value of a pixel point Ri at a corresponding position in the second area to obtain a target pixel value of the Pi, wherein the weight value of the Ri is greater than that of the Pi;
for each pixel point Pj in the area outside the first area in the first RAW image, performing weighted summation operation on an original pixel value of the Pj and an original pixel value of a pixel point Rj at a corresponding position in the second image to obtain a target pixel value of the Pj, wherein the weight value of the Pj is greater than that of the Rj;
and replacing the original pixel value of each Pi in the first RAW image with a corresponding target pixel value, and replacing the original pixel value of each Pj with a corresponding target pixel value to obtain a third RAW image.
8. The method of claim 6, further comprising:
carrying out image reconstruction on the third RAW image to obtain a first full-color image;
and displaying the first full-color image on a photographing preview interface of the electronic equipment.
9. The method of claim 6, further comprising:
storing the first RAW image and the second RAW image into a cache queue, wherein the cache queue stores a plurality of first RAW images and corresponding second RAW images with different exposure degrees in the same photographing scene;
when a photographing instruction for a target scene is received, reading a plurality of first RAW images with different exposure degrees and corresponding second RAW images in the target scene from the cache queue;
generating a fourth RAW image according to the first RAW images with different exposure degrees and the corresponding second RAW images under the target scene;
and carrying out image reconstruction on the fourth RAW image to obtain a second full-color image.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory, characterized in that the processor executes the computer program to implement the method of any of claims 6-9.
11. A computer-readable storage medium on which a computer program/instructions are stored, characterized in that the computer program/instructions, when executed by a processor, implement the method of any one of claims 6-9.
12. A computer program product comprising computer programs/instructions, characterized in that the computer programs/instructions, when executed by a processor, implement the method of any of claims 6-9.
CN202210192672.1A 2022-02-28 2022-02-28 Off-screen camera sensor, image processing method, electronic device, and storage medium Pending CN114640763A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210192672.1A CN114640763A (en) 2022-02-28 2022-02-28 Off-screen camera sensor, image processing method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210192672.1A CN114640763A (en) 2022-02-28 2022-02-28 Off-screen camera sensor, image processing method, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
CN114640763A true CN114640763A (en) 2022-06-17

Family

ID=81947191

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210192672.1A Pending CN114640763A (en) 2022-02-28 2022-02-28 Off-screen camera sensor, image processing method, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN114640763A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013258602A (en) * 2012-06-13 2013-12-26 Olympus Corp Image pick-up device
CN205647732U (en) * 2015-05-27 2016-10-12 半导体元件工业有限责任公司 Image sensor and image sensor system
US20170070689A1 (en) * 2015-09-08 2017-03-09 Apple Inc. Automatic compensation of lens flare
CN110197832A (en) * 2018-02-26 2019-09-03 爱思开海力士有限公司 Imaging sensor including partition pattern
CN211744556U (en) * 2019-09-27 2020-10-23 深圳传音控股股份有限公司 Camera module and terminal adopting same
CN113826375A (en) * 2019-05-22 2021-12-21 索尼半导体解决方案公司 Light receiving device, solid-state imaging apparatus, electronic apparatus, and information processing system
CN113866854A (en) * 2021-11-10 2021-12-31 深圳市雕拓科技有限公司 Transparent optical element
JP2022011504A (en) * 2020-06-30 2022-01-17 凸版印刷株式会社 Solid state image sensor
CN114223191A (en) * 2019-09-17 2022-03-22 脸谱科技有限责任公司 Polarization capture apparatus, system and method
CN116057951A (en) * 2020-07-31 2023-05-02 株式会社半导体能源研究所 Imaging device, electronic apparatus, and moving object

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013258602A (en) * 2012-06-13 2013-12-26 Olympus Corp Image pick-up device
CN205647732U (en) * 2015-05-27 2016-10-12 半导体元件工业有限责任公司 Image sensor and image sensor system
US20170070689A1 (en) * 2015-09-08 2017-03-09 Apple Inc. Automatic compensation of lens flare
CN110197832A (en) * 2018-02-26 2019-09-03 爱思开海力士有限公司 Imaging sensor including partition pattern
CN113826375A (en) * 2019-05-22 2021-12-21 索尼半导体解决方案公司 Light receiving device, solid-state imaging apparatus, electronic apparatus, and information processing system
CN114223191A (en) * 2019-09-17 2022-03-22 脸谱科技有限责任公司 Polarization capture apparatus, system and method
CN211744556U (en) * 2019-09-27 2020-10-23 深圳传音控股股份有限公司 Camera module and terminal adopting same
JP2022011504A (en) * 2020-06-30 2022-01-17 凸版印刷株式会社 Solid state image sensor
CN116057951A (en) * 2020-07-31 2023-05-02 株式会社半导体能源研究所 Imaging device, electronic apparatus, and moving object
CN113866854A (en) * 2021-11-10 2021-12-31 深圳市雕拓科技有限公司 Transparent optical element

Similar Documents

Publication Publication Date Title
Rerabek et al. New light field image dataset
CN105491294B (en) Image processing apparatus, image capturing device and image processing method
Cossairt et al. When does computational imaging improve performance?
CN110149482A (en) Focusing method, device, electronic equipment and computer readable storage medium
CN103733607B (en) For detecting the apparatus and method of moving object
CN103563350B (en) Image processing apparatus, image processing method and digital camera
CN108389224B (en) Image processing method and device, electronic equipment and storage medium
CN107040726B (en) Double-camera synchronous exposure method and system
CN110580428A (en) image processing method, image processing device, computer-readable storage medium and electronic equipment
CN112288663A (en) Infrared and visible light image fusion method and system
EP2312858A1 (en) Image processing apparatus, imaging apparatus, image processing method, and program
CN108537155A (en) Image processing method, device, electronic equipment and computer readable storage medium
JP2017005389A (en) Image recognition device, image recognition method, and program
CN112818732B (en) Image processing method, device, computer equipment and storage medium
WO2023024697A1 (en) Image stitching method and electronic device
JP2023056056A (en) Data generation method, learning method and estimation method
CN109242794A (en) Image processing method, device, electronic equipment and computer readable storage medium
US20130129221A1 (en) Image processing device, image processing method, and recording medium
CN110929615B (en) Image processing method, image processing apparatus, storage medium, and terminal device
CN115049675A (en) Generation area determination and light spot generation method, apparatus, medium, and program product
CN109325905B (en) Image processing method, image processing device, computer readable storage medium and electronic apparatus
CN115147936A (en) Living body detection method, electronic device, storage medium, and program product
CN109191398A (en) Image processing method, device, computer readable storage medium and electronic equipment
CN111968039B (en) Day and night general image processing method, device and equipment based on silicon sensor camera
CN114885096B (en) Shooting mode switching method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination