CN114079731A - Image processing method, image processing device, storage medium and terminal - Google Patents

Image processing method, image processing device, storage medium and terminal Download PDF

Info

Publication number
CN114079731A
CN114079731A CN202010841170.8A CN202010841170A CN114079731A CN 114079731 A CN114079731 A CN 114079731A CN 202010841170 A CN202010841170 A CN 202010841170A CN 114079731 A CN114079731 A CN 114079731A
Authority
CN
China
Prior art keywords
display screen
diffraction
inverse operation
screen assembly
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010841170.8A
Other languages
Chinese (zh)
Inventor
李志林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010841170.8A priority Critical patent/CN114079731A/en
Publication of CN114079731A publication Critical patent/CN114079731A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an image processing method, an image processing device, a storage medium and a terminal, and relates to the technical field of image processing. Firstly, acquiring a target image shot by a camera assembly, wherein the target image comprises a diffraction image area; then obtaining an inverse operation result corresponding to the display screen assembly, wherein the inverse operation result is obtained by performing inverse operation on a diffraction model which diffracts the display screen assembly; and finally, eliminating the diffraction image area in the target image based on the inverse operation result. Because light can take place the diffraction and form the diffraction image area in the target image when passing through the display screen subassembly, consequently through carrying out the inverse operation to the diffraction model that the display screen subassembly took place the diffraction, can effectively eliminate the processing to the diffraction image area in the target image according to the inverse operation result, effectively improve the photographic effect of the camera subassembly of screen subassembly below.

Description

Image processing method, image processing device, storage medium and terminal
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a storage medium, and a terminal.
Background
With the development of science and technology, various terminals appear in the life of people, wherein an important function of the terminal is photographing, so how to improve the image quality of an image obtained in the photographing process becomes one of the important research points of the people in the field.
In the related art, in order to achieve a high screen occupation ratio of some types of terminals, a camera is disposed below a screen, and since a light-emitting element, an electrode and a driving circuit are disposed in the screen, a diffraction effect occurs when external light passes through the screen, so that the shooting effect of the camera below the screen is poor.
Disclosure of Invention
The application provides an image processing method, an image processing device, a storage medium and a terminal, which can solve the technical problem that in the related art, when external light passes through a screen, diffraction effect occurs, so that the photographing effect of a camera below the screen is poor.
In a first aspect, an embodiment of the present application provides an image processing method, which is applied to a display screen assembly, where a camera assembly is disposed below the display screen assembly, and the method includes:
acquiring a target image shot by the camera assembly, wherein the target image comprises a diffraction image area;
obtaining an inverse operation result corresponding to the display screen component, wherein the inverse operation result is obtained by performing inverse operation on a diffraction model which diffracts the display screen component;
and eliminating the diffraction image area in the target image based on the inverse operation result.
Optionally, before acquiring the target image captured by the camera assembly, the method further includes: acquiring diffraction models of diffraction of display screen components; respectively carrying out inverse operation on each diffraction model and obtaining corresponding inverse operation results; and correspondingly storing each inverse operation result and the display screen identification of each display screen assembly to a database.
Optionally, the acquiring diffraction models of diffraction of the display screen components includes: according to the unit transmittance function and the unit period of the display screen assembly, acquiring the transmittance function of the display screen assembly as follows:
Figure RE-GDA0002725394210000021
wherein the transmittance function of each unit in the display screen assembly is tunit(x) The unit period is d, comb (x) is a one-dimensional comb function.
Optionally, the acquiring diffraction models of diffraction of the display screen components further includes:
according to the transmittance function of the display screen assembly, acquiring the light field distribution of the display screen assembly as follows:
Figure RE-GDA0002725394210000022
wherein rect (x) is a rectangular function, and plane waves with the width of L are adopted to irradiate the display screen assembly. Optionally, the acquiring diffraction models of diffraction of the display screen components further includes: fourier change is carried out on the light field distribution of the display screen assembly, and the far field diffraction light field distribution of the display screen assembly is obtained as follows:
Figure RE-GDA0002725394210000023
taking the transmittance function of the display screen assembly, the light field distribution of the display screen assembly and the far-field diffraction light field distribution of the display screen assembly as a diffraction model of the diffraction of the display screen assembly;
and sequentially acquiring diffraction models of other display screen components.
Optionally, the eliminating the diffraction image area in the target image based on the inverse operation result includes: deconvolving the target image based on the inverse operation result to eliminate a diffraction image area in the target image.
Optionally, after eliminating the diffraction image area in the target image, the method further includes: and restoring the missing image in the target image after the diffraction image area is eliminated.
In a second aspect, an image processing apparatus according to an embodiment of the present application is applied to a display screen assembly, a camera assembly is disposed below the display screen assembly, and the apparatus includes:
the image acquisition module is used for acquiring a target image shot by the camera assembly, and the target image comprises a diffraction image area;
the inverse operation module is used for acquiring an inverse operation result corresponding to the display screen component, and the inverse operation result is obtained by performing inverse operation on a diffraction model which diffracts the display screen component;
and the diffraction eliminating module is used for eliminating the diffraction image area in the target image based on the inverse operation result.
In a third aspect, embodiments of the present application provide a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the steps of the above-mentioned method.
In a fourth aspect, the present application provides a terminal, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor executes the program to implement the steps of the above method, and the terminal further includes a display screen assembly and a camera assembly in the above method.
The beneficial effects brought by the technical scheme provided by some embodiments of the application at least comprise:
the embodiment of the application provides an image processing method, which comprises the steps of firstly, obtaining a target image shot by a camera assembly, wherein the target image comprises a diffraction image area; then obtaining an inverse operation result corresponding to the display screen assembly, wherein the inverse operation result is obtained by performing inverse operation on a diffraction model which diffracts the display screen assembly; and finally, eliminating the diffraction image area in the target image based on the inverse operation result. Because light can take place the diffraction and form the diffraction image area in the target image when passing through the display screen subassembly, consequently through carrying out the inverse operation to the diffraction model that the display screen subassembly took place the diffraction, can effectively eliminate the processing to the diffraction image area in the target image according to the inverse operation result, effectively improve the photographic effect of the camera subassembly of screen subassembly below.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic structural diagram of a terminal according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of another terminal provided in an embodiment of the present application;
fig. 3 is a schematic view of an application scenario of an image processing method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a diffraction formation process of light provided by an embodiment of the present application;
fig. 5 is a schematic flowchart of an image processing method according to an embodiment of the present application;
fig. 6 is a schematic flowchart of an image processing method according to another embodiment of the present application;
FIG. 7 is a schematic view of a diffraction screen model provided in accordance with another embodiment of the present application;
FIG. 8 is a schematic diagram of a far field diffracted optical field distribution provided by another embodiment of the present application;
FIG. 9 is a diagram illustrating an envelope of diffracted light intensity of a one-dimensional diffraction screen according to another embodiment of the present application;
FIG. 10 is a schematic diagram of the diffraction order spacing of a one-dimensional diffraction screen according to another embodiment of the present application;
FIG. 11 is a diagram illustrating far field diffracted intensity of a spot according to another embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of an image processing apparatus according to another embodiment of the present application;
fig. 13 is a schematic structural diagram of an image processing apparatus according to another embodiment of the present application;
fig. 14 is a schematic structural diagram of a terminal according to another embodiment of the present application.
Detailed Description
In order to make the features and advantages of the present application more obvious and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Cumulatively, the image processing method provided in the embodiment of the present application can be applied to various display screen assemblies and terminals. For example, the display area corresponding to the display screen assembly at least comprises a first area and a second area, and the transparency of the first area is different from that of the second area. Optionally, the camera in the terminal in the embodiment of the present application adopts an off-screen camera technology, that is, the camera component of the terminal is disposed below the screen of the display screen component, and the screen may be an Active-matrix organic light-emitting diode (AMOLED) screen or a Passive-matrix organic light-emitting diode (PMOLED) screen.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a terminal according to an embodiment of the present disclosure.
As shown in fig. 1, fig. 1A is a schematic structural view of a terminal with a display screen assembly removed, the terminal is provided with a camera assembly 110, fig. 1B is a schematic structural view of a display screen assembly 120 of the terminal, a corresponding display area in the display screen assembly 120 at least includes a first area 121 and a second area 122, where transparencies of the first area 121 and the second area 122 are different, when the display screen assembly 120 is assembled on the terminal, the first area 121 is disposed relative to the camera assembly 110, that is, the first area 121 is located at a position corresponding to the camera assembly 110, at this time, a pixel density of the first area 121 may be reduced, so that the transparency of the first area 121 is greater than that of the second area 122, so that external light rays irradiate on the camera assembly 110 through the first area 121, and the camera assembly 110 performs imaging according to the light rays.
Referring to fig. 2, fig. 2 is a schematic structural diagram of another terminal according to an embodiment of the present disclosure.
As shown in fig. 2, in fig. 2A, a schematic structural view of a terminal after a display panel assembly is removed, a camera assembly 210 is disposed in the terminal, and in fig. 2B, a schematic structural view of a display panel assembly 220 of the terminal, a corresponding display area in the display panel assembly 220 at least includes a first area 221, a second area 222 and a third area 223, wherein the transparency of the first area 221, the transparency of the second area 222 and the transparency of the third area 223 are different, when the display panel assembly 220 is assembled on the terminal, the first area 221 is disposed relative to the camera assembly 210, that is, the first area 221 is located at a position corresponding to the camera assembly 210, in order to facilitate external light to irradiate the camera assembly 210 through the first area 221, the camera assembly 210 performs imaging according to the light, a pixel body of a light emitting element in the first area 221 may be made of a transparent material, an electrode of the light emitting element is also made of a transparent material, the driving circuits of the first region 221 are disposed in the second region 222, the pixel density of the second region 222 is reduced, and the pixels of the third region 223 are normally disposed, so that the transparency of the first region 221 is less than that of the second region 222 or the third region 223.
Referring to fig. 3, fig. 3 is a schematic view of an application scenario of an image processing method according to an embodiment of the present application.
Optionally, as shown in fig. 3, when a camera assembly is disposed below the display screen assembly, when an image is captured by the camera assembly, light may pass through the display screen assembly and enter the camera assembly, and diffraction of the light may occur in the display screen assembly at this time, so that diffraction fringes may exist in the image captured by the camera assembly, and the quality and effect of the image may be affected.
The reason why the diffraction fringes exist in the image shot by the camera assembly is that in the design of the display screen assembly, because the first area cannot be completely transparent, the electrode or the pixel main body in the first area may form a shielding layer with a gap, the shielding layer may cause an optical periodic grating structure to generate a diffraction effect on incident light, and the diffraction effect may affect the shooting effect of the camera assembly below the first area in the screen. Fig. 4 is a schematic diagram of a light diffraction forming process provided in an embodiment of the present application, and as shown in fig. 4, a flare phenomenon refers to that a lens is affected by some non-ideal factors in an image transmission process, for example, light enters a lens head through a small hole or a small gap in a shielding layer and is diffracted, so that light errors are deflected to cause aberration, and diffraction fringes appear in an image captured by a camera. Thus, when light enters the camera assembly through a small hole or slit in the shielding layer of the display screen assembly, diffraction fringes may be present in the image captured by the camera assembly. Therefore, an image processing method can be provided to meet the requirement of better processing the diffraction fringes in the image.
For convenience of description, the following describes an implementation process of the image processing method by taking an example in which the image processing method is applied to a display screen assembly, and a camera assembly is disposed below the display screen assembly.
Referring to fig. 5, fig. 5 is a schematic flowchart of an image processing method according to an embodiment of the present disclosure.
As shown in fig. 5, the image processing method includes:
s501, acquiring a target image shot by the camera assembly, wherein the target image comprises a diffraction image area.
It should be noted that the execution main body in the embodiment of the present application may be, for example, a Central Processing Unit (CPU) or other integrated circuit chip in the image Processing terminal in terms of hardware, and may be, for example, a service related to an image Processing method in the image Processing terminal in terms of software, which is not limited to this. For convenience of description, the following description will be given taking the execution main body as the central processing unit as an example.
When the user needs to take a picture, the display screen assembly of the terminal can be touched through voice control or limbs, so that the voice receiver or the touch sensor receives the picture taking control signal, and the central processing unit can control the camera assembly to take a picture according to the picture taking control signal. Due to the influence of the setting position of the camera assembly in the terminal or the influence of the angle of the external incident light irradiating the camera assembly, a diffraction image area similar to that in fig. 4 may or may not appear in the image shot by the camera assembly. Therefore, in order to improve the shooting speed of the terminal, the image can be preprocessed after the central processing unit acquires the image shot by the camera assembly, the preprocessing specific step comprises the steps of judging whether the area of a diffraction image area in the image shot by the camera assembly exceeds a preset area, if the area of the diffraction image area in the image does not exceed the preset area, the image shot by the camera assembly meets the requirement, the image can be considered not to contain the diffraction image area, and the diffraction area in the image does not need to be eliminated; if the area of the diffraction image area in the image exceeds the preset area, the image shot by the camera assembly does not meet the requirement, the diffraction area in the image needs to be eliminated, at the moment, the image shot by the camera assembly can be determined as a target image, and the target image comprises the diffraction image area.
S502, obtaining an inverse operation result corresponding to the display screen assembly, wherein the inverse operation result is obtained by performing inverse operation on a diffraction model of the display screen assembly.
It will be appreciated that for a camera assembly, rather than imaging the full spectrum, the camera assembly obtains RGB information through a green film of three RGB colors, which are combined to provide color-mixed information. The image processing process is to convert RAW data collected by a sensor in the camera assembly into RGB data for human eyes to view, and may include white balance, demosaicing, noise reduction, color gamut conversion, gamma correction, and compression of an image. Therefore, it is possible to eliminate diffraction in an image by researching the real diffraction of the RGB light through the diffraction structure and performing deconvolution operation. The diffraction model when the light passes through the display screen component and is diffracted is analyzed to find that the transmittance function of the display screen component and the far field diffraction light intensity form a Fourier change pair relationship, so that after a target image shot by the camera component is obtained, the anti-operation result of diffraction corresponding to the display screen component can be determined aiming at the display screen component in advance, wherein the anti-operation result of diffraction can be obtained through a preliminary experiment, specifically, in the preliminary experiment, the diffraction model corresponding to the display screen can be obtained firstly, the diffraction model comprises a calculation process of diffraction when the light passes through the display screen component, then the anti-operation can be carried out aiming at the diffraction model corresponding to the display screen component, and the anti-operation result is obtained, namely, the anti-operation is carried out on the calculation process of diffraction when the light passes through the display screen component, then the normal light before the light passes through the display screen component can be obtained based on the anti-operation result, to achieve the elimination of diffraction zones in the target image.
And S503, eliminating the diffraction image area in the target image based on the inverse operation result.
Because the diffraction model comprises a calculation process of diffraction when light passes through the display screen assembly, the diffraction model corresponding to the display screen assembly can be subjected to inverse operation and an inverse operation result can be obtained, namely the diffraction model comprises a calculation process of diffraction when light passes through the display screen assembly, further, a transmittance function of the display screen assembly and far-field diffraction light intensity form a Fourier change pair relation, then, a target image can be deconvoluted based on the inverse operation result to eliminate a diffraction image area in the target image, namely normal light before the diffracted light passes through the display screen assembly is obtained based on the inverse operation result, so that the elimination of the diffraction area in the target image is realized, and the photographing effect of the camera assembly below the screen assembly is effectively improved.
In the embodiment of the application, a target image shot by a camera assembly is obtained firstly, wherein the target image comprises a diffraction image area; then obtaining an inverse operation result corresponding to the display screen assembly, wherein the inverse operation result is obtained by performing inverse operation on a diffraction model which diffracts the display screen assembly; and finally, eliminating the diffraction image area in the target image based on the inverse operation result. Because light can take place the diffraction and form the diffraction image area in the target image when passing through the display screen subassembly, consequently through carrying out the inverse operation to the diffraction model that the display screen subassembly took place the diffraction, can effectively eliminate the processing to the diffraction image area in the target image according to the inverse operation result, effectively improve the photographic effect of the camera subassembly of screen subassembly below.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating an image processing method according to another embodiment of the present application.
As shown in fig. 6, the method steps include:
s601, acquiring each diffraction model of each display screen component.
Alternatively, a suitable diffraction model may be selected according to the actual structure of the display screen assembly, and one possible method for obtaining diffraction models of the display screen assembly is as follows:
for convenience of description, the transmittance function of the diffraction screen of the display screen assembly and the far-field diffracted light intensity form a fourier transform pair relationship in consideration of only one dimension. Referring to fig. 7, fig. 7 is a schematic view of a diffraction screen model according to another embodiment of the present application, and as shown in fig. 7, a display screen assembly is taken as an example, and the display screen assembly is configured to include a plurality of units, and a transmittance function of each unit is tunit(x) The unit period is d, each unit is made of two materials, the width of the first material is a, and the transmittance function of the display screen assembly is obtained according to the unit transmittance function and the unit period of the display screen assembly, that is, the transmittance function of the one-dimensional diffraction screen can be expressed as:
Figure RE-GDA0002725394210000081
wherein the transmittance function of each unit in the display screen assembly is tunit(x) The unit period is d, comb (x) is a one-dimensional comb function.
Further, according to the transmittance function of the display screen assembly, the light field distribution of the display screen assembly is obtained as follows:
Figure RE-GDA0002725394210000082
wherein rect (x) is a rectangular function, and plane waves with the width of L are adopted to irradiate the display screen assembly.
And finally, performing Fourier change on the light field distribution of the display screen assembly to obtain the far field diffraction light field distribution of the display screen assembly as follows:
Figure RE-GDA0002725394210000083
taking the transmittance function of the display screen assembly, the light field distribution of the display screen assembly and the far-field diffraction light field distribution of the display screen assembly as diffraction models for diffraction of the display screen assembly; and finally, similar to the acquisition of the diffraction model corresponding to one display screen component, the diffraction models of other display screen components which are diffracted can be sequentially acquired.
Referring to fig. 8, fig. 8 is a schematic diagram of a far-field diffraction optical field distribution according to another embodiment of the present application.
As shown in FIG. 8, the figure is
Figure RE-GDA0002725394210000084
That is, each unit has only the far-field diffraction optical field distribution consisting of two parts which are transparent and opaque.
For the far-field diffraction light field distribution of the display screen assembly, neglecting the constant term, the far-field diffraction pattern can be known to be composed of three parts:
a first part: fourier transform FT (t) of transmittance function of cellunit(x) Which constitutes the envelope of the overall diffracted intensity, the shape of the envelope and the transmittance function t of the cellunit(x) In which t isunit(x) The width is d, and the envelope shape is also related to the spacing d. Referring to fig. 9, fig. 9 is a schematic diagram of a diffraction intensity envelope of a one-dimensional diffraction screen according to another embodiment of the present application.
A second part: the comb function comb (dx) determines the distance between diffraction orders, and the value thereof is only related to the cell distance d, and the specific order distance is shown in fig. 10, where fig. 10 is a schematic diagram of the diffraction order distance of a one-dimensional diffraction screen according to another embodiment of the present application.
And a third part: the fourier transform sinc (lx) of the spot, which determines the size and shape of each diffracted spot, is related only to the spot diameter L. Specific far-field diffraction intensity of the light spot, referring to fig. 11, fig. 11 is a schematic diagram of the far-field diffraction intensity of the light spot according to another embodiment of the present disclosure.
And S602, respectively carrying out inverse operation on each diffraction model to obtain corresponding inverse operation results.
From the above analysis, it can be seen that the diffraction of the regular diffraction structure is regular, and a diffraction model can be obtained by accurate calculation. By performing the inverse operation on the diffraction calculation process, the information of the single wavelength can be deconvoluted from the photographed image, so that the diffraction effect caused by the grating structure in the photographed image is eliminated, and a photographed image close to reality is obtained. Since the lens filter is a broadband filter and is not a single wavelength, the result of the model is different from the actual situation, and the adjustment needs to be made for the coefficients.
Therefore, after the diffraction models corresponding to the display screens are obtained, since the diffraction models include the calculation process of diffraction when light passes through the display screen assembly, the back operation can be performed on the diffraction models corresponding to the display screen assembly and the back operation result can be obtained, that is, the diffraction models include the calculation process of diffraction when light passes through the display screen assembly, the back operation can be performed, and the back operation result can be obtained.
And S603, correspondingly storing each inverse operation result and the display screen identification of each display screen component to a database.
In order to facilitate the subsequent central processing unit to acquire the target image shot by the camera assembly, the inverse operation result corresponding to the display screen assembly can be acquired firstly, and each inverse operation result and the display screen identification of each display screen assembly can be correspondingly stored in the database so as to realize the calling of the central processing unit at any time.
And S604, acquiring a target image shot by the camera assembly, wherein the target image comprises a diffraction image area.
As for step S604, the detailed description of step S501 in the above embodiment can be referred to.
S605, obtaining an inverse operation result corresponding to the display screen assembly, wherein the inverse operation result is obtained by performing inverse operation on the diffraction model of the display screen assembly.
Because each inverse operation result is stored in the database corresponding to the display screen identifier of each display screen assembly, the display screen identifier corresponding to the display screen assembly can be obtained first, and then the inverse operation result corresponding to the display screen identifier is searched in the database according to the display screen identifier.
And S606, deconvoluting the target image based on the inverse operation result to eliminate the diffraction image area in the target image.
Because the diffraction model comprises a calculation process of diffraction when light passes through the display screen assembly, the diffraction model corresponding to the display screen assembly can be subjected to inverse operation and an inverse operation result can be obtained, namely the diffraction model comprises a calculation process of diffraction when light passes through the display screen assembly, further, a transmittance function of the display screen assembly and far-field diffraction light intensity form a Fourier change pair relation, then, a target image can be deconvoluted based on the inverse operation result to eliminate a diffraction image area in the target image, namely normal light before the diffracted light passes through the display screen assembly is obtained based on the inverse operation result, so that the elimination of the diffraction area in the target image is realized, and the photographing effect of the camera assembly below the screen assembly is effectively improved.
S607, the missing image in the target image after the diffraction image region is removed is restored.
The above steps can eliminate more obvious diffraction, such as diffraction fringes around a point light source. However, since image information is lost, it is necessary to perform AI learning on the diffraction-free real image and the image under the transmission screen by combining with an AI algorithm, and the analysis power and the image quality can be improved by training a typical scene. For example, the AI learning can be performed on the true image without diffraction and the image under the transmission screen, and the restoration processing can be performed on the missing image in the target image after the diffraction image area is removed, so as to improve the image quality.
In the embodiment of the application, a target image shot by a camera assembly is obtained firstly, wherein the target image comprises a diffraction image area; then obtaining an inverse operation result corresponding to the display screen assembly, wherein the inverse operation result is obtained by performing inverse operation on a diffraction model which diffracts the display screen assembly; and finally, eliminating the diffraction image area in the target image based on the inverse operation result. Because light can take place the diffraction and form the diffraction image area in the target image when passing through the display screen subassembly, consequently through carrying out the inverse operation to the diffraction model that the display screen subassembly took place the diffraction, can effectively eliminate the processing to the diffraction image area in the target image according to the inverse operation result, effectively improve the photographic effect of the camera subassembly of screen subassembly below.
Referring to fig. 12, fig. 12 is a schematic structural diagram of an image processing apparatus according to another embodiment of the present application.
As shown in fig. 12, the image processing apparatus 1200 is applied to a display screen assembly below which a camera assembly is disposed, and the image processing apparatus 1200 includes:
an image acquisition module 1201, configured to acquire a target image captured by the camera assembly, where the target image includes a diffraction image area;
an inverse operation module 1203, configured to obtain an inverse operation result corresponding to the display screen component, where the inverse operation result is a result obtained by performing an inverse operation on a diffraction model in which the display screen component is diffracted;
and a diffraction eliminating module 1204, configured to perform elimination processing on a diffraction image region in the target image based on the inverse operation result.
Referring to fig. 13, fig. 13 is a schematic structural diagram of an image processing apparatus according to another embodiment of the present application.
As shown in fig. 13, the image processing apparatus 1300 includes:
a model obtaining module 1301, configured to obtain each diffraction model that each display screen component diffracts.
Wherein, each diffraction model that each display screen subassembly takes place diffraction is obtained, includes:
according to the unit transmittance function and the unit period of the display screen assembly, the transmittance function of the display screen assembly is obtained as follows:
Figure RE-GDA0002725394210000111
wherein the transmittance function of each unit in the display screen assembly is tunit(x) The unit period is d, comb (x) is a one-dimensional comb function.
Further, acquiring each diffraction model diffracted by each display screen component, further comprises:
according to the transmittance function of the display screen assembly, acquiring the light field distribution of the display screen assembly as follows:
Figure RE-GDA0002725394210000112
wherein rect (x) is a rectangular function, and plane waves with the width of L are adopted to irradiate the display screen assembly.
Further, acquiring each diffraction model diffracted by each display screen component, further comprises:
fourier change is carried out on the light field distribution of the display screen assembly, and the far field diffraction light field distribution of the display screen assembly is obtained as follows:
Figure RE-GDA0002725394210000113
taking the transmittance function of the display screen assembly, the light field distribution of the display screen assembly and the far-field diffraction light field distribution of the display screen assembly as diffraction models for diffraction of the display screen assembly; and sequentially acquiring diffraction models of other display screen components.
A result obtaining module 1302, configured to perform inverse operations on the diffraction models respectively and obtain corresponding inverse operation results.
And the data storage module 1303 is used for correspondingly storing each inverse operation result and the display screen identifier of each display screen component to the database.
And an image obtaining module 1304, configured to obtain a target image captured by the camera assembly, where the target image includes a diffraction image area.
The inverse operation module 1305 deconvolves the target image based on the inverse operation result to eliminate the diffraction image area in the target image.
The diffraction eliminating module 1306 is configured to obtain an inverse operation result corresponding to the display screen component, where the inverse operation result is a result obtained by performing an inverse operation on a diffraction model in which the display screen component is diffracted.
A restoring module 1307, configured to perform restoring processing on the missing image in the target image after the diffraction image area is eliminated.
In an embodiment of the present application, an image processing apparatus includes: the image acquisition module is used for acquiring a target image shot by the camera assembly, and the target image comprises a diffraction image area; the anti-operation module is used for acquiring an anti-operation result corresponding to the display screen component, and the anti-operation result is obtained by performing anti-operation on a diffraction model which diffracts the display screen component; and the diffraction eliminating module is used for eliminating the diffraction image area in the target image based on the inverse operation result. Because light can take place the diffraction and form the diffraction image area in the target image when passing through the display screen subassembly, consequently through carrying out the inverse operation to the diffraction model that the display screen subassembly took place the diffraction, can effectively eliminate the processing to the diffraction image area in the target image according to the inverse operation result, effectively improve the photographic effect of the camera subassembly of screen subassembly below.
Embodiments of the present application also provide a computer storage medium, which stores a plurality of instructions, where the instructions are suitable for being loaded by a processor and executing the steps of the image processing method in the foregoing embodiments.
Further, please refer to fig. 14, where fig. 14 is a schematic structural diagram of a terminal according to another embodiment of the present application. As shown in fig. 14, terminal 1400 may include: at least one central processor 1401, at least one network interface 1404, a user interface 1403, a memory 1405, at least one communication bus 1402. The terminal also comprises the display screen assembly and the camera assembly in the embodiment.
The communication bus 1402 is used to realize connection communication between these components.
The user interface 1403 may include a screen (Display) and a Camera (Camera), and the optional user interface 1403 may also include a standard wired interface and a wireless interface.
The network interface 1404 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Central processor 1401 may include one or more processing cores, among others. The central processor 1401 connects the respective parts within the entire terminal 1400 by means of various interfaces and lines, and performs various functions of the terminal 1400 and processes data by operating or executing instructions, programs, code sets or instruction sets stored in the memory 1405 and calling data stored in the memory 1405. Optionally, the central Processing unit 1401 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The Central Processing Unit 1401 may integrate one or a combination of several of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the screen; the modem is used to handle wireless communications. It is to be understood that the modem may be implemented by a single chip without being integrated into the central processor 1401.
The Memory 1405 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1405 includes a non-transitory computer-readable medium. The memory 1405 may be used to store an instruction, a program, code, a set of codes, or a set of instructions. The memory 1405 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described method embodiments, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. Memory 1405 may alternatively be at least one memory device located remotely from central processor 1401 as described above. As shown in fig. 14, the memory 1405, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an image processing program.
In the terminal 1400 shown in fig. 14, the user interface 1403 is mainly used as an interface for providing input for a user, and acquiring data input by the user; the central processing unit 1401 may be configured to call up the image processing program stored in the memory 1405, and specifically perform the following operations:
acquiring a target image shot by a camera assembly, wherein the target image comprises a diffraction image area;
acquiring an inverse operation result corresponding to the display screen component, wherein the inverse operation result is obtained by performing inverse operation on a diffraction model which diffracts the display screen component;
and eliminating the diffraction image area in the target image based on the inverse operation result.
In one embodiment, before performing the step of acquiring the target image captured by the camera assembly, the central processor 1401 further performs the following steps: acquiring diffraction models of diffraction of display screen components; respectively carrying out inverse operation on each diffraction model and obtaining corresponding inverse operation results; and correspondingly storing each inverse operation result and the display screen identification of each display screen assembly to a database.
In one embodiment, when the central processing unit 1401 is executing to obtain each diffraction model diffracted by each display screen component, the following steps are specifically executed:
according to the unit transmittance function and the unit period of the display screen assembly, the transmittance function of the display screen assembly is obtained as follows:
Figure RE-GDA0002725394210000141
wherein the transmittance function of each unit in the display screen assembly is tunit(x) The unit period is d, comb (x) is a one-dimensional comb function.
In one embodiment, when the central processing unit 1401 performs the step of acquiring each diffraction model diffracted by each display screen component, the following steps are further specifically performed:
according to the transmittance function of the display screen assembly, acquiring the light field distribution of the display screen assembly as follows:
Figure RE-GDA0002725394210000142
wherein rect (x) is a rectangular function, and plane waves with the width of L are adopted to irradiate the display screen assembly.
In one embodiment, when the central processing unit 1401 performs the step of acquiring each diffraction model diffracted by each display screen component, the following steps are further specifically performed:
fourier change is carried out on the light field distribution of the display screen assembly, and the far field diffraction light field distribution of the display screen assembly is obtained as follows:
Figure RE-GDA0002725394210000143
taking the transmittance function of the display screen assembly, the light field distribution of the display screen assembly and the far-field diffraction light field distribution of the display screen assembly as diffraction models for diffraction of the display screen assembly; and sequentially acquiring diffraction models of other display screen components.
In one embodiment, the central processing unit 1401, when executing the elimination processing of the diffraction image region in the target image based on the result of the inverse operation, specifically executes the following steps: deconvolving the target image based on the result of the inverse operation to eliminate a diffraction image region in the target image.
In one embodiment, the central processor 1401, after performing the elimination of the diffraction image region in the target image, further specifically performs the following steps: and restoring the missing image in the target image after the diffraction image area is eliminated.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of modules is merely a division of logical functions, and an actual implementation may have another division, for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
Modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It should be noted that, for the sake of simplicity, the above-mentioned method embodiments are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In view of the above description of the image processing method, apparatus, storage medium and terminal provided by the present application, those skilled in the art will recognize that the scope of the present application can be modified according to the following claims.

Claims (10)

1. An image processing method is applied to a display screen assembly, a camera assembly is arranged below the display screen assembly, and the method is characterized by comprising the following steps:
acquiring a target image shot by the camera assembly, wherein the target image comprises a diffraction image area;
obtaining an inverse operation result corresponding to the display screen component, wherein the inverse operation result is obtained by performing inverse operation on a diffraction model which diffracts the display screen component;
and eliminating the diffraction image area in the target image based on the inverse operation result.
2. The method of claim 1, wherein prior to acquiring the image of the target captured by the camera assembly, further comprising:
acquiring diffraction models of diffraction of display screen components;
respectively carrying out inverse operation on each diffraction model and obtaining corresponding inverse operation results;
and correspondingly storing each inverse operation result and the display screen identification of each display screen assembly to a database.
3. The method of claim 2, wherein obtaining diffraction models for diffraction of display screen components comprises:
according to the unit transmittance function and the unit period of the display screen assembly, acquiring the transmittance function of the display screen assembly as follows:
Figure FDA0002641047140000011
wherein the transmittance function of each unit in the display screen assembly is tunit(x) The unit period is d, comb (x) is a one-dimensional comb function.
4. The method of claim 3, wherein obtaining each diffraction model for each display screen assembly, further comprises:
according to the transmittance function of the display screen assembly, acquiring the light field distribution of the display screen assembly as follows:
Figure FDA0002641047140000012
wherein rect (x) is a rectangular function, and plane waves with the width of L are adopted to irradiate the display screen assembly.
5. The method of claim 4, wherein obtaining each diffraction model for each display screen assembly, further comprises:
fourier change is carried out on the light field distribution of the display screen assembly, and the far field diffraction light field distribution of the display screen assembly is obtained as follows:
Figure FDA0002641047140000021
taking the transmittance function of the display screen assembly, the light field distribution of the display screen assembly and the far-field diffraction light field distribution of the display screen assembly as a diffraction model of the diffraction of the display screen assembly;
and sequentially acquiring diffraction models of other display screen components.
6. The method of claim 5, wherein the eliminating the diffraction image area in the target image based on the inverse operation result comprises:
deconvolving the target image based on the inverse operation result to eliminate a diffraction image area in the target image.
7. The method of claim 6, wherein after eliminating the diffraction image region in the target image, further comprising:
and restoring the missing image in the target image after the diffraction image area is eliminated.
8. The utility model provides an image processing device, is applied to the display screen subassembly, be provided with camera subassembly below the display screen subassembly, its characterized in that, the device includes:
the image acquisition module is used for acquiring a target image shot by the camera assembly, and the target image comprises a diffraction image area;
the inverse operation module is used for acquiring an inverse operation result corresponding to the display screen component, and the inverse operation result is obtained by performing inverse operation on a diffraction model which diffracts the display screen component;
and the diffraction eliminating module is used for eliminating the diffraction image area in the target image based on the inverse operation result.
9. A computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the steps of the method according to any of claims 1 to 7.
10. A terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor when executing the program implementing the steps of the method of any one of claims 1 to 7;
the terminal further comprises a display screen assembly and a camera assembly according to any one of claims 1 to 7.
CN202010841170.8A 2020-08-19 2020-08-19 Image processing method, image processing device, storage medium and terminal Pending CN114079731A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010841170.8A CN114079731A (en) 2020-08-19 2020-08-19 Image processing method, image processing device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010841170.8A CN114079731A (en) 2020-08-19 2020-08-19 Image processing method, image processing device, storage medium and terminal

Publications (1)

Publication Number Publication Date
CN114079731A true CN114079731A (en) 2022-02-22

Family

ID=80281942

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010841170.8A Pending CN114079731A (en) 2020-08-19 2020-08-19 Image processing method, image processing device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN114079731A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104268421A (en) * 2014-10-10 2015-01-07 中国科学院高能物理研究所 Method for removing blurring effect in X-ray scattering and diffraction experiments
CN110475063A (en) * 2019-08-01 2019-11-19 Oppo广东移动通信有限公司 Image-pickup method and device and storage medium
CN110489580A (en) * 2019-08-26 2019-11-22 Oppo(重庆)智能科技有限公司 Image processing method, device, display screen component and electronic equipment
CN111123538A (en) * 2019-09-17 2020-05-08 印象认知(北京)科技有限公司 Image processing method and method for adjusting diffraction screen structure based on point spread function

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104268421A (en) * 2014-10-10 2015-01-07 中国科学院高能物理研究所 Method for removing blurring effect in X-ray scattering and diffraction experiments
CN110475063A (en) * 2019-08-01 2019-11-19 Oppo广东移动通信有限公司 Image-pickup method and device and storage medium
CN110489580A (en) * 2019-08-26 2019-11-22 Oppo(重庆)智能科技有限公司 Image processing method, device, display screen component and electronic equipment
CN111123538A (en) * 2019-09-17 2020-05-08 印象认知(北京)科技有限公司 Image processing method and method for adjusting diffraction screen structure based on point spread function

Similar Documents

Publication Publication Date Title
US11182877B2 (en) Techniques for controlled generation of training data for machine learning enabled image enhancement
US20210152735A1 (en) Image restoration for through-display imaging
CN110675336A (en) Low-illumination image enhancement method and device
CN108364270B (en) Color reduction method and device for color cast image
US20230074060A1 (en) Artificial-intelligence-based image processing method and apparatus, electronic device, computer-readable storage medium, and computer program product
WO2021128593A1 (en) Facial image processing method, apparatus, and system
CN110838088B (en) Multi-frame noise reduction method and device based on deep learning and terminal equipment
EP3675477B1 (en) Electronic device for providing function by using rgb image and ir image acquired through one image sensor
CN114298942A (en) Image deblurring method and device, computer readable medium and electronic equipment
WO2022006556A1 (en) Systems and methods of nonlinear image intensity transformation for denoising and low-precision image processing
CN104010134B (en) For forming the system and method with wide dynamic range
CN202841396U (en) Digital film optimization device and digital film projection system
CN112489144A (en) Image processing method, image processing apparatus, terminal device, and storage medium
US20170163852A1 (en) Method and electronic device for dynamically adjusting gamma parameter
CN114079731A (en) Image processing method, image processing device, storage medium and terminal
WO2020215263A1 (en) Image processing method and device
CN109308690B (en) Image brightness balancing method and terminal
CN111968039A (en) Day and night universal image processing method, device and equipment based on silicon sensor camera
CN116156333A (en) Camera imaging optimization method and system based on complex illumination environment
US20230368340A1 (en) Gating of Contextual Attention and Convolutional Features
CN110766153A (en) Neural network model training method and device and terminal equipment
CN111383171B (en) Picture processing method, system and terminal equipment
CN114331927A (en) Image processing method, storage medium and terminal equipment
CN113014811A (en) Image processing apparatus, image processing method, image processing device, and storage medium
CN111970439A (en) Image processing method and device, terminal and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination