CN117596494A - Image pickup module, image generation method, electronic device, and readable storage medium - Google Patents

Image pickup module, image generation method, electronic device, and readable storage medium Download PDF

Info

Publication number
CN117596494A
CN117596494A CN202311512300.3A CN202311512300A CN117596494A CN 117596494 A CN117596494 A CN 117596494A CN 202311512300 A CN202311512300 A CN 202311512300A CN 117596494 A CN117596494 A CN 117596494A
Authority
CN
China
Prior art keywords
light
pixel unit
image data
image
grating structure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311512300.3A
Other languages
Chinese (zh)
Inventor
彭晓波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202311512300.3A priority Critical patent/CN117596494A/en
Publication of CN117596494A publication Critical patent/CN117596494A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B33/00Colour photography, other than mere exposure or projection of a colour film
    • G03B33/10Simultaneous recording or projection
    • G03B33/12Simultaneous recording or projection using beam-splitting or beam-combining systems, e.g. dichroic mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The application discloses a camera module, an image generation method, electronic equipment and a readable storage medium, and belongs to the technical field of camera shooting. The method comprises the following steps: the method comprises the steps of controlling a grating structure in a camera module to completely transmit light, and collecting first image data through an image sensor in the camera module; changing grating parameters of a grating structure corresponding to the first pixel unit based on the diffraction angle, the wavelength of the diffracted light and the incidence angle of the light into the first pixel unit of the image sensor so as to control the grating structure to diffract the first light in the light to a second pixel unit adjacent to the first pixel unit, and collecting second image data through the image sensor; a first image is generated from the first image data and the second image data.

Description

Image pickup module, image generation method, electronic device, and readable storage medium
Technical Field
The application belongs to the technical field of image pickup, and particularly relates to an image pickup module, an image generation method, electronic equipment and a readable storage medium.
Background
Multispectral imaging is a technology capable of simultaneously acquiring spectral characteristics and spatial image information, and is an important direction of development of a photoelectric imaging system. Multispectral imaging systems can provide images with 3 to 20 discrete bands and have found wide application in the agricultural and food fields.
The implementation schemes of the multispectral technology are various, and the multispectral image acquisition is realized by a spectrometer. At present, the imaging of an image sensor is to sample three primary colors of an incident spectrum curve to form three discrete data, and finally the three discrete data are mixed into color and brightness, so that the human eyes and a camera can only see the color and the brightness, and the details of the spectrum curve cannot be seen.
Disclosure of Invention
An object of the embodiments of the present application is to provide a camera module, an image generating method, an electronic device, and a readable storage medium, which can improve the color sensing capability and the spectrum information acquisition capability of the camera module.
In a first aspect, an embodiment of the present application provides a camera module, where the camera module includes an image sensor and a grating structure, the image sensor includes at least two pixel units, and a grating structure is correspondingly disposed above each pixel unit, and under the condition that the grating structure is powered on, grating parameters of the grating structure are changed.
In a second aspect, an embodiment of the present application provides an image generating method, which is executed by an electronic device, where the electronic device includes the camera module set according to the first aspect, and the method includes: the method comprises the steps of controlling a grating structure in a camera module to completely transmit light, and collecting first image data through an image sensor in the camera module; changing grating parameters of a grating structure corresponding to the first pixel unit based on the diffraction angle, the wavelength of the diffracted light and the incidence angle of the light into the first pixel unit of the image sensor so as to control the grating structure to diffract the first light in the light to a second pixel unit adjacent to the first pixel unit, and collecting second image data through the image sensor; a first image is generated from the first image data and the second image data.
In a third aspect, an embodiment of the present application provides an electronic device, including the camera module set according to the first aspect; the electronic device further includes: control module and generating module, wherein: the control module is used for controlling the grating structure in the camera module to completely transmit light and collecting first image data through the image sensor in the camera module; the control module is further configured to change a grating constant of a grating structure corresponding to the first pixel unit based on a diffraction angle, a wavelength of diffracted light, and an incident angle of the light incident into the first pixel unit of the image sensor, so as to control the grating structure to diffract the first light in the light to a second pixel unit adjacent to the first pixel unit, and collect second image data through the image sensor; the generating module is configured to generate a first image according to the first image data and the second image data.
In a fourth aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the second aspect.
In a fifth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the second aspect.
In a sixth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the second aspect.
In a seventh aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the second aspect.
In this embodiment of the application, be provided with the changeable grating structure of grating parameter on the pixel structure of module of making a video recording, through the grating parameter of the grating structure of adjustment pixel unit top, can control the luminousness of grating structure to can utilize the grating structure to diffract light and change the light path, realize the function that polychrome was gathered, thereby promote the color perceptibility and the spectral information acquisition ability of the module of making a video recording.
In the embodiment of the application, when an image is acquired, a grating structure in an image pickup module is controlled to completely transmit light, the grating structure does not have a diffraction function at the moment, first image data of first image data are acquired through an image sensor, then grating parameters of the grating structure corresponding to a first pixel unit are changed based on a diffraction angle, the wavelength of diffracted light and the incidence angle of light to the first pixel of the image sensor, the grating structure can diffract the first light with a specific wavelength to the pixel unit adjacent to the first pixel unit at the moment, and second image data are acquired through the image sensor; and finally, generating a first image according to the first image data and the second image data. By the method, the light path of the light on the pixel unit of the image sensor can be changed to realize diffraction of specific light, so that image data corresponding to the light with any wavelength can be obtained, multispectral information can be acquired through the image sensor, the color perception capability and the spectrum information acquisition capability of the camera module are improved, and the final image quality effect is improved.
Drawings
FIG. 1 is a schematic diagram of a related art camera module;
fig. 2 is a schematic structural diagram of a pixel unit in a pixel array on an image sensor in the related art;
fig. 3 is a schematic structural diagram of a camera module according to an embodiment of the present application;
FIG. 4 (A) is a schematic structural diagram of a grating structure according to an embodiment of the present disclosure;
FIG. 4 (B) is a second schematic diagram of a grating structure according to an embodiment of the present disclosure;
FIG. 4 (C) is a third schematic diagram of a grating structure according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a pixel unit in a pixel array of an image sensor according to an embodiment of the present disclosure;
fig. 6 is a flowchart of an image generating method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a color filter array of an image sensor according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a grating structure and a pixel unit according to an embodiment of the present disclosure;
FIG. 9 (A) is a schematic diagram of a diffraction region of a pixel unit according to an embodiment of the present disclosure;
FIG. 9 (B) is a second schematic diagram of the diffraction region of the pixel unit according to the embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
Fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 12 is a schematic hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms "first," "second," and the like in the description of the present application, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type and do not limit the number of objects, for example, the first object may be one or more. In addition, "and/or" in the specification means at least one of the connected objects, and the character "/", generally means a relationship in which the associated objects are one kind of "or".
The terms "at least one", and the like in the description of the present application refer to any one, any two, or a combination of two or more of the objects that it comprises. For example, at least one of a, b, c (item) may represent: "a", "b", "c", "a and b", "a and c", "b and c" and "a, b and c", wherein a, b, c may be single or plural. Similarly, the term "at least two" means two or more, and the meaning of the expression is similar to the term "at least one".
The terms of art in the embodiments of the present application are explained as follows.
1. Image sensor
An image sensor is a device for converting an optical image into an electronic signal, and is widely used in digital cameras and other optical devices. Nowadays, image sensors are mainly classified into two types, a charge-coupled device (CCD) and a complementary metal oxide semiconductor active pixel sensor (Complementary Metal Oxide Semiconductor Active pixel sensor).
2. CMOS active pixel sensor
CMOS Active pixel sensor is a type of active pixel sensor that utilizes CMOS semiconductors. A corresponding circuit is located near each photosensor to directly convert light energy into a voltage signal. Unlike a CCD, it does not involve signal charges. An analog-to-digital converter may also be required on the motherboard to convert its output signal to a digital signal.
CMOS Active pixel sensor only the pixel can sense brightness, if it is desired to sense color, a color filter array (color filter array, CFA) needs to be covered on the pixel unit, and the CFA functions to filter light rays of other wavelength bands, let light rays of the desired wavelength band pass through, and be photoelectrically converted by the pixel, so that the pixel has the ability to sense color, but this way is wasteful of energy of light. Moreover, the perceptibility of colors in this way is relatively rough.
3. Multispectral imaging techniques
The multispectral imaging technology mainly uses the principle that the absorption of objects to light rays with different wavelengths is different, and the application requirements of detection, identification and the like are realized through the light intensity change of a target object in a group of specific light ray wavelengths in infrared and near infrared ranges. Certain differences exist between the imaging device and hyperspectral imaging technology, and the imaging device has various characteristics, such as: hyperspectral imaging techniques describe materials by measuring light intensity variations in successive wavelength ranges, while multispectral imaging techniques determine by light intensity variations in a specific set of wavelength ranges, and so forth. Along with the continuous improvement of the multispectral imaging technology, the application range of the multispectral imaging technology is also continuously expanded, and the multispectral imaging technology has important application in the fields of medicine, agriculture, security inspection and the like.
The earliest applications of multispectral, hyperspectral and even hyperspectral cameras or imagers were from aerial photography, that is to say generally for satellite remote sensing. As the name suggests, multispectral imaging divides an incident full-band or broad-band optical signal into several narrow-band light beams according to spectral resolution (minimum wavelength interval capable of resolving), and then images the light beams onto a Sensor (Sensor). If the spectrum resolution is high enough, the incident spectrum curve can be completely sampled by the multispectral technology, and the multispectral technology has wide application in civil and military fields (unmanned aerial vehicle investigation, agricultural pest and disease damage monitoring, soil fertility, water area pollution monitoring and the like).
With the development of software and hardware technologies, the functions of cameras on electronic devices are increasingly powerful, the shooting effect and performance are extremely developed, the image quality improvement and the function integration have reached the bottleneck, and the experience of users on the cameras is also increasingly high, which is a great challenge for the functions, the performance and the effects of the cameras of the electronic devices.
Fig. 1 is a schematic structural diagram of a related art camera module, and as shown in fig. 1, a conventional camera module 10 mainly comprises a lens 11 and an image sensor 12.
The image sensor 12 includes: pixel array, timing control, analog signal processing, analog-to-digital conversion, etc. The pixel array is used for completing photoelectric conversion and converting photons into electrons; the time sequence control is used for controlling the reading and transmission of the electric signals; the analog signal processing module is used for denoising the signal.
The pixel array occupies the largest area of the whole chip, and consists of a single pixel unit, corresponding to each pixel in each picture. Each pixel comprises a photosensitive area and a readout circuit, and signals of each pixel unit are processed by analog signals and then are subjected to analog-to-digital conversion by an analog signal processing module, and then are output to a digital processing module.
Fig. 2 is a schematic diagram of a pixel unit in a pixel array on an image sensor in the related art. As shown in fig. 2, the pixel unit includes a microlens 21 (i.e., on-chip-lens), a color filter 22 disposed under the microlens, a support structure 25 disposed under the color filter 22, and a photoelectric signal converter 24 (i.e., photo diode) disposed under the support structure 25. The space 23 enclosed by the support structure 25 may be filled with air.
In some embodiments, the pixel cell further includes a support structure 26, where the support structure 26 is configured to house the optical-to-electrical signal converter 24.
It should be noted that, for convenience of description, the pixel unit may be referred to as a pixel.
In some embodiments, the lens is a microlens array overlying the photosensitive element for collecting light at the opening of the photosensitive region of the pixel. The photoelectric conversion efficiency can be increased, and the optical signal crosstalk between adjacent pixels can be reduced; the color filter comprises three types of red, green and blue, and only light rays with corresponding wavelengths of red, green and blue can be transmitted respectively. The existence of the filter structure enables each pixel to only sense one color, and the other two color components are needed to be obtained through interpolation of adjacent pixels, namely a demosaic algorithm; the photoelectric signal converter is used for converting the collected optical signals into electric signals, and the converted electric signals can be read out through the metal flat cable.
At present, the image sensor of the camera module senses the brightness and color of light similar to human eyes, the CFA covered on the pixel units simulates three kinds of cone cells of the human eyes, samples the light reflection curve, forms digital signals, and then processes the digital signals through an image signal processor (Image Signal Processor, ISP) to finally become images. Therefore, the sensor imaging is to sample the incident spectrum curve in three primary colors to form three discrete data, namely three spectrum samples, and finally mix into color and brightness, so that the current camera can only recognize the color and brightness, but cannot see the details of the spectrum curve, namely the metamerism. As such, many material features and attributes, such as the difference between real and false banknotes, the degree of water pollution, the degree of skin tone health, etc., are not recognized by the human eyes and the camera. While multispectral techniques can help the sensor acquire this capability, current multispectral techniques fall into two main categories: time domain multispectral and free multispectral. The time domain multispectral mainly comprises a multi-lens type and a beam splitting type; the spatial multi-spectrum is mainly a sensor filter (filter on sensor).
The current multispectral technology has various implementation schemes, but cannot be miniaturized, cannot be used as a mobile phone camera module, and the industry does not have a multispectral image conversion algorithm at present, so that the shot multispectral photo cannot be converted into an image seen by human eyes. Therefore, the electronic device camera is difficult to integrate the multispectral capability, and many multispectral applications corresponding to the electronic device camera cannot be realized, such as that the mobile phone camera cannot judge the air quality of the environment, cannot identify harmful substances and cannot judge the health condition of the skin.
The utility model provides a module of making a video recording that possesses controllable grating structure, be provided with the changeable grating structure of grating parameter on this pixel structure of module of making a video recording, through the grating parameter of the grating structure of adjustment pixel unit top, can control the luminousness of grating structure, thereby can utilize the grating structure to carry out the diffraction to the light of pixel unit top and change the light path, realize the function that polychromatic gathered, thereby can realize hyperspectral or multispectral function, thereby promote the color perceptibility ability and the spectral information acquisition ability of the module of making a video recording, and then promote final image quality effect and user's use experience.
Fig. 3 is a schematic structural diagram of an image capturing module according to the embodiment of the present application, as shown in fig. 3, the image capturing module 30 includes a lens 31, a grating structure array 32, and an image sensor 33, where the image sensor 33 includes at least two pixel units, and a grating structure is correspondingly disposed above each pixel unit, and under the condition that the grating structure is powered on, grating parameters of the grating structure are changed.
In some embodiments, the image capturing module 30 further includes a lens 31 disposed along the optical axis and an infrared filter disposed below the lens.
In some embodiments, the grating structure 32 is formed by an electronically controlled liquid crystal assembly with controllable grating parameters. The liquid crystal assembly described above may be composed of a liquid crystal material, for example.
In some embodiments, the grating parameters of the grating structure described above may be adjusted. Further, by changing the grating parameters of the grating structure, diffraction of light of a specific wavelength can be performed by the grating structure.
In some embodiments, the grating structure includes at least two grating units, each of the grating units includes a first subunit and a second subunit, the first subunit and the second subunit have equal width values, and when the first subunit and the second subunit are powered on, the light transmittance of the first subunit and the second subunit changes.
In some embodiments, the width values of the first subunit and the second subunit may be 1-20 nm. Illustratively, the width value of the first subunit may be 1nm, 10nm, 20nm, or the like.
It should be noted that the width values of the first subunit and the second subunit may be specifically set according to actual requirements, which is not limited in the embodiment of the present application.
In some embodiments, the first subunit and the second subunit may have a thickness value of 100nm to 600nm. Illustratively, the first subunit may have a thickness of 100nm, 300nm, or 600nm.
The width of the liquid crystal device refers to the width in the horizontal direction, and the thickness of the liquid crystal device refers to the thickness in the vertical direction.
In some embodiments, the first subunit and the second subunit are made light transmissive, opaque, or semi-opaque by controlling the voltages of the first subunit and the second subunit.
In some embodiments, one subunit may be composed of at least one liquid crystal component.
Illustratively, one sub-unit may include 1 liquid crystal component, or one sub-unit may include two liquid crystal components.
In some embodiments, the light transmittance of one subunit is the same.
The width value of the above-mentioned one sub-unit is the sum of the width values of at least one liquid crystal element constituting the one sub-unit.
In some embodiments, the liquid crystal assembly is made transparent, opaque or semi-transparent by controlling the voltage of each liquid crystal assembly.
In some embodiments, the grating parameter may be a grating pitch. The grating pitch may be the sum of the widths of the light transmissive liquid crystal cell and the adjacent light opaque liquid crystal cell.
In some embodiments, the grating pitch of the grating structure may be N times the width of the grating unit, where N is a positive integer.
Illustratively, assuming that one grating cell includes two subunits, each having a width value of 10nm, the grating pitch may be 20nm.
In some embodiments, the grating parameters are used to control the diffraction angle, the wavelength of the diffracted light, and the diffraction efficiency of the grating structure, and the effect of controlling the light splitting is achieved by designing the grating structure with variable grating parameters.
The relationship among grating parameters, diffraction angles, and wavelengths of diffracted light for the grating structure is described below with reference to formula (1).
m*λ=d(sinα+sinβ m ) (1)
Wherein d is the grating spacing, also called grating parameter, m is an integer, and the values are 0, ±1, etc. Diffraction angle beta m Alpha is the angle of incidence. Lambda is the wavelength of the light.
Illustratively, taking light in the 400nm band as an example at an incident angle of 30 ° and a diffraction angle of 31 °, it is possible to substitute the above values into the above formula (1): 0.515+0.5=0.4/d, then d=0.394 μm=394 nm, then the grating parameter may be 394nm or 400nm.
The diffraction angle, the wavelength of the diffracted light and the diffraction efficiency are controlled by the grating parameters, and the effect of controlling the light splitting can be achieved by designing the grating structure with variable grating parameters.
It should be noted that the grating structure is an optical element having a period, which may be determined by the peaks and valleys embossed on the surface of the material, which causes a periodic change in the refractive index n (refractive index) of the material.
In some embodiments, the period of the grating structure may be on the micro-nano level, which is an order of magnitude with the wavelength of visible light (450-700 nm), so as to effectively control the deflection of the light.
The "spectroscopic" principle of the diffraction grating is described below:
assuming that the incident light is a single wavelength of green light, it is split by the diffraction grating into a plurality of diffraction orders (diffraction order), each of which propagates in a different direction, including light rays of reflective diffraction (R0, R+ -1, R+ -2, …) and transmissive diffraction (T0, T+ -1, T+ -2, …), each of which corresponds to a diffraction angle (θ m M= ±1, ±2, …) is determined by the incidence angle (θ) of light and the period (Λ) of the grating, and the diffraction efficiency of a certain diffraction order (i.e., a certain direction) can be optimized to be highest by designing parameters of the grating (e.g., material refractive index n, grating shape, thickness, duty cycle, etc.), so that most of light propagates mainly in this direction after diffraction.
In some embodiments, the 0 th order diffracts, i.e. refracts, the incident light is diffracted, this portion of the energy does not change. Other orders of diffraction can be varied by controlling the grating parameters to concentrate energy at a certain order, typically predominantly at 1 order.
In some embodiments, the grating structure may be a separate grating structure; alternatively, the grating structure may be a part of the grating region in the array of grating structures.
Illustratively, the grating structure array includes a plurality of grating regions, each corresponding to one pixel unit in the pixel array of the image sensor, and one grating region corresponding to one pixel unit may be referred to as one grating structure.
In some embodiments, one pixel unit in the image sensor 31 corresponds to at least one grating structure.
Fig. 4 (a) is a schematic structural diagram of a grating structure according to an embodiment of the present application. As shown in fig. 4 (a), the grating structure includes 13 grating units, one grating unit includes two grating subunits, each grating subunit is composed of one liquid crystal component, the width value of each liquid crystal component is 10nm, the width value of one grating unit is 20nm, that is, the grating pitch is 20nm, and the grating structure is disposed on the substrate. The voltage of the liquid crystal component is controlled to enable the liquid crystal component to transmit light, not transmit light or transmit light in half, so that a grating structure of each required grating parameter is formed.
In some embodiments, the liquid crystal element corresponding to reference "1" is opaque and the liquid crystal element corresponding to reference "2" is fully opaque.
Fig. 4 (B) is another schematic structural diagram of the grating structure according to the embodiment of the present application. As shown in fig. 4 (B), the grating structure includes 6 grating units, each of which includes two sub-units, each of which is composed of two liquid crystal modules, each of which has a width of 10nm, and one of which has a width of 40nm, that is, a grating pitch of 40nm.
Fig. 4 (C) is another schematic structural diagram of the grating structure according to the embodiment of the present application. As shown in fig. 4 (C), the grating structure includes 26 liquid crystal modules, and is in a fully transmissive state.
It should be noted that, of course, if the grating structure is not required to diffract light, all the liquid crystal components can be controlled to keep full light transmission, and the grating structure can be regarded as a plane lens.
In some embodiments, a grating structure corresponding to each pixel unit in a pixel array of an image sensor is used to diffract light on the pixel unit onto other pixel units adjacent to the pixel unit, so as to enhance the light sensing capability of the adjacent pixel units.
Illustratively, the grating structure on a second pixel cell of the first row in the pixel array may diffract light on the second pixel cell onto the first pixel cell of the first row in the pixel array.
Fig. 5 is a schematic structural diagram of a pixel unit in a pixel array of an image sensor according to an embodiment of the present application, and as shown in fig. 5, the pixel unit includes a micro lens 51 (i.e. On-chip-lens), a filter 52 disposed below the micro lens, the filter 52 may be a white filter, a support structure 55 disposed below the filter 52, and a photoelectric signal converter 54 (i.e. photo diode) disposed below the support structure 55.
In some embodiments, the space enclosed by the support structure 55 may be filled with air.
In some embodiments, the pixel cell further includes a support structure 56, the support structure 56 configured to house the optical-to-electrical signal converter 54.
In some embodiments, the image sensor is overlaid with a monochromatic color filter array (Color Filter Array, CFA), i.e., a white filter, so that all bands of light pass through and are perceived by pixels on the image sensor.
The white filter means a full-transmission filter, and does not filter light, that is, visible light passing through the white filter can pass through the full-transmission filter.
The execution subject of the image generating method provided in the embodiment of the present invention may be an electronic device, or may be at least one of a functional module and an entity module capable of implementing the image generating method in the electronic device, and specifically may be determined according to actual use requirements, which is not limited by the embodiment of the present invention.
The image generating method provided by the embodiment of the application is described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
Fig. 6 is a flowchart of an image generating method according to an embodiment of the present application, where the image generating method may be applied to an electronic device, and the electronic device includes the above-mentioned image capturing module, and as shown in fig. 6, the image generating method may include the following steps S201 to S203:
step S201: the electronic equipment controls the grating structure in the camera module to completely transmit light, and the first image data is acquired through the image sensor in the camera module.
In some embodiments, the electronic device may control the entire array of grating structures to be fully transparent, that is, the grating structure (or grating region) corresponding to each pixel cell in the image sensor is fully transparent, while the first image data is acquired.
It should be noted that, the explanation of the grating structure may be referred to above, and will not be repeated here.
In some embodiments, the light transmission characteristics of the liquid crystal assembly may be controlled by applying different voltages across the liquid crystal assembly of the grating structure.
Illustratively, the liquid crystal assembly is fully transparent when the voltage applied across the liquid crystal assembly is less than a voltage threshold; when the voltage applied to the liquid crystal component is gradually increased, the light transmission intensity of the liquid crystal component is changed, and the transition from complete light transmission to complete light non-transmission is performed; when the voltage applied to the liquid crystal component is greater than the voltage threshold, the light transmission intensity is not changed along with the applied voltage, that is, the liquid crystal component is completely opaque.
The voltage threshold may be set to a value ranging from 1 v to 5v, for example.
Illustratively, by controlling the grating structure to be in a fully transmissive state, the grating structure may be made to not diffract incident light.
In some embodiments, the first image data may be monochrome image data.
In some embodiments, the first image data may be image data acquired by the image sensor based on incident full-band visible light.
In some embodiments, the first image data includes spectral energy information of light incident on each pixel unit.
Fig. 7 is a schematic diagram of a color filter array (Color Filter Array, CFA) of an image sensor according to an embodiment of the present disclosure. As shown in fig. 7, the color filter array is a monochromatic color filter array, that is, a white filter layer.
It should be noted that the above-mentioned monochromatic color filter array may be referred to as a mono CFA.
Illustratively, in connection with fig. 7, each pixel unit of the image sensor is covered with a mono CFA, that is, a white filter, and when the image sensor performs image data acquisition, all bands of visible light pass through and are perceived by the pixels.
Compared with the traditional image sensor, the image sensor provided by the embodiment of the application uses the white filter layer to replace the Bayer filter layer, and can obtain larger light inlet quantity and higher sensitivity compared with a color image sensor with the Bayer filter layer.
Step S202: the electronic device changes grating parameters of a grating structure corresponding to the first pixel unit based on the diffraction angle, the wavelength of the diffracted light and the incidence angle of the light into the first pixel unit of the image sensor so as to control the grating structure to diffract the first light in the light to a second pixel unit adjacent to the first pixel unit, and collects second image data through the image sensor.
In some embodiments, the electronic device may determine the grating parameter based on the diffraction angle, the wavelength of the diffracted light, and the angle of incidence of the light into the first pixel cell of the image sensor.
Illustratively, in combination with the above formula (1), assuming that the diffraction angle is 31 °, the wavelength of the diffracted light is 400nm, and the incident angle of the light incident on the image sensor is 30 °, substituting the above parameter into the formula (1) results in d=0.394 μm, that is, the grating parameter is 394nm.
In some embodiments, the electronic device may change the light transmission characteristic of the liquid crystal component of the grating structure by adjusting the grating parameter, so that the light on the first pixel unit is diffracted to the adjacent second pixel unit by the grating structure.
In some embodiments, the first pixel unit may be any pixel unit in a pixel array of the image sensor, and the second pixel unit may be a pixel unit adjacent to the first pixel unit.
In some embodiments, the diffraction angle may be a preset diffraction angle, or the diffraction angle may be determined according to the positions of the first pixel unit and the second pixel unit.
In some embodiments, the wavelength of the diffracted light may be a predetermined wavelength.
In some embodiments, the wavelength range of the diffracted light may be 300nm to 1100nm.
In some embodiments, the angle of incidence of light rays incident on the image sensor may be determined from lens parameters.
It should be noted that, from the sensor side of the lens, the maximum Angle of the light that can be focused onto the pixel unit is defined as a Chief Ray Angle (CRA), which is also called a Chief Ray Angle. The distance between the lens axis and the lens is close to zero, and the angle is increased along with the larger the distance between the lens axis and the lens axis is. The CRA is related to the position of the pixel at the sensor.
In some embodiments, the light incident on the first pixel unit of the image sensor may be full-band light.
In some embodiments, the first light may be a light of a specific wavelength that is desired to be diffracted to an adjacent pixel unit.
In some embodiments, the grating structure corresponding to the first pixel unit may be a grating structure area above the first pixel unit in the grating structure array; alternatively, the grating structure corresponding to the first pixel unit may be an independent grating structure disposed above the first pixel unit.
In some embodiments, the grating structure may be a grating region located above the first pixel unit and facing at least a portion of the pixel region of the first pixel unit.
For example, at least a part of the pixel region of the first pixel unit may be a left half of the pixel region, a right half of the pixel region, or a middle half of the pixel region of the first pixel unit.
Fig. 8 is a schematic diagram of a positional relationship between a grating structure and a pixel unit provided in an embodiment of the present application, where, as shown in fig. 8, the grating structure is located right above the pixel unit, and is used to refract light on pixel units at different positions beside the pixel unit onto one pixel unit, and in fig. 8, three pixel units are used to illustrate that, after visible light is transmitted to a left pixel unit and a right pixel unit, the grating structures (i.e., grating structure areas) above the left pixel unit and the right pixel unit can refract light of a specific wavelength band onto a middle pixel unit from the left pixel unit or the right pixel unit, so as to improve the photosensitivity of the middle pixel unit.
In some embodiments, a first light of the light may be diffracted to a second pixel unit adjacent to the first pixel unit by the grating structure to increase the sensitivity of the second pixel unit.
Illustratively, in combination with the above example, the first pixel unit is taken as the second pixel unit of the first row in the pixel array of the image sensor, the second pixel unit is taken as the first pixel unit of the first row in the pixel array, and for convenience of description, the second pixel unit of the first row is referred to as a pixel a, and the first pixel unit of the first row is referred to as a pixel B. If the light with the wavelength of 400nm on the pixel A needs to be diffracted to the pixel B, the grating spacing of the grating structure above the pixel unit A can be adjusted to 394nm, so that the light with the wavelength of 400nm can be diffracted from the pixel B to the pixel A, and the photosensitivity of the pixel A to the light with the wavelength of 400nm is improved.
In some embodiments, the second image data is image data collected by the image sensor after the first light on the first pixel unit is diffracted to a second pixel unit adjacent to the first pixel unit by the grating structure.
In some embodiments, the second image data includes spectral energy information of light of the second pixel unit itself and spectral energy information of light diffracted from the first pixel unit to the second pixel.
Fig. 9 (a) is a schematic diagram of a diffraction region of a pixel unit provided in an embodiment of the present application, in fig. 9 (a), a first two pixel units in a first row and a second two pixel units in a first row in a pixel array of an image sensor are taken as schematic diagrams, four pixel units are respectively monochrome pixel units, that is, pixel units are covered with a white filter, and cyan light in a left region (i.e., a shadow region) of the first row second pixel unit can be diffracted to a right region (i.e., a region labeled "C") of the first row first pixel unit, so that the first row first pixel can obtain the diffracted cyan light. Similarly, the right area (i.e., the area marked with "Y") of the second pixel unit in the first row can obtain the yellow light diffracted from the left area of the right pixel; yellow light in the left area of the second pixel unit of the second row can be diffracted to the right area (namely, the area marked with 'Y') of the first pixel unit of the second row, so that the first pixel unit of the second row can acquire the diffracted yellow light; the right area of the second row of second pixel cells (i.e., the area labeled "M") is capable of capturing magenta light diffracted from the left area of the right pixel cells.
The following embodiment exemplifies an image generation method provided in the embodiment of the present application with reference to fig. 9 (a) described above.
Example 1: taking the first pixel unit as the first row of the second pixel of the image sensor, the second pixel as the first row of the first pixel unit of the image sensor, the first light as the cyan light, and the wavelength of the diffracted light as the wavelength corresponding to the cyan light as an example, each pixel unit on the image sensor covers the mono CFA, that is, the white filter, so that the light rays of all wave bands can pass through and are perceived by the pixel units, and under the condition, the first exposure is performed to obtain the first image data. And then, according to the incidence angle of the light on the pixel units and the angle of diffraction wanted, calculating the required grating parameters, and controlling the light transmission characteristic of the grating structure on the second pixel units of the first row according to the grating parameters, so that the cyan light in the left area of the second pixel units of the first row is diffracted to the first pixel units of the first row. Thus, the light of the pixel unit and the cyan light diffracted by the partial area of the pixel unit on the right are obtained on the first pixel unit in the first row, and in this case, the second exposure is performed to obtain the second image data. The other pixel units are similar, the grating structure of the third pixel unit in the first row can be controlled, and yellow light on the third pixel unit in the first row is diffracted to the second pixel unit in the first row through the grating structure; controlling a grating structure of a second pixel of the second row, and diffracting yellow light on a second pixel unit of the second row to a first pixel unit of the second row through the grating structure; and controlling the grating structure of the third pixels of the second row, and diffracting magenta light on the third pixel units of the second row to the second pixel units of the second row through the grating structure. In this case, a second exposure is performed to obtain second image data containing the spectral energy of each pixel unit itself and the spectral energy of the specific light diffracted from the adjacent pixel unit. In this way, a variety of desired light rays can be diffracted into the pixel cell, resulting in a variety of spectral energies of the light rays.
In this way, various desired light rays can be diffracted to the adjacent pixel units through the grating structure arranged above the pixel units, so that the spectral energy of various light rays can be obtained.
Fig. 9 (B) is another schematic diagram of a diffraction region of a pixel unit provided in this embodiment, where in fig. 9 (B) is shown that two first pixel units and two first pixel units in a first row and two second pixel units in a second row in a pixel unit array of an image sensor are respectively monochrome pixel units, that is, the pixel units are covered with a white filter, and red light in a left region (i.e., a shadow region) of the second pixel unit in the first row can be diffracted to a right region (i.e., a region labeled "R") of the first pixel unit in the first row, so that the first pixel unit in the first row can obtain the diffracted red light. Similarly, the right area (i.e., the area marked with "G") of the second pixel unit in the first row can obtain the green light diffracted from the left area of the right pixel unit; the green light in the left area (i.e., the shaded area) of the second pixel unit in the second row can be diffracted to the right area (i.e., the area marked with "G") of the first pixel unit in the second row, so that the first pixel unit in the second row can obtain the diffracted green light; the right region of the second row of second pixel elements (i.e., the region labeled "B") is capable of capturing blue light diffracted from the left region of the right pixel element.
The following embodiment exemplifies an image generation method provided in the embodiment of the present application with reference to fig. 9 (B) described above.
Example 2: taking the first pixel unit as the first row of the second pixel unit of the image sensor, the second pixel unit as the first row of the first pixel unit of the image sensor, the first light is red light, and the wavelength of the diffracted light is the wavelength corresponding to the red light as an example, since each pixel unit on the image sensor covers the mono CFA, that is, the white filter, light rays of all wave bands pass through and are perceived by the pixel units, and under the condition, the first exposure is performed to obtain the first image data. And then, according to the incidence angle of the light on the pixel units and the angle of diffraction wanted, calculating the required grating parameters, and controlling the light transmission characteristic of the grating structure on the second pixel units in the first row according to the grating parameters, so that the red light in the left area of the second pixel units in the first row is diffracted to the first pixel units in the first row. Thus, the light of the pixel unit and the R channel light diffracted by the partial area of the right pixel are obtained from the first pixel unit in the first row, and other pixels are similar, for example, the grating structure of the third pixel unit in the first row is controlled, and the green light on the third pixel unit in the first row is diffracted to the second pixel unit in the first row through the grating structure; controlling the grating structure of the second pixel unit of the second row, and diffracting the green light on the second pixel unit of the second row to the first pixel unit of the second row through the grating structure; and controlling the grating structure of the third pixel unit of the second row, and diffracting blue light on the third pixel unit of the second row to the second pixel unit of the second row through the grating structure. In this case, the second exposure is performed to obtain second image data containing the spectral energy of each pixel unit itself and the spectral energy of the specific light rays (i.e., red light rays, green light rays, and blue light rays) diffracted from the adjacent pixel units are superimposed. In this way, various desired light rays can be diffracted to the adjacent pixel units through the grating structure arranged above the pixel units, so that the spectral energy of various light rays can be obtained.
Step S203: the electronic device generates a first image according to the first image data and the second image data.
Wherein the first light is related to the diffraction angle and the wavelength of the diffracted light.
In some embodiments, the electronic device may perform a difference between the second image data and the first image data to obtain a difference of the image data.
The first image data includes spectral energy information of light incident on each pixel unit, and the second image data includes spectral energy information of light incident on each pixel unit and spectral energy information of light diffracted from an adjacent pixel unit, so that the second image data is different from the first image data, and spectral energy information of diffracted light on each pixel unit can be obtained.
According to the image generation method, when an image is acquired, a grating structure in an image pickup module is controlled to completely transmit light, the grating structure does not have a diffraction function at the moment, first image data of first image data are acquired through an image sensor, then grating parameters of the grating structure corresponding to a first pixel unit are changed based on a diffraction angle, the wavelength of diffracted light and the incidence angle of the light into the first pixel of the image sensor, the grating structure can diffract the first light with a specific wavelength onto the pixel unit adjacent to the first pixel unit, and second image data are acquired through the image sensor; and finally, generating a first image according to the first image data and the second image data. By the method, the light path of the light on the pixel unit of the image sensor can be changed to realize diffraction of specific light, so that image data corresponding to the light with any wavelength can be obtained, multispectral information can be acquired through the image sensor, the color perception capability and the spectrum information acquisition capability of the camera module are improved, and the final image quality effect is improved.
In some embodiments, the step S203 may include the following steps S203a and S203b:
step S203a: the electronic device subtracts the second image data from the first image data to obtain third image data.
Step S203b: and the electronic equipment generates a first image according to the third image data.
Wherein the third image data includes energy information of the first light.
In some embodiments, the first image data and the second image data may be matrix image data, and the difference between the second image data and the first image data may be obtained by performing a difference operation on the second image data and the first image data, so that information including only diffracted specific light may be obtained according to the difference.
For example, assuming that the first image data is represented by matrix 1 and the second image data is represented by matrix 2, the difference obtained by subtracting matrix 1 from matrix 2 is the difference of the image data.
In some embodiments, in combination with the above example 1, the first pixel unit of the image data obtained by the first exposure (i.e., the first image data) includes white light information, where the white light information is denoted by W, the corresponding first pixel in the image data obtained by the second exposure (i.e., the second image data) includes white light information and cyan light information, where the cyan light information is denoted by C, i.e., the image data of the first pixel unit of the image data obtained by the second exposure may be denoted by w+c, and the light information of the image data obtained by the second exposure may be subtracted from the image data obtained by the first exposure, where the obtained difference is the diffracted cyan light information, i.e., the cyan light information C. The same applies to specific light diffracted into other pixels, such as Yellow light information and Magenta light information, to obtain a single color image and an image including Cyan (Cyan) light information, magenta (Magenta) light information, and Yellow (Yellow) light information.
In some embodiments, when information of light rays in a plurality of wavelength bands needs to be obtained on one pixel unit, multiple exposure shooting may be performed by the image sensor, and each exposure may obtain information of light rays in a specific wavelength band diffracted onto one pixel unit, and after multiple exposure, information of light rays in a plurality of specific wavelength bands diffracted onto one pixel unit may be obtained.
Illustratively, the 300nm to 1100nm band may be separated into 9 bands. The mono image can be obtained by first exposure, and the information of the light rays of each separated wave band can be respectively obtained by nine subsequent exposures.
It should be noted that the separation into 9 bands is only one possible example, and may be further separated into a plurality of bands, for example, 12 bands according to actual requirements, which is not limited in the embodiment of the present application.
In some embodiments, the diffraction of the desired light to the pixel unit through the grating structure can achieve the acquisition of the information of the light of each band, so that the color sensing capability of the image sensor is improved, and more possibilities are provided for multispectral application of the image sensor, such as automatic environment detection, harmful substance identification and the like.
The foregoing method embodiments, or various possible implementation manners in the method embodiments, may be executed separately, or may be executed in combination with each other on the premise that no contradiction exists, and may be specifically determined according to actual use requirements, which is not limited by the embodiments of the present application.
According to the image generation method provided by the embodiment of the application, the execution subject can be electronic equipment. In the embodiment of the present application, an example of a method for executing an image generating by an electronic device is described.
Fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application, as shown in fig. 10, the electronic device 800 may include a camera module provided in an embodiment of the present application, and the electronic device 800 may further include: a control module 801 and a generation module 802, wherein: the control module 801 is configured to control the grating structure in the camera module to completely transmit light, and collect first image data through an image sensor in the camera module; the control module 801 is further configured to change grating parameters of a grating structure corresponding to a first pixel unit based on a diffraction angle, a wavelength of a diffracted light, and an incident angle of the light incident on the first pixel of the image sensor, so as to control the grating structure to diffract the first light in the light to a second pixel unit adjacent to the first pixel unit, and collect second image data through the image sensor; the generating module 802 is configured to generate a first image according to the first image data and the second image data.
In some embodiments, the diffraction angle is determined according to the positions of the first pixel unit and the second pixel unit, and the wavelength of the diffracted light ranges from 300nm to 1100nm.
In some embodiments, the generating module is specifically configured to subtract the second image data from the first image data to obtain third image data; the generation module is specifically configured to generate the first image according to the third image data. Wherein the third image data includes energy information of the first light ray.
When the electronic device provided by the embodiment of the application is used for acquiring an image, firstly, the grating structure in the camera module of the electronic device is controlled to completely transmit light, at the moment, the grating structure does not have a diffraction function, first image data of first image data are acquired through the image sensor, then grating parameters of the grating structure corresponding to a first pixel unit are changed based on a diffraction angle, the wavelength of diffracted light and the incidence angle of the light into the first pixel of the image sensor, at the moment, the grating structure can diffract the first light with a specific wavelength onto the pixel unit adjacent to the first pixel unit, and second image data are acquired through the image sensor; and finally, generating a first image according to the first image data and the second image data. By the method, the light path of the light on the pixel unit of the image sensor can be changed to realize diffraction of specific light, so that image data corresponding to the light with any wavelength can be obtained, multispectral information can be acquired through the image sensor, the color perception capability and the spectrum information acquisition capability of the camera module are improved, and the final image quality effect is improved.
The electronic device in the embodiment of the present application may be a terminal, or may be other devices other than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The electronic device in the embodiment of the application may be a device having an operating system. The operating system may be an Android operating system, an IOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The electronic device provided in the embodiment of the present application can implement each process implemented by the embodiments of the methods of fig. 1 to fig. 9 (B), so as to achieve the same technical effects, and in order to avoid repetition, no detailed description is given here.
As shown in fig. 11, the embodiment of the present application further provides an electronic device 900, which includes a processor 901 and a memory 902, where a program or an instruction capable of being executed on the processor 901 is stored in the memory 902, and the program or the instruction implements each step of the embodiment of the image generating method when executed by the processor 901, and the steps can achieve the same technical effect, so that repetition is avoided, and details are not repeated here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 12 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, and processor 110.
In some embodiments, the electronic device 100 may further include a camera module provided in the embodiments of the present application.
Those skilled in the art will appreciate that the electronic device 100 may further include a power source (e.g., a battery) for powering the various components, and that the power source may be logically coupled to the processor 110 via a power management system to perform functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 12 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
The processor 110 is configured to control the grating structure in the camera module to completely transmit light, and collect first image data through the image sensor; the processor 110 is further configured to change grating parameters of a grating structure corresponding to a first pixel unit based on a diffraction angle, a wavelength of diffracted light, and an incident angle of light incident on the first pixel unit of the image sensor, so as to control the grating structure to diffract a first light in the light to a second pixel unit adjacent to the first pixel unit, and collect second image data through the image sensor; the processor 110 is configured to generate a first image according to the first image data and the second image data.
In some embodiments, the diffraction angle is determined according to the positions of the first pixel unit and the second pixel unit, and the wavelength of the diffracted light ranges from 300nm to 1100nm.
In some embodiments, the processor 110 is specifically configured to subtract the second image data from the first image data to obtain third image data;
the processor 110 is specifically configured to generate the first image according to the third image data. Wherein the third image data includes energy information of the first light ray.
When the electronic device provided by the embodiment of the application is used for acquiring an image, firstly, the grating structure in the camera module of the electronic device is controlled to completely transmit light, at the moment, the grating structure does not have a diffraction function, first image data of first image data are acquired through the image sensor, then grating parameters of the grating structure corresponding to a first pixel unit are changed based on a diffraction angle, the wavelength of diffracted light and the incidence angle of the light into the first pixel of the image sensor, at the moment, the grating structure can diffract the first light with a specific wavelength onto the pixel unit adjacent to the first pixel unit, and second image data are acquired through the image sensor; and finally, generating a first image according to the first image data and the second image data. By the method, the light path of the light on the pixel unit of the image sensor can be changed to realize diffraction of specific light, so that image data corresponding to the light with any wavelength can be obtained, multispectral information can be acquired through the image sensor, the color perception capability and the spectrum information acquisition capability of the camera module are improved, and the final image quality effect is improved.
It should be appreciated that in embodiments of the present application, the input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 109 may include volatile memory or nonvolatile memory, or the memory 109 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (RandomAccess Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 109 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units; optionally, the processor 110 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, etc., and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the processes of the embodiment of the image generating method are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the embodiment of the image generation method, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the embodiments of the image generating method described above, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (12)

1. The image pickup module is characterized by comprising an image sensor and a grating structure, wherein the image sensor comprises at least two pixel units, one grating structure is correspondingly arranged above each pixel unit, and grating parameters of the grating structure are changed under the condition that the grating structure is electrified.
2. The camera module of claim 1, wherein the grating structure comprises at least two grating units, the grating units comprise a first subunit and a second subunit, the first subunit and the second subunit have equal width values, and the light transmittance of the first subunit and the second subunit changes when the first subunit and the second subunit are powered on.
3. The camera module of claim 2, wherein the grating pitch of the grating structure is N times the width value of the grating unit, where N is a positive integer.
4. The camera module of claim 2, wherein the first subunit and the second subunit have a width value of 1-20 nm.
5. An image generation method performed by an electronic device, wherein the electronic device comprises the camera module of any one of claims 1 to 4, the method comprising:
The grating structure in the camera module is controlled to completely transmit light, and first image data are collected through an image sensor in the camera module;
changing grating parameters of the grating structure corresponding to the first pixel unit based on a diffraction angle, the wavelength of diffracted light and the incidence angle of the light into the first pixel unit of the image sensor so as to control the grating structure to diffract the first light in the light to a second pixel unit adjacent to the first pixel unit, and collecting second image data through the image sensor;
and generating a first image according to the first image data and the second image data.
6. The method of claim 5, wherein the diffraction angle is determined according to the positions of the first pixel unit and the second pixel unit, and the wavelength of the diffracted light ranges from 300nm to 1100nm.
7. The method of claim 5 or 6, wherein the generating a first image from the first image data and the second image data comprises:
subtracting the first image data from the second image data to obtain third image data;
Generating the first image according to the third image data;
wherein the third image data includes energy information of the first light ray.
8. An electronic device, characterized in that it comprises a camera module according to any one of claims 1 to 4; the electronic device further includes: control module and generating module, wherein:
the control module is used for controlling the grating structure in the camera module to completely transmit light and collecting first image data through the image sensor in the camera module;
the control module is further configured to change a grating constant of a grating structure corresponding to a first pixel unit based on a diffraction angle, a wavelength of diffracted light, and an incident angle of the light incident into the first pixel unit of the image sensor, so as to control the grating structure to diffract the first light in the light to a second pixel unit adjacent to the first pixel unit, and collect second image data through the image sensor;
the generation module is used for generating a first image according to the first image data and the second image data.
9. The apparatus of claim 8, wherein the diffraction angle is determined according to positions of the first pixel unit and the second pixel unit, and the wavelength of the diffracted light ranges from 300nm to 1100nm.
10. The apparatus according to claim 8 or 9, wherein the generating module is specifically configured to subtract the second image data from the first image data to obtain third image data;
the generation module is specifically configured to generate the first image according to the third image data;
wherein the third image data includes energy information of the first light ray.
11. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the image generation method of any of claims 5-7.
12. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the image generation method according to any of claims 5-7.
CN202311512300.3A 2023-11-13 2023-11-13 Image pickup module, image generation method, electronic device, and readable storage medium Pending CN117596494A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311512300.3A CN117596494A (en) 2023-11-13 2023-11-13 Image pickup module, image generation method, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311512300.3A CN117596494A (en) 2023-11-13 2023-11-13 Image pickup module, image generation method, electronic device, and readable storage medium

Publications (1)

Publication Number Publication Date
CN117596494A true CN117596494A (en) 2024-02-23

Family

ID=89910730

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311512300.3A Pending CN117596494A (en) 2023-11-13 2023-11-13 Image pickup module, image generation method, electronic device, and readable storage medium

Country Status (1)

Country Link
CN (1) CN117596494A (en)

Similar Documents

Publication Publication Date Title
CN102037734B (en) Camera sensor correction
US7835002B2 (en) System for multi- and hyperspectral imaging
US9077916B2 (en) Improving the depth of field in an imaging system
EP2464952B1 (en) Multi-spectral imaging
US11659289B2 (en) Imaging apparatus and method, and image processing apparatus and method
CN208797003U (en) Imaging sensor and pixel array with optics black picture element
EP3700197B1 (en) Imaging device and method, and image processing device and method
JP6348271B2 (en) Mixed material multispectral Stirling array sensor
GB2488519A (en) Multi-channel image sensor incorporating lenslet array and overlapping fields of view.
JP2016533695A (en) Method of using an array sensor to measure multiple types of data at the full resolution of the sensor
CN112672054B (en) Focusing method and device and electronic equipment
CN113497065B (en) Imaging spectrum chip with spectrum and imaging functions and preparation method thereof
Kuniba et al. Spectral sensitivity optimization of color image sensors considering photon shot noise
CN117596494A (en) Image pickup module, image generation method, electronic device, and readable storage medium
EP3780594B1 (en) Imaging device and method, image processing device and method, and imaging element
KR20200135305A (en) Imaging element, imaging device, and information processing method
CN117631212A (en) Lens module, electronic equipment, image acquisition method and device
CN110891137A (en) Image sensor, electronic device, image processing method, and storage medium
CN117596492A (en) Shooting method, shooting device, electronic equipment and medium
CN117979181A (en) Image pickup module, image acquisition method, electronic device, apparatus and storage medium
CN117528239A (en) Image pickup module, focusing method and device, electronic equipment and medium
CN117596493A (en) Shooting method, shooting device, electronic equipment and storage medium
CN115361532A (en) Image sensor, image acquisition method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination