CN117714823A - Image pickup module, image pickup method, electronic device, and readable storage medium - Google Patents

Image pickup module, image pickup method, electronic device, and readable storage medium Download PDF

Info

Publication number
CN117714823A
CN117714823A CN202410097836.1A CN202410097836A CN117714823A CN 117714823 A CN117714823 A CN 117714823A CN 202410097836 A CN202410097836 A CN 202410097836A CN 117714823 A CN117714823 A CN 117714823A
Authority
CN
China
Prior art keywords
grating
light
image
grating structure
zoom magnification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410097836.1A
Other languages
Chinese (zh)
Inventor
冯先
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202410097836.1A priority Critical patent/CN117714823A/en
Publication of CN117714823A publication Critical patent/CN117714823A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a camera module, a shooting method, electronic equipment and a readable storage medium, and belongs to the technical field of camera shooting. The shooting method comprises the following steps: and according to the zoom magnification, controlling a grating structure in the camera module to diffract light rays of at least one color to a first photosensitive area of an image sensor in the camera module respectively, acquiring at least one first image based on the first photosensitive area, and acquiring a second image corresponding to the zoom magnification based on the at least one first image.

Description

Image pickup module, image pickup method, electronic device, and readable storage medium
Technical Field
The application belongs to the technical field of camera shooting, and particularly relates to a camera shooting module, a camera shooting method, electronic equipment and a readable storage medium.
Background
With the continuous development of terminal technology, the shooting function has become an important function of terminal equipment and a main index for evaluating the performance of the terminal equipment. When a user uses a portable terminal device such as a mobile phone to perform shooting, there is also a need for zooming.
At present, a relay type zoom scheme is mainly adopted in terminal equipment, namely, different zoom magnifications are realized through the cooperation of a plurality of cameras with different physical focal lengths, such as an ultra-wide-angle lens, a wide-angle main shooting lens, a periscope type long-focus lens and the like. When the zoom is 1-3 times, the zoom is switched to the wide-angle main shooting lens for shooting and carrying out algorithm cutting, when the zoom is 4-10 times, the zoom is switched to the periscope type long-focus lens for shooting and carrying out algorithm cutting in cooperation with the wide-angle main shooting lens, so that the mixed zoom is realized, when the zoom is 10-100 times, the zoom is completely shot by the periscope type long-focus lens, and then the actual zoom effect is realized by cutting through the algorithm. The existing zooming scheme mainly realizes zooming effect by shooting through lens switching and amplifying and cutting the picture, so that the image quality after zooming is gradually attenuated from the center to the edge, and the image quality after zooming is obviously lost.
Disclosure of Invention
An object of the embodiments of the present application is to provide an image capturing module, a capturing method, an image capturing device, an electronic apparatus, and a readable storage medium, which can prevent image quality from being damaged while realizing a free zoom function.
In a first aspect, an embodiment of the present application provides a camera module, where the camera module includes: the image sensor comprises at least two pixel units, wherein one grating structure is correspondingly arranged above each pixel unit, grating parameters of the grating structure are changed under the condition that the grating structure is electrified so as to change the propagation path of incident light, and the pixel area of the image sensor is larger than the imaging area of the lens.
In a second aspect, an embodiment of the present application provides a shooting method, which is performed by an electronic device, where the electronic device includes the camera module set according to the first aspect, and the method includes: according to the zoom magnification, controlling a grating structure in the camera module to diffract light rays of at least one color to a first photosensitive area of an image sensor in the camera module respectively; acquiring at least one first image based on the first photosensitive area; and obtaining a second image corresponding to the zoom magnification based on at least one first image.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device includes the camera module according to the first aspect, and the electronic device further includes: the control module is used for: according to the zoom magnification, controlling a grating structure in the camera module to diffract light rays of at least one color to a first photosensitive area of an image sensor in the camera module respectively; acquiring at least one first image based on the first photosensitive area; and obtaining a second image corresponding to the zoom magnification based on at least one first image.
In a fourth aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the second aspect.
In a fifth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the second aspect.
In a sixth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the second aspect.
In a seventh aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the second aspect.
In this embodiment of the application, be provided with the changeable grating structure of grating parameter on the pixel structure of camera module, through the grating parameter of the grating structure of adjustment pixel unit top, can utilize the grating structure to carry out the diffraction to light and change the light path, diffract the corresponding photosensitive area (i.e. pixel) of image sensor with light to image sensor, and because image sensor's pixel structure's area is greater than the imaging area of camera lens, consequently can diffract the photosensitive area that the area is greater than the imaging area of camera lens with light through the grating, realize the visual field grow (zoom-out), perhaps diffract the photosensitive area that the area is less than the imaging area of camera lens with light through the grating, realize the visual field and diminish (zoom-in), thereby realize free zoom shooting.
In this embodiment of the application, when shooting, can control the grating according to zoom multiplying power and diffract light to image sensor's corresponding sensitization region, then gather and obtain the image that this zoom multiplying power corresponds, this embodiment of the application utilizes the diffraction principle of grating to realize the free zooming of camera module, because need not to switch the camera of a plurality of different physical focuses in the zooming process and carry out image acquisition to can promote the shooting definition when realizing free zooming function, guarantee image quality.
Drawings
FIG. 1 is a schematic diagram of a related art camera module;
fig. 2 is a schematic structural diagram of a camera module provided in an embodiment of the present application;
FIG. 3 (A) is a schematic diagram of a grating structure according to an embodiment of the present disclosure;
FIG. 3 (B) is a second schematic diagram of a grating structure according to an embodiment of the present disclosure;
FIG. 3 (C) is a third schematic diagram of a grating structure according to an embodiment of the present disclosure;
fig. 4 is a flowchart of a shooting method provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of zoom magnification and resolution of an image sensor according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a photographing device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms "first," "second," and the like in the description of the present application, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type and do not limit the number of objects, for example, the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The terms "at least one", and the like in the description of the present application refer to any one, any two, or a combination of two or more of the objects that it comprises. For example, at least one of a, b, c (item) may represent: "a", "b", "c", "a and b", "a and c", "b and c" and "a, b and c", wherein a, b, c may be single or plural. Similarly, the term "at least two" means two or more, and the meaning of the expression is similar to the term "at least one".
The terms and terminology referred to in the embodiments of the present application are explained below.
An image sensor: is a device for converting optical images into electronic signals, and is widely applied to digital cameras and other electronic optical devices. Currently, image sensors are mainly classified into two types, a Charge-Coupled Device (CCD) and a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS) active pixel cell sensor (Active pixel sensor).
CMOS Active pixel sensor is a type of active pixel cell sensor that utilizes CMOS semiconductors. A corresponding circuit is located near each photosensor to directly convert light energy into a voltage signal. Unlike a CCD sensor, CMOS Active pixel sensor does not involve signal charges. However, CMOS Active pixel sensor may also require an analog-to-digital converter on the motherboard to convert its output signal to a digital signal.
It should be noted that, the pixel unit of the CMOS sensor can only sense brightness, and if color is desired to be sensed, a color filter array (Color Filter Array, CFA) needs to be covered on the pixel unit, and the CFA functions to filter light rays of other wavebands, let light rays of a desired waveband pass through, and be photoelectrically converted by the pixel unit. The pixel cell thus has the ability to perceive color, but this way energy to the light is wasted. Moreover, the perceptibility of colors in this way is relatively rough.
"Relay" zoom mode: and the smoother zooming effect is realized by utilizing the difference of the physical focal lengths of the two or more fixed-focus lenses through lens switching and algorithm assistance.
The relay type zoom mode is that a mobile phone is provided with 3-4 cameras with different focal lengths, namely, an ultra-wide angle of 0.5 x, a wide angle of 1 x and a tele camera of 2 x (or 3 x), then the ultra-wide angle is used in 3 x, the ultra-wide angle digital zoom is used between 3 x and 1 x, the wide angle camera is switched to 1 x, the wide angle digital zoom is used between 1 and 2 x, the tele is switched to 2 x, and the tele digital zoom is used upwards in 2 x. In this way, shooting requirements of users in different scenes are met.
The specific zooming process comprises the following steps: (1) When the zoom magnification is 0.8 x, shooting is carried out by a single ultra-wide angle lens; (2) When the zoom magnification is 0.8-1 x, the single ultra-wide angle lens realizes digital zooming in the range of the focal length of the lens; (3) When the zoom magnification is 1 x, shooting is carried out by a single wide-angle main camera; (4) When the zoom magnification is 1-5 x, the single wide-angle main camera realizes digital zooming in the range of the focal length of the single wide-angle main camera; (5) When the zoom magnification is 5 x, shooting is carried out by a single periscope type long-focus lens; (6) When the zoom magnification is 5-10 x, the wide-angle main shooting lens and the periscope type long-focus lens still cooperate with digital zooming to realize work, but the periscope type long-focus lens is mainly used for shooting, and the image quality is obviously damaged mainly by algorithm cutting. With the assistance of an algorithm, the process can realize a smooth relay effect between 0.8 and 10 times of zoom multiplying power, and can realize a shooting effect similar to optical zooming although the focal length of a single lens is not changed.
Nano grating: is an optical element having a periodic structure, the period being the peaks and valleys of the surface relief of the material, which causes a periodic change in the refractive index n (refractive index) in the material. The period of the nano grating is generally in the micro-nano level and is one order of magnitude with the wavelength of visible light (450-700 nm), so that the light can be effectively diffracted. Therefore, the nanograting is also called a diffraction grating. The "spectroscopic" principle of the diffraction grating is as follows:
assuming that the incident light is a single wavelength of green light, it will be split by the diffraction grating into several diffraction orders (diffraction order), each of which propagates along a different direction, including light rays of reflective diffraction (R0, r±1, r±2, …) and transmissive diffraction (T0, t±1, t±2, …), and the diffraction angle (θm, m= ±1, ±2, …) corresponding to each diffraction order is determined by the incident angle (θ) of the light ray and the period (Λ) of the grating, the diffraction efficiency of a certain diffraction order (i.e., a certain direction) can be optimized to the highest by designing the parameters (material refractive index n, grating shape, thickness, duty cycle, etc.) of the grating, so that most of the light propagates mainly along this direction after diffraction.
Wherein, after the incident light is diffracted, the 0-order diffraction is refracted, and the energy of the part is not changed. Other orders of diffraction can be varied by controlling the grating parameters to concentrate energy at a certain order, typically predominantly at 1 order.
The shooting method, the shooting device, the electronic equipment and the medium provided by the embodiment of the application are described in detail through specific embodiments and application scenes thereof with reference to the accompanying drawings.
The shooting method, the shooting device, the electronic equipment and the medium can be applied to shooting scenes.
At present, electronic devices, such as mobile phones and cameras, have more and more photographing functions, and meanwhile, experience of photographing the mobile phones by consumers has higher and higher requirements, which are great challenges for functions, performances and effects of the mobile phones and cameras.
Currently, when a user needs to switch from close-up shooting to long-range shooting to more clearly shoot a distant scene, the user may contract two fingers on a shooting preview interface to trigger the electronic device to zoom, for example, trigger the electronic device to switch from a 5×zoom magnification to a 0.8×zoom magnification, the electronic device switches a camera for capturing an image from a main camera and a periscope type length Jiao Qiehuan to an ultra-wide angle lens according to the zoom magnification, and then uses the ultra-wide angle camera to capture the image. Since a shake of a picture may occur during zooming, image quality may be degraded.
In the photographing method provided by the embodiment of the application, when a user triggers the electronic device to switch from 0.8×zoom magnification to 5×zoom magnification during photographing, the electronic device controls the grating structure in the photographing module to diffract light rays of at least one color to the corresponding photosensitive area of the image sensor in the photographing module according to the 5×zoom magnification, and after each light ray is diffracted, the image is collected through the image sensor, and then the collected image is synthesized into one image for displaying. So, can control the grating according to the zoom magnification and diffract light to image sensor's corresponding sensitization region, then gather and obtain the image that this zoom magnification corresponds, the free zooming of camera module is realized to the diffraction principle of this application embodiment utilization grating, because need not to switch the camera of a plurality of different physical focuses at the zooming in-process and carry out image acquisition to can promote the shooting definition when realizing free zooming function, guarantee image quality.
Fig. 1 is a schematic structural diagram of an image capturing module in the related art, as shown in fig. 1, the image capturing module mainly comprises a lens 1 (lens) and an image sensor 10 (sensor), wherein the lens 1 is used for condensing light, and the image sensor is used for imaging.
Illustratively, the camera module further comprises an elastic member 2 for driving the lens assembly to move, a column 3 for supporting the whole camera module, a north pole 4 in a permanent magnet, a south pole 5 in the permanent magnet, a lens carrier 6 for fixing a lens, and the lens carrier 6 for fixing the lens, and the lens 1 are combined together to form the lens assembly.
Illustratively, the camera module further includes an energizing coil 7 disposed below the lens, a hold-down plate 8 for holding down, an infrared filter 9 (IR filter) for filtering infrared light, and a module substrate 11.
The magnetic field generated after the coil is electrified is opposite to the magnetic field direction of the permanent magnet, so that a reaction force is obtained, the stronger the induced magnetic field of the coil is, the larger the reaction force is, when the reaction force is larger than the gravity of the lens group, the lens group can be pushed to move upwards, and when the reaction force finally reaches balance with (elastic piece elastic force+module gravity), the position of the lens is stabilized. The module substrate 11 is used for fixing the sensor and transmitting the image in the sensor out through the circuit on the substrate.
The utility model provides a module of making a video recording that possesses controllable grating structure, this module of making a video recording be provided with the changeable grating structure of grating parameter on the pixel structure, through the grating parameter of the grating structure of adjustment pixel unit top, can change the luminousness of grating structure to can utilize the grating structure to carry out the diffraction to the light of pixel unit top and change the light path, realize the function of zooming, thereby can realize smooth zooming.
Fig. 2 is a schematic structural diagram of an image capturing module according to the embodiment of the present application, as shown in fig. 2, the image capturing module includes a lens 21, a grating structure 22 and an image sensor 23 disposed along an optical axis direction, the image sensor 23 includes at least two pixel units, a grating structure is disposed above each pixel unit, under the condition that the grating structure is powered on, grating parameters of the grating structure are changed to change a propagation path of incident light, and a pixel area of the image sensor 23 is larger than an imaging area of the lens.
In some embodiments, the pixel area of the image sensor 23 is at least 1.5 times the imaging area of the lens 21.
In some embodiments, the camera module further includes an infrared filter 24, and the infrared filter 24 is disposed below the grating structure 22 and above the image sensor 23.
In some embodiments, the grating structure 21 is formed by an electrically controlled liquid crystal module with controllable grating parameters. The material of the electrically controlled liquid crystal component may be a liquid crystal material.
In some embodiments, the grating parameters of the grating structure described above may be adjusted. Further, by changing the grating parameters of the grating structure, diffraction of light of a specific wavelength by the grating structure can be achieved.
In some embodiments, the grating structure includes at least two grating units, each of the grating units includes a first subunit and a second subunit, the first subunit and the second subunit have equal width values, and when the first subunit and the second subunit are powered on, the light transmittance of the first subunit and the second subunit changes.
In some embodiments, the width values of the first subunit and the second subunit may be 1-20 nm. Illustratively, the width values of the first subunit, the second subunit may be 1nm, 10nm, 20nm, or the like.
It should be noted that the width values of the first subunit and the second subunit may be specifically set according to actual requirements, which is not limited in the embodiment of the present application.
In some embodiments, the first subunit and the second subunit may have a thickness value of 100nm to 600nm. The thickness of the first sub-unit, the second sub-unit may be 100nm, 300nm or 600nm, for example.
The width of the first or second sub-unit refers to the width in the horizontal direction, and the thickness of the first or second sub-unit refers to the thickness in the vertical direction.
In some embodiments, the first subunit and the second subunit are made light transmissive, opaque, or semi-opaque by controlling the voltages of the first subunit and the second subunit.
In some embodiments, the first subunit and the second subunit may be liquid crystal assemblies, and one grating unit includes at least two liquid crystal assemblies.
In some embodiments, the liquid crystal assembly is made transparent, opaque or semi-transparent by controlling the voltage of each liquid crystal assembly.
In some embodiments, the grating parameter may be a grating pitch. The grating pitch may be the sum of the width values of the transmissive liquid crystal element and the adjacent opaque liquid crystal element.
In some embodiments, the grating pitch of the grating structure may be N times the width of the grating unit, where N is a positive integer.
Illustratively, assuming that one grating cell includes two subunits, each having a width value of 10nm, the grating pitch may be 20nm.
In some embodiments, the grating parameters are used to control the diffraction angle, the wavelength of the diffracted light, and the diffraction efficiency of the grating structure, and the effect of controlling the light splitting is achieved by designing the grating structure with variable grating parameters.
The relationship among grating parameters, diffraction angles, and wavelengths of diffracted light for the grating structure is described below with reference to formula (1).
m*λ=d(sinα+sinβ m ) (1)
Wherein d is the grating spacing, also called grating parameter, m is an integer, and the values are 0, ±1, etc. Diffraction angle beta m Alpha is the angle of incidence. Lambda is the wavelength of the light.
Illustratively, taking light in the 400nm band as an example at an incident angle of 30 ° and a diffraction angle of 31 °, it is possible to substitute the above values into the above formula (1): 0.515+0.5=0.4/d, then d=0.394 μm=394 nm, then the grating parameter may be 394nm or 400nm.
The diffraction angle, the wavelength of the diffracted light and the diffraction efficiency are controlled by the grating parameters, and the effect of controlling the light splitting can be achieved by designing the grating structure with variable grating parameters.
In some embodiments, the grating structure may be a separate grating structure; alternatively, the grating structure may be a part of the grating region in the array of grating structures.
Illustratively, the grating structure array includes a plurality of grating regions, each corresponding to one pixel unit in the pixel array of the image sensor, and one grating region corresponding to one pixel unit may be referred to as one grating structure.
In some embodiments, one pixel unit in the image sensor 21 corresponds to at least one grating structure.
Fig. 3 (a) is a schematic structural diagram of a grating structure according to an embodiment of the present application. As shown in fig. 3 (a), the grating structure includes 13 grating units, one grating unit includes two grating subunits, each grating subunit is composed of one liquid crystal component, the width value of each liquid crystal component is 10nm, the width value of one grating unit is 20nm, that is, the grating pitch is 20nm, and the grating structure is disposed on the substrate. The voltage of the liquid crystal component is controlled to enable the liquid crystal component to transmit light, not transmit light or transmit light in half, so that a grating structure of each required grating parameter is formed.
In some embodiments, the liquid crystal element corresponding to reference "1" is opaque and the liquid crystal element corresponding to reference "2" is fully opaque.
Fig. 3 (B) is another schematic structural diagram of the grating structure according to the embodiment of the present application. As shown in fig. 3 (B), the grating structure includes 6 grating units, each comprising two sub-units, each sub-unit being composed of two liquid crystal modules, each liquid crystal module having a width of 10nm, and one grating unit having a width value of 40nm, that is, a grating pitch of 40nm.
Fig. 3 (C) is another schematic structural diagram of the grating structure according to the embodiment of the present application. As shown in fig. 3 (C), the grating structure includes 26 liquid crystal modules, and is in a fully transmissive state.
It should be noted that, of course, if the grating structure is not required to diffract light, all the liquid crystal components can be controlled to keep full light transmission, and the grating structure can be regarded as a plane lens.
In some embodiments, the grating structure corresponding to each pixel unit in the pixel array of the image sensor can diffract the light on the pixel unit onto other pixel units adjacent to the pixel unit, so as to improve the photosensitive capacity of the adjacent pixel units.
Illustratively, the grating structure on a second pixel cell of the first row in the pixel array may diffract light on the second pixel cell onto the first pixel cell of the first row in the pixel array.
The embodiment of the application provides electronic equipment, which comprises the camera module shown in the figure 2.
The execution body of the shooting method provided in the embodiment of the present application may be an electronic device, including a mobile electronic device or a non-mobile electronic device, or may be a functional module or a functional entity capable of implementing the shooting method in the electronic device, which may be specifically determined according to actual use requirements, and the embodiment of the present invention is not limited.
An embodiment of the present application provides a photographing method, and fig. 4 shows a flowchart of the photographing method provided in the embodiment of the present application, as shown in fig. 4, where the photographing method may include the following steps S401 to S403.
Step S401: the electronic equipment controls the grating structure in the camera module to diffract light rays of at least one color to the first photosensitive area of the image sensor in the camera module respectively according to the zoom multiplying power.
In some embodiments, the zoom magnification may be a zoom magnification input by a user, or the zoom magnification may be a zoom magnification determined by the electronic device according to an input parameter of the user in the shooting preview interface. For example, after the user double-finger zooms the shooting preview interface, the electronic device determines the zoom magnification according to the zoom information of the preview interface.
It will be appreciated that imaging using a zoom lens may vary from focus to focus. The wide-angle lens can expand a photographing range, the telephoto lens can enlarge an object, and the photographing range can be freely changed even if a photographer does not move, and such a change can be understood as a change in "zoom magnification".
Three common zoom magnifications are described below:
0.6×: the method refers to that the field of view is increased, and the shot object is reduced, namely, the super wide angle shooting is performed;
1×: i.e. a normal photographing size mode;
2×: refers to the reduction of the field of view, and the magnification of the photographed object is 2 times that of the original.
In some embodiments, the electronic device may determine, according to the zoom magnification, a photosensitive area, i.e., a first photosensitive area, of the entire photosensitive area of the image sensor, to which light needs to be incident.
Illustratively, the zoom magnification is 2×, that is, the image needs to be enlarged to twice the original size, and the photosensitive area on the image sensor to which light needs to be incident may be determined according to the size of the image acquired at the 2× zoom magnification.
It can be understood that the image sensor is composed of a plurality of pixel units, each pixel unit has a position coordinate, and each pixel unit in the image sensor has a corresponding relation with each pixel point in the acquired image, and the size of the acquired image under different zoom factors can be determined according to the zoom magnification and parameters of the image sensor, so that the size of the photosensitive area of the image sensor can be determined according to the size of the image to be acquired.
It should be noted that, the size of the image corresponding to the zoom magnification may be determined according to the measured data, for example, images with different zoom magnifications are photographed by means of optical zooming or digital zooming, and the size of the image with each zoom magnification is determined, and then the zoom magnification and the size of the image with the zoom magnification are recorded.
In some embodiments, the electronic device may determine diffraction parameters of the grating structures corresponding to the pixel units according to the coordinate information of the pixel units in the first photosensitive area, and then control the grating structures to diffract the incident light according to the diffraction parameters so as to diffract the light to the first photosensitive area of the image sensor.
In some embodiments, the at least one color of light comprises: red light (R), green light (G) and blue light (B). Alternatively, the light of at least one color includes: cyan light (C), magenta light (M), and yellow light (Y).
It should be noted that the light of at least one color may be light of other colors, for example, violet light, which is not limited in the embodiment of the present application.
The light ranges of red light are 580-650 nm, the light ranges of blue light are 400-480 nm, and the light ranges of green light are 480-580 nm.
In some embodiments, the electronic device may determine a center wavelength of light in each band while diffracting light in different bands, and then control the grating structure to diffract light at the center wavelength.
In some embodiments, a plurality of pixel cells may be included in the first photosensitive region, such as may include R, B, and G pixel cells. Thus, all R, G and B light information reaching the first photosensitive area can be converted into corresponding electrical information by the first photosensitive area, thereby obtaining image data.
Step S402: the electronic equipment acquires at least one first image based on the first photosensitive area.
In some embodiments, a first image corresponds to a color light.
Illustratively, where the at least one image includes image 1, image 2, and image 3, image 1 may correspond to red light, image 2 may correspond to green light, and image 3 may correspond to blue light.
In some embodiments, the at least one image includes an image captured by an image sensor after the electronic device diffracts each light ray through the grating structure.
In some embodiments, the electronic device may separately control the grating structure to diffract each of the light rays of the at least one color, and after diffracting each of the light rays, control the image sensor to collect an image, and then combine the collected image (i.e., the first image) after diffracting each of the light rays into one image, i.e., the second image. In other words, the electronic device may acquire an image by the image sensor under the condition that the grating structure is controlled to diffract the light corresponding to the required diffraction band range to the first photosensitive region.
In some embodiments, when it is required to diffract light of a plurality of central wavelengths, the electronic device may control the grating structure to sequentially diffract light of each central wavelength onto the first pixel unit, and each diffract light of one wavelength, the electronic device may collect one first image through the image sensor, so as to obtain a plurality of first images.
Step S403: the electronic equipment obtains a second image corresponding to the zoom magnification based on at least one first image.
In some embodiments, in the case of obtaining a plurality of first images, the electronic device may perform a synthesis process on the plurality of first images to obtain a second image.
It will be appreciated that when the desired diffraction band range includes a plurality of wavelengths, the electronic device is required to control the grating structure to diffract a plurality of times, and each time the grating structure diffracts, the electronic device acquires an image once by the image sensor, and finally synthesizes a second image from all the acquired images.
The process of the electronic device control grating diffracting various light rays and capturing an image by the image sensor is exemplarily described below.
The electronic device may control the grating structure to diffract the R light into the first photosensitive area of the image sensor and collect an image through the image sensor to obtain an R channel image, then control the grating structure to diffract the G light into the first photosensitive area of the image sensor and collect an image through the image sensor to obtain a G channel image, then control the grating structure to diffract the B light into the first photosensitive area of the image sensor and collect an image through the image sensor to obtain a B channel image, and finally, the electronic device performs image synthesis processing on the R channel image, the G channel image and the B channel image to obtain an RGB image, that is, a second image. In this way, the electronic device can sequentially diffract light rays of different wavebands to the photosensitive area of the image sensor through the grating structure, and collect image data after each diffraction, so as to obtain a multi-channel image (namely, a second image) under the zoom multiple.
According to the shooting method provided by the embodiment of the application, the electronic equipment controls the grating structure in the shooting module to diffract light rays of at least one color to the first photosensitive area of the image sensor in the shooting module respectively according to the zoom multiplying power, at least one first image is acquired based on the first photosensitive area, and a second image corresponding to the zoom multiplying power is acquired based on the at least one first image. According to the method, when shooting is carried out, the grating can be controlled according to the zoom multiplying power to diffract light to the corresponding photosensitive area of the image sensor, then the image corresponding to the zoom multiplying power is acquired, the free zooming of the camera shooting module is realized by utilizing the diffraction principle of the grating, and the image acquisition is carried out without switching a plurality of cameras with different physical focal lengths in the zooming process, so that shooting definition can be improved while the free zooming function is realized, and image quality is ensured. .
In some embodiments of the present application, the step S401 of controlling the grating structure in the image capturing module to diffract the light of at least one color to the first photosensitive area of the image sensor in the image capturing module according to the zoom magnification may include the following steps S401a to S401d:
Step S401a: the electronic equipment determines a first photosensitive area based on the zoom multiplying power and lens parameters of a lens in the camera module.
Step S401b: and the electronic equipment determines the diffraction angle of the light according to the incidence angle of the light on the grating structure and the position information of the first photosensitive area.
The diffraction angle is an angle for diffracting light to the first photosensitive area.
Step S401c: the electronic device determines a first grating parameter according to the wavelength, the incidence angle and the diffraction angle of the light.
Step S401d: the electronic device adjusts grating parameters of the grating structure to first grating parameters so that the grating structure diffracts the light to the first photosensitive area.
In some embodiments, the position information of the first photosensitive region may be coordinate information of the first photosensitive region, i.e., coordinate information of the pixel array in the first photosensitive region.
In some embodiments, the electronic device may determine an image size of the image to be captured according to the zoom magnification and the imaging area of the lens, and then determine the first photosensitive region according to the image size. For example, the electronic device may determine, according to the image size, coordinate information of at least a portion of the pixel unit arrays of the image sensor that need to receive light, where the at least a portion of the pixel unit arrays that need to receive light are the first photosensitive areas.
The image size refers to an actual imaging size of an image acquired by the image sensor.
In some embodiments, the electronic device may determine the diffraction angle based on an angle of incidence of the light on the grating structure, the coordinate information of the first photosensitive region, and the coordinate information of the second photosensitive region. The second photosensitive area is an area corresponding to the light without diffraction.
When the zoom magnification is 1×, that is, when the image is not zoomed, the light does not need to be diffracted by the grating structure, and at this time, the grating structure may be in a completely transparent state, and when the grating is in a completely transparent state, the light is incident through the grating to the photosensitive region, that is, the second photosensitive region.
In some embodiments, the wavelength of the light may be a predetermined wavelength, for example, the wavelength of the light may include a center wavelength of red light, a center wavelength of green light, and a center wavelength of blue light.
In some embodiments, the angle of incidence of light rays incident on the image sensor may be determined from lens parameters.
It should be noted that, from the sensor side of the lens, the maximum Angle of the light that can be focused onto the pixel unit is defined as a Chief Ray Angle (CRA), which is also called a Chief Ray Angle. The distance between the lens axis and the lens is close to zero, and the angle is increased along with the larger the distance between the lens axis and the lens axis is. The CRA is related to the position of the pixel at the sensor.
The process of determining grating parameters for the wavelength, angle of incidence, and angle of diffraction of light from an electronic device is described in exemplary fashion below.
Illustratively, in combination with the above formula (1), assuming that the diffraction angle is 31 °, the wavelength of the diffracted light is 400nm, and the incident angle of the light incident on the image sensor is 30 °, substituting the above parameter into the formula (1) results in d=0.394 μm, that is, the grating parameter is 394nm.
In some embodiments, the electronic device may change the light transmission characteristic of the liquid crystal component of the grating structure by adjusting the grating parameter, so that the light incident to the grating structure is diffracted to the first photosensitive region through the grating structure.
In some embodiments of the present application, the process of adjusting the grating constant of the grating structure to the first grating constant in the step S401d may include the following step S401d1:
step S401d1: the electronic device adjusts the grating parameters of the grating structure to the first grating parameters by applying electrical information corresponding to the first grating parameters to at least one grating structure.
Wherein, the at least one grating structure comprises a grating structure corresponding to the light ray when the light ray is injected.
In some embodiments, the electrical information may be a voltage or a current.
When it should be noted that, for the grating structure, the method of applying electrical information to the grating structure and controlling the grating structure to completely transmit light, refer to the related description in the camera module in the above embodiment, and in order to avoid repetition, the description is omitted here.
Therefore, the grating structure can accurately diffract the light to the photosensitive area determined according to the zoom factor, so that the image sensor can accurately acquire the image data corresponding to the required zoom factor.
In some embodiments of the present application, the step S402 may include the following step S403:
step S403: after the electronic equipment diffracts the light rays of each color, the image sensor is controlled to acquire a first image according to the first resolution.
The first resolution is determined according to a zoom magnification interval in which the zoom magnification is located, and the zoom magnifications in different zoom magnification intervals correspond to different resolutions.
In some embodiments, the electronic device controls the image sensor to acquire one image corresponding to one color according to the first resolution after diffracting the light of one color, and acquires at least one first image after diffracting the light of each color and acquiring the image corresponding to each color through the first photosensitive area of the image sensor.
In some embodiments, the zoom magnification interval described above may include one or more zoom magnification intervals. Illustratively, the above-described zoom magnification section may include a first magnification section, a second magnification section, and a third magnification section, wherein the zoom magnification of the first magnification section includes 0.2 to 1×, the zoom magnification of the second magnification section includes 1 to 4×, and the zoom magnification of the third magnification section includes 4× and above.
In some embodiments, different zoom magnification intervals correspond to different resolution modes, and different resolution modes correspond to different resolutions.
The first magnification interval corresponds to a first resolution mode, and when the image sensor is in the first resolution mode, the resolution of the image sensor is lower, and one pixel point of an image acquired through the resolution corresponds to one color; the second multiplying power interval corresponds to a second resolution mode, when the image sensor is in the second resolution mode, the resolution of the image sensor is higher than that in the first resolution mode, and four pixel points of an image acquired through the resolution correspond to one color; the third-time-rate interval corresponds to a third resolution mode, and when the image sensor is in the third resolution mode, the resolution of the image sensor is higher than the resolution in the second resolution mode, and sixteen pixels of the image acquired through the resolution correspond to one color.
It can be understood that the resolution of the image sensor can determine the resolution of the collected image, the image sensor can have stronger photosensitive capacity when working in a low resolution mode, and the collected image has more abundant colors; the image sensor captures images with higher definition in the high resolution mode.
It should be noted that the first resolution mode may be a binding mode (i.e., a 16-in-one mode), the second resolution mode may be a 2×mode (i.e., a four-in-one mode), and the third resolution mode may be a 4×mode, i.e., a full-size mode.
In some embodiments, the image sensor switches between the plurality of resolution modes.
The process of switching the resolution of the image sensor described above is exemplarily described below with reference to the accompanying drawings.
As shown in fig. 5, the image sensor can switch different resolution modes, when the normal sensitization, i.e. the Zoom magnification is 1 x, the 2 x mode is used, when the Zoom out, i.e. the field of view is enlarged, the grating diffracts light to expand, and then the image sensor switches to the binding mode to keep the sampling rate unchanged. When zoom, i.e. when the field of view is reduced, the grating diffracts light into the shrink, the image sensor switches to 4 x mode to maintain the sample rate.
Illustratively, at a zoom magnification of 0.2 x, the electronic device adjusts the resolution mode of the image sensor to a binding mode and captures an image through the image sensor to obtain a color-rich large field-of-view image.
For example, when the zoom magnification is 1×the electronic device adjusts the resolution mode of the image sensor to 2×mode, and the image sensor collects the image, so as to improve the image resolution, so as to obtain a larger-view image with richer color and higher definition.
Illustratively, at a zoom magnification of 4 x, the electronic device adjusts the resolution mode of the image sensor to 4 x mode and captures an image through the image sensor, increasing the image resolution to obtain a higher definition smaller field of view image.
In the embodiment of the application, the image sensor with the dynamically switchable resolution mode is adopted in the camera module to match the change of zoom magnification corresponding to different optical fields of view, so that the sampling frequency is kept unchanged, and a higher-quality image is obtained.
The foregoing method embodiments, or various possible implementation manners in the method embodiments, may be executed separately, or may be executed in combination with each other on the premise that no contradiction exists, and may be specifically determined according to actual use requirements, which is not limited by the embodiments of the present application.
According to the shooting method provided by the embodiment of the application, the execution subject can be electronic equipment. In the embodiment of the present application, an electronic device executing a shooting method is taken as an example, and the electronic device provided in the embodiment of the present application is described.
Fig. 6 is a schematic structural diagram of a photographing device provided in the embodiment of the present application, as shown in fig. 6, the photographing device 600 includes a photographing module of the foregoing embodiment, and the photographing device 600 may further include a control module 601, an acquisition module 602, and an acquisition module 603, where the control module 601 is configured to: according to the zoom magnification, controlling a grating structure in the camera module to diffract light rays of at least one color to a first photosensitive area of an image sensor in the camera module respectively; the acquisition module 602 is configured to acquire at least one first image based on the first photosensitive area; the acquiring module 603 is configured to obtain a second image corresponding to the zoom magnification based on at least one first image acquired by the acquiring module.
In some embodiments of the present application, the control module is specifically configured to: determining a first photosensitive area based on the zoom magnification and lens parameters of a lens in the camera module; determining a diffraction angle of the light according to the incidence angle of the light on the grating structure and the position information of the first photosensitive area, wherein the diffraction angle is an angle for diffracting the light to the first photosensitive area; determining a first grating parameter according to the wavelength, incidence angle and diffraction angle of the light; and adjusting the grating parameters of the grating structure to the first grating parameters so that the grating structure diffracts the light to the first photosensitive area.
In some embodiments of the present application, the control module is specifically configured to: adjusting grating parameters of the grating structures into first grating parameters in a mode of applying electric information corresponding to the first grating parameters to at least one grating structure; wherein, the at least one grating structure comprises a grating structure corresponding to the light ray when the light ray is injected.
In some embodiments of the present application, the acquisition module is specifically configured to control the image sensor to acquire a first image according to a first resolution after diffracting light of each color; the first resolution is determined according to a zoom magnification interval in which the zoom magnification is located, and the zoom magnifications in different zoom magnification intervals correspond to different resolutions.
According to the shooting device, the shooting device controls the grating structure in the shooting module to diffract light rays of at least one color to the first photosensitive area of the image sensor in the shooting module according to the zoom multiplying power; and acquiring at least one first image based on the first photosensitive area, and acquiring a second image corresponding to the zoom magnification based on the at least one first image. According to the method, when the shooting device shoots, the light can be diffracted to the corresponding photosensitive area of the image sensor according to the zooming multiplying power control grating, and then the image corresponding to the zooming multiplying power is acquired.
The photographing device in the embodiment of the application may be an electronic device, or may be a component in the electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The photographing device in the embodiment of the application may be a device having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The photographing device provided in the embodiment of the present application can implement each process implemented by the embodiments of the methods of fig. 1 to 5, so as to achieve the same technical effect, and in order to avoid repetition, a detailed description is omitted here.
Optionally, as shown in fig. 7, the embodiment of the present application further provides an electronic device 700, including a processor 701 and a memory 702, where a program or an instruction that can be executed on the processor 701 is stored in the memory 702, and when the program or the instruction is executed by the processor 701, the steps of the foregoing shooting method embodiment are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 8 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, and processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further include a power source (e.g., a battery) for powering the various components, and that the power source may be logically coupled to the processor 110 via a power management system to perform functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 8 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
In some embodiments, the electronic device 100 includes an image capturing module, the image capturing module includes a lens, an image sensor and a grating structure, the image sensor includes at least two pixel units, one grating structure is disposed above each pixel unit, and under the condition that the grating structure is powered on, grating parameters of the grating structure are changed to change a propagation path of incident light, and a pixel area of the image sensor is larger than an imaging area of the lens.
In some embodiments, the pixel area of the image sensor is at least 1.5 times the imaging area of the lens.
In some embodiments, the grating structure includes at least two grating units, the grating units include a first subunit and a second subunit, the first subunit and the second subunit have equal width values, and when the first subunit and the second subunit are powered on, the light transmittance of the first subunit and the second subunit changes.
In some embodiments, the first subunit and the second subunit have a width value of 1-20 nm.
The processor 110 is configured to control the grating structure in the image capturing module to diffract light of at least one color to the first photosensitive area of the image sensor in the image capturing module according to the zoom magnification; acquiring at least one first image based on the first photosensitive area; and obtaining a second image corresponding to the zoom magnification based on the at least one first image.
In some embodiments of the present application, the processor 110 is specifically configured to: determining a first photosensitive area based on the zoom magnification and lens parameters of a lens in the camera module; determining a diffraction angle of the light according to the incidence angle of the light on the grating structure and the position information of the first photosensitive area, wherein the diffraction angle is an angle for diffracting the light to the first photosensitive area; determining a first grating parameter according to the wavelength, the incidence angle and the diffraction angle of the light; and adjusting the grating parameters of the grating structure to the first grating parameters so that the grating structure diffracts the light to the first photosensitive area.
In some embodiments of the present application, the processor 110 is specifically configured to: adjusting grating parameters of the grating structures into first grating parameters in a mode of applying electric information corresponding to the first grating parameters to at least one grating structure; wherein, the at least one grating structure comprises a grating structure corresponding to the light ray when the light ray is injected.
In some embodiments of the present application, the processor 110 is specifically configured to control the image sensor to acquire a first image according to the first resolution after diffracting the light of each color; the first resolution is determined according to a zoom magnification interval in which the zoom magnification is located, and the zoom magnifications in different zoom magnification intervals correspond to different resolutions.
According to the electronic equipment provided by the embodiment of the application, the electronic equipment controls the grating structure in the camera module to diffract light rays of at least one color to the first photosensitive area of the image sensor in the camera module according to the zoom multiplying power; and acquiring at least one first image based on the first photosensitive area, and acquiring a second image corresponding to the zoom magnification based on the at least one first image. According to the method, when the electronic equipment shoots, the light can be diffracted to the corresponding photosensitive area of the image sensor according to the zoom magnification control grating, and then the image corresponding to the zoom magnification is acquired.
It should be appreciated that in embodiments of the present application, the input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 109 may include volatile memory or nonvolatile memory, or the memory 109 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 109 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units; optionally, the processor 110 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, etc., and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction realizes each process of the above-mentioned shooting method embodiment, and the same technical effect can be achieved, so that repetition is avoided, and no redundant description is provided herein.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, the processor is used for running a program or instructions, implementing each process of the shooting method embodiment, and achieving the same technical effect, so as to avoid repetition, and no redundant description is provided herein.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the foregoing shooting method embodiments, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (15)

1. The image pickup module comprises a lens, an image sensor and a grating structure, wherein the image sensor comprises at least two pixel units, one grating structure is correspondingly arranged above each pixel unit, grating parameters of the grating structure are changed to change the propagation path of incident light under the condition that the grating structure is electrified, and the pixel area of the image sensor is larger than the imaging area of the lens.
2. The camera module of claim 1, wherein a pixel area of the image sensor is at least 1.5 times an imaging area of the lens.
3. The camera module of claim 1, wherein the grating structure comprises at least two grating units, the grating units comprise a first subunit and a second subunit, the first subunit and the second subunit have equal width values, and the light transmittance of the first subunit and the second subunit changes when the first subunit and the second subunit are powered on.
4. The camera module of claim 2, wherein the first subunit and the second subunit have a width value of 1-20 nm.
5. A photographing method performed by an electronic device, wherein the electronic device includes the camera module according to any one of claims 1 to 4, the method comprising:
according to the zoom magnification, controlling a grating structure in the camera module to diffract light rays of at least one color to a first photosensitive area of an image sensor in the camera module respectively;
acquiring at least one first image based on the first photosensitive area;
and obtaining a second image corresponding to the zoom magnification based on the at least one first image.
6. The method of claim 5, wherein controlling the grating structure in the camera module to diffract light of at least one color to the first photosensitive region of the image sensor in the camera module according to the zoom magnification, respectively, comprises:
determining the first photosensitive area based on the zoom magnification and lens parameters of a lens in the camera module;
determining a diffraction angle of the light according to the incidence angle of the light on the grating structure and the position information of the first photosensitive area, wherein the diffraction angle is an angle for diffracting the light to the first photosensitive area;
Determining a first grating parameter according to the wavelength of the light, the incident angle and the diffraction angle;
and adjusting grating parameters of the grating structure to the first grating parameters so that the grating structure diffracts the light to the first photosensitive area.
7. The method of claim 6, wherein adjusting the grating constant of the grating structure to the first grating constant comprises:
adjusting grating parameters of the grating structures to the first grating parameters by applying electrical information corresponding to the first grating parameters to at least one grating structure;
wherein the at least one grating structure comprises a grating structure corresponding to the light ray when the light ray is incident.
8. The method of claim 5, wherein acquiring at least one first image based on the first photosensitive region comprises:
after the light of each color is diffracted, controlling the image sensor to acquire a first image according to a first resolution;
the first resolution is determined according to a zoom magnification interval in which the zoom magnification is located, and different zoom magnification intervals correspond to different resolutions.
9. The method according to any one of claims 5 to 8, wherein the at least one color of light comprises: red light, green light, and blue light.
10. An electronic device comprising the camera module of any one of claims 1 to 4, the electronic device further comprising: the device comprises a control module, an acquisition module and an acquisition module, wherein the control module is used for:
the control module is used for controlling the grating structure in the camera module to diffract light rays of at least one color to a first photosensitive area of the image sensor in the camera module according to the zoom multiplying power;
the acquisition module is used for acquiring at least one first image based on the first photosensitive area;
the acquisition module is used for acquiring a second image corresponding to the zoom magnification based on the at least one first image acquired by the acquisition module.
11. The electronic device according to claim 10, wherein the control module is specifically configured to:
determining the first photosensitive area based on the zoom magnification and lens parameters of a lens in the camera module;
Determining a diffraction angle of the light according to the incidence angle of the light on the grating structure and the position information of the first photosensitive area, wherein the diffraction angle is an angle for diffracting the light to the first photosensitive area;
determining a first grating parameter according to the wavelength of the light, the incident angle and the diffraction angle;
and adjusting grating parameters of the grating structure to the first grating parameters so that the grating structure diffracts the light to the first photosensitive area.
12. The electronic device of claim 11, wherein the control module is specifically configured to:
adjusting grating parameters of the grating structures to the first grating parameters by applying electrical information corresponding to the first grating parameters to at least one grating structure;
wherein the at least one grating structure comprises a grating structure corresponding to the light ray when the light ray is incident.
13. The electronic device according to claim 10, wherein the acquisition module is specifically configured to control the image sensor to acquire a first image according to the first resolution after diffracting the light of each color;
The first resolution is determined according to a zoom magnification interval in which the zoom magnification is located, and the zoom magnifications in different zoom magnification intervals correspond to different resolutions.
14. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the shooting method of any of claims 5-9.
15. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the shooting method according to any one of claims 5 to 9.
CN202410097836.1A 2024-01-23 2024-01-23 Image pickup module, image pickup method, electronic device, and readable storage medium Pending CN117714823A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410097836.1A CN117714823A (en) 2024-01-23 2024-01-23 Image pickup module, image pickup method, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410097836.1A CN117714823A (en) 2024-01-23 2024-01-23 Image pickup module, image pickup method, electronic device, and readable storage medium

Publications (1)

Publication Number Publication Date
CN117714823A true CN117714823A (en) 2024-03-15

Family

ID=90150077

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410097836.1A Pending CN117714823A (en) 2024-01-23 2024-01-23 Image pickup module, image pickup method, electronic device, and readable storage medium

Country Status (1)

Country Link
CN (1) CN117714823A (en)

Similar Documents

Publication Publication Date Title
US9470875B2 (en) Image pickup device
US10027918B2 (en) Imaging module that captures subject images having different characteristics and imaging device
KR101177134B1 (en) Zoom lens and image pickup device
US7848575B2 (en) Imaging apparatus
US20150326848A1 (en) Variable imaging arrangements and methods therefor
JP5749403B2 (en) Imaging apparatus and image processing method
US9106837B2 (en) Image capturing device and image capturing method
WO2013168505A1 (en) Imaging device and signal correction method
EP3163348B1 (en) Imaging device
WO2013145886A1 (en) Image capture element and image capture device and image capture method employing same
US10165181B2 (en) Imaging device
WO2014136570A1 (en) Imaging device, image processing device, image processing method and program
WO2006022373A1 (en) Imaging device and imaging method
JP6989630B2 (en) Imaging device
JP2021530875A (en) Image sensor and mobile terminal
CN113965687B (en) Shooting method and device and electronic equipment
CN114125240A (en) Image sensor, camera module, electronic equipment and shooting method
CN114845061A (en) Control method and device of image sensor, image sensor and electronic equipment
CN117714823A (en) Image pickup module, image pickup method, electronic device, and readable storage medium
CN114338997B (en) Image pickup module, image pickup method, image pickup device and electronic equipment
CN117631212A (en) Lens module, electronic equipment, image acquisition method and device
CN117596493A (en) Shooting method, shooting device, electronic equipment and storage medium
CN117528239A (en) Image pickup module, focusing method and device, electronic equipment and medium
CN118488331A (en) Image sensor, image pickup module, electronic device, image pickup method, and image pickup apparatus
CN117979181A (en) Image pickup module, image acquisition method, electronic device, apparatus and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination