CN117528209A - Image pickup module, electronic device, focusing method, focusing device and readable storage medium - Google Patents

Image pickup module, electronic device, focusing method, focusing device and readable storage medium Download PDF

Info

Publication number
CN117528209A
CN117528209A CN202311506195.2A CN202311506195A CN117528209A CN 117528209 A CN117528209 A CN 117528209A CN 202311506195 A CN202311506195 A CN 202311506195A CN 117528209 A CN117528209 A CN 117528209A
Authority
CN
China
Prior art keywords
grating
distance
detection sensor
light
focusing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311506195.2A
Other languages
Chinese (zh)
Inventor
韩志良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202311506195.2A priority Critical patent/CN117528209A/en
Publication of CN117528209A publication Critical patent/CN117528209A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses a camera module, electronic equipment, a focusing method, a focusing device and a readable storage medium, and belongs to the technical field of photography. The camera module that this application embodiment provided includes: a housing; a lens assembly movably disposed within the housing; the grating component is arranged in the shell and is used for detecting the moving distance of the lens component in the shell in the focusing process.

Description

Image pickup module, electronic device, focusing method, focusing device and readable storage medium
Technical Field
The application belongs to the technical field of photography, and particularly relates to a camera module, electronic equipment, a focusing method, a focusing device and a readable storage medium.
Background
In general, in the process of capturing an image by using an electronic device, the electronic device may determine that the image capturing module is in a focusing state, and the lens assembly of the image capturing module is required to be located at a focusing position, and control the lens assembly to move towards the focusing position, and then collect a preview image through the image capturing module, and determine that the image capturing module is in a focusing state when the definition of the preview image is greater than a preset threshold value, so that the electronic device may capture a clear image through the image capturing module in the focusing state.
However, because the electronic device controls the movement of the lens assembly to have errors, the position of the lens assembly may need to be adjusted repeatedly, and a clear preview image can be acquired through the camera module for a long time, which results in poor focusing capability of the electronic device.
Disclosure of Invention
An objective of the embodiments of the present application is to provide a camera module, an electronic device, a focusing method, a focusing device and a readable storage medium, which can solve the problem of poor focusing capability of the electronic device.
In a first aspect, an embodiment of the present application provides a camera module, where the camera module includes: a housing; a lens assembly movably disposed within the housing; the grating component is arranged in the shell and is used for detecting the moving distance of the lens component in the shell in the focusing process.
In a second aspect, embodiments of the present application provide an electronic device, including: the camera module of the first aspect.
In a third aspect, an embodiment of the present application provides a focusing method, which is performed by the electronic device in the second aspect, and the method includes: under the condition that a camera module of the electronic equipment is in a defocus state, acquiring a first focusing distance, wherein the first focusing distance is as follows: the camera shooting module is in a focusing state, and the lens component of the camera shooting module needs to move in distance; controlling the lens assembly to move in the shell of the camera module, and detecting a first moving distance of the lens assembly in the shell through the grating assembly of the camera module; and determining whether the camera shooting module is in a focusing state according to the first moving distance and the first focusing distance.
In a fourth aspect, an embodiment of the present application provides a focusing device, including an image capturing module according to the first aspect, the focusing device further including: the acquisition module is used for acquiring a first focusing distance under the condition that the camera shooting module of the focusing device is in a defocus state, and the first focusing distance is as follows: the camera module is in a focusing state, and the lens component of the camera module needs to move in distance. The processing module is used for controlling the lens assembly to move in the shell of the camera module and detecting the first moving distance of the lens assembly in the shell through the grating assembly of the camera module; and determining whether the camera shooting module is in a focusing state according to the first moving distance and the first focusing distance.
In a fifth aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the third aspect.
In a sixth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the third aspect.
In a seventh aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement the steps of the method according to the third aspect.
In an eighth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executed by at least one processor to implement the steps of the method according to the third aspect.
In this application embodiment, because set up the grating subassembly in the module of making a video recording, can directly detect the lens subassembly in the moving distance in the casing through the grating subassembly in the in-process of focusing like this, with according to this moving distance, whether this module of making a video recording is in the focusing state to adjust the position of the module of making a video recording in real time, and need not to gather through the module of making a video recording and obtain preview image, and calculate the definition of this preview image, consequently, can reduce the electronic equipment and confirm the time of making a video recording whether the module is in the focusing state, improve electronic equipment's focusing ability, can shoot fast and obtain clear image.
Drawings
Fig. 1 is a schematic side view of a camera module according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a side view of a camera module according to an embodiment of the present disclosure;
FIG. 3 is a third schematic side view of the camera module according to the embodiment of the present disclosure;
FIG. 4 is a schematic side view of an image capturing module according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a side view of an image capturing module according to an embodiment of the present disclosure;
fig. 6 is a schematic top view of a camera module according to an embodiment of the present disclosure;
FIG. 7 is a second schematic top view of the camera module according to the embodiment of the present disclosure;
FIG. 8 is a third schematic top view of the camera module according to the embodiment of the present disclosure;
fig. 9 is one of schematic structural diagrams of an electronic device according to an embodiment of the present application;
fig. 10 is a schematic flow chart of a focusing method according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a focusing device according to an embodiment of the present disclosure;
fig. 12 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application;
fig. 13 is a second schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms "first," "second," and the like in the description of the present application, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. In addition, "and/or" in the specification means at least one of the connected objects, and the character "/", generally means a relationship in which the associated objects are one kind of "or".
The terms "at least one", and the like in the description of the present application refer to any one, any two, or a combination of two or more of the objects that it comprises. For example, at least one of a, b, c (item) may represent: "a", "b", "c", "a and b", "a and c", "b and c" and "a, b and c", wherein a, b, c may be single or plural. Similarly, the term "at least two" means two or more, and the meaning of the expression is similar to the term "at least one".
The image capturing module, the electronic device, the focusing method, the device and the readable storage medium provided in the embodiments of the present application are described in detail below with reference to the accompanying drawings by means of specific embodiments and application scenarios thereof.
Fig. 1 shows a possible schematic structural diagram of a camera module provided in an embodiment of the present application, as shown in fig. 1, the camera module includes: a housing 10; a lens assembly 11, the lens assembly 11 being movably disposed within the housing 10; and a grating assembly 12, wherein the grating assembly 12 is disposed in the housing 10, and the grating assembly 12 is used for detecting a moving distance of the lens assembly 11 in the housing 10 during focusing.
In some embodiments of the present application, the shape of the housing 10 may be any of rectangle, circle, and oval, and the housing 10 has a cavity disposed therein, where the cavity is used to accommodate other components of the camera module, such as the lens assembly 11 and the grating assembly 12.
In some embodiments of the present application, in conjunction with fig. 1, as shown in fig. 2, the above-mentioned housing 10 may further include: a voice coil motor (not shown in the drawings), a post 13, an infrared filter (IR filter) 14, a photosensitive chip 15, and a printed circuit 16 board, wherein the post 13 is disposed on an inner side wall of the housing 10, the post 13 is for supporting the camera module, the voice coil motor is disposed on an inner side wall of the post 13, the printed circuit 16 board is disposed on a bottom of the housing 10, the voice coil motor is for controlling the lens assembly 11 to move, the IR filter14 and the photosensitive chip 15 are disposed between the voice coil motor and the printed circuit 16, the IR filter14 is for filtering infrared light, and the photosensitive chip 15 is for imaging based on the light passing through the lens assembly 11 and the IR filter 14.
It should be noted that, for the explanation of the movement of the voice coil motor control lens assembly 11, reference may be made to the specific description in the related art, and the embodiments of the present application will not be repeated here.
In some embodiments of the present application, the lens assembly 11 may be a zoom lens, and the lens assembly 11 may include at least one lens.
In some embodiments of the present application, the lens assembly 11 may be connected to an inner wall of the housing 10, which may be an inner sidewall or a bottom, through an elastic member. The elastic component can be a spring or a shrapnel.
In some embodiments of the present application, the lens assembly 11 may be moved in the housing 10 in a direction away from the bottom of the housing 10 by the driving of the voice coil motor; alternatively, the lens assembly 11 may be moved in the housing 10 in a direction toward the bottom of the housing 10 by the driving of the voice coil motor.
In some embodiments of the present application, the grating assembly 12 may include a light source, a receiver, and at least one grating, where the light source and the receiver may be disposed on an inner sidewall of the housing 10, a portion of the at least one grating is disposed on an end of the lens assembly 11 near a bottom of the housing 10, another portion of the at least one grating is disposed on the inner sidewall of the housing 10, and the portion of the at least one grating and the another portion of the at least one grating are disposed opposite to each other, so that the grating assembly 12 may emit light to the at least one grating through the light source, so that the receiver may receive the corresponding reflected light, and determine a moving distance of the lens assembly 11 in the housing 10 according to the reflected light. Wherein the light emitting source and the receiver may be integrally provided components.
In some embodiments, in conjunction with fig. 1, as shown in fig. 3, the above-described grating assembly includes: a first grating 121, the first grating 121 being disposed within the housing 10; the second grating 122, one end of the second grating 122 is connected with the lens assembly 11, the second grating 122 is opposite to the first grating 121, the grating parameters of the second grating 122 and the first grating 121 are matched, and a reflecting layer 1221 is further arranged on one side surface of the second grating 122 far away from the first grating 121; the detecting sensor 123 is disposed in the housing 10, the detecting sensor 123 is disposed opposite to the first grating 121, and the detecting sensor 123 is configured to emit a first light to the first grating 121 and calculate a moving distance of the second grating 122 according to the received second light.
In some embodiments, one end of the first grating 121 may be fixedly connected to the inner sidewall of the housing 10, and one end of the second grating 121 may be fixedly connected to one end of the lens assembly 11 near the bottom of the housing 10.
In some embodiments, the plane in which the first grating 121 lies is parallel to the plane in which the second grating 122 lies, and the size of the first grating 121 may be smaller than or equal to the size of the second grating 122. For example, the length of the first grating 121 may be less than or equal to the length of the second grating 122.
In some embodiments, the grating parameters of the second grating 122 and the first grating 121 may be the same. Wherein the grating parameters may comprise at least one of: grating period, grating line width, duty cycle, etc. The grating period may be understood as a distance between adjacent grating lines, and the duty cycle may be understood as a ratio of a width of a gap between adjacent grating lines to the grating period.
In some embodiments of the present application, an included angle is formed between the grating lines of the second grating 122 and the first grating 121, and the included angle ranges from 0.1 ° to 1 °.
It should be noted that, the "an included angle exists between the gate lines of the second grating 122 and the first grating 121" can be understood as: the projection of the first grating 121 on the plane of the second grating 122 is not 0 with respect to the second grating 122.
It can be understood that, because the grating parameters of the second grating 122 and the first grating 121 are matched, and an included angle exists between the grating lines of the second grating 122 and the first grating 121, so that after the first light passes through the first grating 121 and the second grating 122, the first light is reflected by the reflective layer 1221 to the image corresponding to the second light of the detection sensor 123, and the stripe is a moire stripe, so that the moving distance of the second grating 122, that is, the moving distance of the lens assembly 11 in the housing 10 can be determined by the stripe with the bright and dark stripes.
Therefore, as an included angle exists between the second grating and the grating line of the first grating, and the included angle can be set to be 0.1-1 degrees, the interval between the light and shade alternate stripes in the picture corresponding to the second light ray is in a preset distance range, and the moving distance of the second grating can be accurately calculated and determined according to the light and shade alternate stripes, so that the moving distance of the lens component in the shell can be obtained.
In some embodiments, the reflective layer 1221 may be a mirror surface of a mirror, and the reflective layer 1221 may also be a reflective coating.
In some embodiments, the detection sensor 123 may be a specific semiconductor device, and the specific semiconductor device includes a light emitting source, a receiver, and a processor.
The light emitting source may emit a specific light, where the specific light may be a light with a wavelength belonging to a specific wavelength band, the specific wavelength band may be a wavelength band that is not overlapped with a wavelength band of the light receivable by the light sensing chip, for example, the light emitting source may be an infrared emitter, and the specific light may be an infrared with a wavelength of 1100 nm.
The receiver may be an infrared receiver, the light emitting source and the receiver may be distributed in different areas on one side of the detection sensor 123, or the light emitting source and the receiver may be staggered in the same area on one side of the detection sensor 123.
The processor is configured to perform signal shaping, data processing and physical position conversion on the received second light, so as to determine a moving distance of the second grating 122, that is, a moving distance of the lens assembly 11 in the housing 10.
In some embodiments, one end of the detection sensor 123 may be fixedly connected to the inner sidewall of the housing 10, and a plane in which the detection sensor 123 is located is parallel to a plane in which the first grating 121 is located.
The specific structure of the detection sensor 123 will be exemplified below by taking as an example that the light emitting source and the receiver may be distributed in different areas of one side of the detection sensor 123.
In some embodiments of the present application, in conjunction with fig. 3, as shown in fig. 4, the detection sensor includes a first pixel region 1231 and a second pixel region 1232; the first pixel area is used for emitting first light and the second pixel area is used for receiving second light.
It is understood that the light emitting sources may be distributed in the first pixel region 1231 and the receivers may be distributed in the second pixel region 1232.
Therefore, as different pixel areas in the detection sensor can be respectively used for emitting the first light and receiving the second light, only the detection sensor is needed to be arranged, and the moving distance of the second grating can be accurately calculated through the detection sensor, so that the moving distance of the lens component in the shell can be obtained, and other components are not needed to be additionally arranged, therefore, the cost of the camera module can be reduced.
In this embodiment, the second light is reflected to the detection sensor 123 through the reflective layer 1221 after the first light passes through the grating lines of the first grating 121 and the second grating 122.
It will be appreciated that the receiver of the detection sensor 123 may receive the second light, and the processor of the detection sensor 123 may perform signal shaping, data processing and physical position conversion on the second light, so that the moving distance of the second grating 122, that is, the moving distance of the lens assembly 11 in the housing 10, may be determined.
Therefore, the first light ray is transmitted to the first grating and the second light ray is transmitted to the second grating by the reflection layer, the reflection layer reflects the reflection light ray (namely the second light ray) to the detection sensor, and the second light ray corresponds to the image, so that the detection sensor can accurately determine the moving distance of the second grating, namely the moving distance of the lens assembly in the shell according to the light and shade alternate stripes, and the accuracy of the detection sensor in determining the moving distance of the lens assembly in the shell can be improved.
In some embodiments, in conjunction with fig. 3, as shown in fig. 5, the grating assembly 12 further includes: and a light condensing member 124, the light condensing member 124 being disposed between the detection sensor 123 and the first grating 121, the light condensing member 124 being configured to condense the first light.
In some embodiments, the concentrator 124 may be a condenser lens in particular.
In some embodiments, one end of the light gathering member 124 may be fixedly coupled to the inner sidewall of the housing 10.
Therefore, the light-gathering piece can be arranged between the detection sensor and the first grating, so that after the detection sensor emits the first light, the light-gathering piece can gather the first light so that the direction of the first light can face the first grating, the intensity of the light passing through the grating lines of the first grating and the second grating can be improved, the intensity of the second light can be improved, the definition of stripes with alternate brightness in a picture corresponding to the second light can be improved, and the detection sensor can accurately determine the moving distance of the second grating, namely the moving distance of the lens assembly in the shell.
The embodiment of the application provides a module of making a video recording, this module of making a video recording include the casing, movably set up the camera lens subassembly in the casing and set up the grating subassembly in the casing to the module of making a video recording can detect the travel distance of focusing in-process camera lens subassembly in the casing through the grating subassembly. Because the grating component is arranged in the camera module, the moving distance of the lens component in the shell can be directly detected in real time through the grating component in the focusing process, so that whether the camera module is in a focusing state or not can be accurately determined according to the moving distance, the position of the camera module is adjusted in real time, a preview image is not required to be acquired through the camera module, the definition of the preview image is calculated, the time for the electronic equipment to determine whether the camera module is in the focusing state or not can be reduced, the focusing capability of the electronic equipment is improved, and the clear image can be quickly shot.
In some embodiments of the present application, the number of grating elements 12 described above may be at least one.
It should be noted that, for the description of the structure of each grating assembly 12, reference may be made to the specific description in the above embodiments, and the embodiments of the present application will not be repeated here.
In some embodiments, in the case where the number of grating elements 12 is one, the one grating element 12 may be disposed at one side of the center of the lens element 11.
For example, fig. 6 shows a schematic top view of an image capturing module according to an embodiment of the present application. As shown in fig. 6, the grating assembly includes a grating assembly, such as the grating assembly 125, the grating assembly 125 is disposed in the housing 10, and the grating assembly 125 is located at one side of the center of the lens assembly 11.
In some embodiments, the number of grating elements 12 is at least two; wherein at least two grating assemblies 12 are arranged around the lens assembly 11.
It will be appreciated that at least two grating assemblies 12 may be evenly distributed around the center of the lens assembly 11.
For example, as shown in fig. 7, the grating assembly includes two grating assemblies, such as a grating assembly 126 and a grating assembly 127, where the grating assembly 126 and the grating assembly 127 are disposed in the housing 10, and the grating assembly 126 and the grating assembly 127 are uniformly distributed around the center of the lens assembly 11, that is, the grating assembly 126 and the grating assembly 127 are symmetrical with respect to the center of the lens assembly 11.
For example, as shown in fig. 8, the grating assembly includes four grating assemblies, such as grating assembly 128, grating assembly 129, grating assembly 130 and grating assembly 131, wherein the grating assembly 128, grating assembly 129, grating assembly 130 and grating assembly 131 are disposed in the housing 10, and the grating assembly 128, grating assembly 129, grating assembly 130 and grating assembly 131 are uniformly distributed around the center of the lens assembly 11.
In some embodiments, in the case where the number of the grating assemblies 12 is at least two, the at least two grating assemblies 12 may detect the moving distances of the lens assembly 11 in the housing 10, respectively, and determine the moving distance of the lens assembly 11 in space, for example, the moving distance of the lens assembly 11 in a direction toward the sidewall of the housing 10, based on the detected at least two moving distances.
Therefore, as at least two grating components can be arranged, each grating component can respectively detect the moving distance of the lens component in the shell, so that the moving distance of the lens component in space can be determined according to the at least two moving distances, and the shooting module can optimize the shot image in other modes according to the moving distance of the lens component in space, so that the probability of inclination and local blurring in the shot image is reduced, and the image quality of the shot image of the electronic equipment can be improved.
Fig. 9 shows a possible structural schematic diagram of an electronic device provided in an embodiment of the present application, as shown in fig. 9, the electronic device 20 includes: the camera module 21 in the above embodiment.
In some embodiments of the present application, the processor of the electronic device 20 may be electrically connected to the camera module 21.
The embodiment of the application provides electronic equipment, which comprises the camera module in the embodiment. Because the grating component is arranged in the camera module, the moving distance of the lens component in the shell can be directly detected in real time through the grating component in the focusing process, so that whether the camera module is in a focusing state or not can be accurately determined according to the moving distance, the position of the camera module is adjusted in real time, a preview image is not required to be acquired through the camera module, the definition of the preview image is calculated, the time for the electronic equipment to determine whether the camera module is in the focusing state or not can be reduced, the focusing capability of the electronic equipment is improved, and the clear image can be quickly shot.
According to the focusing method provided by the embodiment of the application, the execution body can be a focusing device, or an electronic device, or a functional module or entity in the electronic device. In the embodiment of the present application, an example of a focusing method executed by an electronic device is described.
Fig. 10 shows a flowchart of a focusing method provided in an embodiment of the present application, where the focusing method may be applied to the electronic device in the above embodiment. As shown in fig. 10, the focusing method provided in the embodiment of the present application may include the following steps 101 to 103.
Step 101, under the condition that an imaging module of the electronic equipment is in a defocus state, the electronic equipment acquires a first focusing distance.
In some embodiments of the present application, when the electronic device displays the shooting preview interface, the electronic device may collect at least one preview image through the camera module, perform calculation according to the at least one preview image by adopting a preset algorithm, and determine whether the camera module is in an out-of-focus state according to a calculation result, so that the electronic device may obtain the first in-focus distance when determining that the camera module is in the out-of-focus state.
In some embodiments, the above-mentioned preset algorithm may be an autofocus algorithm, which may be any one of the following: contrast focus (ContrastAuto Focus, CAF) algorithm, phase focus (Phase DetectionAuto Focus, PDAF) algorithm.
It should be noted that, for the description of the CAF algorithm and the PDAF algorithm, reference may be made to the specific description in the related art, and the embodiments of the present application are not repeated herein.
In this embodiment of the present application, the first focusing distance is: the camera module is in a focusing state, and the lens assembly of the camera module needs to move for a distance.
In some embodiments of the present application, the electronic device may calculate the first focusing distance using the phase information of the at least one preview image.
It should be noted that, for the description of calculating the first focal distance by using the phase information of at least one preview image for the electronic device, reference may be made to the specific description in the related art, and the embodiments of the present application are not repeated herein.
In some embodiments of the present application, the electronic device may further determine the first focusing direction according to the phase information of the at least one preview image, so that after the electronic device obtains a preset duration from a time point when the first focusing distance is obtained, the electronic device may control the lens assembly of the camera module to move by a first focusing distance along the first focusing direction.
It can be understood that the electronic device can control the lens assembly of the camera module to move the first focusing distance along the first focusing direction, so that the camera module is in a focusing state.
Step 102, the electronic device controls the lens assembly to move in the shell of the camera module, and detects a first moving distance of the lens assembly in the shell through the grating assembly of the camera module.
In some embodiments of the present application, the electronic device may control the grating assembly to emit light, and determine the first movement distance according to a reflected light of the light.
In some embodiments of the present application, the grating assembly includes a first grating, a second grating, and a detection sensor; the first grating is arranged in the shell; one end of the second grating is connected with the lens component, the second grating is arranged opposite to the first grating, grating parameters of the second grating are matched with those of the first grating, and a reflecting layer is arranged on one side surface of the second grating far away from the first grating; the detection sensor is arranged in the shell, and the detection sensor is arranged opposite to the first grating. In some embodiments, the step 102 may be specifically implemented by the following steps 102a to 102 c.
Step 102a, the electronic device controls the lens assembly to move in the housing of the camera module, controls the detection sensor of the grating assembly to emit first light to the first grating, and receives second light through the detection sensor.
In this embodiment of the present application, the second light is: the first light rays penetrate through the grating lines of the first grating and the second grating and then are reflected to the light rays of the detection sensor through the reflecting layer.
It should be noted that, for the electronic device to control the detection sensor to emit the first light to the first grating and receive the second light through the detection sensor, reference may be made to the specific description in the above embodiment, and this embodiment of the present application will not be repeated here.
Step 102b, the electronic device generates a first image according to the second light through the detection sensor.
In some embodiments of the present application, the electronic device may receive the second light through the receiver of the detection sensor, and generate a first image matrix according to the second light, where the first image matrix includes a plurality of matrix elements, where each matrix element is a pixel value of one pixel of the receiver after receiving the second light, so that the first image may be obtained according to the first image matrix.
In this embodiment of the present application, the first image includes at least two first stripes arranged at intervals.
In some embodiments of the present application, the at least two first stripes may specifically be: fringes with alternating brightness and darkness, such as "moire fringes".
Step 102c, the electronic device determines a first moving distance based on the first position information of the at least two first stripes.
In some embodiments of the present application, the first location information may specifically be: position information of a third stripe of the at least two first stripes. Wherein the number of the third stripes may be at least one, and in the case that the number of the third stripes is one, the third stripes may be any one of at least two first stripes, and the third stripes may be bright stripes or dark stripes. In the case where the number of the third stripes is at least two, the third stripes may be at least a part of the first stripes among the at least two first stripes, and the third stripes may be all bright stripes, all dark stripes, or part of bright stripes.
In some embodiments of the present application, the first location information may specifically be: coordinate information of the third stripe in the first image; the electronic device may perform image detection on the first image to determine the first location information.
In some embodiments of the present application, the electronic device may determine the moving distance of the third stripe based on the first position information, and calculate the first moving distance according to the moving distance of the third stripe.
In the case that the number of the third stripes is at least two, the electronic device may determine an average value of the moving distances of the third stripes, and then calculate the first moving distance according to the average value.
Therefore, the electronic device can generate the first image according to the second light after the electronic device controls the detection sensor to emit the first light and receives the second light through the detection sensor, and can directly determine the first moving distance according to the position information of at least two first stripes in the first image, and the electronic device does not need to acquire a preview image through the camera module and determine the definition of the preview image, so that the time for the electronic device to determine whether the camera module is in a focusing state can be reduced, and the time required by the electronic device to acquire the clear image can be reduced.
In some embodiments of the present application, before the step 102, the focusing method provided in the embodiments of the present application may further include the following step 201 and step 202, and the step 102c may be specifically implemented by the following step 102c 1.
Step 201, the electronic device controls the detection sensor to emit third light to the first grating, and receives fourth light through the detection sensor.
In some embodiments of the present application, the electronic device may control the detection sensor to emit the third light and receive the fourth light through the detection sensor within a preset duration from a time when the first focal distance is acquired.
In this embodiment of the present application, the fourth light is: the third light is reflected to the light of the detection sensor through the reflecting layer after passing through the grating lines of the first grating and the second grating.
Step 202, the electronic device generates a second image according to the fourth light through the detection sensor.
It should be noted that, for the description of the electronic device generating the second image according to the fourth light, reference may be made to the specific description in the foregoing embodiment, and this embodiment of the present application will not be repeated here.
In this embodiment of the present application, the second image includes at least two second stripes arranged at intervals.
In some embodiments of the present application, the at least two second stripes may specifically be: fringes with alternating brightness and darkness, such as "moire fringes".
Step 102c1, the electronic device determines a first moving distance according to the first position information and the second position information of the at least two second stripes.
In this embodiment of the present application, the second location information is: and position information of a fourth stripe corresponding to the third stripe among the at least two second stripes.
The "fourth stripe corresponding to the third stripe" may be understood as: the fourth stripe and the third stripe are diffraction stripes of the same grating.
In some embodiments of the present application, the electronic device may calculate a linear distance between the first position information and the second position information, and calculate the first movement distance according to the linear distance.
It should be noted that, for the description of calculating the linear distance between the first position information and the second position information by the electronic device, reference may be made to the specific description in the related art, and the embodiments of the present application are not repeated herein.
Therefore, before the lens assembly is controlled to move in the shell, the electronic device can control the detection sensor to emit third light and receive fourth light through the detection sensor, so that the electronic device can generate a second image according to the fourth light through the detection sensor, and the electronic device can accurately calculate the first moving distance according to second position information and first position information of a fourth stripe corresponding to the third stripe in at least two second stripes in the second image, and therefore accuracy of the calculated first moving distance can be improved.
In some embodiments of the present application, the step 102c1 may be specifically implemented by the following steps 102c1a and 102c1 b.
Step 102c1a, the electronic device calculates a first location distance between the first location information and the second location information.
In some embodiments of the present application, the electronic device may calculate a linear distance between the first location information and the second location information to obtain the first location distance. It is understood that the first position distance is a straight line distance between the first position information and the second position information.
Step 102c1b, the electronic device calculates a first moving distance according to the first position distance and a preset corresponding relation.
In this embodiment of the present application, the above-mentioned preset corresponding relationship has an association relationship with a first preset distance and a second preset distance, where the first preset distance is a distance between the detection sensor and the first grating, and the second preset distance is a distance between the first grating and the second grating.
In some embodiments of the present application, the preset correspondence may include at least one of the following: at least one first corresponding relation, a first preset value; each first corresponding relation is a corresponding relation between a position distance and a moving distance.
In some embodiments of the present application, in the case where the preset correspondence includes at least one first correspondence, the electronic device may determine, from at least one position distance in the at least one first correspondence, one position distance that matches the first position distance, and then determine, as the first movement distance, one movement distance corresponding to the one position distance.
It should be noted that the above "matching" may be understood as the same, or the difference between the two is less than or equal to the preset threshold value.
In some embodiments of the present application, the first preset value may be: the reciprocal value of the included angle between the grating lines of the first grating and the second grating.
Illustratively, assuming that the angle between the grating lines of the first grating and the second grating is 0.01rad, the first preset value may be a reciprocal value of 0.01, i.e. 1/0.01, i.e. 100.
In some embodiments of the present application, in a case where the preset correspondence includes a first preset value, the electronic device may determine a product of the first position distance and the first preset value as the first movement distance.
In the examples of application, "moire" has the following characteristics:
(1) Law of change: the first grating and the second grating are relatively shifted by one grating pitch (i.e. the grating pitch of the first grating or the grating pitch of the second grating), the "moire fringes" are shifted by one fringe distance. Due to the diffraction and interference effects of light, the variation law of the moire fringes approximates a positive (or residual) chord function, and the variation period number of the moire fringes is synchronous with the grid distance number of the relative displacement of the first grating and the second grating.
(2) Amplification: in the case that the included angle between the grating lines of the first grating and the second grating is smaller, the following relationship exists among the pitch ω of the "moire fringes", the grating pitch W, and the included angle θ between the grating lines of the first grating and the second grating:
wherein, the unit of the included angle theta between the grating lines of the first grating and the second grating is rad, and the unit of the grating pitch W of the grating is millimeter mm.
As can be seen from the above features, the distance of the relative movement between the first grating and the second grating (i.e., the first movement distance) is in a multiple relationship with the movement distance of the "moire fringes", so that the first position distance can be calculated first, and then the distance of the relative movement between the first grating and the second grating (i.e., the first movement distance) can be calculated according to the product of the first position distance and the first preset value.
Therefore, the electronic device can calculate the first position distance between the first position information and the second position information, and then accurately calculate the first moving distance according to the first position distance and the preset corresponding relation.
Step 103, the electronic device determines whether the camera module is in a focusing state according to the first moving distance and the first focusing distance.
In some embodiments of the present application, the electronic device may determine whether the camera module is in an in-focus state according to whether the first moving distance and the first in-focus distance are matched.
It should be noted that, the above "matching" can be understood as: the same, or a difference between the two is less than or equal to a preset difference.
In some embodiments of the present application, under the condition that the first moving distance and the first focusing distance are matched, the electronic device may determine that the image capturing module is in a focusing state, and at this time, the electronic device may capture a clear image.
In some embodiments of the present application, under the condition that the first moving distance and the first focusing distance are not matched, the electronic device may determine that the image capturing module is still in the out-of-focus state, so that the electronic device may calculate a difference between the first focusing distance and the first moving distance, obtain a second moving distance, control the lens module to move the second moving distance, and execute the steps 101 to 103 again until it is determined that the image capturing module is in the in-focus state.
In some embodiments, in the case that the second movement distance is less than 0, the electronic device may control the lens assembly to move in a direction opposite to the first focusing direction by the second movement distance; or, in the case that the second movement distance is greater than 0, controlling the lens assembly to move along the second movement distance along the first focusing direction.
The embodiment of the application provides a focusing method, under the condition that a camera shooting module of electronic equipment is in a defocusing state, the electronic equipment can acquire a first focusing distance firstly, the first focusing distance is a distance for enabling the camera shooting module to be in a focusing state, a lens component of the camera shooting module needs to move, the lens component is controlled to move in a shell of the camera shooting module, and the first moving distance of the lens component in the shell is detected through a grating component of the camera shooting module, so that the electronic equipment can determine whether the camera shooting module is in the focusing state according to the first moving distance and the first focusing distance. Because the grating component is arranged in the camera module, the moving distance of the lens component in the shell can be directly detected in real time through the grating component in the focusing process, so that whether the camera module is in a focusing state or not can be accurately determined according to the moving distance, the position of the camera module is adjusted in real time, a preview image is not required to be acquired through the camera module, the definition of the preview image is calculated, the time for the electronic equipment to determine whether the camera module is in the focusing state or not can be reduced, the focusing capability of the electronic equipment is improved, and the clear image can be quickly shot.
According to the focusing method provided by the embodiment of the application, the execution body can be a focusing device. In the embodiment of the present application, a focusing method performed by a focusing device is taken as an example, and the focusing device provided in the embodiment of the present application is described.
Fig. 11 shows a schematic diagram of one possible structure of the focusing device involved in the embodiment of the present application. As shown in fig. 11, the focusing device 50 may include the camera module in the above embodiment, and the focusing device 50 may further include:
the obtaining module 51 is configured to obtain a first focusing distance when the camera module of the focusing device 50 is in a defocus state, where the first focusing distance is: the camera module is in a focusing state, and the lens component of the camera module needs to move in distance. The processing module 52 is used for controlling the lens assembly to move in the shell of the camera module and detecting the first moving distance of the lens assembly in the shell through the grating assembly of the camera module; and determining whether the camera module is in an in-focus state according to the first moving distance and the first in-focus distance acquired by the acquisition module 51.
The embodiment of the application provides a focusing device, because set up the grating subassembly in focusing device's the module of making a video recording, can directly detect the displacement distance of camera lens subassembly in the casing in real time through the grating subassembly like this in focusing to according to this displacement distance, whether this module of making a video recording is in the focusing state of accurate determination, thereby adjust the position of making a video recording the module in real time, and need not to gather through the module of making a video recording and obtain preview image, and calculate the definition of this preview image, consequently, can reduce the electronic equipment and confirm the time of making a video recording the module and be in the focusing state, improve electronic equipment's focusing ability, can shoot fast and obtain clear image.
In one possible implementation, the grating assembly includes a first grating, a second grating, and a detection sensor; the first grating is arranged in the shell; one end of the second grating is connected with the lens component, the second grating is arranged opposite to the first grating, grating parameters of the second grating are matched with those of the first grating, and a reflecting layer is arranged on one side surface of the second grating far away from the first grating; the detection sensor is arranged in the shell, and the detection sensor is arranged opposite to the first grating. The processing module 52 is specifically configured to control the detection sensor to emit a first light beam to the first grating, and receive a second light beam through the detection sensor, where the second light beam is: the first light rays penetrate through the first grating and the second grating and then are reflected to the light rays of the detection sensor through the reflecting layer; generating a first image according to the second light rays through a detection sensor, wherein the first image comprises at least two first stripes which are arranged at intervals; and determining a first movement distance based on the first position information of the at least two first stripes.
In one possible implementation, the processing module 52 is further configured to control the detection sensor to emit a third light beam to the first grating and receive a fourth light beam through the detection sensor before controlling the lens assembly to move in the housing of the camera module, where the fourth light beam is: the third light passes through the first grating and the second grating and then is reflected to the light of the detection sensor through the reflecting layer; and generating a second image according to the fourth light ray by the detection sensor, wherein the second image comprises at least two second stripes which are arranged at intervals. The processing module 52 is specifically configured to determine the first moving distance according to the first position information and the second position information of the at least two second stripes.
In one possible implementation, the processing module 52 is specifically configured to calculate a first location distance between the first location information and the second location information; and calculating a first moving distance according to the first position distance and a preset corresponding relation, wherein the preset corresponding relation has an association relation with a first preset distance and a second preset distance, the first preset distance is the distance between the detection sensor and the first grating, and the second preset distance is the distance between the first grating and the second grating.
The focusing device in the embodiment of the application may be an electronic device, or may be a component in the electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. Illustratively, the electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a mobile internet appliance (mobile internet device, MID), an augmented reality (augmented reality, AR)/Virtual Reality (VR) device, a robot, a wearable device, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook or a personal digital assistant (personal digital assistant, PDA), or the like, and may also be a server, a network attached storage (network attached storage, NAS), a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, or the like, which is not particularly limited in the embodiments of the present application.
The focusing device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The focusing device provided in the embodiment of the present application can implement each process implemented by the embodiments of the methods of fig. 1 to 10, so as to achieve the same technical effects, and in order to avoid repetition, a detailed description is omitted here.
In some embodiments, as shown in fig. 12, the embodiment of the present application further provides an electronic device 60, including a processor 61 and a memory 62, where a program or an instruction capable of being executed on the processor 61 is stored in the memory 62, and the program or the instruction is executed by the processor 61 to implement each process step of the above focusing method embodiment, and achieve the same technical effect, so that repetition is avoided and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device.
Fig. 13 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, and processor 110.
Wherein, electronic equipment still includes the module of making a video recording, and this module of making a video recording includes: a housing; a lens assembly movably disposed within the housing; the grating component is arranged in the shell and is used for detecting the moving distance of the lens component in the shell in the focusing process.
Those skilled in the art will appreciate that the electronic device 100 may further include a power source (e.g., a battery) for powering the various components, and that the power source may be logically coupled to the processor 110 via a power management system to perform functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 13 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
The processor 110 is configured to obtain a first focusing distance when the camera module of the electronic device is in a defocus state, where the first focusing distance is: the camera shooting module is in a focusing state, and the lens assembly of the camera shooting module needs to move for a distance; controlling the lens assembly to move in the shell of the camera module, and detecting a first moving distance of the lens assembly in the shell through the grating assembly of the camera module; and determining whether the camera shooting module is in a focusing state according to the first moving distance and the first focusing distance.
The embodiment of the application provides electronic equipment, because set up the grating subassembly in the module of making a video recording, can directly detect the lens subassembly in the moving distance in the casing through the grating subassembly in real time like this in focusing to according to this moving distance, whether this module of making a video recording is in the focusing state to adjust the position of the module of making a video recording in real time, and need not to gather through the module of making a video recording and obtain preview image, and calculate the definition of this preview image, consequently, can reduce the time that the module of making a video recording was in the focusing state in the electronic equipment certainty, improve electronic equipment's focusing ability, can shoot fast and obtain clear image.
In some embodiments of the present application, the grating assembly includes a first grating, a second grating, and a detection sensor; the first grating is arranged in the shell; one end of the second grating is connected with the lens component, the second grating is arranged opposite to the first grating, grating parameters of the second grating are matched with those of the first grating, and a reflecting layer is arranged on one side surface of the second grating far away from the first grating; the detection sensor is arranged in the shell, is arranged opposite to the first grating, and is used for emitting first light to the first grating and calculating the moving distance of the second grating according to the received second light; the second light is reflected to the detection sensor through the reflecting layer after the first light passes through the first grating and the second grating.
In some embodiments of the present application, an included angle is formed between the grating lines of the second grating and the first grating, and the included angle ranges from 0.1 ° to 1 °.
In some embodiments of the present application, the detection sensor includes a first pixel area and a second pixel area; the first pixel area is used for emitting first light rays, and the second pixel area is used for receiving second light rays.
In some embodiments of the present application, the grating assembly further includes: the light gathering piece is arranged between the detection sensor and the first grating and is used for gathering the first light.
In some embodiments of the present application, the number of grating elements is at least two; wherein at least two grating assemblies are disposed around the lens assembly.
Based on the above structure, in one possible implementation manner, the processor 110 is specifically configured to control the detection sensor to emit a first light beam to the first grating, and receive, through the detection sensor, a second light beam, where the second light beam is: the first light rays penetrate through the first grating and the second grating and then are reflected to the light rays of the detection sensor through the reflecting layer; generating a first image according to the second light rays through a detection sensor, wherein the first image comprises at least two first stripes which are arranged at intervals; and determining a first movement distance based on the first position information of the at least two first stripes.
Based on the above structure, in one possible implementation manner, the processor 110 is further configured to control the detection sensor to send out a third light to the first grating, and receive, through the detection sensor, a fourth light, where the fourth light is: the third light passes through the first grating and the second grating and then is reflected to the light of the detection sensor through the reflecting layer; and generating a second image according to the fourth light ray by the detection sensor, wherein the second image comprises at least two second stripes which are arranged at intervals.
The processor 110 is specifically configured to determine the first moving distance according to the first position information and the second position information of the at least two second stripes.
Based on the foregoing structure, in one possible implementation manner, the processor 110 is specifically configured to calculate, according to the first position distance and a preset corresponding relationship, a first movement distance, where the preset corresponding relationship has an association relationship with a first preset distance and a second preset distance, where the first preset distance is a distance between the detection sensor and the first grating, and the second preset distance is a distance between the first grating and the second grating.
It should be appreciated that in embodiments of the present application, the input unit 104 may include a graphics processor (graphics processing unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 109 may include volatile memory or nonvolatile memory, or the memory 109 may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (ddr SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DRRAM). Memory 109 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units; in some embodiments, processor 110 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, and the like, and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction realizes each process of the above focusing method embodiment, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, the processor is used for running a program or an instruction, implementing each process of the focusing method embodiment, and achieving the same technical effect, so as to avoid repetition, and no redundant description is provided herein.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the above-mentioned focusing method embodiments, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (17)

1. The utility model provides a module of making a video recording, its characterized in that, the module of making a video recording includes:
a housing;
the lens assembly is movably arranged in the shell;
the grating component is arranged in the shell and used for detecting the moving distance of the lens component in the shell in the focusing process.
2. The camera module of claim 1, wherein the grating assembly comprises:
the first grating is arranged in the shell;
one end of the second grating is connected with the lens assembly, the second grating is arranged opposite to the first grating, grating parameters of the second grating are matched with those of the first grating, and a reflecting layer is further arranged on one side surface, far away from the first grating, of the second grating;
the detection sensor is arranged in the shell, is arranged opposite to the first grating, and is used for emitting first light rays to the first grating and calculating the moving distance of the second grating according to the received second light rays;
the second light is reflected to the detection sensor through the reflecting layer after the first light passes through the first grating and the second grating.
3. The camera module of claim 2, wherein an included angle exists between the second grating and the grating line of the first grating, and the included angle ranges from 0.1 ° to 1 °.
4. The camera module of claim 2, wherein the detection sensor comprises a first pixel region and a second pixel region;
the first pixel area is used for emitting the first light, and the second pixel area is used for receiving the second light.
5. The camera module of claim 2, wherein the grating assembly further comprises:
the light gathering piece is arranged between the detection sensor and the first grating and is used for gathering the first light.
6. The camera module of any of claims 1 to 5, wherein the number of grating assemblies is at least two; wherein at least two of the grating assemblies are disposed around the lens assembly.
7. An electronic device comprising the camera module of any one of claims 1 to 6.
8. A focusing method performed by the electronic device of claim 7, the method comprising:
Under the condition that a camera module of the electronic equipment is in a defocusing state, acquiring a first focusing distance, wherein the first focusing distance is as follows: the camera module is in a focusing state, and the lens component of the camera module needs to move for a distance;
controlling the lens assembly to move in a shell of the camera module, and detecting a first moving distance of the lens assembly in the shell through a grating assembly of the camera module;
and determining whether the camera module is in a focusing state according to the first moving distance and the first focusing distance.
9. The method of claim 8, wherein the grating assembly comprises a first grating, a second grating, and a detection sensor; the first grating is arranged in the shell; one end of the second grating is connected with the lens assembly, the second grating is arranged opposite to the first grating, grating parameters of the second grating and the first grating are matched, and a reflecting layer is further arranged on one side surface, far away from the first grating, of the second grating; the detection sensor is arranged in the shell, and the detection sensor is arranged opposite to the first grating;
The detecting, by the grating assembly of the camera module, the first moving distance of the lens assembly in the housing includes:
the detection sensor is controlled to emit first light to the first grating, and receives second light through the detection sensor, wherein the second light is as follows: the first light rays penetrate through the first grating and the second grating and then are reflected to the light rays of the detection sensor through the reflecting layer;
generating a first image according to the second light rays through the detection sensor, wherein the first image comprises at least two first stripes which are arranged at intervals;
the first movement distance is determined based on first position information of at least two of the first stripes.
10. The method of claim 9, wherein the controlling the lens assembly prior to moving in the housing of the camera module, the method further comprises:
the detection sensor is controlled to emit third light to the first grating, and fourth light is received by the detection sensor, wherein the fourth light is as follows: the third light rays penetrate through the first grating and the second grating and then are reflected to the light rays of the detection sensor through the reflecting layer;
Generating a second image according to the fourth light through the detection sensor, wherein the second image comprises at least two second stripes which are arranged at intervals;
the determining the first moving distance based on the first position information of at least two first stripes includes:
and determining the first moving distance according to the first position information and the second position information of at least two second stripes.
11. The method of claim 10, wherein said determining said first distance of movement based on said first location information and second location information of at least two of said second stripes comprises:
calculating a first location distance between the first location information and the second location information;
and calculating the first moving distance according to the first position distance and a preset corresponding relation, wherein the preset corresponding relation has an association relation with a first preset distance and a second preset distance, the first preset distance is the distance between the detection sensor and the first grating, and the second preset distance is the distance between the first grating and the second grating.
12. A focusing device comprising the camera module according to any one of claims 1 to 6, the focusing device further comprising:
The acquisition module is used for acquiring a first focusing distance under the condition that the camera module of the focusing device is in a defocus state, wherein the first focusing distance is as follows: the camera module is in a focusing state, and the lens component of the camera module needs to move for a distance;
the processing module is used for controlling the lens assembly to move in the shell of the camera module and detecting a first moving distance of the lens assembly in the shell through the grating assembly of the camera module; and determining whether the camera module is in a focusing state according to the first moving distance and the first focusing distance.
13. The focusing device of claim 12, wherein the grating assembly comprises a first grating, a second grating, and a detection sensor; the first grating is arranged in the shell; one end of the second grating is connected with the lens assembly, the second grating is arranged opposite to the first grating, grating parameters of the second grating and the first grating are matched, and a reflecting layer is further arranged on one side surface, far away from the first grating, of the second grating; the detection sensor is arranged in the shell, and the detection sensor is arranged opposite to the first grating;
The processing module is specifically configured to control the detection sensor to emit a first light beam to the first grating, and receive a second light beam through the detection sensor, where the second light beam is: the first light rays penetrate through the first grating and the second grating and then are reflected to the light rays of the detection sensor through the reflecting layer; generating a first image according to the second light rays through the detection sensor, wherein the first image comprises at least two first stripes which are arranged at intervals; and determining the first moving distance based on first position information of at least two first stripes.
14. The focusing device of claim 13, wherein the processing module is further configured to control the detection sensor to emit a third light ray to the first grating and receive a fourth light ray through the detection sensor before controlling the lens assembly to move in the housing of the camera module, the fourth light ray being: the third light rays penetrate through the first grating and the second grating and then are reflected to the light rays of the detection sensor through the reflecting layer; generating a second image according to the fourth light through the detection sensor, wherein the second image comprises at least two second stripes which are arranged at intervals; the processing module is specifically configured to determine the first movement distance according to the first position information and second position information of at least two second stripes.
15. The focusing device according to claim 14, wherein the processing module is configured to calculate a first position distance between the first position information and the second position information; and calculating the first moving distance according to the first position distance and a preset corresponding relation, wherein the preset corresponding relation has an association relation with a first preset distance and a second preset distance, the first preset distance is the distance between the detection sensor and the first grating, and the second preset distance is the distance between the first grating and the second grating.
16. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the focusing method according to any one of claims 8 to 11.
17. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the focusing method according to any one of claims 8 to 11.
CN202311506195.2A 2023-11-13 2023-11-13 Image pickup module, electronic device, focusing method, focusing device and readable storage medium Pending CN117528209A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311506195.2A CN117528209A (en) 2023-11-13 2023-11-13 Image pickup module, electronic device, focusing method, focusing device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311506195.2A CN117528209A (en) 2023-11-13 2023-11-13 Image pickup module, electronic device, focusing method, focusing device and readable storage medium

Publications (1)

Publication Number Publication Date
CN117528209A true CN117528209A (en) 2024-02-06

Family

ID=89762047

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311506195.2A Pending CN117528209A (en) 2023-11-13 2023-11-13 Image pickup module, electronic device, focusing method, focusing device and readable storage medium

Country Status (1)

Country Link
CN (1) CN117528209A (en)

Similar Documents

Publication Publication Date Title
KR101331543B1 (en) Three-dimensional sensing using speckle patterns
TWI585436B (en) Method and apparatus for measuring depth information
RU2502136C2 (en) Combined object capturing system and display device and associated method
US11399139B2 (en) High dynamic range camera assembly with augmented pixels
JP2014174357A (en) Imaging apparatus, imaging system, signal processor, program, and storage medium
CN104683693A (en) Automatic focusing method
US10791286B2 (en) Differentiated imaging using camera assembly with augmented pixels
CN110213491B (en) Focusing method, device and storage medium
CN107241592B (en) Imaging device and imaging method
CN109714539B (en) Image acquisition method and device based on gesture recognition and electronic equipment
US11509803B1 (en) Depth determination using time-of-flight and camera assembly with augmented pixels
JP2012015642A (en) Imaging device
JP2022107533A (en) Systems, methods and apparatuses for focus selection using image disparity
JP5857712B2 (en) Stereo image generation apparatus, stereo image generation method, and computer program for stereo image generation
CN112543284B (en) Focusing system, method and device
CN112824935B (en) Depth imaging system, method, device and medium based on modulated light field
CN113163114A (en) Image focusing method, device, equipment and medium
JP6368593B2 (en) Image processing program, information processing system, and image processing method
CN117528209A (en) Image pickup module, electronic device, focusing method, focusing device and readable storage medium
US11283970B2 (en) Image processing method, image processing apparatus, electronic device, and computer readable storage medium
JP2023517830A (en) Depth image generation method and device, reference image generation method and device, electronic device, and computer program
CN104683694A (en) Terminal
JP2011090166A (en) Stereo imaging apparatus
CN113132709B (en) Binocular distance measuring device, binocular distance measuring method and electronic equipment
JP6386837B2 (en) Image processing program, information processing system, information processing apparatus, and image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination