CN113873132B - Lens module, mobile terminal, shooting method and shooting device - Google Patents

Lens module, mobile terminal, shooting method and shooting device Download PDF

Info

Publication number
CN113873132B
CN113873132B CN202111250296.9A CN202111250296A CN113873132B CN 113873132 B CN113873132 B CN 113873132B CN 202111250296 A CN202111250296 A CN 202111250296A CN 113873132 B CN113873132 B CN 113873132B
Authority
CN
China
Prior art keywords
laser
shooting
target
focusing
image distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111250296.9A
Other languages
Chinese (zh)
Other versions
CN113873132A (en
Inventor
陈典浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111250296.9A priority Critical patent/CN113873132B/en
Publication of CN113873132A publication Critical patent/CN113873132A/en
Application granted granted Critical
Publication of CN113873132B publication Critical patent/CN113873132B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

The application discloses a lens module, a mobile terminal, a shooting method and a shooting device, and belongs to the technical field of shooting. The lens module comprises a laser emitter, a laser receiver and an image sensor; the laser transmitter is used for transmitting laser signals; the laser receiver comprises at least two laser receiving units, wherein the at least two laser receiving units are used for receiving laser signals and filtering non-visible light signals; the image sensor comprises at least two pixel units, wherein the at least two pixel units are arranged in one-to-one correspondence with the at least two laser receiving units; the incident light signal is transmitted to the pixel unit after passing through the laser receiving unit.

Description

Lens module, mobile terminal, shooting method and shooting device
Technical Field
The application belongs to the technical field of shooting, and particularly relates to a lens module, a mobile terminal, a shooting method and a shooting device.
Background
With the rapid development of electronic technology and image processing technology, the shooting function of the terminal is more and more powerful. In the shooting process, the lens can acquire a clear image of the target position by focusing to the target position. In the related art, the laser auto-focusing position defaults to the center position of the preview screen when focusing is performed, and the focusing effect is not ideal when focusing is performed on the non-image center position in the photographing preview interface.
Disclosure of Invention
An object of the embodiment of the application is to provide a lens module, a mobile terminal, a shooting method and a shooting device, which can solve the problem of poor focusing effect in the related technology.
In a first aspect, embodiments of the present application provide a lens module including a laser emitter, a laser receiver, and an image sensor. The laser transmitter is used for transmitting laser signals; the laser receiver comprises at least two laser receiving units, wherein the at least two laser receiving units are used for receiving laser signals and filtering non-visible light signals; the image sensor comprises at least two pixel units, wherein the at least two pixel units are arranged in one-to-one correspondence with the at least two laser receiving units; the incident light signal is transmitted to the pixel unit after passing through the laser receiving unit.
In a second aspect, an embodiment of the present application provides a mobile terminal, where the mobile terminal includes a lens module as described in the first aspect.
In a third aspect, an embodiment of the present application provides a photographing method, including: receiving laser signals by at least two laser receiving units of the laser receiver; calculating at least two image distances of at least two pixel units according to the laser signals; determining a target image distance according to at least two image distances; focusing processing is carried out based on the target image distance, and a target file is obtained through shooting, wherein the target file comprises at least one of the following: images, video.
In a fourth aspect, an embodiment of the present application provides a photographing apparatus, where the photographing apparatus includes a first receiving module, a first processing module, a second processing module, and a third processing module. The first receiving module is used for receiving laser signals through at least two laser receiving units of the laser receiver; the first processing module is used for calculating at least two image distances of at least two pixel units according to the laser signals; the second processing module is used for determining a target image distance according to at least two image distances; the third processing module is used for carrying out focusing processing based on the target image distance, shooting to obtain a target file, and the target file comprises at least one of the following: images, video.
In a fifth aspect, embodiments of the present application provide an electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, the program or instruction when executed by the processor implementing the steps of the method according to the third aspect.
In a sixth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the third aspect.
In a seventh aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the third aspect.
In the embodiment of the application, the laser receiver and the image sensor are integrated in the same lens to obtain the image distance corresponding to all pixel units in the view finding range of the image sensor, so that the accurate focusing of any position, any object or any area in a shooting preview interface can be realized.
Drawings
Fig. 1 is a schematic structural diagram of a lens module according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a laser receiver and an image sensor for receiving laser signals according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a principle of lens imaging;
Fig. 4 is a schematic flow chart of a shooting method according to an embodiment of the present application;
FIG. 5 is a second flowchart of a photographing method according to an embodiment of the present disclosure;
fig. 6 is one of interface schematic diagrams of a shooting method provided in an embodiment of the present application;
fig. 7 is a third flowchart of a photographing method according to an embodiment of the present disclosure;
FIG. 8 is a second interface diagram of a photographing method according to an embodiment of the present disclosure;
FIG. 9 is a third interface diagram of a photographing method according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a photographing apparatus according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 12 is a hardware schematic of an electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
In the related art, single-point laser focusing is widely applied to the focusing process of a terminal with a shooting function by virtue of the advantages of high focusing speed, low dim light, good detail environment adaptability and the like, so that coverage of more application scenes is realized, and better and faster focusing experience is brought.
However, single-point laser focusing can only focus on the central area of the preview screen, namely, focus on a central position point, an object in a non-central area of a shooting preview interface generally does not belong to a focused region of interest, and when focusing on the central object of the non-preview interface, the focusing effect is not ideal enough.
The lens module, the mobile terminal, the shooting method, the shooting device, the electronic equipment and the readable storage medium provided by the embodiment of the application are described in detail below by means of specific embodiments and application scenes thereof with reference to the accompanying drawings.
As shown in fig. 1 and 2, the lens module includes a laser emitter 110, a laser receiver 120, and an image sensor 130.
The lens module is provided with an independent laser emitter 110, and the laser emitter 110 is used for emitting laser signals. It will be appreciated that the laser transmitter 110 may emit a laser beam of a range of wavelengths without interruption at certain periods.
As shown in fig. 2, the image sensor 130 divides an optical image on a light receiving surface thereof into a plurality of pixel units 131, converts the optical signal into a usable electrical signal, and uses the electrical signal for imaging.
It will be appreciated that the image sensor 130 includes at least two pixel units, and the laser receiver 120 includes at least two laser receiving units 121 corresponding to the pixel units one by one. The laser receiver 120 is configured to receive the laser light signal and filter the non-visible light signal. At least two laser light receiving units 121 of the laser light receiver 120 are used to receive laser light signals and filter non-visible light signals.
When single-point laser focusing is adopted in the related art, the lens module comprises a lens for imaging, and also comprises a single laser emitter and a laser receiver. The angle of view of the laser light received by the laser receiver is different from the angle of view of the visible light received by the imaging lens, and therefore, the laser light received by the laser receiver does not completely cover the view range of the imaging lens, i.e., a part of the position of the view range of the lens cannot be focused by the laser light.
Unlike the lens module in the related art described above, the inventors integrate the laser receiver 120 in front of the image sensor 130, and the laser receiver 120 and the image sensor 130 are both integrated in the lens 140. By removing the filter coating on the lens, light in all wave bands can enter the lens 140. All incident light signals transmitted through the lens are converged by the focusing lens and then reach the laser receiver 120.
The incident light signals include at least a laser light signal, a visible light signal, and an invisible light signal emitted from the laser emitter 110 and reflected back by the object. The incident light signal is transmitted to the pixel unit 131 after passing through the laser receiving unit 121.
The laser receiving unit 121 receives the laser signals of the pixel units 131 corresponding to the laser signals one by one, and the object distance corresponding to the pixel units 131 can be determined according to the flight time of the laser. The time of flight of the laser refers to the time during which the laser propagates from emission to reception in one laser emission period. According to the flight time, the object distance of the shot object can be calculated, and then the lens module is focused according to the obtained object distance.
As shown in fig. 3, according to the gaussian formula for object imaging:
the relation between the object distance and the image distance can be obtained, wherein f is the focal length of the lens, u is the distance between the shot object and the optical center of the lens, namely the object distance, and v is the distance between the optimal imaging position and the optical center of the lens, namely the image distance.
It will be appreciated that when the shot lens 140 is fixed, the focal length f of the lens is a fixed value, and the object distance u is inversely proportional to the image distance v. It can be understood that the farther the shot object is from the lens, the larger the object distance u is, and the smaller the image distance v is; the closer the object is to the lens, the smaller the object distance u is, and the larger the image distance v is.
When shooting is carried out, the lens module obtains the object distance through laser flight time to determine the optimal imaging distance of imaging, and then accurate focusing is achieved, so that a clearer image is obtained.
It will be appreciated that since the laser receiver 120 is integrated within the same lens 140 as the image sensor 130, the angle of view of the laser receiver 120 coincides with the angle of view of the image sensor 130. Each pixel unit 131 in the entire view-finding range of the image sensor 130 can focus by means of laser focusing, and the laser focusing area is not limited to a specific region of interest, so that accurate focusing on objects corresponding to all pixels in the view-finding range of the image sensor 130 is realized.
According to the lens module of the embodiment of the application, the laser receiver 120 and the image sensor 130 are integrated in the same lens 140, so that the image distance corresponding to all the pixel units 131 in the view finding range of the image sensor 130 is obtained, the accurate focusing of any position, any object or any area in the shooting preview interface can be realized, as each pixel can be correspondingly measured to obtain an image distance, the pixel-level focusing can be realized, the accuracy of laser focusing is improved, the laser focusing of the embodiment of the application can be compatible with other focusing schemes, the mixed focusing scheme facing different application scenes is formed, the method can be suitable for more focusing scenes, and better, faster and more accurate focusing experience is brought to users.
The embodiment of the application also provides a mobile terminal, which comprises the lens module.
It is understood that mobile terminals include, but are not limited to, cell phones, tablet computers, cameras, wearable devices, and the like.
Taking a mobile phone as an example, the lens module can be arranged on the back of the mobile phone and used as a rear lens module, and the lens module can also be arranged on the front of the mobile phone and used as a front lens module.
When a user shoots by using the mobile phone, the mobile phone can meet focusing requirements of the user on any position of a shooting preview interface in a display screen of the mobile phone, so that clear imaging of different positions of the shooting preview interface is realized, and mobile phone shooting experience of the user is improved.
According to the mobile terminal provided by the embodiment of the application, the laser receiver and the image sensor are integrated in the same lens, so that the image distances corresponding to all pixel units in the view finding range of the image sensor are obtained, any position, any object or any area in a shooting preview interface can be accurately focused, as each pixel can correspondingly measure one image distance, pixel-level focusing can be realized, the accuracy of laser focusing is improved, the laser focusing of the embodiment of the application can be compatible with other focusing schemes, a mixed focusing scheme facing different application scenes is formed, the method can be suitable for more focusing scenes, and better, faster and more accurate focusing experience is brought for users.
The embodiment of the application also provides a shooting method, wherein the shooting method can be applied to the mobile terminal, and can be specifically executed by hardware or software in the mobile terminal. The subject of the shooting method may be a mobile terminal, a control device of the mobile terminal, or the like.
The shooting method provided in the embodiment of the present application may be an electronic device or a functional module or a functional entity capable of implementing the shooting method in the electronic device, where the electronic device in the embodiment of the present application includes, but is not limited to, a mobile phone, a tablet computer, a camera, a wearable device, and the like, and the shooting method provided in the embodiment of the present application is described below by taking the electronic device as an execution body.
As shown in fig. 4, the photographing method includes: step 410, step 420, step 430 and step 440.
Step 410, receiving a laser signal by at least two laser receiving units of a laser receiver.
When focusing is needed for shooting, the laser emitter emits laser light to the shot object. The electronic device obtains the flight time of the laser by receiving the laser reflected by the shot object.
It can be understood that when each pixel unit of the image sensor needs to receive the visible light signal for imaging, at least two laser receiving units can receive the laser signal, and determine the flight time of the laser according to the laser signal received by the laser receiving units, and determine the object distance of the pixel unit corresponding to the laser receiving unit.
Step 420, calculating at least two image distances of at least two pixel units according to the laser signals.
According to the Gaussian formula of imaging, the image distance corresponding to each pixel unit can be determined according to the object distance corresponding to each pixel unit. The image distance corresponding to all pixels in the view finding range of the image sensor forms an image distance set.
It will be appreciated that the laser receiver is always in a laser detection state after the laser transmitter emits laser light. The laser emitted by the laser emitter can be 930nm-940nm in wavelength range, and the emission period of the laser can be 53.2ns.
In this embodiment, each pixel unit is located in different rows and columns in the view-finding range of the image sensor, and each pixel unit has a corresponding number of rows and columns. In a laser emission period, once the laser receiver detects laser light with wavelength of 930nm-940nm at the pixel position of the ith row and jth column, recording the time at the moment, and calculating the time interval delta t between the laser light emission time and the time of laser light emission i,j ,Δt i,j I.e. the time of flight of the laser light at the pixel location.
After one emission period is finished, according to the corresponding time interval delta t of each pixel unit i,j Calculating the object distance u corresponding to the pixel unit i,j =c*Δt i,j Wherein c is the light velocity, the panoramic depth of all pixels in the view-finding range can be obtained, and the set { u } of the object distances of all pixel units can be obtained i,j }. Calculating according to Gaussian formula to obtain the corresponding image distance of the ith row and the jth column pixelsObtaining a set { v } of all pixel unit pixel distances i,j And f is the focal length of the focus lens.
Step 430, determining a target image distance according to at least two image distances.
When the electronic equipment focuses, the flight time of the laser received by the laser receiver is used for determining the image distance corresponding to each pixel unit in the view finding range of the image sensor, and at least two pixel units are arranged. Since the image distance corresponding to each pixel unit is different, the target image distance for focusing needs to be determined at this time.
The manner of determining the focal distance is described below by way of two different implementation angles.
1. The focal distance is determined by receiving a user input.
Referring to fig. 5, the photographing method may further include steps 510 and 520 before at least two laser receiving units of the laser receiver receive the laser signals.
Step 510, receiving a first input of a user to a shooting preview interface.
It can be appreciated that with the increasing demands of users for imaging quality of electronic devices and the favouring of personalized photographing experiences, more and more users are accustomed to actively focusing on a target object within a viewing range.
The electronic equipment is provided with the display screen for displaying the view finding range of the image sensor, and the shooting preview interface on the display screen displays the view finding range of the image sensor, so that a user can preview an imaging picture in real time.
And displaying a target object which the user wants to focus at a target area in the shooting preview interface, and focusing the target object at the target area in the shooting preview interface by the electronic equipment according to the first input of the user.
Wherein the first input may be represented as at least one of:
first, the first input may be represented as a touch operation including, but not limited to, a click operation, a press operation, and the like.
In this embodiment, receiving the first input of the user may be performed by receiving a touch operation of the user in a display area of the display screen of the electronic device.
For example, when a user shoots a person by using a mobile phone, the mobile phone receives an input that the user clicks the face position in the shooting preview interface of the screen of the mobile phone, and the mobile phone can automatically focus the person position in the shooting preview interface.
Second, the first input may appear as a voice input.
In this embodiment, the target voice may trigger the electronic device to focus on the target object.
For example, when a user shoots a person by using the mobile phone, the mobile phone can automatically focus the position of the person in the shooting preview interface when the mobile phone receives voice such as ' small V and small V ' and the image shooting '.
Third, the first input may appear as a physical key input.
In this embodiment, a physical key corresponding to the determination of the focal distance is provided on the body of the electronic device. The physical key can be a knob or a physical slide bar, and different rotation angles of the knob and different sliding distances of the physical slide bar correspond to different object distances or image distances corresponding to all pixels in a view-finding range of the image sensor.
For example, receiving the input of the user may be performed by receiving the input of the user rotating the focusing knob of the camera lens, where the object distance or the image distance corresponds to a value corresponding to the current position of the knob, so as to determine the focusing image distance conveniently.
Of course, in other embodiments, the first input may also take other forms, including but not limited to character input, etc., which may be specifically determined according to actual needs, which is not limited in this embodiment of the present application.
Step 520, responsive to the first input, determines a target area.
After receiving the first input of the user, the electronic device can respond to the first input to focus on a target object at a target position in the shooting preview interface.
Before focusing, determining target areas where a plurality of target pixel units corresponding to the target object are located, wherein the determining mode of the target areas can be represented as at least one mode of the following:
one of them, the target area is an area surrounding the user touch operation position and including a target number of pixel units.
In this embodiment, when the electronic device receives that the first input of the user is a touch operation, the electronic device circles the touch operation position in the view-finding range by making a circle with the touch point of the user on the display screen as a center and with the target length as a radius, and the circular area is the target area.
The circular area can cover a certain area, and the focusing requirement of a user is met.
And secondly, the area in the outline of the target object is a target area.
The target object is an object that the user desires to focus on. The confirmation of the target object can be determined through touch operation of a user, can also be determined through voice input of the user, is not limited to the above two methods, and can also be determined automatically through electronic equipment.
In this embodiment, when the electronic device receives the first input from the user, the electronic device automatically identifies the outline of the target object by using the artificial intelligence recognition algorithm, and the area within the outline is the target area.
For example, a user shoots a person using a mobile phone, and when focusing, the mobile phone receives a click of the user on a face position in a shooting preview interface, and the mobile phone performs image recognition based on the click position and recognizes the outline of the person. In this case, the region within the character outline is the target region.
The operation difficulty of the user can be reduced by automatically identifying the target object, the focusing position error caused by the false clicking of the user is reduced to a certain extent, and the operation experience of the user is further improved.
Of course, in other embodiments, the target area may be confirmed in other manners, including but not limited to, taking the smallest rectangular area covering all pixels of the target object as the target area, which may be specifically determined according to actual needs, which is not limited in the embodiments of the present application.
After confirming the target area, the electronic device can focus on the target area. In the focusing process, corresponding image distance is confirmed according to the object distance of the pixel units in the target area, and then the target image distance is obtained according to the image distance, wherein the target image distance is the actual focusing image distance.
It will be appreciated that after determining the focus area based on the first input from the user, the electronic device determines the target image distance for focusing.
The electronic device determines a target image distance for controlling the motor to drive the focusing lens to move to the target image distance position for focusing. The target image distance of the electronic device for determining the target position may be at least one of the following:
firstly, taking the average value of the image distance values corresponding to all pixel units in the target area as the target image distance.
In this embodiment, the target area includes at least two pixel units, and an average value of the image distances of the at least two pixel units corresponding to the target area is calculated to obtain the target image distance.
The average value of the image distances of the pixel units is used as the focusing image distance, so that the focusing requirement of each pixel unit in the current target area can be met in a better mode, the focusing requirement of a user on the target object is met, and the overall focusing effect of the target object is better.
And secondly, selecting the maximum value in the image distance values corresponding to all pixel units in the target area as the target image distance.
In this embodiment, the electronic device can focus on the position closest to the lens in the target area, so that the closer to the lens, the more clear the final imaging effect of the object is.
Thirdly, selecting the mode value in the image distance values corresponding to all the pixel units in the target area as the target image distance.
In this embodiment, taking the mode of the image distance of the pixel unit in the target area as the target image distance can take into consideration the focusing requirement of more pixels at the target object, and can highlight the more obvious features of the target object in final imaging, so that the focusing effect of the target object as a whole is better.
Of course, in other embodiments, the target image distance of the target area may be determined by other manners, including but not limited to using the median of the image distances of the plurality of pixel units in the target area as the target image distance of the target object, which is not limited in the embodiments of the present application.
2. The focal distance is automatically determined.
When focusing is needed, the electronic equipment can automatically determine the target image distance. The determination of the target image distance may be performed automatically in at least one of the following ways:
firstly, selecting the maximum value in the image distance set corresponding to all pixel units as the target image distance.
Determining the target image distance according to the at least two image distances comprises obtaining the maximum image distance in the at least two image distances to obtain the target image distance. Namely, the electronic equipment selects the maximum value in the image distance set corresponding to all pixel units in the view finding range of the image sensor as the target image distance. And according to an imaging Gaussian formula, the object distance value corresponding to the maximum image distance value is minimum.
Under the focusing mode, the electronic equipment can focus the position closest to the lens in the view-finding range of the image sensor, meets the shooting requirement of a user on a close range or a micro-distance, can ensure that an object closest to the lens is shot clearly, and enriches the experience of the user.
And secondly, selecting the minimum value in the image distance set corresponding to all pixel units as a target image distance.
Determining the target image distance according to the at least two image distances comprises obtaining the minimum image distance in the at least two image distances to obtain the target image distance. The electronic equipment selects the minimum value in the image distance set corresponding to all pixel units in the view finding range of the image sensor as the focusing image distance. And according to an imaging Gaussian formula, an object distance value corresponding to the minimum image distance value is the largest. Under the focusing mode, the electronic equipment can focus the position farthest from the lens in the view-finding range of the image sensor, so that the final imaging of an object farther from the lens can be clearer.
Thirdly, selecting a median value in the image distance set corresponding to all the pixel units as a target image distance.
The electronic equipment selects the median of the image distances corresponding to all pixel units in the view finding range of the image sensor as the focusing image distance. Under the focusing mode, the electronic equipment can focus the position of the middle value of the depth of view distance in the view-finding range of the image sensor, the focusing position is between the two conditions of adopting the maximum value and the minimum value of the image distance, objects at different positions in the view-finding range can be considered, and the imaging effect of objects with different distances from the lens can be ensured.
Fourth, the mode value in the image distance set corresponding to all the pixel units is selected as the target image distance.
The electronic device selects the mode in the image distance corresponding to all pixels in the view finding range of the image sensor as the target image distance. Under the focusing mode, the electronic equipment takes the modes in all image distances as focusing image distances, so that the optimal focusing effect of as many pixels as possible in the view-finding range can be ensured, the focusing effect in a larger range in the view-finding range can be considered, and the imaging quality is improved.
Of course, in other embodiments, the electronic device may also automatically determine the target image distance in other manners, including but not limited to selecting an average value of all target image distances as the focusing image distance, which may be specifically determined according to actual needs, which is not limited in the embodiments of the present application.
The shooting method is further described below by taking a specific example.
Under the scene that the mobile phone shoots the flower stand, a user opens a shooting application of the mobile phone, and when the user aims at the flower stand and keeps the mobile phone fixed briefly, the mobile phone judges that the user needs to focus.
The mobile phone controls the laser emitter to emit laser with the wavelength ranging from 930nm to 940nm according to the period of 53.2ns, the laser receiver is always in a laser detection state after the laser emitter emits the laser, once the laser with the wavelength ranging from 930nm to 940nm is detected at the pixel unit position of the ith row and the jth column of the image sensor in one laser emission period, the time at the moment is recorded, and the time interval between the laser emission time and the laser emission time is calculated.
After one emission period is finished, calculating a set of corresponding object distances and image distances according to the time interval corresponding to each pixel unit.
In order to simplify the operation of the user and reduce the threshold of photographing by the user, the mobile phone can automatically determine the target image distance based on the image distances corresponding to all the current pixel units. In this embodiment, the mobile phone uses the mode among all the image distances as the focus image distance.
As shown in fig. 6, in the shooting preview interface, there are a plurality of flowers 610 far from the lens and near to the lens, and a shooting control 620 for controlling shooting, where the pixel units corresponding to the petals of the flowers 610 occupy most of the pixel unit positions in the view range of the image sensor. The petals of some of the flowers 610 are close to the lens, so that during auto-focusing, the object distance corresponding to the mode in all image distances is the mode of the distance between the petals of the flowers 610 and the lens in the figure.
In this embodiment, the motor in the lens module drives the focusing lens to move to the position corresponding to the target image distance, and at this time, the plurality of flowers 610 with similar distances from the lens in the shooting preview interface on the mobile phone screen are clearly displayed, in this case, the number of the clear flowers 610 in the shooting preview interface can be maximized. The user can determine whether to perform the next shooting operation according to the imaging effect in the shooting preview interface, and at this time, the number of clear flowers 610 in the shot image is the largest.
Step 440, focusing based on the target image distance, and shooting to obtain a target file, wherein the target file comprises at least one of the following: images, video.
And determining the position of the focusing lens according to the target image distance of the pixel unit, and driving the focusing lens to move to the position corresponding to the optimal focusing image distance by the motor.
In the actual execution process, the far focus image distance v recorded in the electronic equipment according to the lens inf Corresponding motor instruction code inf And near focal distance v micr Corresponding motor instruction code micr The best focusing image distance v is obtained by linear interpolation f Converting to a desired motor command code f To control the motor to work to move the motor to the optimal focusing image distance v f On the motor command code f The following formula is satisfied:
at this time, the electronic device focuses according to the optimal focusing image distance and shoots. Shooting to obtain a target file, wherein the target file comprises at least one of the following: images, video.
It can be understood that the laser receiver and the image sensor are integrated in the same lens module, in the focusing process, the angles of view of the light rays received by the laser receiver and the image sensor are consistent, the lens can receive the laser reflected by all objects in the view-finding range of the image sensor, the laser receiver can receive the reflected laser in the view-finding range of the image sensor according to the pixel unit, and the electronic equipment can obtain the object distance and the image distance corresponding to each pixel in the view-finding range of the image sensor.
When focusing is carried out, the electronic equipment can determine the focusing image distance according to the image distance of any pixel according to the requirement of a user, and can realize accurate focusing of any position or any object in the view-finding range of the image sensor.
According to the shooting method provided by the embodiment of the application, the image distances corresponding to all pixel units in the view finding range of the image sensor are obtained through the laser receiver, so that accurate focusing on all positions and all objects in the shooting preview interface can be realized, the laser focusing can be applied to more focusing scenes, and the focusing experience of a user is improved.
In some embodiments, as shown in fig. 7, in step 440, focusing is performed based on the target image distance, and a target file is obtained by shooting, where the target file includes at least one of the following: the image, video may also include step 441, step 442, and step 443.
Step 441, focusing is performed based on the target image distance.
It can be understood that the motor drives the focusing lens to move to the target image distance, and a preview image focused according to the target image distance in the current view-finding range of the image sensor can be displayed in the shooting preview interface.
Step 442, receiving a second input from the user to the shooting preview interface.
When the user is satisfied with the current focusing effect, the second input of the user is used for controlling the electronic equipment to shoot according to the current focusing effect.
Wherein the second input may be represented as at least one of:
first, the second input may be represented as a touch operation including, but not limited to, a click operation, a press operation, a slide operation, and the like.
In this embodiment, receiving the second input of the user may be performed by receiving a touch operation of the user in a display area of the display screen of the electronic device.
In order to reduce the user's misoperation rate, the action area of the second input may be limited to a specific area, such as a lower middle area of the shooting preview interface; or in the state of displaying the shooting preview interface, displaying a target control on the current interface, and touching the target control to realize the second input.
For example, when the user shoots by using the mobile phone, the second input of the user is received, which may be represented by receiving a touch operation that the user clicks the shooting control in a state that the mobile phone display screen shoots the preview interface.
Second, the second input may appear as a physical key input.
In this embodiment, the body of the electronic device is provided with an entity key corresponding to shooting, and the second input of the user is received, which may be expressed as that the second input of the user pressing the corresponding entity key is received; the second input may also be a combined operation of simultaneously pressing multiple physical keys.
For example, when the user performs shooting by using the mobile phone, the second input of the user may be received, and the pressing operation of the volume key may be received when the user presses the volume key in a state where the preview interface is shot on the mobile phone display screen.
Third, the second input may appear as a voice input.
In this embodiment, receiving the second input from the user may be represented as receiving a voice command from the user.
For example, when a user shoots or uses the mobile phone, the mobile phone is triggered to shoot when receiving a voice such as eggplant in a state that a shooting preview interface is displayed on a display screen of the mobile phone.
Of course, in other embodiments, the second input may take other forms, including but not limited to character input, etc., which may be specifically determined according to actual needs, which is not limited in this embodiment of the present application.
And step 443, responding to the second input, and shooting to obtain a target file.
The electronic device is capable of responding to the second input of the user and taking a photograph. And shooting by the electronic equipment to obtain a target file, wherein the target file comprises at least one of an image or a video. The object file obtained by shooting by the electronic equipment can be stored in a storage unit of the electronic equipment.
According to the shooting method provided by the embodiment of the application, the second input of the user is received for shooting under the condition of focusing according to the target image distance, so that the image or video which accords with the expected focusing effect of the user can be obtained, and the shooting experience of the user is improved.
In some shooting scenes, when the user is not satisfied with the current auto-focusing effect, the electronic device can switch the focusing mode to re-focus by receiving a third input of the user on the target position or the target object in the shooting preview interface.
The third input may be represented as a touch input, and the electronic device refocuses by receiving a click operation of the user on the target object or the target position in the photographing preview interface.
For example, when a user takes a photo of a person with a mobile phone, referring to fig. 8 and 9, in a shooting preview interface of a display screen of the mobile phone, there are a person 810, a background building 820 behind the person, and a street 830 in front of the person. The bottom center of the shooting preview interface of the mobile phone display screen is also displayed with a shooting control 840 for shooting control.
When the mobile phone detects that the photographing application is opened, the laser transmitter starts to transmit laser in a certain period, and after the laser receiver receives reflected laser in the current period, the mobile phone calculates a set of object distances and image distances corresponding to each pixel unit in the photographing preview interface according to the laser flight time interval corresponding to each pixel unit, and selects the maximum image distance as a target image distance.
The motor in the mobile phone lens module drives the lens to move to the position corresponding to the focusing image distance, the image focused according to the image distance is displayed in the shooting preview interface of the display screen, the street 830 in the image is nearest to the lens, the preview effect of the street 830 is the most clear, the face of the shot person 810 is blurred, and the face detail is not clear enough.
In order to improve the shooting quality of the face of the person 810, as shown in fig. 8, after receiving the third input of clicking the display screen by the user, the mobile phone detects that the contact point position of clicking the display screen by the user is located at the position of the background building 820 close to the face of the person 810, that is, the black point position in the figure. The mobile phone can take a circular area taking the contact point position as the center as a target area A, the set of pixel units in the target area A is { (i, j), (i, j) ∈A }, and the radius of the target area A can be set to be 1/8 of the length of the narrow side of the shooting preview interface of the display screen.
The laser receiver receives the reflected laser of the first period after the user clicks the display screen, and the mobile phone calculates the corresponding object distance and image distance set { v) according to the laser flight time interval corresponding to each pixel of the target area in the shooting preview interface i,j (i, j) E A), and selecting the average value v of the image distance corresponding to the pixel unit of the target area f =averg({v i,j (i, j) ∈A }) as the target image distance.
In the above case, since the actual click position of the user on the photographing preview interface deviates from the desired focusing position of the user by a certain amount, the actual focusing position is not at the position of the face of the person 810 but at a certain position between the face of the person 810 and the background building 820 after focusing according to the image distance average value, and the actual focusing effect is not good.
As shown in fig. 9, the mobile phone receives the third input of clicking the display screen by the user again, and the mobile phone detects that the contact point position of clicking the display screen by the user is located at the center of the face of the person 810, that is, the black point position in the figure. The mobile phone can take a circular area taking the contact point position as the center as a target area A.
The laser receiver receives the reflected laser of the first period after the user clicks the display screen again, the mobile phone calculates a set of object distances and image distances corresponding to the laser according to the laser flight time interval corresponding to each pixel in the target area A in the shooting preview interface, and the average value of the image distances corresponding to the pixels in the target area is selected as a focusing image distance to focus.
In the above case, the face of the person 810 can be accurately focused, and after receiving a photographing instruction that the user clicks the photographing control 840, the mobile phone starts imaging by the image sensor and obtains an image file meeting the focusing requirement of the user.
In the actual shooting process, the operation difficulty of a user can be reduced through automatic focusing, the user can conveniently shoot, the user can also focus by himself, personalized focusing requirements of the user are met, and the satisfaction degree of shooting of the user is improved.
It should be noted that, in the photographing method provided in the embodiment of the present application, the execution subject may be a photographing device, or a control module in the photographing device for executing the photographing method. In the embodiment of the present application, a method for performing shooting by a shooting device is taken as an example, and the shooting device provided in the embodiment of the present application is described.
The embodiment of the application also provides a shooting device.
As shown in fig. 10, the photographing apparatus includes: a first receiving module 1010, a first processing module 1020, a second processing module 1030, and a third processing module 1040.
The first receiving module 1010 is configured to receive laser signals through at least two laser receiving units of the laser receiver; the first processing module 1020 is configured to calculate at least two image distances of at least two pixel units according to the laser signal; the second processing module 1030 is configured to determine a target image distance according to at least two image distances; the third processing module 1040 is configured to perform focusing processing based on the target image distance, and capture a target file, where the target file includes at least one of the following: images, video.
According to the shooting device provided by the embodiment of the application, the laser receiver and the image sensor are integrated in the same lens, the image distance corresponding to all pixel units in the view finding range of the image sensor is obtained, any position, any object or any area in the shooting preview interface can be accurately focused, as each pixel can correspondingly measure one image distance, pixel-level focusing can be realized, the accuracy of laser focusing is improved, the laser focusing of the embodiment of the application can be compatible with other focusing schemes, a mixed focusing scheme facing different application scenes is formed, the method can be suitable for more focusing scenes, and better, faster and more accurate focusing experience is brought to users.
In some embodiments, the photographing device further includes a second receiving module and a fourth processing module.
The second receiving module is used for receiving a first input of a user to the shooting preview interface; the fourth processing module is used for responding to the first input and determining a target area; the second processing module 1030 is further configured to calculate an average value of image distances of at least two pixel units corresponding to the target area, so as to obtain a target image distance.
According to the shooting device provided by the embodiment of the application, the average value of the image distances of the pixel units is used as the focusing image distance, so that the focusing requirement of each pixel unit at the current target area can be met in a better mode, the focusing requirement of a user on a target object is met, and the overall focusing effect of the target object is better.
In some embodiments, the second processing module 1030 is further configured to obtain a maximum image distance of the at least two image distances, to obtain the target image distance.
According to the shooting device provided by the embodiment of the application, the electronic equipment can focus on the position closest to the lens in the view finding range of the image sensor, so that the shooting requirement of a user on a close range or a micro-distance is met, the object closest to the lens can be guaranteed to be shot clearly, and the experience of the user is enriched.
In some embodiments, the third processing module 1040 is further configured to perform focusing processing based on the target image distance; receiving a second input of a user to a shooting preview interface; in response to the second input, a target file is photographed.
According to the shooting device provided by the embodiment of the application, the second input of the user is received for shooting under the condition of focusing according to the target image distance, so that the image or video which accords with the expected focusing effect of the user can be obtained, and the shooting experience of the user is improved.
The photographing device in the embodiment of the application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device may be a mobile electronic device or a non-mobile electronic device. By way of example, the mobile electronic device may be a cell phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, wearable device, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook or personal digital assistant (personal digital assistant, PDA), etc., and the non-mobile electronic device may be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The photographing device in the embodiment of the application may be a device having an operating system. The operating system may be an Android operating system, an IOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The photographing device provided in the embodiment of the present application can implement each process implemented by the embodiments of the methods of fig. 1 to 9, and in order to avoid repetition, a detailed description is omitted here.
Optionally, as shown in fig. 11, the embodiment of the present application further provides an electronic device 1100, including a processor 1101, a memory 1102, and a program or an instruction stored in the memory 1102 and capable of running on the processor 1101, where the program or the instruction implements each process of the above-mentioned shooting method embodiment when executed by the processor 1101, and the process can achieve the same technical effect, and for avoiding repetition, a description is omitted herein.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 12 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1200 includes, but is not limited to: a radio frequency unit 1201, a network module 1202, an audio output unit 1203, an input unit 1204, a sensor 1205, a display unit 1206, a user input unit 1207, an interface unit 1208, a memory 1209, and a processor 1210.
Those skilled in the art will appreciate that the electronic device 1200 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1210 by a power management system, such as to perform functions such as managing charging, discharging, and power consumption by the power management system. The electronic device structure shown in fig. 12 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
Wherein the processor 1210 is configured to receive laser signals through at least two laser receiving units of the laser receiver;
the processor 1210 is further configured to calculate at least two image distances of at least two pixel units according to the laser signal;
a processor 1210 further configured to determine a target image distance from the at least two image distances;
the processor 1210 is further configured to perform focusing processing based on the target image distance, and photograph to obtain a target file, where the target file includes at least one of the following: images, video.
According to the electronic equipment provided by the embodiment of the application, the laser receiver and the image sensor are integrated in the same lens, so that the image distance corresponding to all pixel units in the view finding range of the image sensor is obtained, any position, any object or any area in the shooting preview interface can be accurately focused, as each pixel can correspondingly measure one image distance, pixel-level focusing can be realized, the accuracy of laser focusing is improved, the laser focusing of the embodiment of the application can be compatible with other focusing schemes, a mixed focusing scheme facing different application scenes is formed, the application can be suitable for more focusing scenes, and better, faster and more accurate focusing experience is brought for users.
Optionally, a user input unit 1207 is configured to receive a first input of a user to the shooting preview interface;
a processor 1210 further configured to determine a target area in response to the first input;
the processor 1210 is further configured to calculate an average value of the image distances of at least two pixel units corresponding to the target area, so as to obtain a target image distance.
According to the electronic equipment provided by the embodiment of the application, the average value of the image distances of the pixel units is used as the focusing image distance, so that the focusing requirement of each pixel unit at the current target area can be met in a better mode, the focusing requirement of a user on a target object is met, and the overall focusing effect of the target object is better.
Optionally, the processor 1210 is further configured to obtain a maximum image distance of the at least two image distances, so as to obtain the target image distance.
According to the electronic equipment provided by the embodiment of the application, the electronic equipment can focus on the position closest to the lens in the view-finding range of the image sensor, so that the shooting requirement of a user on a close range or a micro-distance is met, the object closest to the lens can be guaranteed to be shot clearly, and the experience of the user is enriched.
Optionally, the processor 1210 is further configured to perform focusing processing based on the target image distance;
The user input unit 1207 is further configured to receive a second input of the user to the shooting preview interface;
the processor 1210 is further configured to capture a target file in response to the second input.
According to the electronic equipment provided by the embodiment of the application, the second input of the user is received for shooting under the condition of focusing according to the target image distance, so that the image or video which accords with the expected focusing effect of the user can be obtained, and the shooting experience of the user is improved.
It should be understood that in the embodiment of the present application, the input unit 1204 may include a graphics processor (Graphics Processing Unit, GPU) 9041 and a microphone 12042, and the graphics processor 12041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1206 may include a display panel 12061, and the display panel 12061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1207 includes a touch panel 12071 and other input devices 12072. The touch panel 12071 is also called a touch screen. The touch panel 12071 may include two parts, a touch detection device and a touch controller. Other input devices 12072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein. Memory 1209 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. Processor 1210 may integrate an application processor that primarily processes operating systems, user interfaces, applications, etc., with a modem processor that primarily processes wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1210.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction realizes each process of the above-mentioned shooting method embodiment, and the same technical effect can be achieved, so that repetition is avoided, and no redundant description is provided herein.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium such as a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, the processor is used for running a program or instructions, implementing each process of the shooting method embodiment, and achieving the same technical effect, so as to avoid repetition, and no redundant description is provided herein.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (8)

1. The shooting method is characterized by being applied to a mobile terminal, wherein the mobile terminal comprises a lens module, and the lens module comprises:
a laser transmitter for transmitting a laser signal;
the laser receiver comprises at least two laser receiving units, wherein the at least two laser receiving units are used for receiving laser signals and filtering non-visible light signals;
the image sensor comprises at least two pixel units, wherein the at least two pixel units are arranged in one-to-one correspondence with the at least two laser receiving units;
the incident light signal is transmitted to the pixel unit after passing through the laser receiving unit;
the method comprises the following steps:
receiving laser signals by at least two laser receiving units of the laser receiver;
calculating at least two image distances of at least two pixel units according to the laser signals;
determining a target image distance according to at least two image distances;
focusing processing is carried out based on the target image distance, and a target file is obtained through shooting, wherein the target file comprises at least one of the following: images, video.
2. The photographing method of claim 1, wherein before the at least two laser receiving units of the laser receiver receive the laser signals, further comprising:
Receiving a first input of a user to a shooting preview interface;
determining a target area in response to the first input;
the determining the target image distance according to at least two image distances comprises the following steps:
and calculating the average value of the image distances of at least two pixel units corresponding to the target area to obtain the target image distance.
3. The photographing method of claim 1, wherein said determining a target image distance from at least two of said image distances comprises:
and obtaining the maximum image distance in at least two image distances to obtain the target image distance.
4. The shooting method according to claim 1, wherein the focusing process based on the target image distance, shooting to obtain a target file, includes:
focusing based on the target image distance;
receiving a second input of a user to a shooting preview interface;
and responding to the second input, and shooting to obtain a target file.
5. The utility model provides a shooting device which characterized in that, includes the camera lens module, the camera lens module includes:
a laser transmitter for transmitting a laser signal;
the laser receiver comprises at least two laser receiving units, wherein the at least two laser receiving units are used for receiving laser signals and filtering non-visible light signals;
The image sensor comprises at least two pixel units, wherein the at least two pixel units are arranged in one-to-one correspondence with the at least two laser receiving units;
the incident light signal is transmitted to the pixel unit after passing through the laser receiving unit;
the apparatus further comprises:
the first receiving module is used for receiving laser signals through at least two laser receiving units of the laser receiver;
the first processing module is used for calculating at least two image distances of at least two pixel units according to the laser signals;
the second processing module is used for determining a target image distance according to at least two image distances;
the third processing module is used for carrying out focusing processing based on the target image distance, shooting to obtain a target file, and the target file comprises at least one of the following: images, video.
6. The photographing device of claim 5, further comprising:
the second receiving module is used for receiving a first input of a user to the shooting preview interface;
a fourth processing module for determining a target area in response to the first input;
the second processing module is further configured to calculate an average value of image distances of at least two pixel units corresponding to the target area, so as to obtain the target image distance.
7. The photographing device of claim 5, wherein the second processing module is further configured to obtain a maximum image distance of at least two image distances, and obtain the target image distance.
8. The photographing device of claim 5, wherein said third processing module is further configured to perform focusing processing based on said target image distance; receiving a second input of a user to a shooting preview interface; and responding to the second input, and shooting to obtain a target file.
CN202111250296.9A 2021-10-26 2021-10-26 Lens module, mobile terminal, shooting method and shooting device Active CN113873132B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111250296.9A CN113873132B (en) 2021-10-26 2021-10-26 Lens module, mobile terminal, shooting method and shooting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111250296.9A CN113873132B (en) 2021-10-26 2021-10-26 Lens module, mobile terminal, shooting method and shooting device

Publications (2)

Publication Number Publication Date
CN113873132A CN113873132A (en) 2021-12-31
CN113873132B true CN113873132B (en) 2024-03-22

Family

ID=78998124

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111250296.9A Active CN113873132B (en) 2021-10-26 2021-10-26 Lens module, mobile terminal, shooting method and shooting device

Country Status (1)

Country Link
CN (1) CN113873132B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006177712A (en) * 2004-12-21 2006-07-06 Canon Inc Semiconductor device and its manufacturing method
JP2015162562A (en) * 2014-02-27 2015-09-07 株式会社ニコン Imaging apparatus and digital camera
CN206311755U (en) * 2016-12-30 2017-07-07 北醒(北京)光子科技有限公司 A kind of multi-thread range unit of solid-state
CN112954217A (en) * 2021-02-25 2021-06-11 维沃移动通信有限公司 Electronic equipment and focusing method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7169863B2 (en) * 2018-12-03 2022-11-11 キヤノン株式会社 Imaging device, its control method, and calculation method
US20200344405A1 (en) * 2019-04-25 2020-10-29 Canon Kabushiki Kaisha Image pickup apparatus of measuring distance from subject to image pickup surface of image pickup device and method for controlling the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006177712A (en) * 2004-12-21 2006-07-06 Canon Inc Semiconductor device and its manufacturing method
JP2015162562A (en) * 2014-02-27 2015-09-07 株式会社ニコン Imaging apparatus and digital camera
CN206311755U (en) * 2016-12-30 2017-07-07 北醒(北京)光子科技有限公司 A kind of multi-thread range unit of solid-state
CN112954217A (en) * 2021-02-25 2021-06-11 维沃移动通信有限公司 Electronic equipment and focusing method

Also Published As

Publication number Publication date
CN113873132A (en) 2021-12-31

Similar Documents

Publication Publication Date Title
US8134597B2 (en) Camera system with touch focus and method
CN107592466B (en) Photographing method and mobile terminal
CN110035218B (en) Image processing method, image processing device and photographing equipment
CN111314597A (en) Terminal, focusing method and device
CN112637500B (en) Image processing method and device
CN110049221B (en) Shooting method and mobile terminal
CN114500837B (en) Shooting method and device and electronic equipment
CN112291473B (en) Focusing method and device and electronic equipment
CN112312016A (en) Shooting processing method and device, electronic equipment and readable storage medium
CN113873132B (en) Lens module, mobile terminal, shooting method and shooting device
CN112637495A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN108600623B (en) Refocusing display method and terminal device
CN110602397A (en) Image processing method, device, terminal and storage medium
CN115484383B (en) Shooting method and related device
CN113794833B (en) Shooting method and device and electronic equipment
CN113014799B (en) Image display method and device and electronic equipment
US11252341B2 (en) Method and device for shooting image, and storage medium
CN114025100A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN108833794B (en) Shooting method and mobile terminal
CN115134517A (en) Shooting control method and device and storage medium
CN112399092A (en) Shooting method and device and electronic equipment
CN110913130A (en) Shooting method and electronic equipment
CN112954211B (en) Focusing method and device, electronic equipment and readable storage medium
CN114125417B (en) Image sensor, image pickup apparatus, image pickup method, image pickup apparatus, and storage medium
WO2024076531A1 (en) Hybrid auto-focus system with robust macro object priority focusing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant