CN113873132A - Lens module, mobile terminal, shooting method and shooting device - Google Patents

Lens module, mobile terminal, shooting method and shooting device Download PDF

Info

Publication number
CN113873132A
CN113873132A CN202111250296.9A CN202111250296A CN113873132A CN 113873132 A CN113873132 A CN 113873132A CN 202111250296 A CN202111250296 A CN 202111250296A CN 113873132 A CN113873132 A CN 113873132A
Authority
CN
China
Prior art keywords
laser
shooting
target
focusing
image distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111250296.9A
Other languages
Chinese (zh)
Other versions
CN113873132B (en
Inventor
陈典浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111250296.9A priority Critical patent/CN113873132B/en
Publication of CN113873132A publication Critical patent/CN113873132A/en
Application granted granted Critical
Publication of CN113873132B publication Critical patent/CN113873132B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

The application discloses a lens module, a mobile terminal, a shooting method and a shooting device, and belongs to the technical field of camera shooting. The lens module comprises a laser transmitter, a laser receiver and an image sensor; the laser transmitter is used for transmitting a laser signal; the laser receiver comprises at least two laser receiving units, and the at least two laser receiving units are used for receiving laser signals and filtering out non-visible light signals; the image sensor comprises at least two pixel units, and the at least two pixel units and the at least two laser receiving units are arranged in a one-to-one correspondence manner; and the incident light signal is transmitted to the pixel unit after passing through the laser receiving unit.

Description

Lens module, mobile terminal, shooting method and shooting device
Technical Field
The application belongs to the technical field of camera shooting, and particularly relates to a lens module, a mobile terminal, a shooting method and a shooting device.
Background
With the rapid development of electronic technology and image processing technology, the shooting function of the terminal is more and more powerful. During shooting, the lens can acquire a clear image of the target position by focusing to the target position. In the related art, when focusing is performed, the position of laser automatic focusing is defaulted to the center position of the preview screen, and when focusing is performed on the non-image center position in the shooting preview interface, the focusing effect is not ideal.
Disclosure of Invention
The embodiment of the application aims to provide a lens module, a mobile terminal, a shooting method and a shooting device, which can solve the problem of poor focusing effect in the related technology.
In a first aspect, an embodiment of the present application provides a lens module, which includes a laser transmitter, a laser receiver, and an image sensor. The laser transmitter is used for transmitting a laser signal; the laser receiver comprises at least two laser receiving units, and the at least two laser receiving units are used for receiving laser signals and filtering out non-visible light signals; the image sensor comprises at least two pixel units, and the at least two pixel units and the at least two laser receiving units are arranged in a one-to-one correspondence manner; and the incident light signal is transmitted to the pixel unit after passing through the laser receiving unit.
In a second aspect, an embodiment of the present application provides a mobile terminal, which includes the lens module according to the first aspect.
In a third aspect, an embodiment of the present application provides a shooting method, including: receiving laser signals through at least two laser receiving units of a laser receiver; calculating at least two image distances of at least two pixel units according to the laser signals; determining a target image distance according to at least two image distances; focusing is carried out based on the target image distance, and a target file is obtained through shooting and comprises at least one of the following items: images, videos.
In a fourth aspect, an embodiment of the present application provides a shooting device, which includes a first receiving module, a first processing module, a second processing module, and a third processing module. The first receiving module is used for receiving laser signals through at least two laser receiving units of the laser receiver; the first processing module is used for calculating at least two image distances of at least two pixel units according to the laser signals; the second processing module is used for determining a target image distance according to at least two image distances; the third processing module is used for carrying out focusing processing based on the target image distance and obtaining a target file by shooting, wherein the target file comprises at least one of the following items: images, videos.
In a fifth aspect, the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the third aspect.
In a sixth aspect, the present application provides a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the third aspect.
In a seventh aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the third aspect.
In the embodiment of the application, the laser receiver and the image sensor are integrated in the same lens, so that the image distances corresponding to all pixel units in the viewing range of the image sensor are obtained, accurate focusing on any position, any object or any area in a shooting preview interface can be realized, and since each pixel can correspondingly measure one image distance, pixel-level focusing can be realized, and the accuracy of laser focusing is improved.
Drawings
Fig. 1 is a schematic structural diagram of a lens module according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a laser receiver and an image sensor provided in an embodiment of the present application for receiving a laser signal;
FIG. 3 is a schematic diagram of a lens imaging principle;
fig. 4 is a schematic flowchart of a shooting method provided in an embodiment of the present application;
fig. 5 is a second schematic flowchart of a shooting method according to an embodiment of the present application;
FIG. 6 is a schematic interface diagram of a shooting method provided in an embodiment of the present application;
fig. 7 is a third schematic flowchart of a shooting method according to an embodiment of the present application;
fig. 8 is a second schematic interface diagram of the photographing method according to the embodiment of the present application;
fig. 9 is a third schematic interface diagram of a shooting method according to the embodiment of the present application;
fig. 10 is a schematic structural diagram of a shooting device provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 12 is a hardware schematic diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
In the related art, single-point laser focusing is widely applied to the focusing process of a terminal with a shooting function by virtue of the advantages of high focusing speed, low dim light, good detail environment adaptability and the like, so that more application scenes are covered, and better and faster focusing experience is brought.
However, single-point laser focusing can only focus the center area of the preview picture, that is, focus a center position point, and an object in a non-center area of a photographed preview interface generally does not belong to a focused region of interest, and when focusing is performed on the center object of the non-preview interface, the focusing effect is not ideal.
The lens module, the mobile terminal, the shooting method, the shooting device, the electronic device and the readable storage medium provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
As shown in fig. 1 and 2, the lens module includes a laser transmitter 110, a laser receiver 120, and an image sensor 130.
The lens module is provided with an individual laser emitter 110, and the laser emitter 110 is used for emitting laser signals. It is understood that the laser emitter 110 may emit the laser beam of a certain wavelength range continuously with a certain period.
As shown in fig. 2, the image sensor 130 divides an optical image on a light receiving surface thereof into a plurality of pixel units 131, converts the optical signal into a usable electric signal, and uses the usable electric signal for imaging.
It is understood that the image sensor 130 includes at least two pixel units, and the laser receiver 120 includes at least two laser receiving units 121 corresponding to the pixel units one to one. The laser receiver 120 is used for receiving the laser signal and filtering out the non-visible light signal. At least two laser receiving units 121 of the laser receiver 120 are used for receiving laser signals and filtering out non-visible light signals.
When single-point laser focusing is adopted in the related art, the lens module comprises a lens for imaging and also comprises a laser transmitter and a laser receiver which are independent. The angle of view of the laser light received by the laser receiver is different from the angle of view of the visible light received by the imaging lens, and therefore the laser light received by the laser receiver cannot completely cover the viewing range of the imaging lens, i.e., the partial position of the viewing range of the lens cannot be focused by the laser light.
Unlike the lens module in the related art, the inventor integrates the laser receiver 120 in front of the image sensor 130, and both the laser receiver 120 and the image sensor 130 are integrated in the lens 140. By removing the filter coating on the lens, light rays of all wavelength bands can enter the lens 140. All incident light signals transmitted through the lens will first reach the laser receiver 120 after being converged by the focusing lens.
The incident optical signals include at least laser signals emitted by the laser emitter 110 that are reflected back through the object, visible light signals, and invisible light signals. The incident light signal passes through the laser receiving unit 121 and then is transmitted to the pixel unit 131.
The laser receiving unit 121 receives the laser signals of the pixel units 131 corresponding to the laser receiving unit one by one, and the object distance corresponding to the pixel unit 131 can be determined according to the flight time of the laser. The time of flight of the laser refers to the time during which the laser travels from transmission to reception within one laser transmission period. The object distance of the shot object can be calculated according to the flight time, and the lens module further focuses according to the obtained object distance.
As shown in fig. 3, according to the gaussian formula for object imaging:
Figure BDA0003322390020000041
the relation between the object distance and the image distance can be obtained, wherein f is the focal length of the lens, u is the distance between the shot object and the optical center of the lens, namely the object distance, and v is the distance between the optimal imaging position and the optical center of the lens, namely the image distance.
It is understood that when the lens 140 is shot at a certain time, the focal length f of the lens is a constant value, and the object distance u is inversely proportional to the image distance v. It can be understood that the farther the photographed object is from the lens, the greater the object distance u is, and the smaller the image distance v is; the closer the object to be photographed is to the lens, the smaller the object distance u and the larger the image distance v.
When shooting is carried out, the lens module determines the optimal image distance of imaging through the object distance obtained by the laser flight time, and then accurate focusing is realized to obtain a clearer image.
It can be understood that since the laser receiver 120 and the image sensor 130 are integrated in the same lens 140, the angle of view of the laser receiver 120 is identical to the angle of view of the image sensor 130. Each pixel unit 131 in the whole viewing range of the image sensor 130 can be focused by laser focusing, and the laser focusing area is not limited to a specific region of interest, so that the objects corresponding to all pixels in the viewing range of the image sensor 130 can be focused accurately.
According to the lens module of the embodiment of the application, the laser receiver 120 and the image sensor 130 are integrated in the same lens 140, image distances corresponding to all pixel units 131 in a view range of the image sensor 130 are obtained, accurate focusing on any position, any object or any area in a shooting preview interface can be achieved, since each pixel can correspondingly measure one image distance, pixel-level focusing can be achieved, accuracy of laser focusing is improved, laser focusing of the embodiment of the application can be compatible with other focusing schemes, a mixed focusing scheme facing different application scenes is formed, the lens module can be suitable for more focusing scenes, and better, faster and more accurate focusing experience is brought to a user.
The embodiment of the application also provides a mobile terminal, which comprises the lens module.
It is understood that mobile terminals include, but are not limited to, cell phones, tablets, computers, cameras, wearable devices, and the like.
Taking a mobile phone as an example, the lens module can be disposed on the back side of the mobile phone and used as a rear lens module, and the lens module can also be disposed on the front side of the mobile phone and used as a front lens module.
When a user uses the mobile phone to shoot, the mobile phone can meet the focusing requirement of the user for any position of a shooting preview interface in a mobile phone display screen, so that clear imaging of different positions of the shooting preview interface is realized, and the mobile phone shooting experience of the user is improved.
According to the mobile terminal provided by the embodiment of the application, the laser receiver and the image sensor are integrated in the same lens, the image distances corresponding to all pixel units in the view range of the image sensor are obtained, accurate focusing on any position, any object or any area in a shooting preview interface can be realized, as each pixel can correspondingly measure one image distance, pixel-level focusing can be realized, and the accuracy of laser focusing is improved.
The embodiment of the present application further provides a shooting method, where the shooting method may be applied to the above-mentioned mobile terminal, and may be specifically executed by hardware or software in the mobile terminal. The execution subject of the shooting method can be a mobile terminal, or a control device of the mobile terminal, and the like.
In the shooting method provided by the embodiment of the present application, an execution subject of the shooting method may be an electronic device or a functional module or a functional entity capable of implementing the shooting method in the electronic device, the electronic device mentioned in the embodiment of the present application includes, but is not limited to, a mobile phone, a tablet computer, a camera, a wearable device, and the like, and the shooting method provided by the embodiment of the present application is described below with the electronic device as the execution subject.
As shown in fig. 4, the photographing method includes: step 410, step 420, step 430 and step 440.
Step 410, receiving laser signals through at least two laser receiving units of a laser receiver.
When focusing is needed during shooting, the laser emitter emits laser to irradiate the shot object. The electronic device obtains the flight time of the laser by receiving the laser reflected by the photographic subject.
It can be understood that when each pixel unit of the image sensor needs to receive a visible light signal for imaging, at least two laser receiving units may receive a laser signal, determine the flight time of laser according to the laser signal received by the laser receiving unit, and determine the object distance of the pixel unit corresponding to the laser receiving unit.
And step 420, calculating at least two image distances of at least two pixel units according to the laser signals.
According to the imaging Gaussian formula, the image distance corresponding to each pixel unit can be determined according to the object distance corresponding to each pixel unit. The image distances corresponding to all pixels in the view range of the image sensor form an image distance set.
It will be appreciated that the laser receiver is always in the laser sensing state after the laser transmitter emits the laser light. The laser emitter can emit laser in the wavelength range of 930-940 nm and the laser emitting period can be 53.2 ns.
In this embodiment, each pixel unit in the viewing range of the image sensor is located in different rows and columns, and each pixel unit has a corresponding number of rows and columns. In a laser emission period, once the laser receiver detects laser with the wavelength of 930nm-940nm at the pixel position of the ith row and jth column, recording the time of the detection, and calculating the time interval delta t between the time and the laser emission timei,j,Δti,jI.e. the time of flight of the laser at the pixel location.
After one emission period is finished, according to the corresponding time interval delta t of each pixel uniti,jCalculating the object distance u corresponding to the pixel uniti,j=c*Δti,jWhere c is the speed of light, the panoramic depth of all pixels in the viewing range can be obtained, i.e. the set of object distances { u } of all pixel unitsi,j}. Calculating according to a Gaussian formula to obtain the image distance corresponding to the ith row and jth column of pixels
Figure BDA0003322390020000061
Obtaining a set of all pixel unit image distances vi,jWhere f is the focal length of the focusing lens.
And step 430, determining the target image distance according to the at least two image distances.
When the electronic equipment is focused, the image distance corresponding to each pixel unit in the view range of the image sensor is determined through the flight time of the laser received by the laser receiver, and the number of the pixel units is at least two. Since the image distances corresponding to each pixel unit are different, the target image distance for focusing needs to be determined.
The following describes how the focal length is determined by two different implementation angles.
First, an in-focus image distance is determined by receiving user input.
Referring to fig. 5, before at least two laser receiving units of the laser receiver receive the laser signals, the photographing method may further include steps 510 and 520.
Step 510, receiving a first input of a user to the shooting preview interface.
It can be understood that, as the requirements of users on the imaging quality of electronic devices are improved and personalized photographing experiences are favored, more and more users are used to actively focus on a target object within a viewing range.
The electronic equipment is provided with a display screen for displaying the view finding range of the image sensor, and a shooting preview interface on the display screen displays the view finding range of the image sensor, so that a user can conveniently preview an imaging picture in real time.
And displaying a target object which the user wants to focus at a target area in the shooting preview interface, and focusing the target object at the target area in the shooting preview interface by the electronic equipment according to a first input of the user.
Wherein the first input may be expressed in at least one of the following ways:
first, the first input may be represented by a touch operation, including but not limited to a click operation, a press operation, and the like.
In this embodiment, receiving the first input of the user may be represented by receiving a touch operation of the user on a display area of a display screen of the electronic device.
For example, when a user uses a mobile phone to shoot a person, the mobile phone receives an input that the user clicks the position of the face in the shooting preview interface of the mobile phone screen, and the mobile phone can automatically focus on the position of the person in the shooting preview interface.
Second, the first input may be represented as a voice input.
In this embodiment, the target voice may trigger the electronic device to focus on the target object.
For example, when a user uses a mobile phone to shoot a person, the mobile phone can automatically focus on the position of the person in a shooting preview interface when the mobile phone receives voice such as 'small V, small V and person shooting'.
Third, the first input may be represented as a physical key input.
In this embodiment, the body of the electronic device is provided with a physical key corresponding to the determination of the focusing image distance. The physical key can be a knob or a physical slide bar, and different rotation angles of the knob and different sliding distances of the physical slide bar correspond to different object distances or image distances corresponding to all pixels in a view range of the image sensor.
For example, receiving the input of the user may be represented by receiving the input of the user to rotate a focus knob of the camera lens, where the object distance or the image distance corresponds to a corresponding numerical value at the current position of the knob, so as to determine the focused image distance.
Of course, in other embodiments, the first input may also be in other forms, including but not limited to character input, and the like, which may be determined according to actual needs, and this is not limited in this application.
Step 520, in response to the first input, determining a target area.
The electronic equipment can respond to the first input to focus a target object at a target position in the shooting preview interface after receiving the first input of the user.
Before focusing, determining a target area where a plurality of target pixel units corresponding to a target object are located, wherein the determination mode of the target area can be expressed as at least one of the following modes:
first, the target area is an area surrounding the user touch operation position and including a target number of pixel units.
In this embodiment, when the electronic device receives the first input of the user as a touch operation, the electronic device makes a circle around the touch operation position within the view range by taking the touch point of the user on the display screen as a center of the circle and the target length as a radius, and the circular area is a target area.
The circular area can cover a certain area, and the focusing requirement of a user is met.
Secondly, the area in the contour of the target object is the target area.
The target object is an object that the user desires to focus on. The confirmation of the target object can be determined by the touch operation of the user and can also be confirmed by the voice input of the user, the confirmation modes of the target object are not limited to the above two modes, and the target object can also be automatically confirmed by the electronic equipment.
In this embodiment, when the electronic device receives the first input of the user, the electronic device automatically identifies a contour of the target object through an artificial intelligence identification algorithm, and an area within the contour is a target area.
For example, a user uses a mobile phone to shoot a person, when focusing is performed, the mobile phone receives a click of the user on a face position in a shooting preview interface, and the mobile phone performs image recognition based on the click position and identifies a person outline. In this case, the region within the human figure outline is the target region.
The operation difficulty of the user can be reduced by automatically identifying the target object, the focusing position error caused by mistaken clicking of the user is reduced to a certain extent, and the operation experience of the user is improved.
Of course, in other embodiments, the target area may also be determined in other manners, including but not limited to using a minimum rectangular area covering all pixels of the target object as the target area, which may be determined according to actual needs, and this is not limited in this application.
After confirming the target area, the electronic device can focus on the target area. In the focusing process, the corresponding image distance is determined according to the object distance of the pixel unit in the target area, and the target image distance is obtained according to the image distance and is the actual focusing image distance.
It is understood that, after determining the focusing area according to the first input of the user, the electronic device determines the target image distance for focusing.
The electronic equipment determines a target image distance and is used for controlling the motor to drive the focusing lens to move to the position of the target image distance for focusing. The electronic device may determine the target image distance of the target position according to at least one of the following manners:
firstly, the average value of the image distance values corresponding to all the pixel units in the target area is used as the target image distance.
In this embodiment, the target region includes at least two pixel units, and an average value of image distances of the at least two pixel units corresponding to the target region is calculated to obtain the target image distance.
The average value of the image distances of the pixel units is used as the focusing image distance, so that the focusing requirements of the pixel units in the current target area can be met, the focusing requirements of a user on a target object can be met in a better mode, and the overall focusing effect of the target object is better.
And secondly, selecting the maximum value of the image distance values corresponding to all the pixel units in the target area as the target image distance.
In this embodiment, the electronic device can focus on the position closest to the lens within the target area, so that the final imaging effect of the object closer to the lens is clearer.
And thirdly, selecting a mode value in the image distance values corresponding to all the pixel units in the target area as the target image distance.
In this embodiment, taking the mode of the image distance of the pixel unit in the target area as the target image distance can take the focusing requirement of more pixels at the target object into consideration, and can highlight the more obvious features of the target object in the final imaging, so that the focusing effect of the target object as a whole is better.
Of course, in other embodiments, the target image distance of the target area may also be determined in other manners, including but not limited to using the median of the image distances of a plurality of pixel units in the target area as the target image distance of the target object, which may be determined according to actual needs, and is not limited in this application.
And secondly, automatically determining the focusing image distance.
When focusing is needed, the electronic equipment can automatically determine the target image distance. When automatically determining the target image distance, the determination may be performed in at least one of the following manners:
firstly, the maximum value in the image distance set corresponding to all the pixel units is selected as the target image distance.
Determining the target image distance according to the at least two image distances comprises acquiring the maximum image distance of the at least two image distances to obtain the target image distance. Namely, the electronic device selects the maximum value in the image distance set corresponding to all pixel units in the view range of the image sensor as the target image distance. And according to the imaging Gaussian formula, the object distance value corresponding to the maximum image distance value is minimum.
Under the focusing mode, the electronic equipment can focus the position which is closest to the lens in the view finding range of the image sensor, the shooting requirement of a user for short distance or micro distance is met, the object which is closest to the lens can be shot clearly, and the user experience is enriched.
And secondly, selecting the minimum value in the image distance set corresponding to all the pixel units as the target image distance.
Determining the target image distance according to the at least two image distances comprises obtaining the minimum image distance of the at least two image distances to obtain the target image distance. The electronic equipment selects the minimum value in the image distance set corresponding to all the pixel units in the view range of the image sensor as the focusing image distance. And according to the imaging Gaussian formula, the object distance value corresponding to the minimum image distance value is maximum. Under the focusing mode, the electronic equipment can focus the position farthest from the lens in the view range of the image sensor, so that the final imaging of an object far away from the lens can be clearer.
And thirdly, selecting median values in the image distance set corresponding to all the pixel units as target image distances.
The electronic equipment selects the median of the image distances corresponding to all the pixel units in the view range of the image sensor as the focusing image distance. Under the focusing mode, the electronic equipment can focus the position of the middle value of the depth of field distance in the view finding range of the image sensor, the focusing position is between the two conditions of adopting the maximum value and the minimum value of the image distance, objects at different positions in the view finding range can be considered, and the imaging effect of the objects with different distances from the lens can be guaranteed.
And fourthly, selecting the mode numerical values in the image distance set corresponding to all the pixel units as the target image distance.
And the electronic equipment selects the mode in the image distances corresponding to all the pixels in the view range of the image sensor as the target image distance. Under the focusing mode, the electronic equipment takes the mode of all image distances as the focusing image distance, can ensure that pixels in a viewing range have the best focusing effect as many as possible, can give consideration to the focusing effect in a wider range in the viewing range, and improves the imaging quality.
Of course, in other embodiments, the electronic device may also automatically determine the target image distance according to other manners, including but not limited to selecting an average value of all target image distances as the focusing image distance, which may be determined according to actual needs, and this is not limited in this application.
The above-mentioned shooting method is further described below with reference to a specific scene.
In the scene of shooting the festoons by the mobile phone, the user opens the shooting application of the mobile phone, and when the user aims the lens at the festoons and keeps the mobile phone fixed for a short time, the mobile phone judges that the user needs to focus.
The mobile phone controls the laser transmitter to transmit laser with the wavelength range of 930nm-940nm according to the period of 53.2ns, the laser receiver is always in a laser detection state after the laser transmitter transmits the laser, in one laser transmission period, once the position of the pixel unit in the ith row and the jth column of the image sensor detects the laser with the wavelength of 930nm-940nm, the time at the moment is recorded, and the time interval between the time and the laser transmission time is calculated.
And after one emission period is finished, calculating a set of corresponding object distances and image distances according to the time interval corresponding to each pixel unit.
In order to simplify the operation of the user and reduce the threshold of photographing by the user, the mobile phone can automatically determine the target image distance based on the image distances corresponding to all the current pixel units. In the present embodiment, the mobile phone sets the mode among all the image distances as the focused image distance.
As shown in fig. 6, there are a plurality of flowers 610 with different distances from the lens and a shooting control 620 for controlling shooting in the shooting preview interface, in which the pixel units corresponding to the petals of the flowers 610 occupy most of the pixel unit positions in the viewing range of the image sensor. The petals of some of the flowers 610 are close to the lens, so that in the automatic focusing, the object distances corresponding to the modes of all the image distances are the modes of the distances between the petals and the lens of the multiple flowers 610 in the figure.
In this embodiment, the motor in the lens module drives the focusing lens to move to the position corresponding to the target image distance, and at this time, a plurality of flowers 610 with a close distance from the lens in the shooting preview interface on the mobile phone screen are clearly displayed, so that the number of clear flowers 610 in the shooting preview interface is the largest. The user can judge whether to perform the next shooting operation according to the imaging effect in the shooting preview interface, and the clear flowers 610 in the shot image are the largest in number.
Step 440, focusing is performed based on the target image distance, and a target file is obtained by shooting, wherein the target file comprises at least one of the following items: images, videos.
And determining the position of the focusing lens according to the target image distance of the pixel unit, and driving the focusing lens to move to the position corresponding to the optimal focusing image distance by the motor.
In the actual execution process, the distance v of the far focus image burnt in the electronic equipment by the lens can be usedinfCorresponding motor command codeinfAnd a near focal length vmicrCorresponding motor command codemicrBy linear interpolation, the optimum focusing image distance vfConversion to required motor command codefTo control the motor to work and move the motor to the optimal focusing image distance vfUpper, motor command codefThe following formula is satisfied:
Figure BDA0003322390020000101
at this time, the electronic device focuses and photographs according to the optimal focusing image distance. Shooting to obtain a target file, wherein the target file comprises at least one of the following items: images, videos.
It can be understood that the laser receiver and the image sensor are integrated in the same lens module, in the focusing process, the field angles of the light received by the laser receiver and the image sensor are consistent, the lens can receive the laser reflected by all objects in the viewing range of the image sensor, the laser receiver can receive the reflected laser in the viewing range of the image sensor according to pixel units, and the electronic device can obtain the object distance and the image distance corresponding to each pixel in the viewing range of the image sensor.
When focusing is carried out, the electronic equipment can determine the focusing image distance according to the image distance of any pixel according to the requirements of a user, and the accurate focusing of any position or any object in the view range of the image sensor can be realized.
According to the shooting method provided by the embodiment of the application, the image distances corresponding to all pixel units in the framing range of the image sensor are obtained through the laser receiver, so that accurate focusing of all positions and all objects in the shooting preview interface can be realized, laser focusing can be suitable for more focusing scenes, and the focusing experience of a user is improved.
In some embodiments, as shown in fig. 7, step 440, performing a focusing process based on the target image distance, and capturing a target file, where the target file includes at least one of the following: the image and video may further include steps 441, 442 and 443.
And step 441, carrying out focusing processing based on the target image distance.
It can be understood that the motor drives the focusing lens to move to the target image distance, and a preview image focused by the target image distance in the current view range of the image sensor can be displayed in the shooting preview interface.
And step 442, receiving a second input of the shooting preview interface from the user.
And when the user is satisfied with the current focusing effect, the second input of the user is used for controlling the electronic equipment to shoot according to the current focusing effect.
Wherein the second input may be expressed in at least one of the following ways:
first, the second input may be represented by a touch operation, including but not limited to a click operation, a press operation, a slide operation, and the like.
In this embodiment, receiving the second input of the user may be represented by receiving a touch operation of the user on a display area of a display screen of the electronic device.
In order to reduce the misoperation rate of the user, the action area of the second input can be limited to a specific area, such as the lower middle area of the shooting preview interface; or displaying the target control on the current interface and touching the target control in the state of displaying the shooting preview interface, so that the second input can be realized.
For example, when the user takes a picture or uses a mobile phone, the receiving of the second input of the user may be represented as receiving a touch operation that the user clicks a shooting control in a state of a shooting preview interface on a display screen of the mobile phone.
Second, the second input may be represented as a physical key input.
In this embodiment, the body of the electronic device is provided with a physical key corresponding to the shooting, and receives the second input of the user, which may be expressed as receiving the second input of the user pressing the corresponding physical key; the second input may also be a combined operation of pressing a plurality of physical keys simultaneously.
For example, when the user takes a picture or uses a mobile phone, the second input from the user may be received, and the pressing operation of the volume key by the user in a state of the mobile phone display screen taking a preview interface may be received.
Third, the second input may be presented as a voice input.
In this embodiment, receiving the second input by the user may be embodied as receiving a voice instruction from the user.
For example, when a user shoots or uses a mobile phone, the mobile phone is triggered to shoot when the mobile phone receives a voice such as "eggplant" in a state that a shooting preview interface is displayed on a display screen of the mobile phone.
Of course, in other embodiments, the second input may also be represented in other forms, including but not limited to character input, and the like, which may be determined according to actual needs, and this is not limited in this application.
And step 443, responding to the second input, and shooting to obtain the target file.
The electronic device can respond to the second input of the user and perform shooting. The electronic equipment obtains a target file after shooting, and the target file comprises at least one of an image or a video. The target file shot by the electronic equipment can be stored in a storage unit of the electronic equipment.
According to the shooting method provided by the embodiment of the application, the second input of the user is received for shooting under the condition of focusing according to the target image distance, so that the image or the video which accords with the expected focusing effect of the user can be obtained, and the shooting experience of the user is improved.
In some shooting scenarios, when the user is not satisfied with the effect of the current auto-focusing, the electronic device may switch the focusing mode to focus again by receiving a third input of the user to the target position or the target object in the shooting preview interface.
The third input may be represented as a touch input, and the electronic device performs focusing again by receiving a click operation of a user on a target object or a target position in the shooting preview interface.
For example, when a user takes a memorial photo of a figure using a cell phone, referring to fig. 8 and 9, a preview of the photo of the cell phone display screen is shown with a person 810, a background building 820 behind the person, and a street 830 in front of the person. And a shooting control 840 for shooting control is also displayed in the center of the bottom of the shooting preview interface of the mobile phone display screen.
When the mobile phone detects that the photographing application is opened, the laser transmitter starts to transmit laser in a certain period, after the laser receiver receives the reflected laser in the current period, the mobile phone calculates a set of object distances and image distances corresponding to each pixel unit according to the laser flight time interval corresponding to each pixel unit in the photographing preview interface, and selects the maximum image distance as a target image distance.
A motor in the mobile phone lens module drives a lens to move to a position corresponding to a focusing image distance, an image focused according to the focusing image distance is displayed in a shooting preview interface of a display screen, a street 830 in the image is closest to the lens, the preview effect of the street 830 is clearest, the face of a shot person 810 is fuzzy, and the details of the face are not clear enough.
To improve the shooting quality of the face of the person 810, as shown in fig. 8, after the mobile phone receives a third input that the user clicks the display screen, the mobile phone detects that the touch point position of the user clicking the display screen is located at a position where the background building 820 is close to the face of the person 810, i.e., a black point position in the figure. The mobile phone can take a circular area with the touch point position as the center of a circle as a target area A, the set of pixel units in the target area A is { (i, j), (i, j) ∈ A }, and the radius of the target area A can be set to 1/8 of the length of the narrow side of the shooting preview interface of the display screen.
The laser receiver receives the reflected laser in the first period after the user clicks the display screen, and the mobile phone calculates a set { v } of object distances and image distances corresponding to the mobile phone according to the laser flight time interval corresponding to each pixel of the target area in the shooting preview interfacei,j(i, j) belongs to A }, and the image distance average value v corresponding to the pixel unit of the target area is selectedf=averg({vi,jAnd (i, j) ∈ A }) as the target image distance.
In the above case, since there is a certain deviation between the actual click position of the user on the shooting preview interface and the desired focusing position of the user, after focusing according to the average image distance, the actual focusing position is not at the position of the face of the person 810 but at a position between the face of the person 810 and the background building 820, and the actual focusing effect is not good.
As shown in fig. 9, the mobile phone receives a third input of the user clicking the display again, and the mobile phone detects that the touch point position of the user clicking the display is located at the center of the face of the person 810, i.e. the position of the black point in the figure. The mobile phone may use a circular area centered on the contact position as the target area a.
The laser receiver receives the reflected laser in the first period after the user clicks the display screen again, the mobile phone calculates a set of object distances and image distances corresponding to the laser according to the laser flight time interval corresponding to each pixel in the target area A in the shooting preview interface, and selects the average value of the image distances corresponding to the pixels in the target area as the focusing image distance for focusing.
Under the above condition, the face of the person 810 can be focused accurately, and after the mobile phone receives a photographing instruction that the user clicks the photographing control 840, the image sensor starts to image and obtain an image file meeting the focusing requirement of the user.
In the actual shooting process, the operation difficulty of the user can be reduced through automatic focusing, the shooting by the user is facilitated, the user can focus automatically, the personalized focusing requirement of the user is met, and the shooting satisfaction of the user is improved.
In the shooting method provided by the embodiment of the present application, the execution subject may be a shooting device, or a control module in the shooting device for executing the shooting method. The embodiment of the present application takes a method for executing shooting by a shooting device as an example, and describes the shooting device provided by the embodiment of the present application.
The embodiment of the application also provides a shooting device.
As shown in fig. 10, the photographing apparatus includes: a first receiving module 1010, a first processing module 1020, a second processing module 1030, and a third processing module 1040.
The first receiving module 1010 is configured to receive laser signals through at least two laser receiving units of a laser receiver; the first processing module 1020 is configured to calculate at least two image distances of at least two pixel units according to the laser signal; the second processing module 1030 is configured to determine a target image distance according to at least two image distances; the third processing module 1040 is configured to perform focusing processing based on the target image distance, and obtain a target file by shooting, where the target file includes at least one of the following: images, videos.
According to the shooting device provided by the embodiment of the application, the laser receiver and the image sensor are integrated in the same lens, the image distances corresponding to all pixel units in the view range of the image sensor are obtained, accurate focusing on any position, any object or any area in a shooting preview interface can be realized, and as each pixel can correspondingly measure one image distance, pixel-level focusing can be realized, and the accuracy of laser focusing is improved.
In some embodiments, the photographing apparatus further includes a second receiving module and a fourth processing module.
The second receiving module is used for receiving first input of a user to the shooting preview interface; the fourth processing module is used for responding to the first input and determining a target area; the second processing module 1030 is further configured to calculate an average value of image distances of at least two pixel units corresponding to the target area, so as to obtain a target image distance.
According to the shooting device provided by the embodiment of the application, the average value of the image distances of the pixel units is used as the focusing image distance, so that the focusing requirements of the pixel units in the current target area can be met, the focusing requirements of a user on a target object can be met in a better mode, and the overall focusing effect of the target object is better.
In some embodiments, the second processing module 1030 is further configured to obtain a maximum image distance of the at least two image distances to obtain the target image distance.
According to the shooting device provided by the embodiment of the application, the electronic equipment can focus the position which is closest to the lens in the framing range of the image sensor, the shooting requirement of a user for short distance or micro distance is met, the object which is closest to the lens can be clearly shot, and the user experience is enriched.
In some embodiments, the third processing module 1040 is further configured to perform a focusing process based on the target image distance; receiving a second input of the user to the shooting preview interface; and responding to the second input, and shooting to obtain the target file.
According to the shooting device provided by the embodiment of the application, the second input of the user is received for shooting under the condition of focusing according to the target image distance, the image or the video which accords with the expected focusing effect of the user can be obtained, and the shooting experience of the user is improved.
The shooting device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The photographing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an IOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The shooting device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to 9, and is not described here again to avoid repetition.
Optionally, as shown in fig. 11, an electronic device 1100 is further provided in an embodiment of the present application, and includes a processor 1101, a memory 1102, and a program or an instruction stored in the memory 1102 and executable on the processor 1101, where the program or the instruction is executed by the processor 1101 to implement each process of the foregoing shooting method embodiment, and can achieve the same technical effect, and no repeated description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 12 is a schematic hardware structure diagram of an electronic device implementing an embodiment of the present application.
The electronic device 1200 includes, but is not limited to: radio frequency unit 1201, network module 1202, audio output unit 1203, input unit 1204, sensors 1205, display unit 1206, user input unit 1207, interface unit 1208, memory 1209, and processor 1210.
Those skilled in the art will appreciate that the electronic device 1200 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 1210 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 12 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
Wherein, the processor 1210 is configured to receive laser signals through at least two laser receiving units of the laser receiver;
the processor 1210 is further configured to calculate at least two image distances of at least two pixel units according to the laser signal;
a processor 1210 further configured to determine a target image distance from the at least two image distances;
the processor 1210 is further configured to perform focusing processing based on the target image distance, and capture a target file, where the target file includes at least one of: images, videos.
According to the electronic equipment provided by the embodiment of the application, the laser receiver and the image sensor are integrated in the same lens, the image distances corresponding to all pixel units in the view range of the image sensor are obtained, accurate focusing on any position, any object or any area in a shooting preview interface can be realized, and as each pixel can correspondingly measure one image distance, pixel-level focusing can be realized, and the accuracy of laser focusing is improved.
Optionally, a user input unit 1207, configured to receive a first input of the shooting preview interface by the user;
a processor 1210 further for determining a target area in response to a first input;
the processor 1210 is further configured to calculate an average value of image distances of at least two pixel units corresponding to the target area, so as to obtain a target image distance.
According to the electronic device provided by the embodiment of the application, the average value of the image distances of the plurality of pixel units is used as the focusing image distance, so that the focusing requirement of a user on a target object can be met in a better mode, and the overall focusing effect of the target object is better.
Optionally, the processor 1210 is further configured to obtain a maximum image distance of the at least two image distances to obtain a target image distance.
According to the electronic equipment provided by the embodiment of the application, the electronic equipment can focus the position which is closest to the lens in the framing range of the image sensor, the shooting requirement of a user for short distance or micro distance is met, the object which is closest to the lens can be clearly shot, and the user experience is enriched.
Optionally, the processor 1210 is further configured to perform focusing processing based on the target image distance;
the user input unit 1207 is further configured to receive a second input to the shooting preview interface from the user;
the processor 1210 is further configured to capture a target file in response to a second input.
According to the electronic equipment provided by the embodiment of the application, the second input of the user is received for shooting under the condition of focusing according to the target image distance, the image or the video which accords with the expected focusing effect of the user can be obtained, and the shooting experience of the user is improved.
It should be understood that, in the embodiment of the present application, the input Unit 1204 may include a Graphics Processing Unit (GPU) 9041 and a microphone 12042, and the Graphics Processing Unit 12041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1206 may include a display panel 12061, and the display panel 12061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1207 includes a touch panel 12071 and other input devices 12072. A touch panel 12071, also referred to as a touch screen. The touch panel 12071 may include two parts of a touch detection device and a touch controller. Other input devices 12072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 1209 may be used to store software programs as well as various data, including but not limited to application programs and an operating system. Processor 1210 may integrate an application processor, which handles primarily the operating system, user interface, applications, etc., and a modem processor, which handles primarily wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 1210.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above shooting method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above shooting method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A lens module, comprising:
the laser transmitter is used for transmitting a laser signal;
the laser receiver comprises at least two laser receiving units, and the at least two laser receiving units are used for receiving laser signals and filtering out non-visible light signals;
the image sensor comprises at least two pixel units, and the at least two pixel units are arranged in one-to-one correspondence with the at least two laser receiving units;
and the incident light signal is transmitted to the pixel unit after passing through the laser receiving unit.
2. A mobile terminal comprising the lens module according to claim 1.
3. A photographing method applied to the mobile terminal according to claim 2, the method comprising:
receiving laser signals through at least two laser receiving units of a laser receiver;
calculating at least two image distances of at least two pixel units according to the laser signals;
determining a target image distance according to at least two image distances;
focusing is carried out based on the target image distance, and a target file is obtained through shooting and comprises at least one of the following items: images, videos.
4. The photographing method according to claim 3, wherein before the at least two laser receiving units of the laser receiver receive the laser signals, further comprising:
receiving a first input of a user to a shooting preview interface;
determining a target area in response to the first input;
the determining the target image distance according to at least two image distances comprises:
and calculating the average value of the image distances of at least two pixel units corresponding to the target area to obtain the target image distance.
5. The imaging method according to claim 3, wherein the determining a target image distance from at least two of the image distances comprises:
and acquiring the maximum image distance of at least two image distances to obtain the target image distance.
6. The shooting method according to claim 3, wherein the focusing process is performed based on the target image distance, and the shooting to obtain the target file comprises:
focusing processing is carried out based on the target image distance;
receiving a second input of the user to the shooting preview interface;
and responding to the second input, and shooting to obtain a target file.
7. A camera, comprising:
the first receiving module is used for receiving laser signals through at least two laser receiving units of the laser receiver;
the first processing module is used for calculating at least two image distances of at least two pixel units according to the laser signals;
the second processing module is used for determining a target image distance according to at least two image distances;
a third processing module, configured to perform focusing processing based on the target image distance, and obtain a target file by shooting, where the target file includes at least one of the following items: images, videos.
8. The photographing apparatus according to claim 7, further comprising:
the second receiving module is used for receiving first input of a user to the shooting preview interface;
a fourth processing module for determining a target area in response to the first input;
the second processing module is further configured to calculate an average value of image distances of at least two pixel units corresponding to the target area, so as to obtain the target image distance.
9. The camera according to claim 7, wherein the second processing module is further configured to obtain a maximum image distance of at least two image distances to obtain the target image distance.
10. The shooting device of claim 7, wherein the third processing module is further configured to perform focusing processing based on the target image distance; receiving a second input of the user to the shooting preview interface; and responding to the second input, and shooting to obtain a target file.
CN202111250296.9A 2021-10-26 2021-10-26 Lens module, mobile terminal, shooting method and shooting device Active CN113873132B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111250296.9A CN113873132B (en) 2021-10-26 2021-10-26 Lens module, mobile terminal, shooting method and shooting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111250296.9A CN113873132B (en) 2021-10-26 2021-10-26 Lens module, mobile terminal, shooting method and shooting device

Publications (2)

Publication Number Publication Date
CN113873132A true CN113873132A (en) 2021-12-31
CN113873132B CN113873132B (en) 2024-03-22

Family

ID=78998124

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111250296.9A Active CN113873132B (en) 2021-10-26 2021-10-26 Lens module, mobile terminal, shooting method and shooting device

Country Status (1)

Country Link
CN (1) CN113873132B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006177712A (en) * 2004-12-21 2006-07-06 Canon Inc Semiconductor device and its manufacturing method
JP2015162562A (en) * 2014-02-27 2015-09-07 株式会社ニコン Imaging apparatus and digital camera
CN206311755U (en) * 2016-12-30 2017-07-07 北醒(北京)光子科技有限公司 A kind of multi-thread range unit of solid-state
US20200177781A1 (en) * 2018-12-03 2020-06-04 Canon Kabushiki Kaisha Image capturing apparatus, method for controlling image capturing apparatus, and calculation method
US20200344405A1 (en) * 2019-04-25 2020-10-29 Canon Kabushiki Kaisha Image pickup apparatus of measuring distance from subject to image pickup surface of image pickup device and method for controlling the same
CN112954217A (en) * 2021-02-25 2021-06-11 维沃移动通信有限公司 Electronic equipment and focusing method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006177712A (en) * 2004-12-21 2006-07-06 Canon Inc Semiconductor device and its manufacturing method
JP2015162562A (en) * 2014-02-27 2015-09-07 株式会社ニコン Imaging apparatus and digital camera
CN206311755U (en) * 2016-12-30 2017-07-07 北醒(北京)光子科技有限公司 A kind of multi-thread range unit of solid-state
US20200177781A1 (en) * 2018-12-03 2020-06-04 Canon Kabushiki Kaisha Image capturing apparatus, method for controlling image capturing apparatus, and calculation method
US20200344405A1 (en) * 2019-04-25 2020-10-29 Canon Kabushiki Kaisha Image pickup apparatus of measuring distance from subject to image pickup surface of image pickup device and method for controlling the same
CN112954217A (en) * 2021-02-25 2021-06-11 维沃移动通信有限公司 Electronic equipment and focusing method

Also Published As

Publication number Publication date
CN113873132B (en) 2024-03-22

Similar Documents

Publication Publication Date Title
US9300858B2 (en) Control device and storage medium for controlling capture of images
CN110035218B (en) Image processing method, image processing device and photographing equipment
CN111314597A (en) Terminal, focusing method and device
CN112312016B (en) Shooting processing method and device, electronic equipment and readable storage medium
CN110266957B (en) Image shooting method and mobile terminal
CN112532881B (en) Image processing method and device and electronic equipment
CN110213480A (en) A kind of focusing method and electronic equipment
CN112291473B (en) Focusing method and device and electronic equipment
CN114500837B (en) Shooting method and device and electronic equipment
CN113840070A (en) Shooting method, shooting device, electronic equipment and medium
CN113747067B (en) Photographing method, photographing device, electronic equipment and storage medium
CN112543284A (en) Focusing system, method and device
CN113873132B (en) Lens module, mobile terminal, shooting method and shooting device
CN110602397A (en) Image processing method, device, terminal and storage medium
CN113794833B (en) Shooting method and device and electronic equipment
US11252341B2 (en) Method and device for shooting image, and storage medium
CN115037867B (en) Shooting method, shooting device, computer readable storage medium and electronic equipment
CN112653841B (en) Shooting method and device and electronic equipment
CN114025100A (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN115134517A (en) Shooting control method and device and storage medium
CN112261300A (en) Focusing method and device and electronic equipment
CN111726531A (en) Image shooting method, processing method, device, electronic equipment and storage medium
CN113315904A (en) Imaging method, imaging device, and storage medium
CN114125417B (en) Image sensor, image pickup apparatus, image pickup method, image pickup apparatus, and storage medium
KR102458470B1 (en) Image processing method and apparatus, camera component, electronic device, storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant