CN113472977A - Microspur imaging method and terminal - Google Patents

Microspur imaging method and terminal Download PDF

Info

Publication number
CN113472977A
CN113472977A CN202110598543.8A CN202110598543A CN113472977A CN 113472977 A CN113472977 A CN 113472977A CN 202110598543 A CN202110598543 A CN 202110598543A CN 113472977 A CN113472977 A CN 113472977A
Authority
CN
China
Prior art keywords
lens
terminal
camera module
image sensor
macro
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110598543.8A
Other languages
Chinese (zh)
Inventor
王海燕
叶海水
苏蔚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110598543.8A priority Critical patent/CN113472977A/en
Publication of CN113472977A publication Critical patent/CN113472977A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B3/00Focusing arrangements of general interest for cameras, projectors or printers
    • G03B3/02Focusing arrangements of general interest for cameras, projectors or printers moving lens along baseboard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/24Optical objectives specially designed for the purposes specified below for reproducing or copying at short object distances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/08Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B9/00Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or -
    • G02B9/62Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or - having six components only
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/12Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0015Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
    • G02B13/002Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface
    • G02B13/0045Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface having five or more lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Abstract

The application provides a macro imaging method and a macro imaging terminal, relates to the technical field of photographing, and can solve the problem that in the prior art, the imaging quality of a shot object is low when the shot object is close to an image sensor. The terminal comprises a camera module, an input component, an output component and a processor, wherein the camera module comprises a lens, a lens driving device and an image sensor from an object side to an image side. The lens is used for supporting the clear imaging when the distance between a shot object and the image sensor is within a micro-distance range. And a lens driving device for driving the lens to move along the optical axis when the distance between the object and the image sensor is in a macro range, wherein the driving stroke of the lens driving device is related to the nearest focusing distance of the terminal. And the processor is used for controlling the lens driving device so as to enable the lens to finish focusing on the shot object. And the input component is used for receiving a shooting instruction input by a user, and the shooting instruction is used for shooting the focused picture. An output section for outputting the taken picture.

Description

Microspur imaging method and terminal
Technical Field
The application relates to the technical field of terminal photographing, in particular to a macro imaging method and a terminal.
Background
Generally, a user can take a still picture or record a dynamic video through a camera on the intelligent terminal. At present, the design mode of a camera of an intelligent terminal is divided into a fixed focus design and a zooming design. In the fixed focus design, the focal length of the camera is a determined value, for example, the focal length of the camera may be 27mm or 30mm or 54mm or other values. In a zoom design, the focal length of the camera may be adjusted. In a general use scene, in order to enable the camera to focus at infinity and a close distance from a subject, the focusing distance range of the camera is usually over 7 cm.
In many application scenarios, a user needs to take a picture of a short distance, for example, the user wants to take a picture of a bug that is close to the lens. However, for a closer focusing distance, such as 1-5 cm, the imaging of the camera of the existing intelligent terminal is fuzzy, and the quality of the obtained image is low.
Disclosure of Invention
The embodiment of the application provides a terminal, which can obtain high-quality imaging in a shooting scene with a focusing distance of 1-5 cm.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, an embodiment of the present application provides a terminal, which includes a camera module, an input component, an output component, and a processor, where the camera module includes, from an object side to an image side, a lens driving device, and an image sensor.
The lens is used for supporting the clear imaging when the distance between a shot object and the image sensor is within a micro-distance range. And a lens driving device for driving the lens to move along the optical axis when the distance between the object and the image sensor is in a macro range, wherein the driving stroke of the lens driving device is related to the nearest focusing distance of the terminal. And the processor is used for controlling the lens driving device so as to enable the lens to finish focusing on the shot object. And the input component is used for receiving a shooting instruction input by a user, and the shooting instruction is used for shooting the focused picture. An output section for outputting the taken picture. In this way, when the distance between the shot object and the image sensor is in the micro-distance range, the processor can control the lens driving device to enable the lens to successfully focus on the shot object.
In one possible design, the macro range is 1-5 cm.
Optionally, the lens is an ultra-wide-angle lens, the FOV of the ultra-wide-angle lens is greater than or equal to 100 °, and the value range of the equivalent focal length of the ultra-wide-angle lens is 10mm to 20 mm.
Optionally, the super-wide-angle lens has negative distortion in the edge field, the negative distortion is greater than or equal to-30%, and the range of the vertical axis magnification of the super-wide-angle lens in the central field is 0.03-0.43.
Optionally, the number of lenses in the ultra-wide-angle lens ranges from 5 to 8, and the size of the target surface of the image sensor ranges from 1/3.06 to 1/2.78.
In one possible design, the lens is an inner focus lens. And the processor is also used for adjusting the focal length of the inner focusing lens.
Optionally, the inner focusing lens comprises one or more lenses with variable optical power, and the optical power of the lenses with variable optical power is associated with the focal length of the inner focusing lens.
The processor is configured to adjust a focal length of the inner focusing lens, and may specifically be implemented as: the optical power of one or more than one optical power variable lens is adjusted to adjust the focal length of the inner focusing lens.
Alternatively, the refractive index of the variable power lens is related to the power of the variable power lens.
The processor is used for adjusting the optical power of one or more optical power variable lenses, and can be specifically realized as follows: and the power control circuit is used for controlling the current or voltage input to the power variable lens so as to change the refractive index of the power variable lens and adjust the power of the power variable lens.
Alternatively, the shape of the variable power lens is related to the power of the variable power lens.
Accordingly, the processor, configured to adjust the optical power of one or more optical power variable lenses, may specifically be implemented as: the lens is used for controlling the deformation of the power variable lens so as to adjust the power of the power variable lens.
Optionally, the lens with variable focal power is an electro-optical lens or a deformable lens.
Therefore, the refractive index of the variable focal power lens can be changed by applying an electric field to the variable focal power lens, or the variable focal power lens is pushed and extruded by the driving device to deform, so that the focal power of the variable focal power lens is changed, and the focal length of the inner focusing lens is adjusted. In this way, the terminal can be made to support sharp imaging when the subject is close to the image sensor.
In one possible design, the terminal further comprises a lens driving device. The inner focusing lens comprises n lenses which are sequentially arranged along an optical axis, the n lenses comprise one or more movable lens groups, each movable lens group comprises one or more movable lenses, the movable lenses are lenses with variable positions along the optical axis relative to the lens, and the relative positions of the movable lenses along the optical axis are related to the focal length of the inner focusing lens.
And the lens driving device is used for driving one or more movable lens groups in the inner focusing lens to move along the optical axis so as to adjust the focal length of the inner focusing lens.
In this manner, in the embodiment of the present application, the relative position between the movable lenses in the lens along the optical axis is changed by the driving of the lens driving device, that is, the interval between the lenses in the lens is changed, and thereby, the optical characteristics of the entire lens, such as the focal length, may be changed. In this application embodiment, through the interval between the lens in the dynamic adjustment camera lens, can adjust the focus of camera lens, and then make the terminal can become comparatively clear image under the macro.
In a second aspect, an embodiment of the present application provides a macro imaging method, which is applied to a terminal including a camera module, an input component, an output component, and a processor, wherein the camera module includes a lens, a lens driving device, and an image sensor from an object side to an image side. The lens supports clear imaging when the distance between a shot object and the image sensor is in a micro-distance range. The method comprises the following steps:
if the distance between the shot object and the image sensor is detected to be within the micro-distance range, the processor controls the lens driving device to drive the lens to move along the optical axis, so that the lens focuses on the shot object. The input section receives a photographing instruction input by a user, the photographing instruction being used to photograph a focused picture, and thereafter, the output section outputs the photographed picture.
In one possible design, after the terminal detects that the distance between the subject and the image sensor is within the macro range, the terminal may further perform the following steps:
the output component outputs a first interface, and the first interface is used for prompting a user whether to start macro shooting.
According to the macro imaging method provided by the embodiment of the application, the terminal can detect whether the distance between a shot object and the image sensor meets the macro, when the macro is met, the lens driving device in the terminal pushes the lens to move along the optical axis so as to finish focusing, and then clear imaging can be shot under the macro.
In one possible design, the macro range is 1-5 cm.
In one possible design, the lens is an ultra-wide-angle lens, the FOV of the field of view of the ultra-wide-angle lens is greater than or equal to 100 degrees, and the value range of the equivalent focal length of the ultra-wide-angle lens is 10 mm-20 mm.
Optionally, the super-wide-angle lens has negative distortion in the edge field, the negative distortion is greater than or equal to-30%, and the range of the vertical axis magnification of the super-wide-angle lens in the central field is 0.03-0.43.
Optionally, the number of lenses in the ultra-wide-angle lens ranges from 5 to 8, and the size of the target surface of the image sensor ranges from 1/3.06 to 1/2.78.
In one possible design, the lens is an inner focus lens. The processor controls the lens driving device to drive the lens to move along the optical axis, so that the lens focuses on the object, which can be specifically realized as follows: the processor controls the lens driving device to drive the inner focusing lens to move along the optical axis, and controls and adjusts the focal length of the inner focusing lens, so that the inner focusing lens completes focusing on the shot object.
Optionally, the terminal controls the current or voltage input to the power variable lens through the processor to adjust the power of the power variable lens. Or the terminal controls the variable-power lens to deform through the processor so as to adjust the power of the variable-power lens. Of course, the terminal processor may also control the focal power of the focal power variable lens to change in other ways to adjust the focal length of the inner focusing lens.
According to the macro imaging method provided by the embodiment of the application, when the terminal detects that the distance between a shot object and the image sensor meets the macro, the terminal can change the focal power of the lens by controlling the deformation or the refractive index of the lens so as to adjust the focal length of the lens, and focusing can be completed through the lens driving device, so that high-quality imaging can be obtained at the macro.
Drawings
Fig. 1 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 2 is a schematic view illustrating a setting of a camera module on a terminal according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a camera module according to an embodiment of the present disclosure;
fig. 4 is a schematic view illustrating a connection between a voice coil motor and a lens according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a camera module with an ultra-wide-angle lens according to an embodiment of the present disclosure;
FIG. 6 is a schematic view of a field of view;
FIG. 7 is a photograph taken over a macro range of a conventional mobile phone and a mobile phone according to an embodiment of the present application;
fig. 8 is a picture taken by the terminal in a macro range according to the embodiment of the present application;
fig. 9 is a picture taken by the terminal in a macro range according to the embodiment of the present application;
fig. 10 is a first schematic structural diagram of a camera module with an internal focusing lens according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of a camera module with an internal focusing lens according to an embodiment of the present disclosure;
FIG. 12 is a flowchart of a method for macro imaging according to an embodiment of the present disclosure;
FIG. 13 is a flowchart of a method for macro imaging according to an embodiment of the present disclosure;
FIG. 14 is a flowchart of a method for macro imaging according to an embodiment of the present disclosure;
fig. 15 is a first view of a macro imaging scenario provided in an embodiment of the present application;
fig. 16 is a second scene schematic diagram of macro imaging according to an embodiment of the present application;
fig. 17 is a third scene schematic diagram of macro imaging according to the embodiment of the present application.
Description of reference numerals:
1-a lens, wherein the lens is a lens,
2-voice coil motor.
Detailed Description
First, terms related to embodiments of the present application are explained:
field of view (FOV): referring to fig. 6, in the optical device, an angle formed by two edges of the maximum range where an object image of a subject can pass through the lens is referred to as a field of view. The size of the field of view determines the field of view of the optical instrument, the larger the field of view. That is, objects within the field of view may be captured through the lens, while objects outside the field of view are not visible. In fig. 6, ab is the diameter of the visual range, point c is the center of the visual range, oc is the object distance, and ω is the field of view.
Target surface size of image sensor: refers to the size of the photosensitive element in the image sensor.
Equivalent focal length: because the size of the photosensitive element of the image sensor in different camera modules is different, the same lens is matched with different photosensitive elements for use, the imaging effect is different, and for convenience of understanding and expression, the focal lengths of different lenses are converted into the equivalent focal length of a standard camera according to a certain proportionality coefficient, wherein the standard camera can be a full-frame camera. The method for converting the focal lengths of different lenses into the equivalent focal length of the standard camera can be referred to the prior art, and is not described herein again.
Depth of field: when the camera module finishes focusing, the clear and sharp range of the image of the shot object on the photosensitive element is larger, the depth of field is deeper, the clear range of the image is smaller, and the depth of field is shallower. In addition, depth of field is related to background blurring effects. Generally, a shallow depth of field corresponds to a better background blurring effect, and a deep depth of field corresponds to a poorer background blurring effect.
The terms "first" and "second" and the like in the description and drawings of the present application are used for distinguishing different objects or for distinguishing different processes for the same object, and are not used for describing a specific order of the objects. Furthermore, the terms "including" and "having," and any variations thereof, as referred to in the description of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The terminal provided in the embodiment of the present application may be a portable electronic device including a camera function, such as a mobile phone, a wearable device, an Augmented Reality (AR) \ Virtual Reality (VR) device, a tablet computer, a notebook computer, a super mobile Personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the likeThe examples are not intended to be limiting in any way. Exemplary embodiments of the portable electronic device include, but are not limited to, a mount
Figure BDA0003092087660000041
Or other operating system. The portable electronic device may also be other portable electronic devices such as laptop computers (laptop) with touch sensitive surfaces (e.g., touch panels), etc. It should also be understood that in other embodiments of the present application, the electronic device may not be a portable electronic device, but may be a desktop computer with a touch-sensitive surface (e.g., a touch panel).
As shown in fig. 1 and fig. 2, the terminal in the embodiment of the present application may be a mobile phone 100. The embodiment will be specifically described below by taking the mobile phone 100 as an example.
As shown in fig. 1, the mobile phone 100 may specifically include: processor 101, Radio Frequency (RF) circuitry 102, memory 103, touch screen 104, bluetooth device 105, one or more sensors 106, Wi-Fi device 107, positioning device 108, audio circuitry 109, peripheral interface 110, and power supply 111. These components may communicate over one or more communication buses or signal lines (not shown in fig. 1). Those skilled in the art will appreciate that the hardware configuration shown in fig. 1 is not intended to be limiting, and that the handset 100 may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes the components of the handset 100 in detail with reference to fig. 1:
the processor 101 is a control center of the mobile phone 100, connects various parts of the mobile phone 100 by using various interfaces and lines, and executes various functions of the mobile phone 100 and processes data by running or executing an application program (App for short) stored in the memory 103 and calling data stored in the memory 103. In some embodiments, processor 101 may include one or more processing units. For example, the processor 101 may be configured to control and adjust a focal length of a lens in the camera module. For a detailed description of the processor control to adjust the focal length of the lens, see below. The processor 101 is further configured to control a lens driving device in the camera module, drive the lens to move along the optical axis, and adjust the focal length of the inner focusing lens, so that the lens focuses on the object.
The rf circuit 102 may be used for receiving and transmitting wireless signals during the transmission and reception of information or calls. In particular, the rf circuit 102 may receive downlink data of the base station and then process the received downlink data to the processor 101. In addition, data relating to uplink is transmitted to the base station. Typically, the radio frequency circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuitry 102 may also communicate with other devices via wireless communication. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications, general packet radio service, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, etc.
The memory 103 is used for storing application programs and data, and the processor 101 executes various functions and data processing of the mobile phone 100 by running the application programs and data stored in the memory 103. The memory 103 mainly includes a program storage area and a data storage area. The storage program area may store an operating system, and an application program required by at least one function (such as a sound playing function, an image playing function, and the like). The storage data area may store data (e.g., audio data, a phonebook, etc.) created from use of the handset 100. Further, the memory 103 may include high speed random access memory, and may also include non-volatile memory, such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device. The memory 103 may store various operating systems. For example, the iOS operating system developed by apple inc, the Android operating system developed by google inc, and so on.
The handset may include an input component and an input component. The input component can receive input operations of a user on the mobile phone, for example, receiving voice operations input by the user, receiving touch operations input by the user, and the like. The output component can output the data processing result inside the mobile phone to the user, for example, the mobile phone outputs voice, an output interface and the like through the output component. For example, the input component and the output component may be integrated together. For example, as one possibility, the touch screen 104 incorporates an input component touchpad 104-1 and an output component display screen 104-2. The touch screen 104 may include a touch pad 104-1 and a display screen 104-2. The touch pad 104-1 can be used as an input component for collecting touch events on or near the touch pad 104-1 by a user of the mobile phone 100 (e.g., operations performed by the user on or near the touch pad 104-1 using any suitable object such as a finger, a stylus, etc.), and sending the collected touch information to other devices, such as the processor 101.
Wherein a touch event by a user near the touch pad 104-1 may be referred to as a hover touch. Hover touch may refer to a user not needing to directly contact the touchpad in order to select, move, or drag a target (e.g., an icon, etc.), but only needing to be located near the terminal in order to perform a desired function. In the context of a hover touch application, the terms "touch," "contact," and the like do not imply a direct contact to the touch screen, but rather a nearby or near contact.
Specifically, two types of capacitive sensors, i.e., a mutual capacitive sensor and a self capacitive sensor, may be disposed in the touch pad 104-1, and the two types of capacitive sensors may be alternately arrayed on the touch pad 104-1. The mutual capacitance sensor is used to implement normal conventional multi-touch, i.e. detect a gesture when a user touches the touch pad 104-1. While self-capacitance sensors can generate signals that are more powerful than mutual capacitance, thereby detecting finger touches that are farther from the touch pad 104-1. Thus, when the user's finger is hovering over the screen, the self-capacitance sensor generates a signal that is greater than the mutual-capacitance sensor, allowing the cell phone 100 to detect a user's gesture over the screen, e.g., 20mm above the touch pad 104-1.
Optionally, the touch pad 104-1 capable of performing floating touch may be implemented by using capacitive type, infrared sensing, ultrasonic wave, or the like. In addition, the touch pad 104-1 can be implemented by various types, such as resistive, capacitive, infrared, and surface acoustic wave. The display screen 104-2 may serve as an output component for displaying information entered by or provided to the user, as well as various menus for the handset 100. The display screen 104-2 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The touch pad 104-1 may be overlaid on the display screen 104-2, and when the touch pad 104-1 detects a touch event thereon or nearby, it may be communicated to the processor 101 to determine the type of touch event, and the processor 101 may then provide a corresponding visual output on the display screen 104-2 based on the type of touch event.
Although in FIG. 1, the touch pad 104-1 and the display screen 104-2 are shown as two separate components to implement the input and output functions of the cell phone 100, in some embodiments, the touch pad 104-1 and the display screen 104-2 may be integrated to implement the input and output functions of the cell phone 100.
It is understood that the touch screen 104 is formed by stacking multiple layers of materials, and only the touch pad (layer) and the display screen (layer) are shown in the embodiment of the present application, and other layers are not described in the embodiment of the present application. In addition, in some other embodiments of the present invention, the touch pad 104-1 may be covered on the display screen 104-2, and the size of the touch pad 104-1 is larger than that of the display screen 104-2, so that the display screen 104-2 is completely covered under the touch pad 104-1, or the touch pad 104-1 may be disposed on the front surface of the mobile phone 100 in a full-panel manner, that is, the touch of the user on the front surface of the mobile phone 100 can be sensed by the mobile phone, so that the full-touch experience on the front surface of the mobile phone can be realized. In other embodiments, the touch pad 104-1 is disposed on the front surface of the mobile phone 100 in a full-panel manner, and the display screen 104-2 may also be disposed on the front surface of the mobile phone 100 in a full-panel manner, so that a frameless structure can be implemented on the front surface of the mobile phone.
Illustratively, in the embodiment of the present application, the input device, such as the touch pad 104-1, is configured to receive a shooting instruction input by a user, where the shooting instruction is used to instruct the terminal to shoot a focused picture. An output component, such as a display screen 104-2, for outputting the picture taken after focus. For example, referring to fig. 15 (d), the user selects the photographing option 1505 by touching and clicking the touch pad 104-1 to input a photographing instruction, and then the terminal photographs the picture after focusing, and the display screen 104-2 outputs the picture photographed after focusing to the terminal.
In the embodiment of the present application, the mobile phone 100 may further have a fingerprint recognition function. For example, the fingerprint acquisition device 112 may be disposed on the back side of the handset 100, or the fingerprint acquisition device 112 may be disposed on the front side of the handset 100 (e.g., below the touch screen 104). For another example, the fingerprint acquisition device 112 may be configured in the touch screen 104 to realize the fingerprint identification function, i.e., the fingerprint acquisition device 112 may be integrated with the touch screen 104 to realize the fingerprint identification function of the mobile phone 100. In this case, the fingerprint acquisition device 112 is disposed in the touch screen 104, may be a part of the touch screen 104, and may be disposed in the touch screen 104 in other manners. Additionally, the fingerprint acquisition device 112 may also be implemented as a full panel fingerprint acquisition device. Thus, the touch screen 104 can be viewed as a panel that can be fingerprinted anywhere. The fingerprint acquisition device 112 may send the acquired fingerprint to the processor 101 for processing (e.g., fingerprint verification, etc.) by the processor 101. The main component of the fingerprint acquisition device 112 in the present embodiment is a fingerprint sensor, which may employ any type of sensing technology, including but not limited to optical, capacitive, piezoelectric, or ultrasonic sensing technologies, among others.
The handset 100 may also include a bluetooth device 105 for enabling data exchange between the handset 100 and other short-range terminals (e.g., cell phones, smart watches, etc.). The bluetooth device 105 in the embodiment of the present application may be an integrated circuit or a bluetooth chip, etc.
The handset 100 may also include at least one sensor 106, such as a light sensor, motion sensor, image sensor, and other sensors. In particular, the light sensor may include an ambient light sensor and a proximity sensor. The ambient light sensor can adjust the brightness of the display screen of the touch screen 104 according to the brightness of ambient light, and the proximity sensor can turn off the power supply of the display screen when the mobile phone 100 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, tapping), and the like. The image sensor may be disposed in the camera module 115, and configured to convert a picture photographed by the camera module 115 into an electric signal. Illustratively, a Charge Coupled Device (CCD) image sensor has a High resolution (High resolution) that can sense and identify fine objects and has a Large light sensing area (Large Field of View). The Complementary Metal-Oxide-Semiconductor (CMOS) image sensor has a power-saving characteristic, and can reduce power consumption of a mobile phone when a still picture or a moving video is taken.
In addition, the mobile phone 100 may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which are not described herein again.
The Wi-Fi device 107 is used for providing network access for the mobile phone 100 according to Wi-Fi related standard protocols, the mobile phone 100 can be accessed to a Wi-Fi access point through the Wi-Fi device 107, so that the mobile phone helps a user to send and receive e-mails, browse webpages, access streaming media and the like, and wireless broadband internet access is provided for the user. In other embodiments, the Wi-Fi device 107 can also act as a Wi-Fi wireless access point and can provide Wi-Fi network access to other terminals.
And a positioning device 108 for providing a geographical position for the handset 100. It is understood that the positioning device 108 may be a receiver of a Global Positioning System (GPS) or a positioning system such as the beidou satellite navigation system, russian GLONASS, etc. After receiving the geographical location transmitted by the positioning system, the positioning device 108 transmits the information to the processor 101 for processing or transmits the information to the memory 103 for storage. In still other embodiments, the positioning device 108 may also be a receiver of an Assisted Global Positioning System (AGPS) that assists the positioning device 108 in performing ranging and positioning services by acting as an assisted positioning server, in which case the assisted positioning server provides positioning assistance by communicating with the positioning device 108 (i.e., GPS receiver) of the terminal, such as the handset 100, over a wireless communication network. In other embodiments, the location device 108 may also be a Wi-Fi access point based location technology. Because each Wi-Fi Access point has a globally unique Media Access Control (MAC) address, the terminal can scan and collect broadcast signals of the surrounding Wi-Fi Access points under the condition of starting Wi-Fi, and therefore the MAC address broadcasted by the Wi-Fi Access points can be acquired. The terminal sends the data (such as the MAC address) capable of indicating the Wi-Fi access point to the location server through the wireless communication network, the location server retrieves the geographical location of each Wi-Fi access point, and calculates the geographical location of the terminal according to the strength of the Wi-Fi broadcast signal and sends the geographical location of the terminal to the positioning device 108 of the terminal.
The audio circuitry 109, speaker 113, microphone 114 can provide an audio interface between a user and the handset 100. The audio circuit 109 may transmit the electrical signal converted from the received audio data to the speaker 113, and convert the electrical signal into a sound signal by the speaker 113 to output. On the other hand, the microphone 114 converts the collected sound signal into an electric signal, converts the electric signal into audio data after being received by the audio circuit 109, and then outputs the audio data to the RF circuit 102 to be transmitted to, for example, another cellular phone, or outputs the audio data to the memory 103 for further processing.
Peripheral interface 110, which is used to provide various interfaces for external input/output devices (e.g., keyboard, mouse, external display, external memory, SIM card, etc.). For example, the mouse is connected through a Universal Serial Bus (USB) interface, and the Subscriber Identity Module (SIM) card provided by a telecom operator is connected through a metal contact on a SIM card slot. Peripheral interface 110 may be used to couple the aforementioned external input/output peripherals to processor 101 and memory 103.
The mobile phone 100 may further include a power supply device 111 (such as a battery and a power management chip) for supplying power to each component, and the battery may be logically connected to the processor 101 through the power management chip, so as to implement functions of managing charging, discharging, and power consumption through the power supply device 111.
The mobile phone 100 may further include a camera module 115, and the camera module 115 may be a camera of the terminal. For taking still pictures or moving videos, etc. As a possible implementation manner, the image capturing module 115 includes a lens, a lens driving device, and an image sensor from the object side to the image side. The following embodiments can be referred to for a detailed description of the camera module 115.
Although not shown in fig. 1, the mobile phone 100 may further include a flash, a micro-projector, a Near Field Communication (NFC) device, and so on, which are not described in detail herein.
The terminal provided in the embodiments of the present application is explained in detail as follows. Hereinafter, the terminal is mainly used as a mobile phone for example, and the description is not repeated herein in a unified manner. Referring to fig. 2, taking the mobile phone 200 as an example, the camera module 201 in the mobile phone 200 may be a rear camera shown in fig. 2, and the rear camera is disposed at the top end of the back of the mobile phone. Certainly, the module of making a video recording can also set up in other positions, for example set up inside the cell-phone, when the user has the shooting demand, pops up the module of making a video recording to shoot.
Referring to fig. 3, a structure of a camera module in a terminal according to an embodiment of the present disclosure is shown. The image pickup module 201 includes, from the object side to the image side, a lens 301, a lens driving device 302, and an image sensor 303. It should be noted that the components in fig. 3 are only exemplary components, and the actual shape, size and dimension of the components are not limited to those listed in fig. 3.
The object side refers to a side of a subject (simply referred to as a subject), and the image side refers to a side imaged by the image sensor. The lens driving device includes, but is not limited to, a voice coil motor, a piezoelectric ceramic, and a Micro-Electro-Mechanical System (MEMS). The image sensor includes, but is not limited to, the above-mentioned CCD image sensor, CMOS image sensor.
And the lens driving device is used for driving the lens to move along the optical axis. The driving stroke of the lens driving device is related to the closest focus distance of the lens. In the embodiment of the application, the driving stroke of the motor can enable the range of the closest focusing distance of the lens to be 1-5 cm.
The focus distance refers to a distance between object images, i.e., a sum of a distance from a subject to the lens and a distance from the lens to the image sensor, i.e., a distance between the subject and the image sensor. The closest focus distance refers to a closest focus distance at which the subject is in focus. The subject focusing means that the subject can form a sharper image on the image sensor, that is, the nearest focusing distance is the nearest distance from the image sensor when the subject can form a sharper image.
Taking the lens driving device as an example as a motor, when the object is closer to the image sensor, for example, 1cm, the motor drives the lens to move along the optical axis by a certain stroke (for example, 400um), so that the object is focused when the object is 1cm away from the image sensor. When the subject is at a distance of, for example, 7cm from the image sensor, the motor drives the lens to move along the optical axis by a certain stroke (for example, 50um), so that the subject is in focus at a distance of 7cm from the image sensor. In the embodiment of the application, the driving stroke of the motor can enable the range of the closest focusing distance of the lens to be 1-5 cm. That is, when the distance between the object and the image sensor is within 1-5 cm, the object can be focused, that is, the object can form a relatively clear image on the image sensor.
It should be noted that the lens driving device is mainly used for pushing the lens along the optical axis and pushing the lens to the optimal imaging position along the optical axis. When different lenses are set in the terminal, the driving stroke of the lens driving device may be different. Illustratively, the terminal is provided with a lens 1, and the stroke range of the lens driving device is 0-400 um, so that the lens driving device can push the lens to the optimal imaging position along the optical axis when the object to be shot is 1-5 cm away from the image sensor. For another example, the terminal is provided with a lens 2, and the driving stroke range of the lens driving device is 0-300 um, so that the lens driving device can push the lens to the optimal imaging position along the optical axis when the object is 1-5 cm away from the image sensor. It can be seen that the driving stroke range of the lens driving device may be different for different lenses.
It is understood that the lens driving device and the lens may be connected in some manner. Taking the voice coil motor as an example, alternatively, the lens and the voice coil motor may be connected by a threaded inter-embedded structure as shown in fig. 4 (a). Specifically, this kind of structure mainly relies on the screw thread between voice coil motor 2 and the camera lens 1 to mutually support, forms preliminary cohesion, later from the upper end of the mutual embedded formula structure of screw thread through the point fixed to it is fixed to make the surface of camera lens 1 and the internal surface of voice coil motor 2 fixed, makes camera lens 1 and voice coil motor 2 combine together. Alternatively, the lens 1 and the voice coil motor 2 may be connected by a non-threaded smooth surface structure as shown in fig. 4 (b), and for a specific method of the non-threaded smooth surface structure connection, reference may be made to the prior art, which is not described herein again. Of course, the lens and the voice coil motor may be connected in other manners, which is not limited in the embodiment of the present application. In addition, the connection relationship between the MEMS and the piezoelectric ceramic and the lens can also be referred to in the prior art, and the embodiment of the present application is not limited thereto.
In the embodiment of the application, in order to support that a shot object can form a clearer image when the shot object is 1-5 cm away from an image sensor, at least one of the following 3 lenses can be used:
case 1: the lens is a fixed-focus ultra-wide-angle lens.
Illustratively, the Field of view (FOV) of the ultra-wide angle lens is greater than or equal to 100 °, and the equivalent focal length of the ultra-wide angle lens ranges from 10mm to 20 mm.
It should be noted that, in the embodiment of the present application, the terminal may implement macro imaging by setting different lenses. Wherein, the specific parameters of different lenses are different. Generally, when the parameters of the lens fall within the parameter ranges mentioned in the embodiments of the present application, the terminal can implement macro imaging. For example, when the FOV of the ultra-wide angle lens is 110 ° and the equivalent focal length is 15mm, macro imaging of the terminal can be achieved by adjusting other parameters of the ultra-wide angle lens, such as curvature, refractive index, and the like. In the embodiment of the application, the macro imaging means that a shot object can form a clearer image when the shot object is 1-5 cm away from an image sensor, and the description is unified here and is not repeated.
Referring to fig. 5, an exemplary super-wide angle lens structure is provided in an embodiment of the present application. The super wide-angle lens is composed of 6 lenses. The power of the first lens L1 from the object side to the image side is negative, and the power of the second lens L2 is positive. A stop STO is arranged between L1 and L2. The focal power of the third lens L3 is negative, the focal power of the fourth lens L4 is positive, the focal power of the fifth lens L5 is positive, and the focal power of the sixth lens L6 is negative. The FOV value of the ultra-wide-angle lens can be 100 degrees or a certain value larger than 100 degrees, and the equivalent focal length of the ultra-wide-angle lens can be a certain value in 10-20 mm. The distance from the first lens piece L1 to the image sensor 303 is defined as Total Track Length (TTL), the half-image height of the lens is IH, and the range of IH/TTL is 0.5-0.6. Of course, the super-wide-angle lens in the embodiment of the present application may also have other structures and other numbers of lenses, for example, the super-wide-angle lens is composed of 5 lenses, and the focal power, curvature, and the like of each lens from the object side to the image side may be set according to actual situations. Or, the super wide-angle lens can also adopt the existing structure, and the embodiment of the application does not limit the specific structure of the super wide-angle lens.
In the embodiment of the application, the equivalent focal length of the ultra-wide-angle lens is shorter (10-20 mm), so that a shorter closest focusing distance can be obtained, namely, when the lens is closer to a shot object, focusing can be successfully carried out, and high-quality and high-definition imaging can be obtained.
Referring to fig. 7, (a) in fig. 7 is an image captured by a conventional mobile phone at a macro (for example, a subject is 5cm away from a lens), and the image is blurred. Fig. 7 (b) is an image taken by the mobile phone at a macro distance in the embodiment of the present application, and details of the bug and the leaf are taken clearly in fig. 7 (b).
In addition, when the lens is close to the shot object, the focal distance is close, so that the depth of field of the shot image is shallow, and the shot image obtains a good background blurring effect.
Referring to fig. 8, a picture taken by the mobile phone in the embodiment of the present application has a better background blurring effect.
Optionally, the range of the vertical axis magnification of the lens in the central view field is 0.03-0.43. The lens has a negative distortion at the peripheral field of view, and the negative distortion is greater than or equal to-30%.
The vertical axis magnification refers to the magnification in the direction perpendicular to the optical axis, and the value of the magnification is the ratio of the imaging size to the actual size of the object in the direction perpendicular to the optical axis. The edge field of view is 0.8-1. Specifically, referring to fig. 6, the whole visible range is divided into N parts, the maximum visible range is 1, the central field of view is 0, 0.8-1 is an edge field of view, that is, α and β are edge fields of view. Negative distortion means that the vertical magnification of the lens in the peripheral field of view is less than the vertical magnification of the lens in the central field of view. So, the module of making a video recording when shooing the miniaturisation view, the magnification that the lower magnification of marginal visual field brought because of the object distance increase reduces when being equivalent to shooing the macroscopical view for the module of making a video recording can shoot the better image of perspective effect.
Referring to fig. 9, the pictures taken by the mobile phone in the embodiment of the present application, wherein the microscopic scenes (several dolls on the desktop) and the macroscopic scenes (buildings and the like in fig. 9) have better perspective effect, so that the pictures have more stereoscopic effect.
Optionally, the number of lenses in the lens ranges from 5 to 8, and the size of the target surface of the image sensor ranges from 1/3.06 to 1/2.78. Optionally, the lens is made of plastic or glass, or a mixed material of plastic and glass. Optionally, the aperture range of the lens is F2.4 to F1.8.
Case 2: the lens is an inner focusing lens. And the inner focusing lens comprises n lenses which are sequentially arranged along the optical axis, wherein the n lenses comprise one or more than one movable lens group. Each movable lens group includes one or more movable lenses, which refer to lenses whose positions along the optical axis relative to the lens are variable, the positions of the movable lenses along the optical axis being related to the focal length of the inner focusing lens.
In case 2, the terminal further comprises a lens driving device for driving one or more movable lens groups in the inner focusing lens to move along the optical axis to adjust the focal length of the inner focusing lens.
Alternatively, the lens driving device may be a voice coil motor, MEMS, piezoelectric ceramic.
Optionally, when the lens driving device drives the movable lenses to move, the relative positions of the movable lenses in the same movable lens group along the optical axis are not changed, that is, the lens driving device uses the movable lens group as a whole to move the movable lens group along the optical axis. For example, the lens driving device drives a first lens element of the movable lens group to move 100um along the object side along the optical axis, and correspondingly drives a second lens element of the same movable lens group to move 100um along the object side along the optical axis. The moving distance and moving direction along the optical axis may be different between different movable lens groups. For example, in fig. 10, L2 and L3 are driven to move along the object side and distance 1, and L4 is driven to move along the image side and distance 2. The moving distance and the moving direction along the optical axis between different movable lens groups can also be the same, and the specific moving rule of the movable lens groups is not limited in the embodiment of the application.
The movable lens can be connected with the lens driving device in some way, for example, the movable lens can be connected with the lens driving device in a dispensing way. Of course, the connection manner of the movable lens and the lens driving device can also refer to other manners in the prior art, and the embodiment of the present application does not limit this. Exemplary, refer to fig. 10, which is an exemplary internal focus lens according to an embodiment of the present application. The lens driving device is a motor, n is 6, among the 6 lenses, the movable lens L2 and the movable lens L3 form a movable lens group, and L4 is another movable lens group. The movable lenses L2 and L3 are combined with the motor by dispensing, and the movable lens L4 is combined with the motor by dispensing. Accordingly, the motors may drive the L2, L3, L4 to move relative to the inner focus lens in a direction along the optical axis.
In the embodiments of the present application, the relative position between the movable lenses in the lens along the optical axis is changed by the driving of the lens driving device, that is, the interval between the lenses in the lens is changed, and thus, the optical characteristics of the entire lens, such as the focal length, may be changed. In this application embodiment, through the interval between the lens in the dynamic adjustment camera lens, can adjust the focus of camera lens, and then make the terminal can become comparatively clear image under the macro.
It should be noted that the process of the lens driving device pushing the movable lens is different from the process of the lens driving device pushing the lens mentioned above. The lens driving device drives the movable lens in the lens to move along the optical axis, and aims to adjust the focal length of the lens by changing the distance between the lenses in the lens. The lens driving device pushes the lens to move along the optical axis, and the purpose of the lens driving device is to adjust the object distance and the image distance through the movement of the lens along the optical axis so as to determine the optimal position of the lens when a shot object can clearly image.
Fig. 10 is only an example of the internal focusing lens in the embodiment of the present application, and in actual use, the number of lenses included in the lens, and which lens or lenses are the movable lenses may be set separately, and the embodiment of the present application is not limited.
Case 3: the lens is an inner focusing lens. Referring to fig. 11, the inner focus lens includes one or more variable power lenses (e.g., lenses L1, L4 in fig. 11) whose power is associated with the focal length of the lens.
Wherein the optical power is used to characterize the refractive power of the optical device for an incident parallel light beam. The greater the optical power, the more significant the degree of parallel beam refraction. When the focal power is greater than 0, the refraction is convergent, and when the focal power is less than 0, the refraction is divergent.
The variable power lens may be variable in shape under the influence of an electric field (e.g., a varying current or voltage), the shape of the variable power lens being related to the power of the variable power lens, or the variable power lens may be variable in refractive index under the influence of an electric field, the refractive index of the variable power lens being related to the power of the variable power lens.
Accordingly, the processor in the terminal can adjust the focal length of the inner focusing lens by adjusting the focal power of the variable focal power lens by controlling the deformation or refractive index of the variable focal power lens. Optionally, the processor is configured to adjust a focal length of the inner focusing lens, and may specifically be implemented as: the processor controls the current or voltage input to the focal power variable lens to change the refractive index of the focal power variable lens, so as to achieve the purpose of adjusting the focal power variable lens, thereby adjusting the focal length of the inner focusing lens. Or, the processor is configured to adjust a focal length of the inner focusing lens, and may specifically be further implemented as: and the processor is used for controlling the variable focal power lens to deform so as to achieve the purpose of adjusting the focal power of the variable focal power lens and further adjust the focal length of the inner focusing lens. Here, the processor controls the variable power lens to deform, and specifically, the processor may control the driving device, and the driving device pushes and presses the lens to deform.
Optionally, the lens with variable focal power is an electro-optical lens or a deformable lens. The electro-material is a material with a variable refractive index under the action of an electric field. The deformable lens can be deformed under the driving of the driving device. The driving means may be a motor, MEMS, or the like. Of course, the materials of the variable power lens are not limited to the above two, and may be other materials, which is not limited in the embodiment of the present application.
In the embodiment of the application, the focal length of the whole lens can be adjusted by applying an electric field to the lens with variable focal power, such as L1 and L4, so that the terminal can form a clearer image at a macro distance.
The camera module shown in fig. 3 may further include other components. For example, an infrared cut filter 304 is disposed between the lens 301 and the image sensor 303 for filtering near infrared and ultraviolet light bands in the ambient light. Optionally, the thickness of the infrared cut filter is 0.11mm or 0.21mm, and the material of the infrared cut filter is resin or blue glass. Of course, the infrared cut filter may also be made of other materials and/or have other thicknesses, and the material and the thickness of the filter are not limited in the embodiments of the present application.
An embodiment of the present application further provides a macro imaging method, referring to fig. 12, where the method is applied to the terminal shown in the above case 1, the terminal is provided with a camera module, an input component, and an output component, the camera module is provided with a lens, a lens driving device, and an image sensor from an object side to an image side, the lens is an ultra-wide-angle lens, and the method includes the following steps:
s1201, the input part receives the operation of starting the camera input by the user so as to start the camera.
Illustratively, the input component may be a touch pad 104-1, see fig. 15 (a), where a user touches and clicks a camera icon 1501 displayed on the screen, and the touch pad 104-1 collects information input by the user to start a camera operation and transmits the information to the processor for further processing to start the camera. Referring to fig. 15 (b), there is shown a camera interface 1502 of the terminal. The interface may be displayed to the user by the display screen 104-2 of the terminal.
S1202, the processor detects the distance between the shot object and the image sensor.
(optional) S1203, the terminal detects that the distance between the object and the image sensor is within the macro range, the output component outputs a first interface 1504, and the first interface 1504 is used for prompting a user whether to start macro shooting.
Wherein the micro-distance range is 1-5 cm.
Alternatively, the terminal processor 101 measures the distance between the subject and the image sensor using a laser ranging method. The specific principle and process of laser ranging can be referred to in the prior art, and are not described in detail herein. Alternatively, the processor 101 acquires an image on the image sensor, and may preliminarily determine that the distance between the subject and the image sensor is short when the image is blurred.
Optionally, the processor 101 feeds back the measured distance to the lens driving apparatus.
Referring to fig. 15, after the camera of the terminal is turned on, if the terminal detects that the distance from the object to the image sensor is at the macro, as shown in fig. 15 (c), the terminal output component, i.e., the display screen 104-2, outputs a first interface 1504 to prompt the user whether to turn on macro shooting, so as to obtain better close-range shooting imaging quality.
S1204, the input component receives a first operation input by a user, and the first operation is used for indicating the terminal to start macro shooting.
As shown in fig. 15 (c), the display screen 104-2 displays the options "yes" and "no", and the user can input the first operation through the input means, such as touching the option "yes" through the touch pad 104-1 shown in fig. 1. Optionally, the touch pad 104-1 sends the collected touch information (i.e., the user clicks on the option "yes") to, for example, a processor for processing.
Alternatively, when the user touches the option "no" such as the touch pad 104-1, the terminal may determine that the user's real photographing intention is not macro photographing, and at this time, the terminal may photograph a picture using an existing method.
And S1205, under the control of the processor, the lens driving device drives the lens to move along the optical axis, and focuses on the shot object.
For example, the terminal may automatically focus the object, that is, after receiving a first operation of the user and determining to start macro photography, the processor of the terminal may control the lens driving device, and then the lens driving device drives the ultra-wide angle lens to move along the optical axis, thereby completing the focusing process. The terminal can also receive focusing operation input by a user on the mobile phone interface and adjust the position of the ultra-wide-angle lens along the optical axis according to the focusing operation. For example, referring to fig. 15 (d), the user may perform a focus selection operation by touching a bug displayed on the display screen, and after receiving the user input, the terminal takes the bug as a focus and adjusts the position of the ultra-wide angle lens along the optical axis.
And S1206, the input component receives a shooting instruction input by the user to instruct the terminal to shoot the focused picture.
For example, referring to fig. 15 (d), assuming that the user wants to take a still picture at a macro, the user inputs a photographing instruction by clicking the photographing option 1505 through the touch pad 104-1, and inputs a photographing instruction by clicking the photographing option 1506 through the touch pad 104-1, assuming that the user wants to photograph a moving video at the macro.
Of course, referring to (a) in fig. 17, if the terminal has turned on the "voice-controlled photographing", the user can enter a voice through an input means such as a microphone to input a photographing instruction. Or, the user may also input the shooting instruction in other manners through other input components, which is not described herein again in this embodiment of the application.
S1207, the output section outputs the picture taken after the focusing.
Referring to fig. 15 (d), the user may click the photo option 1505 through the touch pad 104-1 to trigger the terminal to take a picture at macro, and the picture taken at macro is output by an output means, such as the display screen 104-2.
Of course, when the terminal detects that the distance from the object to the image sensor is within the macro range, the first interface 1504 shown in fig. 15 (c) may not be output, but the macro shooting mode may be automatically turned on, the ultra-wide-angle lens may be driven by the lens driving device, focusing may be completed, and then the focused picture may be shot and output, that is, in fig. 12, S1203 and S1204 are optional steps.
According to the macro imaging method provided by the embodiment of the application, the terminal can detect whether the distance between a shot object and the image sensor meets the macro, when the macro is met, the lens driving device in the terminal pushes the lens to move along the optical axis so as to finish focusing, and then clear imaging can be shot under the macro.
The embodiment of the present application further provides a macro imaging method, which is applied to the terminal shown in the above case 2. The terminal is provided with a camera module, an input component and an output component, the camera module is provided with an inner focusing lens, a lens driving device and an image sensor from an object side to an image side, the inner focusing lens comprises n lenses which are sequentially arranged along an optical axis, and the n lenses comprise one or more movable lens groups. Each movable lens set comprises one or more movable lenses, and the movable lenses are lenses with variable positions along the optical axis relative to the lens, and the positions of the movable lenses along the optical axis are related to the focal length of the lens. Referring to fig. 13, the method includes steps S1201 to S1204, S1301, S1302, S1206, and S1207:
the descriptions of S1201 to S1204 may be referred to above, and are not described herein again.
S1301, under the control of the processor, the lens driving device drives one or more movable lens groups in the inner focusing lens to move along the optical axis to adjust the focal length of the inner focusing lens.
Taking the lens driving device as a motor as an example, referring to fig. 10, the motor can drive the movable lens set composed of L2 and L3 to move along the optical axis to the object side, so as to adjust the focal length of the lens.
And S1302, under the control of the processor, the lens driving device drives the inner focusing lens to move along the optical axis so as to focus the shot object.
The descriptions of S1206 and S1207 can be found above and are not repeated here.
According to the macro imaging method provided by the embodiment of the application, when the terminal detects that the distance between a shot object and an image sensor meets the macro, the lens driving device in the terminal can drive one or more movable lenses to move along the optical axis so as to dynamically adjust the focal length of the lens, and the lens driving device can push the lens to move along the optical axis in the macro so as to finish focusing, so that the image can be clearly formed in the macro.
The embodiment of the present application further provides a macro imaging method, which is applied to the terminal shown in the above case 3. The terminal is provided with a camera module, an input component, an output component and a processor, wherein the camera module is provided with a lens, a lens driving device and an image sensor from an object side to an image side, the lens is an inner focusing lens, the inner focusing lens comprises one or more lenses with variable focal power, and the focal power of the lenses with variable focal power is associated with the focal length of the inner focusing lens. Referring to fig. 14, the method includes steps S1201 to S1204, S1401, S1402, S1206, and S1207:
the descriptions of S1201 to S1204 may be referred to above, and are not described herein again.
S1401, the processor controls and adjusts the focal power of one or more lenses with variable focal power in the inner focusing lens so as to adjust the focal length of the lens.
Optionally, the terminal controls the current or voltage input to the power variable lens through the processor to adjust the power of the power variable lens. Or the terminal controls the variable-power lens to deform through the processor so as to adjust the power of the variable-power lens. Of course, the terminal processor may also control the focal power of the focal power variable lens to change in other ways to adjust the focal length of the inner focusing lens.
And S1402, the processor controls the lens driving device to enable the lens driving device to drive the inner focusing lens to move along the optical axis so as to focus the shot object.
The descriptions of S1206 and S1207 can be found above and are not repeated here.
According to the macro imaging method provided by the embodiment of the application, when the terminal detects that the distance between a shot object and the image sensor meets the macro, the terminal can change the focal power of the lens by controlling the deformation or the refractive index of the lens so as to adjust the focal length of the lens, and focusing can be completed through the lens driving device, so that high-quality imaging can be obtained at the macro.
Referring to fig. 16, in other embodiments of the present application, after turning on the camera of the terminal, as shown in fig. 16 (a), the user may trigger the terminal to jump to a mode selection interface 1601 shown in fig. 16(b) by clicking a mode option 1503, and then, at the mode selection interface 1601, the user may click a macro shooting option 1602 to trigger the terminal to perform macro imaging. Alternatively, the terminal having the structure shown in case 1 above may perform S1205 to S1207 above after detecting that the user clicks the macro photography option 1602. The terminal having the structure shown in case 2 above may perform S1301, S1302, S1206, and S1207 above upon detecting that the user clicks the macro shooting option 1602. The terminal having a structure such as that shown in case 3 above may perform S1401, S1402, S1206, S1207 described above upon detecting that the user clicks the macro photography option 1602.
Of course, the terminal may also jump to the mode selection interface 1601 in other manners, for example, when receiving a left-sliding operation of the user in the camera interface 1502, the terminal jumps to the mode selection interface 1601.
Alternatively, in some scenarios, the user may not know the actual shooting effect of macro shooting, in which case the terminal may prompt the user about the effect of macro shooting, or other information about macro shooting. As shown in fig. 16(b), the user selects to start macro photography, at this time, the terminal may output an interface prompt, for example, pop up a prompt box, "macro photography may support clear imaging when the subject is 1-5 cm away from the image sensor," and the prompt box may set options "yes" and "no," when the user touches "yes," the terminal may determine that the real intent of the user photography is macro photography, and thus, the terminal performs the above-mentioned macro photography method.
In addition, in other embodiments of the present application, the user may also preset a macro shooting function of the terminal. For example, as shown in (a) of fig. 17, at the setting interface 1701, the user can start the macro photography function by clicking on the macro photography start option 1702. Optionally, after the terminal starts the macro shooting function, the terminal may execute the macro imaging method, and when the terminal does not start the macro shooting function, the terminal does not have the authority to execute the macro imaging method. When the user wants to implement macro photography, if 1702 shown in fig. 17 (a) is off, the terminal may output a prompt interface to prompt the user to turn on the macro photography function, so that the terminal may form a clear image at macro.
It should be noted that the terminal may enter the setting interface 1701 in various ways, for example, when the terminal receives a right-slide operation of the user on the camera interface 1502, the terminal may jump to the setting interface 1502. The embodiment of the application does not limit the mode of the terminal entering the setting interface. Furthermore, the terminal can save the settings of the user on the setting interface, and subsequently, when the user turns on the camera, if the terminal detects that the shot object is close to the image sensor, the terminal can execute the micro-distance imaging method, so that clear imaging under the micro-distance is realized.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (20)

1. A camera module is characterized in that the camera module comprises a lens and an image sensor,
the lens comprises N lenses, the value of N is any integer in an interval of 5-8, the distance from the first lens L1 to the image sensor is Total Track Length (TTL), the value range of the ratio of the half-image height of the lens to the TTL is 0.5-0.6, the field angle FOV of the lens is greater than or equal to 100 degrees, the aperture value range of the lens is F2.4-F1.8, and the value range of the equivalent focal Length of the camera module is 10-20 mm;
the camera module further comprises a lens driving device, and the lens driving device is used for driving the lens to move along the optical axis of the lens.
2. The camera module according to claim 1, wherein a value range of a closest focus distance of the camera module is 1-5 cm, and the closest focus distance is a closest distance between a subject and the image sensor when the subject can clearly image on the image sensor.
3. The camera module of claim 1, wherein the target surface size of the image sensor ranges from 1/3.06 inch to 1/2.78 inch.
4. The camera module of claim 1, wherein a stop is disposed between the first lens L1 and the second lens L2 from the object side to the image side of the lens.
5. The camera module of claim 1, wherein the lens comprises 6 lenses.
6. The camera module of claim 4, wherein the focal power of the first lens L1 from the object side to the image side of the lens barrel is negative, the focal power of the second lens L2 is positive, the focal power of the third lens L3 is negative, the focal power of the fourth lens L4 is positive, the focal power of the fifth lens L5 is positive, and the focal power of the sixth lens L6 is negative.
7. The camera module according to any one of claims 1 to 6, wherein a vertical axis magnification of the lens in the central view field is in a range of 0.03 to 0.43, and the vertical axis magnification is a ratio of an imaging size to an actual size of an object along a direction perpendicular to an optical axis.
8. The camera module of any one of claims 1-6, wherein the lens has a negative distortion in the peripheral field of view, wherein the negative distortion is greater than or equal to-30%, and the negative distortion is that the vertical axis magnification of the lens in the peripheral field of view is smaller than the vertical axis magnification of the lens in the central field of view.
9. The camera module as claimed in claim 1, wherein the lens of the lens is made of plastic or glass, or a mixture of plastic and glass.
10. A terminal, characterized in that it comprises a camera module according to any one of claims 1 to 9.
11. A terminal is characterized by comprising a camera module, an input component, an output component and a processor, wherein the camera module comprises a lens, a lens driving device and an image sensor;
the input component is used for receiving input operation of a user;
the processor is used for detecting the distance between a shot object and the image sensor;
the output component is used for outputting the pictures or videos generated after shooting;
the lens driving device is used for driving the lens to move along an optical axis, so that the terminal focuses on the shot object;
the lens comprises N lenses, the value of N is any integer in an interval of 5-8, the distance from the first lens L1 to the image sensor is Total Track Length (TTL), the value range of the ratio of the half-image height of the lens to the TTL is 0.5-0.6, the field angle FOV of the lens is greater than or equal to 100 degrees, and the aperture value range of the lens is F2.4-F1.8;
the value range of the equivalent focal length of the camera module is 10-20 mm.
12. A terminal as claimed in claim 11, wherein a stop is provided between the first lens L1 and the second lens L2 of the lens.
13. The terminal of claim 11, wherein the target surface size of the image sensor ranges from 1/3.06 to 1/2.78 inches.
14. The camera module of claim 11, wherein the lens has a negative distortion in the peripheral field of view, wherein the negative distortion is greater than or equal to-30%, and wherein the negative distortion is that the vertical magnification of the lens in the peripheral field of view is less than the vertical magnification of the lens in the central field of view.
15. A macro imaging method, wherein the method is applied to a terminal, the terminal comprises the camera module according to any one of claims 1 to 9, and the method comprises:
receiving a first operation input by a user, starting a camera, and displaying a preview interface;
the terminal enters a macro shooting mode;
the terminal focuses on a shot object;
and shooting the object in response to a shooting instruction input by a user.
16. The method of claim 15, wherein the terminal entering the macro photography mode comprises:
and when the distance between the shot object and the image sensor is within a microspur range, the terminal automatically starts a microspur shooting mode, and the value of the microspur range is between 1cm and 5 cm.
17. The method of claim 15, wherein the terminal entering the macro photography mode comprises:
when the distance between the shot object and the image sensor is within a microspur range, automatically displaying a first interface, wherein the first interface is used for prompting a user whether to start the microspur shooting mode, and the value of the microspur range is between 1cm and 5 cm;
and responding to an instruction for selecting the macro shooting mode input by a user, and enabling the terminal to enter the macro shooting mode.
18. The method of claim 15, wherein the terminal entering the macro photography mode comprises:
displaying a first button;
and receiving a second operation of clicking the first button by the user, and responding to the second operation to enter the macro shooting mode by the terminal.
19. The method according to any one of claims 15 to 18, wherein after the terminal enters the macro shooting mode, the method further comprises:
and displaying a macro shooting interface prompt.
20. The method of any one of claims 15 to 18, wherein the focusing of the subject by the terminal comprises:
receiving focusing operation input by a user, and focusing the shot object in response to the focusing operation.
CN202110598543.8A 2018-10-16 2018-10-16 Microspur imaging method and terminal Pending CN113472977A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110598543.8A CN113472977A (en) 2018-10-16 2018-10-16 Microspur imaging method and terminal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110598543.8A CN113472977A (en) 2018-10-16 2018-10-16 Microspur imaging method and terminal
CN201811206371.XA CN109788089B (en) 2018-10-16 2018-10-16 Microspur imaging method and terminal

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201811206371.XA Division CN109788089B (en) 2018-10-16 2018-10-16 Microspur imaging method and terminal

Publications (1)

Publication Number Publication Date
CN113472977A true CN113472977A (en) 2021-10-01

Family

ID=66496327

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201811206371.XA Active CN109788089B (en) 2018-10-16 2018-10-16 Microspur imaging method and terminal
CN202110595845.XA Active CN113472976B (en) 2018-10-16 2018-10-16 Microspur imaging method and terminal
CN202110598543.8A Pending CN113472977A (en) 2018-10-16 2018-10-16 Microspur imaging method and terminal

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN201811206371.XA Active CN109788089B (en) 2018-10-16 2018-10-16 Microspur imaging method and terminal
CN202110595845.XA Active CN113472976B (en) 2018-10-16 2018-10-16 Microspur imaging method and terminal

Country Status (8)

Country Link
US (2) US11405538B2 (en)
EP (1) EP3846432B1 (en)
JP (2) JP2022511621A (en)
KR (1) KR102324921B1 (en)
CN (3) CN109788089B (en)
ES (1) ES2961008T3 (en)
RU (1) RU2762887C1 (en)
WO (1) WO2020078346A1 (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11949990B2 (en) 2018-10-05 2024-04-02 Google Llc Scale-down capture preview for a panorama capture user interface
CN109788089B (en) 2018-10-16 2021-05-18 华为技术有限公司 Microspur imaging method and terminal
CN110351405A (en) * 2019-07-03 2019-10-18 肯维捷斯(武汉)科技有限公司 A kind of mobile communication equipment with microcosmic imaging function
CN110515258A (en) * 2019-08-05 2019-11-29 肯维捷斯(武汉)科技有限公司 A kind of close-shot luminaire and the imaging device comprising the luminaire
CN110661950A (en) * 2019-09-30 2020-01-07 联想(北京)有限公司 Camera module and electronic equipment
US10984513B1 (en) 2019-09-30 2021-04-20 Google Llc Automatic generation of all-in-focus images with a mobile camera
US11032486B2 (en) 2019-10-11 2021-06-08 Google Llc Reducing a flicker effect of multiple light sources in an image
CN111127699A (en) * 2019-11-25 2020-05-08 爱驰汽车有限公司 Method, system, equipment and medium for automatically recording automobile defect data
CN110769144A (en) * 2019-11-27 2020-02-07 Oppo广东移动通信有限公司 Imaging device and mobile terminal
CN111092977B (en) * 2019-12-31 2021-05-18 维沃移动通信有限公司 Electronic device
CN111294511B (en) * 2020-02-06 2021-12-14 北京小米移动软件有限公司 Focusing method and device of camera module and storage medium
US11190689B1 (en) 2020-07-29 2021-11-30 Google Llc Multi-camera video stabilization
CN111757102B (en) * 2020-08-03 2022-03-15 Oppo广东移动通信有限公司 Ultra-micro distance camera definition detection device and method
CN112422787B (en) * 2020-10-28 2022-04-22 维沃移动通信有限公司 Camera module and electronic equipment
AU2021240322B2 (en) 2021-01-07 2023-08-17 Joji AONUMA Versatile camera device mountable to pole
JP7110450B1 (en) * 2021-06-15 2022-08-01 丈二 青沼 Versatile camera device mounted on a rod
US20220345591A1 (en) * 2021-04-22 2022-10-27 David Shau Underwater Camera Operations
CN113676642A (en) * 2021-08-17 2021-11-19 Oppo广东移动通信有限公司 Camera assembly, control method thereof and electronic equipment
CN113866945A (en) * 2021-09-30 2021-12-31 玉晶光电(厦门)有限公司 Optical imaging lens
CN114633692B (en) * 2022-03-14 2023-10-03 深圳市艾为智能有限公司 Application method of eccentric lens in CMS system
CN114640768A (en) * 2022-03-15 2022-06-17 Oppo广东移动通信有限公司 Shooting assembly, electronic equipment and control method
CN115086518A (en) * 2022-06-06 2022-09-20 Oppo广东移动通信有限公司 Camera and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102547084A (en) * 2011-09-22 2012-07-04 上海兴禄科技实业有限公司 Camera capable of achieving long-range shooting and super macro shooting
US20160147044A1 (en) * 2014-11-25 2016-05-26 Fujifilm Corporation Imaging lens and imaging apparatus equipped with the imaging lens
CN107015345A (en) * 2016-01-28 2017-08-04 Kolen株式会社 Lens optical system
CN107121757A (en) * 2017-06-28 2017-09-01 广东思锐光学股份有限公司 A kind of bugeye lens
CN113472976A (en) * 2018-10-16 2021-10-01 华为技术有限公司 Microspur imaging method and terminal

Family Cites Families (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4478247B2 (en) * 1999-07-06 2010-06-09 キヤノン株式会社 Zoom lens
US6545818B2 (en) * 1999-05-10 2003-04-08 Canon Kabushiki Kaisha Zoom lens and camera system
US20040179687A1 (en) * 2003-03-14 2004-09-16 Cheng-Shing Lai Method for transmitting copyrighted electronic documents in a wireless communication system
US7598997B2 (en) 2004-01-14 2009-10-06 Ricoh Company, Ltd. Imaging apparatus and focus control method based on a number of automatic focus scan stages, and recording medium storing a program for executing such a method
JP2006014054A (en) * 2004-06-28 2006-01-12 Nec Access Technica Ltd Portable electronic apparatus with camera and close-up photographing preventing method thereby
JP2006042028A (en) * 2004-07-28 2006-02-09 Canon Inc Image processor, and control method, control program and storage medium for image processor
JP2006217131A (en) * 2005-02-02 2006-08-17 Matsushita Electric Ind Co Ltd Imaging apparatus
JP2006243092A (en) * 2005-03-01 2006-09-14 Konica Minolta Opto Inc Wide angle lens
US7403341B2 (en) 2005-06-16 2008-07-22 Olympus Imaging Corp. Zoom lens system and electronic image pickup apparatus using the same
CN100406946C (en) * 2005-08-04 2008-07-30 亚洲光学股份有限公司 Macroshot lens
CN101009773B (en) 2006-01-24 2010-10-13 光宝科技股份有限公司 Digital camera module
JP2008015274A (en) * 2006-07-06 2008-01-24 Olympus Imaging Corp Digital camera
JP5607398B2 (en) * 2009-04-07 2014-10-15 富士フイルム株式会社 IMAGING LENS, IMAGING DEVICE, AND PORTABLE TERMINAL DEVICE
CN201434931Y (en) * 2009-06-09 2010-03-31 宁波舜宇车载光学技术有限公司 Super wide angle megapixel vehicle-mounted camera lens
CN101701793B (en) * 2009-10-29 2011-11-02 天津三星光电子有限公司 Method for measuring distance between object and shooting camera by utilizing digital camera
KR20110064156A (en) 2009-12-07 2011-06-15 삼성전자주식회사 Imaging device and its manufacturing method
JP2011123334A (en) * 2009-12-11 2011-06-23 Canon Inc Rear attachment lens, and photographing optical system including the same
US9097887B2 (en) * 2010-05-24 2015-08-04 Nikon Corporation Telescope optical system and optical device provided therewith
EP2397880B1 (en) * 2010-06-16 2017-04-12 Ricoh Company, Ltd. Image-forming lens, and camera device and portable information terminal device with the image-forming lens
JP2012173435A (en) * 2011-02-18 2012-09-10 Tamron Co Ltd Fixed-focus lens
CN102778745B (en) * 2012-08-24 2015-01-07 江西联创电子股份有限公司 Lens imaging system of high-pixel fish-eye lens
JP2015022145A (en) * 2013-07-19 2015-02-02 富士フイルム株式会社 Image capturing lens and image capturing device having the same
CN103543515B (en) * 2013-09-26 2015-09-30 宁波舜宇红外技术有限公司 A kind of novel LONG WAVE INFRARED wide-angle lens
CN103576296B (en) * 2013-10-30 2015-10-28 浙江舜宇光学有限公司 A kind of pick-up lens
JP6222564B2 (en) * 2013-12-27 2017-11-01 コニカミノルタ株式会社 Imaging lens, lens unit, imaging device, digital still camera, and portable terminal
JP6489634B2 (en) * 2014-11-18 2019-03-27 オリンパス株式会社 Inner focus macro lens and imaging apparatus using the same
CN104639831B (en) * 2015-01-05 2018-12-11 信利光电股份有限公司 A kind of camera and the method for expanding the depth of field
KR101748260B1 (en) * 2015-04-23 2017-06-16 엘지전자 주식회사 Camera module
JP6034917B2 (en) * 2015-04-23 2016-11-30 オリンパス株式会社 Rear focus lens system and image pickup apparatus including the same
CN204613498U (en) * 2015-05-04 2015-09-02 嘉兴中润光学科技有限公司 Small-sized wide-angle lens
US20170038504A1 (en) * 2015-08-05 2017-02-09 Lustrous Electro-Optic Co., Ltd. Close-up shots system and close-up shots module
RU2607842C1 (en) * 2015-08-28 2017-01-20 Публичное акционерное общество "Красногорский завод им. С.А. Зверева" Macro lens with variable magnification
JPWO2017057662A1 (en) 2015-09-30 2018-07-12 株式会社ニコン Zoom lens, optical device, and method of manufacturing zoom lens
CN105204140B (en) * 2015-10-28 2017-07-28 东莞市宇瞳光学科技股份有限公司 A kind of tight shot
JP2017102354A (en) 2015-12-04 2017-06-08 キヤノン株式会社 Image capturing lens and image capturing device having the same
CN105759406B (en) * 2016-04-01 2018-09-21 浙江舜宇光学有限公司 Pick-up lens
CN106405788A (en) * 2016-05-31 2017-02-15 信华精机有限公司 High-definition ultra-wide-angle lens
KR20180005464A (en) * 2016-07-06 2018-01-16 삼성전자주식회사 Optical lens assembly and electronic apparatus having the same
CN106019540B (en) * 2016-07-27 2018-10-09 广东弘景光电科技股份有限公司 High pixel ultra-wide angle optical system and its camera lens of application
CN106254768B (en) * 2016-07-29 2018-05-22 广东欧珀移动通信有限公司 Microshot processing method, device and terminal device
CN106657770B (en) 2016-11-04 2019-11-29 上海斐讯数据通信技术有限公司 The calculation method and system of mobile terminal autofocus lens microspur value
CN206321849U (en) 2016-12-07 2017-07-11 深圳市爱科环球实业有限公司 Mobile phone is set to shoot the external lens of wide-angle effect image plus microspur effect image
KR20180068585A (en) * 2016-12-14 2018-06-22 삼성전자주식회사 Optical lens assembly and method of forming an image using the same
WO2018192125A1 (en) * 2017-04-17 2018-10-25 浙江舜宇光学有限公司 Camera lens
CN106980170B (en) * 2017-05-04 2022-05-27 威海嘉瑞光电科技股份有限公司 Optical lens for ultra-wide-angle high-definition aerial photography instrument
CN206818961U (en) * 2017-06-02 2017-12-29 谢莉 A kind of wide-angle adds microspur mobile telephone external camera lens
CN107300756B (en) * 2017-08-23 2019-12-10 浙江舜宇光学有限公司 Camera lens
CN107390352A (en) * 2017-08-23 2017-11-24 速讯光学科技(苏州)有限公司 A kind of large aperture bugeye lens and lens Shooting Technique
CN107577032B (en) * 2017-09-19 2024-01-26 舜宇光学(中山)有限公司 Low-distortion wide-angle lens
CN207502801U (en) * 2017-09-19 2018-06-15 舜宇光学(中山)有限公司 Low distortion wide-angle lens
CN107613209A (en) 2017-09-29 2018-01-19 努比亚技术有限公司 A kind of image-pickup method, terminal and computer-readable recording medium
CN107835344B (en) * 2017-11-27 2020-08-07 信利光电股份有限公司 Ultra-micro distance camera shooting equipment and ultra-micro distance camera shooting system
KR101892897B1 (en) * 2018-04-09 2018-08-28 삼성전기주식회사 Optical system for camera
WO2019205944A1 (en) * 2018-04-28 2019-10-31 宁波舜宇车载光学技术有限公司 Optical lens and imaging device
CN109613685A (en) * 2019-02-19 2019-04-12 浙江舜宇光学有限公司 Pick-up lens group

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102547084A (en) * 2011-09-22 2012-07-04 上海兴禄科技实业有限公司 Camera capable of achieving long-range shooting and super macro shooting
US20160147044A1 (en) * 2014-11-25 2016-05-26 Fujifilm Corporation Imaging lens and imaging apparatus equipped with the imaging lens
CN107015345A (en) * 2016-01-28 2017-08-04 Kolen株式会社 Lens optical system
CN107121757A (en) * 2017-06-28 2017-09-01 广东思锐光学股份有限公司 A kind of bugeye lens
CN113472976A (en) * 2018-10-16 2021-10-01 华为技术有限公司 Microspur imaging method and terminal

Also Published As

Publication number Publication date
CN109788089A (en) 2019-05-21
US11683574B2 (en) 2023-06-20
EP3846432A4 (en) 2021-11-17
JP2022511621A (en) 2022-02-01
EP3846432B1 (en) 2023-08-30
CN113472976B (en) 2022-11-25
ES2961008T3 (en) 2024-03-07
CN109788089B (en) 2021-05-18
US20220217255A1 (en) 2022-07-07
RU2762887C1 (en) 2021-12-23
WO2020078346A1 (en) 2020-04-23
US20210329150A1 (en) 2021-10-21
EP3846432A1 (en) 2021-07-07
KR20210060560A (en) 2021-05-26
US11405538B2 (en) 2022-08-02
CN113472976A (en) 2021-10-01
KR102324921B1 (en) 2021-11-10
JP2023029944A (en) 2023-03-07

Similar Documents

Publication Publication Date Title
CN109788089B (en) Microspur imaging method and terminal
EP3662314B1 (en) Optical lens assembly and electronic apparatus having the samecross-reference to related application
US10447908B2 (en) Electronic device shooting image
KR102491564B1 (en) foldable electronic device with flexible display
KR102187571B1 (en) Mobile terminal
CN105718187B (en) Mobile terminal and method of controlling content of mobile terminal
EP2824910B1 (en) Electronic device and method of operating the same
WO2019033411A1 (en) Panoramic shooting method and device
KR20190008610A (en) Mobile terminal and Control Method for the Same
CN107637063B (en) Method for controlling function based on gesture of user and photographing device
KR20170079545A (en) Mobile terminal and method for controlling the same
CN111083375A (en) Focusing method and electronic equipment
CN111385525B (en) Video monitoring method, device, terminal and system
KR102129797B1 (en) Mobile terminal and controlling method thereof
CN112396076A (en) License plate image generation method and device and computer storage medium
US10771680B2 (en) Mobile terminal and corresponding control method for changing the length of a control icon based on a size, position and/or a moving speed of a first object in a preview image
CN112637495A (en) Shooting method, shooting device, electronic equipment and readable storage medium
JP2015138263A (en) Lens module and imaging module, and imaging unit
CN113709353B (en) Image acquisition method and device
CN110636197B (en) Shooting processing method and electronic equipment
CN108646384B (en) Focusing method and device and mobile terminal
CN210297875U (en) Camera device for mobile terminal and mobile terminal
CN215344778U (en) Camera module and mobile terminal
CN110602381B (en) Depth of field detection method and device, storage medium and terminal
CN113052408B (en) Method and device for community aggregation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination