WO2020078346A1 - 微距成像的方法及终端 - Google Patents

微距成像的方法及终端 Download PDF

Info

Publication number
WO2020078346A1
WO2020078346A1 PCT/CN2019/111213 CN2019111213W WO2020078346A1 WO 2020078346 A1 WO2020078346 A1 WO 2020078346A1 CN 2019111213 W CN2019111213 W CN 2019111213W WO 2020078346 A1 WO2020078346 A1 WO 2020078346A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
terminal
macro
power
image sensor
Prior art date
Application number
PCT/CN2019/111213
Other languages
English (en)
French (fr)
Inventor
王海燕
叶海水
苏蔚
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US17/286,378 priority Critical patent/US11405538B2/en
Priority to RU2021113634A priority patent/RU2762887C1/ru
Priority to EP19874716.4A priority patent/EP3846432B1/en
Priority to KR1020217011330A priority patent/KR102324921B1/ko
Priority to ES19874716T priority patent/ES2961008T3/es
Priority to JP2021520932A priority patent/JP2022511621A/ja
Publication of WO2020078346A1 publication Critical patent/WO2020078346A1/zh
Priority to US17/702,491 priority patent/US11683574B2/en
Priority to JP2022193051A priority patent/JP2023029944A/ja

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B3/00Focusing arrangements of general interest for cameras, projectors or printers
    • G03B3/02Focusing arrangements of general interest for cameras, projectors or printers moving lens along baseboard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/24Optical objectives specially designed for the purposes specified below for reproducing or copying at short object distances
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • G02B7/08Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B9/00Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or -
    • G02B9/62Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or - having six components only
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/12Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0015Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
    • G02B13/002Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface
    • G02B13/0045Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface having five or more lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the present application relates to the technical field of terminal photographing, in particular to a method and terminal for macro imaging.
  • the camera design methods of smart terminals are divided into fixed focus design and zoom design.
  • the focal length of the camera is a certain value, for example, the focal length of the camera may be 27 mm or 30 mm or 54 mm or other values.
  • the zoom design the focal length of the camera can be adjusted.
  • the focusing distance of the camera is usually above 7cm.
  • the user needs to take a close-up picture, for example, the user wants to take a picture of a bug close to the lens.
  • a closer focusing distance such as 1 to 5 cm
  • the camera image of the existing smart terminal is blurred, and the obtained image quality is low.
  • An embodiment of the present application provides a terminal capable of obtaining high-quality imaging in a shooting scene such as a focusing distance of 1 to 5 cm.
  • an embodiment of the present application provides a terminal.
  • the terminal includes a camera module, an input component, an output component, and a processor.
  • the camera module includes a lens, a lens driving device, and an image sensor from the object side to the image side.
  • the lens is used to support clear imaging when the distance between the subject and the image sensor is within the macro range.
  • the lens driving device is used to drive the lens to move along the optical axis when the distance between the subject and the image sensor is within the macro range, wherein the driving stroke of the lens driving device is related to the closest focusing distance of the terminal.
  • the processor is used to control the lens driving device so that the lens can focus on the subject.
  • the input component is used to receive a shooting instruction input by a user, and the shooting instruction is used to shoot a focused picture.
  • the output part is used to output the captured pictures. In this way, when the distance between the subject and the image sensor is within the macro range, the processor can control the lens driving device so that the lens successfully focuses on the subject.
  • the macro range is 1 to 5 cm.
  • the lens is an ultra-wide-angle lens
  • the FOV of the ultra-wide-angle lens is greater than or equal to 100 °
  • the equivalent focal length of the ultra-wide-angle lens ranges from 10 mm to 20 mm.
  • the ultra-wide angle lens has negative distortion in the edge field of view, and the negative distortion is greater than or equal to -30%, and the range of the vertical axis magnification of the ultra-wide angle lens in the center field of view is 0.03 to 0.43.
  • the number of lenses in the ultra-wide-angle lens ranges from 5 to 8
  • the target size of the image sensor ranges from 1 / 3.06 to 1 / 2.78.
  • the lens is an internal focus lens.
  • the processor is also used to adjust the focal length of the internal focusing lens.
  • the internal focusing lens includes one or more lenses with variable power, and the power of the variable power lens is related to the focal length of the internal focusing lens.
  • the processor is used to adjust the focal length of the internal focusing lens, which can be specifically implemented as: adjusting the power of one or more variable power lenses to adjust the focal length of the internal focusing lens.
  • the refractive index of the variable power lens is related to the power of the variable power lens.
  • a processor, used to adjust the power of one or more variable power lenses can be specifically implemented as: used to control the current or voltage input to the variable power lens to change the variable power lens
  • the refractive index of the lens can be adjusted to adjust the optical power of the variable power lens.
  • variable power lens is related to the power of the variable power lens.
  • the processor is used to adjust the power of one or more variable power lenses, which can be specifically implemented as: controlling the deformation of the variable power lens to adjust the power of the variable power lens Optical power.
  • the lens with variable optical power is a lens made of electro-optic material or a deformable lens.
  • the refractive index of the variable power lens can be changed by applying an electric field to the variable power lens, or by the driving device pushing or squeezing the variable power lens to deform, thereby changing the power of the variable power lens
  • the focal length of the internal focusing lens In this way, the terminal can enable clear imaging when the subject is closer to the image sensor.
  • the terminal further includes a lens driving device.
  • the internal focusing lens includes n lenses arranged in sequence along the optical axis, and the n lenses include one or more movable lens groups, each movable lens group includes one or more movable lenses, and the movable lens is a relative lens edge
  • the relative position of the movable lens along the optical axis is related to the focal length of the internal focusing lens.
  • the lens driving device is used to drive one or more movable lens groups in the internal focusing lens to move along the optical axis to adjust the focal length of the internal focusing lens.
  • the relative position of the movable lens in the lens along the optical axis changes through the driving of the lens driving device, that is, the spacing between the lenses in the lens changes, and thus, the entire
  • the optical characteristics of the lens, such as focal length may change. That is, in the embodiment of the present application, by dynamically adjusting the spacing between the lenses in the lens, the focal length of the lens can be adjusted, thereby enabling the terminal to form a clearer image under macro.
  • an embodiment of the present application provides a macro imaging method.
  • the method is applied to a terminal.
  • the terminal includes a camera module, an input component, an output component, and a processor.
  • the camera module includes a lens and a lens from the object side to the image side Drive device and image sensor. Among them, the lens supports clear imaging when the distance between the subject and the image sensor is within the macro range.
  • the method includes:
  • the processor controls the lens driving device to drive the lens to move along the optical axis, so that the lens focuses on the subject.
  • the input part receives the shooting instruction input by the user, and the shooting instruction is used to take a picture after focusing, and then, the output part outputs the taken picture.
  • the terminal may also perform the following steps:
  • the output component outputs a first interface, which is used to prompt the user whether to start macro shooting.
  • the terminal can detect whether the distance between the subject and the image sensor satisfies the macro.
  • the lens driving device in the terminal pushes the lens to move along the optical axis to Focusing is completed, so that a clearer image can be taken under macro.
  • the macro range is 1 to 5 cm.
  • the lens is an ultra-wide-angle lens
  • the FOV of the ultra-wide-angle lens is greater than or equal to 100 °
  • the equivalent focal length of the ultra-wide-angle lens ranges from 10 mm to 20 mm.
  • the ultra-wide angle lens has negative distortion in the edge field of view, and the negative distortion is greater than or equal to -30%, and the range of the vertical axis magnification of the ultra-wide angle lens in the center field of view is 0.03 to 0.43.
  • the number of lenses in the ultra-wide-angle lens ranges from 5 to 8
  • the target size of the image sensor ranges from 1 / 3.06 to 1 / 2.78.
  • the lens is an internal focus lens.
  • the processor controls the lens driving device to drive the lens to move along the optical axis, so that the lens can focus on the subject.
  • the processor can control the lens driving device to drive the internal focusing lens to move along the optical axis, and Control and adjust the focal length of the inner focusing lens so that the inner focusing lens can focus on the subject.
  • the terminal controls the current or voltage input to the variable power lens through the processor to adjust the power of the variable power lens.
  • the terminal controls the variable power lens to deform through the processor to adjust the power of the variable power lens.
  • the terminal processor can also control the power of the variable power lens to change in other ways to adjust the focal length of the internal focusing lens.
  • the terminal when the terminal detects that the distance between the object and the image sensor satisfies the macro, the terminal can change the optical power of the lens by controlling the deformation or refractive index of the lens, and then adjust The focal length of the lens, and focus can be achieved by the lens drive device, so that high-quality imaging can be obtained at macro.
  • FIG. 1 is a schematic structural diagram of a terminal provided by an embodiment of this application.
  • FIG. 2 is a schematic diagram of setting a camera module on a terminal provided by an embodiment of the present application
  • FIG. 3 is a schematic structural diagram of a camera module provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of connection between a voice coil motor and a lens provided by an embodiment of the present application
  • FIG. 5 is a schematic structural diagram of a camera module with an ultra-wide-angle lens provided by an embodiment of the present application.
  • Figure 6 is a schematic diagram of the field of view
  • FIG. 9 is a picture taken by a terminal provided in an embodiment of the present application in a macro range
  • FIG. 10 is a schematic structural diagram 1 of a camera module with an internal focusing lens provided by an embodiment of the present application;
  • FIG. 11 is a second schematic structural view of a camera module with an internal focusing lens provided by an embodiment of the present application.
  • FIG. 13 is a flowchart of a method for macro imaging provided by an embodiment of the present application.
  • 15 is a schematic diagram 1 of a scene of macro imaging provided by an embodiment of the present application.
  • 16 is a schematic diagram 2 of a scene of macro imaging provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram 3 of a macro imaging scenario provided by an embodiment of the present application.
  • Field of view See Figure 6.
  • the angle of the lens of the optical instrument is taken as the apex, and the object image of the subject can pass through the two edges of the maximum range of the lens.
  • the size of the field of view determines the field of view of the optical instrument. The larger the field of view, the larger the field of view. In other words, objects in the field of view can be photographed through the lens, and objects outside the field of view are not visible.
  • ab is the diameter of the visible range
  • point c is the center of the visible range
  • oc is the object distance
  • is the field of view.
  • Image sensor target surface size refers to the size of the photosensitive element in the image sensor.
  • Equivalent focal length due to the different size of the photosensitive element of the image sensor in different camera modules, the same lens is used with different photosensitive elements, and its imaging effect is different.
  • the focal length of different lenses is set according to a certain
  • the scale factor is converted into the equivalent focal length of a standard camera, where the standard camera may be a full-frame camera.
  • the method for converting the focal lengths of different lenses into the equivalent focal length of a standard camera can be found in the prior art, and will not be repeated here.
  • Depth of Field refers to the clear and sharp range of the subject's imaging on the photosensitive element when the camera module completes focusing.
  • the depth of field is related to the background blur effect. Generally, a shallow depth of field corresponds to a better background blur effect, and a deep depth of field corresponds to a poor background blur effect.
  • first and second in the specification and drawings of the present application are used to distinguish different objects, or to distinguish different treatments of the same object, rather than to describe a specific order of objects.
  • the terms “including” and “having” and any variations thereof mentioned in the description of this application are intended to cover non-exclusive inclusions.
  • a process, method, system, product, or device that includes a series of steps or units is not limited to the listed steps or units, but optionally includes other steps or units that are not listed, or optionally also Include other steps or units inherent to these processes, methods, products, or equipment.
  • the words “exemplary” or “for example” are used as examples, illustrations or explanations.
  • the terminal provided in the embodiments of the present application may be a portable electronic device including a camera function, such as a mobile phone, a wearable device, an augmented reality (AR) ⁇ virtual reality (VR) device, a tablet computer, a notebook computer, a super
  • the mobile personal computer Ultra-Mobile personal computer, UMPC), netbook, personal digital assistant (Personal Digital Assistant, PDA), etc.
  • Exemplary embodiments of portable electronic devices include, but are not limited to Or portable electronic devices of other operating systems.
  • the above portable electronic device may also be other portable electronic devices, such as a laptop with a touch-sensitive surface (for example, a touch panel) or the like. It should also be understood that, in some other embodiments of the present application, the electronic device may not be a portable electronic device, but a desktop computer with a touch-sensitive surface (such as a touch panel).
  • the terminal in the embodiment of the present application may be a mobile phone 100.
  • the following uses the mobile phone 100 as an example to specifically describe the embodiment.
  • the mobile phone 100 may specifically include: a processor 101, a radio frequency (RF) circuit 102, a memory 103, a touch screen 104, a Bluetooth device 105, one or more sensors 106, a Wi-Fi device 107, a positioning device 108,
  • the audio circuit 109, the peripheral interface 110, and the power supply device 111 are components. These components can communicate through one or more communication buses or signal lines (not shown in FIG. 1).
  • RF radio frequency
  • the hardware structure shown in FIG. 1 does not constitute a limitation on the mobile phone, and the mobile phone 100 may include more or fewer components than those illustrated, or combine certain components, or arrange different components.
  • the processor 101 is the control center of the mobile phone 100, and uses various interfaces and lines to connect various parts of the mobile phone 100, by running or executing an application program (App for short) stored in the memory 103, and calling data stored in the memory 103, Perform various functions of the mobile phone 100 and process data.
  • the processor 101 may include one or more processing units.
  • the processor 101 may be used to control and adjust the focal length of the lens in the camera module. For a detailed description of the processor's control of adjusting the focal length of the lens, see below.
  • the processor 101 is also used to control the lens driving device in the camera module, drive the lens to move along the optical axis, and adjust the focal length of the internal focusing lens, so that the lens can focus on the subject.
  • the radio frequency circuit 102 can be used for receiving and sending wireless signals during the process of receiving and sending information or talking.
  • the radio frequency circuit 102 may receive the downlink data of the base station and process it to the processor 101.
  • uplink data is sent to the base station.
  • the radio frequency circuit includes but is not limited to an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency circuit 102 can also communicate with other devices through wireless communication. Wireless communication can use any communication standard or protocol, including but not limited to global mobile communication system, general packet radio service, code division multiple access, broadband code division multiple access, long-term evolution, e-mail, short message service, etc.
  • the memory 103 is used to store application programs and data, and the processor 101 executes various functions and data processing of the mobile phone 100 by running the application programs and data stored in the memory 103.
  • the memory 103 mainly includes a storage program area and a storage data area.
  • the storage program area may store an operating system and application programs required by at least one function (such as a sound playback function, an image playback function, etc.).
  • the storage data area may store data created according to the use of the mobile phone 100 (such as audio data, phone book, etc.).
  • the memory 103 may include a high-speed random access memory, and may also include a non-volatile memory, such as a magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the memory 103 may store various operating systems. For example, the iOS operating system developed by Apple, and the Android operating system developed by Google.
  • the mobile phone may include an input component and an input component.
  • the input component can receive user input operations on the mobile phone, for example, receive voice operations input by the user, receive touch operations input by the user, and so on.
  • the output part can output the data processing result inside the mobile phone to the user, for example, the mobile phone outputs voice and output interface through the output part.
  • the input and output components can be integrated together.
  • the touch screen 104 integrates an input component touch panel 104-1 and an output component display screen 104-2.
  • the touch screen 104 may include a touch panel 104-1 and a display screen 104-2.
  • the touchpad 104-1 can be used as an input component to collect touch events on or near the user of the mobile phone 100 (for example, the user uses any suitable object such as a finger or a stylus on the touchpad 104-1 or Operation near the touchpad 104-1), and send the collected touch information to other devices such as the processor 101.
  • the user's touch event in the vicinity of the touchpad 104-1 can be called floating touch.
  • Floating touch can mean that the user does not need to directly touch the touchpad to select, move, or drag an object (such as an icon, etc.), but only needs the user to be near the terminal to perform the desired function.
  • the terms "touch”, “contact”, etc. do not imply direct contact with the touch screen, but near or close contact.
  • two capacitive sensors may be provided in the touch panel 104-1, namely, a mutual capacitance sensor and a self-capacitance sensor.
  • the two capacitive sensors may be alternately arranged in an array on the touch panel 104-1.
  • the mutual capacitance sensor is used to realize the normal traditional multi-touch, that is, detect the gesture of the user when touching the touch panel 104-1.
  • the self-capacitance sensor can generate a stronger signal than the mutual capacitance, thereby detecting the finger induction farther from the touchpad 104-1.
  • the signal generated by the self-capacitance sensor is larger than the signal generated by the mutual capacitance sensor, so that the mobile phone 100 can detect that it is above the screen, for example, above the touchpad 104-1
  • the user's gesture at 20mm the signal generated by the self-capacitance sensor is larger than the signal generated by the mutual capacitance sensor, so that the mobile phone 100 can detect that it is above the screen, for example, above the touchpad 104-1.
  • the touch panel 104-1 capable of floating touch can be implemented by capacitive, infrared light sensing, ultrasonic waves, and the like.
  • the touch panel 104-1 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the display screen 104-2 can be used as an output component for displaying information input by the user or information provided to the user and various menus of the mobile phone 100.
  • the display screen 104-2 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the touchpad 104-1 can be overlaid on the display screen 104-2. When the touchpad 104-1 detects a touch event on or near it, it is transmitted to the processor 101 to determine the type of touch event, and then processed The device 101 may provide corresponding visual output on the display screen 104-2 according to the type of touch event.
  • the touchpad 104-1 and the display screen 104-2 are used as two independent components to realize the input and output functions of the mobile phone 100, in some embodiments, the touchpad 104- 1 Integrated with the display screen 104-2 to realize the input and output functions of the mobile phone 100.
  • the touch screen 104 is formed by stacking multiple layers of materials. Only the touch panel (layer) and the display screen (layer) are shown in the embodiments of the present application, and other layers are not described in the embodiments of the present application. .
  • the touchpad 104-1 may cover the display screen 104-2, and the size of the touchpad 104-1 is larger than the size of the display screen 104-2, so that the display screen 104 -2 is completely covered under the touchpad 104-1, or the above-mentioned touchpad 104-1 can be configured on the front of the mobile phone 100 in the form of a full board, that is, the user's touch on the front of the mobile phone 100 can be perceived by the mobile phone, In this way, you can achieve a full touch experience on the front of the phone.
  • the touchpad 104-1 is arranged on the front of the mobile phone 100 in the form of a full board
  • the display screen 104-2 may also be arranged on the front of the mobile phone 100 in the form of a full board, so that the front of the phone It can realize the structure without border.
  • an input component such as a touch panel 104-1 is used to receive a shooting instruction input by a user, and the shooting instruction is used to instruct the terminal to take a picture after focusing.
  • An output part such as the display screen 104-2 is used to output the picture taken after focusing.
  • the user selects the photographing option 1505 by touching the touchpad 104-1 to input a shooting instruction, and then the terminal takes a picture after focusing, and the display 104-2 outputs the terminal after focusing Picture taken.
  • the mobile phone 100 may also have a fingerprint recognition function.
  • the fingerprint collecting device 112 may be arranged on the back of the mobile phone 100, or the fingerprint collecting device 112 may be arranged on the front of the mobile phone 100 (for example, below the touch screen 104).
  • the fingerprint collection device 112 may be configured in the touch screen 104 to implement the fingerprint recognition function, that is, the fingerprint collection device 112 may be integrated with the touch screen 104 to implement the fingerprint recognition function of the mobile phone 100.
  • the fingerprint collecting device 112 is configured in the touch screen 104, and may be a part of the touch screen 104, or may be configured in the touch screen 104 in other ways.
  • the fingerprint collection device 112 can also be implemented as a full-board fingerprint collection device. Therefore, the touch screen 104 can be regarded as a panel that can perform fingerprint recognition at any position.
  • the fingerprint collecting device 112 may send the collected fingerprint to the processor 101, so that the processor 101 processes the fingerprint (eg, fingerprint verification, etc.).
  • the main component of the fingerprint collection device 112 in the embodiment of the present application is a fingerprint sensor, which may use any type of sensing technology, including but not limited to optical, capacitive, piezoelectric, or ultrasonic sensing technology.
  • the mobile phone 100 may also include a Bluetooth device 105 for implementing data exchange between the mobile phone 100 and other short-range terminals (such as mobile phones, smart watches, etc.).
  • the Bluetooth device 105 in the embodiment of the present application may be an integrated circuit or a Bluetooth chip.
  • the mobile phone 100 may further include at least one sensor 106, such as a light sensor, a motion sensor, an image sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display screen of the touch screen 104 according to the brightness of the ambient light
  • the proximity sensor can turn off the power of the display screen when the mobile phone 100 moves to the ear.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (generally three axes), and can detect the magnitude and direction of gravity when at rest, and can be used to identify mobile phone gesture applications (such as horizontal and vertical screen switching, related Games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, tap), etc.
  • the image sensor may be provided in the camera module 115 and used to convert the captured image of the camera module 115 into an electrical signal.
  • a charge coupled device (Charged Coupled Device, CCD) image sensor has a high resolution (High Resolution), which can sense and identify fine objects, and has a large photosensitive area (Large Field of View).
  • CCD Charge Coupled Device
  • CMOS complementary metal oxide semiconductor
  • the complementary metal oxide semiconductor (Complementary Metal-Oxide-Semiconductor, CMOS) image sensor has the characteristics of power saving, which can reduce the power consumption of the mobile phone when shooting still photos or dynamic videos.
  • the mobile phone 100 may also be equipped with other sensors such as a gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc., which will not be repeated here.
  • sensors such as a gyroscope, barometer, hygrometer, thermometer, infrared sensor, etc., which will not be repeated here.
  • the Wi-Fi device 107 is used to provide the mobile phone 100 with network access following Wi-Fi related standard protocols.
  • the mobile phone 100 can be connected to the Wi-Fi access point through the Wi-Fi device 107, thereby helping the user to send and receive emails, Browsing web pages and accessing streaming media, etc., it provides users with wireless broadband Internet access.
  • the Wi-Fi device 107 may also serve as a Wi-Fi wireless access point, and may provide Wi-Fi network access for other terminals.
  • the positioning device 108 is used to provide a geographic location for the mobile phone 100. It can be understood that the positioning device 108 may specifically be a receiver of a positioning system such as a global positioning system (GPS) or a Beidou satellite navigation system, a Russian GLONASS, or the like. After receiving the geographic location sent by the positioning system, the positioning device 108 sends the information to the processor 101 for processing, or to the memory 103 for storage. In some other embodiments, the positioning device 108 may also be a receiver of an assisted global satellite positioning system (AGPS). The AGPS system assists the positioning device 108 to complete ranging and positioning services by serving as an auxiliary positioning server.
  • AGPS assisted global satellite positioning system
  • the auxiliary positioning server communicates with the positioning device 108 (ie, GPS receiver) of the terminal, such as the mobile phone 100, through a wireless communication network to provide positioning assistance.
  • the positioning device 108 may also be a positioning technology based on Wi-Fi access points. Since each Wi-Fi access point has a globally unique Media Access Control (MAC) address, the terminal can scan and collect the surrounding Wi-Fi access points when Wi-Fi is turned on Broadcast signal, so you can get the MAC address broadcast by the Wi-Fi access point.
  • MAC Media Access Control
  • the terminal sends these data (such as MAC address) that can mark the Wi-Fi access point to the location server through the wireless communication network, and the location server retrieves the geographic location of each Wi-Fi access point and combines it with Wi-Fi broadcasting The strength of the signal is calculated, and the geographic location of the terminal is calculated and sent to the positioning device 108 of the terminal.
  • data such as MAC address
  • the location server retrieves the geographic location of each Wi-Fi access point and combines it with Wi-Fi broadcasting
  • the strength of the signal is calculated, and the geographic location of the terminal is calculated and sent to the positioning device 108 of the terminal.
  • the audio circuit 109, the speaker 113, and the microphone 114 may provide an audio interface between the user and the mobile phone 100.
  • the audio circuit 109 may transmit the converted electrical signal of the received audio data to the speaker 113, where the speaker 113 converts it into a sound signal and outputs it.
  • the microphone 114 converts the collected sound signal into an electrical signal, which is received by the audio circuit 109 and then converted into audio data, and then outputs the audio data to the RF circuit 102 to be sent to, for example, another mobile phone, or outputs the audio data to Memory 103 for further processing.
  • the peripheral interface 110 is used to provide various interfaces for external input / output devices (such as a keyboard, a mouse, an external display, an external memory, a user identification module card, etc.). For example, it is connected to a mouse through a Universal Serial Bus (USB) interface, and connected to a subscriber identification module (SIM) card provided by a telecommunications operator through metal contacts on the card slot of the user identification module card .
  • the peripheral interface 110 may be used to couple the above-mentioned external input / output peripheral devices to the processor 101 and the memory 103.
  • the mobile phone 100 may further include a power supply device 111 (such as a battery and a power management chip) that supplies power to various components.
  • a power supply device 111 such as a battery and a power management chip
  • the battery may be logically connected to the processor 101 through the power management chip, so that the power supply device 111 can manage charging, discharging, and power consumption management. And other functions.
  • the mobile phone 100 may further include a camera module 115, and the camera module 115 may be a camera of the terminal. Used for shooting still photos or dynamic videos.
  • the camera module 115 includes a lens, a lens driving device, and an image sensor from the object side to the image side. For a detailed description of the camera module 115, refer to the following embodiments.
  • the mobile phone 100 may further include a flash lamp, a micro-projection device, a near field communication (NFC) device, and so on, and details are not described herein again.
  • a flash lamp a micro-projection device
  • NFC near field communication
  • the terminal provided in the embodiment of the present application will be described in detail as follows.
  • the following mainly uses the terminal as a mobile phone as an example for illustration, which is described here in a unified manner and will not be described in detail below.
  • the camera module 201 in the mobile phone 200 may be the rear camera shown in FIG. 2, and the rear camera is disposed on the top of the back of the mobile phone.
  • the camera module can also be set in other locations, such as inside the mobile phone, and when the user has a shooting requirement, the camera module is popped up for shooting.
  • FIG. 3 it is an exemplary structure of a camera module in a terminal according to an embodiment of the present application.
  • the camera module 201 includes a lens 301, a lens driving device 302, and an image sensor 303 from the object side to the image side.
  • each component in FIG. 3 is only an exemplary component, and the actual shape, size, and size of the component are not limited to those listed in FIG. 3.
  • the object side refers to the side of the object to be photographed (referred to as the subject), and the image side refers to the side where the image sensor images.
  • Lens driving devices include but are not limited to voice coil motors, piezoelectric ceramics, and Micro-Electro-Mechanical Systems (MEMS).
  • Image sensors include, but are not limited to, the aforementioned CCD image sensor and CMOS image sensor.
  • the lens driving device is used to drive the lens to move along the optical axis.
  • the driving stroke of the lens driving device is related to the closest focusing distance of the lens.
  • the driving stroke of the motor can make the closest focusing distance of the lens range from 1 to 5 cm.
  • the focusing distance refers to the distance between the object image, that is, the sum of the distance between the subject and the lens and the distance between the lens and the image sensor, that is, the distance between the subject and the image sensor.
  • the closest focusing distance refers to the closest focusing distance at which the subject focuses.
  • Subject focusing refers to that the subject can form a clearer image on the image sensor, that is, the closest focusing distance is the closest distance of the subject to the image sensor when a clearer image can be formed.
  • the motor drives the lens to move a certain distance along the optical axis (such as moving 400um), so that the subject is at a distance
  • the sensor focuses at 1cm.
  • the motor drives the lens to move along the optical axis by a certain distance (such as 50 um), so that the subject is focused when it is 7 cm from the image sensor.
  • the driving stroke of the motor can make the closest focusing distance of the lens range from 1 to 5 cm. That is, when the distance of the subject from the image sensor is within a range of 1 to 5 cm, the subject can be focused, that is, the subject can be formed into a clearer image on the image sensor.
  • the above lens driving device is mainly used to push the lens along the optical axis and push the lens to the optimal imaging position along the optical axis.
  • the driving stroke of the lens driving device may be different.
  • the terminal is provided with a lens 1, and the stroke range of the lens driving device is 0 to 400um, so that when the subject is 1 to 5cm from the image sensor, the lens driving device can push the lens to the optimal imaging position along the optical axis .
  • the terminal is provided with a lens 2, and the driving stroke range of the lens driving device is 0 to 300um. In this way, when the subject is 1 to 5cm from the image sensor, the lens driving device can push the lens to the optimal imaging position along the optical axis .
  • the driving stroke range of the lens driving device may be different for different lenses.
  • the lens driving device and the lens can be connected in a certain manner.
  • the lens and the voice coil motor may be connected with a screw-tooth interlocking structure as shown in (a) of FIG. 4.
  • this structure mainly relies on the screw teeth between the voice coil motor 2 and the lens 1 to cooperate to form a preliminary binding force, and then fixed from the upper end of the screw-tooth interlocking structure by glue, so that the outer surface of the lens 1
  • the inner surface of the voice coil motor 2 is fixed, so that the lens 1 and the voice coil motor 2 are combined together.
  • the lens 1 and the voice coil motor 2 may be connected with a non-screw tooth smooth surface structure as shown in (b) of FIG. 4.
  • the lens and the voice coil motor can also be connected in other ways, which is not limited in the embodiments of the present application.
  • connection relationship between the MEMS and the piezoelectric ceramic and the lens can also refer to the prior art, which is not limited in the embodiments of the present application.
  • At least one of the following three lenses may be used:
  • Case 1 The lens is a fixed-focus ultra wide-angle lens.
  • the field of view (FOV) of the ultra-wide-angle lens is greater than or equal to 100 °, and the value range of the equivalent focal length of the ultra-wide-angle lens is 10 mm to 20 mm.
  • the terminal can realize macro imaging by setting different lenses. Among them, the specific parameters of different lenses will be different. Generally, when the parameter of the lens falls within the parameter range mentioned in the embodiments of the present application, the terminal can realize macro imaging. For example, when the FOV of the ultra-wide-angle lens is 110 ° and the equivalent focal length is 15 mm, other parameters of the ultra-wide-angle lens, such as curvature and refractive index, can be adjusted to achieve macro imaging of the terminal.
  • macro imaging refers to that a subject can form a clearer image at a distance of 1 to 5 cm from the image sensor, which is described here in a unified manner and will not be described in detail below.
  • the ultra wide-angle lens is composed of 6 lenses. Among them, the power of the first lens L1 from the object side to the image side is negative, and the power of the second lens L2 is positive. A diaphragm STO is provided between L1 and L2.
  • the third lens L3 has a negative power
  • the fourth lens L4 has a positive power
  • the fifth lens L5 has a positive power
  • the sixth lens L6 has a negative power.
  • the FOV value of the ultra-wide-angle lens may be 100 °, or a certain value greater than 100 °
  • the equivalent focal length of the ultra-wide-angle lens may be a certain value from 10 to 20 mm.
  • the distance from the first lens L1 to the image sensor 303 is defined as the total length (Total Track Length, TTL), the half-image height of the lens is IH, and the range of IH / TTL is 0.5-0.6.
  • TTL Total Track Length
  • the ultra-wide-angle lens of the embodiment of the present application may also have other structures and other numbers of lenses.
  • the ultra-wide-angle lens is composed of 5 lenses, and the power and curvature of each lens from the object side to the image side may be based on actual conditions. set up.
  • the ultra-wide-angle lens may also adopt an existing structure.
  • the embodiments of the present application do not limit the specific structure of the ultra-wide-angle lens.
  • the equivalent focal length of the ultra-wide-angle lens is short (10-20 mm), therefore, a shorter closest focusing distance can be obtained, that is, when the lens is closer to the subject, it can also succeed Focus to get high-quality, high-definition imaging.
  • FIG. 7 is an image taken by a conventional mobile phone at a macro (for example, the subject is 5 cm away from the lens), and the image is blurred.
  • FIG. 7 (b) is an image taken by the mobile phone in macro mode in the embodiment of the present application. In FIG. 7 (b), details of small insects and leaves are photographed more clearly.
  • the lens when the lens is closer to the subject, due to the shorter focusing distance, the depth of field of the captured image is shallower, so that the captured image obtains a better background blur effect.
  • FIG. 8 it is a picture taken by a mobile phone in an embodiment of the present application, and the picture has a good background blur effect.
  • the vertical magnification of the lens in the central field of view ranges from 0.03 to 0.43.
  • the lens has negative distortion in the edge field of view, and the negative distortion is greater than or equal to -30%.
  • the vertical axis magnification refers to the magnification in the direction of the vertical optical axis, and its value is the ratio of the imaging size to the actual size of the object along the direction of the vertical optical axis.
  • the fringe field of view is between 0.8 and 1. Specifically, referring to FIG. 6, the entire visual range is divided into N parts, the maximum visual range is recorded as 1, the central visual field is recorded as 0, and 0.8 to 1 is the peripheral visual field, that is, ⁇ and ⁇ are the peripheral visual fields. Negative distortion means that the vertical axis magnification of the lens in the edge field of view is smaller than the vertical axis magnification of the lens in the center field of view.
  • the lower magnification of the edge field of view is equivalent to the reduction of the magnification caused by the increase of the object distance when shooting the macro landscape, so that the camera module can shoot a better perspective image .
  • FIG. 9 is a picture taken by a mobile phone in an embodiment of the present application, in which micro scenes (a few small dolls on the desktop) and macro scenes (buildings, etc. in FIG. 9) have better perspective effects, making the screen More three-dimensional.
  • the number of lenses in the lens ranges from 5 to 8, and the target size of the image sensor ranges from 1 / 3.06 to 1 / 2.78.
  • the lens material is plastic or glass, or a mixture of plastic and glass.
  • the aperture range of the lens is F2.4 ⁇ F1.8.
  • the lens is an internal focus lens.
  • the internal focusing lens includes n lenses arranged in sequence along the optical axis, and the n lenses include one or more movable lens groups.
  • Each movable lens group includes one or more movable lenses.
  • the movable lens refers to a lens with a variable position along the optical axis relative to the lens. The position of the movable lens along the optical axis is related to the focal length of the internal focusing lens.
  • the terminal further includes a lens driving device for driving one or more movable lens groups in the internal focusing lens to move along the optical axis to adjust the focal length of the internal focusing lens.
  • the lens driving device may be a voice coil motor, MEMS, or piezoelectric ceramic.
  • the lens driving device drives the movable lens to move
  • the relative position along the optical axis between the movable lenses in the same movable lens group does not change, that is, the lens driving device takes the movable lens group as a whole
  • the entire movable lens group is moved along the optical axis.
  • the lens driving device drives the first lens in the movable lens group to move along the optical axis by 100 um, and correspondingly, drives the second lens in the same movable lens group to move along the optical axis by 100 um.
  • the moving distance and moving direction along the optical axis between different movable lens groups may be different. For example, in FIG.
  • the driving L2 and L3 move along the optical axis to the object side, and the moving distance is distance 1
  • the driving L4 moves along the optical axis to the image side, and the moving distance is distance 2.
  • the moving distance and moving direction along the optical axis between different movable lens groups may also be the same, and the specific movement rule of the movable lens group is not limited in the embodiments of the present application.
  • the movable lens can be connected with the lens driving device in a certain way, for example, the movable lens can be connected with the lens driving device through the dispensing method.
  • the connection between the movable lens and the lens driving device can also refer to other methods in the prior art, which is not limited in the embodiments of the present application.
  • the lens driving device is a motor
  • the value of n is 6, among the 6 lenses
  • the movable lens L2 and the movable lens L3 constitute a movable lens group
  • L4 is another movable lens group.
  • the movable lenses L2, L3 and the motor are combined by dispensing, and the movable lens L4 and the motor are also combined by dispensing.
  • the motor can drive L2, L3, L4 to move relative to the inner focusing lens in the direction along the optical axis.
  • the relative position of the movable lens in the lens along the optical axis changes through the driving of the lens driving device, that is, the spacing between the lenses in the lens changes.
  • Optical characteristics such as focal length may change. That is, in the embodiment of the present application, by dynamically adjusting the spacing between the lenses in the lens, the focal length of the lens can be adjusted, thereby enabling the terminal to form a clearer image under macro.
  • the lens driving device pushes the movable lens and the lens driving device mentioned above pushes the lens, which is a different process.
  • the lens driving device pushes the movable lens in the lens to move along the optical axis, and its purpose is to adjust the focal length of the lens by changing the spacing between the lenses in the lens.
  • the lens driving device pushes the lens to move along the optical axis. Its purpose is to adjust the object distance and image distance through the movement of the lens along the optical axis to determine the optimal position of the lens when the subject can be clearly imaged.
  • FIG. 10 is only an example of the internal focusing lens in the embodiment of the present application.
  • the number of lenses included in the lens, which specific lens or which lenses are movable lenses, can be set separately, this application
  • the embodiment is not limited.
  • the lens is an internal focus lens.
  • the internal focusing lens includes one or more lenses with variable power (such as the lenses L1 and L4 in FIG. 11), and the power of the variable power lens is related to the focal length of the lens .
  • the optical power is used to characterize the refractive power of the optical device to the incident parallel light beam.
  • the greater the power the more pronounced the refraction of the parallel beam.
  • the refraction is convergent, and when the power is less than 0, the refraction is divergent.
  • a variable power lens has a variable shape under the action of an electric field (such as a changing current or voltage).
  • the shape of the variable power lens is related to the power of the variable power lens, or
  • the refractive index of the variable-degree lens is variable under the action of an electric field, and the refractive index of the variable-power lens is related to the optical power of the variable-power lens.
  • the processor in the terminal can adjust the power of the variable power lens by controlling the deformation or refractive index of the variable power lens to adjust the focal length of the internal focusing lens.
  • the processor is used to adjust the focal length of the internal focusing lens, which may be specifically implemented as follows: the processor controls the current or voltage input to the variable power lens to change the refractive index of the variable power lens, To achieve the purpose of adjusting the variable power lens, thereby adjusting the focal length of the internal focusing lens.
  • the processor is used to adjust the focal length of the internal focusing lens, and may specifically be implemented as: a processor to control the deformation of the variable power lens to adjust the power of the variable power lens Purpose, thereby adjusting the focal length of the inner focusing lens.
  • the processor controls the deformation of the variable power lens, specifically, the processor can control the driving device, and the driving device pushes and squeezes the lens to deform.
  • the lens with variable optical power is a lens made of electro-optic material or a deformable lens.
  • the electroluminescent material refers to a material with a variable refractive index under the action of an electric field.
  • the deformable lens can be deformed by the driving device.
  • the driving device may be a motor, MEMS, or the like.
  • the material of the lens with variable optical power is not limited to the above two, but may be other materials, which are not limited in the embodiments of the present application.
  • an electric field can be applied to lenses with variable optical powers such as L1 and L4 to change the optical power of the lenses, thereby adjusting the focal length of the entire lens, so that the terminal can become clearer at macro Like.
  • the camera module shown in FIG. 3 may further include other components.
  • an infrared cut filter 304 is provided between the lens 301 and the image sensor 303 to filter out the near infrared and ultraviolet light bands in the ambient light.
  • the thickness of the infrared cut filter is 0.11 mm or 0.21 mm, and the material of the infrared cut filter is resin or blue glass.
  • the infrared cut filter may also be other materials and / or filters with other thicknesses. The embodiments of the present application do not limit the material and thickness of the filter.
  • An embodiment of the present application also provides a macro imaging method.
  • the method is applied to the terminal shown in case 1 above.
  • the terminal is provided with a camera module, an input component, and an output component.
  • a lens, a lens driving device, and an image sensor are provided.
  • the lens is an ultra-wide-angle lens. The method includes the following steps:
  • the input component receives an operation of turning on the camera input by the user to turn on the camera.
  • the input component may be a touchpad 104-1.
  • the user touches and clicks the camera icon 1501 displayed on the screen, and the touchpad 104-1 collects information input by the user to start the camera operation. And send this information to the processor for further processing to turn on the camera. See (b) in FIG. 15 for the camera interface 1502 of the terminal. This interface can be displayed to the user by the display screen 104-2 of the terminal.
  • the processor detects the distance between the subject and the image sensor.
  • the terminal detects that the distance between the subject and the image sensor is within the macro range, and the output component outputs a first interface 1504, which is used to prompt the user whether to start macro shooting.
  • the macro range refers to 1 to 5 cm.
  • the terminal processor 101 uses a laser ranging method to measure the distance between the subject and the image sensor.
  • the specific principle and process of laser ranging can be referred to the prior art, and will not be repeated here.
  • the processor 101 collects the imaging on the image sensor, and when the imaging is relatively blurred, it can initially determine that the distance between the subject and the image sensor is relatively short.
  • the processor 101 feeds back the measured distance to the lens driving device.
  • the terminal output part that is, the display 104-2 outputs the first
  • the interface 1504 is used to prompt the user whether to enable macro shooting to obtain better close-up imaging quality.
  • the input component receives a first operation input by the user, and the first operation is used to instruct the terminal to start macro shooting.
  • the display screen 104-2 displays the options “Yes” and “No”, and the user can input the first operation through the input part, for example, touch the option “through the touchpad 104-1 shown in FIG. 1 Yes".
  • the touch panel 104-1 sends the collected touch information (ie, the user clicks the option "Yes") to a processor for processing, for example.
  • the terminal may determine that the user's actual shooting intention is not macro shooting, and at this time, the terminal may use an existing method to take a picture.
  • the lens driving device drives the lens to move along the optical axis to focus on the subject.
  • the terminal can automatically focus on the subject, that is, after receiving the first operation of the user and determining that macro shooting is started, the processor of the terminal can control the lens driving device, and then the lens driving device drives the ultra-wide-angle lens along The optical axis moves to complete the focusing process.
  • the terminal can also receive the focusing operation input by the user on the mobile phone interface, and adjust the position of the ultra-wide-angle lens along the optical axis according to the focusing operation. For example, referring to FIG. 15 (d), the user can perform the focus selection operation by touching the bug displayed on the display screen. After receiving the user input, the terminal uses the bug as the focus and adjusts the position of the ultra-wide-angle lens along the optical axis .
  • the input component receives a shooting instruction input by the user to instruct the terminal to shoot the focused picture.
  • the terminal may input voice through an input component such as a microphone to input a shooting instruction.
  • the user can also input a shooting instruction through other input components and other methods, which will not be repeated in the embodiments of the present application.
  • the output part outputs the picture taken after focusing.
  • the user can click the photographing option 1505 through the touchpad 104-1 to trigger the terminal to take pictures under macro, and the output part, such as the display screen 104-2, outputs the pictures taken under macro.
  • the first interface 1504 shown in (c) of FIG. 15 may not be output, but the macro shooting mode is automatically turned on and passed
  • the lens driving device drives the ultra-wide-angle lens to complete focusing, and then takes and outputs the focused image, that is, in FIG. 12, S1203 and S1204 are optional steps.
  • the terminal can detect whether the distance between the subject and the image sensor satisfies the macro.
  • the lens driving device in the terminal pushes the lens to move along the optical axis to Focusing is completed, so that a clearer image can be taken under macro.
  • An embodiment of the present application also provides a macro imaging method, which is applied to the terminal shown in the above case 2.
  • the terminal is provided with a camera module, an input component, and an output component.
  • the camera module is provided with an internal focusing lens, a lens driving device, and an image sensor from the object side to the image side.
  • the internal focusing lens includes n lenses arranged in sequence along the optical axis , N lenses include one or more movable lens groups. Each movable lens group includes one or more movable lenses.
  • the movable lens is a lens with a variable position along the optical axis relative to the lens. The position of the movable lens along the optical axis is related to the focal length of the lens. Referring to FIG. 13, the method includes steps S1201 to S1204, S1301, S1302, S1206, and S1207:
  • the lens driving device drives one or more movable lens groups in the internal focusing lens to move along the optical axis to adjust the focal length of the internal focusing lens.
  • the motor can drive the movable lens group composed of L2 and L3 to move along the optical axis to the object side, thereby adjusting the focal length of the lens.
  • the lens driving device drives the internal focusing lens to move along the optical axis to focus on the subject.
  • the lens driving device in the terminal can drive one or more movable lenses along the optical axis Move to dynamically adjust the focal length of the lens, and can drive the lens to move along the optical axis through the lens drive device during macro, and complete the focus, so that the image can be clear even in macro.
  • An embodiment of the present application further provides a macro imaging method, which is applied to the terminal shown in the above case 3.
  • the terminal is provided with a camera module, an input component, an output component, and a processor.
  • the camera module is provided with a lens, a lens driving device, and an image sensor from the object side to the image side.
  • the lens is an internal focus lens, and the internal focus lens includes one or For more than one variable power lens, the power of the variable power lens is related to the focal length of the internal focusing lens.
  • the method includes steps S1201 to S1204, S1401, S1402, S1206, and S1207:
  • S1401 The processor controls to adjust the power of one or more lenses with variable optical power in the internal focusing lens to adjust the focal length of the lens.
  • the terminal controls the current or voltage input to the variable power lens through the processor to adjust the power of the variable power lens.
  • the terminal controls the variable power lens to deform through the processor to adjust the power of the variable power lens.
  • the terminal processor can also control the power of the variable power lens to change in other ways to adjust the focal length of the internal focusing lens.
  • the processor controls the lens driving device so that the lens driving device drives the internal focusing lens to move along the optical axis to focus on the subject.
  • the terminal when the terminal detects that the distance between the object and the image sensor satisfies the macro, the terminal can change the optical power of the lens by controlling the deformation or refractive index of the lens, and then adjust The focal length of the lens, and focus can be achieved by the lens drive device, so that high-quality imaging can be obtained at macro.
  • the user after turning on the camera of the terminal, as shown in (a) of FIG. 16, the user can trigger the terminal to jump to (b) of FIG. 16 by clicking the mode option 1503
  • the mode selection interface 1601 is shown.
  • the user can click the macro shooting option 1602 to trigger the terminal to perform macro imaging.
  • the terminal having the structure shown in the above case 1 may perform the above-mentioned S1205 to S1207.
  • the terminal with the structure shown in the above case 2 can execute the above-mentioned S1301, S1302, S1206, and S1207.
  • the terminal having a structure such as that shown in case 3 above can execute the above-mentioned S1401, S1402, S1206, and S1207.
  • the terminal may also jump to the mode selection interface 1601 in other ways.
  • the user may jump to the mode selection interface 1601.
  • the method is not limited.
  • the terminal may prompt the user of the effect of macro shooting or other macro shooting information.
  • the terminal can output an interface prompt, for example, a prompt box "Macro shooting can support clear imaging of the subject at an image distance of 1 to 5 cm"
  • the prompt box can set options “Yes” and “No”.
  • the terminal can determine that the real intention of the user's shooting is macro shooting, and thus the terminal performs the above macro shooting method.
  • the user may also preset the macro shooting function of the terminal. For example, as shown in (a) of FIG. 17, in the setting interface 1701, the user may click the macro shooting start option 1702 to start the macro shooting function.
  • the terminal may execute the above macro imaging method.
  • the terminal does not enable the macro shooting function, the terminal does not have the authority to execute the above macro imaging method.
  • the terminal may output a prompt interface to prompt the user to turn on the macro shooting function, so that the terminal can form a clear image in macro.
  • the terminal may enter the setting interface 1701 in various ways.
  • the terminal may jump to the setting interface 1502 when it receives a user's right-slide operation on the camera interface 1502.
  • the embodiment of the present application does not limit the way in which the terminal enters the setting interface.
  • the terminal can save the settings made by the user on the setting interface. Later, when the user turns on the camera, if the terminal detects that the subject is closer to the image sensor, the terminal can perform the above-mentioned macro imaging method to realize the macro Clear imaging.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Business, Economics & Management (AREA)
  • Business, Economics & Management (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)
  • Lens Barrels (AREA)
  • Lenses (AREA)

Abstract

本申请提供一种微距成像方法及终端,涉及拍照技术领域,能够解决现有技术中被摄物在距离图像传感器较近时成像质量较低的问题。终端包括摄像模组、输入部件、输出部件和处理器,摄像模组从物侧到像侧包括镜头、镜头驱动装置以及图像传感器。其中,镜头,用于支持被摄物与图像传感器之间的距离处于微距范围内时清晰成像。镜头驱动装置,用于当被摄物与图像传感器之间的距离处于微距范围内时,驱动镜头沿着光轴移动,其中,镜头驱动装置的驱动行程与终端的最近对焦距离相关。处理器,用于控制镜头驱动装置,以使得镜头对被摄物完成对焦。输入部件,用于接收用户输入的拍摄指令,拍摄指令用于拍摄对焦后的图片。输出部件,用于输出拍摄的图片。

Description

微距成像的方法及终端
本申请要求于2018年10月16日提交中国国家知识产权局、申请号为201811206371.X、发明名称为“微距成像的方法及终端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端拍照技术领域,尤其涉及一种微距成像的方法及终端。
背景技术
通常,用户可以通过智能终端上的摄像头拍摄静态照片或者摄录动态视频。目前,智能终端的摄像头设计方式分为定焦设计和变焦设计。其中,定焦设计中,摄像头的焦距为确定值,例如,摄像头的焦距可为27mm或者30mm或者54mm或者其他数值。在变焦设计中,摄像头的焦距可调整。在一般使用场景中,为了使摄像头在距离被摄物无穷远和近距离下均能对焦,摄像头的对焦距离范围通常在7cm以上。
在许多应用场景中,用户需拍摄近距离的图片,比如,用户想拍摄距离镜头很近的小虫的图片。然而,对于更近的对焦距离,比如1~5cm,现有智能终端的摄像头成像模糊,获得的图像品质较低。
发明内容
本申请实施例提供一种终端,能够在诸如1~5cm对焦距离的拍摄场景中获得高质量成像。
为达到上述目的,本申请实施例采用如下技术方案:
第一方面,本申请实施例提供一种终端,终端包括摄像模组、输入部件、输出部件和处理器,摄像模组从物侧到像侧包括镜头、镜头驱动装置以及图像传感器。
其中,镜头,用于支持被摄物与图像传感器之间的距离处于微距范围内时清晰成像。镜头驱动装置,用于当被摄物与图像传感器之间的距离处于微距范围内时,驱动镜头沿着光轴移动,其中,镜头驱动装置的驱动行程与终端的最近对焦距离相关。处理器,用于控制镜头驱动装置,以使得镜头对被摄物完成对焦。输入部件,用于接收用户输入的拍摄指令,拍摄指令用于拍摄对焦后的图片。输出部件,用于输出拍摄的图片。如此,当被摄物与图像传感器的距离处于微距范围内时,处理器能够控制镜头驱动装置,以使得镜头对被摄物成功对焦。
在一种可能的设计中,微距范围为1~5cm。
可选的,镜头为超广角镜头,超广角镜头的视场FOV大于或等于100°,超广角镜头的等效焦距的取值范围为10mm~20mm。
可选的,超广角镜头在边缘视场具有负畸变,且负畸变大于或等于-30%,超广角镜头在中心视场的垂轴放大率的范围为0.03~0.43。
可选的,超广角镜头中镜片数目的范围为5~8,图像传感器的靶面尺寸的范围为1/3.06~1/2.78。
在一种可能的设计中,镜头为内调焦镜头。处理器,还用于调整内调焦镜头的焦距。
可选的,内调焦镜头包括一个或一个以上光焦度可变的镜片,光焦度可变的镜片的光焦度与内调焦镜头的焦距相关联。
处理器,用于调整内调焦镜头的焦距,具体可以实现为:用于调整一个或一个以上光焦度可变镜片的光焦度,以调整内调焦镜头的焦距。
可选的,光焦度可变镜片的折射率与光焦度可变镜片的光焦度相关。
处理器,用于调整一个或一个以上光焦度可变镜片的光焦度,具体可以实现为:用于控制输入到光焦度可变镜片的电流或电压,以改变光焦度可变镜片的折射率,以调整光焦度可变镜片的光焦度。
或者,光焦度可变镜片的形状与光焦度可变镜片的光焦度相关。
相应的,处理器,用于调整一个或一个以上光焦度可变镜片的光焦度,具体可以实现为:用于控制光焦度可变镜片发生形变,以调整光焦度可变镜片的光焦度。
可选的,光焦度可变的镜片为电致材质的镜片或者可形变镜片。
如此,可以通过为光焦度可变镜片施加电场改变其折射率,或者通过驱动装置推动、挤压光焦度可变镜片发生形变,进而改变光焦度可变镜片的光焦度,以调整内调焦镜头的焦距。如此,能够使得终端支持在被摄物距离图像传感器较近时清晰成像。
在一种可能的设计中,终端还包括镜片驱动装置。内调焦镜头包括沿光轴依次排列的n个镜片,n个镜片包括一个或者一个以上可移动镜片组,每一可移动镜片组包括一个或一个以上可移动镜片,可移动镜片为相对镜头沿光轴的位置可变的镜片,可移动镜片沿光轴的相对位置与内调焦镜头的焦距相关。
镜片驱动装置,用于驱动内调焦镜头中的一个或一个以上可移动镜片组沿着光轴移动,以调整内调焦镜头的焦距。
如此,在本申请实施例中,通过镜片驱动装置的驱动,镜头中可移动镜片之间沿光轴的相对位置发生变化,也就是,镜头中镜片之间的间距发生了变化,由此,整个镜头的光学特性,比如焦距可能发生变化。即本申请实施例中,通过动态调整镜头中镜片之间的间距,可以调整镜头的焦距,进而使得终端在微距下能够成较为清晰的像。
第二方面,本申请实施例提供一种微距成像方法,该方法应用于终端,终端包括摄像模组、输入部件、输出部件和处理器,摄像模组从物侧到像侧包括镜头、镜头驱动装置以及图像传感器。其中,镜头支持被摄物与图像传感器之间的距离处于微距范围内时清晰成像。该方法包括:
若检测到被摄物与图像传感器之间的距离处于微距范围内,处理器控制镜头驱动装置,以驱动镜头沿着光轴移动,以使得镜头对被摄物完成对焦。输入部件接收用户输入的拍摄指令,拍摄指令用于拍摄对焦后的图片,之后,输出部件输出拍摄的图片。
在一种可能的设计中,在终端检测到被摄物与图像传感器之间的距离处于微距范围内之后,终端还可以执行如下步骤:
输出部件输出第一界面,第一界面用于提示用户是否开启微距拍摄。
本申请实施例提供的微距成像方法,终端可检测被摄物与图像传感器之间的距离是否满足微距,在满足微距时,终端中的镜头驱动装置推动镜头沿着光轴移动,以完成对焦,进而在微距下能够拍摄较为清晰的成像。
在一种可能的设计中,微距范围为1~5cm。
在一种可能的设计中,镜头为超广角镜头,超广角镜头的视场FOV大于或等于100°,超广角镜头的等效焦距的取值范围为10mm~20mm。
可选的,超广角镜头在边缘视场具有负畸变,且负畸变大于或等于-30%,超广角镜头在中心视场的垂轴放大率的范围为0.03~0.43。
可选的,超广角镜头中镜片数目的范围为5~8,图像传感器的靶面尺寸的范围为1/3.06~1/2.78。
在一种可能的设计中,镜头为内调焦镜头。处理器控制镜头驱动装置,驱动镜头沿着光轴移动,以使得镜头对被摄物完成对焦,具体可以实现为:处理器控制镜头驱动装置,以驱动内调焦镜头沿着光轴移动,并且控制调整内调焦镜头的焦距,以使得内调焦镜头对被摄物完成对焦。
可选的,终端通过处理器控制输入到光焦度可变镜片的电流或电压,以调整光焦度可变镜片的光焦度。或者,终端通过处理器控制光焦度可变镜片发生形变,以调整光焦度可变镜片的光焦度。当然,终端处理器还可以通过其他方式控制光焦度可变镜片的光焦度发生变化,以调整内调焦镜头的焦距。
本申请实施例提供的微距成像方法,当终端检测到被摄物与图像传感器之间的距离满足微距时,终端可通过控制镜片的形变或者折射率来改变镜片的光焦度,进而调整镜头的焦距,并且,可通过镜头驱动装置完成对焦,从而在微距时能够获得高质量的成像。
附图说明
图1为本申请实施例提供的终端的结构示意图;
图2为本申请实施例提供的摄像模组在终端上的设置示意图;
图3为本申请实施例提供的摄像模组的结构示意图;
图4为本申请实施例提供的音圈马达与镜头的连接示意图;
图5为本申请实施例提供的具有超广角镜头的摄像模组的结构示意图;
图6为视场的示意图;
图7为现有手机和本申请实施例的手机在微距范围内拍摄的图片;
图8为本申请实施例提供的终端在微距范围内拍摄的图片;
图9为本申请实施例提供的终端在微距范围内拍摄的图片;
图10为本申请实施例提供的具有内调焦镜头的摄像模组的结构示意图一;
图11为本申请实施例提供的具有内调焦镜头的摄像模组的结构示意图二;
图12为本申请实施例提供的微距成像的方法流程图;
图13为本申请实施例提供的微距成像的方法流程图;
图14为本申请实施例提供的微距成像的方法流程图;
图15为本申请实施例提供的微距成像的场景示意图一;
图16为本申请实施例提供的微距成像的场景示意图二;
图17为本申请实施例提供的微距成像的场景示意图三。
附图标记说明:
1-镜头,
2-音圈马达。
具体实施方式
首先,对本申请实施例涉及的术语进行说明:
视场(Field of view,FOV):参见图6,在光学仪器中,以光学仪器的镜头为顶点,以被摄物的物像可通过镜头的最大范围的两条边缘构成的夹角,称为视场。视场的大小决定了光学仪器的视野范围,视场越大,视野就越大。也就是说,在视场内的物体可以通过镜头被拍摄,在视场外的物体不可视。图6中,ab为可视范围的直径,c点为可视范围的中心,oc为物距,ω为视场。
图像传感器的靶面尺寸:指的是图像传感器中感光元件的尺寸。
等效焦距:由于不同摄像模组中图像传感器的感光元件的大小不同,同样的镜头搭配不同的感光元件使用,其成像效果是不同的,为了方便理解和表述,将不同镜头的焦距按照一定的比例系数换算成标准相机的等效焦距,其中,标准相机可以为全画幅相机。将不同镜头的焦距转化为标准相机的等效焦距的方法可参见现有技术,这里不再赘述。
景深:指的是摄像模组完成对焦时,被摄物在感光元件上成像的清楚、锐利范围,成像的清楚范围越大,景深越深,成像的清楚范围越小,景深越浅。此外,景深与背景虚化效果相关。通常,浅景深对应较好的背景虚化效果,深景深对应较差的背景虚化效果。
本申请的说明书以及附图中的术语“第一”和“第二”等是用于区别不同的对象,或者用于区别对同一对象的不同处理,而不是用于描述对象的特定顺序。此外,本申请的描述中所提到的术语“包括”和“具有”以及它们的任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括其他没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。需要说明的是,本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
本申请实施例提供的终端可以是包含摄像功能的便携式电子设备,诸如手机、可穿戴设备、增强现实(Augmented Reality,AR)\虚拟现实(Virtual Reality,VR)设备、平板电脑、笔记本电脑、超级移动个人计算机(Ultra-Mobile personal computer, UMPC)、上网本、个人数字助理(Personal Digital Assistant,PDA)等,本申请实施例对此不作任何限制。便携式电子设备的示例性实施例包括但不限于搭载
Figure PCTCN2019111213-appb-000001
Figure PCTCN2019111213-appb-000002
或者其他操作系统的便携式电子设备。上述便携式电子设备也可以是其他便携式电子设备,诸如具有触敏表面(例如触控面板)的膝上型计算机(laptop)等。还应当理解的是,在本申请其他一些实施例中,上述电子设备也可以不是便携式电子设备,而是具有触敏表面(例如触控面板)的台式计算机等。
如图1和图2所示,本申请实施例中的终端可以为手机100。下面以手机100为例对实施例进行具体说明。
如图1所示,手机100具体可以包括:处理器101、射频(RF)电路102、存储器103、触摸屏104、蓝牙装置105、一个或多个传感器106、Wi-Fi装置107、定位装置108、音频电路109、外设接口110以及电源装置111等部件。这些部件可通过一根或多根通信总线或信号线(图1中未示出)进行通信。本领域技术人员可以理解,图1中示出的硬件结构并不构成对手机的限定,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图1对手机100的各个部件进行具体的介绍:
处理器101是手机100的控制中心,利用各种接口和线路连接手机100的各个部分,通过运行或执行存储在存储器103内的应用程序(简称App),以及调用存储在存储器103内的数据,执行手机100的各种功能和处理数据。在一些实施例中,处理器101可包括一个或多个处理单元。示例性的,处理器101可用于控制调整摄像模组中镜头的焦距。关于处理器控制调整镜头焦距的具体描述,可参见后文。处理器101,还用于控制摄像模组中的镜头驱动装置,驱动镜头沿着光轴移动,并且调整内调焦镜头的焦距,以使得镜头对被摄物完成对焦。
射频电路102可用于在收发信息或通话过程中,无线信号的接收和发送。特别地,射频电路102可以将基站的下行数据接收后,给处理器101处理。另外,将涉及上行的数据发送给基站。通常,射频电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频电路102还可以通过无线通信和其他设备通信。无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统、通用分组无线服务、码分多址、宽带码分多址、长期演进、电子邮件、短消息服务等。
存储器103用于存储应用程序以及数据,处理器101通过运行存储在存储器103的应用程序以及数据,执行手机100的各种功能以及数据处理。存储器103主要包括存储程序区以及存储数据区。其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)。存储数据区可以存储根据使用手机100时所创建的数据(比如音频数据、电话本等)。此外,存储器103可以包括高速随机存取存储器,还可以包括非易失存储器,例如磁盘存储器件、闪存器件或其他易失性固态存储器件等。存储器103可以存储各种操作系统。例如,苹果公司所开发的iOS操作系统,谷歌公司所开发的Android操作系统等。
手机可以包括输入部件和输入部件。其中,输入部件可接收用户对手机的输入操作,比如,接收用户输入的语音操作,接收用户输入的触摸操作等。输出部件可将手机内部的数据处理结果输出给用户,比如,手机通过输出部件输出语音、输出界面等。示例性 的,输入部件和输出部件可集成在一起。比如,作为一种可能,触摸屏104中集成了输入部件触控板104-1和输出部件显示屏104-2。触摸屏104可以包括触控板104-1和显示屏104-2。其中,触控板104-1可作为输入部件,采集手机100的用户在其上或附近的触摸事件(比如用户使用手指、触控笔等任何适合的物体在触控板104-1上或在触控板104-1附近的操作),并将采集到的触摸信息发送给其他器件例如处理器101。
其中,用户在触控板104-1附近的触摸事件可以称之为悬浮触控。悬浮触控可以是指,用户无需为了选择、移动或拖动目标(例如图标等)而直接接触触控板,而只需用户位于终端附近以便执行所想要的功能。在悬浮触控的应用场景下,术语“触摸”、“接触”等不会暗示用于直接接触触摸屏,而是附近或接近的接触。
具体的,可以在触控板104-1内设置两种电容式传感器,即互电容传感器和自电容传感器,这两种电容传感器可以交替地阵列排布在触控板104-1上。其中,互电容传感器用于实现正常传统的多点触控,即检测用户接触触控板104-1时的手势。而自电容传感器能够产生比互电容更为强大的信号,从而检测到距离触控板104-1更远的手指感应。因此,当用户的手指在屏幕上悬停时,由于自电容传感器产生的信号要比互电容传感器产生的信号大,使得手机100可以检测到在屏幕上方,例如,距离触控板104-1上方20mm处用户的手势。
可选的,能够进行悬浮触控的触控板104-1可以采用电容式、红外光感以及超声波等实现。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型来实现触控板104-1。显示屏104-2可作为输出部件,用于显示由用户输入的信息或提供给用户的信息以及手机100的各种菜单。可以采用液晶显示器、有机发光二极管等形式来配置显示屏104-2。触控板104-1可以覆盖在显示屏104-2之上,当触控板104-1检测到在其上或附近的触摸事件后,传送给处理器101以确定触摸事件的类型,随后处理器101可以根据触摸事件的类型在显示屏104-2上提供相应的视觉输出。
虽然在图1中,触控板104-1与显示屏104-2是作为两个独立的部件来实现手机100的输入和输出功能,但是在某些实施例中,可以将触控板104-1与显示屏104-2集成而实现手机100的输入和输出功能。
可以理解的是,触摸屏104是由多层的材料堆叠而成,本申请实施例中只展示出了触控板(层)和显示屏(层),其他层在本申请实施例中不予记载。另外,在本发明其他一些实施例中,触控板104-1可以覆盖在显示屏104-2之上,并且触控板104-1的尺寸大于显示屏104-2的尺寸,使得显示屏104-2全部覆盖在触控板104-1下面,或者,上述触控板104-1可以以全面板的形式配置在手机100的正面,也即用户在手机100正面的触摸均能被手机感知,这样就可以实现手机正面的全触控体验。在其他一些实施例中,触控板104-1以全面板的形式配置在手机100的正面,显示屏104-2也可以以全面板的形式配置在手机100的正面,这样在手机的正面就能够实现无边框的结构。
示例性的,在本申请实施例中,诸如触控板104-1的输入部件,用于接收用户输入的拍摄指令,拍摄指令用于指示终端拍摄对焦后的图片。诸如显示屏104-2的输出部件,用于输出对焦后所拍摄的图片。比如,参见图15中(d),用户通过触摸点击触控板104-1选择拍照选项1505,以输入拍摄指令,进而,终端在对焦后拍摄图片,并由显示屏104-2输出终端对焦后所拍摄图片。
在本申请实施例中,手机100还可以具有指纹识别功能。例如,可以在手机100的背面配置指纹采集器件112,或者在手机100的正面(例如触摸屏104的下方)配置指纹采集器件112。又例如,可以在触摸屏104中配置指纹采集器件112来实现指纹识别功能,即指纹采集器件112可以与触摸屏104集成在一起来实现手机100的指纹识别功能。在这种情况下,该指纹采集器件112配置在触摸屏104中,可以是触摸屏104的一部分,也可以以其他方式配置在触摸屏104中。另外,该指纹采集器件112还可以被实现为全面板指纹采集器件。因此,可以把触摸屏104看成是任何位置都可以进行指纹识别的一个面板。该指纹采集器件112可以将采集到的指纹发送给处理器101,以便处理器101对该指纹进行处理(例如指纹验证等)。本申请实施例中的指纹采集器件112的主要部件是指纹传感器,该指纹传感器可以采用任何类型的感测技术,包括但不限于光学式、电容式、压电式或超声波传感技术等。
手机100还可以包括蓝牙装置105,用于实现手机100与其他短距离的终端(例如手机、智能手表等)之间的数据交换。本申请实施例中的蓝牙装置105可以是集成电路或者蓝牙芯片等。
手机100还可以包括至少一种传感器106,比如光传感器、运动传感器、图像传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器。其中,环境光传感器可根据环境光线的明暗来调节触摸屏104的显示屏的亮度,接近传感器可在手机100移动到耳边时,关闭显示屏的电源。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等。图像传感器可设置在摄像模组115中,用于将摄像模组115的拍摄的画面转换为电信号。示例性的,电荷耦合器件(Charged Coupled Device,CCD)图像传感器具有高解析度(High Resolution),即可感测以及识别精细物体,并且,具有较大感光面积(Large Field of View)。互补金属氧化物半导体(Complementary Metal-Oxide-Semiconductor,CMOS)图像传感器具有省电的特性,能够降低手机在拍摄静态照片或动态视频时的功耗。
此外,手机100还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
Wi-Fi装置107,用于为手机100提供遵循Wi-Fi相关标准协议的网络接入,手机100可以通过Wi-Fi装置107接入到Wi-Fi接入点,进而帮助用户收发电子邮件、浏览网页和访问流媒体等,它为用户提供了无线的宽带互联网访问。在其他一些实施例中,该Wi-Fi装置107也可以作为Wi-Fi无线接入点,可以为其他终端提供Wi-Fi网络接入。
定位装置108,用于为手机100提供地理位置。可以理解的是,该定位装置108具体可以是全球定位系统(GPS)或北斗卫星导航系统、俄罗斯GLONASS等定位系统的接收器。定位装置108在接收到上述定位系统发送的地理位置后,将该信息发送给处理器101进行处理,或者发送给存储器103进行保存。在另外的一些实施例中,该定位装置108还可以是辅助全球卫星定位系统(AGPS)的接收器,AGPS系统通过作为辅助定位服务器来协助定位装置108完成测距和定位服务,在这种情况下,辅助定位服务器通过无线通信网络与终端例如手机100的定位装置108(即GPS接收器)通信而提供定位协助。 在另外的一些实施例中,该定位装置108也可以是基于Wi-Fi接入点的定位技术。由于每一个Wi-Fi接入点都有一个全球唯一的媒体访问控制(Media Access Control,MAC)地址,终端在开启Wi-Fi的情况下即可扫描并收集周围的Wi-Fi接入点的广播信号,因此可以获取到Wi-Fi接入点广播出来的MAC地址。终端将这些能够标示Wi-Fi接入点的数据(例如MAC地址)通过无线通信网络发送给位置服务器,由位置服务器检索出每一个Wi-Fi接入点的地理位置,并结合Wi-Fi广播信号的强弱程度,计算出该终端的地理位置并发送到该终端的定位装置108中。
音频电路109、扬声器113、麦克风114可提供用户与手机100之间的音频接口。音频电路109可将接收到的音频数据转换后的电信号,传输到扬声器113,由扬声器113转换为声音信号输出。另一方面,麦克风114将收集的声音信号转换为电信号,由音频电路109接收后转换为音频数据,再将音频数据输出至RF电路102以发送给比如另一手机,或者将音频数据输出至存储器103,以便进一步处理。
外设接口110,用于为外部的输入/输出设备(例如键盘、鼠标、外接显示器、外部存储器、用户识别模块卡等)提供各种接口。例如通过通用串行总线(Universal Serial Bus,USB)接口与鼠标连接,通过用户识别模块卡卡槽上的金属触点与电信运营商提供的用户识别模块卡(Subscriber Identification Module,SIM)卡进行连接。外设接口110可以被用来将上述外部的输入/输出外围设备耦接到处理器101和存储器103。
手机100还可以包括给各个部件供电的电源装置111(比如电池和电源管理芯片),电池可以通过电源管理芯片与处理器101逻辑相连,从而通过电源装置111实现管理充电、放电、以及功耗管理等功能。
手机100还可以包括摄像模组115,摄像模组115可以是终端的摄像头。用于拍摄静态照片或者动态视频等。作为一种可能的实现方式,摄像模组115从物侧到像侧包括镜头、镜头驱动装置以及图像传感器。摄像模组115的详细说明可参见下文实施例。
尽管图1中并未示出,手机100还可以包括闪光灯、微型投影装置、近场通信(Near Field Communication,NFC)装置等,在此不再赘述。
如下对本申请实施例提供的终端进行详细说明。下文主要以终端为手机进行举例说明,在此统一说明,下文不再赘述。参见图2,以手机200为例,手机200中的摄像模组201可以为图2所示的后置摄像头,后置摄像头设置在手机背面的顶端。当然,摄像模组还可以设置在其他位置,比如设置在手机内部,当用户有拍摄需求时,弹出摄像模组,以进行拍摄。
参见图3,为本申请实施例终端中摄像模组的示例性结构。摄像模组201从物侧到像侧包括镜头301、镜头驱动装置302以及图像传感器303。需要说明的是,图3中各个部件仅仅是示例性的部件,部件的实际形状和大小、尺寸并不局限于图3中所列的情况。
其中,物侧指的是被拍摄物体(简称被摄物)的一侧,像侧指的是图像传感器成像的一侧。镜头驱动装置包括但不限于音圈马达、压电陶瓷、微机电系统(Micro-Electro-Mechanical System,MEMS)。图像传感器包括但不限于上述提及的CCD图像传感器、CMOS图像传感器。
镜头驱动装置,用于驱动镜头沿着光轴移动。镜头驱动装置的驱动行程与镜头的最 近对焦距离相关。在本申请实施例中,马达的驱动行程能够使得镜头的最近对焦距离的范围为1~5cm。
其中,对焦距离指的是物像之间的距离,即被摄物到镜头的距离与镜头到图像传感器的距离之和,也就是被摄物与图像传感器之间的距离。最近对焦距离指的是被摄物对焦的最近对焦距离。被摄物对焦指的是被摄物能够在图像传感器上成较为清晰的像,即,最近对焦距离为能够成较清晰像时,被摄物距离图像传感器的最近距离。
以镜头驱动装置为马达为例,当被摄物距离图像传感器的距离较近,比如为1cm时,马达驱动镜头沿着光轴移动一定的行程(比如移动400um),使得被摄物在距离图像传感器1cm时对焦。当被摄物距离图像传感器的距离比如为7cm时,马达驱动镜头沿着光轴移动一定的行程(比如移动50um),使得被摄物在距离图像传感器7cm时对焦。在本申请实施例中,马达的驱动行程能够使得镜头的最近对焦距离的范围为1~5cm。也就是,被摄物距离图像传感器的距离在1~5cm范围内时,被摄物能够对焦,即被摄物可在图像传感器上成较为清晰的像。
需要说明的是,上述镜头驱动装置主要用于沿光轴推动镜头,并将镜头沿光轴推到最佳的成像位置。当终端中设置不同的镜头时,镜头驱动装置的驱动行程可能有所不同。示例性的,终端设置有镜头1,镜头驱动装置的行程范围为0~400um,如此,在被摄物距离图像传感器1~5cm时,镜头驱动装置可将镜头沿光轴推到最佳成像位置。又例如,终端设置有镜头2,镜头驱动装置的驱动行程范围为0~300um,如此,在被摄物距离图像传感器1~5cm时,镜头驱动装置可将镜头沿光轴推到最佳成像位置。可见,针对不同的镜头,镜头驱动装置的驱动行程范围也可能有所不同。
可以理解的是,镜头驱动装置与镜头之间可以通过某种方式进行连接。以音圈马达为例,可选的,镜头与音圈马达之间可以采用如图4中(a)所示的螺牙互嵌式结构连接。具体的,这种结构主要依靠音圈马达2和镜头1之间的螺牙相互配合,形成初步结合力,之后从螺牙互嵌式结构的上端通过点胶固定,从而使镜头1的外表面和音圈马达2的内表面固定,使镜头1和音圈马达2结合在一起。或者,镜头1与音圈马达2之间可以采用如图4中(b)所示的无螺牙光面结构连接,关于无螺牙光面结构连接的具体方法可参见现有技术,这里不再赘述。当然,镜头与音圈马达之间还可以采用其他方式连接,本申请实施例对此不进行限制。并且,MEMS、压电陶瓷各自与镜头的连接关系也可参见现有技术的方式,本申请实施例对此不加以限制。
在本申请实施例中,为了支持被摄物在距离图像传感器1~5cm时能够成较清晰的像,可以使用如下3种镜头中的至少一种:
情况1:镜头为定焦的超广角镜头。
示例性的,超广角镜头的视场(Field of view,FOV)大于或等于100°,超广角镜头的等效焦距的取值范围为10mm~20mm。
需要说明的是,本申请实施例中,终端通过设置不同的镜头,可实现微距成像。其中,不同镜头的具体参数会有所不同。一般来说,当镜头的参数落在本申请实施例中提到的参数范围内时,终端可实现微距成像。比如,当超广角镜头的FOV为110°、等效焦距为15mm时,可通过调整超广角镜头的其他参数,比如曲率、折射率等实现终端的微距成像。本申请实施例中,微距成像指的是,被摄物在距离图像传感器1~5cm时能够 成较清晰的像,此处统一说明,下文不再赘述。
参见图5,为本申请实施例提供的一种示例性的超广角镜头的结构。该超广角镜头由6枚镜片构成。其中,从物侧到像侧的第一片镜片L1的光焦度为负,第二片镜片L2的光焦度为正。L1和L2之间设置有光阑STO。第三片镜片L3的光焦度为负,第四片镜片L4的光焦度为正,第五片镜片L5的光焦度为正,第六片镜片L6的光焦度为负。该超广角镜头的FOV数值可以为为100°,也可以为大于100°的某一数值,该超广角镜头的等效焦距可以是10~20mm中的某一数值。定义第一片镜片L1到图像传感器303的距离为总长度(Total Track Length,TTL),镜头的半像高为IH,IH/TTL的范围为0.5~0.6。当然,本申请实施例的超广角镜头还可以具有其他结构和其他数目的镜片,比如,超广角镜头由5枚镜片构成,且从物侧到像侧各个镜片的光焦度、曲率等可根据实际情况设定。或者,超广角镜头还可以采用现有结构,本申请实施例对超广角镜头的具体结构不进行限制。
在本申请实施例中,超广角镜头的等效焦距较短(10~20mm),因此,可获得更短的最近对焦距离,也就是,在镜头距离被摄物的距离较近时,也能够成功对焦,获得高质量、高清晰度的成像。
参见图7,图7中(a)为现有手机在微距(比如被摄物距离镜头5cm)时拍摄的图像,成像较为模糊。图7中(b)为本申请实施例中手机在微距时拍摄的图像,图7中(b)中,小虫和树叶的细节被拍摄的较为清晰。
此外,在镜头距离被摄物的距离较近时,由于对焦距离较近,所拍摄图像的景深较浅,使得所拍摄图像获得较好的背景虚化效果。
参见图8,为本申请实施例中手机所拍摄的画面,该画面具有较好的背景虚化效果。
可选的,镜头在中心视场的垂轴放大率的范围为0.03~0.43。镜头在边缘视场具有负畸变,且负畸变大于或等于-30%。
其中,垂轴放大率指的是垂直光轴方向的放大率,其数值为成像尺寸与物体实际尺寸沿垂直光轴方向的比值。边缘视场为0.8~1之间的视场。具体的,参见图6,将整个可视范围划分为N个部分,最大可视范围记为1,中心视场记为0,0.8~1为边缘视场,即α、β为边缘视场。负畸变指的是,镜头在边缘视场的垂轴放大率小于镜头在中心视场的垂轴放大率。如此,摄像模组在拍摄微缩景观时,边缘视场较低的放大率相当于拍摄宏观景观时因物距增大带来的放大率降低,使得摄像模组能够拍摄出透视效果较好的图像。
参见图9,为本申请实施例中手机所拍摄的画面,其中,微观景物(桌面上的几个小玩偶)与宏观景物(图9中的建筑物等)具有较好的透视效果,使得画面更加有立体感。
可选的,镜头中镜片数目的范围为5~8,图像传感器的靶面尺寸的范围为1/3.06~1/2.78。可选的,镜片的材质为塑胶或玻璃,或者为塑胶和玻璃的混合材质。可选的,镜头的光圈范围为F2.4~F1.8。
情况2:镜头为内调焦镜头。且该内调焦镜头包括沿光轴依次排列的n个镜片,n个镜片包括一个或者一个以上可移动镜片组。每一可移动镜片组包括一个或一个以上可移动镜片,可移动镜片指的是相对镜头沿光轴的位置可变的镜片,可移动镜片沿光轴的 位置与内调焦镜头的焦距相关。
情况2中,终端还包括镜片驱动装置,用于驱动内调焦镜头中的一个或一个以上可移动镜片组沿着光轴移动,以调整内调焦镜头的焦距。
可选的,镜片驱动装置可以为音圈马达、MEMS、压电陶瓷。
可选的,镜头驱动装置在驱动可移动镜片进行移动时,同一可移动镜片组中的可移动镜片之间沿光轴的相对位置不变,即镜头驱动装置将可移动镜片组作为一个整体,对可移动镜片组这一整体沿光轴进行移动。比如,镜头驱动装置驱动可移动镜片组中的第一镜片沿着光轴向物侧移动100um,相应的,驱动同一可移动镜片组中的第二镜片也沿着光轴向物侧移动100um。不同可移动镜片组之间的沿光轴的移动距离和移动方向可以不同。比如,图10中,驱动L2、L3沿光轴向物侧移动,且移动距离为距离1,驱动L4沿光轴向像侧移动,且移动距离为距离2。不同可移动镜片组之间沿光轴的移动距离和移动方向也可相同,本申请实施例对可移动镜片组的具体移动规律不做限定。
其中,可移动镜片可与镜片驱动装置采用某种方式进行连接,比如,可移动镜片可通过点胶方式与镜片驱动装置连接。当然,可移动镜片与镜片驱动装置的连接方式还可以参见现有技术的其他方式,本申请实施例对此不进行限制。示例性的,参见图10,为本申请实施例的一种示例性的内调焦镜头。其中,镜片驱动装置为马达,n取值为6,这6片镜片中,可移动镜片L2、可移动镜片L3构成可移动镜片组,L4为另一个可移动镜片组。其中,可移动镜片L2、L3与马达通过点胶相结合,可移动镜片L4与马达之间也通过点胶相结合。相应的,马达可驱动L2、L3、L4在沿着光轴的方向与内调焦镜头发生相对移动。
在本申请实施例中,通过镜片驱动装置的驱动,镜头中可移动镜片之间沿光轴的相对位置发生变化,也就是,镜头中镜片之间的间距发生了变化,由此,整个镜头的光学特性,比如焦距可能发生变化。即本申请实施例中,通过动态调整镜头中镜片之间的间距,可以调整镜头的焦距,进而使得终端在微距下能够成较为清晰的像。
需要说明的是,镜头驱动装置推动可移动镜片与上述提及的镜头驱动装置推动镜头,是不同的过程。镜头驱动装置推动镜头中可移动镜片沿光轴移动,其目的是通过改变镜头中镜片之间的间距,以调整镜头的焦距。镜头驱动装置推动镜头沿着光轴移动,其目的是通过镜头沿光轴的移动,调整物距和像距,以确定被摄物能够清晰成像时镜头的最佳位置。
其中,图10仅作为本申请实施例中内调焦镜头的一个示例,在实际使用中,镜头包括的镜片数目、具体哪一个或哪几个镜片为可移动镜片,可另行设定,本申请实施例不做限制。
情况3:镜头为内调焦镜头。参见图11,该内调焦镜头包括一个或一个以上光焦度可变的镜片(比如图11中的镜片L1、L4),光焦度可变的镜片的光焦度与镜头的焦距相关联。
其中,光焦度用于表征光学器件对入射平行光束的屈折本领。光焦度越大,平行光束屈折程度越显著。光焦度大于0时,屈折是会聚性的,光焦度小于0时,屈折是发散性的。
光焦度可变镜片在电场(例如变化的电流或电压)作用下的形状可变,光焦度可变 的镜片的形状与光焦度可变的镜片的光焦度相关,或者,光焦度可变的镜片在电场作用下的折射率可变,光焦度可变的镜片的折射率与光焦度可变的镜片的光焦度相关。
相应的,终端中的处理器,可通过控制光焦度可变镜片的形变或折射率,调整光焦度可变镜片的光焦度,以调整内调焦镜头的焦距。可选的,处理器,用于调整内调焦镜头的焦距,具体可以实现为:处理器控制输入到光焦度可变镜片的电流或电压,以改变光焦度可变镜片的折射率,以达到调整光焦度可变镜片的目的,从而调整内调焦镜头的焦距。或者,处理器,用于调整内调焦镜头的焦距,具体还可以实现为:处理器,用于控制光焦度可变镜片发生形变,以达到调整光焦度可变镜片的光焦度的目的,从而调整内调焦镜头的焦距。这里,处理器控制光焦度可变镜片发生形变,具体可以使处理器控制驱动装置,驱动装置推动、挤压镜片发生形变。
可选的,光焦度可变的镜片为电致材质的镜片或者可形变镜片。其中,该电致材质指的是在电场作用下折射率可变的材质。该可形变镜片可以在驱动装置的驱动下发生形变。驱动装置可以是马达、MEMS等。当然,光焦度可变的镜片的材质并不局限于上述两种,还可以为其他材质,本申请实施例对此不进行限制。
在本申请实施例中,可以通过为诸如L1、L4这类光焦度可变的镜片施加电场,改变镜片的光焦度,进而调整整个镜头的焦距,使得终端在微距时能够成较清晰的像。
需要说明的是,图3所示的摄像模组还可以包括其他部件。比如,在镜头301和图像传感器303之间设置红外截止滤光片304,用于滤除环境光中的近红外和紫外光波段。可选的,红外截止滤光片的厚度为0.11mm或0.21mm,红外截止滤光片的材料为树脂或蓝玻璃。当然,红外截止滤光片还可以为其他材料,和/或具有其他厚度的滤光片,本申请实施例对滤光片的材质和厚度不进行限制。
本申请实施例还提供一种微距成像方法,参见图12,该方法应用于上述情况1所示终端,终端设置有摄像模组、输入部件和输出部件,摄像模组从物侧至像侧设置有镜头、镜头驱动装置以及图像传感器,镜头为超广角镜头,该方法包括如下步骤:
S1201、输入部件接收用户输入的开启相机的操作,以开启相机。
示例性的,输入部件可为触控板104-1,参见图15中(a),用户触摸点击屏幕上显示的相机图标1501,触控板104-1采集用户输入的开启相机操作的信息,并将该信息传送给处理器做进一步处理,以开启相机。参见图15中(b)所示,为终端的相机界面1502。该界面可由终端的显示屏104-2显示给用户。
S1202、处理器检测被摄物与图像传感器之间的距离。
(可选的)S1203、终端检测到被摄物与图像传感器之间的距离处于微距范围内,输出部件输出第一界面1504,第一界面1504用于提示用户是否开启微距拍摄。
其中,微距范围指的是1~5cm。
可选的,终端处理器101使用激光测距方式,测量被摄物与图像传感器之间的距离。激光测距的具体原理和过程可参见现有技术,这里不再赘述。或者,处理器101采集图像传感器上的成像,当成像较为模糊时可初步判定被摄物与图像传感器之间距离较近。
可选的,处理器101将所测量距离反馈给镜头驱动装置。
参见图15,在开启终端的相机后,若终端检测到被摄物距离图像传感器的距离处于微距,如图15中(c)所示,终端输出部件,即显示屏104-2输出第一界面1504,以提 示用户是否开启微距拍摄,以获得较好的近距离拍摄成像质量。
S1204、输入部件接收用户输入的第一操作,第一操作用于指示终端开启微距拍摄。
如图15中(c)所示,显示屏104-2显示选项“是”、“否”,用户可通过输入部件输入第一操作,比如通过图1所示触控板104-1触摸选项“是”。可选的,触控板104-1将采集到的触摸信息(即用户点击选项“是”)发送给诸如处理器处理。
可选的,当用户通过诸如触控板104-1触摸选项“否”,终端可确定用户的真实拍摄意图并非微距拍摄,此时,终端可使用现有方法拍摄图片。
S1205、在处理器的控制下,镜头驱动装置驱动镜头沿着光轴移动,对被摄物进行对焦。
示例性的,终端可自动对被摄物进行对焦,即在接收到用户的第一操作,确定开启微距拍摄后,终端的处理器可控制镜头驱动装置,进而镜头驱动装置驱动超广角镜头沿着光轴移动,完成对焦过程。终端还可以接收用户在手机界面上输入的对焦操作,并根据该对焦操作进行调整超广角镜头沿着光轴的位置。比如,参见图15中(d),用户可通过触摸显示屏中显示的小虫来进行焦点选择操作,终端接收到用户输入之后,将小虫作为焦点,并调整超广角镜头沿着光轴的位置。
S1206、输入部件接收用户输入的拍摄指令,以指示终端拍摄对焦后的图片。
示例性的,参见图15中(d)所示,假设用户想在微距时拍摄静态照片,则用户通过触控板104-1点击拍照选项1505输入拍摄指令,假设用户想在微距时摄录动态视频,则用户通过触控板104-1点击摄录选项1506输入拍摄指令。
当然,参见图17中(a)所示,若终端已开启“声控拍照”,则用户可通过诸如麦克风的输入部件录入语音,以输入拍摄指令。或者,用户还可以通过其他输入部件,其他方式输入拍摄指令,本申请实施例在此不再赘述。
S1207、输出部件输出对焦后拍摄的图片。
参见图15中(d),用户可通过触控板104-1点击拍照选项1505,触发终端在微距下拍摄图片,并由输出部件,比如显示屏104-2输出微距下拍摄的图片。
当然,在终端检测到被摄物距离图像传感器的距离处于微距范围内时,还可以不输出图15中(c)所示的第一界面1504,而是自动开启微距拍摄模式,并通过镜头驱动装置驱动超广角镜头,完成对焦,之后,拍摄并输出对焦后的图片,也就是说,图12中,S1203和S1204为可选步骤。
本申请实施例提供的微距成像方法,终端可检测被摄物与图像传感器之间的距离是否满足微距,在满足微距时,终端中的镜头驱动装置推动镜头沿着光轴移动,以完成对焦,进而在微距下能够拍摄较为清晰的成像。
本申请实施例还提供一种微距成像方法,该方法应用于上述情况2所示的终端。终端设置有摄像模组、输入部件、输出部件,摄像模组从物侧至像侧设置有内调焦镜头、镜头驱动装置以及图像传感器,内调焦镜头包括沿光轴依次排列的n个镜片,n个镜片包括一个或者一个以上可移动镜片组。每一可移动镜片组包括一个或一个以上可移动镜片,可移动镜片为相对镜头沿光轴的位置可变的镜片,可移动镜片沿光轴的位置与镜头的焦距相关。参见图13,该方法包括步骤S1201至S1204、S1301、S1302、S1206和S1207:
其中,S1201至S1204的描述可参见上文,这里不再赘述。
S1301、在处理器的控制下,镜片驱动装置驱动内调焦镜头中的一个或一个以上可移动镜片组沿着光轴移动,以调整内调焦镜头的焦距。
以镜头驱动装置为马达为例,参见图10,马达可驱动L2、L3构成的可移动镜片组沿着光轴向物侧移动,从而调整镜头的焦距。
S1302、在处理器的控制下,镜头驱动装置驱动内调焦镜头沿着光轴移动,以对被摄物进行对焦。
S1206和S1207的描述可参见上文,这里不再赘述。
本申请实施例提供的微距成像方法,当终端检测到被摄物与图像传感器之间的距离满足微距时,终端中的镜头驱动装置可驱动一个或一个以上的可移动镜片沿着光轴移动,以动态调整镜头的焦距,并且,能够在微距时通过镜头驱动装置推动镜头沿着光轴移动,完成对焦,使得在微距时也能够成像清晰。
本申请实施例还提供一种微距成像方法,该方法应用于上述情况3所示终端。终端设置有摄像模组、输入部件、输出部件以及处理器,摄像模组从物侧至像侧设置有镜头、镜头驱动装置以及图像传感器,镜头为内调焦镜头,内调焦镜头包括一个或一个以上光焦度可变的镜片,光焦度可变的镜片的光焦度与内调焦镜头的焦距相关联。参见图14,该方法包括步骤S1201至S1204、S1401、S1402、S1206和S1207:
其中,S1201至S1204的描述可参见上文,这里不再赘述。
S1401、处理器控制调整内调焦镜头中一个或一个以上光焦度可变的镜片的光焦度,以调整镜头的焦距。
可选的,终端通过处理器控制输入到光焦度可变镜片的电流或电压,以调整光焦度可变镜片的光焦度。或者,终端通过处理器控制光焦度可变镜片发生形变,以调整光焦度可变镜片的光焦度。当然,终端处理器还可以通过其他方式控制光焦度可变镜片的光焦度发生变化,以调整内调焦镜头的焦距。
S1402、处理器控制镜头驱动装置,以使得镜头驱动装置驱动内调焦镜头沿着光轴移动,以对被摄物进行对焦。
S1206和S1207的描述可参见上文,这里不再赘述。
本申请实施例提供的微距成像方法,当终端检测到被摄物与图像传感器之间的距离满足微距时,终端可通过控制镜片的形变或者折射率来改变镜片的光焦度,进而调整镜头的焦距,并且,可通过镜头驱动装置完成对焦,从而在微距时能够获得高质量的成像。
参见图16,在本申请的另一些实施例中,在开启终端的相机之后,如图16中(a)所示,用户可通过点击模式选项1503,触发终端跳转至图16中(b)所示的模式选择界面1601,之后,在模式选择界面1601,用户可点击微距拍摄选项1602,触发终端执行微距成像。可选的,具有上述情况1所示结构的终端在检测到用户点击微距拍摄选项1602后,可执行上述S1205至S1207。具有上述情况2所示结构的终端在检测到用户点击微距拍摄选项1602后,可执行上述S1301、S1302、S1206、S1207。具有诸如上述情况3所示结构的终端在检测到用户点击微距拍摄选项1602后,可执行上述S1401、S1402、S1206、S1207。
当然,终端还可通过其他方式跳转至模式选择界面1601,比如,接收到用户在1502 相机界面中的左滑操作时,跳转至模式选择界面1601,本申请实施例对进入模式选择界面的方式不进行限定。
可选的,在有些场景中,用户可能并不了解微距拍摄的实际拍摄效果,这种情况下,终端可向用户提示微距拍摄的效果,或者其他微距拍摄的信息。如图16(b)所示,用户选择开启微距拍摄,此时,终端可输出界面提示,比如,弹出提示框“微距拍摄可支持被摄物距离图像传感器1~5cm时的清晰成像”,并且,提示框可设置选项“是”、“否”,当用户触摸“是”,终端可确定用户拍摄的真实意图是微距拍摄,由此,终端执行上述微距拍摄方法。
此外,在本申请的另一些实施例中,用户还可以预先设置终端的微距拍摄功能。比如,如图17中(a)所示,在设置界面1701,用户可通过点击微距拍摄开启选项1702开启微距拍摄功能。可选的,在终端开启微距拍摄功能后,终端可以执行上述微距成像方法,在终端未开启微距拍摄功能时,终端不具备执行上述微距成像方法的权限。当用户想实现微距拍摄时,若图17中(a)所示的1702为关闭,则终端可输出提示界面,提示用户开启微距拍摄功能,使得终端在微距时可成清晰的像。
需要说明的是,终端可通过多种方式进入设置界面1701,比如,终端接收到用户在相机界面1502的右滑操作时,可跳转至设置界面1502。本申请实施例对终端进入设置界面的方式不进行限制。进而,终端可保存用户在设置界面进行的设置,后续,当用户开启相机时,若终端检测到被摄物距离图像传感器较近,则终端可执行上述的微距成像方法,实现微距下的清晰成像。
以上,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以权利要求的保护范围为准。

Claims (16)

  1. 一种终端,其特征在于,所述终端包括摄像模组、输入部件、输出部件和处理器,所述摄像模组从物侧到像侧包括镜头、镜头驱动装置以及图像传感器;
    所述镜头,用于支持被摄物与图像传感器之间的距离处于微距范围内时清晰成像;
    所述镜头驱动装置,用于当被摄物与所述图像传感器之间的距离处于微距范围内时,驱动所述镜头沿着光轴移动,其中,所述镜头驱动装置的驱动行程与所述终端的最近对焦距离相关;
    所述处理器,用于控制所述镜头驱动装置,以使得所述镜头对所述被摄物完成对焦;
    所述输入部件,用于接收用户输入的拍摄指令,所述拍摄指令用于拍摄对焦后的图片;
    所述输出部件,用于输出拍摄的图片。
  2. 根据权利要求1所述的终端,其特征在于,所述微距范围为1~5cm。
  3. 根据权利要求1或2所述的终端,其特征在于,所述镜头为超广角镜头,所述超广角镜头的视场FOV大于或等于100°,所述超广角镜头的等效焦距的取值范围为10mm~20mm。
  4. 根据权利要求1至3任一项所述的终端,其特征在于,所述超广角镜头在边缘视场具有负畸变,且负畸变大于或等于-30%,所述超广角镜头在中心视场的垂轴放大率的范围为0.03~0.43。
  5. 根据权利要求1至4中任一项所述的终端,其特征在于,所述超广角镜头中镜片数目的范围为5~8,所述图像传感器的靶面尺寸的范围为1/3.06~1/2.78。
  6. 根据权利要求1或2所述的终端,其特征在于,所述镜头为内调焦镜头;
    所述处理器,还用于调整所述内调焦镜头的焦距。
  7. 根据权利要求6所述的终端,其特征在于,所述内调焦镜头包括一个或一个以上光焦度可变的镜片,光焦度可变的镜片的光焦度与所述内调焦镜头的焦距相关联;
    所述处理器,用于调整所述内调焦镜头的焦距,包括:用于调整一个或一个以上光焦度可变镜片的光焦度,以调整所述内调焦镜头的焦距。
  8. 根据权利要求7所述的终端,其特征在于,所述光焦度可变镜片的折射率与所述光焦度可变镜片的光焦度相关;
    所述处理器,用于调整一个或一个以上光焦度可变镜片的光焦度,包括:用于控制输入到光焦度可变镜片的电流或电压,以改变光焦度可变镜片的折射率,以调整所述光焦度可变镜片的光焦度。
  9. 根据权利要求7所述的终端,其特征在于,所述光焦度可变镜片的形状与所述光焦度可变镜片的光焦度相关;
    所述处理器,用于调整一个或一个以上光焦度可变镜片的光焦度,包括:用于控制光焦度可变镜片发生形变,以调整所述光焦度可变镜片的光焦度。
  10. 根据权利要求7至9任一项所述的终端,其特征在于,光焦度可变的镜片为电致材质的镜片或者可形变镜片。
  11. 根据权利要求6所述的终端,其特征在于,所述终端还包括镜片驱动装置,
    所述内调焦镜头包括沿光轴依次排列的n个镜片,所述n个镜片包括一个或者一个以上可移动镜片组,每一可移动镜片组包括一个或一个以上可移动镜片,可移动镜片为相对镜头沿光轴的位置可变的镜片,可移动镜片沿光轴的相对位置与所述内调焦镜头的焦距相关;
    所述镜片驱动装置,用于驱动所述内调焦镜头中的一个或一个以上可移动镜片组沿着光轴移动,以调整所述内调焦镜头的焦距。
  12. 一种微距成像方法,其特征在于,应用于终端,所述终端包括摄像模组、输入部件、输出部件和处理器,所述摄像模组从物侧到像侧包括镜头、镜头驱动装置以及图像传感器;其中,所述镜头支持被摄物与图像传感器之间的距离处于微距范围内时清晰成像,所述方法包括:
    若检测到被摄物与图像传感器之间的距离处于微距范围内,所述处理器控制所述镜头驱动装置,以驱动所述镜头沿着光轴移动,以使得所述镜头对所述被摄物完成对焦;
    所述输入部件接收用户输入的拍摄指令,所述拍摄指令用于拍摄对焦后的图片;
    所述输出部件输出拍摄的图片。
  13. 根据权利要求12所述的微距成像方法,其特征在于,在所述终端检测到被摄物与所述图像传感器之间的距离处于微距范围内之后,所述方法还包括:
    所述输出部件输出第一界面,所述第一界面用于提示用户是否开启微距拍摄。
  14. 根据权利要求12或13所述的微距成像方法,其特征在于,所述微距范围为1~5cm。
  15. 根据权利要求12-14任一项所述的微距成像方法,其特征在于,所述镜头为超广角镜头,所述超广角镜头的视场FOV大于或等于100°,所述超广角镜头的等效焦距的取值范围为10mm~20mm。
  16. 根据权利要求12至14任一项所述的微距成像方法,其特征在于,所述镜头为内调焦镜头;
    所述处理器控制所述镜头驱动装置,驱动所述镜头沿着光轴移动,以使得所述镜头对所述被摄物完成对焦,包括:所述处理器控制所述镜头驱动装置,以驱动所述内调焦镜头沿着光轴移动,并且控制调整所述内调焦镜头的焦距,以使得所述内调焦镜头对所述被摄物完成对焦。
PCT/CN2019/111213 2018-10-16 2019-10-15 微距成像的方法及终端 WO2020078346A1 (zh)

Priority Applications (8)

Application Number Priority Date Filing Date Title
US17/286,378 US11405538B2 (en) 2018-10-16 2019-10-15 Macro imaging method and terminal
RU2021113634A RU2762887C1 (ru) 2018-10-16 2019-10-15 Оконечное устройство и способ макросъемки
EP19874716.4A EP3846432B1 (en) 2018-10-16 2019-10-15 Macro imaging method and terminal
KR1020217011330A KR102324921B1 (ko) 2018-10-16 2019-10-15 매크로 이미징 방법 및 단말기
ES19874716T ES2961008T3 (es) 2018-10-16 2019-10-15 Método y terminal de imagen macro
JP2021520932A JP2022511621A (ja) 2018-10-16 2019-10-15 マクロイメージング方法および端末
US17/702,491 US11683574B2 (en) 2018-10-16 2022-03-23 Macro imaging method and terminal
JP2022193051A JP2023029944A (ja) 2018-10-16 2022-12-01 マクロイメージング方法および端末

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811206371.XA CN109788089B (zh) 2018-10-16 2018-10-16 微距成像的方法及终端
CN201811206371.X 2018-10-16

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US17/286,378 A-371-Of-International US11405538B2 (en) 2018-10-16 2019-10-15 Macro imaging method and terminal
US17/702,491 Continuation US11683574B2 (en) 2018-10-16 2022-03-23 Macro imaging method and terminal

Publications (1)

Publication Number Publication Date
WO2020078346A1 true WO2020078346A1 (zh) 2020-04-23

Family

ID=66496327

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/111213 WO2020078346A1 (zh) 2018-10-16 2019-10-15 微距成像的方法及终端

Country Status (8)

Country Link
US (2) US11405538B2 (zh)
EP (1) EP3846432B1 (zh)
JP (2) JP2022511621A (zh)
KR (1) KR102324921B1 (zh)
CN (3) CN113472976B (zh)
ES (1) ES2961008T3 (zh)
RU (1) RU2762887C1 (zh)
WO (1) WO2020078346A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7387827B2 (ja) 2021-06-15 2023-11-28 丈二 青沼 観察及び誘導用カメラシステム
US11860517B2 (en) 2021-01-07 2024-01-02 Joji AONUMA Versatile camera device mountable to pole

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020072267A1 (en) 2018-10-05 2020-04-09 Google Llc Scale-down capture preview for a panorama capture user interface
CN113472976B (zh) * 2018-10-16 2022-11-25 华为技术有限公司 微距成像的方法及终端
CN110351405A (zh) * 2019-07-03 2019-10-18 肯维捷斯(武汉)科技有限公司 一种具有微观成像功能的移动通讯设备
CN110515258A (zh) * 2019-08-05 2019-11-29 肯维捷斯(武汉)科技有限公司 一种近摄照明器及包含该照明器的成像设备
US10984513B1 (en) 2019-09-30 2021-04-20 Google Llc Automatic generation of all-in-focus images with a mobile camera
CN110661950A (zh) * 2019-09-30 2020-01-07 联想(北京)有限公司 摄像头模组以及电子设备
US11032486B2 (en) 2019-10-11 2021-06-08 Google Llc Reducing a flicker effect of multiple light sources in an image
CN111127699A (zh) * 2019-11-25 2020-05-08 爱驰汽车有限公司 汽车缺陷数据自动录入方法、系统、设备及介质
CN110769144A (zh) * 2019-11-27 2020-02-07 Oppo广东移动通信有限公司 成像装置及移动终端
CN111092977B (zh) * 2019-12-31 2021-05-18 维沃移动通信有限公司 电子设备
CN111294511B (zh) * 2020-02-06 2021-12-14 北京小米移动软件有限公司 相机模组的对焦方法及装置、存储介质
US11190689B1 (en) 2020-07-29 2021-11-30 Google Llc Multi-camera video stabilization
CN111757102B (zh) * 2020-08-03 2022-03-15 Oppo广东移动通信有限公司 超微距摄像头清晰度检测装置及方法
CN112422787B (zh) * 2020-10-28 2022-04-22 维沃移动通信有限公司 摄像模组及电子设备
US20220345591A1 (en) * 2021-04-22 2022-10-27 David Shau Underwater Camera Operations
CN113676642A (zh) * 2021-08-17 2021-11-19 Oppo广东移动通信有限公司 摄像头组件及其控制方法、电子设备
CN113866945A (zh) * 2021-09-30 2021-12-31 玉晶光电(厦门)有限公司 光学成像镜头
CN114633692B (zh) * 2022-03-14 2023-10-03 深圳市艾为智能有限公司 一种偏心镜头在cms系统中的应用方法
CN114640768A (zh) * 2022-03-15 2022-06-17 Oppo广东移动通信有限公司 拍摄组件、电子设备和控制方法
CN115086518A (zh) * 2022-06-06 2022-09-20 Oppo广东移动通信有限公司 摄像头和电子装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040179687A1 (en) * 2003-03-14 2004-09-16 Cheng-Shing Lai Method for transmitting copyrighted electronic documents in a wireless communication system
CN102547084A (zh) * 2011-09-22 2012-07-04 上海兴禄科技实业有限公司 一种既能实现远距拍摄又能实现超微距拍摄的摄像机
CN106254768A (zh) * 2016-07-29 2016-12-21 广东欧珀移动通信有限公司 微距拍摄处理方法、装置和终端设备
CN109788089A (zh) * 2018-10-16 2019-05-21 华为技术有限公司 微距成像的方法及终端

Family Cites Families (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545818B2 (en) * 1999-05-10 2003-04-08 Canon Kabushiki Kaisha Zoom lens and camera system
JP4478247B2 (ja) * 1999-07-06 2010-06-09 キヤノン株式会社 ズームレンズ
US7598997B2 (en) 2004-01-14 2009-10-06 Ricoh Company, Ltd. Imaging apparatus and focus control method based on a number of automatic focus scan stages, and recording medium storing a program for executing such a method
JP2006014054A (ja) * 2004-06-28 2006-01-12 Nec Access Technica Ltd カメラ付携帯電子機器による接写防止方法及びカメラ付携帯電子機器
JP2006042028A (ja) * 2004-07-28 2006-02-09 Canon Inc 画像処理装置、画像処理装置の制御方法、制御プログラム及び記憶媒体
JP2006217131A (ja) 2005-02-02 2006-08-17 Matsushita Electric Ind Co Ltd 撮像装置
JP2006243092A (ja) 2005-03-01 2006-09-14 Konica Minolta Opto Inc 広角レンズ
US7403341B2 (en) * 2005-06-16 2008-07-22 Olympus Imaging Corp. Zoom lens system and electronic image pickup apparatus using the same
CN100406946C (zh) * 2005-08-04 2008-07-30 亚洲光学股份有限公司 微距摄像镜头
CN101009773B (zh) 2006-01-24 2010-10-13 光宝科技股份有限公司 数字相机模块
JP2008015274A (ja) * 2006-07-06 2008-01-24 Olympus Imaging Corp デジタルカメラ
JP5607398B2 (ja) * 2009-04-07 2014-10-15 富士フイルム株式会社 撮像レンズおよび撮像装置、ならびに携帯端末機器
CN201434931Y (zh) * 2009-06-09 2010-03-31 宁波舜宇车载光学技术有限公司 超广角百万像素车载镜头
CN101701793B (zh) * 2009-10-29 2011-11-02 天津三星光电子有限公司 利用数码相机测定物体与拍摄相机之间距离的方法
KR20110064156A (ko) * 2009-12-07 2011-06-15 삼성전자주식회사 촬상 장치 및 그 제조방법
JP2011123334A (ja) * 2009-12-11 2011-06-23 Canon Inc リアアタッチメントレンズ及びそれを有する撮影光学系
JP5532456B2 (ja) * 2010-05-24 2014-06-25 株式会社ニコン 望遠鏡光学系及びこれを備える光学装置
EP2397880B1 (en) 2010-06-16 2017-04-12 Ricoh Company, Ltd. Image-forming lens, and camera device and portable information terminal device with the image-forming lens
JP2012173435A (ja) * 2011-02-18 2012-09-10 Tamron Co Ltd 固定焦点レンズ
CN102778745B (zh) * 2012-08-24 2015-01-07 江西联创电子股份有限公司 高像素鱼眼镜头的透镜成像系统
JP2015022145A (ja) * 2013-07-19 2015-02-02 富士フイルム株式会社 撮像レンズおよび撮像レンズを備えた撮像装置
CN103543515B (zh) * 2013-09-26 2015-09-30 宁波舜宇红外技术有限公司 一种新型长波红外广角镜头
CN103576296B (zh) * 2013-10-30 2015-10-28 浙江舜宇光学有限公司 一种摄像镜头
JP6222564B2 (ja) * 2013-12-27 2017-11-01 コニカミノルタ株式会社 撮像レンズ、レンズユニット、撮像装置、デジタルスチルカメラ及び携帯端末
JP6489634B2 (ja) * 2014-11-18 2019-03-27 オリンパス株式会社 インナーフォーカスマクロレンズ及びそれを用いた撮像装置
JP2016099550A (ja) * 2014-11-25 2016-05-30 富士フイルム株式会社 撮像レンズおよび撮像レンズを備えた撮像装置
CN104639831B (zh) * 2015-01-05 2018-12-11 信利光电股份有限公司 一种照相机及拓展景深的方法
JP6034917B2 (ja) * 2015-04-23 2016-11-30 オリンパス株式会社 リアフォーカスレンズ系及びそれを備えた撮像装置
KR101748260B1 (ko) * 2015-04-23 2017-06-16 엘지전자 주식회사 카메라 모듈
CN204613498U (zh) * 2015-05-04 2015-09-02 嘉兴中润光学科技有限公司 小型广角镜头
US20170038504A1 (en) * 2015-08-05 2017-02-09 Lustrous Electro-Optic Co., Ltd. Close-up shots system and close-up shots module
RU2607842C1 (ru) * 2015-08-28 2017-01-20 Публичное акционерное общество "Красногорский завод им. С.А. Зверева" Макрообъектив с переменным увеличением
CN108139572B (zh) 2015-09-30 2021-03-12 株式会社尼康 变焦镜头以及光学设备
CN105204140B (zh) * 2015-10-28 2017-07-28 东莞市宇瞳光学科技股份有限公司 一种定焦镜头
JP2017102354A (ja) 2015-12-04 2017-06-08 キヤノン株式会社 撮影レンズ及びそれを有する撮像装置
KR101834728B1 (ko) * 2016-01-28 2018-03-06 주식회사 코렌 촬영 렌즈 광학계
CN105759406B (zh) * 2016-04-01 2018-09-21 浙江舜宇光学有限公司 摄像镜头
CN106405788A (zh) * 2016-05-31 2017-02-15 信华精机有限公司 一种高清超广角镜头
KR20180005464A (ko) * 2016-07-06 2018-01-16 삼성전자주식회사 옵티칼 렌즈 어셈블리 및 이를 포함한 전자 장치
CN106019540B (zh) * 2016-07-27 2018-10-09 广东弘景光电科技股份有限公司 高像素超广角光学系统及其应用的镜头
CN106657770B (zh) 2016-11-04 2019-11-29 上海斐讯数据通信技术有限公司 移动终端自动对焦镜头微距值的计算方法和系统
CN206321849U (zh) 2016-12-07 2017-07-11 深圳市爱科环球实业有限公司 使手机拍摄出广角效果图像加微距效果图像的外接镜头
KR20180068585A (ko) * 2016-12-14 2018-06-22 삼성전자주식회사 옵티칼 렌즈 어셈블리 및 이를 이용한 이미지 형성 방법
US10921570B2 (en) * 2017-04-17 2021-02-16 Zhejiang Sunny Optical Co., Ltd Camera lens assembly
CN106980170B (zh) * 2017-05-04 2022-05-27 威海嘉瑞光电科技股份有限公司 一种超广角高清航拍仪用光学镜头
CN206818961U (zh) * 2017-06-02 2017-12-29 谢莉 一种广角加微距手机外置镜头
CN107121757B (zh) * 2017-06-28 2023-03-21 广东思锐光学股份有限公司 一种超广角镜头
CN107390352A (zh) * 2017-08-23 2017-11-24 速讯光学科技(苏州)有限公司 一种大光圈超广角镜头及透镜注塑工艺
CN107300756B (zh) * 2017-08-23 2019-12-10 浙江舜宇光学有限公司 摄像镜头
CN107577032B (zh) * 2017-09-19 2024-01-26 舜宇光学(中山)有限公司 低畸变广角镜头
CN207502801U (zh) * 2017-09-19 2018-06-15 舜宇光学(中山)有限公司 低畸变广角镜头
CN107613209A (zh) 2017-09-29 2018-01-19 努比亚技术有限公司 一种图像采集方法、终端和计算机可读存储介质
CN107835344B (zh) * 2017-11-27 2020-08-07 信利光电股份有限公司 一种超微距摄像设备及一种超微距摄像系统
KR101892897B1 (ko) * 2018-04-09 2018-08-28 삼성전기주식회사 카메라용 광학계
WO2019205944A1 (zh) * 2018-04-28 2019-10-31 宁波舜宇车载光学技术有限公司 光学镜头及成像设备
CN109613685A (zh) * 2019-02-19 2019-04-12 浙江舜宇光学有限公司 摄像镜头组

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040179687A1 (en) * 2003-03-14 2004-09-16 Cheng-Shing Lai Method for transmitting copyrighted electronic documents in a wireless communication system
CN102547084A (zh) * 2011-09-22 2012-07-04 上海兴禄科技实业有限公司 一种既能实现远距拍摄又能实现超微距拍摄的摄像机
CN106254768A (zh) * 2016-07-29 2016-12-21 广东欧珀移动通信有限公司 微距拍摄处理方法、装置和终端设备
CN109788089A (zh) * 2018-10-16 2019-05-21 华为技术有限公司 微距成像的方法及终端

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BAI, YUFENG: "Macro Lens", PHOTOGRAPHY KNOWLEDGE, 31 July 2006 (2006-07-31), pages 13, XP009526888, ISBN: 978-7-204-08445-6 *
CAI, LIN: "Internal Focusing", ENCYCLOPAEDIA OF PHOTOGRAPHY, 30 September 1994 (1994-09-30), pages 184, XP009526949, ISBN: 7-5364-2761-1 *
CHANG, JUN ET AL.: "Deformable Mirror", MODERN REFLECTIVE ZOOM OPTICAL SYSTEM, 30 September 2017 (2017-09-30), pages 101 - 102, XP009526951, ISBN: 978-7-118-10930-6 *
LEI JIAN: "Comprehensive Handbook of Practical Use of Digital Single Lens Reflex", 29 February 2012, CHINA ELECTRIC POWER PRESS, CN, ISBN: 978-7-5123-2417-6, article LEI JIAN: "Chapter 1: The Focal Length and Angle of View of Lens", pages: 2 - 12, XP009526946 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11860517B2 (en) 2021-01-07 2024-01-02 Joji AONUMA Versatile camera device mountable to pole
JP7387827B2 (ja) 2021-06-15 2023-11-28 丈二 青沼 観察及び誘導用カメラシステム

Also Published As

Publication number Publication date
KR20210060560A (ko) 2021-05-26
CN113472977B (zh) 2024-05-14
US20210329150A1 (en) 2021-10-21
CN113472976A (zh) 2021-10-01
JP2023029944A (ja) 2023-03-07
KR102324921B1 (ko) 2021-11-10
CN113472977A (zh) 2021-10-01
US11683574B2 (en) 2023-06-20
EP3846432B1 (en) 2023-08-30
US20220217255A1 (en) 2022-07-07
EP3846432A4 (en) 2021-11-17
JP2022511621A (ja) 2022-02-01
RU2762887C1 (ru) 2021-12-23
ES2961008T3 (es) 2024-03-07
CN109788089A (zh) 2019-05-21
EP3846432A1 (en) 2021-07-07
CN109788089B (zh) 2021-05-18
CN113472976B (zh) 2022-11-25
US11405538B2 (en) 2022-08-02

Similar Documents

Publication Publication Date Title
WO2020078346A1 (zh) 微距成像的方法及终端
US10511758B2 (en) Image capturing apparatus with autofocus and method of operating the same
US10015383B2 (en) Mobile terminal
WO2021073346A1 (zh) 摄像头模组和终端设备
US9874998B2 (en) Mobile terminal and controlling method thereof
WO2019033411A1 (zh) 一种全景拍摄方法及装置
EP3154255B1 (en) Imaging device and video generation method
US9470875B2 (en) Image pickup device
CN107637063B (zh) 用于基于用户的手势控制功能的方法和拍摄装置
JP6403368B2 (ja) 携帯端末、画像検索プログラムおよび画像検索方法
WO2021057529A1 (zh) 摄像模组及终端设备
CN114174887A (zh) 包括两个光折叠元件的相机
CN111385525B (zh) 视频监控方法、装置、终端及系统
US20200169660A1 (en) Mobile terminal and control method therefor
US9467550B2 (en) Communication apparatus, external apparatus, and control method therefor
CN210297875U (zh) 用于移动终端的摄像装置及移动终端
JP2015192362A (ja) 画像再生装置,画像再生方法およびその制御プログラム
CN115048572A (zh) 数据处理方法、装置、服务器及存储介质
JP2015065616A (ja) 撮像装置、触覚情報出力プログラムおよび触覚情報出力方法
JP2015088774A (ja) カメラ装置、画像処理プログラムおよび画像処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19874716

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021520932

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019874716

Country of ref document: EP

Effective date: 20210401

Ref document number: 20217011330

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE