CN113056692B - Lens assembly and electronic device comprising same - Google Patents

Lens assembly and electronic device comprising same Download PDF

Info

Publication number
CN113056692B
CN113056692B CN201980073299.4A CN201980073299A CN113056692B CN 113056692 B CN113056692 B CN 113056692B CN 201980073299 A CN201980073299 A CN 201980073299A CN 113056692 B CN113056692 B CN 113056692B
Authority
CN
China
Prior art keywords
lens
lens assembly
electronic device
information
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980073299.4A
Other languages
Chinese (zh)
Other versions
CN113056692A (en
Inventor
金东佑
金昶翰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN113056692A publication Critical patent/CN113056692A/en
Application granted granted Critical
Publication of CN113056692B publication Critical patent/CN113056692B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0015Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
    • G02B13/002Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface
    • G02B13/004Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface having four lenses
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0015Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
    • G02B13/002Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface
    • G02B13/0045Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface having five or more lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/008Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras designed for infrared light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/14Optical objectives specially designed for the purposes specified below for use with infrared or ultraviolet radiation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/208Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B9/00Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or -
    • G02B9/34Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or - having four components only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/586Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B9/00Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or -
    • G02B9/60Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or - having five components only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses

Abstract

According to one embodiment, a lens assembly may include: at least four lenses are sequentially arranged along the optical axis from the subject to the image sensor. Of the at least four lenses, a first lens disposed closest to the subject may have a visible light transmittance ranging from 0% to 5%, and of the subject side surfaces and the image sensor side surfaces of the remaining lenses other than the first lens, at least four surfaces may include inflection points. The lens assembly or the electronic device including the lens assembly may be implemented in various manners according to embodiments.

Description

Lens assembly and electronic device comprising same
Technical Field
One or more embodiments of the present disclosure relate generally to optical devices. For example, in some embodiments, there is a lens assembly including a plurality of lenses and an electronic device including the lens assembly.
Background
Optical devices (e.g., cameras capable of capturing images or video) have been widely used. Recently, digital cameras or video cameras having solid-state image sensors, such as Charge Coupled Devices (CCDs) or Complementary Metal Oxide Semiconductor (CMOS) devices, have been widely distributed. These solid-state image sensors have replaced film-type optical devices in some applications because of their ease of image storage and reproduction and ease of movement.
Recently, manufacturers of devices such as smartphones have increasingly used multiple optical devices (e.g., a tele camera and a wide camera) together in a single electronic device to improve the quality of the captured image and to provide various visual effects to the captured image. For example, an image of an object may be obtained via a plurality of cameras having different optical characteristics, and the images may be synthesized to obtain a processed image. Such optical devices may be installed in electronic devices dedicated to image capture, such as digital cameras. Recently, these optical devices have also been mounted on miniaturized portable electronic devices such as mobile communication terminals.
Disclosure of Invention
Technical problem
As the use of electronic devices such as mobile communication terminals has become popular, the appearance of the electronic devices has been continuously improved, satisfying the user's expectations for good design and appearance. However, the optical device (e.g., camera) of the electronic device is inevitably exposed so that it can receive light from the surrounding environment, and thus may interfere with the design appearance of the electronic device. Meanwhile, various functions such as object recognition, augmented reality, and three-dimensional scanning using an optical device are also incorporated in an electronic device, and thus, an additional optical device that detects distance information (e.g., depth) about an object is miniaturized and can be mounted on the electronic device. However, the optical device for detecting the distance information may also be exposed, and thus it also spoils the external appearance of the electronic device.
Technical proposal
According to an embodiment, a lens assembly may include: at least four lenses are sequentially arranged along the optical axis from the object to the image sensor. Of the at least four lenses, a first lens disposed closest to the subject may have a visible light transmittance ranging from 0% to 5%, and of the subject side surfaces and the image sensor side surfaces of the remaining lenses other than the first lens, at least four surfaces may include inflection points.
According to an embodiment, an electronic device may include: a first camera including a lens assembly and configured to acquire first information about an object based on light incident through the lens assembly; at least one second camera configured to acquire second information about the subject different from the first information; a processor or an image signal processor. The lens assembly may include: at least four lenses are sequentially arranged along the optical axis from the subject to the image sensor. Among the at least four lenses, a first lens disposed closest to the subject may have a visible light transmittance ranging from 0% to 5%, among subject side surfaces and image sensor side surfaces of the remaining lenses other than the first lens, at least four surfaces may include inflection points, and the processor or the image signal processor may be configured to perform user authentication based on first information.
Additional aspects will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the presented embodiments.
Advantageous effects
According to an embodiment, among lenses of the lens assembly, a first lens disposed closest to the subject may have a transmittance with respect to visible light ranging from 0% to 5%. The lens assembly may not be visually discernable to a user even if the lens assembly is mounted on an electronic device. In other words, the lens assembly may not be visually distinguishable from the rest of the exterior of the electronic device. Thus, the appearance of the lens assembly and the electronic device according to some embodiments can be visually coordinated. According to an embodiment, the lens assembly includes at least four lenses, and at least four of the lens surfaces have inflection points, whereby the lens assembly can have a large aperture characteristic while having a short total length. Accordingly, the lens assembly according to some embodiments can be easily mounted in a miniaturized electronic device.
Drawings
The above and other aspects, features and advantages of the present disclosure will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings in which:
FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments;
FIG. 2 is a block diagram illustrating a camera module according to various embodiments;
fig. 3 is a perspective view showing a front face of an electronic device according to an embodiment;
fig. 4 is a perspective view showing the back side of the electronic device shown in fig. 3;
fig. 5 is a configuration diagram showing a lens assembly according to one of the various embodiments;
fig. 6 (a) is a graph showing spherical aberration of a lens assembly according to one of the various embodiments, fig. 6 (b) is a graph showing astigmatism of a lens assembly according to one of the various embodiments, and fig. 6 (c) is a graph showing distortion rate of a lens assembly according to one of the various embodiments;
fig. 7 is a diagram showing a configuration of a lens assembly according to another embodiment of various embodiments;
fig. 8 (a) is a graph showing spherical aberration of a lens assembly according to another embodiment of the various embodiments, fig. 8 (b) is a graph showing astigmatism of a lens assembly according to another embodiment of the various embodiments, and fig. 8 (c) is a graph showing distortion rate of a lens assembly according to another embodiment of the various embodiments;
Fig. 9 is a diagram showing a configuration of a lens assembly according to another embodiment of various embodiments;
fig. 10 (a) is a graph showing spherical aberration of a lens assembly according to another embodiment of the various embodiments, fig. 10 (b) is a graph showing astigmatism of a lens assembly according to another embodiment of the various embodiments, and fig. 10 (c) is a graph showing distortion rate of a lens assembly according to another embodiment of the various embodiments;
fig. 11 is a diagram showing a configuration of a lens assembly according to another embodiment of the various embodiments;
fig. 12 (a) is a graph showing spherical aberration of a lens assembly according to another embodiment of the various embodiments, fig. 12 (b) is a graph showing astigmatism of a lens assembly according to another embodiment of the various embodiments, and fig. 12 (c) is a graph showing distortion rate of a lens assembly according to another embodiment of the various embodiments;
fig. 13 is a flowchart illustrating a control method of an electronic device including a lens assembly according to an embodiment; and
fig. 14 is a flowchart illustrating another control method of an electronic device including a lens assembly according to an embodiment.
Detailed Description
Mode for the invention
Since the present disclosure is susceptible of various modifications and alternative embodiments, certain exemplary embodiments will be described in detail with reference to the accompanying drawings. It should be understood, however, that the disclosure is not limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure.
For the description of the drawings, like reference numerals may be used to refer to like or related elements. It is to be understood that the singular form of a noun corresponding to an item may include one or more things unless the context clearly dictates otherwise. As used herein, each of the phrases such as "a or B", "at least one of a and B", "at least one of a or B", "A, B or C", "at least one of A, B and C", and "at least one of A, B or C" may include all possible combinations of items listed together in a respective one of the phrases. Although ordinal terms such as "first" and "second" may be used to describe various elements, these elements are not limited by the terms. The term is used only for the purpose of distinguishing one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure. As used herein, the term "and/or" includes any and all combinations of one or more of the associated items. It will be understood that if an element (e.g., a first element) is referred to as being "coupled with … …" or "connected with … …" with or without the term "operatively" or "communicatively", it is intended that the element can be directly (e.g., wired), wirelessly, or coupled with another element via a third element.
Furthermore, relative terms "front surface", "rear surface", "top surface", "bottom surface", etc. described with respect to directions in the drawings may be replaced with ordinal numbers such as first and second. In ordinal numbers such as first and second, their order is determined in the order mentioned or arbitrarily determined, and may not be arbitrarily changed if necessary.
In this disclosure, the terminology is used to describe specific embodiments and is not intended to be limiting of the disclosure. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise. In the description, it should be understood that the terms "comprises" or "comprising" indicate the presence of a feature, a number, a step, an operation, a structural element, a component, or a combination thereof, and do not previously exclude the possibility of the presence of one or more other features, numbers, steps, operations, structural elements, components, or a combination thereof, or the addition of one or more other features, numbers, steps, operations, structural elements, components, or a combination thereof.
Unless defined differently, all terms used herein, including technical or scientific terms, have the same meaning as understood by one of ordinary skill in the art to which this disclosure belongs. Unless explicitly defined in the specification, terms such as those defined in commonly used dictionaries should be interpreted as having a meaning that is the same as the context of the relevant art and should not be interpreted in an idealized or overly formal sense.
In the present disclosure, the electronic device may be an arbitrary device, and the electronic device may be referred to as a terminal, a portable terminal, a mobile terminal, a communication terminal, a portable mobile terminal, a touch screen, or the like.
For example, the electronic device may be a smart phone, a portable phone, a game machine, a TV, a display unit, a head-up display unit for a vehicle, a notebook computer, a laptop computer, a tablet Personal Computer (PC), a Personal Media Player (PMP), a Personal Digital Assistant (PDA), or the like. The electronic device may be implemented as a portable communication terminal having a wireless communication function and a pocket size. Further, the electronic device may be a flexible device or a flexible display device.
The electronic device may communicate with an external electronic device such as a server or perform operations through interworking with the external electronic device. For example, the electronic device may transmit an image photographed by the camera and/or position information detected by the sensor unit to the server through the network. The network may be, but is not limited to, a mobile or cellular communication network, a Local Area Network (LAN), a Wireless Local Area Network (WLAN), a Wide Area Network (WAN), the internet, a Small Area Network (SAN), etc.
According to some embodiments, a lens assembly implementing an optical device that detects distance information about an object while visually coordinating with the appearance of an electronic device may be provided.
According to some embodiments, a lens assembly that is easily mounted on a miniaturized electronic device may be provided.
Fig. 1 is a block diagram illustrating an electronic device 101 in a network environment 100 according to various embodiments. Referring to fig. 1, an electronic device 101 in a network environment 100 may communicate with an electronic device 102 via a first network 198 (e.g., a short-range wireless communication network) or with an electronic device 104 or server 108 via a second network 199 (e.g., a long-range wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 via the server 108. According to an embodiment, the electronic device 101 may include a processor 120, a memory 130, an input device 150, a sound output device 155, a display device 160, an audio module 170, a sensor module 176, an interface 177, a haptic module 179, a camera module 180, a power management module 188, a battery 189, a communication module 190, a Subscriber Identity Module (SIM) 196, or an antenna module 197. In some embodiments, at least one of the components (e.g., display device 160 or camera module 180) may be omitted from electronic device 101, or one or more other components may be added to electronic device 101. In some embodiments, some of the components may be implemented as a single integrated circuit. For example, the sensor module 176 (e.g., a fingerprint sensor, iris sensor, or illuminance sensor) may be implemented embedded in the display device 160 (e.g., a display).
The processor 120 may run, for example, software (e.g., program 140) to control at least one other component (e.g., hardware component or software component) of the electronic device 101 that is connected to the processor 120, and may perform various data processing or calculations. According to one embodiment, as at least part of the data processing or calculation, the processor 120 may load commands or data received from another component (e.g., the sensor module 176 or the communication module 190) into the volatile memory 132, process the commands or data stored in the volatile memory 132, and store the resulting data in the non-volatile memory 134. According to an embodiment, the processor 120 may include a main processor 121 (e.g., a Central Processing Unit (CPU) or an Application Processor (AP)) and an auxiliary processor 123 (e.g., a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a sensor hub processor or a Communication Processor (CP)) that is operatively independent or combined with the main processor 121. Additionally or alternatively, the auxiliary processor 123 may be adapted to consume less power than the main processor 121, or may be adapted specifically for a specified function. The auxiliary processor 123 may be implemented separately from the main processor 121 or as part of the main processor 121.
The auxiliary processor 123 (instead of the main processor 121) may control at least some of the functions or states associated with at least one of the components of the electronic device 101 (e.g., the display device 160, the sensor module 176, or the communication module 190) when the main processor 121 is in an inactive (e.g., sleep) state, or may control at least some of the functions or states associated with at least one of the components of the electronic device 101 (e.g., the display device 160, the sensor module 176, or the communication module 190) with the main processor 121 when the main processor 121 is in an active state (e.g., running an application). According to an embodiment, the auxiliary processor 123 (e.g., an image signal processor or a communication processor) may be implemented as part of another component (e.g., the camera module 180 or the communication module 190) functionally related to the auxiliary processor 123.
The memory 130 may store various data used by at least one component of the electronic device 101 (e.g., the processor 120 or the sensor module 176). The various data may include, for example, software (e.g., program 140) and input data or output data for commands associated therewith. Memory 130 may include volatile memory 132 or nonvolatile memory 134.
The program 140 may be stored as software in the memory 130, and the program 140 may include, for example, an Operating System (OS) 142, middleware 144, or applications 146.
The input device 150 may receive commands or data from outside the electronic device 101 (e.g., a user) to be used by other components of the electronic device 101 (e.g., the processor 120). The input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (e.g., a stylus).
The sound output device 155 may output a sound signal to the outside of the electronic device 101. The sound output device 155 may include, for example, a speaker or a receiver. Speakers may be used for general purposes such as playing multimedia or playing a album and receivers may be used for incoming calls. Depending on the embodiment, the receiver may be implemented separate from the speaker or as part of the speaker.
Display device 160 may visually provide information to the outside (e.g., user) of electronic device 101. The display device 160 may include, for example, a display, a holographic device, or a projector, and a control circuit for controlling a corresponding one of the display, the holographic device, and the projector. According to an embodiment, the display device 160 may include touch circuitry adapted to detect touches or sensor circuitry (e.g., pressure sensors) adapted to measure the strength of forces caused by touches.
The audio module 170 may convert sound into electrical signals and vice versa. According to an embodiment, the audio module 170 may obtain sound via the input device 150, or output sound via the sound output device 155 or headphones of an external electronic device (e.g., the electronic device 102) that is directly (e.g., wired) or wirelessly connected to the electronic device 101.
The sensor module 176 may detect an operational state (e.g., power or temperature) of the electronic device 101 or an environmental state (e.g., a state of a user) external to the electronic device 101 and then generate an electrical signal or data value corresponding to the detected state. According to an embodiment, the sensor module 176 may include, for example, a gesture sensor, a gyroscope sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an Infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
Interface 177 may support one or more specific protocols that will be used to connect with an external electronic device (e.g., electronic device 102) directly (e.g., wired) or wirelessly with electronic device 101. According to an embodiment, interface 177 may include, for example, a High Definition Multimedia Interface (HDMI), a Universal Serial Bus (USB) interface, a Secure Digital (SD) card interface, or an audio interface.
The connection terminals 178 may include connectors via which the electronic device 101 may be physically connected with an external electronic device (e.g., the electronic device 102). According to an embodiment, the connection end 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).
The haptic module 179 may convert the electrical signal into a mechanical stimulus (e.g., vibration or motion) or an electrical stimulus that may be recognized by the user via his or her sense of touch or kinesthetic sense. According to an embodiment, the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrostimulator.
The camera module 180 may capture still images or moving images. According to an embodiment, the camera module 180 may include one or more lenses, an image sensor, an image signal processor, or a flash.
The power management module 188 may manage power supply to the electronic device 101. According to one embodiment, the power management module 188 may be implemented as at least part of, for example, a Power Management Integrated Circuit (PMIC).
Battery 189 may power at least one component of electronic device 101. According to an embodiment, battery 189 may include, for example, a primary non-rechargeable battery, a rechargeable battery, or a fuel cell.
The communication module 190 may support establishing a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (e.g., the electronic device 102, the electronic device 104, or the server 108) and performing communication via the established communication channel. The communication module 190 may include one or more communication processors capable of operating independently of the processor 120 (e.g., an Application Processor (AP)) and supporting direct (e.g., wired) or wireless communication. According to an embodiment, the communication module 190 may include a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a Global Navigation Satellite System (GNSS) communication module) or a wired communication module 194 (e.g., a Local Area Network (LAN) communication module or a Power Line Communication (PLC) module). A respective one of these communication modules may be connected via a first network 198 (e.g., a short-range communication network such as Bluetooth TM Wireless fidelity (Wi-Fi) direct or infrared data association (IrDA)) or a second network 199 (e.g., a long-range communication network such as a cellular network, the internet, or a computer network (e.g., a LAN or Wide Area Network (WAN))) with an external electronic device. These various types of communication modules may be implemented as a single component (e.g., a single chip), or may be implemented as multiple components (e.g., multiple chips) separate from each other. The wireless communication module 192 may identify and authenticate the electronic device 101 in a communication network, such as the first network 198 or the second network 199, using user information (e.g., an International Mobile Subscriber Identity (IMSI)) stored in the user identification module 196.
The antenna module 197 may transmit signals or power to the outside of the electronic device 101 (e.g., an external electronic device) or receive signals or power from the outside of the electronic device 101 (e.g., an external electronic device). According to an embodiment, the antenna module 197 may include an antenna including a radiating element composed of a conductive material or conductive pattern formed in or on a substrate (e.g., PCB). According to an embodiment, the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication scheme used in a communication network, such as the first network 198 or the second network 199, may be selected from the plurality of antennas, for example, by the communication module 190 (e.g., the wireless communication module 192). Signals or power may then be transmitted or received between the communication module 190 and the external electronic device via the selected at least one antenna. According to an embodiment, further components (e.g., a Radio Frequency Integrated Circuit (RFIC)) other than radiating elements may additionally be formed as part of the antenna module 197.
At least some of the above components may be interconnected via an inter-peripheral communication scheme (e.g., bus, general Purpose Input and Output (GPIO), serial Peripheral Interface (SPI), or Mobile Industrial Processor Interface (MIPI)) and communicatively communicate signals (e.g., commands or data) therebetween.
According to an embodiment, commands or data may be sent or received between the electronic device 101 and the external electronic device 104 via the server 108 connected to the second network 199. Each of the electronic devices 102 and 104 may be the same type of device as the electronic device 101 or a different type of device from the electronic device 101. According to an embodiment, all or some of the operations to be performed at the electronic device 101 may be performed at one or more of the external electronic device 102, the external electronic device 104, or the server 108. For example, if the electronic device 101 should automatically perform a function or service or should perform a function or service in response to a request from a user or another device, the electronic device 101 may request the one or more external electronic devices to perform at least part of the function or service instead of or in addition to the function or service, or the electronic device 101 may request the one or more external electronic devices to perform at least part of the function or service. The one or more external electronic devices that received the request may perform the requested at least part of the function or service or perform another function or another service related to the request and transmit the result of the performing to the electronic device 101. The electronic device 101 may provide the result as at least a partial reply to the request with or without further processing of the result. For this purpose, cloud computing technology, distributed computing technology, or client-server computing technology, for example, may be used.
Fig. 2 is a block diagram 200 illustrating a camera module 280 in accordance with various embodiments. Referring to fig. 2, the camera module 280 may include a lens assembly 210, a flash 220, an image sensor 230, an image stabilizer 240, a memory 250 (e.g., a buffer memory), or an image signal processor 260. The lens assembly 210 may collect light emitted or reflected from an object of an image to be photographed. The lens assembly 210 may include one or more lenses. According to an embodiment, the camera module 280 may include a plurality of lens assemblies 210. In this case, the camera module 280 may form, for example, a dual camera, a 360 degree camera, or a spherical camera. Some of the plurality of lens assemblies 210 may have the same lens properties (e.g., viewing angle, focal length, auto-focus, f-number, or optical zoom), or at least one lens assembly may have one or more lens properties that are different from the lens properties of the other lens assemblies. Lens assembly 210 may include, for example, a wide angle lens or a tele lens.
The flash 220 may emit light for enhancing light reflected from the subject. According to an embodiment, the flash 220 may include one or more Light Emitting Diodes (LEDs), such as, for example, red Green Blue (RGB) LEDs, white LEDs, infrared (IR) LEDs, or Ultraviolet (UV) LEDs), or a xenon lamp. The image sensor 230 may obtain an image corresponding to the object by converting light emitted or reflected from the object and transmitted through the lens assembly 210 into an electrical signal. According to an embodiment, the image sensor 230 may include one image sensor selected from among image sensors having different properties, such as an RGB sensor, a black-and-white (BW) sensor, an IR sensor, or a UV sensor, a plurality of image sensors having the same properties, or a plurality of image sensors having different properties. Each of the image sensors included in the image sensor 230 may be implemented using, for example, a Charge Coupled Device (CCD) sensor or a Complementary Metal Oxide Semiconductor (CMOS) sensor.
The image stabilizer 240 may move the image sensor 230 or at least one lens included in the lens assembly 210 in a specific direction, or control an operational property of the image sensor 230 (e.g., adjust a readout timing) in response to movement of the camera module 280 or the electronic device 201 including the camera module 280. This allows compensation of at least a portion of the negative effects (e.g., image blur) due to movement of the image being captured. According to an embodiment, the image stabilizer 240 may sense such movement of the camera module 280 or the electronic device 101 using a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 280. According to an embodiment, the image stabilizer 240 may be implemented as, for example, an optical image stabilizer.
The memory 250 may at least temporarily store at least a portion of the image acquired via the image sensor 230 for subsequent image processing tasks. For example, if multiple images are captured quickly or image capture delays due to shutter time lags, the acquired raw images (e.g., bayer pattern images, high resolution images) may be stored in memory 250 and their corresponding duplicate images (e.g., low resolution images) may be previewed via display device 160. Then, if the specified condition is satisfied (e.g., by user input or a system command), at least a portion of the original image stored in the memory 250 may be acquired and processed by, for example, the image signal processor 260. According to embodiments, memory 250 may be configured as at least a portion of memory 130, or memory 250 may be configured as a separate memory that operates independently of memory 130.
The image signal processor 260 may perform one or more image processes on an image acquired via the image sensor 230 or an image stored in the memory 250. The one or more image processing may include, for example, depth map generation, three-dimensional (3D) modeling, panorama generation, feature point extraction, image synthesis, or image compensation (e.g., noise reduction, resolution adjustment, brightness adjustment, blurring, sharpening, or softening). Additionally or alternatively, the image signal processor 260 may perform control (e.g., exposure time control or readout timing control) on at least one of the components included in the camera module 280 (e.g., the image sensor 230). The image processed by the image signal processor 260 may be stored back to the memory 250 for further processing or may be provided to an external component (e.g., the memory 130, the display device 160, the electronic device 102, the electronic device 104, or the server 108) external to the camera module 280. According to an embodiment, the image signal processor 260 may be configured as at least a portion of the processor 120, or the image signal processor 260 may be configured as a separate processor that operates independently of the processor 120. If the image signal processor 260 is configured as a processor separate from the processor 120, at least one image processed by the image signal processor 260 may be displayed as it is by the processor 120 via the display device 160, or may be displayed after being further processed.
According to an embodiment, the electronic device 101 may include multiple camera modules 280 having different properties or functions. In this case, at least one camera module 280 of the plurality of camera modules 280 may form, for example, a wide-angle camera, and at least another camera module 280 of the plurality of camera modules 280 may form a tele camera. Similarly, at least one camera module 280 of the plurality of camera modules 280 may form, for example, a front camera, and at least another camera module 280 of the plurality of camera modules 280 may form a rear camera.
Fig. 3 is a perspective view illustrating a front side of an electronic device 300 (e.g., electronic device 101 of fig. 1) according to an embodiment. Fig. 4 is a perspective view illustrating a rear surface of the electronic device 300 shown in fig. 3.
Referring to fig. 3 and 4, an electronic device 300 (e.g., the electronic device 101 of fig. 1) according to an embodiment may include a housing 310 having a first face (or front face) 310A, a second face (or back face) 310B, and a side face 310C surrounding a space between the first face 310A and the second face 310B. In another embodiment (not shown), the term "housing 310" may refer to a structure that forms a portion of the first face 310A, the second face 310B, and the side face 310C of fig. 3. According to an embodiment, at least a portion of the first face 310A may be made of a substantially transparent front plate 302 (e.g., a glass plate or a polymer plate including various coatings). In another embodiment, front plate 302 is coupled to housing 310 to form an interior space within housing 310. In some embodiments, the term "inner space" may denote an inner space defined by an outer edge of the housing 310 that accommodates at least a portion of the display 301 or the display device 160 in fig. 1, which will be described later.
According to an embodiment, the second face 310B may be formed by a substantially opaque back plate 311. The rear plate 311 may be made of, for example, coated or colored glass, ceramic, polymer, or metal (e.g., aluminum, stainless steel (STS), magnesium, etc.), or a combination of two or more of these materials. The side 310C may be formed from a side frame structure (or "side member") 318 coupled to the front and rear panels 302, 311, and the side frame structure may also be made from metal and/or polymer. In an embodiment, the back plate 311 and the side frame structure 318 may be integrally formed and may be made of the same material (e.g., a metal such as aluminum).
In the illustrated embodiment, the front panel 302 may include two first regions 310D at long opposite side edges thereof that are seamlessly curved and extending from the first face 310A toward the rear panel 311. In the illustrated embodiment (see fig. 4), the back panel 311 may include two second regions 310E at long opposite side edges thereof that are seamlessly curved and extend from the second face 310B toward the front panel 302. In an embodiment, the front plate 302 (or the rear plate 311) may include only one of the first regions 310D (or the second regions 310E). In another embodiment, some of the first region 310D and the second region 310E may not be included. In the above embodiment, when viewed from the side of the electronic device 300, the side frame structure 318 may have a first thickness (or width) on the side (e.g., the side where the connector hole 308 is formed) excluding the first region 310D or the second region 310E, and may have a second thickness thinner than the first thickness on the side (e.g., the side where the key input device 317 is provided) including the first region 310D or the second region 310E.
According to an embodiment, the electronic device 300 may include at least one of a display 301, audio modules 303, 307, and 314, sensor modules 304, 316, and 319, camera modules 305, 312, and 313, a key input device 317, a light emitting element 306, and connector holes 308 and 309. In embodiments, at least one component (e.g., key input device 317 or light emitting element 306) may be omitted from electronic device 300, or electronic device 300 may additionally include other components.
The display 301 (e.g., the display device 160 in fig. 1) may be exposed through a substantial portion of the front plate 302. In an embodiment, at least a portion of the display 301 may be exposed through the front plate 302 forming the first face 310A and the first area 310D. In an embodiment, the edge of the display 301 may be formed to have a contour shape substantially identical to that of the front plate 302 adjacent thereto. In another embodiment (not shown), the distance between the outer contour of the display 301 and the outer contour of the front plate 302 may be substantially constant in order to expand the exposed area of the display 301.
In another embodiment (not shown), a groove or opening may be formed in a portion of a screen display area (e.g., an active area) of the display 301 or an area outside the screen display area (e.g., an inactive area), and at least one of the audio module 314 (e.g., the audio module 170 of fig. 1), the sensor module 304 (e.g., the sensor module 176 of fig. 1), the camera module 305 (e.g., the camera module 180), and the light emitting element 306 may be aligned with the groove or opening. In another embodiment (not shown), at least one of the audio module 314, the sensor module 304, the camera module 305, the fingerprint sensor 316, and the light emitting element 306 may be disposed behind a screen display area of the display 301. In another embodiment (not shown), the display 301 may be coupled to or disposed adjacent to a touch sensing circuit, a pressure sensor capable of measuring touch (pressure) intensity, and/or a digitizer that detects a magnetic field stylus. In some embodiments, at least some of the sensor modules 304 and 319 and/or at least some of the key input devices 317 may be disposed in the first region 310D and/or the second region 310E.
The audio modules 303, 307, and 314 may include microphone holes 303 and speaker holes 307 and 314. The microphone aperture 303 may include a microphone disposed therein to obtain sound from the external environment. In an embodiment, a plurality of microphones may be provided therein in order to detect the direction of sound. Speaker holes 307 and 314 may include an external speaker hole 307 and a telephone call receiver hole 314. In some embodiments, the speaker holes 307 and 314 and the microphone hole 303 may be implemented as a single hole, or may include a speaker such as a piezoelectric speaker without the speaker holes 307 and 314.
The sensor modules 304, 316, and 319 may generate electrical signals or data values corresponding to various internal operating states of the electronic device 300 or various external environmental conditions. The sensor modules 304, 316, and 319 may include, for example, a first sensor module 304 (e.g., a proximity sensor) and/or a second sensor module (not shown) (e.g., a fingerprint sensor) disposed on a first side 310A of the housing 310, and/or a third sensor module 319 (e.g., a HRM sensor) and/or a fourth sensor module 316 (e.g., a fingerprint sensor) disposed on a second side 310B of the housing 310. The fingerprint sensor may be disposed not only on the first face 310A (e.g., display 301) of the housing 310, but also on the second face 310B. The electronic device 300 may further include at least one of a sensor module (not shown) such as a gesture sensor, a gyroscope sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a color sensor, an Infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
The camera modules 305, 312, and 313 may include a first camera device 305 disposed on a first side 310A of the electronic device 300, a second camera device 312 disposed on a second surface 310B, and/or a flash 313. The camera modules 305 and 312 may include one or more lenses, image sensors, and/or image signal processors. The flash 313 may include, for example, a light emitting diode or a xenon lamp. In an embodiment, two or more lenses (e.g., an infrared camera lens, a wide angle lens, and a telephoto lens) and an image sensor may be disposed on one face of the electronic device 300.
The key input device 317 may be disposed on the side 310C of the housing 310. In another embodiment, the electronic device 300 may not include some or all of the key input devices 317 described above, and the omitted key input devices 317 may be implemented in another form, such as soft keys on the display 301. In some embodiments, the key input device 317 may be integrated with the sensor module 316.
The light emitting element 306 may be arranged on, for example, a first face 310A of the housing 310. The light emitting element 306 may optically provide status information of the electronic device 300, for example. In another embodiment, the light emitting element 306 may provide a light source that cooperates with, for example, the camera module 305. The light emitting element 306 may include, for example, an LED, an IR LED, and a xenon lamp.
The connector holes 308 and 309 may include a first connector hole 308 and/or a second connector hole 309, wherein the first connector hole 308 is capable of receiving a connector (e.g., a USB connector) for transmitting and receiving power and/or data to and from an external electronic device, and the second connector hole 309 is capable of receiving a connector (e.g., a headphone jack) for transmitting and receiving audio signals to and from the external electronic device.
Fig. 5 is a configuration diagram illustrating a lens assembly 400 according to one of the various embodiments.
Referring to fig. 5, according to one of various embodiments, a lens assembly 400 (e.g., lens assembly 210 of fig. 2) may include a plurality of lenses 401, 402, 403, and 404, and may also include an image sensor 406 in this embodiment. For example, the image sensor 406 may be mounted in an electronic device (e.g., the electronic devices 101 and 300 in fig. 1 or 3) or an optical device (e.g., the camera module 280 in fig. 2), and the plurality of lenses 401, 402, 403, and 404 constituting the lens assembly 400 may be mounted in the electronic device or the optical device and aligned with the image sensor 406. In an embodiment, the lens assembly 400 may be disposed in any one of the camera modules 305, 312, and 313 of fig. 3 or 4.
According to an embodiment, the plurality of lenses 401, 402, 403, and 404 may include a first lens 401, a second lens 402, a third lens 403, and/or a fourth lens 404 sequentially arranged from the object O side toward the image sensor 406. For example, the first to fourth lenses 401, 402, 403, and 404 may be aligned on the optical axis a together with the image sensor 406. In another embodiment, a plurality of second lenses 402 may be provided, and the first to fourth lenses 401, 402, 403, and 404 may be plastic aspherical lenses. In an embodiment, the first lens 401 may have a transmittance with respect to visible light ranging from 0% to 5%. In this way, even if the lens assembly 400 is mounted in the electronic device 300 shown in fig. 3, it is impossible for the user to visually recognize the lens assembly 400 when viewing the outside of the electronic device 300. In an embodiment, the first lens 401 may not completely block visible light when actually manufactured. For example, even if the design specification of the visible light transmittance is 0%, the actually manufactured first lens 401 may have a transmittance with respect to the visible light of about 0.001%.
According to an embodiment, the first lens 401 may have a positive refractive index, and may be arranged on the object O side to face, for example, the object O with an aperture stop S interposed between the object and the first lens 401. Since the first lens 401 has a positive refractive index, the Total Length (TL) of the lens assembly 400 (e.g., the distance from the object-side surface S2 of the first lens 401 to the imaging surface IS of the image sensor 406) and/or the outside diameters of the second to fourth lenses 402, 403, and 404 can be reduced. In the embodiment, the first lens 401 may be a meniscus lens convex toward the object O side, and the image sensor 406 side surface S3 of the first lens 401 may have an inflection point. For example, on the image sensor 406 side surface S3 of the first lens 401, there may be an inflection point shape in which a center portion aligned on the optical axis a is concave, and a peripheral portion on the edge side is convex. The first lens 401 having the above-described shape may enable the lens assembly 400 to form a large aperture optical system having an F-number ranging from 1.1 to 1.7. In an embodiment, the first lens 401 having the above-described shape and positive refractive index may facilitate miniaturization or correction of spherical aberration of the lens assembly 400.
According to an embodiment, the first lens 401 may have a transmittance with respect to visible light of 5% or less, and may be made of a material optimized for near infrared rays. Accordingly, the first lens 401 may have a low visible light reflectance to mitigate or prevent visual discrimination by a user. In an embodiment, when the lens assembly 400 is installed in an electronic device (e.g., the electronic device 300 in fig. 3), a color (e.g., black) material matching the color of the outside of the device may be provided. Therefore, even if the lens assembly or the optical device is mounted to the electronic device, deterioration in the external appearance of the electronic device can be reduced or prevented.
According to an embodiment, the second lens 402 has a positive refractive index and may be arranged adjacent to the first lens 401. In some embodiments, a plurality of second lenses 402 may be disposed between the first lens 401 and the third lens 403, which will be described in more detail later with reference to fig. 11. The second lens 402 may be a meniscus lens convex toward the object O side, and both the object side surface S4 and the image sensor side surface S5 may include inflection points.
According to an embodiment, the third lens 403 may have a positive refractive index and may be arranged adjacent to the second lens 402. According to an embodiment, the third lens 403 may be a meniscus lens convex toward the image sensor 406 side, and both the object side surface S6 and the image sensor side surface S7 may include inflection points.
According to an embodiment, the fourth lens 404 may have a negative refractive index and may be disposed adjacent to the third lens 403. In an embodiment, the fourth lens 404 may be a meniscus lens convex toward the object O side, and both the object side surface S8 and the image sensor side surface S9 may include inflection points. Since the fourth lens 404 has a negative refractive index, an incident angle of a peripheral portion of an image formed on the imaging surface IS may be an angle of 30 degrees or more. Accordingly, a Back Focal Length (BFL) of the lens assembly 400 may be reduced to reduce the overall length of the lens assembly 400.
According to an embodiment, among the first to fourth lenses 401, 402, 403, and 404, at least four of the lens surfaces (e.g., object-side surfaces S4, S6, and S8 and image sensor-side surfaces S5, S7, and S9) of the second lens 402, the third lens 403, and/or the fourth lens 404 may include inflection points. Since the plurality of lens surfaces include inflection points, the lens assembly 400 may be implemented as a large aperture optical system, and this may be advantageous for aberration correction of the lens assembly 400. In an embodiment, a light beam formed on the peripheral portion of the imaging plane IS of the image sensor 406 by the combination of refractive index and shape of the first to fourth lenses 401, 402, 403, and 404 has an incident angle of about 30 degrees. The lens assembly 400 can secure a viewing angle of 70 degrees or more. In an embodiment, at least the second to fourth lenses 402, 403 and 404 may be plastic to facilitate molding or machining the lenses into designed shapes. In some embodiments, when a plurality of second lenses 402 are arranged, aberration correction may be easier and resolution of an optical device (e.g., a camera or an electronic device including the lens assembly 400) may be increased.
According to an embodiment, all of the first to fourth lenses 401, 402, 403, and 404 may be meniscus lenses. In a lens assembly including four or five lenses, when at least four lenses are meniscus lenses, even if the focal length of the lens assembly is small, the total length of the lens assembly can be miniaturized, and aberration correction can be good. For example, when implemented with four or five lenses, the lens assembly 400 may be miniaturized and may have high optical performance.
According to an embodiment, the lens assembly 400 may include a bandpass filter 405. For example, a bandpass filter 405 may be disposed between the fourth lens 404 and the image sensor 406. In an embodiment, the band pass filter 405 largely blocks visible light (e.g., has a visible light transmittance ranging from 0% to 1%) and may have a transmittance ranging from 90% to 99% with respect to light having a wavelength between 800nm and 1000nm (e.g., near infrared light). Since the band pass filter 405 is arranged as described above, light incident on the image sensor 406 may be substantially limited to light in the near infrared region. In some embodiments, the bandpass filter 405 may not completely block visible light when actually manufacturing the bandpass filter. For example, even if the design specification of the visible light transmittance is 0%, the band-pass filter 405 may actually have a transmittance with respect to visible light of about 0.001%.
According to an embodiment, the bandpass filter 405 may transmit light having a specific wavelength between 800nm and 1000 nm. For example, light having a specific wavelength transmitted by the band pass filter 405 may have at least one of wavelengths of 850±5nm, 940±5nm, or 980±5 nm. In another embodiment, the bandpass filter 405 may block light having wavelengths outside the transmission range. Herein, although specific wavelengths of light transmitted by the bandpass filter 405 are disclosed, these specific values are merely used as examples, and the present disclosure is not limited to these values. The bandpass filter 405 may be designed or manufactured to have appropriate optical characteristics according to the required specifications of the lens assembly 400 or a camera module or electronic device including the lens assembly 400.
According to an embodiment, an electronic device (e.g., electronic device 300 in fig. 3) mounted with the lens assembly 400 may emit light having a near infrared wavelength using at least a portion of a light source device (e.g., light emitting device 306 in fig. 3). In some embodiments, the light source device is embedded in the camera module (e.g., camera module 280 in fig. 2) itself, or may be disposed adjacent to the camera module separately from the camera module (e.g., camera module 305 in fig. 3).
According to an embodiment, the light source device (e.g., the light emitting device 306 in fig. 3) may comprise an infrared emitting diode or a near infrared laser light source. Light emitted from the light source device may be reflected by the object O and may be incident on the image sensor 406 through the lens assembly 400. Based on the time taken for the reflected light to reach the image sensor 406, the electronic device (e.g., the processor 120 in fig. 1 or the image signal processor 260 in fig. 2) may detect first information (e.g., distance information (e.g., depth information) of the object O relative to the electronic device). For example, the lens assembly 400 may include a band pass filter 405 to constitute a near infrared camera that suppresses interference of visible light or infrared rays having wavelengths that are not necessary for detecting distance information of an object. The processor (e.g., processor 120 in fig. 1 or image signal processor 260 in fig. 2) may include a microprocessor or any suitable type of processing circuit, such as one or more general purpose processors (e.g., ARM-based processors), digital Signal Processors (DSPs), programmable Logic Devices (PLDs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs), graphics Processing Units (GPUs), video card controllers, or the like. Further, it will be appreciated that when a general purpose computer accesses code for implementing the processes shown herein, execution of the code converts the general purpose computer into a special purpose computer for performing the processes shown herein. Some of the functions and steps provided in the figures may be implemented in hardware, software, or a combination of both, and may be executed in whole or in part within programmed instructions of a computer.
According to an embodiment, the lens assembly 400 may satisfy the following equation 1.
[ equation 1]
1.5=<TL/ImgH=<3.5
Here, "TL" may represent the total length of the lens assembly 400 (e.g., the distance from the object-side surface S2 of the first lens 401 to the imaging surface IS of the image sensor 406), and "ImgH" may represent the maximum image height of an image formed on the imaging surface IS. The term "image height" refers to the maximum distance from the edge of an image formed on the imaging surface IS to the optical axis a. When the ratio of the total length TL to the maximum image height ImgH exceeds 3.5, this means that the total length is relatively long, and thus the lens assembly 400 may be difficult to install in a miniaturized electronic device. When the ratio of the total length TL to the maximum image height ImgH does not reach 1.5, this means that the total length of the lens assembly 400 may be too short, thereby limiting the number of lenses. When the number of lenses is limited, the number of lens surfaces including inflection points is also reduced, and thus aberration correction may be limited. When equation 1 is satisfied, when implemented with four or five lenses, the lens assembly 400 may have a total length of about 3mm and may be easily installed in a miniaturized electronic device while ensuring good performance of an optical device (e.g., the camera module 280).
Since the optical device including the lens assembly 400 is capable of detecting distance information about a subject or a specific portion of the subject, etc., the optical device may be used as a part of a security camera, an object recognition or user authentication camera, or a thermal imaging camera, and may be combined with other optical devices (e.g., a tele camera or a wide camera) in order to provide functions such as augmented reality and three-dimensional scanner functions in a small electronic device such as a mobile communication terminal. In some embodiments, the optical device including the lens assembly 400 described above may be implemented as at least some of the components including the camera module 280 of fig. 2.
According to an embodiment, the lens assembly 400 may satisfy the following equation 2.
[ equation 2]
1=<f1/f=<10
Here, "f1" may represent a focal length of the first lens 401, and "f" may represent a total focal length of the lens assembly 400. When the ratio of the focal length f1 of the first lens 401 to the total focal length f of the lens assembly 400 exceeds 10, miniaturization of the lens assembly 400 may be limited, and when the ratio does not reach 1, it may be difficult to manufacture or combine lenses satisfying design characteristics. For example, the lens assembly 400 according to the embodiment can be easily manufactured or combined using the lenses 401, 402, 403, and 404 while being miniaturized.
According to an embodiment, the lens assembly 400 may satisfy the following equation 3.
[ equation 3]
0.15=<tmax/f=<0.5
Here, "tmax" may represent a center thickness of the lens having the maximum thickness, and "f" may represent a total focal length of the lens assembly 400. The "center thickness" may represent the thickness of each of the first to fourth lenses 401, 402, 403, and 404 on the optical axis a, and may be represented by "tn" (where "n" is a natural number). For example, "t1" in fig. 4 may represent the center thickness of the first lens 401, and in the lens assembly 400 shown in fig. 4, the center thickness t3 of the third lens 403 is 0.552388, which may be the largest. In one embodiment, the total focal length of the lens assembly 400 is 1.921, and the ratio of the maximum center thickness tmax to the focal length f may satisfy equation 3. When the ratio of the maximum center thickness tmax of the lens assembly 400 to the total focal length f exceeds 0.5, miniaturization of the lens assembly 400 may be limited due to the thickness of the lens, and when the ratio does not reach 0.15, it may be difficult to manufacture the lens assembly satisfying the design characteristics. For example, the lens assembly 400 according to the embodiment can be easily manufactured or combined using lenses 401, 402, 403, and 404 while being miniaturized.
The lens data of the lens assembly 400 is shown in table 1 below, where "aperture stop" may indicate the opening surface of the aperture stop S, and "S2 to S11" may indicate the surfaces of the respective lenses 401, 402, 403, and 404 and/or the bandpass filter 405. The lens assembly 400 has a focal length of 1.921mm, an F-number of 1.64, a viewing angle of 75 degrees, and a maximum image height of 1.5mm, and may satisfy at least one of the above equations.
TABLE 1
Figure GDA0003053708580000191
The aspherical coefficients of the first to fourth lenses 401, 402, 403, and 404 are shown in tables 2 and 3 below, where the aspherical coefficients can be calculated by the following equation 4.
[ equation 4]
Figure GDA0003053708580000192
Here, "x" may represent a distance from the lens vertex in the direction of the optical axis a, "y" may represent a distance in the direction perpendicular to the optical axis a, "C" may represent an inverse of a radius of curvature at the lens vertex, "K" may represent a quadratic constant, and "a", "B", "C", "D", "E", "F", "G", "H", and "J" may represent aspherical coefficients, respectively.
TABLE 2
S2 S3 S4 S5
K -99.000000 0.378418 .636931 314602
A 0.182436E+01 -0.591959E+00 -0.227167E+00 0.232548E+00
B -0.158916E+02 -0.325118E+01 -0.341451E+01 -0.253700E+01
C 0.100487E+03 0.284453E+02 0.107911E+02 0.168462E+01
D -0.430830E+03 -0.144168E+03 -0.304075E+02 0.155140E+02
E 0.121683E+04 0.481397E+03 0.100819E+03 -0.550828E+02
F -0.222664E+04 -0.105573E+04 -0.266383E+03 0.810567E+02
G 0.253624E+04 0.145083E+04 0.452565E+03 -0.583465E+02
H -0.162931E+04 -0.111587E+04 -0.422345E+03 0.178095E+02
J 0.449670E+03 0.365391E+03 0.165255E+03 -0.898805E+00
TABLE 3 Table 3
S6 S7 S8 S9
K -56.769953 -7.707943 0.111614 -2.146231
A -0.308161E-01 -0.886176E+00 -0.853398E+00 -0.465917E+00
B 0.175201E+01 0.196136E+01 0.149970E+01 -0.156316E+00
C -0.126893E+02 -0.317631E+00 -0.783564E+01 0.264827E+01
D 0.489301E+02 -0.189369E+02 0.309094E+02 -0.716632E+01
E -0.111380E+03 0.782428E+02 -0.766578E+02 0.106322E+02
F 0.155315E+03 -0.157127E+03 0.117704E+03 -0.961967E+01
G -0.129721E+03 0.180423E+03 -0.109170E+03 0.526560E+01
H 0.593016E+02 -0.112294E+03 0.560674E+02 -0.160148E+01
J -0.114332E+02 0.290752E+02 -0.122650E+02 0.207401E+00
Fig. 6 (a) is a graph representing spherical aberration of a lens assembly (e.g., lens assembly 400 in fig. 5) according to one of the various embodiments, fig. 6 (b) is a graph representing astigmatism of a lens assembly (e.g., lens assembly 400 in fig. 5) according to one of the various embodiments, and fig. 6 (c) is a graph representing distortion rate of a lens assembly (e.g., lens assembly 400 in fig. 5) according to one of the various embodiments.
Fig. 6 (a) is a graph showing spherical aberration of the lens assembly 400 according to one of various embodiments, in which the horizontal axis shows a longitudinal spherical aberration coefficient and the vertical axis shows a normalized distance from the center of the optical axis. Fig. 6 (a) shows a change in longitudinal spherical aberration according to the wavelength of light.
Fig. 6 (b) is a graph representing astigmatism of the lens assembly 400 according to one of the various embodiments, and fig. 6 (c) is a graph showing a distortion rate of the lens assembly 400 according to one of the various embodiments.
Referring to (c) in fig. 6, an image captured by the lens assembly 400 may have some distortion generated at a point deviated from the optical axis a, but such distortion generally occurs in an optical device using an optical lens or lens assembly. The short focal length lens optical system can provide good optical characteristics with a distortion rate of about 1%.
In the following detailed description, components that can be easily understood by the foregoing embodiments may be denoted by the same reference numerals as the foregoing embodiments, or the reference numerals may be omitted, and the detailed description thereof may also be omitted.
Fig. 7 is a diagram showing a configuration of a lens assembly 500 according to another embodiment of various embodiments. Fig. 8 (a) is a graph showing spherical aberration of the lens assembly 500 according to another embodiment of the various embodiments, fig. 8 (b) is a graph showing astigmatism of the lens assembly 500 according to another embodiment of the various embodiments, and fig. 8 (c) is a graph showing distortion rate of the lens assembly 500 according to another embodiment of the various embodiments.
Referring to (a) to (c) in fig. 7 and 8, the lens assembly 500 may include first to fourth lenses 501, 502, 503 and 504 and a band pass filter 405, and the first to fourth lenses 501, 502, 503 and 504 may be sequentially arranged along the optical axis a from the object O side toward the image sensor 506 side. Although the first to fourth lenses 501, 502, 503 and 504 have some differences in details (such as lens data) from the previous embodiment shown in fig. 5, the lens assembly 500 may satisfy at least one of the conditions described by the previous embodiments. The phrase "the condition described by the foregoing embodiment" may include the transmittance characteristics of the first lens 501 (e.g., the first lens 401 of fig. 5), as well as the refractive powers or lens surface shapes of the first to fourth lenses 501, 502, 503, and 504, the number of lens surfaces including inflection points, the condition presented by equations 1 to 3, or the materials of the first to fourth lenses 501, 502, 503, and 504.
According to an embodiment, the lens assembly 500 may not include a separate aperture (e.g., aperture S in fig. 5). For example, in the lens assembly 500, the object side surface S2 of the first lens 501 may serve as an opening surface (aperture stop) of the lens assembly 500.
The lens data of the lens assembly 500 is shown in table 4 below, where "object" may indicate a subject, and "S2 to S11" may indicate surfaces of the respective lenses 501, 502, 503, and 504 and/or the bandpass filter 505. The lens assembly 500 has a focal length of 1.904mm, an F-number of 1.12, a viewing angle of 78.4 degrees, and a maximum image height of 1.5mm, and may satisfy at least one of the above equations.
TABLE 4 Table 4
Figure GDA0003053708580000221
The aspherical coefficients of the first to fourth lenses 501, 502, 503, and 504 are shown in tables 5 and 6 below.
TABLE 5
S2 S3 S4 S5
K -99.000000 9.912100 0.560161 1.399029
A 0.177479E+01 -0.512059E+00 -0.799243E+00 0.379203E+00
B -0.153624E+02 -0.658051E+00 0.394245E+01 -0.488573E+01
C 0.956621E+02 0.607808E+01 -0.364164E+02 0.250366E+02
D -0.406319E+03 -0.285070E+02 0.183957E+03 -0.941454E+02
E 0.114256E+04 0.975832E+02 -0.529614E+03 0.224331E+03
F -0.208597E+04 -0.224377E+03 0.909663E+03 -0.323657E+03
G 0.236747E+04 0.311028E+03 -0.922233E+03 0.273768E+03
H -0.151368E+04 -0.232020E+03 0.508203E+03 -0.125026E+03
J 0.415783E+03 0.712984E+02 -0.117075E+03 0.238290E+02
TABLE 6
S6 S7 S8 S9
K -99.000000 -34.966230 0.134549 -1.879691
A 0.996477E+00 -0.171170E+01 -0.169106E-01 -0.194544E+00
B -0.585933E+01 0.119472E+02 -0.331380E+01 -0.778342E+00
C 0.286682E+02 -0.597450E+02 0.173504E+02 0.344459E+01
D -0.104628E+03 0.202028E+03 -0.627653E+02 -0.798102E+01
E 0.239036E+03 -0.449788E+03 0.150600E+03 0.118735E+02
F -0.324465E+03 0.645678E+03 -0.231241E+03 -0.115421E+02
G 0.254356E+03 -0.568071E+03 0.216897E+03 0.705220E+01
H -0.106395E+03 0.276336E+03 -0.112753E+03 -0.244943E+01
J 0.183891E+02 -0.566704E+02 0.247908E+02 0.367326E+00
Fig. 9 is a diagram showing a configuration of a lens assembly 600 according to another embodiment of various embodiments. Fig. 10 (a) is a graph showing spherical aberration of the lens assembly 600 according to another embodiment of the various embodiments, fig. 10 (b) is a graph showing astigmatism of the lens assembly 600 according to another embodiment of the various embodiments, and fig. 10 (c) is a graph showing distortion rate of the lens assembly 600 according to another embodiment of the various embodiments.
Referring to (a) to (c) in fig. 9 and 10, the lens assembly 600 may include first to fourth lenses 601, 602, 603 and 604 and a band pass filter 605, and the first to fourth lenses 601, 602, 603 and 604 may be sequentially arranged along the optical axis a from the object O side toward the image sensor 606 side. Although the first to fourth lenses 601, 602, 603 and 604 have differences in some details (such as lens data) from the previous embodiments shown in fig. 5 and 7, the lens assembly 600 may satisfy at least one of the conditions described by the previous embodiments.
The lens data of the lens assembly 600 is shown in table 7 below, where "aperture stop" may indicate the opening surface of the aperture stop S, and "S2 to S11" may indicate the surfaces of the respective lenses 601, 602, 603, and 604 and/or the bandpass filter 605. The lens assembly 600 has a focal length of 1.98mm, an F-number of 1.26, a viewing angle of 70 degrees, and a maximum image height of 1.5mm, and may satisfy at least one of the above equations.
TABLE 7
Figure GDA0003053708580000231
/>
Figure GDA0003053708580000241
The aspherical coefficients of the first to fourth lenses 601, 602, 603, and 604 are shown in tables 8 and 9 below.
TABLE 8
S2 S3 S4 S5
K -97.609965 10.466068 -3.491424 -0.252535
A 0.186199E+01 -0.545894E+00 -0.408448E+00 0.183394E+00
B -0.160596E+02 -0.333579E+01 -0.303131E+01 -0.233221E+01
C 0.100929E+03 0.287563E+02 0.113124E+02 0.168605E+01
D -0.430992E+03 -0.142907E+03 -0.316400E+02 0.153625E+02
E 0.121608E+04 0.478270E+03 0.996370E+02 -0.552770E+02
F -0.222664E+04 -0.105605E+04 -0.266595E+03 0.813297E+02
G 0.253624E+04 0.145083E+04 0.452563E+03 -0.583171E+02
H -0.162931E+04 -0.111587E+04 -0.422347E+03 0.178161E+02
J 0.449670E+03 0.365391E+03 0.165253E+03 -0.898791E+00
TABLE 9
S6 S7 S8 S9
K -99.000000 -9.108304 0.180233 -2.427213
A -0.583806E-01 -0.871045E+00 -0.729710E+00 -0.333802E+00
B 0.180546E+01 0.198405E+01 0.145495E+01 -0.279173E+00
C -0.128125E+02 -0.216180E+00 -0.784100E+01 0.270348E+01
D 0.488753E+02 -0.192729E+02 0.309174E+02 -0.715750E+01
E -0.111350E+03 0.777014E+02 -0.766539E+02 0.106195E+02
F 0.155386E+03 -0.156475E+03 0.117700E+03 -0.962403E+01
G -0.129618E+03 0.180420E+03 -0.109163E+03 0.526627E+01
H 0.592991E+02 -0.112296E+03 0.560650E+02 -0.159734E+01
J -0.114403E+02 0.290727E+02 -0.122668E+02 0.205865E+00
Fig. 11 is a view showing a configuration of a lens assembly 700 according to another embodiment of various embodiments. Fig. 12 (a) is a graph showing spherical aberration of the lens assembly 700 according to another embodiment of the various embodiments, fig. 12 (b) is a graph showing astigmatism of the lens assembly 700 according to another embodiment of the various embodiments, and fig. 12 (c) is a graph showing distortion rate of the lens assembly 700 according to another embodiment of the various embodiments.
Referring to (a) to (c) in fig. 11 and 12, the lens assembly 700 may include first to fourth lenses 701, 702, 703 and 704 and a band pass filter 705, and the first to fourth lenses 701, 702, 703 and 704 may be sequentially arranged along the optical axis a from the object O side toward the image sensor 706 side. According to an embodiment, the second lens 702 may include an object-side lens 702a and an image sensor-side lens 702b. For example, the lens assembly 700 according to this embodiment may include five lenses, and the number of lenses included in the lens assembly may be appropriately selected in consideration of various design conditions such as aberration correction and total length of the lens assembly.
Although the first to fourth lenses 701, 702, 703 and 704 have differences in some details (such as lens shape and lens data) from the previous embodiments shown in fig. 5, 7 and 9, the lens assembly 700 may satisfy at least one of the conditions described by the previous embodiments.
The lens data of the lens assembly 700 is shown in table 10 below, where "aperture stop" may indicate the opening surface of the aperture stop S, and "S2 to S13" may indicate the surfaces of the respective lenses 701, 702, 703, and 704 and/or the bandpass filter 705. The lens assembly 700 has a focal length of 1.96mm, an F-number of 1.62, a viewing angle of 74 degrees, and a maximum image height of 1.5mm, and may satisfy at least one of the above equations.
Table 10
Figure GDA0003053708580000251
Figure GDA0003053708580000261
The aspherical coefficients of the first to fourth lenses 701, 702, 703, and 704 are shown in tables 11 and 12 below.
TABLE 11
Figure GDA0003053708580000262
Table 12
Figure GDA0003053708580000263
/>
Figure GDA0003053708580000271
Referring again to fig. 1 to 4, a camera module or an electronic device (e.g., the camera module 280 of fig. 2 or the electronic device 300 of fig. 3) including the above-described lens assembly 400, 500, 600, or 700 will be described.
According to some embodiments, the lens assembly 400, 500, 600, or 700 described above may be provided as the lens assembly 210 of fig. 2. In an embodiment, a camera module (e.g., camera module 280 in fig. 2) including such a lens assembly 400, 500, 600, or 700 may be implemented as camera modules 305, 312, and 313 in fig. 3 or 4. In some embodiments, the camera module 305 disposed on the front side of the electronic device 300 of fig. 3 may include a plurality of cameras, e.g., a first camera and a second camera. In an embodiment, the first camera of the camera module 305 may include the lens assembly 400, 500, 600, or 700 as described above, and may detect distance information about an object using near infrared rays. The second camera of the camera module 305 may be a camera for capturing light in the visible spectrum. For example, the second camera may detect or acquire second information about the object, such as at least one of color information, luminance information, saturation information, and contrast information. In some embodiments, the second camera may include a plurality of cameras. Thus, in this example, the first camera may include a near infrared camera, and the second camera may be composed of a combination of a tele camera and a wide camera.
According to some embodiments, a camera module (e.g., camera module 280 in fig. 2) including the lens assembly 400, 500, 600, or 700 may be used for security purposes, such as in public space, living space, or the like, according to design parameters for the outside diameter of the lens or the overall length of the lens assembly. For example, the lens assembly 400, 500, 600, or 700 or the camera module 280 may be used as a closed-loop security camera, a camera for identifying objects in a vehicle, or a thermal imaging camera. In another embodiment, the lens assembly 400, 500, 600, or 700 may be manufactured to have a total length of about 3 mm. When so designed, the lens assembly 400, 500, 600 or 700 may be installed in a personal electronic device such as a mobile communication terminal to provide functions such as user authentication, object recognition, augmented reality and three-dimensional scanner.
According to some embodiments, the electronic apparatus 300 emits light (e.g., infrared or near infrared) toward the object using the light source apparatus, and the first camera of the camera module 305 may detect first information about the object, such as distance information (e.g., depth information), by detecting light emitted from the light source apparatus and reflected by the object. In an embodiment, the light source device may include an infrared emitting diode or a near infrared laser light source, and the light emitting device 306 of the electronic device 300 may be used as the above-described light source device. In another embodiment, the electronic device 300 may include a light source device for emitting light for detecting distance information, separate from the light emitting device 306.
Hereinafter, an example of a method of controlling an electronic device including the lens assembly 400, 500, 600, or 700 or the camera module 280 or 305 will be described with reference to fig. 13 and 14.
Fig. 13 is a flowchart illustrating a control method (1300) of an electronic device including a lens assembly according to an embodiment.
Referring to fig. 13, in a control method 1300, a method of performing user authentication based on first information is shown, wherein the first information is acquired via a camera (e.g., a first camera of the camera module 305 in fig. 3) including the lens assembly 400, 500, 600, or 700. For example, a processor or image signal processor of the electronic device (e.g., processor 120 in fig. 1 or image signal processor 260 in fig. 2) may perform user authentication by comparing the first information with information stored in a memory (e.g., memory 130 or 250 in fig. 1 or 2). Through user authentication, the electronic device may release the lock mode or may perform authentication required for an application currently executed by the electronic device.
According to an embodiment, operation 1301 is an operation of determining whether to perform a process requiring authentication, and when an electronic apparatus (e.g., the electronic apparatus 300 in fig. 3) is in, for example, its lock mode, or when the electronic apparatus is executing a specific application requiring authentication, the processor 120 or the image signal processor 260 may determine whether a preset condition is satisfied. The "preset condition" may include, for example, a request (e.g., touch input or button operation) for unlocking the lock mode by a user or an authentication request for an executed application.
According to an embodiment, operation 1302 is an operation of imaging a subject. When the preset condition is satisfied in operation 1031, the processor 120 or the image signal processor 260 may activate the first camera of the camera module 305 to image the subject. According to an embodiment, the first camera may be implemented as a near infrared camera by including the lens assembly 400, 500, 600, or 700 described above. In some embodiments, the electronic device may emit light (e.g., near infrared laser light) using a light source device (e.g., light emitting device 306 of fig. 3), and the emitted light may be reflected by the object to be incident on the first camera. The first camera may acquire first information, for example, distance information (e.g., depth information) about the subject from the incident light.
According to an embodiment, operation 1303 is an operation of determining whether the user is a registered user, and the processor 120 or the image signal processor 260 may perform authentication on the subject by comparing the acquired first information with the user information stored in the memory 130 or 250. When it is determined that the user is a registered user as a result of performing authentication on the subject, the processor 120 or the image signal processor 260 may perform operation 1304.
According to an embodiment, operation 1304 is an operation to perform a preset operation mode. When the user information is authenticated, the electronic device (e.g., the processor 120 or the image signal processor 260) may perform a preset operation mode. The "preset operation mode" may be a mode in which the lock mode is released to activate the electronic device 300 or to continue the subsequent processing of the application. When the user authentication is not performed, the processor 120 or the image signal processor 260 may re-perform the control method 1300 from operation 1301.
Fig. 14 is a flowchart illustrating another control method (1400) of an electronic device including a lens assembly according to an embodiment.
Referring to fig. 14, a control method 1400 is an example of a method of capturing an object image using at least one of a first camera and a second camera of the camera module 305. The second camera may detect or acquire second information about the object, such as color information, luminance information, saturation information, and contrast information, for example. According to an embodiment, in the first mode in which the subject is imaged using the first camera and the second camera, the processor 120 or the image signal processor 260 may generate the subject image by combining the first information acquired by the first camera and the second information acquired by the second camera. According to another embodiment, in a second mode of the control method 1400, the processor or the image signal processor may generate the subject image based on second information acquired by the second camera. The operation for generating an image by the first mode and the second mode will be described in more detail.
According to an embodiment, operation 1401 is an operation of determining an imaging mode, wherein the processor 120 or the image signal processor 260 may determine whether to perform the imaging mode and/or determine the first mode and the second mode. The first mode may include, for example, a mode for imaging a subject using both the first camera and the second camera, and the second mode may include a mode for imaging a subject using the second camera.
According to an embodiment, in operation 1402a of the first mode, the processor 120 or the image signal processor 260 may image a subject using both the first camera and the second camera. According to an embodiment, light (e.g., near infrared rays) emitted from a light source device (e.g., the light emitting element 306 in fig. 3) may be incident on the first camera, and the first camera may acquire first information about the object from the incident light. The first information may include distance information (e.g., depth information) of the object so that, for example, the processor 120 or the image signal processor 260 may detect or identify the three-dimensional shape of the object. For example, the first information may include data serving as a basis for generating a depth map, three-dimensional modeling, so that feature points for an object may be extracted. The second camera may detect or acquire second information about the object, such as, for example, color information, brightness information, saturation information, and contrast information.
According to the embodiment, operation 1403a of the first mode is an operation of generating an object image, and the processor 120 or the image signal processor 260 may generate the object image by combining the first information and the second information acquired in operation 1402 a. In an embodiment, the generated object image may have a three-dimensional visual effect by including first information (e.g., depth or distance information). In another embodiment, the generated object image is an image in which a specific portion of the object is emphasized based on the first information, and the image of the object in the generated image may have three-dimensional characteristics.
According to an embodiment, operation 1404a of the first mode is an operation of storing the generated object image, and the processor 120 or the image signal processor 260 may store the object image generated in operation 1403a in the memory 130 or 250.
As described above, an electronic device (e.g., electronic device 300 in fig. 3) according to some embodiments may use a camera module (e.g., a first camera of camera module 305) including a lens assembly to obtain distance information (e.g., depth information) about a subject. In some embodiments, the electronic device 300 may provide various visual effects on the subject image by combining information acquired by another camera (e.g., the second camera of the camera module 305) with information acquired by the first camera.
According to an embodiment, in operation 1402b of the second mode, the processor 120 or the image signal processor 260 may image the subject using the second camera. The second camera may detect or acquire second information about the object, such as, for example, color information, brightness information, saturation information, and contrast information.
According to the embodiment, operation 1403b of the second mode is an operation of generating the object image, and the processor 120 or the image signal processor 260 may generate the object image based on the second information acquired in operation 1402b.
According to an embodiment, operation 1404b of the second mode is an operation of storing the generated object image, and the processor 120 or the image signal processor 260 may store the object image generated in operation 1403b in the memory 130 or 250.
In some embodiments, the second mode may perform operation 1402b using substantially both the first camera and the second camera. However, in operation 1403b of generating the object image, the processor 120 or the image signal processor 260 excludes information acquired by the first camera, and may generate the object image based on information acquired by the second camera.
Note that the control method described with reference to fig. 13 or 14 is merely an example, and does not limit the present disclosure. The control method described with reference to fig. 13 or 14 may include at least a portion of the configuration or operation of the processor or image signal processor described with reference to fig. 2.
According to an embodiment, a lens assembly (e.g., lens assembly 400, 500, 600, or 700 in fig. 5, 7, 9, or 11) includes: at least four lenses are sequentially arranged along the optical axis from the subject to the image sensor. Of the at least four lenses, a first lens (for example, the first lens 401, 501, 601, or 701 in fig. 5, 7, 9, or 11) disposed closest to the subject has a visible light transmittance ranging from 0% to 5%, and at least four surfaces include inflection points among subject-side surfaces and image sensor-side surfaces of the remaining lenses other than the first lens.
According to an embodiment, a lens assembly may have: a first lens (for example, the first lens 401, 501, 601, or 701 in fig. 5, 7, 9, or 11) having a positive refractive index and being disposed closest to the subject; at least one second lens (e.g., second lens 402, 502, 602, or 702 in fig. 5, 7, 9, or 11) disposed adjacent to the first lens and having a positive refractive index; a third lens (e.g., third lens 403, 503, 603, or 703 in fig. 5, 7, 9, or 11) disposed adjacent to the second lens and having a positive refractive index; and a fourth lens (e.g., fourth lens 404, 504, 604, or 704 in fig. 5, 7, 9, or 11) disposed proximate to the third lens and closest to the image sensor (e.g., image sensor 406, 506, 606, or 706 in fig. 5, 7, 9, or 11) and having a negative refractive index.
In an embodiment, the first, second, third, and fourth lenses may be sequentially arranged along an optical axis (e.g., optical axis a in fig. 5, 7, 9, or 11) from the object to the image sensor.
According to an embodiment, the lens assembly may have an F-number ranging from 1.0 to 1.7.
According to an embodiment, the lens assembly may satisfy the following conditional expression 1:
1.5=<TL/ImgH=<3.5
where "TL" may represent a distance from the object-side surface of the first lens to the imaging surface of the image sensor, and "ImgH" may represent a maximum image height of an image formed on the imaging surface.
According to an embodiment, the lens assembly may satisfy the following conditional expression 2:
1=<f1/f=<10
where "f1" may represent the focal length of the first lens, and "f" may represent the total focal length of the lens assembly.
According to an embodiment, the first lens may be a meniscus lens, wherein an object side surface of the first lens is convex, and an image sensor side surface of the first lens is concave.
According to an embodiment, all lenses of the at least four lenses may be meniscus lenses.
According to an embodiment, the lens assembly may satisfy the following conditional expression 3:
0.15=<tmax/f=<0.5
where "tmax" may denote a center thickness of the lens having the maximum thickness on the optical axis, and "f" may denote a total focal length of the lens assembly.
According to an embodiment, at least one of the at least four lenses may be a plastic aspherical lens.
According to an embodiment, the image sensor side surface of the first lens may include the following inflection point shape: the center portion aligned on the optical axis is concave, and the peripheral portion on the edge side is convex.
According to an embodiment, the lens assembly may further include: a bandpass filter (e.g., bandpass filter 405, 505, 605, or 705 in fig. 5, 7, 9, or 11) is disposed between a lens (e.g., fourth lens 404, 504, 604, or 704 in fig. 5, 7, 9, or 11) disposed closest to the image sensor and the image sensor, and the bandpass filter may have a transmittance with respect to light having a wavelength between 800nm and 1000nm ranging from 90% to 99% and a transmittance with respect to visible light ranging from 0% to 1%.
According to an embodiment, an electronic device (e.g., electronic device 101 or 300 in fig. 1 or 3) may include: a first camera (e.g., one of the camera modules 305 in fig. 3) including a lens assembly and configured to acquire first information about an object based on light incident through the lens assembly; at least one second camera (e.g., another one of the camera modules 305 in fig. 3) configured to acquire information second information different from the first information about the subject; and a processor or image signal processor (e.g., processor 120 in fig. 1 or image signal processor 260 in fig. 2), wherein the lens assembly may include at least four lenses arranged sequentially along the optical axis from the subject to the image sensor. Among the at least four lenses, a first lens disposed closest to the subject may have a visible light transmittance ranging from 0% to 5%, among subject side surfaces and image sensor side surfaces of the remaining lenses other than the first lens, at least four surfaces may include inflection points, and the processor or the image signal processor may be configured to perform user authentication based on first information.
According to an embodiment, the above-described electronic apparatus may further include a light source device (e.g., the light emitting device 306 in fig. 3), and the first camera may acquire the first information by receiving light emitted from the light source device and reflected by the subject.
According to an embodiment, the light source device may comprise a near infrared laser light source.
According to an embodiment, the first information may include at least distance information about the subject.
According to an embodiment, the lens assembly may satisfy the following conditional expression 4:
1.5=<TL/ImgH=<3.5
where "TL" may represent a distance from the object-side surface of the first lens to the imaging surface of the image sensor, and "ImgH" may represent a maximum image height of an image formed on the imaging surface.
According to an embodiment, the processor or the image signal processor may be configured to generate an image of the subject by combining the first information and the second information.
According to an embodiment, the second information may include at least one of color information, luminance information, saturation information, and contrast information of the subject.
According to an embodiment, the first information may include at least distance information of the subject, the second information may include at least one of color information, luminance information, saturation information, and contrast information of the subject, and the processor or the image signal processor may be configured to generate an image of the subject by combining the first information and the second information.
According to an embodiment, the lens assembly may satisfy the following conditional expression 5:
0.15=<tmax/f=<0.5
where "tmax" may denote a center thickness of the lens having the maximum thickness on the optical axis, and "f" may denote a total focal length of the lens assembly.
According to an embodiment, the lens assembly or the electronic device including the lens assembly may further include: a band pass filter disposed between the lens disposed closest to the image sensor and the image sensor, and the band pass filter may have a transmittance for light having a wavelength between 800nm and 1000nm ranging from 90% to 99% and may have a transmittance for visible light ranging from 0% to 1%.
Some of the above-described embodiments of the present disclosure may be implemented in hardware, firmware, or by executing software or computer code that may be stored in a recording medium such as a CD ROM, digital Versatile Disk (DVD), magnetic tape, RAM, floppy disk, hard disk, or magneto-optical disk, or computer code downloaded over a network that is initially stored on a remote recording medium or non-transitory machine-readable medium and that is to be stored on a local recording medium, such that the methods described herein may be presented using a general purpose computer or special purpose processor, or in programmable or special purpose hardware (such as an ASIC or FPGA) via such software stored on the recording medium. As will be appreciated in the art, a computer, processor, microprocessor controller or programmable hardware includes a memory component, such as RAM, ROM, flash memory, etc., that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the processes described herein.
While the present disclosure has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims.

Claims (13)

1. A lens assembly (400, 500, 600, 700), comprising:
at least four lenses (401-405, 501-505, 601-605, 701-705) including a first lens, a second lens, a third lens, and a fourth lens sequentially arranged along an optical axis from a subject to an image sensor (406, 506, 606, 706),
wherein a first lens (401, 501, 601, 701) of the at least four lenses (401-405, 501-505, 601-605, 701-705) disposed closest to the subject has a visible light transmittance ranging from 0% to 5%,
wherein at least four of the object side surfaces and the image sensor side surfaces of the remaining lenses excluding the first lens (401, 501, 601, 701) include inflection points,
wherein the fourth lens has a negative refractive index,
wherein a light beam formed on a peripheral portion of an imaging plane of the image sensor has an incident angle of not less than 30 degrees,
Wherein all of the at least four lenses (401-405, 501-505, 601-605, 701-705) are meniscus lenses, and
wherein the image sensor side surface (S3) of the first lens (401, 501, 601, 701) has the following inflection point shape: the center portion aligned on the optical axis is concave, and the peripheral portion on the edge side is convex.
2. The lens assembly (400, 500, 600, 700) of claim 1, wherein the lens assembly (400, 500, 600, 700) has an F-number ranging from 1.0 to 1.7.
3. The lens assembly (400, 500, 600, 700) of claim 1, wherein the lens assembly (400, 500, 600, 700) satisfies the following conditional expression 1:
1.5=<TL/ImgH=<3.5
wherein "TL" represents a distance from the object side surface (S2) of the first lens (401, 501, 601, 701) to the Imaging Surface (IS) of the image sensor (406, 506, 606, 706), and "ImgH" represents a maximum image height of an image formed on the Imaging Surface (IS).
4. The lens assembly (400, 500, 600, 700) of claim 1, wherein the lens assembly (400, 500, 600, 700) satisfies the following conditional expression 2:
1=<f1/f=<10
wherein "f1" denotes a focal length of the first lens (401, 501, 601, 701), and "f" denotes a total focal length of the lens assembly (400, 500, 600, 700).
5. The lens assembly (400, 500, 600, 700) of claim 1, wherein the first lens (401, 501, 601, 701) is a meniscus lens, wherein the object side surface (S2) of the first lens (401, 501, 601, 701) is convex, and the image sensor side surface (S3) of the first lens (401, 501, 601, 701) is concave.
6. The lens assembly (400, 500, 600, 700) of claim 1, wherein the lens assembly (400, 500, 600, 700) satisfies the following conditional expression 3:
0.15=<tmax/f=<0.5
wherein "tmax" denotes a center thickness of the lens having the maximum thickness on the optical axis, and "f" denotes a total focal length of the lens assembly (400, 500, 600, 700).
7. The lens assembly (400, 500, 600, 700) of claim 1, wherein at least one of the at least four lenses (401-405, 501-505, 601-605, 701-705) is a plastic aspheric lens.
8. The lens assembly (400, 500, 600, 700) of claim 1, further comprising:
a bandpass filter (405, 505, 605, 705) arranged between a lens of the at least four lenses (401-405, 501-505, 601-605, 701-705) arranged closest to the image sensor (406, 506, 606, 706) and the image sensor (406, 506, 606, 706),
Wherein the band pass filter (405, 505, 605, 705) has a transmittance for light having a wavelength between 800nm and 1000nm in a range from 90% to 99%, and has a transmittance for visible light in a range from 0% to 1%.
9. An electronic device (101, 300), comprising:
a first camera (305) comprising the lens assembly (400, 500, 600, 700) of any of claims 1 to 8, and configured to acquire first information about a subject based on light incident through the lens assembly (400, 500, 600, 700);
at least one second camera (305) configured to acquire second information different from the first information about the subject; and
a processor or image signal processor (120, 260),
wherein the processor or the image signal processor (120, 260) is configured to perform user authentication based on the first information.
10. The electronic device (101, 300) of claim 9, further comprising:
a light source device (306),
wherein the first camera (305) acquires first information by receiving light emitted from the light source device (306) and reflected by the subject.
11. The electronic device (101, 300) of claim 10, wherein the light source device (306) comprises an infrared emitting diode or a near infrared laser light source.
12. The electronic device (101, 300) of claim 9, wherein the processor or the image signal processor (120, 260) is configured to generate an image of the subject by combining first information and second information.
13. The electronic device (101, 300) of claim 9, wherein the first information includes at least distance information about the subject,
the second information includes color information, brightness information, saturation information, or contrast information of the subject, and
the processor or the image signal processor (120, 260) is configured to generate an image of the subject by combining the first information and the second information.
CN201980073299.4A 2018-11-14 2019-10-17 Lens assembly and electronic device comprising same Active CN113056692B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR1020180139644A KR20200055944A (en) 2018-11-14 2018-11-14 Lens assembly and electronic device with the same
KR10-2018-0139644 2018-11-14
PCT/KR2019/013650 WO2020101193A1 (en) 2018-11-14 2019-10-17 Lens assembly and electronic device including the same

Publications (2)

Publication Number Publication Date
CN113056692A CN113056692A (en) 2021-06-29
CN113056692B true CN113056692B (en) 2023-05-23

Family

ID=70551232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980073299.4A Active CN113056692B (en) 2018-11-14 2019-10-17 Lens assembly and electronic device comprising same

Country Status (5)

Country Link
US (1) US11598933B2 (en)
EP (1) EP3847494A4 (en)
KR (1) KR20200055944A (en)
CN (1) CN113056692B (en)
WO (1) WO2020101193A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI745661B (en) * 2019-03-14 2021-11-11 先進光電科技股份有限公司 Vehicle identification system
CN110426828B (en) * 2019-09-06 2024-04-23 浙江舜宇光学有限公司 Imaging lens group and imaging device
TWI717161B (en) 2019-12-20 2021-01-21 大立光電股份有限公司 Optical lens assembly, image capturing unit and electronic device
JP2022021540A (en) * 2020-07-22 2022-02-03 キヤノン株式会社 Electronic device and control method therefor
JP2022114767A (en) * 2021-01-27 2022-08-08 キヤノン株式会社 Optical system, image capturing device, in-vehicle system, and mobile device
CN113419334A (en) * 2021-06-02 2021-09-21 福建华科光电有限公司 Large-aperture laser radar receiving optical lens
KR20220169608A (en) * 2021-06-21 2022-12-28 삼성전자주식회사 Lens assembly and electronic device including the same
KR102651598B1 (en) * 2022-05-27 2024-03-27 삼성전자주식회사 Lens assembly and electronic device including the same
WO2023229174A1 (en) * 2022-05-27 2023-11-30 삼성전자 주식회사 Lens assembly and electronic device comprising same

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105093481A (en) * 2014-05-21 2015-11-25 华晶科技股份有限公司 Optical image-taking lens

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5097059B2 (en) * 2008-09-03 2012-12-12 パナソニック株式会社 Imaging lens and imaging apparatus using the same
TWI418839B (en) * 2009-09-24 2013-12-11 Largan Precision Co Ltd Near infra-red imaging lens assembly
KR101912588B1 (en) 2011-04-29 2018-10-29 엘지이노텍 주식회사 Camera lens module
KR101535086B1 (en) * 2013-09-24 2015-07-09 주식회사 세코닉스 Photographing wide angle lens system corrected distortion
KR102208325B1 (en) 2013-12-24 2021-01-27 엘지이노텍 주식회사 Image pickup lens
KR102392800B1 (en) 2014-02-12 2022-04-29 삼성전자주식회사 Agile biometric camera with bandpass filter and variable light source
KR101580463B1 (en) 2014-06-02 2015-12-28 주식회사 유비키이노베이션 Iris recognition lens
US9864168B2 (en) * 2014-06-23 2018-01-09 Genius Electronic Optical Co., Ltd. Near-infrared lens for cameras in mobile devices
KR102400015B1 (en) 2015-02-24 2022-05-19 삼성전자주식회사 Imaging lens and imaging apparatus including the same
TWI589917B (en) 2015-03-06 2017-07-01 先進光電科技股份有限公司 Optical image capturing system
JP6603320B2 (en) * 2015-08-28 2019-11-06 富士フイルム株式会社 Projector device with distance image acquisition device and projection mapping method
US9804368B2 (en) 2015-10-05 2017-10-31 Omnivision Technologies, Inc. Near-infrared hybrid lens systems with wide field of view
US9869847B2 (en) * 2015-10-29 2018-01-16 Apple Inc. Near-infrared imaging lens
TWI683127B (en) 2015-12-18 2020-01-21 先進光電科技股份有限公司 Optical image capturing system
US10338355B2 (en) * 2016-02-05 2019-07-02 Largan Precision Co., Ltd. Lens assembly
KR101671451B1 (en) * 2016-03-09 2016-11-01 주식회사 에이스솔루텍 Photographic lens optical system
US10067344B2 (en) * 2016-07-01 2018-09-04 Intel Corporation Variable transmissivity virtual image projection system
KR101914041B1 (en) 2016-08-03 2018-11-02 주식회사 코렌 Optical lens assembly and electronic apparatus having the same
US10288851B2 (en) * 2016-10-27 2019-05-14 Newmax Technology Co., Ltd. Four-piece infrared single wavelength lens system
KR20180073904A (en) * 2016-12-23 2018-07-03 삼성전기주식회사 Optical Imaging System
EP3600966A4 (en) * 2017-03-30 2020-03-04 Gentex Corporation Switchable imager lens cover
TWI634360B (en) 2017-09-29 2018-09-01 大立光電股份有限公司 Electronic device
WO2020075097A1 (en) * 2018-10-09 2020-04-16 Gentex Corporation Camera concealment using photochromics

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105093481A (en) * 2014-05-21 2015-11-25 华晶科技股份有限公司 Optical image-taking lens

Also Published As

Publication number Publication date
WO2020101193A1 (en) 2020-05-22
EP3847494A1 (en) 2021-07-14
US11598933B2 (en) 2023-03-07
KR20200055944A (en) 2020-05-22
EP3847494A4 (en) 2021-10-13
CN113056692A (en) 2021-06-29
US20200150387A1 (en) 2020-05-14

Similar Documents

Publication Publication Date Title
CN113056692B (en) Lens assembly and electronic device comprising same
US20200183128A1 (en) Lens assembly and electronic device including the same
US11614600B2 (en) Optical lens assembly including seven lenses of -+++-+ refractive powers, and electronic device comprising same
EP3736616A1 (en) Optical lens system and electronic device including the same
EP3881116B1 (en) Lens assembly and electronic device including the same
CN112789540B (en) Lens assembly and electronic device including the same
US11726301B2 (en) Lens assembly and electronic device including the same
US11368636B2 (en) Method for acquiring image corresponding to infrared rays by using camera module comprising lens capable of absorbing light in visible light band and electronic device implementing same
US20230176336A1 (en) Lens assembly and electronic device including the same
US11494885B2 (en) Method for synthesizing image on reflective object on basis of attribute of reflective object included in different image, and electronic device
US20230068298A1 (en) Lens assembly and electronic device including same
US20240098352A1 (en) Lens assembly and electronic device comprising same
US20230121915A1 (en) Lens assembly and electronic device including the same
US20230051248A1 (en) Lens assembly and electronic device including the same
EP4239387A1 (en) Lens assembly and electronic device comprising same
US20240019306A1 (en) Spectral image sensors and electronic apparatuses including the same
US20240126054A1 (en) Lens assembly and electronic device including same
US20230010526A1 (en) Electronic device including camera
EP4318070A1 (en) Lens assembly and electronic device comprising same
CN118043715A (en) Lens assembly and electronic device including the same
CN117441340A (en) Camera module and electronic device including the same
KR20230086537A (en) Lens assembly and electronic device including the same
KR20240047267A (en) Lens assembly and electronic device including the same
KR20240022950A (en) Lens assembly and electronic device including the same
KR20230109524A (en) Lens assembly and electronic device including the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant