WO2020085873A1 - Caméra et terminal la comprenant - Google Patents

Caméra et terminal la comprenant Download PDF

Info

Publication number
WO2020085873A1
WO2020085873A1 PCT/KR2019/014212 KR2019014212W WO2020085873A1 WO 2020085873 A1 WO2020085873 A1 WO 2020085873A1 KR 2019014212 W KR2019014212 W KR 2019014212W WO 2020085873 A1 WO2020085873 A1 WO 2020085873A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
pattern light
pattern
output
processor
Prior art date
Application number
PCT/KR2019/014212
Other languages
English (en)
Korean (ko)
Inventor
이지은
신윤섭
정용우
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2020085873A1 publication Critical patent/WO2020085873A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/12Fluid-filled or evacuated lenses
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/12Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Definitions

  • the present invention relates to a camera and a terminal having the same, and more particularly, to a slim camera capable of reducing heat generation and size using a single light source unit, and a terminal having the same.
  • the camera is a device for photographing images. Recently, as a camera is employed in a mobile terminal, research on miniaturization of the camera has been conducted.
  • a depth camera is mounted on a mobile terminal, and a method of simultaneously outputting a plurality of pattern lights from a plurality of light source units and calculating the depth information of the subject has been implemented for calculating the depth information of the subject.
  • An object of the present invention is to provide a camera capable of reducing heat generation and size using a single light source unit, and a terminal having the same.
  • Another object of the present invention is to provide a camera capable of calculating depth information with high resolution by alternately outputting a plurality of pattern lights, and a terminal having the same.
  • Another object of the present invention is to provide a camera capable of changing the output period of pattern light according to temperature, and a terminal having the same.
  • a camera and a terminal having the same according to an embodiment of the present invention for achieving the above object include a plurality of light sources, a light source unit for outputting the first pattern light, and a first pattern light from the light source unit to convert A light conversion unit that generates a second pattern light different from the one pattern light and outputs one of the first pattern light and the second pattern light, and is received from the subject in correspondence with the first pattern light output from the light conversion unit
  • a light detector for detecting the first reflected light and the second reflected light received from the subject in response to the second pattern light, the first pattern light is controlled to be output, and the second time point after the first time point is controlled.
  • 2 controls to output pattern light, and includes a processor that generates depth information on the subject based on the first reflected light and the second reflected light.
  • the light conversion unit may transmit and output the first pattern light at the first time point and output the second pattern light that is converted and generated at the second time point.
  • the optical patterns of the first pattern light and the second pattern light do not overlap.
  • the light conversion unit may include a micro lens or a liquid lens.
  • the camera and the terminal having the same may further include a lens projecting outside the first pattern light or the second pattern light output from the light conversion unit.
  • the processor can control the first pattern light and the second pattern light to be alternately output at regular intervals.
  • the camera and the terminal having the same further include a temperature detector, and the processor may control the output periods of the first pattern light and the second pattern light to vary according to the detected temperature.
  • the processor may control the output period of the first pattern light and the second pattern light to increase as the detected temperature increases.
  • the processor may control the higher the detected temperature, the lower the resolution of the first pattern light output from the light source unit.
  • the processor may control the plurality of light sources to change the resolution of the first pattern light output from the light source unit.
  • the processor may control the resolution of the first pattern light to decrease as the amount of movement of the camera increases.
  • the processor may control the output period of the first pattern light and the second pattern light to increase as the amount of movement of the camera increases.
  • the processor may apply a first electric signal to the liquid lens at a first time point and a second electric signal to the liquid lens at a second time point.
  • the light conversion unit may include a lens driver for applying an electrical signal to the liquid lens, and a sensor unit for sensing the curvature of the liquid lens formed based on the electrical signal.
  • the sensor unit can detect a change in size or area of the area of the boundary between the insulator on the electrode in the liquid lens and the electrically conductive aqueous solution.
  • the sensor unit may sense the capacitance formed by the electrically conductive aqueous solution and the electrode in response to a change in the size or area of the area of the boundary region between the insulator on the electrode in the liquid lens and the electrically conductive aqueous solution.
  • the light conversion unit may further include a plurality of conductive lines that supply a plurality of electrical signals output from the lens driving unit to a liquid lens, one of the plurality of conductive lines, and a switching element disposed between the sensor unit.
  • the processor may calculate the curvature of the liquid lens based on the capacitance sensed by the sensor unit, and output a pulse width variable signal to the lens driver based on the calculated curvature and target curvature.
  • the processor may control to increase the duty of the pulse width variable signal.
  • a camera and a terminal having the same include a plurality of light sources, a light source unit outputting the first pattern light, and a first pattern light different from the first pattern light by converting the first pattern light from the light source unit.
  • 2 A light conversion unit that generates pattern light and outputs one of the first pattern light and the second pattern light, and the first reflected light received from the subject corresponding to the first pattern light output from the light conversion unit, and 2 A light detection unit that detects the second reflected light received from the subject in response to the pattern light, and controls to output the first pattern light at a first time point, and outputs a second pattern light at a second time point after the first time point
  • a processor that generates depth information on the subject based on the first reflected light and the second reflected light. Accordingly, it is possible to reduce heat generation and size using one light source unit.
  • the light conversion unit can form a wider light gap in the first pattern light, thereby reducing heat generation and making the camera size compact.
  • the resolution of the light pattern by the synthesis of the first reflected light and the second reflected light increases, and, finally, it is possible to calculate high-depth information of the resolution.
  • the light conversion unit may transmit and output the first pattern light at the first time point and output the second pattern light that is converted and generated at the second time point. Accordingly, since the light gap in the first pattern light can be formed wide, heat generation can be reduced, and the size of the camera can be configured compactly.
  • the optical patterns of the first pattern light and the second pattern light do not overlap. Accordingly, since the light gap in the first pattern light can be formed wide, heat generation can be reduced, and the size of the camera can be configured compactly.
  • the light conversion unit may include a micro lens or a liquid lens. Accordingly, it is possible to easily implement the second pattern light in which the first pattern light and the light pattern do not overlap.
  • the camera and the terminal having the same may further include a lens projecting outside the first pattern light or the second pattern light output from the light conversion unit. Accordingly, the quality of the pattern light projected to the outside can be improved.
  • the processor can control the first pattern light and the second pattern light to be alternately output at regular intervals.
  • the camera and the terminal having the same further include a temperature detector, and the processor may control the output periods of the first pattern light and the second pattern light to vary according to the detected temperature. Accordingly, it is possible to reduce the heat generation of the camera.
  • the processor may control the output period of the first pattern light and the second pattern light to increase as the detected temperature increases. Accordingly, it is possible to reduce the heat generation of the camera.
  • the processor may control the higher the detected temperature, the lower the resolution of the first pattern light output from the light source unit. Accordingly, it is possible to reduce the heat generation of the camera.
  • the processor may control the plurality of light sources to change the resolution of the first pattern light output from the light source unit. Accordingly, the resolution of the depth information can be varied while reducing the heat generation of the camera.
  • the processor may control the resolution of the first pattern light to decrease as the amount of movement of the camera increases. Accordingly, the accuracy of depth information can be improved.
  • the processor may control the output period of the first pattern light and the second pattern light to increase as the amount of movement of the camera increases. Accordingly, the accuracy of depth information can be improved.
  • the processor may apply a first electric signal to the liquid lens at a first time point and a second electric signal to the liquid lens at a second time point. Accordingly, the liquid lens can be easily driven.
  • the light conversion unit may include a lens driver for applying an electrical signal to the liquid lens, and a sensor unit for sensing the curvature of the liquid lens formed based on the electrical signal. Accordingly, the curvature of the liquid lens can be easily calculated.
  • the sensor unit can detect a change in size or area of the area of the boundary between the insulator on the electrode in the liquid lens and the electrically conductive aqueous solution. Accordingly, the curvature of the liquid lens can be easily calculated.
  • the sensor unit may sense the capacitance formed by the electrically conductive aqueous solution and the electrode in response to a change in the size or area of the area of the boundary region between the insulator on the electrode in the liquid lens and the electrically conductive aqueous solution. Accordingly, the curvature of the liquid lens can be easily calculated.
  • the light conversion unit may further include a plurality of conductive lines supplying a plurality of electrical signals output from the lens driving unit to a liquid lens, one of the plurality of conductive lines, and a switching element disposed between the sensor unit. . Accordingly, the curvature of the liquid lens can be easily calculated.
  • the processor may calculate the curvature of the liquid lens based on the capacitance sensed by the sensor unit, and output a pulse width variable signal to the lens driver based on the calculated curvature and target curvature. Accordingly, the liquid lens can be implemented to have a desired target curvature.
  • the processor may control to increase the duty of the pulse width variable signal. Accordingly, the liquid lens can be implemented to have a desired target curvature.
  • FIG. 1A is a perspective view of a mobile terminal, which is one example of a terminal according to an embodiment of the present invention, viewed from the front.
  • FIG. 1B is a rear perspective view of the mobile terminal shown in FIG. 1A.
  • FIG. 2 is a block diagram of the mobile terminal of FIG. 1.
  • 3A is an internal cross-sectional view of the RGB camera of FIG. 1B.
  • 3B is an internal block diagram of the RGB camera of FIG. 1B.
  • FIG. 4 is an internal convex view of the depth camera of FIG. 1B.
  • FIG. 5A is a view referred to for describing an operation of the depth camera of FIG. 1B.
  • 5B illustrates a depth image generated based on the calculated distance information.
  • FIG. 6 is an internal structure diagram of the light output unit of FIG. 4.
  • FIG. 7 is a flowchart illustrating a method of operating a depth camera according to an embodiment of the present invention.
  • FIG. 8A to 11 are views referred to for explaining the operation method of FIG. 7.
  • 12A to 12B are views illustrating a driving method of a liquid lens.
  • FIG. 13A to 13C are views showing the structure of the liquid lens.
  • 14A to 14E are views for explaining a variable lens curvature of a liquid lens.
  • 15A to 15B are various examples of internal block diagrams of the light conversion unit.
  • modules and “parts” for components used in the following description are given simply by considering the ease of writing the present specification, and do not give meanings or roles particularly important in themselves. Therefore, the “module” and the “unit” may be used interchangeably.
  • FIG. 1A is a perspective view of a mobile terminal, which is an example of a terminal according to an embodiment of the present invention, viewed from the front, and FIG. 1B is a rear perspective view of the mobile terminal shown in FIG. 1A.
  • the case forming the appearance of the mobile terminal 100 is formed by the front case (100-1) and the rear case (100-2).
  • Various electronic components may be embedded in the space formed by the front case 100-1 and the rear case 100-2.
  • the display 180, the first sound output module 153a, the first camera 195a, and the first to third user input units 130a, 130b, and 130c may be disposed on the front case 100-1. have.
  • a fourth user input unit 130d, a fifth user input unit 130e, and first to third microphones 123a, 123b, and 123c may be disposed on the side surfaces of the rear case 100-2.
  • the display 180 may overlap the touch pad in a layer structure, so that the display 180 may operate as a touch screen.
  • the first sound output module 153a may be implemented in the form of a receiver or speaker.
  • the first camera 195a may be embodied in a form suitable for taking an image or video for a user.
  • the microphone 123 may be implemented in a form suitable for receiving a user's voice, other sounds, and the like.
  • the first to fifth user input units 130a, 130b, 130c, 130d, and 130e, and the sixth and seventh user input units 130f and 130g, which will be described later, may be collectively referred to as a user input unit 130.
  • the first to second microphones 123a and 123b are arranged to collect audio signals on the upper side of the rear case 100-2, that is, on the upper side of the mobile terminal 100, and the third microphone 123c is The lower case 100-2, that is, the lower side of the mobile terminal 100, may be arranged for audio signal collection.
  • a second camera 195b, a third camera 195c, and a fourth microphone may be additionally mounted on the rear surface of the rear case 100-2, and the rear case 100 On the side of -2), the sixth and seventh user input units 130f and 130g and the interface unit 175 may be disposed.
  • the second camera 195b has a shooting direction substantially opposite to the first camera 195a, and may have different pixels from the first camera 195a.
  • a flash (not shown) and a mirror (not shown) may be additionally disposed adjacent to the second camera 195b.
  • the second camera 195b may be an RGB camera
  • the third camera 195c may be a depth camera for calculating a distance from the subject.
  • a second sound output module (not shown) may be additionally disposed in the rear case 100-2.
  • the second audio output module may implement a stereo function together with the first audio output module 153a, and may be used for a call in speakerphone mode.
  • a power supply unit 190 for supplying power to the mobile terminal 100 may be mounted on the rear case 100-2 side.
  • the power supply unit 190 may be detachably coupled to the rear case 100-2 for charging, for example, as a rechargeable battery.
  • the fourth microphone 123d may be disposed in front of the rear case 100-2, that is, on the back of the mobile terminal 100, for collecting audio signals.
  • FIG. 2 is a block diagram of the mobile terminal of FIG. 1.
  • the mobile terminal 100 includes a wireless communication unit 110, an audio / video (A / V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, and a memory 160, an interface unit 175, a control unit 170, and a power supply unit 190.
  • These components may be configured by combining two or more components into one component, or when one component is subdivided into two or more components, when implemented in an actual application.
  • the wireless communication unit 110 may include a broadcast reception module 111, a mobile communication module 113, a wireless Internet module 115, a short-range communication module 117, and a GPS module 119.
  • the broadcast receiving module 111 may receive at least one of a broadcast signal and broadcast-related information from an external broadcast management server through a broadcast channel.
  • the broadcast signal and / or broadcast-related information received through the broadcast receiving module 111 may be stored in the memory 160.
  • the mobile communication module 113 can transmit and receive a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include a voice call signal, a video call signal, or various types of data according to transmission / reception of text / multimedia messages.
  • the wireless Internet module 115 refers to a module for wireless Internet access, and the wireless Internet module 115 may be built in or external to the mobile terminal 100.
  • the short-range communication module 117 refers to a module for short-range communication.
  • Bluetooth radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, Near Field Communication (NFC) may be used as the short-range communication technology.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra wideband
  • ZigBee Near Field Communication
  • NFC Near Field Communication
  • the Global Position System (GPS) module 119 receives position information from a plurality of GPS satellites.
  • the A / V (Audio / Video) input unit 120 is for inputting an audio signal or a video signal, which may include a camera 195 and a microphone 123.
  • the camera 195 may process a video frame such as a still image or video obtained by an image sensor in a video call mode or a shooting mode. Then, the processed image frame may be displayed on the display 180.
  • the image frames processed by the camera 195 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more cameras 195 may be provided according to a configuration aspect of the terminal.
  • the microphone 123 may receive an external audio signal by a microphone in a display off mode, for example, a call mode, a recording mode, or a voice recognition mode, and process it as electrical voice data.
  • a display off mode for example, a call mode, a recording mode, or a voice recognition mode
  • a plurality of microphones 123 may be arranged at different positions.
  • the audio signal received from each microphone may be processed by the control unit 170 or the like.
  • the user input unit 130 generates key input data input by the user to control the operation of the terminal.
  • the user input unit 130 may be configured with a key pad, a dome switch, a touch pad (static pressure / power outage), and the like, through which a user may receive commands or information by pressing or touching a user.
  • the touch pad forms a mutual layer structure with the display 180, which will be described later, it may be referred to as a touch screen.
  • the sensing unit 140 detects the current state of the mobile terminal 100, such as an open / closed state of the mobile terminal 100, a location of the mobile terminal 100, or a user contact, to control the operation of the mobile terminal 100 Sensing signals can be generated.
  • the sensing unit 140 may include a proximity sensor 141, a pressure sensor 143, a motion sensor 145, a touch sensor 146, and the like.
  • the proximity sensor 141 may detect the presence or absence of an object approaching the mobile terminal 100 or an object present in the vicinity of the mobile terminal 100 without mechanical contact.
  • the proximity sensor 141 may detect a proximity object using a change in an alternating magnetic field, a change in a static magnetic field, or a rate of change in capacitance.
  • the pressure sensor 143 may detect whether pressure is applied to the mobile terminal 100 and the size of the pressure.
  • the motion sensor 145 may detect the position or movement of the mobile terminal 100 using an acceleration sensor, a gyro sensor, or the like.
  • the touch sensor 146 may detect a touch input by a user's finger or a touch input by a specific pen.
  • the touch screen panel may include a touch sensor 146 for detecting location information, intensity information, and the like of the touch input.
  • the sensing signal sensed by the touch sensor 146 may be transmitted to the controller 170.
  • the output unit 150 is for outputting an audio signal, a video signal, or an alarm signal.
  • the output unit 150 may include a display 180, an audio output module 153, an alarm unit 155, and a haptic module 157.
  • the display 180 displays and outputs information processed by the mobile terminal 100. For example, when the mobile terminal 100 is in a call mode, a UI (User Interface) or a GUI (Graphic User Interface) related to the call is displayed. In addition, when the mobile terminal 100 is in a video call mode or a shooting mode, the captured or received video may be displayed respectively or simultaneously, and a UI and GUI are displayed.
  • a UI User Interface
  • GUI Graphic User Interface
  • the display 180 and the touch pad are configured as a touch screen by forming a mutual layer structure
  • the display 180 may be used as an input device capable of inputting information by a user's touch in addition to an output device. You can.
  • the audio output module 153 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, call mode or recording mode, voice recognition mode, broadcast reception mode, or the like. In addition, the audio output module 153 outputs an audio signal related to a function performed in the mobile terminal 100, for example, a call signal reception sound and a message reception sound.
  • the sound output module 153 may include a speaker, a buzzer, and the like.
  • the alarm unit 155 outputs a signal for notifying the occurrence of an event in the mobile terminal 100.
  • the alarm unit 155 outputs a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal.
  • a signal can be output in the form of vibration.
  • the haptic module 157 generates various tactile effects that the user can feel.
  • a typical example of the tactile effect generated by the haptic module 157 is a vibration effect.
  • the intensity and pattern of vibration generated by the haptic module 157 are convertible, and different vibrations can be synthesized and output or sequentially output.
  • the memory 160 may store a program for processing and control of the control unit 170 and provide a function for temporarily storing input or output data (eg, a phone book, a message, a still image, a video, etc.). You can also do
  • the interface unit 175 serves as an interface with all external devices connected to the mobile terminal 100.
  • the interface unit 175 may receive data from an external device or receive power and transmit it to each component inside the mobile terminal 100, and allow data inside the mobile terminal 100 to be transmitted to the external device.
  • the control unit 170 controls the overall operation of the mobile terminal 100 by generally controlling the operation of each part. For example, it may perform related control and processing for voice calls, data communication, video calls, and the like. Also, the control unit 170 may include a multimedia playback module 181 for multimedia playback. The multimedia playback module 181 may be configured with hardware in the control unit 170 or may be configured with software separately from the control unit 170. Meanwhile, the control unit 170 may include an application processor (not shown) for driving an application. Alternatively, the application processor (not shown) may be provided separately from the control unit 170.
  • the power supply unit 190 may receive external power or internal power under the control of the control unit 170 to supply power required for the operation of each component.
  • 3A is an internal cross-sectional view of the RGB camera of FIG. 1B.
  • FIG. 3A is an example of a cross-sectional view of the camera 195b.
  • the camera 195b may include an aperture 194, a lens device 193, and an image sensor 820.
  • the aperture 194 can open and close the light incident to the lens device 193.
  • the lens device 193 may include a plurality of lenses adjusted for variable focus.
  • the image sensor 820 may include an RGb filter 915b and a sensor array 911b that converts an optical signal into an electrical signal to sense RGB color.
  • the image sensors 820 may sense and output RGB images, respectively.
  • 3B is an internal block diagram of the RGB camera of FIG. 1B.
  • FIG. 3B is an example of a block diagram for the camera 195b.
  • the camera 195b may include a lens device 193, an image sensor 820, and an image processor 830.
  • the lens device 193 may receive incident light incident, and may include a plurality of lenses adjusted for variable focus.
  • the image processor 830 may generate an RGB image based on an electrical signal from the image sensor 820.
  • the image processor 830 may generate an image signal based on incident light that has passed through the lens device 193. Accordingly, according to this, an image signal may be generated using the image sensor 820 when photographing the rear side.
  • the exposure time of the image sensor 820 may be adjusted based on an electrical signal.
  • the RGB image from the image processor 830 may be transferred to the control unit 170 of the mobile terminal 100.
  • control unit 170 of the mobile terminal 100 may output a control signal to the lens device 193 for the movement of the lens in the lens device 193 or the like.
  • a control signal for auto focusing may be output to the lens device 193.
  • control unit 170 of the mobile terminal 100 may output an aperture control signal for adjusting the light transmittance of the aperture 194 to the aperture 194.
  • control unit 170 of the mobile terminal 100 may control the aperture 194 to be opened in the first period and the aperture 194 to be closed in the second period.
  • FIG. 4 is an internal convex view of the depth camera of FIG. 1B.
  • the depth camera 195c of FIG. 4 may include an optical output unit 210, an optical photodetector 280, a temperature detector 260, and a processor 270.
  • the processor 270 may output a sinusoidal drive signal Tx of a predetermined frequency to the optical output unit 210.
  • the optical output unit 210 may output the first pattern light or the second pattern light to the external subject 40 based on the sinusoidal wave driving signal, that is, the transmission signal Tx.
  • the light output unit 210 may output a first pattern light at a first time point and output a second pattern light at a second time point after the first time point.
  • the light output unit 210 may alternately output the first pattern light and the second pattern light.
  • the first pattern light or the second pattern light output to the external subject 40 is scattered or reflected by the external subject 40. Accordingly, the first reception light or the second reception light that is scattered or reflected by the external subject 40 may be received by the depth camera 195c.
  • the photodetector 280 may receive the first received light or the second received light, and convert it into a received signal that is an electrical signal.
  • the electrical signal Rx converted by the photodetector 280 may be transmitted to the processor 270.
  • the processor 270 may calculate distance information on an external subject based on the transmission signal Tx and the transmission signal Tx.
  • the processor 270 may synthesize the first received light and the second received light, and calculate distance information on an external subject based on the synthesized light.
  • the processor 270 may calculate distance information on the external subject based on the difference between the first pattern light and the second pattern light, and the light in which the first and second received lights are synthesized.
  • the distance information calculated by the depth camera 195c may be transmitted to the control unit 170 of the mobile terminal 100.
  • controller 170 of the mobile terminal 100 may generate a depth image based on the distance information calculated by the depth camera 195c.
  • the processor 270 in the depth camera 195c may generate a depth image based on the calculated distance information.
  • the light output unit 210 outputs the first pattern light to the external subject 40 at the first time point, and the second pattern light to the external subject 40 at the second time point after the first time point.
  • the light output unit 210 may alternately output the first pattern light and the second pattern light.
  • the second pattern light is a pattern light generated by replicating the first pattern light, and it is preferable that the light pattern of the second pattern light does not overlap with the light pattern of the first pattern light.
  • the light output unit 210 uses a single light source unit, it is possible to reduce heat and size of the depth camera 195c.
  • the light conversion unit 320 by outputting different pattern light for each viewpoint, it is possible to form a wide light gap in the first pattern light, it is possible to reduce heat generation, the size of the depth camera 195c is also compact Can be configured.
  • the resolution of the light pattern by the synthesis of the first reflected light and the second reflected light increases, and, finally, it is possible to calculate high-depth information of the resolution.
  • the temperature detection unit 260 can detect the temperature of the light output unit 210.
  • the detected temperature information T may be input to the processor 270.
  • the processor 270 may control the first pattern light and the second pattern light to be alternately output at regular intervals.
  • the processor 270 may control the output periods of the first pattern light and the second pattern light to vary according to the detected temperature. Accordingly, it is possible to reduce the heat generated by the depth camera 195c.
  • the processor 270 may control the output periods of the first pattern light and the second pattern light to increase as the detected temperature increases. Accordingly, it is possible to reduce the heat generated by the depth camera 195c.
  • the processor 270 may control the higher the detected temperature, the lower the resolution of the first pattern light output from the light source unit 310. Accordingly, it is possible to reduce the heat generated by the depth camera 195c.
  • the processor 270 may control the plurality of light sources to vary the resolution of the first pattern light output from the light source unit 310. Accordingly, it is possible to change the resolution of the depth information while reducing the heat generated by the depth camera 195c.
  • the processor 270 may control the resolution of the first pattern light to decrease. Accordingly, the accuracy of depth information can be improved.
  • the processor 270 may control the output period of the first pattern light and the second pattern light to increase as the amount of movement of the depth camera 195c increases. Accordingly, the accuracy of depth information can be improved.
  • FIG. 5A is a view referred to for describing an operation of the depth camera of FIG. 1B.
  • the depth camera 195c selectively outputs the first pattern light or the second pattern light to the external subject 40, and the first reflected light or the second reflected light scattered or reflected by the external subject 40
  • the distance information may be calculated using the difference between the first patterned light or the second patterned light and the first reflected light or the second reflected light.
  • 5B illustrates a depth image generated based on the calculated distance information.
  • the calculated distance information may be represented as the luminance image 65, as shown in FIG. 5B.
  • Various distance information values of the external subject can be displayed as corresponding luminance levels.
  • the luminance level may be large (brightness may be bright), and when the depth is far, the luminance level may be small (brightness may be dark).
  • FIG. 6 is an internal structure diagram of the light output unit of FIG. 4.
  • the light output unit 210 includes a plurality of light sources, and converts the first pattern light from the light source unit 310 and the light source unit 310 to output the first pattern light, the first pattern light A light conversion unit 320 may be provided to generate a second pattern light different from each other and to output any one of the first pattern light and the second pattern light.
  • the light conversion unit 320 outputs the first pattern light to the external subject 40 at the first time point, and outputs the second pattern light to the external subject 40 at the second time point after the first time point. Can be output to
  • the light output unit 210 may alternately output the first pattern light and the second pattern light.
  • the second pattern light is a pattern light generated by replicating the first pattern light, and it is preferable that the light pattern of the second pattern light does not overlap with the light pattern of the first pattern light.
  • the light output unit 210 uses a single light source unit, it is possible to reduce heat and size of the depth camera 195c.
  • the light output unit 210 may further include a lens 330 projecting outside the first pattern light or the second pattern light output from the light conversion unit 320. Accordingly, the quality of the pattern light projected to the outside can be improved.
  • the lens 330 may change a direction in which light travels, and a condensing lens or the like may be used for this purpose.
  • the light conversion unit 320 may duplicate the first pattern light from the light source unit 310 to generate a second pattern light different from the first pattern light and output it.
  • the light conversion unit 320 may transmit and output the first pattern light at the first time point and output the second pattern light generated by being converted at the second time point. Accordingly, since the light gap in the first pattern light can be formed wide, heat generation can be reduced, and the size of the depth camera 195c can be configured compactly.
  • the optical patterns of the first pattern light and the second pattern light do not overlap. Accordingly, since the light gap in the first pattern light can be formed wide, heat generation can be reduced, and the size of the depth camera 195c can be configured compactly.
  • the light conversion unit 320 may include a micro lens or a liquid lens. Accordingly, it is possible to easily implement the second pattern light in which the first pattern light and the light pattern do not overlap.
  • the processor 270 applies a first electrical signal to the liquid lens 500 at a first time point and at a second time point, The second electrical signal may be applied to the liquid lens 500. Accordingly, the liquid lens 500 can be easily driven.
  • FIG. 7 is a flowchart illustrating an operation method of a depth camera according to an embodiment of the present invention
  • FIGS. 8A to 11 are views referred to for describing the operation method of FIG.
  • the processor 270 in the depth camera 195c determines whether it is the first viewpoint (S705) and, if applicable, controls to output the first pattern light (S710).
  • the light source unit 310 in the light output unit 210 includes a plurality of light sources and can output a first pattern light based on the plurality of light sources.
  • the light source unit 310 includes a plurality of vertical resonant surface emitting laser diodes (VSCEL), and as shown in FIG. 8A, at the first time point T1m, the first pattern light PTa is used as a surface light source.
  • VSCEL vertical resonant surface emitting laser diodes
  • the light conversion unit 320 receives the first pattern light (PTa) from the light source unit 310, as shown in Figure 8a, at a first time point (T1m), without a separate light conversion, the first pattern light (PTa) ) Can be output.
  • the liquid lens 500 may not change the light traveling direction.
  • the lens 330 may output the first pattern light PTa to the outside at the first time point T1m, as shown in FIG. 8A.
  • step 705 if it is not the first time point, the processor 270 determines whether it is the second time point (S707), and if so, controls the second pattern light to be output (S715). ).
  • the light source unit 310 may output the first pattern light PTa as the surface light source at the second time point T2m.
  • the light conversion unit 320 that receives the first pattern light PTa from the light source unit 310 performs light conversion at a second time point T2m, as shown in FIG. 8B, so that the first pattern light PTa ) May be converted into the second pattern light PTb and output.
  • the light patterns of the first pattern light PTa and the second pattern light PTb do not overlap in order to increase the optical resolution.
  • the light conversion unit 320 may shift the first pattern light PTa in a constant direction or the like.
  • the liquid lens 500 can change the direction of light travel.
  • the lens 330 may output the second pattern light PTb to the outside at the second time point T2m, as shown in FIG. 8B (S710).
  • the processor 270 applies a first electrical signal to the liquid lens 500 at a first time point, and at a second time point, The second electrical signal may be applied to the liquid lens 500. Accordingly, the liquid lens 500 can be easily driven.
  • the photodetector 280 may receive and detect the first received light reflected from the external object in response to the first pattern light (S720).
  • the photodetector 280 may receive and detect the second received light reflected from the external object in response to the second pattern light (S725).
  • the processor 270 may receive an electrical signal corresponding to the first reflected light from the photodetector 280 and an electrical signal corresponding to the second reflected light from the first time point and the second time point, and synthesize them ( S730),
  • the processor 270 may synthesize the first reflected light and the second reflected light from the photodetector 280 after the first time point and the second time point.
  • the processor 270 may calculate depth information based on the synthesized signal or the synthesized light (S735).
  • FIG. 8C (a) illustrates the first pattern light PTa
  • FIG. 8C (b) illustrates the second pattern light PTb
  • FIG. 8C (c) shows the first pattern light
  • the composite light PTc in which (PTa) and the second pattern light PTb are synthesized is exemplified.
  • the optical pattern resolution of the composite light PTc is the first pattern light PTa or the second pattern light PTb. It is approximately twice the resolution of the optical pattern.
  • the optical pattern resolution can be equal to or higher than the conventional method of simultaneously outputting a plurality of pattern lights.
  • the processor 270 may synthesize the first reflected light and the second reflected light, and calculate distance information based on the synthesized light.
  • 8D illustrates the first composite light PTam and the second composite light PTBm projected on the first surface SFa and the second surface SFb.
  • the first synthetic light PTam may be obtained by combining the first pattern light PTa and the second pattern light PTb of FIG. 8C.
  • the distance difference between the adjacent first light Pa and the second light Pb in the first composite light PTam is d
  • the distance difference between the adjacent first light Pa 'and the second light Pb' in the second synthetic light PTBm may be represented by d1.
  • the processor 270 may calculate distance information on the external subject according to the difference between d and d1.
  • the temperature detection unit 260 may reduce heat generation of the depth camera 195c based on the sensed temperature information.
  • the depth camera 195c and the mobile terminal 100 having the same further include a temperature detector 260, and the processor 270, according to the detected temperature, the first pattern light and the second The output period of the pattern light can be controlled to be variable. Accordingly, it is possible to reduce the heat generated by the depth camera 195c.
  • the processor 270 may control the output periods of the first pattern light and the second pattern light to increase as the detected temperature increases. Accordingly, it is possible to reduce the heat generated by the depth camera 195c.
  • 10A illustrates that when the temperature is Ta, the output periods of the first pattern light PTa and the second pattern light PTb are 2T1, respectively.
  • FIG. 10 illustrates that when the temperature is Ta, the output intervals of the first pattern light PTa and the second pattern light PTb alternately output are T1.
  • T2 is preferably larger than T1.
  • (b) of FIG. 10 illustrates that when the temperature is Tb higher than Ta, the output intervals of the first pattern light PTa and the second pattern light PTb alternately output are T2. . Accordingly, it is possible to reduce the heat generated by the depth camera 195c.
  • the processor 270 may control the higher the detected temperature, the lower the resolution of the first pattern light output from the light source unit 310. Accordingly, it is possible to reduce the heat generated by the depth camera 195c.
  • the processor 270 may perform control on a plurality of light sources to change the resolution of the first pattern light output from the light source unit 310. Accordingly, it is possible to change the resolution of the depth information while reducing the heat generated by the depth camera 195c.
  • the processor 270 may control the output periods of the first pattern light and the second pattern light to increase as the amount of movement of the depth camera 195c increases. Accordingly, the accuracy of depth information can be improved.
  • the processor 270 may control the resolution of the first pattern light to decrease as the movement amount of the depth camera 195c increases. Accordingly, the accuracy of depth information can be improved.
  • 11A illustrates that when the temperature is Ta1, the output periods of the first pattern light PTam and the second pattern light PTbm are 2T1, respectively.
  • FIG. 11 illustrates that when the temperature is Ta1, the output intervals of the first pattern light PTam and the second pattern light PTbm alternately output are T1.
  • 11B illustrates that when the temperature is Tb1 higher than Ta1, the output periods of the first pattern light PTa and the second pattern light PTb are 2T1, respectively.
  • (b) of FIG. 11 illustrates that when the temperature is Tb1 higher than Ta1, the output intervals of the first pattern light PTa and the second pattern light PTb alternately output are T1. .
  • the amount of the pattern of the first pattern light PTa and the second pattern light PTb in FIG. 11 (b) compared to FIG. 11 (a) is the first pattern light in FIG. 11 (a). It is preferable that it is smaller than the amount of the pattern of (PTam) and the 2nd pattern light PTbm.
  • the pattern light resolution of the outputted pattern light may be lower. Accordingly, while reducing the heat generated by the depth camera 195c, it is possible to calculate the depth information for the external subject.
  • 12A to 12B are views illustrating a driving method of a liquid lens.
  • FIG. 12A (a) illustrates that the first voltage V1 is applied to the liquid lens 500, and the liquid lens operates like a concave lens.
  • FIG. 12A (b) illustrates that the liquid lens 500 is applied with a second voltage V2 that is greater than the first voltage V1, so that the liquid lens does not change the traveling direction of light.
  • FIG. 12A (c) illustrates that the liquid lens 500 is applied with a third voltage V3 greater than the second voltage V2, so that the liquid lens operates like a convex lens.
  • the curvature or diopter of the liquid lens is changed according to the level of the applied voltage, it is not limited thereto, and the curvature or diopter of the liquid lens may be changed according to the pulse width of the applied pulse. Do.
  • FIG. 12B (a) illustrates that the liquid in the liquid lens 500 has the same curvature and thus operates like a convex lens.
  • FIG. 12B (b) illustrates that as the liquid in the liquid lens 500 has an asymmetrical curved surface, the traveling direction of light is changed upward.
  • FIGS. 13A and 13C are views showing the structure of the liquid lens.
  • FIG. 13A shows a top view of the liquid lens
  • FIG. 13B shows a bottom view of the liquid lens
  • FIG. 13C shows a cross-sectional view of I-I 'of FIGS. 13A and 13C.
  • FIG. 13A is a view corresponding to the right side of the liquid lens 500 of FIGS. 12A to 12B
  • FIG. 13B may be a view corresponding to the left side of the liquid lens 500 of FIGS. 12A to 12B.
  • the liquid lens 500 may have a common electrode (COM) 520 disposed thereon, as illustrated in FIG. 13A.
  • the common electrode (COM) 520 may be disposed in a tube shape, and the liquid 530 may be disposed in a lower region of the common electrode (COM) 520, particularly in a region corresponding to hollow. have.
  • an insulator (not shown) may be disposed between the common electrode (COM) 520 and the liquid.
  • a plurality of electrodes LA to LD 540a to 540d may be disposed under the common electrode COM 520, particularly under the liquid 530.
  • the plurality of electrodes LA to LD 540a to 540d in particular, may be disposed in a form surrounding the liquid 530.
  • a plurality of insulators 550a to 550d for insulation may be disposed between the plurality of electrodes LA to LD 540a to 540d and the liquid 530.
  • the liquid lens 500 includes a common electrode (COM) 520, a plurality of electrodes (LA to LD) 540a to 540d spaced apart from the common electrode (COM) 520, and the common electrode A liquid 530 and an electrically conductive aqueous solution (595 in FIG. 13C) disposed between the (COM) 520 and the plurality of electrodes LA to LD (540a to 540d) may be provided.
  • the liquid lens 500 insulates a plurality of electrodes LA to LD 540a to 540d and a plurality of electrodes LA to LD 540a to 540d on the first substrate 510.
  • a plurality of electrodes (LA ⁇ LD) (540a ⁇ 540d) on the liquid 530, the liquid 530 on the electrically conductive aqueous solution (electroconductive aqueous solution) 595, and the liquid A common electrode (COM) 520 that is spaced apart from the 530 and a second substrate 515 on the common electrode (COM) 520 may be provided.
  • the common electrode 520 may have a hollow shape and be formed in a tube shape.
  • the liquid 530 and the electrically conductive aqueous solution 595 may be disposed in the hollow region.
  • the liquid 530 may be arranged in a circular shape, as shown in FIGS. 13A to 13B.
  • the liquid 530 at this time may be a non-conductive liquid such as oil.
  • the size may be increased, and accordingly, the plurality of electrodes LA to LD (540a to 540d) may be smaller in size from the lower portion to the upper portion.
  • the first electrode LA (540a) and the second electrode (LB) 540b among the plurality of electrodes LA to LD (540a to 540d) are formed to be inclined. It is illustrated that the size becomes small.
  • a plurality of electrodes LA to LD 540a to 540d are formed at an upper portion of the common electrode 520 and a common electrode 520 is formed at a lower portion. Do.
  • FIGS. 13A to 13C illustrate four electrodes as a plurality of electrodes, but are not limited thereto, and it is possible that two or more different numbers of electrodes are formed.
  • the first electrode (LA) 540a and the second electrode (LB) 540b are pulsed.
  • an electrical signal is applied, a potential difference between the common electrode 520 and the first electrode (LA) 540a and the second electrode (LB) 540b occurs, and accordingly, an electrically conductive aqueous solution having electrical conductivity
  • the shape of 595 changes, and in response to the shape change of the electrically conductive aqueous solution 595, the shape of the liquid 530 inside the liquid 530 changes.
  • the plurality of electrodes (LA to LD) (540a to 540d), and the common electrode 520 according to the electric signal applied to each, the liquid 530 formed curvature of the simple and quick detection Suggests.
  • the sensor unit 962 in the present invention is the area of the boundary region Ac0 between the first insulator 550a on the first electrode 540a in the liquid lens 500 and the electrically conductive aqueous solution 595. Detect changes in size or area.
  • AM0 is exemplified as the area of the boundary area Ac0.
  • the area of the boundary region Ac0 in contact with the electrically conductive aqueous solution 595 among the inclined portions of the first insulator 550a on the first electrode 540a is AM0.
  • the liquid 530 is not concave or convex, and is parallel to the first substrate 510 and the like.
  • the curvature at this time can be defined as 0, for example.
  • Equation 1 for the boundary region Ac0 in contact with the electrically conductive aqueous solution 595 among the inclined portions of the first insulator 550a on the first electrode 540a, the capacitance is represented by Equation 1 below. (C) may be formed.
  • may represent the dielectric constant of the dielectric 550a
  • A may represent the area of the boundary region Ac0
  • d may indicate the thickness of the first dielectric 550a.
  • ⁇ and d are fixed values, it may be the area of the boundary area Ac0 that greatly affects the capacitance C.
  • the area of the boundary area Ac0 varies.
  • the area of the boundary area Ac0 is sensed by using the sensor unit 962 or the boundary area It is assumed that the capacitance C formed in (Ac0) is sensed.
  • the capacitance of FIG. 13C can be defined as CAc0.
  • 14A to 14E are diagrams illustrating various curvatures of the liquid lens 500.
  • FIG. 14A illustrates that the first curvature Ria is formed in the liquid 530 according to the application of electric signals to the plurality of electrodes LA to LD 540a to 540d and the common electrode 520, respectively. do.
  • AMa (> AM0) is illustrated as the area of the boundary area Aaa.
  • the area of the boundary region Aaa in contact with the electrically conductive aqueous solution 595 among the inclined portions of the first insulator 550a on the first electrode 540a is AMa.
  • Equation 1 since the area of the boundary area Aaa in FIG. 14A is larger than that of FIG. 13C, the capacitance of the boundary area Aaa is greater. Meanwhile, the capacitance of FIG. 14A can be defined as CAaa, and has a larger value than the capacitance CAc0 of FIG. 13C.
  • the first curvature Ri may be defined as having a positive polarity value.
  • it may be defined that the first curvature Ria has a +2 level.
  • FIG. 14B illustrates that a second curvature Rib is formed in the liquid 530 according to the application of electric signals to the plurality of electrodes LA to LD 540a to 540d and the common electrode 520, respectively. do.
  • AMb (> AMa) is illustrated as the area of the boundary area Abba.
  • the area of the boundary region Aba in contact with the electrically conductive aqueous solution 595 among the inclined portions of the first insulator 550a on the first electrode 540a is AMb.
  • Equation 1 since the area of the boundary area Aba in FIG. 14B is larger than that of FIG. 14A, the capacitance of the boundary area Aba is greater. Meanwhile, the capacitance of FIG. 14B can be defined as CAba, and has a larger value than the capacitance of CAaa of FIG. 14A.
  • the second curvature Rib and the first curvature Ria may be defined as having a smaller positive polarity value.
  • the second curvature Rib has a +4 level.
  • the liquid lens 500 operates as a convex lens, and accordingly, the output light LP1a in which the incident light LP1 is concentrated is output.
  • FIG. 14C illustrates that a third curvature Ric is formed in the liquid 530 according to the application of an electric signal to the plurality of electrodes LA to LD 540a to 540d and the common electrode 520, respectively. do.
  • AMa is illustrated as the area of the left boundary region Aca
  • AMb (> AMa) is illustrated as the area of the right boundary region Akb.
  • the area of the boundary region Aca in contact with the electrically conductive aqueous solution 595 among the inclined portions of the first insulator 550a on the first electrode 540a is AMa
  • the second insulator on the second electrode 540b It is exemplified that the area of the boundary region Acb in contact with the electrically conductive aqueous solution 595 among the inclined portions of 550b is AMb.
  • the capacitance of the left boundary region Aca may be CAaa
  • the capacitance of the right boundary region Acb may be CAba
  • the third curvature Ric may be defined as having a positive polarity value.
  • it may be defined that the third curvature Ric has a +3 level.
  • the liquid lens 500 operates as a convex lens, and accordingly, the output light LP1b in which the incident light LP1 is more concentrated toward one side is output.
  • FIG. 14D illustrates that a fourth curvature Rid is formed in the liquid 530 according to the application of an electrical signal to the plurality of electrodes LA to LD 540a to 540d and the common electrode 520, respectively. do.
  • AMd ( ⁇ AM0) is illustrated as the area of the boundary region Ada.
  • the area of the boundary region Ada in contact with the electrically conductive aqueous solution 595 among the inclined portions of the first insulator 550a on the first electrode 540a is AMd.
  • Equation 1 since the area of the boundary area Ada in FIG. 14D is smaller than that of FIG. 13C, the capacitance of the boundary area Ada is smaller. Meanwhile, the capacitance of FIG. 14D can be defined as CAda, and has a smaller value than the capacitance CAc0 of FIG. 13C.
  • the fourth curvature Rid may be defined as having a negative polarity value.
  • the fourth curvature Rid may be defined that the fourth curvature Rid has a -2 level.
  • FIG. 14E illustrates that a fifth curvature Rie is formed in the liquid 530 according to the application of electric signals to the plurality of electrodes LA to LD 540a to 540d and the common electrode 520, respectively. do.
  • AMe ( ⁇ AMd) is illustrated as the area of the boundary area Aea.
  • the area of the boundary region (Aea) in contact with the electrically conductive aqueous solution 595 among the inclined portions of the first insulator 550a on the first electrode 540a is AMe.
  • Equation 1 compared to FIG. 14D, since the area of the boundary area Aea in FIG. 14E is smaller, the capacitance of the boundary area Aea is smaller. Meanwhile, the capacitance of FIG. 14E can be defined as CAea, and has a smaller value than the capacitance of CAda of FIG. 14D.
  • the fifth curvature Ri may be defined as having a negative polarity value.
  • the fifth curvature Rie has a -4 level.
  • the liquid lens 500 operates as a concave lens, and accordingly, the output light LP1c from which the incident light LP1 is emitted is output.
  • 15A to 15B are various examples of internal block diagrams of the light conversion unit.
  • FIG. 15A is an example of an internal block diagram of a light conversion unit.
  • the light conversion unit 320a of FIG. 15A may include a lens driving unit 860, a pulse width variable control unit 840, a power supply unit 890, and a liquid lens 500.
  • the pulse width variable control unit 840 outputs a pulse width variable signal V in response to a target curvature, and the lens driver 860 outputs a pulse width variable signal.
  • the voltage may be output to a plurality of electrodes and a common electrode of the liquid lens 500 using (V) and the voltage Vx of the power supply 890.
  • the light conversion unit 320a of FIG. 15A may operate as an open loop system to vary the curvature of the liquid lens.
  • 15B is another example of an internal block diagram of a light conversion unit.
  • the light conversion unit 320b includes a liquid lens 500, a lens driving unit 960 applying an electric signal to the liquid lens 500, and an electric signal A sensor unit 962 for detecting the curvature of the formed liquid lens 500 and a processor 970 for controlling the lens driver 960 to form a target curvature of the liquid lens 500 based on the sensed curvature. It can contain.
  • the light conversion unit 320b may not include the processor 970, and the processor 970 may be provided in the processor 270 of FIG. 4.
  • the sensor unit 962 may detect a change in size or area of the area of the boundary region Ac0 between the insulator on the electrode in the liquid lens 500 and the electrically conductive aqueous solution 595. Accordingly, it is possible to quickly and accurately sense the curvature of the lens.
  • the light conversion unit 320b according to an embodiment of the present invention, a power supply unit 990 for supplying power, and an AD converter 967 converting a signal related to the capacitance detected by the sensor unit 962 into a digital signal ) May be further provided.
  • the light conversion unit 320b includes a plurality of conductive lines CA1 and CA2 for supplying electric signals to each electrode (common electrode, plurality of electrodes) in the liquid lens 500 in the lens driving unit 960.
  • a switching element SWL disposed between any one of the plurality of conductive lines CA2 and the sensor unit 962 may be further included.
  • the switching element SWL is disposed between the conductive line CA2 for applying an electrical signal to any one of the plurality of electrodes in the liquid lens 500 and the sensor unit 962.
  • the contact point between the conductive line CA2 and one end of the switching element SWL or the liquid lens 500 may be referred to as node A.
  • the switching element SWL may be turned on.
  • the sensor unit 962 during the on period of the switching element SWL, based on the electrical signal from the liquid lens 500, the insulator on the electrode in the liquid lens 500 of the liquid lens 500, The size or area of the area of the boundary area Ac0 between the electrically conductive aqueous solutions 595 may be detected, or the capacitance of the boundary area Ac0 may be sensed.
  • the switching element SWL is turned off, and an electrical signal may be continuously applied to the electrodes in the liquid lens 500. Accordingly, a curvature may be formed in the liquid 530.
  • the switching element SWL is turned off, and an electrical signal is not applied to an electrode in the liquid lens 500 or a low-level electrical signal is applied.
  • the switching element SWL may be turned on.
  • the processor 970 pulses the pulse width variable control signal supplied to the driver 960 to reach the target curvature. It can be controlled to increase the width.
  • a time difference between pulses applied to the common electrode 530 and the plurality of electrodes may be increased, and accordingly, the curvature formed in the liquid 530 may be increased.
  • the switching element SWL when the switching element SWL is turned on and in contact with the sensor unit 962, when an electric signal is applied to the electrode in the liquid lens 500, a curvature is formed in the liquid lens 500, ,
  • the electrical signal corresponding to the curvature formation may be supplied to the sensor unit 962 through the switching element SWL.
  • the sensor unit 962 during the on period of the switching element SWL, based on the electrical signal from the liquid lens 500, the insulator on the electrode in the liquid lens 500 of the liquid lens 500, The size or area of the area of the boundary area Ac0 between the electrically conductive aqueous solutions 595 may be detected, or the capacitance of the boundary area Ac0 may be sensed.
  • the processor 970 may calculate the curvature based on the sensed capacitance and determine whether the target curvature has been reached. On the other hand, when the target curvature is reached, the processor 970 may control to supply a corresponding electrical signal to each electrode.
  • the curvature of the liquid 530 is formed, and the curvature of the liquid can be sensed immediately. Therefore, it is possible to quickly and accurately grasp the curvature of the liquid lens 500.
  • the lens driving unit 960 and the sensor unit 962 may be formed as one module 965.
  • the processor 970 may control the level of the voltage applied to the liquid lens 500 to increase or the pulse width to increase.
  • the processor 970 may calculate a curvature of the liquid lens 500 based on the capacitance sensed by the sensor unit 962.
  • the processor 970 may calculate that as the capacitance sensed by the sensor unit 962 increases, the curvature of the liquid lens 500 increases.
  • the processor 970 may control the liquid lens 500 to have a target curvature.
  • the processor 970 calculates the curvature of the liquid lens 500 based on the capacitance detected by the sensor unit 962, and based on the calculated curvature and target curvature, the pulse width variable signal V It can be output to the lens driving unit 960.
  • the lens driving unit 960 uses the pulse width variable signal V and the voltages Lv1 and Lv2 of the power supply unit 990 to use a plurality of electrodes LA to LD 540a to 540d. , And a corresponding electrical signal to the common electrode 520.
  • an electric signal is applied to the liquid lens 500 so that the curvature of the lens is variable, so that the curvature of the lens can be changed quickly and accurately.
  • the processor 970 generates a pulse width variable signal V based on the calculated curvature and the target curvature, an equalizer 972 for calculating a curvature error, and a calculated curvature error ⁇ .
  • a pulse width variable control unit 940 to be output may be included.
  • the processor 970 may control the duty of the pulse width variable signal V to increase based on the calculated curvature error ⁇ . Accordingly, the curvature of the liquid lens 500 can be changed quickly and accurately.
  • the processor 970 receives focus information AF from the image processing unit 930 and shake information OIS from a gyro sensor (not shown), and focus information AF and shake information (OIS). Based on this, a target curvature can be determined.
  • the determined update period of the target curvature is preferably longer than the update period of the calculated curvature based on the sensed capacitance of the liquid lens 500.
  • the camera 195c described with reference to FIGS. 4 to 15B may be employed in various electronic devices such as the mobile terminal 100 of FIG. 2, a vehicle, a TV, a drone, a robot, a robot cleaner, and a door.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne une caméra et un terminal la comprenant. Une caméra selon un mode de réalisation de la présente invention comprend : une unité de source de lumière ayant une pluralité de sources de lumière et délivrant en sortie une première lumière de motif ; une unité de conversion de lumière pour convertir la première lumière de motif provenant de l'unité de source de lumière pour générer une seconde lumière de motif différente de la première lumière de motif, et délivrer en sortie l'une quelconque de la première lumière de motif et de la seconde lumière de motif ; une unité de détection de lumière pour détecter une première lumière réfléchie reçue d'un sujet en réponse à la première lumière de motif délivrée en sortie par l'unité de conversion de lumière et une seconde lumière réfléchie reçue du sujet en réponse à la seconde lumière de motif ; et un processeur pour commander la première lumière de motif à être délivrée en sortie à un premier instant, et commander la seconde lumière de motif à être délivrée en sortie à un second instant après le premier instant, et générer des informations de profondeur concernant le sujet sur la base de la première lumière réfléchie et de la seconde lumière réfléchie. Grâce à ces caractéristiques, la génération de chaleur et la taille peuvent être réduites à l'aide d'une unité de source de lumière.
PCT/KR2019/014212 2018-10-26 2019-10-25 Caméra et terminal la comprenant WO2020085873A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180128608A KR102553487B1 (ko) 2018-10-26 2018-10-26 카메라, 및 이를 구비하는 단말기
KR10-2018-0128608 2018-10-26

Publications (1)

Publication Number Publication Date
WO2020085873A1 true WO2020085873A1 (fr) 2020-04-30

Family

ID=70331747

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/014212 WO2020085873A1 (fr) 2018-10-26 2019-10-25 Caméra et terminal la comprenant

Country Status (2)

Country Link
KR (1) KR102553487B1 (fr)
WO (1) WO2020085873A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114945091A (zh) * 2022-07-19 2022-08-26 星猿哲科技(深圳)有限公司 深度相机的温度补偿方法、装置、设备及存储介质
EP4164219A4 (fr) * 2020-06-04 2024-01-17 LG Electronics, Inc. Dispositif de caméra et dispositif électronique le comprenant

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021256576A1 (fr) * 2020-06-16 2021-12-23 엘지전자 주식회사 Dispositif de génération d'image de profondeur et son procédé de fonctionnement
WO2024035157A1 (fr) * 2022-08-11 2024-02-15 엘지이노텍 주식회사 Dispositif de caméra

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130141635A1 (en) * 1998-03-19 2013-06-06 LED Tech Development LLC Apparatus and method for l.e.d. illumination
KR101530930B1 (ko) * 2008-08-19 2015-06-24 삼성전자주식회사 패턴투영장치, 이를 구비한 3차원 이미지 형성장치, 및 이에 사용되는 초점 가변 액체렌즈
KR20170049274A (ko) * 2015-10-28 2017-05-10 삼성전자주식회사 깊이 영상 촬영장치 및 방법
US20180234673A1 (en) * 2015-09-25 2018-08-16 Intel Corporation Online compensation of thermal distortions in a stereo depth camera
US10089738B2 (en) * 2016-08-30 2018-10-02 Microsoft Technology Licensing, Llc Temperature compensation for structured light depth imaging system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130141635A1 (en) * 1998-03-19 2013-06-06 LED Tech Development LLC Apparatus and method for l.e.d. illumination
KR101530930B1 (ko) * 2008-08-19 2015-06-24 삼성전자주식회사 패턴투영장치, 이를 구비한 3차원 이미지 형성장치, 및 이에 사용되는 초점 가변 액체렌즈
US20180234673A1 (en) * 2015-09-25 2018-08-16 Intel Corporation Online compensation of thermal distortions in a stereo depth camera
KR20170049274A (ko) * 2015-10-28 2017-05-10 삼성전자주식회사 깊이 영상 촬영장치 및 방법
US10089738B2 (en) * 2016-08-30 2018-10-02 Microsoft Technology Licensing, Llc Temperature compensation for structured light depth imaging system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4164219A4 (fr) * 2020-06-04 2024-01-17 LG Electronics, Inc. Dispositif de caméra et dispositif électronique le comprenant
CN114945091A (zh) * 2022-07-19 2022-08-26 星猿哲科技(深圳)有限公司 深度相机的温度补偿方法、装置、设备及存储介质
CN114945091B (zh) * 2022-07-19 2022-10-25 星猿哲科技(深圳)有限公司 深度相机的温度补偿方法、装置、设备及存储介质

Also Published As

Publication number Publication date
KR102553487B1 (ko) 2023-07-07
KR20200046861A (ko) 2020-05-07

Similar Documents

Publication Publication Date Title
WO2020085873A1 (fr) Caméra et terminal la comprenant
WO2016182132A1 (fr) Terminal mobile et son procédé de commande
WO2020096423A1 (fr) Appareil à prisme et caméra le comprenant
WO2017078328A1 (fr) Dispositif d'entraînement de lentille, et module de caméra et instrument optique le comprenant
WO2018052228A1 (fr) Module de caméra double, dispositif optique, module de caméra et procédé de fonctionnement de module de caméra
WO2020262876A1 (fr) Module de caméra et dispositif optique le comprenant
WO2014148698A1 (fr) Dispositif d'affichage et son procédé de commande
WO2019225978A1 (fr) Caméra et terminal la comprenant
WO2019117652A1 (fr) Appareil à prisme, et appareil photographique comprenant celui-ci
WO2020213862A1 (fr) Module d'appareil photographique et dispositif optique
WO2016129778A1 (fr) Terminal mobile et procédé de commande associé
WO2020060235A1 (fr) Dispositif de caméra
WO2015194773A1 (fr) Dispositif d'affichage et son procédé de commande
WO2017086538A1 (fr) Terminal mobile et son procédé de commande
WO2022097981A1 (fr) Dispositif électronique à module de caméra et procédé de fonctionnement dudit dispositif
WO2021080061A1 (fr) Dispositif de détection de stylo électronique et appareil électronique le comprenant
WO2021215752A1 (fr) Dispositif optique, et dispositif de caméra et dispositif électronique le comprenant
WO2020045867A1 (fr) Module de prisme, caméra le comprenant, et dispositif d'affichage d'image
WO2021141453A1 (fr) Module de caméra et dispositif électronique comprenant ce dernier
WO2017196023A1 (fr) Module de caméra et procédé de mise au point automatique associé
WO2017082472A1 (fr) Terminal mobile et son procédé de commande
WO2019225979A1 (fr) Caméra et terminal la comprenant
WO2019225928A1 (fr) Caméra et terminal la comprenant
WO2019190159A1 (fr) Appareil à prismes, et caméra comprenant cet appareil
WO2021006679A1 (fr) Appareil de variation de courbure de lentille

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19875834

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19875834

Country of ref document: EP

Kind code of ref document: A1