WO2020189923A1 - Dispositif optique, et dispositif caméra et dispositif électronique le comprenant - Google Patents

Dispositif optique, et dispositif caméra et dispositif électronique le comprenant Download PDF

Info

Publication number
WO2020189923A1
WO2020189923A1 PCT/KR2020/002939 KR2020002939W WO2020189923A1 WO 2020189923 A1 WO2020189923 A1 WO 2020189923A1 KR 2020002939 W KR2020002939 W KR 2020002939W WO 2020189923 A1 WO2020189923 A1 WO 2020189923A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
pattern
sensor
control unit
light emitting
Prior art date
Application number
PCT/KR2020/002939
Other languages
English (en)
Korean (ko)
Inventor
조아영
이창환
신윤섭
지석만
정용우
김항태
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2020189923A1 publication Critical patent/WO2020189923A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/22Illumination; Arrangements for improving the visibility of characters on dials
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/236Image signal generators using stereoscopic image cameras using a single 2D image sensor using varifocal lenses or mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/34Microprocessors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present invention relates to an optical device and an electronic device having the same, and more particularly, by acquiring depth information through a patterned light emitting device and a sensor device operating in synchronization, it is possible to reduce the computational amount and increase the accuracy of the depth information. It relates to an optical device and a camera device and an electronic device having the same.
  • the three-dimensional image includes shape information and color information.
  • the shape information can be obtained using a depth image.
  • the depth information refers to distance information between an optical device that photographs an image and a target area, or between the optical device and each object existing in the target area.
  • a technology using a temporal transformation of a light source (ToF, Time of flight), a structured light camera, or an active stereo camera is used. .
  • the structured light camera method has limitations in miniaturization due to the physical size of the light receiving unit, and thus it is difficult to apply to mobile devices.
  • the active stereo camera method has a difficult problem in applying a high-resolution sensor because the amount of computation required to obtain depth information is large.
  • an object of the present invention is to obtain depth information through a patterned light emitting device and a sensor device operating in synchronization, thereby reducing an amount of computation and increasing the accuracy of depth information.
  • the present invention minimizes the size of the light emitting device, reduces power consumption, and reduces the light recognition rate in the sensor device by including a vertical resonance surface emitting laser and a light conversion optical member.
  • the purpose is to help you increase.
  • the controller may control the sensor device so that the sensor device operates in synchronization with a point in time when the light emitting element emits light.
  • the sensor device includes an asynchronous type sensor that senses pixel-based data, and the asynchronous type sensor has a set value of the amount of change in received light Whenever there is an abnormality, a signal is output, and the controller may control the light emitting device so that the intensity of the patterned light output from the light emitting element is equal to or greater than a set value.
  • the sensor device includes a frame type sensor, and the control unit derives a difference value for each pixel of a plurality of frames sensed by the frame type sensor. A pattern of received light can be generated.
  • the sensor device includes a first sensor device and a second sensor device
  • the control unit includes a first pattern and a first pattern detected by the first sensor device. 2
  • the parallax distance may be calculated by comparing the second pattern detected by the sensor device, and depth information of the target area may be obtained using the calculated parallax distance.
  • the control unit compares each pixel of the reference pattern with the pattern pixel of the detected received light, and is parallel to the base line connecting the light emitting device and the sensor device.
  • the parallax distance may be calculated by calculating a distance between pixels corresponding to a first direction.
  • the pattern light includes at least one line pattern light in a straight line or at least one line dot pattern light in which a plurality of dots form a straight line. can do.
  • the patterned light may be configured to form an angle perpendicular to the base line connecting the light emitting device and the sensor device.
  • a distance between at least one line pattern light or the at least one line dot pattern light may be greater than or equal to a maximum pixel parallax distance.
  • the target region is composed of at least one divided region
  • the patterned light is sequentially emitted to at least one divided region
  • the sensor device is sequentially
  • the pattern light emitted by the pattern detects the received light reflected from at least one divided area
  • the control unit acquires depth information of each of the divided areas using the detected pattern of each of the received light, and the obtained depth information of each of the divided areas
  • the depth information of the target area can be obtained by using.
  • the light emitting device further comprises at least one first optical member for generating duplicate light by replicating the light emitted from the light emitting element into a plurality of lights.
  • the first optical member may be a diffractive optical element.
  • the light emitting device further comprises a second optical member for converting the copy light or the pattern light
  • the second optical member is a light emitting element-first
  • the optical member may be disposed to pass light in the order of the second optical member, or the light emitting device may be disposed to pass the second optical member in the order of the first optical member.
  • the second optical member may include at least one lens or at least one diffractive optical element.
  • the light emitting element may be a vertical resonance surface emitting laser (Virtical Cavity Surface Emitting Laser).
  • the light emitting device includes a plurality of light sources arranged in a line in a first direction parallel to the base line connecting the light emitting device and the sensor device. I can.
  • the light emitting elements are arranged in a matrix form in a first direction parallel to a base line connecting the light emitting device and the sensor device and a second direction perpendicular to each other. It may include a plurality of light sources.
  • the controller may control the light source so that the plurality of light sources sequentially emit light individually.
  • control unit may control the light source so that light sources belonging to the same row among a plurality of light sources simultaneously emit light, and each row sequentially emit light.
  • An optical device a camera device and an electronic device having the same, obtain depth information through a patterned light emitting device and a sensor device that operate in synchronization, thereby reducing the amount of computation and increasing the accuracy of depth information. It can have an effect.
  • the optical device according to an embodiment of the present invention and the camera device and electronic device having the same include a light emitting device including a vertical resonance surface emitting laser and a light conversion optical member, minimizing the size of the light emitting device, and power consumption. There is an effect of reducing the value and increasing the light recognition rate in the sensor device.
  • FIG. 1 is a view showing the appearance of a mobile terminal as an example of an electronic device according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of the mobile terminal of FIG. 1.
  • 3A is a view showing an optical device according to an embodiment of the present invention.
  • 3B is a view showing an optical device according to another embodiment of the present invention.
  • 4A to 4B are diagrams illustrating calculation of a parallax distance in each optical device of FIG. 3.
  • 5A to 5C are views showing a light emitting device according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating an example of the second optical member of FIG. 5.
  • FIG. 7 is a view showing a light emitting device of a light emitting device according to an embodiment of the present invention.
  • FIG. 8 is a diagram illustrating that an optical device emits light to a target region and receives a pattern of received light according to an exemplary embodiment of the present invention.
  • FIG. 9 to 11 are views referenced for describing FIG. 8.
  • FIG. 12 is a diagram illustrating that an optical device emits light to a target area and receives a pattern of received light according to another exemplary embodiment of the present invention.
  • FIG. 12 13A to 13F are views referenced for describing FIG. 12.
  • FIG. 14 is a diagram illustrating that an optical device emits light to a target area and receives a pattern of received light according to another exemplary embodiment of the present invention.
  • 15A to 15C are views referenced to describe FIG. 14.
  • 16 is a diagram illustrating that an optical device emits light to a target area and receives a pattern of received light according to another exemplary embodiment of the present invention.
  • 17A to 17B are views referenced for describing FIG. 16.
  • FIG. 18 is a flowchart of obtaining depth information of an optical device according to an embodiment of the present invention.
  • module and unit for components used in the following description are given or used interchangeably in consideration of only the ease of preparation of the specification, and do not have meanings or roles that are distinguished from each other by themselves. Therefore, the “module” and “unit” may be used interchangeably with each other.
  • the electronic device described in the present specification is mainly described with a mobile terminal as an example for convenience, but is not limited thereto.
  • the electronic device may include a vehicle video device such as a black box, a CCTV (Closed Caption TV), a mobile terminal, and the like.
  • the mobile terminal may include a mobile phone, a smart phone, a notebook computer, a digital broadcasting terminal, a personal digital assistant (PDA), a navigation system, a tablet PC, a wearable device, and the like.
  • PDA personal digital assistant
  • FIG. 1 is a view showing the appearance of a mobile terminal as an example of an electronic device according to an embodiment of the present invention.
  • (a) is a front view
  • (b) is a side view
  • (c) is a rear view
  • (d) is a bottom view.
  • a case forming the exterior of the mobile terminal 100 is formed by a front case 100-1 and a rear case 100-2.
  • Various electronic components may be embedded in the space formed by the front case 100-1 and the rear case 100-2.
  • a display 180, a first camera device 195a, a first sound output module 153a, and the like may be disposed on the front case 100-1.
  • first to second user input units 130a and 130b may be disposed on the side of the rear case 100-2.
  • the touch pads are overlapped in a layered structure, so that the display 180 can operate as a touch screen.
  • the first sound output module 153a may be implemented in the form of a receiver or speaker.
  • the first camera device 195a may be implemented in a form suitable for capturing an image or video of a user or the like.
  • the microphone 123 may be implemented in a form suitable for receiving a user's voice or other sound.
  • the first to second user input units 130a and 130b and a third user input unit 130c to be described later may be collectively referred to as a user input unit 130.
  • the first microphone (not shown) may be disposed on the upper side of the rear case 100-2, that is, on the upper side of the mobile terminal 100, for collecting audio signals, and at the lower side of the rear case 100-2, That is, under the mobile terminal 100, the second microphone 123 may be disposed to collect an audio signal.
  • a second camera device 195b, a third camera device 195c, a flash 196, and a third user input unit 130c may be disposed on the rear surface of the rear case 100-2.
  • the second and third camera devices 195b and 195c may have a photographing direction that is substantially opposite to that of the first camera device 195a and may have different pixels from the first camera device 195a.
  • the second camera device 195b and the third camera device 195c may have different angles of view to expand the shooting range.
  • a mirror (not shown) may be additionally disposed adjacent to the third camera device 195c.
  • another camera device may be further installed adjacent to the third camera device 195c and used for capturing a 3D stereoscopic image, or may be used for capturing an additional angle of view.
  • the flash 196 may be disposed adjacent to the third camera 195c. When a subject is photographed by the third camera 195c, the flash 196 illuminates light toward the subject.
  • a second sound output module 153b may be additionally disposed on the rear case 100-2.
  • the second sound output module may implement a stereo function together with the first sound output module 153a, and may be used for a call in a speaker phone mode.
  • a power supply unit (not shown) for supplying power to the mobile terminal 100 may be mounted on the rear case 100-2 side.
  • the power supply unit 190 is, for example, a rechargeable battery, and may be configured integrally with the rear case 100-2, or may be detachably coupled to the rear case 100-2 for charging or the like.
  • FIG. 2 is a block diagram of the mobile terminal of FIG. 1.
  • the mobile terminal 100 includes a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, and a memory. (160), an interface unit 175, a terminal control unit 170, and may include a power supply unit 190.
  • A/V audio/video
  • the mobile terminal 100 includes a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, and a memory. (160), an interface unit 175, a terminal control unit 170, and may include a power supply unit 190.
  • the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 113, a wireless Internet module 115, a short range communication module 117, and a GPS module 119.
  • the broadcast receiving module 111 may receive at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast signal and/or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.
  • the mobile communication module 113 may transmit and receive a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include a voice call signal, a video call signal, or various types of data according to transmission/reception of text/multimedia messages.
  • the wireless Internet module 115 refers to a module for wireless Internet access, and the wireless Internet module 115 may be built-in or external to the mobile terminal 100.
  • the short-range communication module 117 refers to a module for short-range communication.
  • Bluetooth Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), and the like may be used.
  • RFID Radio Frequency Identification
  • IrDA infrared data association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • NFC Near Field Communication
  • the GPS (Global Position System) module 119 receives location information from a plurality of GPS satellites.
  • the A/V (Audio/Video) input unit 120 is for inputting an audio signal or a video signal, and may include a camera device 195 and a microphone 123.
  • the camera device 195 may process an image frame such as a still image or a video obtained by an image sensor in a video call mode or a photographing mode.
  • the processed image frame may be displayed on the display 180.
  • the image frame processed by the camera device 195 may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. Two or more camera devices 195 may be provided depending on the configuration aspect of the electronic device.
  • the microphone 123 may receive an external audio signal by a microphone in a display off mode, for example, a call mode, a recording mode, or a voice recognition mode, and process it as electrical voice data.
  • a display off mode for example, a call mode, a recording mode, or a voice recognition mode
  • a plurality of microphones 123 may be disposed at different positions.
  • the audio signal received from each microphone may be processed by the terminal controller 170 or the like.
  • the user input unit 130 generates key input data input by a user to control the operation of an electronic device.
  • the user input unit 130 may include a key pad, a dome switch, a touch pad (positive pressure/power failure), etc., through which a command or information can be inputted by a user's pressing or touching operation.
  • a touch screen When the touch pad forms a mutual layer structure with the display 180 to be described later, this may be referred to as a touch screen.
  • the sensing unit 140 detects the current state of the mobile terminal 100, such as an open/closed state of the mobile terminal 100, a location of the mobile terminal 100, and whether a user is in contact, and controls the operation of the mobile terminal 100. It can generate a sensing signal.
  • the sensing unit 140 may include a proximity sensor 141, a pressure sensor 143, a motion sensor 145, a touch sensor 146, and the like.
  • the proximity sensor 141 may detect the presence or absence of an object approaching the mobile terminal 100 or an object existing in the vicinity of the mobile terminal 100 without mechanical contact.
  • the proximity sensor 141 may detect a proximity object by using a change in an AC magnetic field or a change in a static magnetic field, or by using a rate of change in capacitance.
  • the pressure sensor 143 may detect whether pressure is applied to the mobile terminal 100 and the magnitude of the pressure.
  • the motion sensor 145 may detect the position or movement of the mobile terminal 100 using an acceleration sensor or a gyro sensor.
  • the touch sensor 146 may detect a touch input by a user's finger or a touch input by a specific pen.
  • the touch screen panel may include a touch sensor 146 for sensing location information and intensity information of a touch input.
  • the sensing signal sensed by the touch sensor 146 may be transmitted to the terminal controller 170.
  • the output unit 150 is for outputting an audio signal, a video signal, or an alarm signal.
  • the output unit 150 may include a display 180, an audio output module 153, an alarm unit 155, and a haptic module 157.
  • the display 180 displays and outputs information processed by the mobile terminal 100. For example, when the mobile terminal 100 is in a call mode, a user interface (UI) or a graphical user interface (GUI) related to a call is displayed. In addition, when the mobile terminal 100 is in a video call mode or a photographing mode, the photographed or received images can be displayed individually or simultaneously, and a UI and a GUI are displayed.
  • UI user interface
  • GUI graphical user interface
  • the display 180 and the touch pad form a mutual layer structure to form a touch screen
  • the display 180 may be used as an input device capable of inputting information by a user's touch in addition to an output device. I can.
  • the sound output module 153 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. In addition, the sound output module 153 outputs audio signals related to functions performed in the mobile terminal 100, for example, a call signal reception sound, a message reception sound, and the like.
  • the sound output module 153 may include a speaker, a buzzer, and the like.
  • the alarm unit 155 outputs a signal for notifying the occurrence of an event in the mobile terminal 100.
  • the alarm unit 155 outputs a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal.
  • a signal can be output in the form of vibration.
  • the haptic module 157 generates various tactile effects that a user can feel.
  • a typical example of the tactile effect generated by the haptic module 157 is a vibration effect.
  • the haptic module 157 When the haptic module 157 generates vibration through a tactile effect, the intensity and pattern of the vibration generated by the haptic module 157 can be converted, and different vibrations may be synthesized and output or sequentially output.
  • the memory 160 may store a program for processing and control of the terminal controller 170, or a function for temporary storage of input or output data (eg, phonebook, message, still image, video, etc.) You can also do
  • the interface unit 175 serves as an interface with all external devices connected to the mobile terminal 100.
  • the interface unit 175 may receive data from an external device or receive power and transmit the data to each component inside the mobile terminal 100, and transmit data inside the mobile terminal 100 to an external device.
  • the mobile terminal 100 may be provided with a fingerprint recognition sensor for recognizing a user's fingerprint, and the terminal controller 170 may use fingerprint information sensed through the fingerprint recognition sensor as an authentication means.
  • the fingerprint recognition sensor may be embedded in the display 180 or the user input unit 130.
  • the terminal control unit 170 typically controls the operation of each unit to control the overall operation of the mobile terminal 100. For example, it is possible to perform related control and processing for voice calls, data communication, and video calls.
  • the terminal control unit 170 may include a multimedia playback module 181 for multimedia playback.
  • the multimedia playback module 181 may be configured as hardware in the terminal control unit 170 or may be configured as software separately from the terminal control unit 170.
  • the terminal control unit 170 may include an application processor (not shown) for driving an application.
  • the application processor (not shown) may be provided separately from the terminal control unit 170.
  • the power supply unit 190 may receive external power and internal power under the control of the terminal control unit 170 to supply power necessary for operation of each component.
  • the power supply unit 190 may have a connection port, and an external charger supplying power for charging the battery may be electrically connected to the connection port. Meanwhile, the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port.
  • 3A is a view showing an optical device according to an embodiment of the present invention.
  • the optical device 10 may include a controller 500, a light emitting device 200, and a sensor device 300.
  • the light-emitting device 200 may include at least one light-emitting device 220 and a light-emitting control unit 210.
  • the light-emitting device 220 may be a device that emits output light of a specific wavelength.
  • the light emitting device 220 may emit infrared ray. Specifically, infrared light having a wavelength of 940 nm or 850 nm may be emitted.
  • the light emitting device 220 may be a laser diode that converts an electrical signal into an optical signal.
  • the light-emitting element 220 may output light in the form of a pulse under the control of the light-emitting control unit 210.
  • the light emission control unit 210 may provide information related to the light emission operation of the light emission element 220 to the control unit 500. Specifically, the light emission control unit 210 may provide the control unit 500 with first timestamp information including information when the light-emitting element 220 is turned on to emit light.
  • the sensor device 300 may detect reception light in which the pattern light of a specific wavelength emitted from the light emitting device 200 is reflected from the target area.
  • the sensor device 300 may be disposed to face the same direction as the light emitting direction of the light emitting device 200.
  • the sensor device 300 may include a sensor control unit 310, a sensor 320, a filter 330, a lens 340, and a diaphragm 350.
  • the stop 350 opens and closes the light incident on the lens 340 and adjusts the amount of incident light.
  • the lens 340 condenses the light incident through the aperture 350 to the sensor 320.
  • the filter 330 may pass only light of a specific wavelength among the received light to the sensor 320.
  • Light of a specific wavelength may be infrared. Therefore, the filter 330 may be a band pass filter that transmits infrared rays.
  • the filter 330 may be a filter capable of passing infrared light having a wavelength of 940 nm or 850 nm.
  • the sensor 320 may detect received light having a specific wavelength that has passed through the filter 330 among received light corresponding to the output light.
  • the senor 320 may be an asynchronous sensor that senses pixel-based data.
  • the sensor 320 outputs an event signal whenever the amount of change in the received light incident on a specific pixel becomes greater than or equal to a set value. Therefore, as the amount of change in the received light increases, the number of times the event signal output from the sensor 320 is output may increase.
  • the sensor control unit 310 may transmit event signal information output from the sensor 320 to the control unit 500, and the control unit 500 may transmit the received event signal information to the light emission control unit 210.
  • the sensor control unit 310 receives output light emission period information of the light-emitting element 220 from the light-emitting control unit 210 through the control unit 500, and the sensor 320 turns on the light-emitting element 220.
  • the operation of the sensor 320 may be controlled to be activated in synchronization with the section.
  • the sensor controller 310 may control the sensor 320 so that the operation period of the sensor 320 includes an ON period of the light emitting element 220.
  • the length of the operation section of the sensor 320 may be determined by the performance of the photodiode formed in the sensor 320.
  • the operation or sensing timing of the sensor 320 may be synchronized with the operation or light emission timing of the light emitting element 220.
  • the senor 320 may be a general frame type sensor.
  • the sensor 320 may output only size information of the received light incident on a specific pixel.
  • the sensor controller 310 may receive an output value for each pixel sensed by the sensor 320 for a plurality of frames from the frame-type sensor 320 and derive a difference in the output value for each pixel with respect to the front and rear frames.
  • the sensor control unit 310 may determine that an event signal has occurred whenever the magnitude of the derived difference value becomes greater than or equal to the set value.
  • the controller 500 may control the operation of the light emitting device 200 and the sensor device 300 through the light emitting control unit 210 and the sensor control unit 310. Meanwhile, the control unit 500 may include both the light emission control unit 210 and the sensor control unit 310, and the operation of the light emission device 200 and the sensor device 300 may be directly controlled through the control unit 500. .
  • control unit 500 may be included in the terminal control unit 170, and the terminal control unit 170 can control all components of the optical device 10. I can.
  • the controller 500 may calculate a parallax distance by comparing the detected pattern of received light with a reference pattern, and obtain depth information of the target area by using the calculated parallax distance. A configuration related to obtaining depth information will be described in detail later with reference to FIGS. 4A and 4B.
  • the controller 500 may synchronize the operation of the sensor device 300 and the light emitting element 200. Specifically, the control unit 500 may receive first time stamp information from the light emission control unit 210 including information about a time point at which the light emitting element 220 is turned on to emit light.
  • the controller 500 may transmit the received first time stamp information to the sensor controller 310.
  • the sensor controller 310 may control the sensor 320 to operate in synchronization with the ON period of the light emitting element 220 based on the first time stamp information received from the controller 500.
  • the controller 500 may receive second time stamp information including pattern data of the received light sensed by the sensor 320 from the sensor controller 310 and information at a time point at which the corresponding data is sensed.
  • the control unit 500 may match and store the pattern data of the received light and second time stamp information in a storage unit (not shown), and may derive depth information using the stored information.
  • the controller 500 may control the intensity of the output light output from the light emitting element 220.
  • the controller 500 may control the light emitting device 200 so that the intensity of the output light output from the light emitting element 220 is equal to or greater than a set value. This is because when the intensity of the output light is less than or equal to the set value, the sensor 320 cannot output an event signal according to the received light corresponding to the pattern light.
  • the controller 500 receives the received light data detected by the sensor device 300, and changes at least one of the output light emission period or the output light emission intensity of the light emitting element 220 according to the amount of change in the received light.
  • the light emitting device 200 may be controlled so as to be performed.
  • the amount of change means the amount of change in luminance or the amount of change in light intensity.
  • the control unit 500 controls the output light to be output with a first emission intensity
  • the change amount of received light is a second change amount that is greater than the first change amount
  • the output light may be controlled to be output with a second emission intensity smaller than the first emission intensity. Accordingly, the control unit 500 can adaptively adjust the emission intensity of the output light of the optical device 10 according to the presence or absence of various light sources in the surrounding environment, and increase the accuracy of depth information.
  • control unit 500 may control the light emitting device 200 to vary the emission period of the output light of the light emitting element 220 based on motion information of the optical device 10. For example, when the mobile terminal 100 including the optical device 10 is mounted on a moving vehicle such as a car, a robot, or a drone, the control unit 500 may store movement information of the mobile terminal 100. Based on this, the emission period of the output light of the light emitting device 220 may be varied.
  • the controller 500 emits light to vary the output light emission period of the light emitting element in consideration of only the movement speed of the object.
  • the device 200 can be controlled.
  • 3B is a view showing an optical device according to another embodiment of the present invention.
  • an optical device 20 may include a control unit 500, a light emitting device 200, a first sensor device 300, and a second sensor device 400. have.
  • the light emitting device 200 and the first sensor device 300 have the same configuration as the light emitting device 200 and the sensor device 300 of the optical device 10 according to the exemplary embodiment illustrated in FIG. 3A. Therefore, in this description, only the control operation of the second sensor device 400 and the controller 500 will be described.
  • the second sensor device 400 may be the same sensor device as the first sensor device 300.
  • the second sensor device 400 may be positioned in a straight line with the light emitting device 200 and the first sensor 300 device, and may be disposed to face the same direction as the light emitting direction of the light emitting device 200.
  • the second sensor device 400 and the first sensor device 300 may be disposed at positions symmetrical to each other with respect to the light emitting device 200.
  • the distance between the second sensor device 400 and the first sensor device 300 and the light emitting device 200 may be the same.
  • the positions of the second sensor device 400 and the first sensor device 300 are not limited thereto, and the separation distance may be different from each other, and the second sensor device 400 is provided on one side of the light emitting device 200. Both the and the first sensor device 300 may be located.
  • the second sensor device 400 may detect reception light in which the pattern light of a specific wavelength emitted from the light emitting device 200 is reflected from the target area.
  • the second sensor device 400 may include a second sensor control unit 410, a second sensor 420, a filter 430, a second lens 440, and a second aperture 450.
  • the second sensor 420 may detect received light of a specific wavelength that has passed through the filter 430 among received light corresponding to the output light.
  • the second sensor 420 may be the same sensor as the first sensor 320 of the first sensor device 300.
  • the second sensor 420 may be an asynchronous type sensor that senses pixel-based data or a general frame type sensor.
  • the control unit 500 includes the light emitting device 200, the first sensor device 300, and the second sensor device 400 through the light emitting control unit 210, the first sensor control unit 310, and the second sensor control unit 410. You can control the operation.
  • the control unit 500 may include all of the light emission control unit 210, the first sensor control unit 310, and the second sensor control unit 410, and the light emitting device 200 and the first sensor through the control unit 500 The operation of the device 300 and the second sensor device 400 may be directly controlled.
  • control unit 500 may be included in the terminal control unit 170, and the terminal control unit 170 can control all configurations of the optical device 20. I can.
  • the controller 500 calculates a parallax distance by comparing the first pattern detected by the first sensor device 300 with the second pattern detected by the second sensor device 400, and uses the calculated parallax distance to determine the target area. It is possible to obtain depth information of.
  • control unit 500 compares the first pattern detected by the first sensor device 300 with a reference pattern to calculate a parallax distance, and compares the second pattern detected by the second sensor device 300 with the reference pattern. Using one result, the calculated parallax distance can be corrected.
  • the controller 500 may obtain depth information of the target area based on the corrected parallax distance information. The configuration related thereto will be described later in detail with reference to FIGS. 4A and 4B.
  • the controller 500 may synchronize operations of the first sensor device 300, the second sensor device 400, and the light emitting element 200. Specifically, the control unit 500 may receive first time stamp information from the light emission control unit 210 including information about a time point at which the light emitting element 220 is turned on to emit light.
  • the control unit 500 may transmit the received first time stamp information to the first sensor control unit 310 and the second sensor control unit 320. Based on the first time stamp information received from the control unit 500, the first sensor control unit 310 and the second sensor control unit 410 are configured to detect the first sensor 320 and the second sensor 420. ) Can be controlled to operate in synchronization with the ON period.
  • control unit 500 includes data sensed by the first sensor 320 and the second sensor 420 from the first sensor control unit 310 and the second sensor control unit 410, as well as second time stamp information of the corresponding data. Can receive.
  • the control unit 500 may store sensing data, second time stamp information, and the like in a storage unit (not shown), and may derive depth information by using the stored information.
  • the controller 500 may control the intensity of the output light output from the light emitting element 220.
  • the controller 500 may control the light emitting device 200 so that the intensity of the output light output from the light emitting element 220 is equal to or greater than a set value.
  • FIG. 4A is a diagram illustrating calculating a parallax distance in the optical device 10 of FIG. 3A.
  • the light emitting device 200 emits pattern light OB1 in the direction or angle V1 toward the target area.
  • V1 is a direction parallel to the base line BL connecting the positions of the light emitting device 200 and the sensor device 300 (hereinafter referred to as'x-axis direction') and a direction perpendicular to the direction (hereinafter referred to as'z-axis direction').
  • Direction' is a direction parallel to the base line BL connecting the positions of the light emitting device 200 and the sensor device 300 (hereinafter referred to as'x-axis direction') and a direction perpendicular to the direction (hereinafter referred to as'z-axis direction').
  • the emitted pattern light OB1 may be irradiated to the first object A1, the second object A2, and the third object A3 positioned at different distances from the light emitting device 200.
  • Patterned light reflected from each of the objects A1, A2, and A3 may be incident on the sensor device 300 in the direction or angle of V2.
  • the received light received by the sensor device 300 may be sensed by the sensor 320 through the aperture 350, the lens 340, and the filter 330.
  • the pattern light OB1 emitted from the light emitting device 200 may be formed of at least one line pattern light in a linear shape or at least one line dot pattern light in which a plurality of dots form a straight line.
  • the controller 500 may designate the pattern light OB1 as a reference pattern.
  • the patterned light OB1 is a straight line formed in a direction parallel to and perpendicular to the base line BL connecting the positions of the light emitting device 200 and the sensor device 300 (hereinafter referred to as'y-axis direction') It can be formed to achieve.
  • the pattern of the received light reflected from the object located at an infinite distance from the light emitting device 200 and the sensor device 300 has a parallax distance close to zero. Also, as the distance between the light emitting device 200 and the sensor device 300 and the object becomes closer, the parallax distance increases.
  • the controller 500 matches the distance information of the light emitting device 200 and the sensor device 300 and the object and stores the parallax distance information of the received light pattern as reference distance data in the storage unit, and derives it from the measured received light pattern.
  • the depth information of the target area may be calculated by comparing the parallax distance information for each pixel with the stored reference distance data.
  • the relationship between the parallax distance and distance information of the object may vary according to the separation distance between the light emitting device 200 and the sensor device 300. Therefore, when the separation distance between the light emitting device 200 and the sensor device 300 is not fixed, the parallax distance, the distance information of the object, and the separation distance information of the light emitting device 200 and the sensor device 300 are matched together. It can be saved as reference distance data.
  • the controller 500 compares the parallax distance information for each pixel derived from the measured received light pattern and the separation distance information of the light emitting device 200 and the sensor device 300 with stored reference distance data to calculate depth information of the target area. I can.
  • the pattern OB2 of the received light detected by the sensor device 300 may have a different shape from the patterned light OB1 emitted from the light emitting device 200.
  • the first pattern OB2a reflected from the first object A1 located at a relatively closest distance to the light emitting device 200 and the sensor device 300 is compared with the original position of the pattern light OB1 in the x-axis direction.
  • the parallax distance D1 is the largest.
  • the third pattern OB2c reflected from the third object A3 located at the farthest distance from the light emitting device 200 and the sensor device 300 is compared with the original position of the pattern light OB1 on the x-axis.
  • the parallax distance D3 appears the smallest in the direction.
  • the controller 500 finds a pattern of the received light having the same y coordinate as the specific y coordinate of the pattern light OB1, and the x coordinate of the pattern light and the pattern light OB1
  • the parallax distance in the x-axis direction can be derived by comparing the original x-coordinates.
  • the control unit 500 sequentially derives the parallax distance in the x-axis direction from the y-coordinate at the bottom of the pattern light to the y-coordinate at the top, and based on the derived parallax distance information, the pattern light is located at the irradiated y coordinates Depth information of the target area to be performed can be derived.
  • the controller 500 obtains the x-coordinate of the pattern OB2 of the received light with respect to each y-coordinate of the pattern light OB1, and the first object (
  • the parallax distance D1 of A1), the parallax distance D2 of the second object A2, and the parallax distance D3 of the third object A3 may be derived.
  • the controller 500 compares the derived D1, D2, and D3 with reference data stored in the storage unit, and compares the first object A1, the second object A2, and the third object corresponding to the irradiated portion of the pattern light OB1. Depth information of the object A3 can be derived. Accordingly, it is possible to effectively calculate parallax distance and depth information from the received light reflected from the object of the linear pattern light.
  • the controller 500 may control the light emitting device 200 to continuously and repeatedly emit the pattern light OB1.
  • a plurality of line patterns having different x-coordinates may be simultaneously emitted to the target area.
  • the distance in the x-axis direction between each line pattern may be set to be greater than or equal to a predetermined value.
  • the emitted pattern light OB1 may be repeatedly emitted to the target region so that the x-coordinate changes continuously.
  • the controller 500 may compare the second time stamp information matched with the pattern data of the received light with the first time stamp information of the light emitting device 220 to calculate the parallax distance.
  • the control unit 500 may derive parallax distance and depth information by comparing pattern data and pattern light OB1 data having the same time stamp value or a difference between the time stamp values within a certain range.
  • 4B is a diagram illustrating the calculation of the parallax distance in the optical device 20 of FIG. 3B.
  • patterned light reflected from each of the objects A1, A2, and A3 may be incident on the first sensor device 300 in the direction or angle of V2, and the second sensor device 400 may transmit V3. It may be incident in a direction or angle.
  • the received light received by the first sensor device 300 and the second sensor device 400 may be sensed by a sensor through an aperture, a lens, and a filter, respectively.
  • the first reception light pattern OB2 detected by the first sensor device 300 and the second reception light pattern OB3 detected by the second sensor device 400 are patterned light OB1 emitted from the light emitting device 200. ) And the shape may be different.
  • the controller 500 finds the first and second reception light patterns having the same y coordinates as the specific y coordinates of the pattern light OB1, and the x coordinates of the patterns and the original x coordinates of the pattern light OB1
  • the parallax distance in the x-axis direction can be derived by comparing.
  • the control unit 500 sequentially derives the parallax distance in the x-axis direction from the y-coordinate at the bottom of the pattern light to the y-coordinate at the top, and based on the derived parallax distance information, the pattern light is located at the irradiated y coordinates Depth information of the target area to be performed can be derived.
  • the control unit 500 includes the x coordinates of the first reception light pattern OB2 and the second reception light pattern with respect to each y coordinate of the pattern light OB1.
  • the x-coordinate of (OB3) is obtained, and a parallax distance D1 of the first object A1, a parallax distance D2 of the second object A2, and a parallax distance D3 of the third object A3 may be derived.
  • the controller 500 compares the derived D1, D2, and D3 with reference data stored in the storage unit, and compares the first object A1, the second object A2, and the third object corresponding to the irradiated portion of the pattern light OB1. Depth information of the object A3 can be derived. Accordingly, it is possible to effectively calculate parallax distance and depth information from the received light reflected from the object of the linear pattern light.
  • control unit 500 finds a first reception light pattern and a second reception light pattern having the same y-coordinate as the specific y-coordinate of the pattern light OB1, and directly compares the x-coordinates of the patterns in the x-axis direction. You can derive the parallax distance.
  • 5A is a diagram illustrating a light emitting device of an optical device according to an embodiment of the present invention.
  • the light-emitting device 200 may include at least one light-emitting element 220 and a light-emitting control unit 210, and may further include at least one first optical member 230.
  • the first optical member 230 is a device that duplicates the light emitted from the light emitting device 220 into a plurality of lights to generate duplicate light.
  • the first optical member 230 may be a diffractive optical element (DOE).
  • DOE diffractive optical element
  • the diffractive optical element is an optical element capable of outputting by replicating input light into a plurality of identical pieces of light.
  • the type of light to be replicated and the number of light may vary depending on the type of the diffractive optical element.
  • the light emitting device 200 including the first optical member 230 may simultaneously emit a plurality of patterned lights spaced apart from each other through the target area.
  • the plurality of patterned lights may have different y-coordinates and may be emitted from the light emitting device 200 to have the same separation distance.
  • the distance in the x-axis direction between each line pattern may be set to be more than a predetermined value.
  • 5B and 5C are diagrams illustrating a light emitting device 200 of an optical device according to another exemplary embodiment of the present invention.
  • the light emitting device 200 may include at least one light emitting element 220, a light emitting control unit 210, and at least one first optical member 230, and a second optical member 240 It may contain more.
  • the second optical member 240 may be arranged in the order of'light emitting element 220-first optical member 230-second optical member 240'. In this case, the second optical member 240 may transmit the duplicated light from the first optical member 230. Meanwhile, the second optical member 240 may be disposed in the order of'light emitting element 220-second optical member 240-first optical member 220'. In this case, the second optical member 240 may transmit the patterned light emitted from the light emitting device 220.
  • the second optical member 240 may convert copy light or pattern light.
  • the second optical member 240 may convert the duplicated light or patterned light into duplicated or patterned light expanded in the x-axis direction or the y-axis direction.
  • the second optical member 240 may convert the duplicated light or the patterned light so that the patterned light irradiated to the target area has an appropriate angle of view.
  • the second optical member 240 may convert the duplicated light to minimize distortions such as a pin cushion generated when the first optical member 220 generates the duplicated light. Accordingly, the second optical member 240 can greatly reduce the replication distortion caused by the first optical member 220.
  • FIG. 6 is a diagram illustrating an example of the second optical member 240 in the light emitting device 200 of FIG. 5B.
  • the second optical member 240 may include at least one lens. Meanwhile, the second optical member 240 may include at least one diffractive optical element.
  • each of the light sources 220a, 220b, 220c, 220d of the light-emitting element 220 passes through the first optical member 230 and is replicated as a plurality of replica lights, and a second optical member composed of a plurality of lenses
  • the pattern may be converted while passing through 240.
  • FIG. 7 is a view showing a light emitting device of a light emitting device according to an embodiment of the present invention.
  • the light emitting device 220 may include at least one or more light sources. Specifically, the light emitting device 220 may have a form in which at least one vertical resonance surface emitting laser (VCSEL) is connected to or integrated with a circuit board (not shown).
  • VCSEL vertical resonance surface emitting laser
  • the vertical resonance surface-emission laser light source is a light source used in the optical communication field, and has features of low cost, high temperature stability, and mass productivity.
  • the vertical resonance surface-emission laser light source has advantages of high modulation speed, narrow radiation angle, low operating current, and high conversion efficiency.
  • the vertical resonance surface-emitting laser light source has an advantage of being able to mass-produce by printing a plurality of highly efficient laser light sources on a semiconductor wafer in a pattern form.
  • the optical device may have advantages such as low cost, miniaturization, low power, high efficiency, and high stability by using a vertical resonance surface emitting laser as a light emitting device.
  • the light emitting element 220 includes a plurality of light sources T1 arranged in a line in the x-axis direction parallel to the base line BL connecting the light emitting device 200 and the sensor device 300. , T2, ..., TN).
  • the plurality of light sources may be light sources included in one vertical resonance surface emitting laser VS1 of 1 * N segment.
  • the controller 500 may control the light emitting device 200 so that the light sources T1, T2, ..., TN included in the vertical resonance surface emitting laser VS1 sequentially emit light.
  • the light emitting element 220 is in a matrix form in the x-axis direction parallel to the base line BL connecting the light-emitting device 200 and the sensor device 300 and a y-axis direction perpendicular thereto. It may include a plurality of light sources (T11, T12, ..., TPN) arranged in. In this case, the plurality of light sources may be light sources of a type (VS1, VS2, ..., VSP) in which a plurality of 1 * N segment vertical resonance surface-emitting lasers are combined in the y-axis direction.
  • the control unit 500 is a light emitting device 200 so that each light source (T11, T12, ..., TPN) included in the plurality of vertical resonance surface emitting lasers VS1, VS2, ..., VSP sequentially emit light. Can be controlled.
  • controller 500 may control the light emitting device 200 so that light sources belonging to the same column among the plurality of light sources simultaneously emit light, and each column sequentially emit light.
  • control unit 500 controls the light emitting device 200 to simultaneously emit light sources having the same x coordinate among each of the light sources included in the plurality of vertical resonance surface emitting lasers VS1, VS2, ..., VSP. can do.
  • the control unit 500 includes a first light source T11 of the first vertical resonance surface emitting laser VS1, a first light source T21 of the second vertical resonance surface emitting laser VS2, and a P-th vertical resonance surface emitting laser
  • the light emitting device 200 is controlled so that the first light source TP1 of the (VSP) emits light simultaneously, and each light source T11, T12, ..., T1N sequentially emits light in one vertical resonance surface emitting laser. Can be controlled.
  • the light emitting element 220 is in a matrix form in the x-axis direction parallel to the base line BL connecting the light-emitting device 200 and the sensor device 300 and a y-axis direction perpendicular thereto. It may include a plurality of light sources (T11, T12, ..., TNP) arranged in. In this case, the plurality of light sources may be light sources of a form (VS1, VS2, ..., VSN) in which a plurality of vertical resonance surface-emitting lasers of 1 * P segment are combined in the x-axis direction.
  • the control unit 500 is a light emitting device 200 so that each light source (T11, T21, ..., TNP) included in the plurality of vertical resonance surface emitting lasers VS1, VS2, ..., VSN sequentially emit light. Can be controlled.
  • the control unit 500 includes a first light source T11 of the first vertical resonance surface emitting laser VS1, a first light source T21 of the second vertical resonance surface emitting laser VS2, and a third vertical resonance surface emitting laser.
  • the first light source T31 of VS3 may be controlled to sequentially emit light.
  • the controller 500 sequentially emits a plurality of light sources arranged in a matrix form from a light source T11 located at the first upper left to emit light sequentially, and then sequentially emits a second row, until the last light source TNP at the lower right. It can be controlled to emit light.
  • controller 500 may control the light emitting device 200 so that light sources belonging to the same column among the plurality of light sources simultaneously emit light, and each column sequentially emit light.
  • the controller 500 may control the light emitting device 200 so that each of the vertical resonance surface emitting lasers VS1, VS2, ..., VSN sequentially emit light.
  • control unit 500 controls all light sources T11, T12, ..., T1P of the first vertical resonance surface-emitting laser VS1 to emit light at the same time, and then, the second vertical resonance surface-emitting laser VS2 ) All light sources (T21, T22, ..., T2P) are controlled to emit light at the same time, and finally, all light sources (TN1, TN2, ..., TNP) of the Nth vertical resonance surface emitting laser (VSN) are simultaneously
  • the light emitting device 200 can be controlled to emit light.
  • FIG. 8 is a diagram illustrating that an optical device emits light to a target region and receives a pattern of received light according to an exemplary embodiment of the present invention
  • FIGS. 9 to 11 are views referenced for describing FIG. 8.
  • the optical device 20 may include a light emitting control unit 210, a light emitting element 220, a first optical member 230, and a second optical member 240.
  • the light emitting device 220 may include one vertical resonance surface emitting laser of 1 * N segment.
  • the controller 500 may control the light emitting device 200 so that the light sources T1, T2, T3, ..., TN of the light emitting element 220 sequentially emit light.
  • Each light source may be a dot light source.
  • the first optical member 230 may be an optical member that replicates M * L light.
  • the first optical member 230 may replicate one point light source that emits light at a specific time into M * L point light sources.
  • the first optical member 230 duplicates the first light source T1 into M light sources in the y-axis direction to form a linear first line dot pattern T_1. .1.1, T_1.2.1, T_1.3.1, ..., T_1.M.1), and at the same time, the second to second lines having the same shape as the first line dot pattern and separated by the same distance from each other.
  • L line dot patterns (T_1.1.L, T1.2.L, T_1.3.L, ..., T_1.ML) can be duplicated.
  • the first light source T1 may be duplicated in an L line dot pattern including M dots.
  • a line dot pattern including M dots may be duplicated as a line pattern having a straight line shape instead of a dot shape.
  • the first optical member 230 may replicate the pattern similarly to the second light source T2 to the Nth light source TN. Since the first light source (T1) to the Nth light source (TN) are separated by the same distance, each line dot pattern duplicated by the first optical member 230 can be irradiated to different points without overlapping points. have.
  • FIG. 9(a) is a view showing the pattern light emitted from the light emitting device 200 by emitting light from the first light source T1
  • FIG. 9(b) shows the pattern light reflected from the target area and received. It is a diagram showing a pattern of received light.
  • the light emitting device 200 when the first light source T1 emits light at a specific point in time, the light emitting device 200 emits L line dot patterns having M linear dots. Each of the line dot patterns may be spaced apart from each other by a predetermined distance D0.
  • the distance D0 at which each line dot pattern is separated from each other may be greater than or equal to the maximum pixel parallax distance.
  • the maximum pixel parallax distance is a parallax distance measured from a received light pattern reflected by an object located at the closest distance that can be photographed by the first sensor device 300 or the second sensor device 400.
  • the pattern of the received light reflected from the object very close to the optical device 20 may have a parallax distance greater than the distance between the line dot patterns. .
  • some patterns of the received light may be matched with adjacent line dot pattern regions, and the controller 500 may not be able to calculate accurate depth information.
  • the received light may be a pattern in which each dot is shifted in the x-axis direction compared to the pattern of the emitted pattern light. This is because the depths of the target regions to which the emitted pattern light is reflected are different.
  • Each pixel of the sensors 320 and 420 may be matched one-to-one with each dot of the received light pattern, a plurality of pixels may be matched with one dot, or a plurality of dots may be matched with one pixel. have. However, in order to increase the accuracy of calculating depth information, it is preferable that at least one pixel matches one dot.
  • the control unit 500 compares the position of each line dot pattern of the emitted pattern light and the position of each dot pattern of the received light, and the parallax distances D_1.1.2, D_1.2.2, D_1.3.2, and D_1.3.2 for each dot. ..., D_1.M.2) can be calculated.
  • the control unit 500 may calculate a parallax distance by calculating a difference in x coordinates for a dot having the same y coordinate of each line dot pattern of the emitted pattern light and each dot pattern of the received light.
  • the controller 500 may calculate a parallax distance for L line dot patterns of the emitted pattern light.
  • FIGS. 10 and 11 are diagrams illustrating patterns of patterned light emitted from the light emitting device 200 by the second light source T2 and the Nth light source TN, and a pattern of received light from which the patterned light is reflected.
  • the light emitting device 200 is emitted by the second light source T2.
  • the pattern light is emitted, and the sensor devices 300 and 400 may receive the reception light in which the pattern light is reflected to the target area.
  • the line dot pattern emitted by the light emission of the second light source T2 is emitted from the first light source T1. It can be spaced apart without overlapping each other with the line dot pattern emitted by.
  • line dot patterns may be emitted to different points by sequential light emission from the third light source T3 to the Nth light source TN.
  • the control unit 500 compares the position of each line dot pattern of the pattern lights emitted by the sequential light emission of the second light source T2 to the Nth light source TN and the position of each dot pattern of the received lights.
  • the parallax distance can be calculated for each dot.
  • control unit 500 can calculate the parallax distance for the entire target area, and store the calculated parallax distance information in the storage unit.
  • the depth information of the entire target area can be calculated by comparing it with the distance data.
  • the controller 500 may compare the second time stamp information matched with the pattern data of the received light with the first time stamp information of the light emitting device 220 to calculate the parallax distance.
  • the controller 500 may derive parallax distance and depth information by comparing the dot pattern of the received light with the line dot pattern of the pattern light having the same time stamp value or having a difference between the time stamp value within a certain range.
  • the controller 500 may derive the parallax distance and depth information through only the time stamp and x, y coordinate comparison. Accordingly, the amount of computation required for deriving depth information can be reduced.
  • FIG. 12 is a diagram illustrating that an optical device emits light to a target area and receives a pattern of received light according to another exemplary embodiment of the present invention
  • FIGS. 13A to 13F are views referenced for describing FIG. 12.
  • the optical device 20 may include a light emitting control unit 210, a light emitting element 220, a first optical member 230, and a second optical member 240.
  • the light emitting device 220 may include a plurality of 1 * N segment vertical resonance surface emitting lasers. In the present description, a case in which two vertically resonant surface emitting lasers are used is described, but is not limited thereto.
  • the control unit 500 may control the light-emitting device 200 so that each light source T11, T12, T13, ..., T1N, T21, ..., T2N of the light-emitting element 220 sequentially emit light. .
  • the first optical member 230 may be an optical member that replicates M * L light.
  • the first optical member 230 may replicate one point light source that emits light at a specific time into M * L point light sources.
  • the light emitted from the first to Nth light sources T11, T12, ..., T1N included in the first vertical resonance surface emitting laser is L including M dots, respectively. It can be duplicated in a pattern of dots of lines. Since the first light source T11 to the Nth light source T1N are separated by the same distance, each line dot pattern duplicated by the first optical member 230 may be irradiated to different points without overlapping points. have.
  • the light emitted by the N+1 to 2N light sources (T21, T22, ..., T2N) included in the second vertical resonance surface emitting laser is M dots It may be duplicated in L line dot patterns including.
  • the first vertical resonance surface emitting laser and the second vertical resonance surface emitting laser may be disposed adjacent to each other in the y-axis direction.
  • the plurality of line dot patterns in which the light emitted from the first to Nth light sources is duplicated and the plurality of line dot patterns in which the light emitted from the N+1 to 2Nth light sources are duplicated do not overlap each other, and are different first It may be sequentially emitted to the area AR1 and the second area AR2.
  • the target area may be composed of at least one divided area.
  • the first area AR1 and the second area AR2 correspond to a divided area of the target area.
  • T2N is a diagram showing patterned light emitted from the light emitting device 200 by emitting light.
  • the light emitting device 200 emits pattern light by light emission from the first light source T11, and the sensor devices 300 and 400 are configured to reflect the emitted pattern light from the target area. It can receive received light.
  • the light emitting device 200 and the sensor devices 300 and 400 may sequentially repeat the processes of emitting patterned light and receiving received light from light emission of the first light source T11 to light emission of the Nth light source T1N. Accordingly, the light emitting device 200 may emit pattern light to the first area AR1 of the target area and receive the reflected light to derive depth information.
  • the light emitting device 200 and the sensor devices 300 and 400 sequentially emit patterned light from light emission of the N+1th light source T21 to light emission of the 2N light source T2N. And the process of receiving the received light may be repeated. Accordingly, the light emitting device 200 may emit pattern light to the second area AR2 of the target area and receive the reflected light to derive depth information.
  • the control unit 500 may derive first depth information of the first area AR1 and second depth information of the second area AR2.
  • the control unit 500 may derive depth information of the entire target area by using the first depth information and the second depth information.
  • the controller 500 may control only one of the first vertical resonance surface-emitting laser and the second vertical resonance surface-emitting laser to operate. Accordingly, the control unit 500 may derive depth information only for a specific region of interest among the first region AR1 and the second region AR2.
  • the light emitting device 220 may include a plurality of 1 * N segment vertical resonance surface-emitting lasers, and the controller 500 divides the target area into areas equal to the number of vertical resonance surface-emitting lasers, Depth information may be derived only for at least one or more specific regions of interest.
  • control unit 500 is the light intensity emitted by the N light sources (T11, T12, ..., T1N) included in the first vertical resonance surface emitting laser, and N number included in the second vertical resonance surface emitting laser
  • the light emitting device 200 may be controlled so that the light intensities emitted by the light sources T21, T22, ..., T2N are different from each other. Accordingly, pattern light having different intensities is emitted for each divided area, thereby improving accuracy of depth information measurement.
  • FIG. 14 is a diagram illustrating that an optical device emits light to a target area and receives a pattern of received light according to another embodiment of the present invention
  • FIGS. 15A to 15C are views referenced for describing FIG. 14. .
  • the optical device 20 may include a light emitting control unit 210, a light emitting element 220, a first optical member 230, and a second optical member 240.
  • the light emitting device 220 may include a plurality of 1 * M segment vertical resonance surface emitting lasers. In this description, the case where the vertical resonance surface-emission laser is N will be described.
  • the plurality of vertically resonant surface-emission lasers may be combined in the x-axis direction.
  • the controller 500 may control the light emitting device 200 so that light sources belonging to the same column among the plurality of light sources simultaneously emit light, and each column sequentially emit light.
  • the controller 500 may control the light emitting device 200 so that each of the vertical resonance surface-emission lasers VS1, VS2, ..., VSN sequentially emit light.
  • control unit 500 controls all light sources T11, T12, ..., T1M of the first vertical resonance surface-emitting laser VS1 to emit light at the same time, and then, the second vertical resonance surface-emitting laser VS2. All light sources (T21, T22, ..., T2M) are controlled to emit light simultaneously, and finally, all light sources (TN1, TN2, ..., TNM) of the N-th vertical resonance surface emitting laser (VSN) emit simultaneously.
  • the light emitting device 200 may be controlled so as to be performed.
  • the first optical member 230 may be an optical member that replicates 1 * L light.
  • the first optical member 230 may replicate one point light source that emits light at a specific time into 1 * L point light sources.
  • the first to M-th light sources (T11, T12, ..., T1M) included in the first vertical resonance surface emitting laser simultaneously emit light is L including M dots. It can be duplicated in a pattern of dots of lines.
  • the light simultaneously emitted by M light sources included in each of the second to Nth vertical resonance surface emitting lasers is duplicated in L line dot patterns including M dots. Can be.
  • the first vertical resonance surface-emission laser and the second vertical resonance surface-emission laser may be disposed adjacent to each other in the x-axis direction. Since the first vertical resonance surface emitting laser VS1 to the Nth vertical resonance surface emitting laser VSN are spaced apart by the same distance, each line dot pattern duplicated by the first optical member 230 has no overlapping point, All can be investigated to different points.
  • 15A to 15C are M light sources T11, T12, ..., T1M of the first vertical resonance surface emitting laser VS1, and M light sources T21 of the second vertical resonance surface emitting laser VS2, respectively.
  • the controller 500 includes L line dot patterns including M dots by sequentially emitting light from the first vertical resonance surface-emitting laser VS1 to the Nth vertical resonance surface-emitting laser VSN. Patterned light may be sequentially emitted to the target area.
  • the controller 500 may calculate a parallax distance for each dot by comparing the position of each line dot pattern of the pattern lights with the position of each dot pattern of the received lights.
  • control unit 500 can calculate the parallax distance for the entire target area, and store the calculated parallax distance information in the storage unit.
  • the depth information of the entire target area can be calculated by comparing it with the distance data.
  • the optical device 20 Since the optical device 20 according to the present exemplary embodiment has a plurality of light sources that emit light at the same time, it is possible to minimize a decrease in the brightness of patterned light that occurs as the first optical member 230 replicates light.
  • FIG. 16 is a diagram illustrating that an optical device emits light to a target area and receives a pattern of received light according to another exemplary embodiment of the present invention, and FIGS. 17A to 17B are views referenced for describing FIG. 16.
  • the optical device 20 may include a light emitting control unit 210, a light emitting element 220, a first optical member 230, and a second optical member 240.
  • the light emitting device 220 may include a plurality of 1 * M segment vertical resonance surface emitting lasers.
  • a case in which 2N vertical resonance surface-emission lasers are used is described, but the present invention is not limited thereto, and a case in which 3N or 4N vertical resonance surface-emission lasers are possible is also possible.
  • the N vertical resonance surface-emitting lasers may be in the form of laser modules coupled in the x-axis direction.
  • two laser modules may be disposed adjacent to each other in the y-axis direction or may be combined.
  • the controller 500 may control the light emitting device 200 so that each of the vertical resonance surface emitting lasers VS1, VS2, ..., VS2N sequentially emit light.
  • control unit 500 controls all light sources T11, T12, ..., T1M of the first vertical resonance surface-emitting laser VS1 to emit light at the same time, and then, the second vertical resonance surface-emitting laser VS2. Controls all light sources (T21, T22, ..., T2M) to emit light simultaneously, and finally, all light sources (T2N1, T2N2, ..., T2NM) of the 2N vertical resonance surface emitting laser (VS2N) emit simultaneously.
  • the light emitting device 200 may be controlled so as to be performed.
  • the first optical member 230 may be an optical member that replicates 1 * L light.
  • the first optical member 230 may replicate one point light source that emits light at a specific time into 1 * L point light sources.
  • the first to M-th light sources (T11, T12, ..., T1M) included in the first vertical resonance surface emitting laser simultaneously emit light is L including M dots. It can be duplicated in a pattern of dots of lines.
  • the light simultaneously emitted by the M light sources included in each of the second to 2N vertical resonance surface-emitting lasers is duplicated in L line dot patterns including M dots. Can be.
  • the first vertical resonance surface-emission laser and the second vertical resonance surface-emission laser may be disposed adjacent to each other in the x-axis direction. Since the first vertical resonance surface-emission laser (VS1) to the 2N vertical resonance surface-emission laser (VS2N) are spaced apart by the same distance, each line dot pattern duplicated by the first optical member 230 has no overlapping point, All can be investigated to different points.
  • the first laser module and the second laser module may be disposed adjacent to each other in the y-axis direction or may be combined.
  • a line dot pattern emitted by the first to Nth vertical resonance surface-emission lasers VS1, VS2, ..., VSN included in the first laser module, and N+1th included in the second laser module The line dot patterns emitted by the 2N vertical resonance surface-emission lasers (VSN+1, VSN+2, ..., VS2N) do not overlap each other, and are different from the first area AR1 and the second area AR2. ) Can be released sequentially.
  • 17A to 17B are diagrams illustrating patterned light emitted by the first vertical resonance surface-emission laser VS1 and pattern light emitted by the second N vertical resonance surface-emission laser VS2N, respectively.
  • the light emitting device 200 and the sensor devices 300 and 400 are included in the N-th vertical resonance surface emitting laser VSN from simultaneous emission of light sources included in the first vertical resonance surface emitting laser VS1.
  • the process of emitting patterned light and receiving received light may be sequentially repeated until simultaneous light emission of the light sources. Accordingly, the light emitting device 200 may emit pattern light to the first area AR1 of the target area and receive the reflected light to derive depth information.
  • the light emitting device 200 and the sensor devices 300 and 400 emit light from the simultaneous light sources included in the N+1th vertical resonance surface emission laser VSN+1 to the 2Nth vertical resonance surface emission.
  • the processes of emitting patterned light and receiving received light may be sequentially repeated until simultaneous light emission of light sources included in the laser VS2N. Accordingly, the light emitting device 200 may emit pattern light to the second area AR2 of the target area and receive the reflected light to derive depth information.
  • the control unit 500 may derive first depth information of the first area AR1 and second depth information of the second area AR2.
  • the control unit 500 may derive depth information of the entire target area by using the first depth information and the second depth information.
  • the controller 500 may control only one of the first laser module and the second laser module to operate. Accordingly, the control unit 500 may derive depth information only for a specific region of interest among the first region AR1 and the second region AR2.
  • the light emitting device 220 may include a plurality of laser modules arranged adjacent to each other in the y-axis direction or combined, and the control unit 500 divides the target area into the same number of areas as the number of laser modules.
  • Depth information may be derived only for at least one or more specific regions of interest.
  • control unit 500 includes light intensity emitted by a plurality of light sources T11, T12, ..., TNM included in the first laser module, and a plurality of light sources TN+11 included in the second laser module.
  • TN+12, ..., T2NM may control the light-emitting device 200 so that the light intensities are different from each other. Accordingly, pattern light having different intensities is emitted for each divided area, thereby improving accuracy of depth information measurement.
  • FIG. 18 is a flowchart of obtaining depth information of an optical device according to an embodiment of the present invention.
  • the controller 500 controls the light emitting device 200 so that the light emitting device 200 emits patterned light of a specific wavelength to the target area (S1801).
  • the controller 500 controls the sensor devices 300 and 400 so that the timing at which the light emitting element 220 emits light and the timing at which the sensor devices 300 and 400 operate are synchronized (S1802).
  • the controller 500 controls the sensor devices 300 and 400 so that the sensor devices 300 and 400 detect the received light in conjunction with the timing at which the light-emitting element 220 emits light (S1803).
  • the control unit 500 matches the first time stamp information including information about the time point at which the light-emitting element 220 emits light with information about the emitted pattern light (reference pattern) and stores it in the storage unit.
  • the control unit 500 stores second time stamp information including information about the timing at which the sensor devices 300 and 400 operate.
  • the controller 500 calculates a parallax distance by comparing the detected pattern of the received light with a reference pattern (S1804).
  • the control unit 500 compares the first time stamp information and the second time stamp information to match time stamp information having the same time stamp value or a difference between the time stamp value being less than a predetermined value.
  • the reference pattern and the pattern of the received light are compared.
  • the controller 500 obtains depth information of the target area by using the calculated parallax distance (S1805).

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

La présente invention concerne un dispositif optique, un dispositif caméra et un dispositif électronique le comprenant. Le dispositif optique, selon un mode de réalisation de la présente invention, comprend : un dispositif électroluminescent comprenant au moins un élément électroluminescent, et émettant une lumière à motifs ayant une longueur d'onde spécifique vers une région cible ; au moins un dispositif de détection pour détecter une lumière reçue qui correspond à la lumière à motifs émise ayant la longueur d'onde spécifique qui est réfléchie par la région cible ; et une unité de commande pour calculer une distance de parallaxe par comparaison du motif de la lumière reçue détectée avec un motif de référence, et, à l'aide de la distance de parallaxe calculée, pour acquérir des informations de profondeur de la région cible, l'unité de commande pouvant commander le dispositif capteur de telle sorte que ce dernier fonctionne en synchronisation avec le point temporel lorsque l'élément électroluminescent émet de la lumière. Par conséquent, une miniaturisation et une faible consommation d'énergie du dispositif peuvent être possibles, et un avantage de réduire la quantité de fonctionnement et d'augmenter la précision des informations de profondeur peut être obtenu.
PCT/KR2020/002939 2019-03-15 2020-02-28 Dispositif optique, et dispositif caméra et dispositif électronique le comprenant WO2020189923A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962818762P 2019-03-15 2019-03-15
US62/818,762 2019-03-15

Publications (1)

Publication Number Publication Date
WO2020189923A1 true WO2020189923A1 (fr) 2020-09-24

Family

ID=72520399

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/002939 WO2020189923A1 (fr) 2019-03-15 2020-02-28 Dispositif optique, et dispositif caméra et dispositif électronique le comprenant

Country Status (2)

Country Link
KR (1) KR20200110172A (fr)
WO (1) WO2020189923A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130019581A (ko) * 2011-08-17 2013-02-27 (주)화이버 옵틱코리아 비점수차가 적용된 패턴 빔을 이용하는 3차원 이미지 촬영 장치 및 방법
KR20130027671A (ko) * 2011-09-08 2013-03-18 한국전자통신연구원 깊이 정보 획득장치 및 이를 포함하는 3차원 정보 획득 시스템
KR20140041012A (ko) * 2012-09-27 2014-04-04 오승태 다중 패턴 빔을 이용하는 3차원 촬영 장치 및 방법
KR20180104970A (ko) * 2017-03-14 2018-09-27 엘지전자 주식회사 단말기 및 그 제어 방법
US20190011721A1 (en) * 2014-08-12 2019-01-10 Mantisvision Ltd. System, method and computer program product to project light pattern

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130019581A (ko) * 2011-08-17 2013-02-27 (주)화이버 옵틱코리아 비점수차가 적용된 패턴 빔을 이용하는 3차원 이미지 촬영 장치 및 방법
KR20130027671A (ko) * 2011-09-08 2013-03-18 한국전자통신연구원 깊이 정보 획득장치 및 이를 포함하는 3차원 정보 획득 시스템
KR20140041012A (ko) * 2012-09-27 2014-04-04 오승태 다중 패턴 빔을 이용하는 3차원 촬영 장치 및 방법
US20190011721A1 (en) * 2014-08-12 2019-01-10 Mantisvision Ltd. System, method and computer program product to project light pattern
KR20180104970A (ko) * 2017-03-14 2018-09-27 엘지전자 주식회사 단말기 및 그 제어 방법

Also Published As

Publication number Publication date
KR20200110172A (ko) 2020-09-23

Similar Documents

Publication Publication Date Title
WO2014014238A1 (fr) Système et procédé de fourniture d'une image
WO2018169135A1 (fr) Terminal, et procédé de commande associé
WO2019225978A1 (fr) Caméra et terminal la comprenant
WO2017204498A1 (fr) Terminal mobile
WO2021006366A1 (fr) Dispositif d'intelligence artificielle pour ajuster la couleur d'un panneau d'affichage et procédé associé
WO2019117652A1 (fr) Appareil à prisme, et appareil photographique comprenant celui-ci
WO2018139790A1 (fr) Terminal mobile/portatif
WO2019231042A1 (fr) Dispositif d'authentification biométrique
WO2021251549A1 (fr) Dispositif d'affichage
WO2022010122A1 (fr) Procédé pour fournir une image et dispositif électronique acceptant celui-ci
WO2021215752A1 (fr) Dispositif optique, et dispositif de caméra et dispositif électronique le comprenant
WO2015194773A1 (fr) Dispositif d'affichage et son procédé de commande
WO2019208915A1 (fr) Dispositif électronique pour acquérir une image au moyen d'une pluralité de caméras par ajustage de la position d'un dispositif extérieur, et procédé associé
WO2020209624A1 (fr) Dispositif de visiocasque et procédé de fonctionnement associé
WO2022060126A1 (fr) Dispositif électronique comprenant un module de caméra
WO2020189923A1 (fr) Dispositif optique, et dispositif caméra et dispositif électronique le comprenant
WO2019022492A1 (fr) Caméra, et appareil d'affichage d'images l'incluant
WO2022098204A1 (fr) Dispositif électronique et procédé de fourniture de service de réalité virtuelle
WO2019240318A1 (fr) Terminal mobile et son procédé de commande
WO2017131246A1 (fr) Terminal mobile et son procédé de commande
WO2017094936A1 (fr) Terminal mobile et son procédé de commande
WO2021201619A1 (fr) Terminal mobile et procédé de commande associé
WO2019190159A1 (fr) Appareil à prismes, et caméra comprenant cet appareil
WO2024117730A1 (fr) Dispositif électronique permettant d'identifier un objet et son procédé de commande
WO2020085525A1 (fr) Terminal mobile et son procédé de commande

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20772865

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20772865

Country of ref document: EP

Kind code of ref document: A1