US20120154537A1 - Image sensors and methods of operating the same - Google Patents
Image sensors and methods of operating the same Download PDFInfo
- Publication number
- US20120154537A1 US20120154537A1 US13/239,925 US201113239925A US2012154537A1 US 20120154537 A1 US20120154537 A1 US 20120154537A1 US 201113239925 A US201113239925 A US 201113239925A US 2012154537 A1 US2012154537 A1 US 2012154537A1
- Authority
- US
- United States
- Prior art keywords
- light source
- light
- image sensor
- lens
- dimensional image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 230000007423 decrease Effects 0.000 claims description 15
- 238000000926 separation method Methods 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 20
- 238000005070 sampling Methods 0.000 description 12
- 230000000875 corresponding effect Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 239000000284 extract Substances 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000009977 dual effect Effects 0.000 description 3
- 238000002513 implantation Methods 0.000 description 3
- 230000000737 periodic effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000005571 horizontal transmission Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000007943 implant Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 238000009416 shuttering Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 235000012773 waffles Nutrition 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/254—Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4811—Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
Definitions
- Example embodiments relate to image sensors and method of operating the same.
- An image sensor is a photo-detection device that converts optical signals including image and/or distance (i.e., depth) information of an object into electrical signals.
- Various types of image sensors such as charge-coupled device (CCD) image sensors, CMOS image sensors (CIS), etc., have been developed to provide high quality image information of the object.
- CCD charge-coupled device
- CIS CMOS image sensors
- 3D three-dimensional
- the three-dimensional image sensor may obtain the depth information using infrared light or near-infrared light as a light source.
- light energy may be wasted since light is projected even on regions where an object of interest does not exist.
- a method of operating a three-dimensional image sensor includes measuring a distance of an object from the three-dimensional image sensor using light emitted by a light source module, and adjusting an emission angle of the light emitted by the light source module based on the measured distance.
- the three-dimensional image sensor includes the light source module.
- the measuring includes emitting the light with desired emission angle at the light source module, and measuring the distance of the object from the three-dimensional image sensor by detecting a time-of-flight of the emitted light.
- the adjusting decreases the emission angle of the light as the distance of the object from the three-dimensional image sensor increases.
- the light source module includes a light source and a lens, and the adjusting adjusts an interval between the light source and the lens based on the measured distance.
- the adjusting moves the light source or the lens such that a separation between the light source and the lens increases as the distance of the object from the three-dimensional image sensor increases.
- the light source module includes a light source and a lens, and the adjusting adjusts a refractive index of the lens based on the measured distance.
- the adjusting increases the refractive index of the lens as the distance of the object from the three-dimensional image sensor increases.
- the light source module includes a light source and a lens, and the adjusting adjusts a curvature of the lens based on the measured distance.
- the adjusting increases the curvature of the lens as the distance of the object from the three-dimensional image sensor increases.
- the method further includes adjusting an amplitude of the light emitted by the light source module according to an increment or a decrement of the emission angle of the light.
- the adjusting decreases the amplitude of the light as the emission angle of the light decreases.
- a method of operating a three-dimensional image sensor includes obtaining position information of an object using light emitted by a light source module, and adjusting a relative position of the light source to the lens based on the obtained position information of the object.
- the three-dimensional image sensor includes the light source module having a light source and a lens.
- the position information of the object includes at least one of a distance of the object from the three-dimensional image sensor, a horizontal position of the object, a vertical position of the object and a size of the object.
- the relative position of the light source to the lens includes at least one of an interval between the light source and the lens, a horizontal position of the light source and a vertical position of the light source.
- the position information of the object includes a horizontal position and a vertical position of the object, and the adjusting moves the light source or the lens such that the object, the lens and the light source are positioned in a straight line.
- a method of operating an image sensor includes obtaining position information of an object using light emitted by a light source module, and adjusting an emission angle of the light emitted by the light source module based on the obtained position information.
- the image sensor includes the light source.
- the obtaining the position information of the object includes emitting the light with a desired emission angle from the light source module; and measuring a distance of the object from the image sensor by detecting a time-of-flight of the emitted light.
- the adjusting adjusts the emission angle of the light based on the distance of the object from the image sensor.
- the adjusting adjusts the emission angle of the light by adjusting a separation between a light source and a lens based on the measured distance.
- the light source module includes the light source and the lens.
- the adjusting adjusts the emission angle of the light by adjusting a refractive index of a lens included in the light source module based on the measured distance.
- the adjusting adjusts the emission angle of the light by adjusting a curvature of the lens based on the measured distance.
- the method further includes adjusting an amplitude of the light emitted by the light source module according to an increase or a decrease in the emission angle of the light.
- the measuring at least twice samples light reflected from the object calculates a phase difference between the emitted and reflected light based on the at least two samples and detects the time-of-flight of the emitted light.
- an image sensor includes a light source module configured to transmit light to an object, a pixel array configured to receive light reflected from the object, and a controller configured to obtain position information of the object based on the received light and to adjust an emission angle of the light transmitted by the light source module based on the obtained position information.
- the controller is configured to obtain a distance of the object from the image sensor as the position information and to adjust the emission angle of the transmitted light based on the distance.
- the controller is further configured to obtain the distance of the object from the image sensor by at least twice sampling the light reflected from the object, calculating a phase difference between the transmitted and reflected light based on the at least two samples and detecting a time-of-flight of the light.
- the light source module includes a light source and a lens
- the controller is configured to adjust the emission angle of the transmitted light by adjusting a separation between the lens and the light source based on the obtained distance.
- the light source module includes a light source and a lens
- the controller is configured to adjust the emission angle of the transmitted light by adjusting a refractive index of the lens based on the obtained distance.
- the light source module includes a light source and a lens
- the controller is configured to adjust the emission angle of the transmitted light by adjusting a curvature of the lens based on the obtained distance.
- the controller is configured to adjust an amplitude of the light transmitted by the light source module according to the adjustment of the emission light.
- FIG. 1 is a block diagram illustrating a three-dimensional image sensor according to example embodiments.
- FIG. 2 is a diagram for describing an example of calculating a distance of an object by a three-dimensional image sensor of FIG. 1 .
- FIG. 3 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments.
- FIG. 4 is a diagram for describing an example of measuring a distance of an object according to the method of FIG. 3 .
- FIGS. 5A and 5B are diagrams for describing examples of adjusting an emission angle of light according to the method of FIG. 3 .
- FIG. 6 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments.
- FIG. 7 is a diagram for describing an example of measuring a horizontal position and a vertical position of an object according to the method of FIG. 6 .
- FIG. 8 is a diagram for describing examples of adjusting a relative position of a light source to a lens according to the method of FIG. 6 .
- FIG. 9 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments.
- FIG. 10 is a diagram for describing an example of measuring a size of an object according to the method of FIG. 9 .
- FIG. 11 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments.
- FIG. 12 is a block diagram illustrating a camera including a three-dimensional image sensor according to example embodiments.
- FIG. 13 is a block diagram illustrating a computing system including a three-dimensional image sensor according to example embodiments.
- FIG. 14 is a block diagram illustrating an example of an interface used in a computing system of FIG. 13 .
- first, second, third etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present inventive concept.
- spatially relative terms such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- Example embodiments are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized example embodiments (and intermediate structures). As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an implanted region illustrated as a rectangle will, typically, have rounded or curved features and/or a gradient of implant concentration at its edges rather than a binary change from implanted to non-implanted region.
- a buried region formed by implantation may result in some implantation in the region between the buried region and the surface through which the implantation takes place.
- the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of the present inventive concept.
- FIG. 1 is a block diagram illustrating a three-dimensional image sensor according to example embodiments.
- a three-dimensional image sensor 100 includes a pixel array 110 , an analog-to-digital conversion (ADC) unit 120 , a row scanning circuit 130 , a column scanning circuit 140 , a control unit 150 and a light source module 200 .
- ADC analog-to-digital conversion
- the pixel array 110 may include depth pixels receiving light RX that is reflected from an object 160 after being transmitted to the object 160 by the light source module 200 .
- the depth pixels may convert the received light RX into electrical signals.
- the depth pixels may provide information about a distance of the object 160 from the three-dimensional image sensor 100 and/or black-and-white image information.
- the pixel array 110 may further include color pixels for providing color image information.
- the three-dimensional image sensor 100 may be a three-dimensional color image sensor that provides the color image information and the depth information.
- an infrared filter and/or a near-infrared filter may be formed on the depth pixels, and a color filter (e.g., red, green and blue filters) may be formed on the color pixels.
- a ratio of the number of the depth pixels to the number of the color pixels may vary as desired.
- the ADC unit 120 may convert an analog signal output from the pixel array 110 into a digital signal. According to example embodiments, the ADC unit 120 may perform a column analog-to-digital conversion that converts analog signals in parallel using a plurality of analog-to-digital converters respectively coupled to a plurality of column lines. According to example embodiments, the ADC unit 120 may perform a single analog-to-digital conversion that sequentially converts the analog signals using a single analog-to-digital converter.
- the ADC unit 120 may further include a correlated double sampling (CDS) unit for extracting an effective signal component.
- the CDS unit may perform an analog double sampling that extracts the effective signal component based on a difference between an analog reset signal including a reset component and an analog data signal including a signal component.
- the CDS unit may perform a digital double sampling that converts the analog reset signal and the analog data signal into two digital signals and extracts the effective signal component based on a difference between the two digital signals.
- the CDS unit may perform a dual correlated double sampling that performs both the analog double sampling and the digital double sampling.
- the row scanning circuit 130 may receive control signals from the control unit 150 , and may control a row address and a row scan of the pixel array 110 . To select a row line among a plurality of row lines, the row scanning circuit 130 may apply a signal for activating the selected row line to the pixel array 110 . According to example embodiments, the row scanning circuit 130 may include a row decoder that selects a row line of the pixel array 110 and a row driver that applies a signal for activating the selected row line.
- the column scanning circuit 140 may receive control signals from the control unit 150 , and may control a column address and a column scan of the pixel array 110 .
- the column scanning circuit 140 may output a digital output signal from the ADC unit 120 to a digital signal processing circuit (not shown) and/or to an external host (not shown).
- the column scanning circuit 140 may provide the ADC unit 120 with a horizontal scan control signal to sequentially select a plurality of analog-to-digital converters included in the ADC unit 120 .
- the column scanning circuit 140 may include a column decoder that selects one of the plurality of analog-to-digital converters and a column driver that applies an output of the selected analog-to-digital converter to a horizontal transmission line.
- the horizontal transmission line may have a bit width corresponding to that of the digital output signal.
- the control unit 150 may control the ADC unit 120 , the row scanning circuit 130 , the column scanning circuit 140 and the light source module 200 .
- the control unit 150 may provide the ADC unit 120 , the row scanning circuit 130 , the column scanning circuit 140 and the light source module 200 with control signals, such as a clock signal, a timing control signal, or the like.
- the control unit 150 may include a control logic circuit, a phase locked loop circuit, a timing control circuit, a communication interface circuit, or the like.
- the light source module 200 may emit light of a desired (or, alternatively predetermined) wavelength.
- the light source module 200 may emit infrared light and/or near-infrared light.
- the light source module 200 may include a light source 210 and a lens 220 .
- the light source 210 may be controlled by the control unit 150 to emit the light TX of a desired intensity and/or characteristic (for example, periodic).
- the intensity and/or characteristic of the light TX may be controlled such that the light TX has a waveform of a pulse wave, a sine wave, a cosine wave, or the like.
- the light source 210 may be implemented by a light emitting diode (LED), a laser diode, or the like.
- the lens 220 may be configured to adjust an emission angle (or an angle of transmission) of the light TX output from the light source 210 .
- an interval between the light source 210 and the lens 220 may be controlled by the control unit 150 to adjust the emission angle of the light TX.
- the control unit 150 may control the light source module 200 to emit the light TX having the periodic intensity.
- the light TX emitted by the light source module 200 may be reflected from the object 160 back to the three-dimensional image sensor 100 as the received light RX.
- the received light RX may be incident on the depth pixels, and the depth pixels may be activated by the row scanning circuit 130 to output analog signals corresponding to the received light RX.
- the ADC unit 120 may convert the analog signals output from the depth pixels into digital data DATA.
- the digital data DATA may be provided to the control unit 150 by the column scanning circuit 140 and/or the ADC 120 .
- a calculation unit 155 included in the control unit 150 may calculate a distance of the object 160 from the three-dimensional image sensor 100 , a horizontal position of the object 160 , a vertical position of the object 160 and/or a size of the object 160 based on the digital data DATA.
- the control unit 150 may control the emission angle or a projection (or incident) region of the light TX based on the distance, the horizontal position, the vertical position and/or the size of the object 160 .
- control unit 150 may control an interval between the light source 210 and the lens 220 , a relative position (or, a placement) of the light source 210 and the lens 220 with respect to each other, a refractive index of the lens 220 , a curvature of the lens 220 , or the like. Accordingly, the light TX emitted by the light source module 200 may be focused on a region where the object 160 of interest is located, thereby improving the accuracy of the depth information provided from the depth pixels.
- control unit 150 may adjust an amplitude of the light TX (or the maximum intensity of the light TX during each period) according to a decrement or an increment of the emission angle of the light TX or according to a size of a region on which the light TX is projected (or incident). For example, the control unit 150 may decrease the amplitude of the light TX as the emission angle of the light TX decreases. As a result, in the three-dimensional image sensor 100 according to example embodiments, the power consumption may be reduced.
- the digital data DATA and/or the depth information may be provided to the digital signal processing circuit and/or the external host.
- the pixel array 110 may include color pixels, and the color image information as well as the depth information may be provided to the digital signal processing circuit and/or the external host.
- the accuracy of the depth information may be improved, and the power consumption may be reduced.
- FIG. 2 is a diagram for describing an example of calculating a distance of an object by a three-dimensional image sensor of FIG. 1 .
- light TX emitted by a light source module 200 may have a periodic intensity and/or characteristic.
- the intensity (for example, the number of photons per unit area) of the light TX may have a waveform of a sine wave.
- the light TX emitted by the light source module 200 may be reflected from the object 160 , and then may be incident on the pixel array 110 as received light RX.
- the pixel array 100 may periodically sample the received light RX. According to example embodiments, during each period of the received light RX (for example, corresponding to a period of the transmitted light TX), the pixel array 100 may perform a sampling on the received light RX by sampling, for example, at two sampling points having a phase difference of about 180 degrees, at four sampling points having a phase difference of about 90 degrees, or at more than four sampling points.
- the pixel array 110 may extract four samples A 0 , A 1 , A 2 and A 3 of the received light RX at phases of about 90 degrees, about 180 degrees, about 270 degrees and about 360 degrees per period, respectively.
- the received light RX may have an offset B that is different from an offset of the light TX emitted by the light source module 200 due to background light, a noise, or the like.
- the offset B of the received light RX may be calculated by Equation 1.
- a 0 represents an intensity of the received light RX sampled at a phase of about 90 degrees of the emitted light TX
- a 1 represents an intensity of the received light RX sampled at a phase of about 180 degrees of the emitted light TX
- a 2 represents an intensity of the received light RX sampled at a phase of about 270 degrees of the emitted light TX
- a 3 represents an intensity of the received light RX sampled at a phase of about 360 degrees of the emitted light TX.
- the received light RX may have an amplitude A lower than that of the light TX emitted by the light source module 200 due to loss (for example, light loss).
- the amplitude A of the received light RX may be calculated by Equation 2.
- Black-and-white image information about the object 160 may be provided by respective depth pixels included in the pixel array 110 based on the amplitude A of the received light RX.
- the received light RX may be delayed by a phase difference ⁇ corresponding, for example, to a double of the distance of the object 160 from the three-dimensional image sensor 100 with respect to the emitted light TX.
- the phase difference ⁇ between the emitted light TX and the received light RX may be calculated by Equation 3.
- the phase difference ⁇ between the emitted light TX and the received light RX may, for example, correspond to a time-of-flight (TOF).
- f represents a modulation frequency, which is a frequency of the intensity of the emitted light TX (or a frequency of the intensity of the received light RX).
- This calculation of the distance may be performed with respect to each depth pixel included in the pixel array 110 .
- a calculation unit 155 may perform the calculation on respective digital data DATA corresponding to the respective depth pixels to generate a plurality of distance values of a plurality of portions in an image frame. That is, each distance value may be calculated based on a signal output from one depth pixel, and may be a distance value of one portion (or one point) in the image frame. Further, the calculation unit 155 may distinguish the object 160 of interest from a background image in the image frame based on the distance values. For example, the calculation unit 155 may detect the object 160 of interest such that the calculation unit 155 may determine portions of the image frame having relatively high distance values as the object 160 of interest. Thus, the calculation unit 155 may generate depth information about the object 160 of interest by generating the distance values and by detecting the object 160 of interest.
- the calculation unit 155 may further calculate a horizontal/vertical position of the object 160 of interest based on the distance values. For example, the calculation unit 155 may measure a relative position of the object 160 of interest from the center in the image frame, and may adjust the relative position using the distance information about the object 160 of interest to generate the horizontal/vertical position of the object 160 of interest.
- the calculation unit 155 may further calculate a size of the object 160 of interest based on the distance values. For example, the calculation unit 155 may measure a size of the object 160 of interest in the image frame, and may adjust the measured size of the object 160 of interest using the distance information about the object 160 of interest to generate the size of the object 160 of interest.
- the three-dimensional image sensor 100 may obtain depth information about the object 160 using the light TX emitted by the light source module 200 .
- FIG. 2 illustrates the light TX of which the intensity has a waveform of a sine wave
- the three-dimensional image sensor 100 may use the light TX of which the intensity has various types of waveforms, according to example embodiments.
- the three-dimensional image sensor 100 may extract the depth information according to the waveform of the intensity of the light TX, a structure of a depth pixel, or the like.
- FIG. 3 is a flow chart illustrating a method of operating a three-dimensional image sensor, according to example embodiments.
- FIG. 4 is a diagram for describing an example of measuring a distance of an object according to the method of FIG. 3 .
- FIGS. 5A and 5B are diagrams for describing examples of adjusting an emission angle of light according to the method of FIG. 3 .
- a three-dimensional image sensor 100 measures a distance DIST of an object 160 from the three-dimensional image sensor 100 using light TX emitted by a light source module 200 (S 310 ).
- the light TX generated by a light source 210 may be emitted through a lens 220 . If the light source 210 and the lens 220 have a first interval ITV 1 , the emitted light TX may have a first emission angle ⁇ 1 .
- the first emission angle ⁇ 1 may be the maximum emission angle of the light TX emitted by the light source module 200 .
- the three-dimensional image sensor 100 may measure the distance DIST of the object 160 from the three-dimensional image sensor 100 by detecting the light RX reflected from the object 160 to the three-dimensional image sensor 100 .
- the three-dimensional image sensor 100 adjusts the emission angle of the light TX emitted by the light source module 200 based on the distance DIST of the object 160 (S 330 ). According to example embodiments, as illustrated in FIG. 5A , the three-dimensional image sensor 100 may adjust the interval between (or, the separation) the light source 210 and the lens 220 to a second interval ITV 2 so that the light TX emitted by the light source module 200 has a second emission angle ⁇ 2 . For example, a control unit 150 may control the light source module 200 to decrease the emission angle of the light TX as the distance DIST of the object 160 increases.
- control unit 150 may move the light source 210 such that the interval between the light source 210 and the lens 220 increases as the distance DIST of the object 160 increases.
- control unit 150 may move the lens 220 such that the interval between the light source 210 and the lens 220 increases as the distance DIST of the object 160 increases.
- the three-dimensional image sensor 100 may adjust a curvature of the lens 220 so that the light TX emitted by the light source module 200 has the second emission angle ⁇ 2 .
- the control unit 150 may increase the curvature of the lens 220 (i.e. decrease a radius of curvature of the lens 220 ) as the distance DIST of the object 160 increases.
- the three-dimensional image sensor 100 may adjust a refractive index of the lens 220 so that the light TX emitted by the light source module 200 has the second emission angle ⁇ 2 .
- the control unit 150 may increase the refractive index of the lens 220 as the distance DIST of the object 160 increases.
- the three-dimensional image sensor 100 may adjust any one or two or all of the interval between the light source 210 and lens 220 , a curvature of the lens 220 and a refractive index of the lens 220 .
- the emission angle of the light TX emitted by the light source module 200 is adjusted corresponding to the distance DIST of the object 160 , light energy projected on the object 160 may be increased even with less power consumption, and the accuracy of depth information obtained by the three-dimensional image sensor 100 may be improved.
- the three-dimensional image sensor 100 may emit the light TX with the maximum amplitude before adjusting the emission angle of the light TX, and may decrease the amplitude of the light TX according to a decrement of the emission angle of the light TX. Accordingly, the power consumed by the light source module 200 may be reduced.
- an operation wherein light TX is initially emitted with minimum amplitude and the amplitude is later maximized depending on the emission angle of light TX, is also possible.
- FIG. 6 is a flow chart illustrating a method of operating a three-dimensional image sensor, according to example embodiments.
- FIG. 7 is a diagram for describing an example of measuring a horizontal position and a vertical position of an object according to the method of FIG. 6 .
- FIG. 8 is a diagram for describing examples of adjusting a relative position (or, alternatively a placement) of a light source to a lens according to the method of FIG. 6 .
- a three-dimensional image sensor 100 measures a horizontal position HP 1 and/or a vertical position VP 1 of an object 160 using light TX emitted by a light source module 200 (S 410 ).
- the object 160 may be placed at a distance HP 1 in a positive horizontal direction and/or a distance VP 1 in a positive vertical direction from a straight line connecting the center of a light source 210 and the center of a lens 220 .
- This straight line may be assumed to pass vertically through the plane of the paper and through the point of intersection of the horizontal and vertical axes shown in FIG. 7 .
- the light TX emitted by the light source module 200 may be reflected from the object 160 to the three-dimensional image sensor 100 .
- the received light RX may be converted into data DATA by depth pixels included in a pixel array 110 and an ADC unit 120 , and a control unit 150 may measure the horizontal position HP 1 and/or the vertical position VP 1 of the object 160 based on the data DATA.
- the three-dimensional image sensor 100 adjusts a relative position (or, the placement) of the light source 210 to the lens 220 based on the horizontal position HP 1 and/or the vertical position VP 1 of the object 160 (S 430 ).
- the control unit 150 may move the light source 210 by a desired (or, alternatively predetermined) distance HP 2 in a negative horizontal direction and/or by a desired (or, alternatively predetermined) distance VP 2 in a negative vertical direction based on the positive horizontal position HP 1 and/or the positive vertical position VP 1 of the object 160 .
- a ratio of the adjusted horizontal position HP 2 of the light source 210 to the measured horizontal position HP 1 of the object 160 may correspond to a ratio of a distance of the light source 210 from the lens 220 to a distance of the object 160 from the lens 220
- a ratio of the adjusted vertical position VP 2 of the light source 210 to the measured vertical position VP 1 of the object 160 may correspond to the ratio of the distance of the light source 210 from the lens 220 to the distance of the object 160 from the lens 220 .
- control unit 150 may move the lens 220 by a desired (or, alternatively predetermined) distance in a positive horizontal direction and/or by a desired (or, alternatively predetermined) distance VP 2 in a positive vertical direction based on the positive horizontal position HP 1 and/or the positive vertical position VP 1 of the object 160 .
- control unit 150 may move the light source 210 or the lens 220 in a horizontal direction and/or a vertical direction based on the horizontal position HP 1 and/or the vertical position VP 1 of the object 160 so that the light source 210 , the lens 220 and the object 160 are positioned in a straight line.
- control unit 150 may adjust an emission angle of the light TX emitted by the light source module 200 according to a distance of the object 160 from the three-dimensional image sensor 110 and/or a size of the object 160 , and may adjust (for example, decrease) an amplitude of the emitted light TX.
- the position of the light source 210 and the lens 220 is adjusted according to the horizontal position HP 1 and/or the vertical position VP 1 of the object 160 , light energy projected (or incident) on the object 160 may be increased even with less power consumption. Accordingly, the accuracy of depth information obtained by the three-dimensional image sensor 100 may be improved, and the power consumed by the light source module 200 may be reduced.
- FIG. 9 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments.
- FIG. 10 is a diagram for describing an example of measuring a size of an object according to the method of FIG. 9 .
- a three-dimensional image sensor 100 measures a size of an object 160 using light TX emitted by a light source module 200 (S 510 ).
- the light TX emitted by the light source module 200 may be reflected from the object 160 back to the three-dimensional image sensor 100 .
- the received light RX may be converted into data DATA by depth pixels included in a pixel array 110 and an ADC unit 120 , and a control unit 150 may measure the size of the object 160 based on the data DATA.
- the three-dimensional image sensor 100 adjusts a relative position of the light source 210 to the lens 220 based on the size of the object 160 (S 530 ).
- the control unit 150 may adjust an emission angle of the light TX so that the emitted light TX is focused on a region 170 corresponding to the size of the object 160 .
- the control unit 150 may move the light source 210 and/or the lens 220 to adjust any one or two or all of an interval between the light source 210 and the lens 220 , a refractive index and a curvature of the lens 220 .
- the relative position of the light source 210 and the lens 220 and/or the property of the lens 220 is adjusted according to the size of the object 160 , light energy projected on the object 160 may be increased even with less power consumption. Accordingly, the accuracy of depth information obtained by the three-dimensional image sensor 100 may be improved, and the power consumed by the light source module 200 may be reduced.
- FIG. 11 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments.
- a three-dimensional image sensor 100 obtains position information of an object 160 using light TX emitted by a light source module 200 (S 610 ).
- the position information may include a distance of the object 160 from the three-dimensional image sensor 100 , a horizontal position of the object 160 , a vertical position of the object 160 , a size of the object 160 , or the like.
- the three-dimensional image sensor 100 adjusts a relative position of a light source 210 to a lens 220 based on the position information of the object 160 (S 630 ). For example, the three-dimensional image sensor 100 may adjust an interval between the light source 210 and the lens 220 , a refractive index of the lens 220 and/or a curvature of the lens 220 based on the distance of the object 160 from the three-dimensional image sensor 100 . According to example embodiments, the three-dimensional image sensor 100 may move the light source 210 and/or the lens 220 in a horizontal direction and/or in a vertical direction based on the horizontal position and the vertical position of the object 160 so that the light source 210 , the lens 220 and the object 160 are positioned in a straight line.
- the three-dimensional image sensor 100 may adjust any one, or two or all of the interval between the light source 210 and the lens 220 , the refractive index of the lens 220 and the curvature of the lens 220 based on the size of the object 160 so that the light TX emitted by the light source module 200 is focused on a region corresponding to the object 160 of interest.
- the three-dimensional image sensor 100 may emit the light TX with the maximum amplitude before adjusting the relative position of the light source 210 and the lens 220 , and may decrease the amplitude of the light TX after adjusting the relative position. Accordingly, the power consumption by the light source module 200 may be reduced.
- an operation wherein light TX is initially emitted with minimum amplitude and the amplitude is then maximized, is also possible.
- the relative position of the light source 210 and the lens 220 is adjusted based on the position information of the object 160 , light energy projected on the object 160 may be increased even with less power consumption. Accordingly, the accuracy of depth information obtained by the three-dimensional image sensor 100 may be improved, and the power consumed by the light source module 200 may be reduced.
- FIGS. 1 through 11 illustrate an example of the light source module 200 including the light source 210 and the lens 220 , in example embodiments may also include, the light source module 200 having a reflector along with or instead of the lens 220 .
- FIG. 12 is a block diagram illustrating a camera including a three-dimensional image sensor according to example embodiments.
- a camera 800 includes a receiving lens 810 , a three-dimensional image sensor 100 , a motor unit 830 and an engine unit 840 .
- the three-dimensional image sensor 100 may include a three-dimensional image sensor chip 820 and a light source module 200 .
- the three-dimensional image sensor chip 820 and the light source module 200 may be implemented as separate devices, or may be implemented such that at least one component of the light source module 200 is included in the three-dimensional image sensor chip 820 .
- the receiving lens 810 may focus incident light on a photo-receiving region (e.g., depth pixels and/or color pixels) of the three-dimensional image sensor chip 820 .
- the three-dimensional image sensor chip 820 may generate data DATA 1 including depth information and/or color image information based on the incident light passing through the receiving lens 810 .
- the data DATA 1 generated by the three-dimensional image sensor chip 820 may include depth data generated using infrared light or near-infrared light emitted by the light source module 200 , and RGB data of a Bayer pattern generated using external visible light.
- the three-dimensional image sensor chip 820 may provide the data DATA 1 to the engine unit 840 in response to a clock signal CLK.
- the three-dimensional image sensor chip 820 may interface with the engine unit 840 using a mobile industry processor interface (MIPI) and/or a camera serial interface (CSI).
- MIPI mobile industry processor interface
- CSI camera serial interface
- the motor unit 830 may control the focusing of the lens 810 or may perform shuttering in response to a control signal CTRL received from the engine unit 840 .
- a relative position of a light source 210 and a lens 220 included in the light source module 200 may be adjusted by the motor unit 830 and/or the three-dimensional image sensor chip 820 .
- the engine unit 840 may control the three-dimensional image sensor 100 and the motor unit 830 .
- the engine unit 840 may process the data DATA 1 received from the three-dimensional image sensor chip 820 .
- the engine unit 840 may generate three-dimensional color data based on the received data DATA 1 .
- the engine unit 840 may generate YUV data including a luminance component, a difference between the luminance component and a blue component, and a difference between the luminance component and a red component based on the RGB data, or may generate compressed data, such as joint photography experts group (JPEG) data.
- JPEG joint photography experts group
- the engine unit 840 may be coupled to a host/application 850 , and may provide data DATA 2 to the host/application 850 based on a master clock signal MCLK.
- the engine unit 840 may interface with the host/application 850 using a serial peripheral interface (SPI) and/or an inter integrated circuit (I2C) interface.
- SPI serial peripheral interface
- I2C inter integrated circuit
- FIG. 13 is a block diagram illustrating a computing system including a three-dimensional image sensor according to example embodiments.
- a computing system 1000 includes a processor 1010 , a memory device 1020 , a storage device 1030 , an input/output device 1040 , a power supply 1050 and/or a three-dimensional image sensor 100 .
- the computing system 1000 may further include a port for communicating with electronic devices, such as a video card, a sound card, a memory card, a USB device, etc.
- the processor 1010 may perform specific calculations and/or tasks.
- the processor 1010 may be a microprocessor, a central process unit (CPU), a digital signal processor, or the like.
- the processor 1010 may communicate with the memory device 1020 , the storage device 1030 and the input/output device 1040 via an address bus, a control bus and/or a data bus.
- the processor 1010 may be coupled to an extension bus, such as a peripheral component interconnect (PCI) bus.
- PCI peripheral component interconnect
- the memory device 1020 may store data for operating the computing system 1020 .
- the memory device 1020 may be implemented by a dynamic random access memory (DRAM), a mobile DRAM, a static random access memory (SRAM), a phase change random access memory (PRAM), a resistance random access memory (RRAM), a nano floating gate memory (NFGM), a polymer random access memory (PoRAM), a magnetic random access memory (MRAM), a ferroelectric random access memory (FRAM), or the like.
- the storage device 1030 may include a solid state drive, a hard disk drive, a CD-ROM, or the like.
- the input/output device 1040 may include an input device, such as a keyboard, a mouse, a keypad, etc., and an output device, such as a printer, a display device, or the like.
- the power supply 1050 may supply power to the computing device 1000 .
- the three-dimensional image sensor 100 may be coupled to the processor 1010 via the buses or other desired communication links. As described above, three-dimensional image sensor 100 may adjust a projection region of light emitted by a light source module based on position information of an object of interest, thereby improving the accuracy and reducing the power consumption.
- the three-dimensional image sensor 100 and the processor 1010 may be integrated in one chip, or may be implemented as separate chips.
- the three-dimensional image sensor 100 and/or components of the three-dimensional image sensor 100 may be packaged in various desired forms, such as package on package (PoP), ball grid arrays (BGAs), chip scale packages (CSPs), plastic leaded chip carrier (PLCC), plastic dual in-line package (PDIP), die in waffle pack, die in wafer form, chip on board (COB), ceramic dual in-line package (CERDIP), plastic metric quad flat pack (MQFP), thin quad flat pack (TQFP), small outline IC (SOIC), shrink small outline package (SSOP), thin small outline package (TSOP), system in package (SIP), multi chip package (MCP), wafer-level fabricated package (WFP), or wafer-level processed stack package (WSP).
- PoP package on package
- BGAs ball grid arrays
- CSPs chip scale packages
- PLCC plastic leaded chip carrier
- PDIP plastic dual in-line package
- COB chip on board
- CERDIP ceramic dual in-line package
- MQFP plastic
- the computing system 1000 may be any computing system including the three-dimensional image sensor 100 .
- the computing system 1000 may include a digital camera, a mobile phone, a smart phone, a personal digital assistants (PDA), a portable multimedia player (PMP), or the like.
- PDA personal digital assistants
- PMP portable multimedia player
- FIG. 14 is a block diagram illustrating an example of an interface used in a computing system of FIG. 13 .
- a computing system 1100 may employ or support a MIPI interface, and may include an application processor 1110 , a three-dimensional image sensor 1140 and a display device 1050 .
- a CSI host 1112 of the application processor 1110 may perform a serial communication with a CSI device 1141 of the three-dimensional image sensor 1140 using a camera serial interface (CSI).
- the CSI host 1112 may include a deserializer DES
- the CSI device 1141 may include a serializer SER.
- a DSI host 1111 of the application processor 1110 may perform a serial communication with a DSI device 1151 of the display device 1150 using a display serial interface (DSI).
- the DSI host 1111 may include a serializer SER
- the DSI device 1151 may include a deserializer DES.
- the computing system 1100 may further include a radio frequency (RF) chip 1160 .
- a physical layer PHY 1113 of the application processor 1110 may perform data transfer with a physical layer PHY 1161 of the RF chip 1160 using a MIPI DigRF.
- the PHY 1113 of the application processor 1110 may interface (or, alternatively communicate) a DigRF MASTER 1114 for controlling the data transfer with the PHY 1161 of the RF chip 1160 .
- the computing system 1100 may further include a global positioning system (GPS) 1120 , a storage device 1170 , a microphone 1180 , a DRAM 1185 and/or a speaker 1190 .
- GPS global positioning system
- the computing system 1100 may communicate with external devices using an ultra wideband (UWB) communication 1210 , a wireless local area network (WLAN) communication 1220 , a worldwide interoperability for microwave access (WIMAX) communication 1230 , or the like.
- UWB ultra wideband
- WLAN wireless local area network
- WIMAX worldwide interoperability for microwave access
- example embodiments are not limited to configurations or interfaces of the computing systems 1000 and 1100 illustrated in FIGS. 13 and 14 .
- Example embodiments may be used in any three-dimensional image sensor or any system including the three-dimensional image sensor, such as a computer, a digital camera, a three-dimensional camera, a mobile phone, a personal digital assistant (PDA), a scanner, a navigator, a video phone, a monitoring system, an auto focus system, a tracking system, a motion capture system, an image stabilizing system, or the like.
- a computer such as a computer, a digital camera, a three-dimensional camera, a mobile phone, a personal digital assistant (PDA), a scanner, a navigator, a video phone, a monitoring system, an auto focus system, a tracking system, a motion capture system, an image stabilizing system, or the like.
- PDA personal digital assistant
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Measurement Of Optical Distance (AREA)
- Studio Devices (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
According to example embodiments, a method of operating a three-dimensional image sensor comprises measuring a distance of an object from the three-dimensional image sensor using light emitted by a light source module, and adjusting an emission angle of the light emitted by the light source module based on the measured distance. The three-dimensional image sensor includes the light source module.
Description
- This U.S. non-provisional application claims priority under 35 U.S.C. §119 to Korean Patent Application No. 2010-0131147 filed on Dec. 21, 2010 in the Korean Intellectual Property Office (KIPO), the contents of which are incorporated herein by reference in their entirety.
- 1. Technical Field
- Example embodiments relate to image sensors and method of operating the same.
- 2. Description of the Related Art
- An image sensor is a photo-detection device that converts optical signals including image and/or distance (i.e., depth) information of an object into electrical signals. Various types of image sensors, such as charge-coupled device (CCD) image sensors, CMOS image sensors (CIS), etc., have been developed to provide high quality image information of the object. Recently, a three-dimensional (3D) image sensor is being researched and developed which provides depth information as well as two-dimensional image information.
- The three-dimensional image sensor may obtain the depth information using infrared light or near-infrared light as a light source. In a conventional three-dimensional image sensor, light energy may be wasted since light is projected even on regions where an object of interest does not exist.
- According to example embodiments, a method of operating a three-dimensional image sensor includes measuring a distance of an object from the three-dimensional image sensor using light emitted by a light source module, and adjusting an emission angle of the light emitted by the light source module based on the measured distance. The three-dimensional image sensor includes the light source module.
- According to example embodiments, the measuring includes emitting the light with desired emission angle at the light source module, and measuring the distance of the object from the three-dimensional image sensor by detecting a time-of-flight of the emitted light.
- According to example embodiments, the adjusting decreases the emission angle of the light as the distance of the object from the three-dimensional image sensor increases.
- According to example embodiments, the light source module includes a light source and a lens, and the adjusting adjusts an interval between the light source and the lens based on the measured distance.
- According to example embodiments, the adjusting moves the light source or the lens such that a separation between the light source and the lens increases as the distance of the object from the three-dimensional image sensor increases.
- According to example embodiments, the light source module includes a light source and a lens, and the adjusting adjusts a refractive index of the lens based on the measured distance.
- According to example embodiments, the adjusting increases the refractive index of the lens as the distance of the object from the three-dimensional image sensor increases.
- According to example embodiments, the light source module includes a light source and a lens, and the adjusting adjusts a curvature of the lens based on the measured distance.
- According to example embodiments, the adjusting increases the curvature of the lens as the distance of the object from the three-dimensional image sensor increases.
- According to example embodiments, the method further includes adjusting an amplitude of the light emitted by the light source module according to an increment or a decrement of the emission angle of the light.
- According to example embodiments, the adjusting decreases the amplitude of the light as the emission angle of the light decreases.
- According to example embodiments, a method of operating a three-dimensional image sensor includes obtaining position information of an object using light emitted by a light source module, and adjusting a relative position of the light source to the lens based on the obtained position information of the object. The three-dimensional image sensor includes the light source module having a light source and a lens.
- According to example embodiments, the position information of the object includes at least one of a distance of the object from the three-dimensional image sensor, a horizontal position of the object, a vertical position of the object and a size of the object.
- According to example embodiments, the relative position of the light source to the lens includes at least one of an interval between the light source and the lens, a horizontal position of the light source and a vertical position of the light source.
- According to example embodiments, the position information of the object includes a horizontal position and a vertical position of the object, and the adjusting moves the light source or the lens such that the object, the lens and the light source are positioned in a straight line.
- According to example embodiments, a method of operating an image sensor includes obtaining position information of an object using light emitted by a light source module, and adjusting an emission angle of the light emitted by the light source module based on the obtained position information. The image sensor includes the light source.
- According to example embodiments, the obtaining the position information of the object includes emitting the light with a desired emission angle from the light source module; and measuring a distance of the object from the image sensor by detecting a time-of-flight of the emitted light.
- According to example embodiments, the adjusting adjusts the emission angle of the light based on the distance of the object from the image sensor.
- According to example embodiments, the adjusting adjusts the emission angle of the light by adjusting a separation between a light source and a lens based on the measured distance. The light source module includes the light source and the lens.
- According to example embodiments, the adjusting adjusts the emission angle of the light by adjusting a refractive index of a lens included in the light source module based on the measured distance.
- According to example embodiments, the adjusting adjusts the emission angle of the light by adjusting a curvature of the lens based on the measured distance.
- According to example embodiments, the method further includes adjusting an amplitude of the light emitted by the light source module according to an increase or a decrease in the emission angle of the light.
- According to example embodiments, the measuring at least twice samples light reflected from the object, calculates a phase difference between the emitted and reflected light based on the at least two samples and detects the time-of-flight of the emitted light.
- According to example embodiments, an image sensor includes a light source module configured to transmit light to an object, a pixel array configured to receive light reflected from the object, and a controller configured to obtain position information of the object based on the received light and to adjust an emission angle of the light transmitted by the light source module based on the obtained position information.
- According to example embodiments, the controller is configured to obtain a distance of the object from the image sensor as the position information and to adjust the emission angle of the transmitted light based on the distance.
- According to example embodiments, the controller is further configured to obtain the distance of the object from the image sensor by at least twice sampling the light reflected from the object, calculating a phase difference between the transmitted and reflected light based on the at least two samples and detecting a time-of-flight of the light.
- According to example embodiments, the light source module includes a light source and a lens, and the controller is configured to adjust the emission angle of the transmitted light by adjusting a separation between the lens and the light source based on the obtained distance.
- According to example embodiments, the light source module includes a light source and a lens, and the controller is configured to adjust the emission angle of the transmitted light by adjusting a refractive index of the lens based on the obtained distance.
- According to example embodiments, the light source module includes a light source and a lens, and the controller is configured to adjust the emission angle of the transmitted light by adjusting a curvature of the lens based on the obtained distance.
- According to example embodiments, the controller is configured to adjust an amplitude of the light transmitted by the light source module according to the adjustment of the emission light.
- The above and other features and advantages will become more apparent by describing in detail example embodiments with reference to the attached drawings. The accompanying drawings are intended to depict example embodiments and should not be interpreted to limit the intended scope of the claims. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted.
-
FIG. 1 is a block diagram illustrating a three-dimensional image sensor according to example embodiments. -
FIG. 2 is a diagram for describing an example of calculating a distance of an object by a three-dimensional image sensor ofFIG. 1 . -
FIG. 3 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments. -
FIG. 4 is a diagram for describing an example of measuring a distance of an object according to the method ofFIG. 3 . -
FIGS. 5A and 5B are diagrams for describing examples of adjusting an emission angle of light according to the method ofFIG. 3 . -
FIG. 6 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments. -
FIG. 7 is a diagram for describing an example of measuring a horizontal position and a vertical position of an object according to the method ofFIG. 6 . -
FIG. 8 is a diagram for describing examples of adjusting a relative position of a light source to a lens according to the method ofFIG. 6 . -
FIG. 9 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments. -
FIG. 10 is a diagram for describing an example of measuring a size of an object according to the method ofFIG. 9 . -
FIG. 11 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments. -
FIG. 12 is a block diagram illustrating a camera including a three-dimensional image sensor according to example embodiments. -
FIG. 13 is a block diagram illustrating a computing system including a three-dimensional image sensor according to example embodiments. -
FIG. 14 is a block diagram illustrating an example of an interface used in a computing system ofFIG. 13 . - Various example embodiments will be described more fully hereinafter with reference to the accompanying drawings, in which some example embodiments are shown. The present inventive concept may, however, be embodied in many different forms and should not be construed as limited to the example embodiments set forth herein. In the drawings, the sizes and relative sizes of layers and regions may be exaggerated for clarity.
- It will be understood that when an element or layer is referred to as being “on,” “connected to” or “coupled to” another element or layer, it can be directly on, connected or coupled to the other element or layer or intervening elements or layers may be present. In contrast, when an element is referred to as being “directly on,” “directly connected to” or “directly coupled to” another element or layer, there are no intervening elements or layers present. Like numerals refer to like elements throughout. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
- It will be understood that, although the terms first, second, third etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present inventive concept.
- Spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the present inventive concept. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- Example embodiments are described herein with reference to cross-sectional illustrations that are schematic illustrations of idealized example embodiments (and intermediate structures). As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, example embodiments should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an implanted region illustrated as a rectangle will, typically, have rounded or curved features and/or a gradient of implant concentration at its edges rather than a binary change from implanted to non-implanted region. Likewise, a buried region formed by implantation may result in some implantation in the region between the buried region and the surface through which the implantation takes place. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of the present inventive concept.
- Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this inventive concept belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
-
FIG. 1 is a block diagram illustrating a three-dimensional image sensor according to example embodiments. - Referring to
FIG. 1 , a three-dimensional image sensor 100 includes apixel array 110, an analog-to-digital conversion (ADC)unit 120, arow scanning circuit 130, acolumn scanning circuit 140, acontrol unit 150 and alight source module 200. - The
pixel array 110 may include depth pixels receiving light RX that is reflected from anobject 160 after being transmitted to theobject 160 by thelight source module 200. The depth pixels may convert the received light RX into electrical signals. The depth pixels may provide information about a distance of theobject 160 from the three-dimensional image sensor 100 and/or black-and-white image information. - The
pixel array 110 may further include color pixels for providing color image information. In this case, the three-dimensional image sensor 100 may be a three-dimensional color image sensor that provides the color image information and the depth information. According to example embodiments an infrared filter and/or a near-infrared filter may be formed on the depth pixels, and a color filter (e.g., red, green and blue filters) may be formed on the color pixels. According to example embodiments, a ratio of the number of the depth pixels to the number of the color pixels may vary as desired. - The
ADC unit 120 may convert an analog signal output from thepixel array 110 into a digital signal. According to example embodiments, theADC unit 120 may perform a column analog-to-digital conversion that converts analog signals in parallel using a plurality of analog-to-digital converters respectively coupled to a plurality of column lines. According to example embodiments, theADC unit 120 may perform a single analog-to-digital conversion that sequentially converts the analog signals using a single analog-to-digital converter. - According to example embodiments, the
ADC unit 120 may further include a correlated double sampling (CDS) unit for extracting an effective signal component. According to example embodiments, the CDS unit may perform an analog double sampling that extracts the effective signal component based on a difference between an analog reset signal including a reset component and an analog data signal including a signal component. According to example embodiments, the CDS unit may perform a digital double sampling that converts the analog reset signal and the analog data signal into two digital signals and extracts the effective signal component based on a difference between the two digital signals. According to example embodiments, the CDS unit may perform a dual correlated double sampling that performs both the analog double sampling and the digital double sampling. - The
row scanning circuit 130 may receive control signals from thecontrol unit 150, and may control a row address and a row scan of thepixel array 110. To select a row line among a plurality of row lines, therow scanning circuit 130 may apply a signal for activating the selected row line to thepixel array 110. According to example embodiments, therow scanning circuit 130 may include a row decoder that selects a row line of thepixel array 110 and a row driver that applies a signal for activating the selected row line. - The
column scanning circuit 140 may receive control signals from thecontrol unit 150, and may control a column address and a column scan of thepixel array 110. Thecolumn scanning circuit 140 may output a digital output signal from theADC unit 120 to a digital signal processing circuit (not shown) and/or to an external host (not shown). For example, thecolumn scanning circuit 140 may provide theADC unit 120 with a horizontal scan control signal to sequentially select a plurality of analog-to-digital converters included in theADC unit 120. According to example embodiments, thecolumn scanning circuit 140 may include a column decoder that selects one of the plurality of analog-to-digital converters and a column driver that applies an output of the selected analog-to-digital converter to a horizontal transmission line. The horizontal transmission line may have a bit width corresponding to that of the digital output signal. - The
control unit 150 may control theADC unit 120, therow scanning circuit 130, thecolumn scanning circuit 140 and thelight source module 200. Thecontrol unit 150 may provide theADC unit 120, therow scanning circuit 130, thecolumn scanning circuit 140 and thelight source module 200 with control signals, such as a clock signal, a timing control signal, or the like. According to example embodiments thecontrol unit 150 may include a control logic circuit, a phase locked loop circuit, a timing control circuit, a communication interface circuit, or the like. - The
light source module 200 may emit light of a desired (or, alternatively predetermined) wavelength. For example, thelight source module 200 may emit infrared light and/or near-infrared light. Thelight source module 200 may include alight source 210 and alens 220. Thelight source 210 may be controlled by thecontrol unit 150 to emit the light TX of a desired intensity and/or characteristic (for example, periodic). For example, the intensity and/or characteristic of the light TX may be controlled such that the light TX has a waveform of a pulse wave, a sine wave, a cosine wave, or the like. Thelight source 210 may be implemented by a light emitting diode (LED), a laser diode, or the like. Thelens 220 may be configured to adjust an emission angle (or an angle of transmission) of the light TX output from thelight source 210. For example, an interval between thelight source 210 and thelens 220 may be controlled by thecontrol unit 150 to adjust the emission angle of the light TX. - Hereinafter, an operation of the three-
dimensional image sensor 100 according to example embodiments will be described below. - The
control unit 150 may control thelight source module 200 to emit the light TX having the periodic intensity. The light TX emitted by thelight source module 200 may be reflected from theobject 160 back to the three-dimensional image sensor 100 as the received light RX. The received light RX may be incident on the depth pixels, and the depth pixels may be activated by therow scanning circuit 130 to output analog signals corresponding to the received light RX. TheADC unit 120 may convert the analog signals output from the depth pixels into digital data DATA. The digital data DATA may be provided to thecontrol unit 150 by thecolumn scanning circuit 140 and/or theADC 120. - A
calculation unit 155 included in thecontrol unit 150 may calculate a distance of theobject 160 from the three-dimensional image sensor 100, a horizontal position of theobject 160, a vertical position of theobject 160 and/or a size of theobject 160 based on the digital data DATA. Thecontrol unit 150 may control the emission angle or a projection (or incident) region of the light TX based on the distance, the horizontal position, the vertical position and/or the size of theobject 160. For example, thecontrol unit 150 may control an interval between thelight source 210 and thelens 220, a relative position (or, a placement) of thelight source 210 and thelens 220 with respect to each other, a refractive index of thelens 220, a curvature of thelens 220, or the like. Accordingly, the light TX emitted by thelight source module 200 may be focused on a region where theobject 160 of interest is located, thereby improving the accuracy of the depth information provided from the depth pixels. Further, thecontrol unit 150 may adjust an amplitude of the light TX (or the maximum intensity of the light TX during each period) according to a decrement or an increment of the emission angle of the light TX or according to a size of a region on which the light TX is projected (or incident). For example, thecontrol unit 150 may decrease the amplitude of the light TX as the emission angle of the light TX decreases. As a result, in the three-dimensional image sensor 100 according to example embodiments, the power consumption may be reduced. - The digital data DATA and/or the depth information may be provided to the digital signal processing circuit and/or the external host. According to example embodiments, the
pixel array 110 may include color pixels, and the color image information as well as the depth information may be provided to the digital signal processing circuit and/or the external host. - As described above, in the three-
dimensional image sensor 100 according to example embodiments, since the light TX emitted by thelight source module 200 is focused on theobject 160 of interest, the accuracy of the depth information may be improved, and the power consumption may be reduced. -
FIG. 2 is a diagram for describing an example of calculating a distance of an object by a three-dimensional image sensor ofFIG. 1 . - Referring to
FIGS. 1 and 2 , light TX emitted by alight source module 200 may have a periodic intensity and/or characteristic. For example, the intensity (for example, the number of photons per unit area) of the light TX may have a waveform of a sine wave. - The light TX emitted by the
light source module 200 may be reflected from theobject 160, and then may be incident on thepixel array 110 as received light RX. Thepixel array 100 may periodically sample the received light RX. According to example embodiments, during each period of the received light RX (for example, corresponding to a period of the transmitted light TX), thepixel array 100 may perform a sampling on the received light RX by sampling, for example, at two sampling points having a phase difference of about 180 degrees, at four sampling points having a phase difference of about 90 degrees, or at more than four sampling points. For example, thepixel array 110 may extract four samples A0, A1, A2 and A3 of the received light RX at phases of about 90 degrees, about 180 degrees, about 270 degrees and about 360 degrees per period, respectively. - The received light RX may have an offset B that is different from an offset of the light TX emitted by the
light source module 200 due to background light, a noise, or the like. The offset B of the received light RX may be calculated byEquation 1. -
- Here, A0 represents an intensity of the received light RX sampled at a phase of about 90 degrees of the emitted light TX, A1 represents an intensity of the received light RX sampled at a phase of about 180 degrees of the emitted light TX, A2 represents an intensity of the received light RX sampled at a phase of about 270 degrees of the emitted light TX, and A3 represents an intensity of the received light RX sampled at a phase of about 360 degrees of the emitted light TX.
- The received light RX may have an amplitude A lower than that of the light TX emitted by the
light source module 200 due to loss (for example, light loss). The amplitude A of the received light RX may be calculated by Equation 2. -
- Black-and-white image information about the
object 160 may be provided by respective depth pixels included in thepixel array 110 based on the amplitude A of the received light RX. - The received light RX may be delayed by a phase difference Φ corresponding, for example, to a double of the distance of the
object 160 from the three-dimensional image sensor 100 with respect to the emitted light TX. The phase difference Φ between the emitted light TX and the received light RX may be calculated by Equation 3. -
- The phase difference Φ between the emitted light TX and the received light RX may, for example, correspond to a time-of-flight (TOF). The distance of the
object 160 from the three-dimensional image sensor 100 may be calculated by an equation, “R=c*TOF/2”, where R represents the distance of theobject 160, and c represents the speed of light. Further, the distance of theobject 160 from the three-dimensional image sensor 100 may also be calculated by Equation 4 using the phase difference Φ between the emitted light TX and the received light RX. -
- Here, f represents a modulation frequency, which is a frequency of the intensity of the emitted light TX (or a frequency of the intensity of the received light RX).
- This calculation of the distance may be performed with respect to each depth pixel included in the
pixel array 110. For example, acalculation unit 155 may perform the calculation on respective digital data DATA corresponding to the respective depth pixels to generate a plurality of distance values of a plurality of portions in an image frame. That is, each distance value may be calculated based on a signal output from one depth pixel, and may be a distance value of one portion (or one point) in the image frame. Further, thecalculation unit 155 may distinguish theobject 160 of interest from a background image in the image frame based on the distance values. For example, thecalculation unit 155 may detect theobject 160 of interest such that thecalculation unit 155 may determine portions of the image frame having relatively high distance values as theobject 160 of interest. Thus, thecalculation unit 155 may generate depth information about theobject 160 of interest by generating the distance values and by detecting theobject 160 of interest. - In some embodiments, the
calculation unit 155 may further calculate a horizontal/vertical position of theobject 160 of interest based on the distance values. For example, thecalculation unit 155 may measure a relative position of theobject 160 of interest from the center in the image frame, and may adjust the relative position using the distance information about theobject 160 of interest to generate the horizontal/vertical position of theobject 160 of interest. - In some embodiments, the
calculation unit 155 may further calculate a size of theobject 160 of interest based on the distance values. For example, thecalculation unit 155 may measure a size of theobject 160 of interest in the image frame, and may adjust the measured size of theobject 160 of interest using the distance information about theobject 160 of interest to generate the size of theobject 160 of interest. - As described above, the three-
dimensional image sensor 100 according to example embodiments may obtain depth information about theobject 160 using the light TX emitted by thelight source module 200. AlthoughFIG. 2 illustrates the light TX of which the intensity has a waveform of a sine wave, the three-dimensional image sensor 100 may use the light TX of which the intensity has various types of waveforms, according to example embodiments. Further, the three-dimensional image sensor 100 may extract the depth information according to the waveform of the intensity of the light TX, a structure of a depth pixel, or the like. -
FIG. 3 is a flow chart illustrating a method of operating a three-dimensional image sensor, according to example embodiments.FIG. 4 is a diagram for describing an example of measuring a distance of an object according to the method ofFIG. 3 .FIGS. 5A and 5B are diagrams for describing examples of adjusting an emission angle of light according to the method ofFIG. 3 . - Referring to
FIGS. 1 , 3, 4, 5A and 5B, a three-dimensional image sensor 100 measures a distance DIST of anobject 160 from the three-dimensional image sensor 100 using light TX emitted by a light source module 200 (S310). The light TX generated by alight source 210 may be emitted through alens 220. If thelight source 210 and thelens 220 have a first interval ITV1, the emitted light TX may have a first emission angle θ1. According to example embodiments, the first emission angle θ1 may be the maximum emission angle of the light TX emitted by thelight source module 200. The three-dimensional image sensor 100 may measure the distance DIST of theobject 160 from the three-dimensional image sensor 100 by detecting the light RX reflected from theobject 160 to the three-dimensional image sensor 100. - The three-
dimensional image sensor 100 adjusts the emission angle of the light TX emitted by thelight source module 200 based on the distance DIST of the object 160 (S330). According to example embodiments, as illustrated inFIG. 5A , the three-dimensional image sensor 100 may adjust the interval between (or, the separation) thelight source 210 and thelens 220 to a second interval ITV2 so that the light TX emitted by thelight source module 200 has a second emission angle θ2. For example, acontrol unit 150 may control thelight source module 200 to decrease the emission angle of the light TX as the distance DIST of theobject 160 increases. According to example embodiments, thecontrol unit 150 may move thelight source 210 such that the interval between thelight source 210 and thelens 220 increases as the distance DIST of theobject 160 increases. According to example embodiments, thecontrol unit 150 may move thelens 220 such that the interval between thelight source 210 and thelens 220 increases as the distance DIST of theobject 160 increases. - According to example embodiments, as illustrated in
FIG. 5B , the three-dimensional image sensor 100 may adjust a curvature of thelens 220 so that the light TX emitted by thelight source module 200 has the second emission angle θ2. For example, thecontrol unit 150 may increase the curvature of the lens 220 (i.e. decrease a radius of curvature of the lens 220) as the distance DIST of theobject 160 increases. - According to example embodiments, as illustrated in
FIG. 5B , the three-dimensional image sensor 100 may adjust a refractive index of thelens 220 so that the light TX emitted by thelight source module 200 has the second emission angle θ2. For example, thecontrol unit 150 may increase the refractive index of thelens 220 as the distance DIST of theobject 160 increases. According to example embodiments, the three-dimensional image sensor 100 may adjust any one or two or all of the interval between thelight source 210 andlens 220, a curvature of thelens 220 and a refractive index of thelens 220. - In the method of operating the three-
dimensional image sensor 100 according to example embodiments, since the emission angle of the light TX emitted by thelight source module 200 is adjusted corresponding to the distance DIST of theobject 160, light energy projected on theobject 160 may be increased even with less power consumption, and the accuracy of depth information obtained by the three-dimensional image sensor 100 may be improved. - Further, in example embodiments, the three-
dimensional image sensor 100 may emit the light TX with the maximum amplitude before adjusting the emission angle of the light TX, and may decrease the amplitude of the light TX according to a decrement of the emission angle of the light TX. Accordingly, the power consumed by thelight source module 200 may be reduced. However, an operation, wherein light TX is initially emitted with minimum amplitude and the amplitude is later maximized depending on the emission angle of light TX, is also possible. -
FIG. 6 is a flow chart illustrating a method of operating a three-dimensional image sensor, according to example embodiments.FIG. 7 is a diagram for describing an example of measuring a horizontal position and a vertical position of an object according to the method ofFIG. 6 .FIG. 8 is a diagram for describing examples of adjusting a relative position (or, alternatively a placement) of a light source to a lens according to the method ofFIG. 6 . - Referring to
FIGS. 1 , 6, 7 and 8, a three-dimensional image sensor 100 measures a horizontal position HP1 and/or a vertical position VP1 of anobject 160 using light TX emitted by a light source module 200 (S410). For example, theobject 160 may be placed at a distance HP1 in a positive horizontal direction and/or a distance VP1 in a positive vertical direction from a straight line connecting the center of alight source 210 and the center of alens 220. This straight line may be assumed to pass vertically through the plane of the paper and through the point of intersection of the horizontal and vertical axes shown inFIG. 7 . The light TX emitted by thelight source module 200 may be reflected from theobject 160 to the three-dimensional image sensor 100. The received light RX may be converted into data DATA by depth pixels included in apixel array 110 and anADC unit 120, and acontrol unit 150 may measure the horizontal position HP1 and/or the vertical position VP1 of theobject 160 based on the data DATA. - The three-
dimensional image sensor 100 adjusts a relative position (or, the placement) of thelight source 210 to thelens 220 based on the horizontal position HP1 and/or the vertical position VP1 of the object 160 (S430). According to example embodiments, as illustrated inFIG. 8 , thecontrol unit 150 may move thelight source 210 by a desired (or, alternatively predetermined) distance HP2 in a negative horizontal direction and/or by a desired (or, alternatively predetermined) distance VP2 in a negative vertical direction based on the positive horizontal position HP1 and/or the positive vertical position VP1 of theobject 160. For example, a ratio of the adjusted horizontal position HP2 of thelight source 210 to the measured horizontal position HP1 of theobject 160 may correspond to a ratio of a distance of thelight source 210 from thelens 220 to a distance of theobject 160 from thelens 220, and a ratio of the adjusted vertical position VP2 of thelight source 210 to the measured vertical position VP1 of theobject 160 may correspond to the ratio of the distance of thelight source 210 from thelens 220 to the distance of theobject 160 from thelens 220. - According to example embodiments, the
control unit 150 may move thelens 220 by a desired (or, alternatively predetermined) distance in a positive horizontal direction and/or by a desired (or, alternatively predetermined) distance VP2 in a positive vertical direction based on the positive horizontal position HP1 and/or the positive vertical position VP1 of theobject 160. - According to example embodiments, the
control unit 150 may move thelight source 210 or thelens 220 in a horizontal direction and/or a vertical direction based on the horizontal position HP1 and/or the vertical position VP1 of theobject 160 so that thelight source 210, thelens 220 and theobject 160 are positioned in a straight line. - Further, the
control unit 150 may adjust an emission angle of the light TX emitted by thelight source module 200 according to a distance of theobject 160 from the three-dimensional image sensor 110 and/or a size of theobject 160, and may adjust (for example, decrease) an amplitude of the emitted light TX. - As described above, in the method of operating the three-
dimensional image sensor 100 according to example embodiments, since the position of thelight source 210 and thelens 220 is adjusted according to the horizontal position HP1 and/or the vertical position VP1 of theobject 160, light energy projected (or incident) on theobject 160 may be increased even with less power consumption. Accordingly, the accuracy of depth information obtained by the three-dimensional image sensor 100 may be improved, and the power consumed by thelight source module 200 may be reduced. -
FIG. 9 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments.FIG. 10 is a diagram for describing an example of measuring a size of an object according to the method ofFIG. 9 . - Referring to
FIGS. 1 , 9 and 10, a three-dimensional image sensor 100 measures a size of anobject 160 using light TX emitted by a light source module 200 (S510). The light TX emitted by thelight source module 200 may be reflected from theobject 160 back to the three-dimensional image sensor 100. The received light RX may be converted into data DATA by depth pixels included in apixel array 110 and anADC unit 120, and acontrol unit 150 may measure the size of theobject 160 based on the data DATA. - The three-
dimensional image sensor 100 adjusts a relative position of thelight source 210 to thelens 220 based on the size of the object 160 (S530). According to example embodiments, as illustrated inFIG. 10 , thecontrol unit 150 may adjust an emission angle of the light TX so that the emitted light TX is focused on aregion 170 corresponding to the size of theobject 160. For example, as illustrated inFIGS. 5A and 5B , thecontrol unit 150 may move thelight source 210 and/or thelens 220 to adjust any one or two or all of an interval between thelight source 210 and thelens 220, a refractive index and a curvature of thelens 220. - As described above, in the method of operating the three-
dimensional image sensor 100 according to example embodiments, since the relative position of thelight source 210 and thelens 220 and/or the property of thelens 220 is adjusted according to the size of theobject 160, light energy projected on theobject 160 may be increased even with less power consumption. Accordingly, the accuracy of depth information obtained by the three-dimensional image sensor 100 may be improved, and the power consumed by thelight source module 200 may be reduced. -
FIG. 11 is a flow chart illustrating a method of operating a three-dimensional image sensor according to example embodiments. - Referring to
FIGS. 1 and 11 , a three-dimensional image sensor 100 obtains position information of anobject 160 using light TX emitted by a light source module 200 (S610). For example, the position information may include a distance of theobject 160 from the three-dimensional image sensor 100, a horizontal position of theobject 160, a vertical position of theobject 160, a size of theobject 160, or the like. - The three-
dimensional image sensor 100 adjusts a relative position of alight source 210 to alens 220 based on the position information of the object 160 (S630). For example, the three-dimensional image sensor 100 may adjust an interval between thelight source 210 and thelens 220, a refractive index of thelens 220 and/or a curvature of thelens 220 based on the distance of theobject 160 from the three-dimensional image sensor 100. According to example embodiments, the three-dimensional image sensor 100 may move thelight source 210 and/or thelens 220 in a horizontal direction and/or in a vertical direction based on the horizontal position and the vertical position of theobject 160 so that thelight source 210, thelens 220 and theobject 160 are positioned in a straight line. According to example embodiments, the three-dimensional image sensor 100 may adjust any one, or two or all of the interval between thelight source 210 and thelens 220, the refractive index of thelens 220 and the curvature of thelens 220 based on the size of theobject 160 so that the light TX emitted by thelight source module 200 is focused on a region corresponding to theobject 160 of interest. - According to example embodiments, the three-
dimensional image sensor 100 may emit the light TX with the maximum amplitude before adjusting the relative position of thelight source 210 and thelens 220, and may decrease the amplitude of the light TX after adjusting the relative position. Accordingly, the power consumption by thelight source module 200 may be reduced. However, an operation, wherein light TX is initially emitted with minimum amplitude and the amplitude is then maximized, is also possible. - As described above, in the method of operating the three-
dimensional image sensor 100 according to example embodiments, since the relative position of thelight source 210 and thelens 220 is adjusted based on the position information of theobject 160, light energy projected on theobject 160 may be increased even with less power consumption. Accordingly, the accuracy of depth information obtained by the three-dimensional image sensor 100 may be improved, and the power consumed by thelight source module 200 may be reduced. - Although
FIGS. 1 through 11 illustrate an example of thelight source module 200 including thelight source 210 and thelens 220, in example embodiments may also include, thelight source module 200 having a reflector along with or instead of thelens 220. -
FIG. 12 is a block diagram illustrating a camera including a three-dimensional image sensor according to example embodiments. - Referring to
FIG. 12 , acamera 800 includes a receivinglens 810, a three-dimensional image sensor 100, amotor unit 830 and anengine unit 840. The three-dimensional image sensor 100 may include a three-dimensionalimage sensor chip 820 and alight source module 200. In example embodiments, the three-dimensionalimage sensor chip 820 and thelight source module 200 may be implemented as separate devices, or may be implemented such that at least one component of thelight source module 200 is included in the three-dimensionalimage sensor chip 820. - The receiving
lens 810 may focus incident light on a photo-receiving region (e.g., depth pixels and/or color pixels) of the three-dimensionalimage sensor chip 820. The three-dimensionalimage sensor chip 820 may generate data DATA1 including depth information and/or color image information based on the incident light passing through the receivinglens 810. For example, the data DATA1 generated by the three-dimensionalimage sensor chip 820 may include depth data generated using infrared light or near-infrared light emitted by thelight source module 200, and RGB data of a Bayer pattern generated using external visible light. The three-dimensionalimage sensor chip 820 may provide the data DATA1 to theengine unit 840 in response to a clock signal CLK. In example embodiments, the three-dimensionalimage sensor chip 820 may interface with theengine unit 840 using a mobile industry processor interface (MIPI) and/or a camera serial interface (CSI). - The
motor unit 830 may control the focusing of thelens 810 or may perform shuttering in response to a control signal CTRL received from theengine unit 840. In example embodiments, a relative position of alight source 210 and alens 220 included in thelight source module 200 may be adjusted by themotor unit 830 and/or the three-dimensionalimage sensor chip 820. - The
engine unit 840 may control the three-dimensional image sensor 100 and themotor unit 830. Theengine unit 840 may process the data DATA1 received from the three-dimensionalimage sensor chip 820. For example, theengine unit 840 may generate three-dimensional color data based on the received data DATA1. In example embodiments, theengine unit 840 may generate YUV data including a luminance component, a difference between the luminance component and a blue component, and a difference between the luminance component and a red component based on the RGB data, or may generate compressed data, such as joint photography experts group (JPEG) data. Theengine unit 840 may be coupled to a host/application 850, and may provide data DATA2 to the host/application 850 based on a master clock signal MCLK. In example embodiments, theengine unit 840 may interface with the host/application 850 using a serial peripheral interface (SPI) and/or an inter integrated circuit (I2C) interface. -
FIG. 13 is a block diagram illustrating a computing system including a three-dimensional image sensor according to example embodiments. - Referring to
FIG. 13 , acomputing system 1000 includes aprocessor 1010, amemory device 1020, astorage device 1030, an input/output device 1040, apower supply 1050 and/or a three-dimensional image sensor 100. Although it is not illustrated inFIG. 13 , thecomputing system 1000 may further include a port for communicating with electronic devices, such as a video card, a sound card, a memory card, a USB device, etc. - The
processor 1010 may perform specific calculations and/or tasks. For example, theprocessor 1010 may be a microprocessor, a central process unit (CPU), a digital signal processor, or the like. Theprocessor 1010 may communicate with thememory device 1020, thestorage device 1030 and the input/output device 1040 via an address bus, a control bus and/or a data bus. Theprocessor 1010 may be coupled to an extension bus, such as a peripheral component interconnect (PCI) bus. Thememory device 1020 may store data for operating thecomputing system 1020. For example, thememory device 1020 may be implemented by a dynamic random access memory (DRAM), a mobile DRAM, a static random access memory (SRAM), a phase change random access memory (PRAM), a resistance random access memory (RRAM), a nano floating gate memory (NFGM), a polymer random access memory (PoRAM), a magnetic random access memory (MRAM), a ferroelectric random access memory (FRAM), or the like. Thestorage device 1030 may include a solid state drive, a hard disk drive, a CD-ROM, or the like. The input/output device 1040 may include an input device, such as a keyboard, a mouse, a keypad, etc., and an output device, such as a printer, a display device, or the like. Thepower supply 1050 may supply power to thecomputing device 1000. - The three-
dimensional image sensor 100 may be coupled to theprocessor 1010 via the buses or other desired communication links. As described above, three-dimensional image sensor 100 may adjust a projection region of light emitted by a light source module based on position information of an object of interest, thereby improving the accuracy and reducing the power consumption. The three-dimensional image sensor 100 and theprocessor 1010 may be integrated in one chip, or may be implemented as separate chips. - In example embodiments, the three-
dimensional image sensor 100 and/or components of the three-dimensional image sensor 100 may be packaged in various desired forms, such as package on package (PoP), ball grid arrays (BGAs), chip scale packages (CSPs), plastic leaded chip carrier (PLCC), plastic dual in-line package (PDIP), die in waffle pack, die in wafer form, chip on board (COB), ceramic dual in-line package (CERDIP), plastic metric quad flat pack (MQFP), thin quad flat pack (TQFP), small outline IC (SOIC), shrink small outline package (SSOP), thin small outline package (TSOP), system in package (SIP), multi chip package (MCP), wafer-level fabricated package (WFP), or wafer-level processed stack package (WSP). - The
computing system 1000 may be any computing system including the three-dimensional image sensor 100. For example, thecomputing system 1000 may include a digital camera, a mobile phone, a smart phone, a personal digital assistants (PDA), a portable multimedia player (PMP), or the like. -
FIG. 14 is a block diagram illustrating an example of an interface used in a computing system ofFIG. 13 . - Referring to
FIG. 14 , acomputing system 1100 may employ or support a MIPI interface, and may include anapplication processor 1110, a three-dimensional image sensor 1140 and adisplay device 1050. ACSI host 1112 of theapplication processor 1110 may perform a serial communication with aCSI device 1141 of the three-dimensional image sensor 1140 using a camera serial interface (CSI). In example embodiments, theCSI host 1112 may include a deserializer DES, and theCSI device 1141 may include a serializer SER. A DSI host 1111 of theapplication processor 1110 may perform a serial communication with aDSI device 1151 of thedisplay device 1150 using a display serial interface (DSI). In example embodiments, the DSI host 1111 may include a serializer SER, and theDSI device 1151 may include a deserializer DES. - The
computing system 1100 may further include a radio frequency (RF)chip 1160. Aphysical layer PHY 1113 of theapplication processor 1110 may perform data transfer with aphysical layer PHY 1161 of theRF chip 1160 using a MIPI DigRF. ThePHY 1113 of theapplication processor 1110 may interface (or, alternatively communicate) aDigRF MASTER 1114 for controlling the data transfer with thePHY 1161 of theRF chip 1160. Thecomputing system 1100 may further include a global positioning system (GPS) 1120, astorage device 1170, amicrophone 1180, aDRAM 1185 and/or aspeaker 1190. Thecomputing system 1100 may communicate with external devices using an ultra wideband (UWB)communication 1210, a wireless local area network (WLAN)communication 1220, a worldwide interoperability for microwave access (WIMAX)communication 1230, or the like. However, example embodiments are not limited to configurations or interfaces of thecomputing systems FIGS. 13 and 14 . - Example embodiments may be used in any three-dimensional image sensor or any system including the three-dimensional image sensor, such as a computer, a digital camera, a three-dimensional camera, a mobile phone, a personal digital assistant (PDA), a scanner, a navigator, a video phone, a monitoring system, an auto focus system, a tracking system, a motion capture system, an image stabilizing system, or the like.
- While example embodiments have been particularly shown and described, it will be understood by one of ordinary skill in the art that variations in form and detail may be made therein without departing from the spirit and scope of the claims.
Claims (24)
1. A method of operating a three-dimensional image sensor, the method comprising:
measuring a distance of an object from the three-dimensional image sensor using light emitted by a light source module, the three-dimensional image sensor including the light source module; and
adjusting an emission angle of the light emitted by the light source module based on the measured distance.
2. The method of claim 1 , wherein the measuring comprises:
emitting the light with desired emission angle at the light source module; and
measuring the distance of the object from the three-dimensional image sensor by detecting a time-of-flight of the emitted light.
3. The method of claim 1 , wherein the adjusting decreases the emission angle of the light as the distance of the object from the three-dimensional image sensor increases.
4. The method of claim 1 , wherein the light source module includes a light source and a lens, and wherein the adjusting adjusts an interval between the light source and the lens based on the measured distance.
5. The method of claim 4 , wherein the adjusting moves the light source or the lens such that a separation between the light source and the lens increases as the distance of the object from the three-dimensional image sensor increases.
6. The method of claim 1 , wherein the light source module includes a light source and a lens, and wherein the adjusting
adjusts a refractive index of the lens based on the measured distance.
7. The method of claim 6 , wherein the adjusting
increases the refractive index of the lens as the distance of the object from the three-dimensional image sensor increases.
8. The method of claim 1 , wherein the light source module includes a light source and a lens, and wherein the adjusting
adjusts a curvature of the lens based on the measured distance.
9. The method of claim 8 , wherein the adjusting
increases the curvature of the lens as the distance of the object from the three-dimensional image sensor increases.
10. The method of claim 1 , further comprising:
adjusting an amplitude of the light emitted by the light source module according to an increment or a decrement of the emission angle of the light.
11. The method of claim 10 , wherein the adjusting
decreases the amplitude of the light as the emission angle of the light decreases.
12. A method of operating a three-dimensional image sensor, the method comprising:
obtaining position information of an object using light emitted by a light source module, the three-dimensional image sensor including the light source module having a light source and a lens; and
adjusting a relative position of the light source to the lens based on the obtained position information of the object.
13. The method of claim 12 , wherein the position information of the object includes at least one of a distance of the object from the three-dimensional image sensor, a horizontal position of the object, a vertical position of the object and a size of the object.
14. The method of claim 12 , wherein the relative position of the light source to the lens includes at least one of an interval between the light source and the lens, a horizontal position of the light source and a vertical position of the light source.
15. The method of claim 12 , wherein the position information of the object includes a horizontal position and a vertical position of the object, and wherein the adjusting
moves the light source or the lens such that the object, the lens and the light source are positioned in a straight line.
16. A method of operating an image sensor, the method comprising:
obtaining position information of an object using light emitted by a light source module, the image sensor including the light source module; and
adjusting an emission angle of the light emitted by the light source module based on the obtained position information.
17. The method of claim 16 , wherein the obtaining the position information of the object comprises:
emitting the light with a desired emission angle from the light source module; and
measuring a distance of the object from the image sensor by detecting a time-of-flight of the emitted light.
18. The method of claim 17 , wherein the adjusting adjusts the emission angle of the light based on the distance of the object from the image sensor.
19. The method of claim 17 , wherein the adjusting adjusts the emission angle of the light by adjusting a separation between a light source and a lens based on the measured distance, the light source module including the light source and the lens.
20. The method of claim 17 , wherein the adjusting adjusts the emission angle of the light by adjusting a refractive index of a lens included in the light source module based on the measured distance.
21. The method of claim 17 , wherein the adjusting adjusts the emission angle of the light by adjusting a curvature of the lens based on the measured distance.
22. The method of claim 17 , further comprising:
adjusting an amplitude of the light emitted by the light source module according to an increase or a decrease in the emission angle of the light.
23. The method of claim 17 , wherein the measuring at least twice samples light reflected from the object, calculates a phase difference between the emitted and reflected light based on the at least two samples and detects the time-of-flight of the emitted light.
24.-30. (canceled)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/883,453 US10602086B2 (en) | 2010-12-21 | 2015-10-14 | Methods of operating image sensors |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020100131147A KR101694797B1 (en) | 2010-12-21 | 2010-12-21 | Method of operating a three-dimensional image sensor |
KR10-2010-0131147 | 2010-12-21 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/883,453 Division US10602086B2 (en) | 2010-12-21 | 2015-10-14 | Methods of operating image sensors |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120154537A1 true US20120154537A1 (en) | 2012-06-21 |
Family
ID=46233860
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/239,925 Abandoned US20120154537A1 (en) | 2010-12-21 | 2011-09-22 | Image sensors and methods of operating the same |
US14/883,453 Active US10602086B2 (en) | 2010-12-21 | 2015-10-14 | Methods of operating image sensors |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/883,453 Active US10602086B2 (en) | 2010-12-21 | 2015-10-14 | Methods of operating image sensors |
Country Status (3)
Country | Link |
---|---|
US (2) | US20120154537A1 (en) |
KR (1) | KR101694797B1 (en) |
CN (1) | CN102547355B (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130322863A1 (en) * | 2012-06-01 | 2013-12-05 | Hon Hai Precision Industry Co., Ltd. | Camera and auto-focusing method of the camera |
US20130342653A1 (en) * | 2012-06-20 | 2013-12-26 | Honeywell International Inc. | Cargo sensing |
US20150022545A1 (en) * | 2013-07-18 | 2015-01-22 | Samsung Electronics Co., Ltd. | Method and apparatus for generating color image and depth image of object by using single filter |
US20160054812A1 (en) * | 2014-08-25 | 2016-02-25 | Samsung Electronics Co., Ltd. | Apparatus and method of recognizing movement of subject |
US20160073061A1 (en) * | 2014-09-04 | 2016-03-10 | Adesa, Inc. | Vehicle Documentation System |
US20160138910A1 (en) * | 2014-11-13 | 2016-05-19 | Samsung Electronics Co., Ltd. | Camera for measuring depth image and method of measuring depth image |
US20160165214A1 (en) * | 2014-12-08 | 2016-06-09 | Lg Innotek Co., Ltd. | Image processing apparatus and mobile camera including the same |
US20160224095A1 (en) * | 2012-11-13 | 2016-08-04 | Lg Display Co., Ltd. | Touch sensing system and method of controlling power consumption thereof |
US20160377720A1 (en) * | 2014-01-29 | 2016-12-29 | Lg Innotek Co., Ltd. | Device for extracting depth information |
WO2016064096A3 (en) * | 2014-10-21 | 2017-05-04 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US9989630B2 (en) * | 2015-05-13 | 2018-06-05 | Infineon Technologies Ag | Structured-light based multipath cancellation in ToF imaging |
CN108495039A (en) * | 2015-05-19 | 2018-09-04 | 广东欧珀移动通信有限公司 | A kind of camera angle correcting method and terminal |
US10580234B2 (en) | 2017-01-20 | 2020-03-03 | Adesa, Inc. | Vehicle documentation system |
US10594974B2 (en) | 2016-04-07 | 2020-03-17 | Tobii Ab | Image sensor for vision based on human computer interaction |
CN112153442A (en) * | 2019-06-28 | 2020-12-29 | Oppo广东移动通信有限公司 | Playing method, device, terminal, television equipment, storage medium and electronic equipment |
US20220321789A1 (en) * | 2019-05-27 | 2022-10-06 | Lg Innotek Co., Ltd. | Camera module |
US11885981B2 (en) | 2018-06-22 | 2024-01-30 | Lg Innotek Co., Ltd. | Camera module |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101938648B1 (en) * | 2012-10-23 | 2019-01-15 | 삼성전자주식회사 | Mobile system including image sensor, method of operating image sensor and method of operating mobile system |
KR102153045B1 (en) * | 2013-12-04 | 2020-09-07 | 삼성전자주식회사 | Wavelength separation device and 3-dimensional image acquisition apparatus including the wavelength separation device |
CN104967763B (en) * | 2015-06-09 | 2019-04-26 | 联想(北京)有限公司 | A kind of image acquisition device, image-pickup method and electronic equipment |
KR102311688B1 (en) | 2015-06-17 | 2021-10-12 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
CN105472224B (en) * | 2016-01-05 | 2017-03-15 | 北京度量科技有限公司 | A kind of passive optical motion capture equipment and its application |
CN107710741B (en) * | 2016-04-21 | 2020-02-21 | 华为技术有限公司 | Method for acquiring depth information and camera device |
JP7103354B2 (en) * | 2017-05-24 | 2022-07-20 | ソニーグループ株式会社 | Information processing equipment, information processing methods, and programs |
US11828849B2 (en) * | 2017-11-28 | 2023-11-28 | Sony Semiconductor Solutions Corporation | Illumination device, time of flight system and method |
CN109981993B (en) * | 2017-12-28 | 2021-02-05 | 舜宇光学(浙江)研究院有限公司 | Depth camera projector power consumption control method and application thereof |
CN109031334A (en) * | 2018-08-27 | 2018-12-18 | 湖北工业大学 | A kind of safety monitoring system and method based on 3-D image ranging |
CN109788174B (en) * | 2019-01-11 | 2021-05-07 | 维沃移动通信有限公司 | Light supplementing method and terminal |
KR102432531B1 (en) * | 2019-10-23 | 2022-08-18 | 세메스 주식회사 | Droplet inspection module and droplet inspection method |
KR20210063674A (en) * | 2019-11-25 | 2021-06-02 | 엘지이노텍 주식회사 | Camera module |
EP4050377A4 (en) * | 2020-03-16 | 2023-08-09 | Shenzhen Goodix Technology Co., Ltd. | Three-dimensional image sensing system and related electronic device, and time-of-flight ranging method |
US20240056669A1 (en) * | 2021-01-15 | 2024-02-15 | Lg Innotek Co., Ltd. | Camera module |
KR102473423B1 (en) * | 2021-05-03 | 2022-12-02 | 삼성전기주식회사 | Time of flight camera |
CN118210372A (en) * | 2022-12-16 | 2024-06-18 | 高通科技公司 | Regulation and control method, tracking method, system, equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010043335A1 (en) * | 1993-12-20 | 2001-11-22 | Toshio Norita | Measuring system with improved method of reading image data of an object |
US20050162639A1 (en) * | 2002-08-03 | 2005-07-28 | Joerg Stierle | Method and device for optically measuring distance |
US20060235753A1 (en) * | 2005-04-04 | 2006-10-19 | Denso Corporation | Vehicular user hospitality system |
US20070237363A1 (en) * | 2004-07-30 | 2007-10-11 | Matsushita Electric Works, Ltd. | Image Processing Device |
US20080174860A1 (en) * | 2006-11-06 | 2008-07-24 | University Of Massachusetts | Systems and methods of all-optical fourier phase contrast imaging using dye doped liquid crystals |
US20100278030A1 (en) * | 2007-09-20 | 2010-11-04 | Yasuhiro Tanaka | Optical pickup optical system and optical pickup device having the same |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2927179B2 (en) * | 1994-06-15 | 1999-07-28 | ミノルタ株式会社 | 3D shape input device |
HUP9700348A1 (en) | 1997-02-04 | 1998-12-28 | Holografika E.C. | Method and device for displaying three-dimensional pictures |
JP4265214B2 (en) * | 2002-12-20 | 2009-05-20 | ソニー株式会社 | Lighting device and camera |
JP2004333736A (en) * | 2003-05-06 | 2004-11-25 | Mitsubishi Electric Corp | Portable equipment |
US7178250B2 (en) * | 2004-07-21 | 2007-02-20 | Irwin Industrial Tool Company | Intersecting laser line generating device |
CN1984766B (en) * | 2004-07-28 | 2012-01-25 | 东洋制罐株式会社 | Thermal crystallization system of saturated polyester hollow body and its heating method |
JP4659414B2 (en) | 2004-09-01 | 2011-03-30 | アバゴ・テクノロジーズ・イーシービーユー・アイピー(シンガポール)プライベート・リミテッド | Light emitting diode and light emission control system using the same |
WO2006118037A1 (en) * | 2005-04-28 | 2006-11-09 | Matsushita Electric Industrial Co., Ltd. | Optical pickup device, optical information device provided with such optical pickup device, and optical information recording/reproducing device provided with such optical information device |
JP2007195020A (en) | 2006-01-20 | 2007-08-02 | Sanyo Electric Co Ltd | Camera |
US8496575B2 (en) * | 2006-11-14 | 2013-07-30 | Olympus Corporation | Measuring endoscope apparatus, program and recording medium |
KR100825919B1 (en) | 2006-12-18 | 2008-04-28 | 엘지전자 주식회사 | Distance measurement sensor, moving robot having the distance measurement sensor and driving method for moving robot using the moving robot |
US8385153B2 (en) | 2007-11-08 | 2013-02-26 | Lite-On It Corporation | Light control system with multiple ultrasonic receivers |
JP5239326B2 (en) | 2007-12-19 | 2013-07-17 | ソニー株式会社 | Image signal processing apparatus, image signal processing method, image projection system, image projection method and program |
JP4879218B2 (en) * | 2008-04-25 | 2012-02-22 | シャープ株式会社 | Lens body, light source unit, and illumination device |
JP2011133836A (en) * | 2009-12-24 | 2011-07-07 | Shinten Sangyo Co Ltd | Camera-equipped mobile phone with linear drive led light emitting device |
-
2010
- 2010-12-21 KR KR1020100131147A patent/KR101694797B1/en active IP Right Grant
-
2011
- 2011-09-22 US US13/239,925 patent/US20120154537A1/en not_active Abandoned
- 2011-12-13 CN CN201110421183.0A patent/CN102547355B/en active Active
-
2015
- 2015-10-14 US US14/883,453 patent/US10602086B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010043335A1 (en) * | 1993-12-20 | 2001-11-22 | Toshio Norita | Measuring system with improved method of reading image data of an object |
US20050162639A1 (en) * | 2002-08-03 | 2005-07-28 | Joerg Stierle | Method and device for optically measuring distance |
US20070237363A1 (en) * | 2004-07-30 | 2007-10-11 | Matsushita Electric Works, Ltd. | Image Processing Device |
US20060235753A1 (en) * | 2005-04-04 | 2006-10-19 | Denso Corporation | Vehicular user hospitality system |
US20080174860A1 (en) * | 2006-11-06 | 2008-07-24 | University Of Massachusetts | Systems and methods of all-optical fourier phase contrast imaging using dye doped liquid crystals |
US20100278030A1 (en) * | 2007-09-20 | 2010-11-04 | Yasuhiro Tanaka | Optical pickup optical system and optical pickup device having the same |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8837932B2 (en) * | 2012-06-01 | 2014-09-16 | Hon Hai Precision Industry Co., Ltd. | Camera and auto-focusing method of the camera |
US20130322863A1 (en) * | 2012-06-01 | 2013-12-05 | Hon Hai Precision Industry Co., Ltd. | Camera and auto-focusing method of the camera |
US20130342653A1 (en) * | 2012-06-20 | 2013-12-26 | Honeywell International Inc. | Cargo sensing |
US10158842B2 (en) * | 2012-06-20 | 2018-12-18 | Honeywell International Inc. | Cargo sensing detection system using spatial data |
US10120432B2 (en) * | 2012-11-13 | 2018-11-06 | Lg Display Co., Ltd. | Touch sensing system and method of controlling power consumption thereof |
US20160224095A1 (en) * | 2012-11-13 | 2016-08-04 | Lg Display Co., Ltd. | Touch sensing system and method of controlling power consumption thereof |
US20150022545A1 (en) * | 2013-07-18 | 2015-01-22 | Samsung Electronics Co., Ltd. | Method and apparatus for generating color image and depth image of object by using single filter |
US20160377720A1 (en) * | 2014-01-29 | 2016-12-29 | Lg Innotek Co., Ltd. | Device for extracting depth information |
US10094926B2 (en) * | 2014-01-29 | 2018-10-09 | Lg Innotek Co., Ltd. | Device for extracting depth information |
US9857877B2 (en) * | 2014-08-25 | 2018-01-02 | Samsung Electronics Co., Ltd. | Apparatus and method of recognizing movement of subject |
US20160054812A1 (en) * | 2014-08-25 | 2016-02-25 | Samsung Electronics Co., Ltd. | Apparatus and method of recognizing movement of subject |
US10401969B2 (en) | 2014-08-25 | 2019-09-03 | Samsung Electronics Co., Ltd. | Apparatus and method of recognizing movement of subject |
US20160073061A1 (en) * | 2014-09-04 | 2016-03-10 | Adesa, Inc. | Vehicle Documentation System |
US9942453B2 (en) | 2014-10-21 | 2018-04-10 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
WO2016064096A3 (en) * | 2014-10-21 | 2017-05-04 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US9769456B2 (en) * | 2014-11-13 | 2017-09-19 | Samsung Electronics Co., Ltd. | Camera for measuring depth image and method of measuring depth image |
US20160138910A1 (en) * | 2014-11-13 | 2016-05-19 | Samsung Electronics Co., Ltd. | Camera for measuring depth image and method of measuring depth image |
US20160165214A1 (en) * | 2014-12-08 | 2016-06-09 | Lg Innotek Co., Ltd. | Image processing apparatus and mobile camera including the same |
US9989630B2 (en) * | 2015-05-13 | 2018-06-05 | Infineon Technologies Ag | Structured-light based multipath cancellation in ToF imaging |
CN108495039A (en) * | 2015-05-19 | 2018-09-04 | 广东欧珀移动通信有限公司 | A kind of camera angle correcting method and terminal |
US10594974B2 (en) | 2016-04-07 | 2020-03-17 | Tobii Ab | Image sensor for vision based on human computer interaction |
US10580234B2 (en) | 2017-01-20 | 2020-03-03 | Adesa, Inc. | Vehicle documentation system |
US11885981B2 (en) | 2018-06-22 | 2024-01-30 | Lg Innotek Co., Ltd. | Camera module |
US20220321789A1 (en) * | 2019-05-27 | 2022-10-06 | Lg Innotek Co., Ltd. | Camera module |
US11997383B2 (en) * | 2019-05-27 | 2024-05-28 | Lg Innotek Co., Ltd. | Camera module extracting distance information based on a time-of-flight method |
CN112153442A (en) * | 2019-06-28 | 2020-12-29 | Oppo广东移动通信有限公司 | Playing method, device, terminal, television equipment, storage medium and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
KR20120069833A (en) | 2012-06-29 |
CN102547355B (en) | 2017-04-12 |
KR101694797B1 (en) | 2017-01-11 |
US20160037094A1 (en) | 2016-02-04 |
CN102547355A (en) | 2012-07-04 |
US10602086B2 (en) | 2020-03-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10602086B2 (en) | Methods of operating image sensors | |
US20120236121A1 (en) | Methods of Operating a Three-Dimensional Image Sensor Including a Plurality of Depth Pixels | |
US20130229491A1 (en) | Method of operating a three-dimensional image sensor | |
US10186045B2 (en) | Methods of and apparatuses for recognizing motion of objects, and associated systems | |
US8687174B2 (en) | Unit pixel, photo-detection device and method of measuring a distance using the same | |
KR102007279B1 (en) | Depth pixel included in three-dimensional image sensor, three-dimensional image sensor including the same and method of operating depth pixel included in three-dimensional image sensor | |
US9324758B2 (en) | Depth pixel included in three-dimensional image sensor and three-dimensional image sensor including the same | |
US9485440B2 (en) | Signal processing unit and signal processing method | |
US9673236B2 (en) | Pixel array of an image sensor and image sensor | |
KR20120075739A (en) | Image processing system and image processing method | |
US20130222543A1 (en) | Method and apparatus for generating depth information from image | |
US9103722B2 (en) | Unit pixels, depth sensors and three-dimensional image sensors including the same | |
KR20120111013A (en) | A tree-dimensional image sensor and method of measuring distance using the same | |
KR101938648B1 (en) | Mobile system including image sensor, method of operating image sensor and method of operating mobile system | |
US20130062500A1 (en) | Image sensor apparatus using shaded photodetector for time of flight determination | |
US20140166858A1 (en) | Methods of Operating Depth Pixel Included in Three-Dimensional Image Sensor and Methods of Operating Three-Dimensional Image Sensor | |
WO2022241942A1 (en) | Depth camera and depth calculation method | |
US12020406B2 (en) | Image signal processing method, image sensing device including an image signal processor | |
US20210281752A1 (en) | Imaging device and electronic device including the same | |
CN112596065A (en) | Time-of-flight sensor and method of calibrating errors therein | |
US20220018946A1 (en) | Multi-function time-of-flight sensor and method of operating the same | |
KR20120111092A (en) | Image pick-up apparatus | |
KR20120110614A (en) | A tree-dimensional image sensor | |
CN202230718U (en) | Spatial light modulator controller | |
US20220385841A1 (en) | Image sensor including image signal processor and operating method of the image sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHANG, SEUNG-HYUK;PARK, YOON-DONG;LEE, YONG-JEI;REEL/FRAME:026987/0035 Effective date: 20110811 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |