US20130335725A1 - Color sensor insensitive to distance variations - Google Patents
Color sensor insensitive to distance variations Download PDFInfo
- Publication number
- US20130335725A1 US20130335725A1 US13/973,692 US201313973692A US2013335725A1 US 20130335725 A1 US20130335725 A1 US 20130335725A1 US 201313973692 A US201313973692 A US 201313973692A US 2013335725 A1 US2013335725 A1 US 2013335725A1
- Authority
- US
- United States
- Prior art keywords
- color
- distance
- sensing pixel
- tof
- pixel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000005670 electromagnetic radiation Effects 0.000 claims abstract description 21
- 230000005855 radiation Effects 0.000 claims description 55
- 238000000034 method Methods 0.000 claims description 31
- 238000001228 spectrum Methods 0.000 abstract description 20
- 230000010363 phase shift Effects 0.000 abstract description 11
- 239000004065 semiconductor Substances 0.000 abstract description 4
- 238000005259 measurement Methods 0.000 description 39
- 238000012937 correction Methods 0.000 description 21
- 238000012545 processing Methods 0.000 description 11
- 238000005286 illumination Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 238000002955 isolation Methods 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000001429 visible spectrum Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 229910052876 emerald Inorganic materials 0.000 description 1
- 239000010976 emerald Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910021420 polycrystalline silicon Inorganic materials 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/134—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/32—Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
- G01S17/36—Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated with phase comparison between the received signal and the contemporaneously transmitted signal
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/705—Pixels for depth measurement, e.g. RGBZ
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2209/00—Details of colour television systems
- H04N2209/04—Picture signal generators
- H04N2209/041—Picture signal generators using solid-state devices
- H04N2209/042—Picture signal generators using solid-state devices having a single pick-up sensor
- H04N2209/047—Picture signal generators using solid-state devices having a single pick-up sensor using multispectral pick-up elements
Definitions
- the subject specification relates generally to solid state sensors and in particular to color determination/correction based upon distance measurement/determination.
- a means for capturing an image digitally is an image sensor which converts an optical image to an electrical signal, as commonly employed in digital cameras and other imaging devices.
- Typical image sensors comprise a charge-coupled device (CCD) or a complimentary metal-oxide-semiconductor (CMOS) active-pixel sensor.
- CCD is an analog device, when light strikes the individual photo sensors (pixels) comprising the image sensor the received light is held as an electric charge in each pixel. The charge in each pixel is read, converted to a voltage, and further converted to digital information from which a digital image can be created.
- CMOS sensor additional circuitry is employed to convert a voltage into digital data.
- Both CCD and CMOS systems operate utilizing poly-silicon gates, and have their advantages and disadvantages.
- Hybrid image sensors, scientific CMOS (sCMOS) are available that combine a CCD imaging substrate with a CMOS readout integrated circuit (ROICs).
- color determination can be affected by the distance between a color pixel and the color of an object in question. Further, the effect of distance on one particular portion of the electromagnetic spectrum can have a different effect with respect to another portion of the electromagnetic spectrum. For example, color readings made from the red light portion (e.g., about 650 nm) may be more affected by distance than readings made from the blue light portion (e.g., 475 nm). Accordingly, for color correction of an image, differing degrees of color correction may be required for readings taken from different portions of the electromagnetic spectrum.
- information regarding coloration of an object can be color-corrected based upon the viewing distance between a color sensing device (e.g., a pixel in a photosensor) and the object being viewed.
- a Time of Flight (ToF) sensor/pixel is associated with a pixel (e.g., a color pixel) receiving light from the object.
- Phase shift analysis of electromagnetic radiation received at the ToF sensor is performed and, accordingly, the distance from the ToF sensor (and accordingly the distance of the associated color pixel) to the object can be determined
- a color value generated by the color pixel can be corrected to a color value in accordance with the measured distance. Effectively, the color pixel is calibrated based upon the measured distance.
- a radiation source for the ToF sensor can be electromagnetic radiation, where, for example, the electromagnetic radiation can be from the infrared or visible light portions of the electromagnetic radiation spectrum.
- a plurality of color pixel and ToF pixel arrangements are available. Color pixels and ToF pixels can be incorporated into the same image sensor.
- a chip can be manufactured comprising pixel groupings where pixels are created to perform color sensing (e.g., red, green, blue (RGB)) and other pixels are ToF pixels created to perform distance measuring.
- a plurality of chips can be manufactured whereby a number of the chips are color sensing chips while the other chips are manufactured as ToF sensors.
- Each pixel on a color sensing chip is associated with a pixel on a ToF sensor.
- color and distance sensing components can be combined on a single integrated circuit along with means for processing readings received from the color and distance sensing components.
- a common radiation source can be employed to generate illumination for the color sensor as well as allowing distance measurements to be performed by the ToF sensor.
- the radiation source employed to perform ToF measurements is of a different part of the electromagnetic spectrum compared with a radiation source employed to illuminate the object and color sensing by a color pixel.
- a ToF sensor operating in grayscale and producing grayscale images can be altered to produce color images, whereby such alteration can be performed using color filters which limit the wavelength of radiation incident on the ToF sensor, and comprising ToF pixel(s).
- a single LED can be employed to illuminate an object and facilitate distance measurements, e.g., the LED emits white light.
- a plurality of LEDs can be employed, where each LED emits electromagnetic radiation of a particular wavelength.
- the radiation generated by the plurality of LEDs can be combine to produce illumination having a plurality of wavelengths, e.g., red LED, blue LED, green LED combined to form white light illumination.
- calibration of a color pixel can be performed by employing look-up tables and/or algorithms which provide correlations between color, distance and color correction.
- FIG. 1 illustrates a system for determining color-correction based upon object distance, in accordance with an aspect.
- FIG. 2 illustrates a measurement process employing electromagnetic radiation to determine position of an object, in accordance with an aspect.
- FIG. 3 illustrates a system for color correcting an image based upon distance measurements provided by associated ToF sensor(s), in accordance with an aspect.
- FIG. 4 illustrates an arrangement of pixels for color correcting an image based upon distance measurements provided by associated ToF sensor(s), in accordance with an aspect.
- FIG. 5 illustrates an arrangement of pixels and associated ToF pixels(s) for color correcting an image based upon distance measurements provided by associated ToF sensor(s), in accordance with an aspect.
- FIG. 6 illustrates a plurality of color sensor and associated ToF pixels(s) arrangements for color correcting an image based upon distance measurements provided by the associated ToF pixel(s), in accordance with an aspect.
- FIG. 7 illustrates an arrangement of pixels and associated ToF pixels(s) for color correcting an image based upon distance measurements provided by the associated ToF pixel(s), in accordance with an aspect.
- FIG. 8 illustrates a system for color image generation with a ToF sensor, in accordance with an aspect.
- FIG. 9 illustrates a system for determining color-correction based upon object distance, in accordance with an aspect.
- FIG. 10 illustrates a system for determining color-correction based upon object distance, in accordance with an aspect.
- FIG. 11 illustrates an example methodology for determining color-correction based upon object distance, in accordance with an aspect.
- FIG. 12 illustrates an example methodology for identifying whether a ToF pixel is inoperable, in accordance with an aspect.
- FIG. 13 illustrates an example methodology for generating a color image using a ToF sensor, in accordance with an aspect.
- a component can be, but is not limited to being, a process running on a processor, a microprocessor, a microcontroller, a chip, an integrated circuit, an object, an executable, a thread of execution, a program, and/or a computer.
- an application running on a controller and the controller can be a component.
- One or more components can reside within a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers.
- an interface can include I/O components as well as associated processor, application, and/or API components.
- FIG. 1 illustrates system 100 for determining viewing distance to an object to facilitate sensor calibration and color sensing.
- An image sensor 100 comprises a Time of Flight (ToF) sensor 110 and a color sensor 120 .
- the image sensor 100 receives radiation from respective ranges of the electromagnetic spectrum to facilitate operation of ToF sensor 110 and color sensor 120 .
- a ToF radiation source 140 provides electromagnetic energy to facilitate operation of the ToF sensor 110
- a radiation source 150 provides electromagnetic energy to facilitate operation of the color sensor 120 .
- Electromagnetic energy emitted from sources 140 and 150 is reflected from object 130 and respectively captured by ToF sensor 110 and color sensor 120 .
- Controller 160 can control operation of the various components comprising system 100 (e.g., ToF sensor 110 , color sensor 120 , etc.) either on an individual basis or a combination of components, thereby enabling determination of distance d between object 130 and ToF sensor 110 , and correction (as required) of the received color of object 130 . Based upon measurements of distance d, controller 160 can make a determination regarding any required calibration of color sensor 120 and accordingly, “correction” of the received color of object 130 . Such automatic calibration of color sensor 120 can negate the need for labor-intensive calibration procedures which may be required to ensure operation of a color sensor that does not adjust readings on account of sensor to object distance.
- system 100 e.g., ToF sensor 110 , color sensor 120 , etc.
- the viewing distance of the color sensor 120 can be automatically adjusted based upon the refresh rate of the ToF sensor 110 , and any ToF pixels comprising ToF sensor 110 .
- Readings obtained from color sensor 120 and ToF sensor 110 can be utilized by controller 160 to generate a color corrected output (e.g., an image) which can be forwarded to an external device 170 .
- the corrected output can include both color information and distance information for respective pixels thereby facilitating generation of a 3D image.
- External device 170 can provide means for display of an image generated from information received from the color sensor 120 , from the ToF sensor 110 , and any color correction effected by controller 160 .
- External device 170 can further comprise means for data processing, where information associated with the color correction process (e.g., readings obtained by image sensor 100 , T of sensor 110 , color sensor 120 , etc.) can be provided to the external device for subsequent processing.
- an environmental sensor 180 can be utilized to obtain information regarding the environment of operation.
- environmental sensor 180 can detect that the environment of operation has been negatively affected by atmospheric particulates (e.g., smoke) and the output of radiation source 150 or ToF radiation source 140 requires according adjustment, e.g., by controller 160 .
- atmospheric particulates e.g., smoke
- ToF sensor 110 and color sensor 120 are shown as being incorporated into a common image sensor 100 component, the invention is not so limited, where ToF sensor 110 and color sensor 120 can be combined to form a common image sensor 110 , or ToF sensor 110 and color sensor 120 operate in isolation. Embodiments of such combination/isolation are presented infra. It is to be further appreciated that while various components comprising system 100 are shown as individual devices (e.g., image sensor 100 , ToF sensor 110 , color sensor 120 , ToF radiation source 140 , radiation source 150 , controller 160 , external device 170 , and storage 910 (ref. FIG.
- one or more components comprising system 100 can be housed in a common housing, e.g., image sensor 100 , ToF sensor 110 , color sensor 120 , ToF radiation source 140 , controller 160 , and storage 710 are integrated into a single housing or a single chip, microcontroller, integrated circuit, and the like.
- external device 170 is shown as being external to controller 160 , the two components can be combined in a common housing, where, for example, the external device 170 is a display device facilitating display of a color corrected image, color data, distance data, etc., as generated by controller 160 and any associated components.
- ToF radiation source 140 and radiation source 150 may utilize radiation from different portions of the electromagnetic spectrum, e.g., ToF radiation source 140 utilizes IR radiation and radiation source 150 utilizes visible light
- a common radiation can be employed with the various aspects presented herein.
- ToF radiation source 140 and radiation source 150 can both employ radiation having a common wavelength such as visible red light.
- a radiation source 150 can employ red visible light to illuminate the object and the ToF radiation source 140 can utilize red visible light from which phase shift measurements and, accordingly, distance measurements can be determined, as described below.
- an image generated using the common radiation source can be monochromatic.
- the ToF radiation source 140 and/or radiation source 150 can contain a combined light operating concurrently, (e.g.
- ToF radiation source 140 and/or radiation source 150 can be a laser, or similar mechanism for emitting electromagnetic radiation, wherein ToF sensor 110 and/or color sensor 120 have suitable functionality to perform respective distance measurement/image generation using a laser source.
- ToF sensor 110 has been developed for application in the field of distance measurement to facilitate such operations as detection of persons, objects, obstacles, etc., as well as measuring position, distance, dimensions, etc.
- a ToF sensor 110 operates in conjunction with a ToF radiation source 140 .
- ToF radiation source 140 generates radiation of a specific part of the electromagnetic spectrum and, based upon the wavelength of the radiation, the distance an object 130 resides from the ToF radiation source 140 and a ToF sensor 110 can be determined.
- the ToF radiation source 140 emits light from the infrared (IR) portion of the electromagnetic spectrum.
- ToF technology is based upon determining a phase shift ⁇ between an signal emitted from the ToF radiation source 140 and a signal reflected from object 130 received at the ToF sensor 110 . From the determined phase shift ⁇ , the distance d, between object 130 and ToF sensor 110 , can be determined.
- d distance
- c speed of light
- f m modulation frequency (which is a function of the radiated power of an emitter)
- ⁇ arctan ((A 3 ⁇ A 1 )/(A 0 ⁇ A 2 )) where positions A 0 , A 1 , A 2 , and A 3 are at fixed durations of time
- FIG. 2 presents ToF technology based upon determination of phase shift ⁇ , other principles of ToF technologies are available, for example, pulse metering where a pulse is transmitted and the time-to-receive is measured.
- ToF sensor 110 and color sensor 120 can operate in isolation or in combination. Various embodiments of such operations are presented.
- a ToF sensor 110 can be combined with a color sensor 120 in the same semiconductor chip.
- ToF pixels can be combined with color sensor pixels to form an image sensor 310 having distance measuring functionality.
- Each pixel in a cluster of pixels e.g., group of four pixels 320 , 330 , 340 and 350
- pixels 320 , 330 , and 340 are utilized to capture light from respective red (R), green (G), and blue (B) wavelength portions of the visible light portion of the electromagnetic spectrum.
- Pixel 350 is employed to operate as a ToF sensor.
- the chip can be divided into four groups of 12 ⁇ 50 pixels having respective R, G, B color sensing and ToF determining functionality.
- FIG. 3 shows pixel clustering having an RGB-d arrangement (where d is equivalent to the ToF pixel) the various aspects are not so limited, whereby any clustering of pixels can be utilized to facilitate suitable image capture by pixels performing capture of visible light along with requisite pixels having ToF capability.
- FIG. 4 illustrates an image sensor 410 comprising pixels 420 - 450 having an hexagonal profile and arranged in a honeycomb layout, where pixels 420 , 430 , and 440 are utilized to capture light from respective red (R), green (G), and blue (B) wavelength portions of the visible light portion of the electromagnetic spectrum.
- Pixel 450 is employed to operate as a ToF sensor.
- FIG. 5 presents system 500 comprising a color sensor 510 constructed separately from, but operating in conjunction with a ToF sensor 520 .
- the pixels of color sensor 510 have been arranged in a Bayer GRGB layout, where each pixel cluster is operating in conjunction with an associated ToF sensor pixel.
- the cluster of pixels G 1 R 1 G 1 B 1 can be color corrected based upon distance measurements obtained from ToF 1 in the ToF sensor
- pixel cluster G 2 R 2 G 2 B 2 can be color corrected based upon ToF 2
- pixel cluster G 3 R 3 G 3 B 3 can be color corrected based upon ToF 3 , etc.
- the size of a ToF pixel and a color pixel do not have to be of the same size, and accordingly, the ToF pixel and a color pixel do not have to have the same resolution.
- the size of the respective ToF pixels is not as critical as the color pixels as the ToF pixels may play no part in the actual image construction and resolution.
- FIG. 6 system 600 presents a plurality of color sensor and ToF pixel arrangements where a high resolution color sensor 610 comprises a plurality of color RGB pixels 620 .
- the high resolution color sensor 610 can be coupled to a low resolution ToF chip 630 .
- low resolution ToF chip can comprise of one or more ToF pixels 640 .
- a high resolution color sensor 610 (comprising a plurality of color RGB pixels 620 ) can be coupled to a low resolution ToF chip 630 comprising of a single ToF pixel 640 (as shown in FIG. 6 , lower image).
- a high resolution color sensor 610 (comprising a plurality of color RGB pixels 620 ) can be coupled to a low resolution ToF chip 630 comprising a plurality of ToF pixels 640 .
- the high resolution color sensor 610 can be broken up into groupings of color RGB pixels 620 and distance determinations performed with respect to an associated ToF pixel 640 .
- the various embodiments shown in FIG. 6 are suitable for measuring the distance of about planar surfaces, as well as finding application in a low cost system(s).
- FIG. 7 presents system 700 comprising color sensor 710 constructed separately from, but operating in conjunction with ToF sensor 720 .
- the hexagonal pixels of color sensor 710 have been arranged in a honeycomb layout, where each pixel is operating in conjunction with an associated ToF sensor pixel.
- pixel R x can be color corrected based upon distance measurements obtained from TOF m in the ToF sensor, with G x associated with ToF n , G y associated with ToF o , etc.
- pixels operating across the visible portion of the electromagnetic spectrum can be employed.
- grayscale pixels can be employed where the pixel responds to the amount of incident light, for example, as the amount of incident light increases the greater the output voltage of the pixel.
- a filter can be placed over grayscale pixels whereby the filter is employed to filter respective portions of the electromagnetic spectrum while allowing a particular wavelength, plurality of wavelengths, or a range of wavelengths to pass through to the underlying grayscale pixels.
- a filter can be employed to reduce the wavelength of light striking the color sensor, e.g., the filter is a “green light” filter and allows light having a wavelength of about 550 nm to pass through to the light sensor.
- FIG. 8 presents system 800 illustrating color filters being employed on a ToF sensor.
- Grayscale ToF sensor 810 can be divided into 4 regions, the first region functions as a ToF sensor 820 and is employed to determine a distance D from the ToF sensor 820 to an object (e.g., object 130 ).
- the second, third and fourth regions of grayscale ToF sensor 810 can be respectively covered with a red filter 830 , green filter 840 or blue filter 850 .
- Object 130 can be illuminated with a white-light source (e.g., a light emitting diode (LED) 860 producing white light) with the respective filter (e.g., filter 830 , 840 , 850 ) allowing a particular portion of the visible spectrum to be recorded at the underlying portion of grayscale ToF sensor 810 .
- a white-light source e.g., a light emitting diode (LED) 860 producing white light
- the respective filter e.g., filter 830 , 840 , 850
- a 3D color image can be compiled.
- a white-light LED e.g., LED 860
- a modulation suitable for distance measurement e.g., about 20 MHz
- illumination can be provided by red, green and blue LED's 870 , illuminated in sequence with a frequency modulation of about 20 MHz.
- the ToF sensor can be responsive to one or more wavelengths in the visible portion of the electromagnetic spectrum.
- the ToF radiation source 140 can emit light from the red portion of the visible spectrum (e.g., light with a wavelength of about 650 nm).
- the red light would be received by imaging pixels (e.g., color sensor 120 ) responsive to light having a red wavelength, and the ToF sensor 110 can operate (as shown in FIG. 2 ) by determining the position of object 130 in accordance with the phase shift of the received red light.
- the effect of variation of an object surface can be compensated for.
- an associated color pixel can be “calibrated” in accordance with the measured viewing distance.
- the color pixel can be calibrated based upon the distanced measured by the ToF sensor and any measurements made by the color sensor can be adjusted in accord with the calibration adjustment.
- the viewing distance of the color pixel can be automatically adjusted based upon the refresh rate of the ToF sensor.
- RGB layout and the Bayer layouts can be further complimented by red-green-blue-emerald (RGBE), cyan-yellow-yellow-magenta (CYYM), cyan-yellow-green-magenta (CYGM), and the like.
- RGBE red-green-blue-emerald
- CYYM cyan-yellow-yellow-magenta
- CYGM cyan-yellow-green-magenta
- system 900 is illustrated and comprises various components previously discussed, image sensor 100 comprising ToF sensor 110 , color sensor 120 , ToF radiation source 140 , coupled to controller 160 which is further coupled to external device 170 , etc. Further, system 900 includes a storage device 910 on which are stored one or more look up tables 920 and/or algorithms 930 .
- a look up table 920 can provide correlation between a distance measurement and a degree of color correction to be applied to a reading received from color sensor 120 based upon a distance measurement received from ToF sensor 110 . Similarly, a reading received from color sensor 120 can be “calibrated” with an algorithm 930 utilizing a distance measurement received from ToF sensor 110 .
- FIG. 10 illustrates system 1000 depicting a microcontroller/integrated circuit according to an aspect.
- System 1000 comprises a microcontroller/integrated circuit 1010 on which are combined a processing unit 1050 along with an image sensor 1020 comprising a ToF sensor 1030 and a color sensor 1040 .
- image sensor 100 operating as a separate device to controller 160
- the various components can be combined on a single device, e.g., microcontroller/integrated circuit 1010 .
- Microcontroller/integrated circuit 1010 can be designed such that an image sensor 1020 , comprising ToF sensor 1030 and color sensor 1040 , resides on the same chip as an associated processing unit 1050 .
- image sensor 1020 , ToF sensor 1030 , color sensor 1040 , and processing unit 1050 have the same functionality as that described earlier with reference to FIG. 1 , and respective components image sensor 100 , ToF sensor 110 , color sensor 120 , and controller 160 .
- ToF radiation source 140 , radiation source 150 , external device 170 , and/or environmental sensor 180 can also be located on the microcontroller/integrated circuit 1010 .
- ToF sensor 110 can be a color or grayscale ToF sensor.
- FIG. 11 illustrates an example methodology 1100 for determining color correction based upon viewing distance.
- light reflected from an object e.g., object 130
- one or more color pixels e.g., color pixels 320 - 340 , 420 - 440 , G 1 R 1 G 1 B 1 , R x-z G x-z B x , 1040 , etc.
- one or more color sensors e.g., image sensors 100 , 710 , 1010 , and color sensors 120 , 310 , 410 , 510 , and 610 , etc.
- each color pixel has an associated ToF pixel (e.g., ToF pixels 350 , 450 , ToF 1 , TOF m-n , 1030 , etc.) comprising one or more ToF sensors (e.g., image sensor 100 and sensors 310 , 410 , 520 , and 620 ). Electromagnetic radiation reflected from the object is received at the ToF pixel.
- ToF pixels e.g., ToF pixels 350 , 450 , ToF 1 , TOF m-n , 1030 , etc.
- ToF sensors e.g., image sensor 100 and sensors 310 , 410 , 520 , and 620 .
- the distance from a particular ToF pixel to the object surface from which the ToF radiation was received from is determined (e.g., by controller 160 , processing unit 1050 , etc.).
- a ToF sensor determines distance based upon a phase shift between an emitted radiation transmitted from a radiation source to an object and the radiation received at a ToF pixel as reflected from the object.
- a phase shift of upto 360° can be determined, where 360° represents one wavelength of the radiation, 180° equals 1 ⁇ 2 wavelength, etc.
- the wavelength portion can be calculated and a distance with respect to wavelength portion can be determined
- the distance from a color pixel to the object should be about equal to the respective distance of a ToF pixel to the object (or a known distance ratio, e.g., 1:1, 1:2, 1:5, etc.), where the ToF pixel is being utilized to determine the distance of the particular color pixel to the object.
- the ToF pixel is being employed to determine a corresponding distance from a color pixel to an object to facilitate calibration of the color pixel.
- the color reading obtained by the color pixel can be corrected in accordance with the determined distance (e.g., by controller 160 ).
- Any suitable method can be employed to perform the color correction. Suitable methods are, but not limited to, employing a look-up table (e.g., look-up table 920 ) comprising correlations between color reading(s), object to pixel distance(s), and correction factor(s) based thereon.
- an algorithm(s) e.g., algorithm(s) 930
- a color reading and associated distance measurement are entered, correction factor(s) determined and a corrected color value generated.
- one or more color-corrected values can be outputted.
- the color-corrected values can be outputted to an external device (e.g., external device 170 ) for presentation, e.g., where the external device comprises means for display and the color corrected image can be presented thereon.
- the external device can comprise visual display means such as a printer and the color corrected image is generated in hard format.
- the external device can comprise further means for data processing, where information associated with the color correction process (e.g., readings obtained by color pixels 320 - 340 , 420 - 440 , G 1 R 1 G 1 B 1 , R x-z G x-z B x , 1040 , etc.; readings obtained by image sensor 100 , and ToF sensors 110 , 310 , 410 , 520 , 620 and 710 ; color corrected values as generated by look-up table(s) 820 and/or algorithm(s) 830 ; the look-up table(s) 820 and/or algorithm(s) 830 utilized to generate the color-corrected values; operating settings of the ToF radiation source 140 ; operating settings of radiation source 150 , etc.) can be provided to the external device for subsequent processing.
- information associated with the color correction process e.g., readings obtained by color pixels 320 - 340 , 420 - 440 , G 1 R 1 G 1 B 1 , R x-
- methodology 1100 shows the process 1110 of capturing light reflected from an object, e.g., object 130 , by one or more color pixels as an operation proceeding that of 1120 where radiation is received at one or more ToF pixels and at 1130 determining the object distance based upon the ToF data
- the various aspects presented herein are not so limited and the operation of determining the object distance can be performed prior to capturing light by a color pixel and the operation 1140 of color correction performed thereafter.
- methodology 1100 discloses object to sensor distance being determined based upon determination of phase shift
- Other principles of ToF technologies are available, for example, pulse metering where a pulse is transmitted and the time-to-receive is measured.
- FIG. 12 illustrates an example methodology 1200 for correcting operation of a color sensing pixel where an associated ToF pixel is erroneous or inoperable.
- one or more color sensing pixels e.g., color pixels 320 - 340 , 420 - 440 , G 1 R 1 G 1 B 1 , R x-z G x-z B x , etc.
- an image sensor e.g., image sensor 100 , 1020 , and color sensors 120 , 310 , 410 , 510 , 610 , 710 , 810 , 1040 , etc.
- improbable distance measurements or no measurements are obtained from a ToF pixel associated with the one or more color sensing pixels.
- a color sensing pixel is calibrated based upon distance measurements received from a ToF pixel associated with the color pixel.
- distance readings from two or more ToF pixels can be compared to determine whether a particular ToF pixel is generating a correct output.
- two adjacent ToF pixels may be producing disparate readings, e.g., a first ToF pixel (e.g., FIG. 5 , ToF 1 ) indicates an object is 7m away, while a second ToF pixel (e.g., FIG. 5 , ToF 2 ) indicates the object to be 1.5m away.
- the object has a planar surface and is aligned perpendicular to the sightline of the ToF sensor. Accordingly, by comparing the distance values from the two ToF pixels it is possible to identify which ToF pixel is erroneous by comparing readings of adjacent ToF pixels (e.g., FIG. 5 , ToF 3 or ToF 4 ). Alternatively, during an attempt to determine a degree with which a color sensing pixel is to be calibrated, during execution of the calibration determination (e.g., by controller 160 ) no readings are available from the ToF pixel associated with the color sensing pixel. Based on the above, it is considered that the ToF pixel is either providing erroneous values or is inoperable.
- a ToF pixel is inoperable/erroneous (e.g., FIG. 5 , ToF 1 is inoperable) readings are taken from an adjacent ToF pixel (e.g., FIG. 5 , ToF 2 , ToF 3 or ToF 4 ) which is performing correctly.
- the distance from the alternative ToF pixel to the object surface is determined (e.g., by controller 160 , processing unit 1050 ).
- the alternative distance measurements are applied to the color sensing pixel.
- the alternative distance measurement is referenced when determining a correction value from a lookup table (e.g., lookup table 920 ) associated with the color reading obtained from the subject color sensing pixel.
- the alternative distance measurement can be utilized when applying a calibration algorithm (e.g, algorithm 930 ) to calibrate the color reading obtained from the subject color sensing pixel.
- color corrected (e.g., calibrated) values for the subject color sensing pixel are outputted.
- FIG. 1300 illustrates an example methodology 1300 for generating a color corrected image from a grayscale ToF sensor.
- a grayscale ToF sensor e.g., ToF sensor 810
- a ToF sensor for viewing distance determination, may only operate in the infrared portion of the electromagnetic spectrum (as discussed supra). Further, as discussed above, a ToF sensor may only produce a grayscale 3D image. However, by employing color filters and appropriate illumination, a 3D color image can be produced.
- To facilitate distance measurement a portion of a ToF sensor is maintained as a pixel(s) region for measuring distance (e.g., ToF-d 820 ).
- ToF red filter 830 can be covered with filters (e.g., ToF red filter 830 , ToF green filter 840 , ToF blue filter 850 ) to allow a desired portion of visible light to pass through to the ToF sensor pixel array below.
- filters e.g., ToF red filter 830 , ToF green filter 840 , ToF blue filter 850 .
- an object e.g., object 130
- Illumination can be facilitated by any suitable illumination source (e.g., ToF radiation source 140 , radiation source 150 ), where such illumination source can be an LED device, laser, etc.
- the LED device can comprise of a single LED emitting light across the visible portion of the electromagnetic spectrum (e.g., LED 860 ), where the LED device can operate with a frequency modulation suitable for ToF determination, e.g., about 20 MHz.
- the LED device can comprise of a plurality of LEDs, where each LED emits light from different portions of the electromagnetic spectrum (e.g., red, green, and blue LEDs 870 ). The plurality of LED's can operate in sequence and with frequency modulation required for distance measurement, e.g., about 20 MHz.
- the distance of the object is determined based upon measurements obtained from the portion of the ToF sensor employed to facilitate distance measurement.
- the respective portions of the ToF sensor being employed to receive light reflected from the object and thereby create a 3D image of the object are calibrated.
- the actual measurements received from each color sensing portion are adjusted.
- the degree of calibration required for one color sensing region may not equal a degree of calibration required for another color sensing region.
- the degree of color correction required to correct a red light reading may be greater than that required for a blue light reading, when both readings are made over the same distance.
- one or more color-corrected values can be combined and outputted. Values taken from each of the red, green, and blue light sensing portions of the ToF sensor can be combined to create a color image. Owing to the respective readings being accompanied by distance measurements it is possible to create a 3D color image.
- the color-corrected values can be output to an external device (e.g., external device 170 ) for presentation, printing, further data processing, etc.
- computer readable media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips . . . ), optical disks (e.g., compact disk (CD), digital versatile disk (DVD) . . . ), smart cards, and flash memory devices (e.g., card, stick, key drive . . . ).
- a carrier wave can be employed to carry computer-readable electronic data such as those used in transmitting and receiving electronic mail or in accessing a network such as the Internet or a local area network (LAN).
- LAN local area network
- the word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the word exemplary is intended to disclose concepts in a concrete fashion.
- the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances.
- the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Measurement Of Optical Distance (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
- This patent application is a continuation of U.S. patent application Ser. No. 12/938,499, filed Nov. 3, 2010 and entitled “COLOR SENSOR INSENSITIVE TO DISTANCE VARIATIONS,” the entirety of which is incorporated herein by reference.
- The subject specification relates generally to solid state sensors and in particular to color determination/correction based upon distance measurement/determination.
- A means for capturing an image digitally is an image sensor which converts an optical image to an electrical signal, as commonly employed in digital cameras and other imaging devices. Typical image sensors comprise a charge-coupled device (CCD) or a complimentary metal-oxide-semiconductor (CMOS) active-pixel sensor. A CCD is an analog device, when light strikes the individual photo sensors (pixels) comprising the image sensor the received light is held as an electric charge in each pixel. The charge in each pixel is read, converted to a voltage, and further converted to digital information from which a digital image can be created. With a CMOS sensor additional circuitry is employed to convert a voltage into digital data. Both CCD and CMOS systems operate utilizing poly-silicon gates, and have their advantages and disadvantages. Hybrid image sensors, scientific CMOS (sCMOS), are available that combine a CCD imaging substrate with a CMOS readout integrated circuit (ROICs).
- By knowing the viewing distance between a color pixel and an object, it is possible to compile readings from a plurality of color pixels along with a distance measurement for each color pixel and thereby create a 3D image. Color determination can be affected by the distance between a color pixel and the color of an object in question. Further, the effect of distance on one particular portion of the electromagnetic spectrum can have a different effect with respect to another portion of the electromagnetic spectrum. For example, color readings made from the red light portion (e.g., about 650 nm) may be more affected by distance than readings made from the blue light portion (e.g., 475 nm). Accordingly, for color correction of an image, differing degrees of color correction may be required for readings taken from different portions of the electromagnetic spectrum.
- The following discloses a simplified summary of the specification in order to provide a basic understanding of some aspects of the specification. This summary is not an extensive overview of the specification. It is intended to neither identify key or critical elements of the specification nor delineate the scope of the specification. Its sole purpose is to disclose some concepts of the specification in a simplified form as a prelude to the more detailed description that is disclosed later.
- With the disclosed aspects presented herein, information regarding coloration of an object can be color-corrected based upon the viewing distance between a color sensing device (e.g., a pixel in a photosensor) and the object being viewed. A Time of Flight (ToF) sensor/pixel is associated with a pixel (e.g., a color pixel) receiving light from the object. Phase shift analysis of electromagnetic radiation received at the ToF sensor is performed and, accordingly, the distance from the ToF sensor (and accordingly the distance of the associated color pixel) to the object can be determined By knowing the distance from the color pixel to the object, a color value generated by the color pixel can be corrected to a color value in accordance with the measured distance. Effectively, the color pixel is calibrated based upon the measured distance. A radiation source for the ToF sensor can be electromagnetic radiation, where, for example, the electromagnetic radiation can be from the infrared or visible light portions of the electromagnetic radiation spectrum.
- A plurality of color pixel and ToF pixel arrangements are available. Color pixels and ToF pixels can be incorporated into the same image sensor. A chip can be manufactured comprising pixel groupings where pixels are created to perform color sensing (e.g., red, green, blue (RGB)) and other pixels are ToF pixels created to perform distance measuring. Alternatively, a plurality of chips can be manufactured whereby a number of the chips are color sensing chips while the other chips are manufactured as ToF sensors. Each pixel on a color sensing chip is associated with a pixel on a ToF sensor. In another aspect, color and distance sensing components can be combined on a single integrated circuit along with means for processing readings received from the color and distance sensing components.
- In one aspect, a common radiation source can be employed to generate illumination for the color sensor as well as allowing distance measurements to be performed by the ToF sensor. In another aspect, the radiation source employed to perform ToF measurements is of a different part of the electromagnetic spectrum compared with a radiation source employed to illuminate the object and color sensing by a color pixel.
- A ToF sensor operating in grayscale and producing grayscale images can be altered to produce color images, whereby such alteration can be performed using color filters which limit the wavelength of radiation incident on the ToF sensor, and comprising ToF pixel(s).
- In another aspect, a single LED can be employed to illuminate an object and facilitate distance measurements, e.g., the LED emits white light. In another aspect a plurality of LEDs can be employed, where each LED emits electromagnetic radiation of a particular wavelength. In a further aspect, the radiation generated by the plurality of LEDs can be combine to produce illumination having a plurality of wavelengths, e.g., red LED, blue LED, green LED combined to form white light illumination.
- In an aspect, calibration of a color pixel can be performed by employing look-up tables and/or algorithms which provide correlations between color, distance and color correction.
- The following description and the annexed drawings set forth certain illustrative aspects of the specification. These aspects are indicative, however, of but a few of the various ways in which the principles of the specification can be employed. Other advantages and novel features of the specification will become apparent from the following detailed description of the specification when considered in conjunction with the drawings.
-
FIG. 1 illustrates a system for determining color-correction based upon object distance, in accordance with an aspect. -
FIG. 2 illustrates a measurement process employing electromagnetic radiation to determine position of an object, in accordance with an aspect. -
FIG. 3 illustrates a system for color correcting an image based upon distance measurements provided by associated ToF sensor(s), in accordance with an aspect. -
FIG. 4 illustrates an arrangement of pixels for color correcting an image based upon distance measurements provided by associated ToF sensor(s), in accordance with an aspect. -
FIG. 5 illustrates an arrangement of pixels and associated ToF pixels(s) for color correcting an image based upon distance measurements provided by associated ToF sensor(s), in accordance with an aspect. -
FIG. 6 illustrates a plurality of color sensor and associated ToF pixels(s) arrangements for color correcting an image based upon distance measurements provided by the associated ToF pixel(s), in accordance with an aspect. -
FIG. 7 illustrates an arrangement of pixels and associated ToF pixels(s) for color correcting an image based upon distance measurements provided by the associated ToF pixel(s), in accordance with an aspect. -
FIG. 8 illustrates a system for color image generation with a ToF sensor, in accordance with an aspect. -
FIG. 9 illustrates a system for determining color-correction based upon object distance, in accordance with an aspect. -
FIG. 10 illustrates a system for determining color-correction based upon object distance, in accordance with an aspect. -
FIG. 11 illustrates an example methodology for determining color-correction based upon object distance, in accordance with an aspect. -
FIG. 12 illustrates an example methodology for identifying whether a ToF pixel is inoperable, in accordance with an aspect. -
FIG. 13 illustrates an example methodology for generating a color image using a ToF sensor, in accordance with an aspect. - The claimed subject matter is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the claimed subject matter. It can be evident, however, that the claimed subject matter can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the claimed subject matter.
- As used in this application, the terms “component,” “module,” “system,” “interface,” or the like are generally intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component can be, but is not limited to being, a process running on a processor, a microprocessor, a microcontroller, a chip, an integrated circuit, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a controller and the controller can be a component. One or more components can reside within a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers. As another example, an interface can include I/O components as well as associated processor, application, and/or API components.
-
FIG. 1 illustratessystem 100 for determining viewing distance to an object to facilitate sensor calibration and color sensing. Animage sensor 100 comprises a Time of Flight (ToF)sensor 110 and acolor sensor 120. Theimage sensor 100 receives radiation from respective ranges of the electromagnetic spectrum to facilitate operation ofToF sensor 110 andcolor sensor 120. AToF radiation source 140 provides electromagnetic energy to facilitate operation of theToF sensor 110, while aradiation source 150 provides electromagnetic energy to facilitate operation of thecolor sensor 120. Electromagnetic energy emitted fromsources object 130 and respectively captured byToF sensor 110 andcolor sensor 120.Controller 160 can control operation of the various components comprising system 100 (e.g.,ToF sensor 110,color sensor 120, etc.) either on an individual basis or a combination of components, thereby enabling determination of distance d betweenobject 130 andToF sensor 110, and correction (as required) of the received color ofobject 130. Based upon measurements of distance d,controller 160 can make a determination regarding any required calibration ofcolor sensor 120 and accordingly, “correction” of the received color ofobject 130. Such automatic calibration ofcolor sensor 120 can negate the need for labor-intensive calibration procedures which may be required to ensure operation of a color sensor that does not adjust readings on account of sensor to object distance. - Further, the viewing distance of the
color sensor 120, and any pixels it comprises, can be automatically adjusted based upon the refresh rate of theToF sensor 110, and any ToF pixels comprisingToF sensor 110. Readings obtained fromcolor sensor 120 andToF sensor 110 can be utilized bycontroller 160 to generate a color corrected output (e.g., an image) which can be forwarded to anexternal device 170. The corrected output can include both color information and distance information for respective pixels thereby facilitating generation of a 3D image.External device 170 can provide means for display of an image generated from information received from thecolor sensor 120, from theToF sensor 110, and any color correction effected bycontroller 160.External device 170 can further comprise means for data processing, where information associated with the color correction process (e.g., readings obtained byimage sensor 100, T ofsensor 110,color sensor 120, etc.) can be provided to the external device for subsequent processing. - Further, to take into account the environmental circumstances in which
system 100 is operating anenvironmental sensor 180 can be utilized to obtain information regarding the environment of operation. For example,environmental sensor 180 can detect that the environment of operation has been negatively affected by atmospheric particulates (e.g., smoke) and the output ofradiation source 150 orToF radiation source 140 requires according adjustment, e.g., bycontroller 160. - It is to be appreciated that while
ToF sensor 110 andcolor sensor 120 are shown as being incorporated into acommon image sensor 100 component, the invention is not so limited, whereToF sensor 110 andcolor sensor 120 can be combined to form acommon image sensor 110, orToF sensor 110 andcolor sensor 120 operate in isolation. Embodiments of such combination/isolation are presented infra. It is to be further appreciated that while variouscomponents comprising system 100 are shown as individual devices (e.g.,image sensor 100,ToF sensor 110,color sensor 120,ToF radiation source 140,radiation source 150,controller 160,external device 170, and storage 910 (ref.FIG. 9 ), etc.) the various aspects, as presented herein, are not so limited and one or morecomponents comprising system 100 can be housed in a common housing, e.g.,image sensor 100,ToF sensor 110,color sensor 120,ToF radiation source 140,controller 160, andstorage 710 are integrated into a single housing or a single chip, microcontroller, integrated circuit, and the like. Further, whileexternal device 170 is shown as being external tocontroller 160, the two components can be combined in a common housing, where, for example, theexternal device 170 is a display device facilitating display of a color corrected image, color data, distance data, etc., as generated bycontroller 160 and any associated components. - It is to be appreciated that while
ToF radiation source 140 andradiation source 150 may utilize radiation from different portions of the electromagnetic spectrum, e.g.,ToF radiation source 140 utilizes IR radiation andradiation source 150 utilizes visible light, a common radiation can be employed with the various aspects presented herein. For example,ToF radiation source 140 andradiation source 150 can both employ radiation having a common wavelength such as visible red light. Aradiation source 150 can employ red visible light to illuminate the object and theToF radiation source 140 can utilize red visible light from which phase shift measurements and, accordingly, distance measurements can be determined, as described below. Accordingly, in another aspect, an image generated using the common radiation source can be monochromatic. Further, in another aspect, theToF radiation source 140 and/orradiation source 150 can contain a combined light operating concurrently, (e.g. red, green, and blue LEDs combining to produce white light) which is modulated and utilized for distance measurement (e.g., by ToF sensor 110) and color sensing (e.g., color sensor 120) simultaneously. In a further aspect,ToF radiation source 140 and/orradiation source 150 can be a laser, or similar mechanism for emitting electromagnetic radiation, whereinToF sensor 110 and/orcolor sensor 120 have suitable functionality to perform respective distance measurement/image generation using a laser source. - For the purpose of understanding the various aspects presented herein, operation of the
ToF sensor 110 is now briefly described.ToF sensor 110 has been developed for application in the field of distance measurement to facilitate such operations as detection of persons, objects, obstacles, etc., as well as measuring position, distance, dimensions, etc. As mentioned above, aToF sensor 110 operates in conjunction with aToF radiation source 140.ToF radiation source 140 generates radiation of a specific part of the electromagnetic spectrum and, based upon the wavelength of the radiation, the distance anobject 130 resides from theToF radiation source 140 and aToF sensor 110 can be determined. In one aspect, theToF radiation source 140 emits light from the infrared (IR) portion of the electromagnetic spectrum. With reference toFIG. 2 , ToF technology is based upon determining a phase shift Δφ between an signal emitted from theToF radiation source 140 and a signal reflected fromobject 130 received at theToF sensor 110. From the determined phase shift Δφ, the distance d, betweenobject 130 andToF sensor 110, can be determined. Where with reference to Equation A, d=distance, c=speed of light, fm=modulation frequency (which is a function of the radiated power of an emitter) and Δφ=arctan ((A3−A1)/(A0−A2)) where positions A0, A1, A2, and A3 are at fixed durations of time It is to be appreciated that whileFIG. 2 presents ToF technology based upon determination of phase shift Δφ, other principles of ToF technologies are available, for example, pulse metering where a pulse is transmitted and the time-to-receive is measured. - As previously mentioned,
ToF sensor 110 andcolor sensor 120 can operate in isolation or in combination. Various embodiments of such operations are presented. In one aspect aToF sensor 110 can be combined with acolor sensor 120 in the same semiconductor chip. With reference toFIG. 3 , ToF pixels can be combined with color sensor pixels to form animage sensor 310 having distance measuring functionality. Each pixel in a cluster of pixels (e.g., group of fourpixels FIG. 3 ,pixels Pixel 350 is employed to operate as a ToF sensor. For example, in a chip comprising 24×100 pixels, the chip can be divided into four groups of 12×50 pixels having respective R, G, B color sensing and ToF determining functionality. - It is to be appreciated that while
FIG. 3 shows pixel clustering having an RGB-d arrangement (where d is equivalent to the ToF pixel) the various aspects are not so limited, whereby any clustering of pixels can be utilized to facilitate suitable image capture by pixels performing capture of visible light along with requisite pixels having ToF capability.FIG. 4 illustrates animage sensor 410 comprising pixels 420-450 having an hexagonal profile and arranged in a honeycomb layout, wherepixels Pixel 450 is employed to operate as a ToF sensor. -
FIG. 5 presents system 500 comprising acolor sensor 510 constructed separately from, but operating in conjunction with aToF sensor 520. The pixels ofcolor sensor 510 have been arranged in a Bayer GRGB layout, where each pixel cluster is operating in conjunction with an associated ToF sensor pixel. For example, the cluster of pixels G1R1G1B1 can be color corrected based upon distance measurements obtained from ToF1 in the ToF sensor, pixel cluster G2R2G2B2 can be color corrected based upon ToF2, and pixel cluster G3R3G3B3 can be color corrected based upon ToF3, etc. - Further, as shown in
FIG. 5 the size of a ToF pixel and a color pixel do not have to be of the same size, and accordingly, the ToF pixel and a color pixel do not have to have the same resolution. For image quality, typically the smaller the color pixels comprisingcolor sensor 510 the better, but given that the ToF pixels are being employed to calibrate the color pixels (and readings from the color pixels), the size of the respective ToF pixels is not as critical as the color pixels as the ToF pixels may play no part in the actual image construction and resolution. -
FIG. 6 ,system 600, presents a plurality of color sensor and ToF pixel arrangements where a highresolution color sensor 610 comprises a plurality ofcolor RGB pixels 620. The highresolution color sensor 610 can be coupled to a lowresolution ToF chip 630. As shown inFIG. 6 , low resolution ToF chip can comprise of one ormore ToF pixels 640. In one aspect, a high resolution color sensor 610 (comprising a plurality of color RGB pixels 620) can be coupled to a lowresolution ToF chip 630 comprising of a single ToF pixel 640 (as shown inFIG. 6 , lower image). In another aspect, a high resolution color sensor 610 (comprising a plurality of color RGB pixels 620) can be coupled to a lowresolution ToF chip 630 comprising a plurality ofToF pixels 640. As shown in the upper two embodiments (FIG. 6 , middle and upper images), the highresolution color sensor 610 can be broken up into groupings ofcolor RGB pixels 620 and distance determinations performed with respect to an associatedToF pixel 640. The various embodiments shown inFIG. 6 are suitable for measuring the distance of about planar surfaces, as well as finding application in a low cost system(s). -
FIG. 7 presents system 700 comprisingcolor sensor 710 constructed separately from, but operating in conjunction withToF sensor 720. The hexagonal pixels ofcolor sensor 710 have been arranged in a honeycomb layout, where each pixel is operating in conjunction with an associated ToF sensor pixel. For example, pixel Rx can be color corrected based upon distance measurements obtained from TOFm in the ToF sensor, with Gx associated with ToFn, Gy associated with ToFo, etc. - In an alternative aspect, rather than having pixels which operate in specific portions of the visible electromagnetic spectrum (e.g., R, G, or B), pixels operating across the visible portion of the electromagnetic spectrum can be employed. Alternatively, grayscale pixels can be employed where the pixel responds to the amount of incident light, for example, as the amount of incident light increases the greater the output voltage of the pixel. Further, in another embodiment, a filter can be placed over grayscale pixels whereby the filter is employed to filter respective portions of the electromagnetic spectrum while allowing a particular wavelength, plurality of wavelengths, or a range of wavelengths to pass through to the underlying grayscale pixels. Accordingly, where
radiation source 150 comprises electromagnetic radiation having a range of wavelengths, e.g., visible light with wavelengths of about 390 to 750 nm, a filter can be employed to reduce the wavelength of light striking the color sensor, e.g., the filter is a “green light” filter and allows light having a wavelength of about 550 nm to pass through to the light sensor. - Current ToF sensors can operate using grayscale pixels. By employing color filters it is possible to create a ToF sensor having color sensing capabilities, thereby allowing a grayscale ToF sensor to generate 3D color images.
FIG. 8 presents system 800 illustrating color filters being employed on a ToF sensor.Grayscale ToF sensor 810 can be divided into 4 regions, the first region functions as aToF sensor 820 and is employed to determine a distance D from theToF sensor 820 to an object (e.g., object 130). The second, third and fourth regions ofgrayscale ToF sensor 810 can be respectively covered with ared filter 830,green filter 840 orblue filter 850. Object 130 can be illuminated with a white-light source (e.g., a light emitting diode (LED) 860 producing white light) with the respective filter (e.g.,filter grayscale ToF sensor 810. With the known distance measurement (as measured by ToF sensor 820), any required sensor calibration/adjustment based on the distance measurement, and the (as required) corrected color readings from the respective red, green and blue portions (e.g., 830, 840, 850) of thegrayscale ToF sensor 810, a 3D color image can be compiled. Further, a white-light LED (e.g., LED 860) can be operating with a modulation suitable for distance measurement, e.g., about 20 MHz, or in an alternative embodiment illumination can be provided by red, green and blue LED's 870, illuminated in sequence with a frequency modulation of about 20 MHz. - In a further aspect, rather than the ToF sensor being responsive to electromagnetic radiation having an infrared wavelength, the ToF sensor can be responsive to one or more wavelengths in the visible portion of the electromagnetic spectrum. For example, with reference to
FIG. 1 , theToF radiation source 140 can emit light from the red portion of the visible spectrum (e.g., light with a wavelength of about 650 nm). The red light would be received by imaging pixels (e.g., color sensor 120) responsive to light having a red wavelength, and theToF sensor 110 can operate (as shown inFIG. 2 ) by determining the position ofobject 130 in accordance with the phase shift of the received red light. - In another embodiment, the effect of variation of an object surface can be compensated for. Rather than an object surface being planar, and typically orientated perpendicular to the image sensor, by employing a ToF sensor to determine viewing distance, an associated color pixel can be “calibrated” in accordance with the measured viewing distance. As described above, the color pixel can be calibrated based upon the distanced measured by the ToF sensor and any measurements made by the color sensor can be adjusted in accord with the calibration adjustment. The viewing distance of the color pixel can be automatically adjusted based upon the refresh rate of the ToF sensor.
- It is to be appreciated that while the various aspects presented herein present color sensors comprising an RGB layout and a Bayer layout, the various aspects are not so limited and any suitable color sensor layout can be utilized. For example, the RGB layout and the Bayer layouts can be further complimented by red-green-blue-emerald (RGBE), cyan-yellow-yellow-magenta (CYYM), cyan-yellow-green-magenta (CYGM), and the like.
- Turning to
FIG. 9 ,system 900 is illustrated and comprises various components previously discussed,image sensor 100 comprisingToF sensor 110,color sensor 120,ToF radiation source 140, coupled tocontroller 160 which is further coupled toexternal device 170, etc. Further,system 900 includes astorage device 910 on which are stored one or more look up tables 920 and/oralgorithms 930. A look up table 920 can provide correlation between a distance measurement and a degree of color correction to be applied to a reading received fromcolor sensor 120 based upon a distance measurement received fromToF sensor 110. Similarly, a reading received fromcolor sensor 120 can be “calibrated” with analgorithm 930 utilizing a distance measurement received fromToF sensor 110. -
FIG. 10 illustratessystem 1000 depicting a microcontroller/integrated circuit according to an aspect.System 1000 comprises a microcontroller/integratedcircuit 1010 on which are combined aprocessing unit 1050 along with animage sensor 1020 comprising aToF sensor 1030 and acolor sensor 1040. As described previously, rather than various components being separately located, e.g., with reference toFIG. 1 ,image sensor 100 operating as a separate device tocontroller 160, the various components can be combined on a single device, e.g., microcontroller/integratedcircuit 1010. Microcontroller/integratedcircuit 1010 can be designed such that animage sensor 1020, comprisingToF sensor 1030 andcolor sensor 1040, resides on the same chip as an associatedprocessing unit 1050. It is to be appreciated thatimage sensor 1020,ToF sensor 1030,color sensor 1040, andprocessing unit 1050, have the same functionality as that described earlier with reference toFIG. 1 , and respectivecomponents image sensor 100,ToF sensor 110,color sensor 120, andcontroller 160. Further, while not shown, it is to be appreciated thatToF radiation source 140,radiation source 150,external device 170, and/orenvironmental sensor 180 can also be located on the microcontroller/integratedcircuit 1010. It is to be further appreciated thatToF sensor 110 can be a color or grayscale ToF sensor. -
FIG. 11 illustrates anexample methodology 1100 for determining color correction based upon viewing distance. At 1110, light reflected from an object, e.g.,object 130, is captured by one or more color pixels (e.g., color pixels 320-340, 420-440, G1R1G1B1, Rx-zGx-zBx, 1040, etc.) comprising one or more color sensors (e.g.,image sensors color sensors - At 1120 each color pixel has an associated ToF pixel (e.g.,
ToF pixels image sensor 100 andsensors - At 1130 the distance from a particular ToF pixel to the object surface from which the ToF radiation was received from is determined (e.g., by
controller 160,processing unit 1050, etc.). In one aspect, a ToF sensor determines distance based upon a phase shift between an emitted radiation transmitted from a radiation source to an object and the radiation received at a ToF pixel as reflected from the object. A phase shift of upto 360° can be determined, where 360° represents one wavelength of the radiation, 180° equals ½ wavelength, etc. Accordingly, by knowing the phase shift, the wavelength portion can be calculated and a distance with respect to wavelength portion can be determined For an accurate determination to be obtained, the distance from a color pixel to the object should be about equal to the respective distance of a ToF pixel to the object (or a known distance ratio, e.g., 1:1, 1:2, 1:5, etc.), where the ToF pixel is being utilized to determine the distance of the particular color pixel to the object. In effect, the ToF pixel is being employed to determine a corresponding distance from a color pixel to an object to facilitate calibration of the color pixel. - At 1140 based upon the determined distance between the ToF pixel to the object, and accordingly the respective distance of the color pixel associated with the ToF pixel, the color reading obtained by the color pixel can be corrected in accordance with the determined distance (e.g., by controller 160). Any suitable method can be employed to perform the color correction. Suitable methods are, but not limited to, employing a look-up table (e.g., look-up table 920) comprising correlations between color reading(s), object to pixel distance(s), and correction factor(s) based thereon. Alternatively, an algorithm(s) (e.g., algorithm(s) 930) can be utilized into which a color reading and associated distance measurement are entered, correction factor(s) determined and a corrected color value generated.
- At 1150 one or more color-corrected values can be outputted. The color-corrected values can be outputted to an external device (e.g., external device 170) for presentation, e.g., where the external device comprises means for display and the color corrected image can be presented thereon. Alternatively, the external device can comprise visual display means such as a printer and the color corrected image is generated in hard format. Further, the external device can comprise further means for data processing, where information associated with the color correction process (e.g., readings obtained by color pixels 320-340, 420-440, G1R1G1B1, Rx-zGx-zBx, 1040, etc.; readings obtained by
image sensor 100, andToF sensors ToF radiation source 140; operating settings ofradiation source 150, etc.) can be provided to the external device for subsequent processing. - It is to be appreciated that while
methodology 1100 shows theprocess 1110 of capturing light reflected from an object, e.g.,object 130, by one or more color pixels as an operation proceeding that of 1120 where radiation is received at one or more ToF pixels and at 1130 determining the object distance based upon the ToF data, the various aspects presented herein are not so limited and the operation of determining the object distance can be performed prior to capturing light by a color pixel and theoperation 1140 of color correction performed thereafter. - It is to be further appreciated that while
methodology 1100 discloses object to sensor distance being determined based upon determination of phase shift, other principles of ToF technologies are available, for example, pulse metering where a pulse is transmitted and the time-to-receive is measured. -
FIG. 12 illustrates anexample methodology 1200 for correcting operation of a color sensing pixel where an associated ToF pixel is erroneous or inoperable. At 1210, during calibration of one or more color sensing pixels (e.g., color pixels 320-340, 420-440, G1R1G1B1, Rx-zGx-zBx, etc.) which comprise an image sensor (e.g.,image sensor color sensors FIG. 5 , ToF1) indicates an object is 7m away, while a second ToF pixel (e.g.,FIG. 5 , ToF2) indicates the object to be 1.5m away. It is known that the object has a planar surface and is aligned perpendicular to the sightline of the ToF sensor. Accordingly, by comparing the distance values from the two ToF pixels it is possible to identify which ToF pixel is erroneous by comparing readings of adjacent ToF pixels (e.g.,FIG. 5 , ToF3 or ToF4). Alternatively, during an attempt to determine a degree with which a color sensing pixel is to be calibrated, during execution of the calibration determination (e.g., by controller 160) no readings are available from the ToF pixel associated with the color sensing pixel. Based on the above, it is considered that the ToF pixel is either providing erroneous values or is inoperable. - At 1220 a determination is made to identify an alternative ToF pixel which can provide readings to be employed in calibrating the subject color pixel. In one aspect where a ToF pixel is inoperable/erroneous (e.g.,
FIG. 5 , ToF1 is inoperable) readings are taken from an adjacent ToF pixel (e.g.,FIG. 5 , ToF2, ToF3 or ToF4) which is performing correctly. - At 1230 the distance from the alternative ToF pixel to the object surface is determined (e.g., by
controller 160, processing unit 1050). - At 1240 the alternative distance measurements are applied to the color sensing pixel. For example, the alternative distance measurement is referenced when determining a correction value from a lookup table (e.g., lookup table 920) associated with the color reading obtained from the subject color sensing pixel. In another example, the alternative distance measurement can be utilized when applying a calibration algorithm (e.g, algorithm 930) to calibrate the color reading obtained from the subject color sensing pixel.
- At 1250 color corrected (e.g., calibrated) values for the subject color sensing pixel are outputted.
-
FIG. 1300 illustrates anexample methodology 1300 for generating a color corrected image from a grayscale ToF sensor. At 1310 a grayscale ToF sensor (e.g., ToF sensor 810) is divided up into respective portions for performing viewing distance determination and color sensing. A ToF sensor, for viewing distance determination, may only operate in the infrared portion of the electromagnetic spectrum (as discussed supra). Further, as discussed above, a ToF sensor may only produce a grayscale 3D image. However, by employing color filters and appropriate illumination, a 3D color image can be produced. To facilitate distance measurement a portion of a ToF sensor is maintained as a pixel(s) region for measuring distance (e.g., ToF-d 820). Other portions of the ToF sensor can be covered with filters (e.g., ToFred filter 830, ToFgreen filter 840, ToF blue filter 850) to allow a desired portion of visible light to pass through to the ToF sensor pixel array below. - At 1320 an object (e.g., object 130) is illuminated. Illumination can be facilitated by any suitable illumination source (e.g.,
ToF radiation source 140, radiation source 150), where such illumination source can be an LED device, laser, etc. The LED device can comprise of a single LED emitting light across the visible portion of the electromagnetic spectrum (e.g., LED 860), where the LED device can operate with a frequency modulation suitable for ToF determination, e.g., about 20 MHz. In an alternative embodiment the LED device can comprise of a plurality of LEDs, where each LED emits light from different portions of the electromagnetic spectrum (e.g., red, green, and blue LEDs 870). The plurality of LED's can operate in sequence and with frequency modulation required for distance measurement, e.g., about 20 MHz. - At 1330, the distance of the object is determined based upon measurements obtained from the portion of the ToF sensor employed to facilitate distance measurement.
- At 1340, based upon the determined object distance, the respective portions of the ToF sensor being employed to receive light reflected from the object and thereby create a 3D image of the object are calibrated. In accordance with the determined calibration(s) for each color sensing portion of the ToF sensor, the actual measurements received from each color sensing portion are adjusted. It is to be appreciated that the degree of calibration required for one color sensing region may not equal a degree of calibration required for another color sensing region. For example, the degree of color correction required to correct a red light reading may be greater than that required for a blue light reading, when both readings are made over the same distance.
- At 1350 one or more color-corrected values can be combined and outputted. Values taken from each of the red, green, and blue light sensing portions of the ToF sensor can be combined to create a color image. Owing to the respective readings being accompanied by distance measurements it is possible to create a 3D color image. The color-corrected values can be output to an external device (e.g., external device 170) for presentation, printing, further data processing, etc.
- For purposes of simplicity of explanation, methodologies that can be implemented in accordance with the various aspects disclosed herein were shown and described as a series of blocks. However, it is to be understood and appreciated that the various aspects disclosed herein are not limited by the order of the blocks, as some blocks can occur in different orders and/or concurrently with other blocks from what is depicted and described herein. Moreover, not all illustrated blocks are required to implement the methodologies described supra. Additionally, it should be further appreciated that the methodologies disclosed throughout this specification are capable of being stored on an article of manufacture to facilitate transporting and transferring such methodologies to computers. The term article of manufacture, as used, is intended to encompass a computer program accessible from any computer-readable device, carrier, or media.
- The aforementioned systems have been described with respect to interaction among several components. It should be appreciated that such systems and components can include those components or sub-components specified therein, some of the specified components or sub-components, and/or additional components. Sub-components can also be implemented as components communicatively coupled to other components rather than included within parent components. Additionally, it should be noted that one or more components could be combined into a single component providing aggregate functionality. The components could also interact with one or more other components not specifically described herein but known by those of skill in the art.
- Furthermore, the various aspects as presented herein can be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof to control a computer to implement the various disclosed aspects. The term “article of manufacture” as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier, or media. For example, computer readable media can include but are not limited to magnetic storage devices (e.g., hard disk, floppy disk, magnetic strips . . . ), optical disks (e.g., compact disk (CD), digital versatile disk (DVD) . . . ), smart cards, and flash memory devices (e.g., card, stick, key drive . . . ). Additionally it should be appreciated that a carrier wave can be employed to carry computer-readable electronic data such as those used in transmitting and receiving electronic mail or in accessing a network such as the Internet or a local area network (LAN). Of course, those skilled in the art will recognize many modifications can be made to this configuration without departing from the scope or spirit of the claimed subject matter.
- Moreover, the word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the word exemplary is intended to disclose concepts in a concrete fashion. As used in this application, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.
- What has been described above includes examples of the subject specification. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the subject specification, but one of ordinary skill in the art can recognize that many further combinations and permutations of the subject specification are possible. Accordingly, the subject specification is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term “includes” is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/973,692 US20130335725A1 (en) | 2010-11-03 | 2013-08-22 | Color sensor insensitive to distance variations |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/938,499 US8542348B2 (en) | 2010-11-03 | 2010-11-03 | Color sensor insensitive to distance variations |
US13/973,692 US20130335725A1 (en) | 2010-11-03 | 2013-08-22 | Color sensor insensitive to distance variations |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/938,499 Continuation US8542348B2 (en) | 2010-11-03 | 2010-11-03 | Color sensor insensitive to distance variations |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130335725A1 true US20130335725A1 (en) | 2013-12-19 |
Family
ID=45092192
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/938,499 Active 2031-01-29 US8542348B2 (en) | 2010-11-03 | 2010-11-03 | Color sensor insensitive to distance variations |
US13/973,692 Abandoned US20130335725A1 (en) | 2010-11-03 | 2013-08-22 | Color sensor insensitive to distance variations |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/938,499 Active 2031-01-29 US8542348B2 (en) | 2010-11-03 | 2010-11-03 | Color sensor insensitive to distance variations |
Country Status (3)
Country | Link |
---|---|
US (2) | US8542348B2 (en) |
EP (1) | EP2451150B1 (en) |
CN (1) | CN102685402B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130123015A1 (en) * | 2011-11-15 | 2013-05-16 | Samsung Electronics Co., Ltd. | Image sensor, operation method thereof and apparatuses incuding the same |
WO2016105656A1 (en) | 2014-12-22 | 2016-06-30 | Google Inc. | Image sensor and light source driver integrated in a same semiconductor package |
CN106231193A (en) * | 2016-08-05 | 2016-12-14 | 深圳市金立通信设备有限公司 | A kind of image processing method and terminal |
US9615013B2 (en) * | 2014-12-22 | 2017-04-04 | Google Inc. | Image sensor having multiple output ports |
DE102018108379A1 (en) * | 2018-04-09 | 2019-10-10 | pmdtechnologies ag | Transit Time pixels |
US10580807B2 (en) | 2017-10-24 | 2020-03-03 | Stmicroelectronics, Inc. | Color pixel and range pixel combination unit |
WO2020167682A1 (en) * | 2019-02-12 | 2020-08-20 | Viavi Solutions Inc. | Time of flight sensor device and method of use |
EP4009625A1 (en) * | 2020-12-03 | 2022-06-08 | Artilux Inc. | Multi-application optical sensing apparatus and method thereof |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012049773A (en) * | 2010-08-26 | 2012-03-08 | Sony Corp | Imaging apparatus and method, and program |
US20130176550A1 (en) * | 2012-01-10 | 2013-07-11 | Ilia Ovsiannikov | Image sensor, image sensing method, and image photographing apparatus including the image sensor |
US9784577B2 (en) * | 2012-03-16 | 2017-10-10 | Lg Innotek Co., Ltd. | Measuring distance from object by using size of pattern projected onto object |
US9516248B2 (en) | 2013-03-15 | 2016-12-06 | Microsoft Technology Licensing, Llc | Photosensor having enhanced sensitivity |
DE102013103333A1 (en) * | 2013-04-03 | 2014-10-09 | Karl Storz Gmbh & Co. Kg | Camera for recording optical properties and room structure properties |
US9435641B2 (en) | 2013-06-20 | 2016-09-06 | Analog Devices, Inc. | Optical angle measurement |
US9274202B2 (en) | 2013-06-20 | 2016-03-01 | Analog Devices, Inc. | Optical time-of-flight system |
KR20150010230A (en) * | 2013-07-18 | 2015-01-28 | 삼성전자주식회사 | Method and apparatus for generating color image and depth image of an object using singular filter |
CN105308953A (en) * | 2013-07-19 | 2016-02-03 | 谷歌技术控股有限责任公司 | Asymmetric sensor array for capturing images |
US10591969B2 (en) | 2013-10-25 | 2020-03-17 | Google Technology Holdings LLC | Sensor-based near-field communication authentication |
KR102277309B1 (en) * | 2014-01-29 | 2021-07-14 | 엘지이노텍 주식회사 | Apparatus and method for extracting depth map |
JP6328965B2 (en) * | 2014-03-12 | 2018-05-23 | スタンレー電気株式会社 | Distance image generating apparatus and distance image generating method |
US9871065B2 (en) | 2014-12-22 | 2018-01-16 | Google Inc. | RGBZ pixel unit cell with first and second Z transfer gates |
US9591247B2 (en) | 2014-12-22 | 2017-03-07 | Google Inc. | Image sensor having an extended dynamic range upper limit |
US9425233B2 (en) | 2014-12-22 | 2016-08-23 | Google Inc. | RGBZ pixel cell unit for an RGBZ image sensor |
US9741755B2 (en) | 2014-12-22 | 2017-08-22 | Google Inc. | Physical layout and structure of RGBZ pixel cell unit for RGBZ image sensor |
US9508681B2 (en) | 2014-12-22 | 2016-11-29 | Google Inc. | Stacked semiconductor chip RGBZ sensor |
US20160182846A1 (en) | 2014-12-22 | 2016-06-23 | Google Inc. | Monolithically integrated rgb pixel array and z pixel array |
DE102015101902A1 (en) * | 2015-02-10 | 2016-08-11 | Osram Opto Semiconductors Gmbh | Detector and lidar system |
US10412363B1 (en) * | 2015-03-26 | 2019-09-10 | 3D Patents, Llc | Light-field display with micro-lens alignment adapted color and brightness |
CN107534764B (en) * | 2015-04-30 | 2020-03-17 | 深圳市大疆创新科技有限公司 | System and method for enhancing image resolution |
US11191489B2 (en) * | 2016-01-15 | 2021-12-07 | Koninklijke Philips N.V. | Device, system and method for generating a photoplethysmographic image carrying vital sign information of a subject |
US10321114B2 (en) * | 2016-08-04 | 2019-06-11 | Google Llc | Testing 3D imaging systems |
US10810753B2 (en) * | 2017-02-27 | 2020-10-20 | Microsoft Technology Licensing, Llc | Single-frequency time-of-flight depth computation using stereoscopic disambiguation |
US20180301484A1 (en) * | 2017-04-17 | 2018-10-18 | Semiconductor Components Industries, Llc | Image sensors with high dynamic range and autofocusing hexagonal pixels |
US10802117B2 (en) | 2018-01-24 | 2020-10-13 | Facebook Technologies, Llc | Systems and methods for optical demodulation in a depth-sensing device |
US10805594B2 (en) | 2018-02-08 | 2020-10-13 | Facebook Technologies, Llc | Systems and methods for enhanced depth sensor devices |
US10735640B2 (en) | 2018-02-08 | 2020-08-04 | Facebook Technologies, Llc | Systems and methods for enhanced optical sensor devices |
WO2019211485A1 (en) * | 2018-05-04 | 2019-11-07 | Peter Ehbets | Handheld non-contact multispectral measurement device with position correction |
CN113728246A (en) * | 2019-04-22 | 2021-11-30 | 株式会社小糸制作所 | ToF camera, vehicle lamp, and automobile |
CN111965655A (en) * | 2019-05-02 | 2020-11-20 | 广州印芯半导体技术有限公司 | Multimedia system applying time-of-flight ranging and operation method thereof |
US11070757B2 (en) * | 2019-05-02 | 2021-07-20 | Guangzhou Tyrafos Semiconductor Technologies Co., Ltd | Image sensor with distance sensing function and operating method thereof |
WO2021005659A1 (en) * | 2019-07-05 | 2021-01-14 | パナソニックセミコンダクターソリューションズ株式会社 | Information processing system, sensor system, information processing method, and program |
US11674797B2 (en) | 2020-03-22 | 2023-06-13 | Analog Devices, Inc. | Self-aligned light angle sensor using thin metal silicide anodes |
US11871132B2 (en) * | 2020-05-08 | 2024-01-09 | Omnivision Technologies, Inc. | Devices and methods for obtaining three-dimensional shape information using polarization and time-of-flight detection pixel cells |
EP4283329B1 (en) * | 2022-05-24 | 2024-07-03 | Sick Ag | Device and method for jointly detecting a colour and a distance of an object |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6611617B1 (en) * | 1995-07-26 | 2003-08-26 | Stephen James Crampton | Scanning apparatus and method |
US7560679B1 (en) * | 2005-05-10 | 2009-07-14 | Siimpel, Inc. | 3D camera |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3584845B2 (en) | 2000-03-16 | 2004-11-04 | 日立ハイテク電子エンジニアリング株式会社 | Test device and test method for IC device |
US6456793B1 (en) | 2000-08-03 | 2002-09-24 | Eastman Kodak Company | Method and apparatus for a color scannerless range imaging system |
US7274393B2 (en) | 2003-02-28 | 2007-09-25 | Intel Corporation | Four-color mosaic pattern for depth and image capture |
JP2007526453A (en) | 2004-01-28 | 2007-09-13 | カネスタ インコーポレイテッド | Single chip red, green, blue, distance (RGB-Z) sensor |
US7538889B2 (en) * | 2004-02-18 | 2009-05-26 | Hewlett-Packard Development Company, L.P. | Calibration feedback-control circuit for diffraction light devices |
US7283240B2 (en) * | 2005-08-24 | 2007-10-16 | Xerox Corporation | Spectrophotometer target distance variation compensation |
US20070052540A1 (en) * | 2005-09-06 | 2007-03-08 | Rockwell Automation Technologies, Inc. | Sensor fusion for RFID accuracy |
US7869649B2 (en) * | 2006-05-08 | 2011-01-11 | Panasonic Corporation | Image processing device, image processing method, program, storage medium and integrated circuit |
DE112007003293B4 (en) * | 2007-01-26 | 2021-01-21 | Trimble Jena Gmbh | Optical instrument and method for obtaining distance and image information |
KR101467509B1 (en) * | 2008-07-25 | 2014-12-01 | 삼성전자주식회사 | Image sensor and operating method for image sensor |
US7880888B2 (en) * | 2009-03-17 | 2011-02-01 | Rockwell Automation Technologies, Inc. | Photoelectric sensor for sensing a target |
US8487235B2 (en) * | 2009-04-13 | 2013-07-16 | Rockwell Automation Technologies, Inc. | Photoelectric sensor for sensing a target at a predetermined location |
-
2010
- 2010-11-03 US US12/938,499 patent/US8542348B2/en active Active
-
2011
- 2011-11-03 EP EP11187652.0A patent/EP2451150B1/en active Active
- 2011-11-03 CN CN201110353153.0A patent/CN102685402B/en active Active
-
2013
- 2013-08-22 US US13/973,692 patent/US20130335725A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6611617B1 (en) * | 1995-07-26 | 2003-08-26 | Stephen James Crampton | Scanning apparatus and method |
US7560679B1 (en) * | 2005-05-10 | 2009-07-14 | Siimpel, Inc. | 3D camera |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130123015A1 (en) * | 2011-11-15 | 2013-05-16 | Samsung Electronics Co., Ltd. | Image sensor, operation method thereof and apparatuses incuding the same |
US9025829B2 (en) * | 2011-11-15 | 2015-05-05 | Samsung Electronics Co., Ltd. | Image sensor, operation method thereof and apparatuses including the same |
US10257455B2 (en) * | 2014-12-22 | 2019-04-09 | Google Llc | Image sensor and light source driver integrated in a same semiconductor package |
US9581696B2 (en) * | 2014-12-22 | 2017-02-28 | Google Inc. | Image sensor and light source driver integrated in a same semiconductor package |
US9615013B2 (en) * | 2014-12-22 | 2017-04-04 | Google Inc. | Image sensor having multiple output ports |
US9866740B2 (en) * | 2014-12-22 | 2018-01-09 | Google Llc | Image sensor having multiple output ports |
EP3238255A4 (en) * | 2014-12-22 | 2018-08-01 | Google LLC | Image sensor and light source driver integrated in a same semiconductor package |
US10182182B2 (en) * | 2014-12-22 | 2019-01-15 | Google Llc | Image sensor having multiple output ports |
USRE49664E1 (en) * | 2014-12-22 | 2023-09-19 | Google Llc | Image sensor and light source driver integrated in a same semiconductor package |
WO2016105656A1 (en) | 2014-12-22 | 2016-06-30 | Google Inc. | Image sensor and light source driver integrated in a same semiconductor package |
USRE49748E1 (en) * | 2014-12-22 | 2023-12-05 | Google Llc | Image sensor and light source driver integrated in a same semiconductor package |
CN106231193A (en) * | 2016-08-05 | 2016-12-14 | 深圳市金立通信设备有限公司 | A kind of image processing method and terminal |
US10580807B2 (en) | 2017-10-24 | 2020-03-03 | Stmicroelectronics, Inc. | Color pixel and range pixel combination unit |
DE102018108379A1 (en) * | 2018-04-09 | 2019-10-10 | pmdtechnologies ag | Transit Time pixels |
DE102018108379B4 (en) | 2018-04-09 | 2024-03-07 | pmdtechnologies ag | Light travel time pixels |
US11340111B2 (en) | 2019-02-12 | 2022-05-24 | Viavi Solutions Inc. | Sensor device and method of use |
US10876889B2 (en) | 2019-02-12 | 2020-12-29 | Viavi Solutions Inc. | Sensor device and method of use |
WO2020167682A1 (en) * | 2019-02-12 | 2020-08-20 | Viavi Solutions Inc. | Time of flight sensor device and method of use |
US11852531B2 (en) | 2019-02-12 | 2023-12-26 | Viavi Solutions Inc. | Sensor device and method of use |
EP4009625A1 (en) * | 2020-12-03 | 2022-06-08 | Artilux Inc. | Multi-application optical sensing apparatus and method thereof |
US11624653B2 (en) | 2020-12-03 | 2023-04-11 | Artilux, Inc. | Multi-application optical sensing apparatus and method thereof |
US12007280B2 (en) | 2020-12-03 | 2024-06-11 | Artilux, Inc. | Multi-application optical sensing apparatus and method thereof |
Also Published As
Publication number | Publication date |
---|---|
CN102685402B (en) | 2015-03-11 |
US8542348B2 (en) | 2013-09-24 |
EP2451150A2 (en) | 2012-05-09 |
EP2451150B1 (en) | 2022-04-20 |
EP2451150A3 (en) | 2017-06-21 |
CN102685402A (en) | 2012-09-19 |
US20120105823A1 (en) | 2012-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8542348B2 (en) | Color sensor insensitive to distance variations | |
US11575843B2 (en) | Image sensor modules including primary high-resolution imagers and secondary imagers | |
US11172186B2 (en) | Time-Of-Flight camera system | |
US10120066B2 (en) | Apparatus for making a distance determination | |
TW201544848A (en) | Structured-stereo imaging assembly including separate imagers for different wavelengths | |
CN108353158B (en) | Image capturing apparatus and control method thereof | |
JP6717887B2 (en) | Distance measuring device having distance correction function | |
US11914078B2 (en) | Calibration of a depth sensing array using color image data | |
US9103663B2 (en) | Depth sensor, method of calculating depth in the same | |
TW201539012A (en) | Optical imaging modules and optical detection modules including a time-of-flight sensor | |
US20150271892A1 (en) | Light emitting apparatus | |
CN110779681A (en) | Distance measuring device for detecting abnormality of optical system | |
CN110661940A (en) | Imaging system with depth detection and method of operating the same | |
JP2005249723A (en) | Display output unit for image containing temperature distribution, and control method therefor | |
US20210382153A1 (en) | Method and apparatus for characterizing a time-of-flight sensor and/or a cover covering the time-of-flight sensor | |
US10863116B2 (en) | Solid-state image capture device, image capture system, and object identification system | |
US11610339B2 (en) | Imaging processing apparatus and method extracting a second RGB ToF feature points having a correlation between the first RGB and TOF feature points | |
TWI727081B (en) | Photographic components and photography system | |
JP6245440B2 (en) | Image reading apparatus and image reading program | |
WO2024079989A1 (en) | Display device with detection function | |
JP2017133931A (en) | Image creation device, and distance image and color image creation method | |
TW202430910A (en) | Display device with detection function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CEDES SAFETY & AUTOMATION AG, SWITZERLAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HARDEGGER, MARTIN;BERNER, RETO;GALERA, RICHARD;AND OTHERS;SIGNING DATES FROM 20100928 TO 20101025;REEL/FRAME:031065/0381 Owner name: ROCKWELL AUTOMATION TECHNOLOGIES, INC., OHIO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HARDEGGER, MARTIN;BERNER, RETO;GALERA, RICHARD;AND OTHERS;SIGNING DATES FROM 20100928 TO 20101025;REEL/FRAME:031065/0381 |
|
AS | Assignment |
Owner name: ROCKWELL AUTOMATION TECHNOLOGIES, INC., OHIO Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF INVENTOR MANFRED NORBERT STEIN FROM INCORRECTLY ENTERED MANFRED NORBERT NORBERT PREVIOUSLY RECORDED ON REEL 031065 FRAME 0381. ASSIGNOR(S) HEREBY CONFIRMS THE NAME OF MANFRED NORBERT STEIN;ASSIGNORS:HARDEGGER, MARTIN;BERNER, RETO;GALERA, RICHARD;AND OTHERS;SIGNING DATES FROM 20100928 TO 20101025;REEL/FRAME:031969/0163 Owner name: CEDES SAFETY & AUTOMATION AG, SWITZERLAND Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF INVENTOR MANFRED NORBERT STEIN FROM INCORRECTLY ENTERED MANFRED NORBERT NORBERT PREVIOUSLY RECORDED ON REEL 031065 FRAME 0381. ASSIGNOR(S) HEREBY CONFIRMS THE NAME OF MANFRED NORBERT STEIN;ASSIGNORS:HARDEGGER, MARTIN;BERNER, RETO;GALERA, RICHARD;AND OTHERS;SIGNING DATES FROM 20100928 TO 20101025;REEL/FRAME:031969/0163 |
|
AS | Assignment |
Owner name: ROCKWELL AUTOMATION SAFETY AG, SWITZERLAND Free format text: CHANGE OF NAME;ASSIGNOR:CEDES SAFETY & AUTOMATION AG;REEL/FRAME:037513/0154 Effective date: 20150501 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |