WO2014014838A2 - Interactive illumination for gesture and/or object recognition - Google Patents

Interactive illumination for gesture and/or object recognition Download PDF

Info

Publication number
WO2014014838A2
WO2014014838A2 PCT/US2013/050551 US2013050551W WO2014014838A2 WO 2014014838 A2 WO2014014838 A2 WO 2014014838A2 US 2013050551 W US2013050551 W US 2013050551W WO 2014014838 A2 WO2014014838 A2 WO 2014014838A2
Authority
WO
WIPO (PCT)
Prior art keywords
illumination
processor
target
light source
laser
Prior art date
Application number
PCT/US2013/050551
Other languages
French (fr)
Other versions
WO2014014838A3 (en
Inventor
Richard William NEUMANN
Original Assignee
2R1Y
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 2R1Y filed Critical 2R1Y
Publication of WO2014014838A2 publication Critical patent/WO2014014838A2/en
Publication of WO2014014838A3 publication Critical patent/WO2014014838A3/en
Priority to US14/597,819 priority Critical patent/US20160006914A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Definitions

  • the embodiments here relates to an illumination system for illumination of a target area for image capture in order to allow for three dimensional object recognition and target mapping.
  • the disclosure includes methods and systems including a system for target illumination and mapping, comprising, a light source and an image sensor, the light source configured to, communicate with a processor, scan a target area within a field of view, receive direction from the processor regarding projecting light within the field of view on at least one target, the image sensor configured to, communicate with the processor, receive reflected illumination from the target area within the field of view, generate data regarding the received reflected illumination, and send the data regarding the received reflected illumination to the processor.
  • a system for target illumination and mapping comprising, a light source and an image sensor, the light source configured to, communicate with a processor, scan a target area within a field of view, receive direction from the processor regarding projecting light within the field of view on at least one target, the image sensor configured to, communicate with the processor, receive reflected illumination from the target area within the field of view, generate data regarding the received reflected illumination, and send the data regarding the received reflected illumination to the processor.
  • Such systems where the light source is an array of light emitting diodes (LEDs).
  • LEDs light emitting diodes
  • Such systems where the light source is a laser, where the laser is at least one of, amplitude modulated and pulse width modulated.
  • the direction received from the processor includes direction to track the at least one target.
  • the data regarding the received reflected illumination includes information that would allow the processor to determine the distance from the system to the select target via triangulation.
  • Such systems where the system is light source is further configured to receive direction from the processor to illuminate the tracked target in motion.
  • Such systems where the light source is further configured to block illumination of particular areas on the at least one select target via direction from the processor.
  • MEMS micro electromechanical system mirror
  • Such systems where the image sensor is further configured to generate gray shade image data based on the received infrared illumination, and assign visible colors to gray shades of the image data.
  • Such systems where the image sensor is a complementary metal oxide semiconductor (CMOS).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • Such systems where the light source and the image sensor include optical filters.
  • Such systems where the light source is a laser.
  • Another example system includes a system for illuminating a target area, including, a directionally controlled laser light source, and an image sensor, the directionally controlled laser light source configured to, communicate with a processor, scan the target area, receive direction on illuminating specific selected targets within the target area from the processor, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to communicate with the processor, receive the laser light reflected off of the target area, generate data regarding the received reflected laser light, and send the data regarding the received laser light to the processor.
  • a directionally controlled laser light source configured to, communicate with a processor, scan the target area, receive direction on illuminating specific selected targets within the target area from the processor, where the laser is at least one of, amplitude modulated and pulse width modulated
  • the image sensor configured to communicate with the processor, receive the laser light reflected off of the target area, generate data regarding the received reflected laser light, and send the data regarding the received laser light to the processor.
  • Such systems where the data regarding the received reflected laser light is configured to allow the processor to calculate a depth map.
  • Such systems where the image sensor is a complementary metal oxide semiconductor (CMOS).
  • CMOS complementary metal oxide semiconductor
  • Such systems where the image sensor is a charge coupled device (CCD).
  • CCD charge coupled device
  • Such systems where the data regarding the received reflected laser light is configured to allow the processor to calculate a point cloud.
  • Such systems where the directional control is via at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS.
  • MEMS micro electromechanical system mirror
  • Another example method includes a method for target illumination and mapping, including, via a light source, communicating with a processor, scanning a target area within a field of view, receiving direction from the processor regarding projecting light within the field of view on at least one target, via an image sensor, communicating with the processor, receiving reflected illumination from the target area within the field of view, generating data regarding the received reflected illumination, and sending the data regarding the received reflected
  • Such methods where the light source is an array of light emitting diodes (LEDs).
  • LEDs light emitting diodes
  • the light source is a laser
  • the laser is at least one of, amplitude modulated and pulse width modulated.
  • the laser is an infrared laser and the image sensor is configured to receive and process infrared energy.
  • Such methods where the direction received from the processor includes direction to track the at least one target.
  • Such methods further comprising, via the light source, receiving direction from the processor to illuminate the tracked target in motion.
  • Such methods further comprising, via the light source, blocking illumination of particular areas on the at least one select target via direction from the processor.
  • Such methods where the scan of the target area is a raster scan.
  • Such methods where the raster scan is completed within one frame of the image sensor.
  • Such methods where the light source includes at least one of, a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS, to direct the light.
  • Such methods where the tracking the selected target includes more than one selected target.
  • Such methods further comprising, via the image sensor, generating gray shade image data based on the received infrared illumination, and assigning visible colors to gray shades of the image data.
  • Such methods where the image sensor is a complementary metal oxide
  • CMOS complementary metal-oxide-semiconductor
  • CCD charge coupled device
  • optical filters optical filters
  • Another example method includes a method for illuminating a target area, comprising, via a directionally controlled laser light source, communicating with a processor, scanning the target area, receiving direction on illuminating specific selected targets within the target area from the processor, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving the laser light reflected off of the target area, generating data regarding the received reflected laser light, and sending the data regarding the received laser light to the processor.
  • Such methods further comprising, via the laser light source, receiving direction from the processor to illuminate at least two target objects with different illumination patterns.
  • Such methods where the data regarding the received reflected laser light is configured to allow the processor to calculate a depth map.
  • Such methods where the image sensor is a complementary metal oxide semiconductor (CMOS).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • Such methods where the light source and the image sensor include optical filters.
  • Such methods where the data regarding the received reflected laser light is configured to allow the processor to calculate a point cloud.
  • Such methods where the directional control is via at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS.
  • Such methods where the directional control is via at least one rotating mirror.
  • Such methods further comprising, via the laser light source, receiving direction to send a pulse of energy to a unique part of the target area, creating pixels for the image sensor.
  • Such methods where the laser is a continuous wave laser.
  • Another example system includes a system for target area illumination, comprising, a directional illumination source and image sensor, the directional illumination source configured to, communicate with a processor, receive direction to illuminate the target area from the processor, and project illumination on the target area, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, capture reflected illumination off of the target area, generate data regarding the captured reflected illumination, and send the data regarding the capture reflected illumination to the processor, where the illumination source and the image sensor share an aperture and which a throw angle of the directed illumination and a field of view angle of the reflected captured illumination are matched.
  • Such systems where the laser is an infrared laser and the image sensor is configured to receive and process infrared energy.
  • the laser includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light.
  • MEMS micro electromechanical system mirror
  • the data regarding the captured reflected illumination includes information regarding triangulation for distance measurements.
  • the illumination source is further configured to receive instruction regarding motion tracking of the select target.
  • the shared aperture is at least one of adjacent, common and objective.
  • Another example method includes a method for target area illumination, comprising, via a directional illumination source, communicating with a processor, receiving direction to illuminate the target area from the processor, and projecting illumination on the target area, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, capturing reflected illumination off of the target area, generating data regarding the captured reflected illumination, and sending the data regarding the capture reflected illumination to the processor, where the illumination source and the image sensor share an aperture and which a throw angle of the directed illumination and a field of view angle of the reflected captured illumination are matched.
  • the laser is an infrared laser and the image sensor is configured to receive and process infrared energy
  • the shared aperture is at least one of adjacent, common and objective.
  • Another example system includes a system for illuminating a target area, comprising, a light source and an image sensor, the light source configured to, communicate with a processor, illuminate a target area with at least one pattern of light, within a field of view, receive direction to illuminate at least one select target within the target area from the processor, and receive information regarding illuminating the at least one select target with at least one calibrated pattern of light, from the processor, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, receive reflected illumination patterns from the at least one select target within the field of view, generate data regarding the received reflected illumination patterns, and send data about the received reflected illumination patterns to the processor, where the data includes, information allowing the processor to determine distance to the at least one select target via triangulation of the illumination and
  • Such methods where the pattern is at least one of, alternating illuminated and non- illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array.
  • Such methods where the light source is further configured to change illumination patterns.
  • Such methods where the light source is a laser.
  • Such methods where the direction to illuminate at least one select target includes direction to track the motion of the at least one select target.
  • Another example system includes a system for allowing mapping of a target area, comprising, a laser and an image sensor, the laser configured to, communicate with a processor, receive direction to illuminate at least one select target with a pattern of light, project
  • the image sensor configured to, communicate with the processor, receive reflected laser illumination patterns from the at least one select target, generate data regarding the received reflected laser illumination patterns, and send the data regarding the received reflected laser illumination to the processor, where the data includes information that would allow the processor to, determine distance via triangulation, generate a map of the target area via 3D surface measurements, and generate a point cloud of the select target.
  • Such systems where the pattern is at least one of, alternating illuminated and non- illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array.
  • the light source is further configured to change illumination patterns.
  • the laser is further configured to receive direction to track a motion of the selected target.
  • the image sensor is at least one of complementary metal oxide semiconductor (CMOS) and charge coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • Another example method includes a method for illuminating a target area, comprising, via a light source, communicating with a processor, illuminating a target area with at least one pattern of light, within a field of view, receiving direction to illuminate at least one select target within the target area from the processor, and receiving information regarding illuminating the at least one select target with at least one calibrated pattern of light, from the processor, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving reflected illumination patterns from the at least one select target within the field of view, generating data regarding the received reflected illumination patterns, and sending data about the received reflected illumination patterns to the processor, where the data includes, information allowing the processor to determine distance to the at least one select target via triangulation of the illumination and received reflected illumination, and information regarding structured light of the at least one received reflected illumination patterns.
  • Such methods where the pattern is at least one of, alternating illuminated and non- illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array.
  • Such methods further comprising, via the light source, projecting a new illumination pattern.
  • the light source is a laser.
  • Such methods where the direction to illuminate at least one select target includes direction to track the motion of the at least one select target.
  • Another example method includes a method for allowing mapping of a target area, comprising, via a laser, communicating with a processor, receiving direction to illuminate at least one select target with a pattern of light, projecting illumination on the at least one select target with the pattern of light, receiving information regarding calibration of the pattern of light, projecting calibrated illumination on the at least one select target, via an image sensor, communicating with the processor, receiving reflected laser illumination patterns from the at least one select target, generating data regarding the received reflected laser illumination patterns, and sending the data regarding the received reflected laser illumination to the processor, where the data includes information that would allow the processor to, determine distance via triangulation, generate a map of the target area via 3D surface measurements, and generate a point cloud of the select target.
  • Such methods where the pattern is at least one of, alternating illuminated and non- illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array.
  • Such methods further comprising, via the light source, projecting a new illumination pattern.
  • Such methods further comprising, via the laser, receiving direction to track a motion of the selected target.
  • the image sensor is at least one of complementary metal oxide semiconductor (CMOS) and charge coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • Another example system includes a system for target illumination and mapping, comprising, an infrared light source and an image sensor, the infrared light source configured to, communicate with a processor, illuminate a target area within a field of view, receive direction from the processor, to illuminate at least one select target within the field of view, project illumination on the at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor, having a dual band pass filter, configured to, communicate with the processor, receive reflected illumination from the target area within the field of view, receive reflected illumination from the at least one select target within the target area, generate data regarding the received reflected illumination, and send the data to the processor.
  • Such systems where the dual band pass filter is configured to allow visible light and light at the wavelengths emitted by the infrared light source, to pass. Such systems where the visible light wavelengths are between 400nm and 700nm. Such systems where dual band pass filter includes a notch filter.
  • the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD)
  • the infrared light source includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • MEMS micro electromechanical system mirror
  • Another example method includes a method for target illumination and mapping, comprising, via an infrared light source, communicating with a processor, illuminating a target area within a field of view, receiving direction from the processor, to illuminate at least one select target within the field of view, projecting illumination on the at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, having a dual band pass filter, communicating with the processor, receiving reflected illumination from the target area within the field of view, receiving reflected illumination from the at least one select target within the target area, generating data regarding the received reflected illumination, and sending the data to the processor.
  • the dual band pass filter is configured to allow visible light and light at the wavelengths emitted by the infrared light source, to pass. Such methods where the visible light wavelengths are between 400nm and 700nm. Such methods where dual band pass filter includes a notch filter.
  • the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD)
  • the infrared light source includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light.
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • MEMS micro electromechanical system mirror
  • Another example system includes a system for target illumination and mapping, comprising, a laser light source and an image sensor, the laser light source configured to, communicate with a processor, project square wave illumination to at least one select target, where the square wave includes at least a leading edge and a trailing edge, send information to the processor regarding the time the leading edge of the square wave illumination was projected and the time the trailing edge of the square wave was projected, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, receive at least one reflected square wave illumination from the at least one select target, generate a signal based on the received reflected square wave illumination, where the signal includes at least information regarding the received time of the leading edge and received time of the trailing edge of the square wave, and send the signal regarding the received reflected square wave illumination to the processor.
  • a system for target illumination and mapping comprising, a laser light source and an image sensor, the laser light source configured to, communicate with a processor, project square wave illumination to at least one select target
  • CCD current assisted photon demodulation
  • Another example method includes a method for target illumination and mapping, comprising, via a laser light source, communicating with a processor, projecting square wave illumination to at least one select target, where the square wave includes at least a leading edge and a trailing edge, sending information to the processor regarding the time the leading edge of the square wave illumination was projected and the time the trailing edge of the square wave was projected, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving at least one reflected square wave illumination from the at least one select target, generating a signal based on the received reflected square wave illumination, where the signal includes at least information regarding the received time of the leading edge and received time of the trailing edge of the square wave, and sending the signal regarding the received reflected square wave illumination to the processor.
  • Such methods further comprising, via the laser light source, projecting a pulse of energy, where the square wave leading edge is caused by the laser pulse on and the trailing edge is caused by the laser pulse off.
  • Such methods further comprising, via the laser light source, projecting energy with a new polarization, where the square wave is caused by a change of polarization.
  • Such methods further comprising, via the laser light source switching gain in order to change polarization.
  • Such methods where the image sensor is a current assisted photon demodulation (CAPD).
  • CCD current assisted photon demodulation
  • Another example system includes a system for target illumination and mapping, comprising, an infrared laser light source and an image sensor, the infrared laser light source configured to, communicate with a processor, illuminate at least one select target within a field of view, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one select target within the field of view, create a signal based on the received reflected illumination, and send the signal to the processor, where the signal includes at least information that would allow the processor to map the target area and generate an image of the target area.
  • a system for target illumination and mapping comprising, an infrared laser light source and an image sensor, the infrared laser light source configured to, communicate with a processor, illuminate at least one select target within a field of view, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one select target
  • Such systems where the image is a gray scale image.
  • Such systems where the signal further includes information that would allow the processor to assign visible colors to the gray scale.
  • Such systems where the infrared laser light source is further configured to receive direction from the processor to illuminate a select target.
  • Such systems where the infrared laser light source is further configured to receive direction from the processor to track the motion of the select target and maintain illumination on the select target.
  • Another example method includes a method for target illumination and mapping, comprising, via an infrared laser light source, communicating with a processor, illuminating at least one select target within a field of view, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving reflected illumination from the at least one select target within the field of view, creating a signal based on the received reflected illumination, and sending the signal to the processor, where the signal includes at least information that would allow the processor to map the target area and generate an image of the target area.
  • Such methods where the image is a gray scale image.
  • Such methods where the signal further includes information that would allow the processor to assign visible colors to the gray scale.
  • Such methods where the infrared laser light source is further configured to receive direction from the processor to illuminate a select target.
  • Such methods where the infrared laser light source is further configured to receive direction from the processor to track the motion of the select target and maintain illumination on the select target.
  • Another example system includes a system for target illumination comprising, an illumination device in communication with an image sensor, the illumination device further configured to, communicate with a processor, project low level full scan illumination to a target area, where the laser is at least one of, amplitude modulated and pulse width modulated, the image sensor further configured to, communicate with the processor, receive reflected illumination from the target area, the processor configured to, identify specific target areas of interest, map the target area, set a value of the number of image pulses for one scan, calculate the energy intensity of each pulse, calculate the total intensity per frame, and compare the total intensity per frame to an eye safety limit, the computing system further configured to, direct the illumination device to scan if the total intensity per frame is less than the eye safety limit, and direct the illumination device to stop scan if the total intensity per frame is greater than or equal to the eye safety limit.
  • the illumination source includes a laser and a micro electromechanical system mirror (MEMS) to direct the light.
  • MEMS micro electromechanical system mirror
  • Another example method includes a method for target illumination comprising, via an illumination device, communicating with a processor, projecting low level full scan illumination to a target area, where the laser is at least one of, amplitude modulated and pulse width modulated, via an image sensor, communicating with the processor, receiving reflected illumination from the target area, via the processor, identifying specific target areas of interest, mapping the target area, setting a value of the number of image pulses for one scan, calculating the energy intensity of each pulse, calculating the total intensity per frame, and comparing the total intensity per frame to an eye safety limit, directing the illumination device to scan if the total intensity per frame is less than the eye safety limit, and directing the illumination device to stop scan if the total intensity per frame is greater than or equal to the eye safety limit.
  • Such methods further comprising, via the processor, communicating to a user an error message if the total intensity per frame is greater than or equal to the eye safety limit.
  • Such methods further comprising, via the processor, if the total intensity per frame is greater than or equal to the eye safety limit, mapping the target area, setting a new value of the number of image pulses for one scan, calculating the energy intensity of each pulse, calculating the total intensity per frame, and comparing the total intensity per frame to an eye safety limit.
  • the computing system is further configured to track the specific target of interest and direct the illumination source to illuminate the specific area of interest.
  • the illumination source includes a laser and a micro electromechanical system mirror (MEMS) to direct the light.
  • MEMS micro electromechanical system mirror
  • Another example system includes a system for target illumination and mapping, comprising, a directed light source, at least one image projector, and an image sensor, the directed light source configured to, communicate with a processor, illuminate at least one select target area within a field of view, receive direction to illuminate an at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one select target within the target area, create data regarding the received reflected illumination, send data regarding the received reflected illumination to the processor, and the image projector configured to, communicate with the processor, receive direction to project an image on the at least one select target, and project an image on the at least one select target.
  • the directed light source configured to, communicate with a processor, illuminate at least one select target area within a field of view, receive direction to illuminate an at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated
  • the image sensor configured to, communicate with the processor, receive reflected
  • Such systems where the directed light source is an infrared laser.
  • Such systems where the data regarding the received reflected illumination includes information regarding the distance from the system to the target via triangulation.
  • Such systems where the image projector is calibrated to the distance calculation from the processor, where calibration includes adjustments to a throw angle of the image projector.
  • Such systems where the image projector is further configured to project at least two images on at least two different identified and tracked targets.
  • Such systems where the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • Another example system includes a system for target illumination and mapping, comprising, a directed light source and an image sensor, the directed light source configured to, communicate with a processor, illuminate at least one target area within a field of view, receive direction to track a selected target within the target area from the processor, receive direction to project an image on the tracked selected target from the processor, project an image on the tracked selected target according to the received direction, the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one select target within the field of view, generate data regarding the received reflected illumination, and send the received reflected illumination data to the processor.
  • a directed light source configured to, communicate with a processor, illuminate at least one target area within a field of view, receive direction to track a selected target within the target area from the processor, receive direction to project an image on the tracked selected target from the processor, project an image on the tracked selected target according to the received direction
  • the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one select target within the field of view, generate data regarding the received
  • the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • Another example method includes a method for target illumination and mapping, comprising, via a directed light source, communicating with a processor, illuminating at least one select target area within a field of view, receiving direction to illuminate an at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, via an image sensor, communicating with the processor, receiving reflected illumination from the at least one select target within the target area, creating data regarding the received reflected illumination, sending data regarding the received reflected illumination to the processor, and via an image projector, communicating with the processor, receiving direction to project an image on the at least one select target, and projecting an image on the at least one select target.
  • Such methods where the directed light source is an infrared laser.
  • Such methods where the data regarding the received reflected illumination includes information regarding the distance from the system to the target via triangulation.
  • Such methods where the image projector is calibrated to the distance calculation from the processor, where calibration includes adjustments to a throw angle of the image projector.
  • Such methods further comprising, via the image projector, projecting at least two images on at least two different identified and tracked targets.
  • Such methods where the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • Another example method includes a method for target illumination and mapping, comprising, via a directed light source, communicating with a processor, illuminating at least one target area within a field of view, receiving direction to track a selected target within the target area from the processor, receiving direction to project an image on the tracked selected target from the processor, projecting an image on the tracked selected target according to the received direction, via an image sensor, communicating with the processor, receiving reflected illumination from the at least one select target within the field of view, generating data regarding the received reflected illumination, and sending the received reflected illumination data to the processor.
  • the directed light source is a visible light laser and the image is a laser scan image, where the laser is at least one of, amplitude modulated and pulse width modulated.
  • the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • Another example system includes a system for target illumination and mapping, comprising, a directional light source and an image sensor, the directional light source configured to, communicate with a processor, illuminate at least one target area within a field of view with a scan of at least one pixel point, receive direction to illuminate the target with additional pixel points over time for additional calculations of distance, from the at least one processor, the image sensor configured to, communicate with the processor, receive a reflection of the at least one pixel point from the at least one select target within the field of view, generate data regarding the received pixel reflection, send the data regarding the received pixel reflection to the at least one processor, where the data includes information that the processor could analyze and determine distance from the system to the target via triangulation, and where the data further includes information regarding the relative proximity between the directional light source and the image sensor.
  • the directional light source configured to, communicate with a processor, illuminate at least one target area within a field of view with a scan of at least one pixel point, receive direction to illuminate the target with additional pixel points over time for additional calculations
  • Such systems where the directional light source is a laser, and at least one of, amplitude modulated and pulse width modulated.
  • Such systems where the data further includes information that the processor could analyze and determine a depth map, based on the calculations of distance of the at least one target pixel point.
  • Such systems where the data further includes information that the processor could analyze and determine the distance between the system and the target via triangulation among the directed light source, the image sensor, and the additional pixel points.
  • Such systems where the directional light source is further configured to receive direction to illuminate the selected target with at least one pixel point from the processor.
  • Another example method includes a method for target illumination and mapping, comprising, via a directional light source, communicating with a processor, illuminating at least one target area within a field of view with a scan of at least one pixel point, receiving direction to illuminate the target with additional pixel points over time for additional calculations of distance, from the at least one processor, via an image sensor, communicating with the processor, receiving a reflection of the at least one pixel point from the at least one select target within the field of view, generating data regarding the received pixel reflection, sending the data regarding the received pixel reflection to the at least one processor, where the data includes information that the processor could analyze and determine distance from the system to the target via triangulation, and where the data further includes information regarding the relative proximity between the directional light source and the image sensor.
  • Such methods where the directional light source is a laser, and at least one of, amplitude modulated and pulse width modulated.
  • Such methods where the data further includes information that the processor could analyze and determine a depth map, based on the calculations of distance of the at least one target pixel point.
  • Such methods where the data further includes information that the processor could analyze and determine the distance between the system and the target via triangulation among the directed light source, the image sensor, and the additional pixel points.
  • Such methods further comprising, via the directional light source receiving direction to illuminate the selected target with at least one pixel point from the processor.
  • Another example system includes a system for biometric analysis, comprising, a directed laser light source and an image sensor, the directed laser light source configured to communicate with a processor, illuminate a target area within a field of view, receive direction to illuminate at least one select target in the target area, receive direction to illuminate a biometric area of the at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one target area within the field of view, generate data regarding the received reflected illumination, send the generated data to the processor, where the data includes at least information that would allow the processor to map the target area, identify the select target within the target area, and determine a biometric reading of the at least one select target.
  • Such systems where the biometric reading is at least one of, skin deflection, skin reflectivity, and oxygen absorption.
  • the illumination is a pattern of illumination
  • the computing system is further configured to analyze the reflected pattern illumination from the target.
  • the data contains further information that would allow the processor to calculate a distance from the system to the target via triangulation.
  • the light source is further configured to receive calibration information of the illumination pattern, and project the calibrated pattern on the at least one select target.
  • Another example method includes a method for biometric analysis, comprising, via a directed laser light source, communicating with a processor, illuminating a target area within a field of view, receiving direction to illuminate at least one select target in the target area, receiving direction to illuminate a biometric area of the at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving reflected illumination from the at least one target area within the field of view, generating data regarding the received reflected illumination, sending the generated data to the processor, where the data includes at least information that would allow the processor to map the target area, identify the select target within the target area, and determine a biometric reading of the at least one select target.
  • Such methods where the biometric reading is at least one of, skin deflection, skin reflectivity, and oxygen absorption.
  • Such methods where the illumination is a pattern of illumination, and where the computing system is further configured to analyze the reflected pattern illumination from the target.
  • Such methods where the data contains further information that would allow the processor to calculate a distance from the system to the target via triangulation.
  • Such methods further comprising, via the light source, receiving calibration information of the illumination pattern, and projecting the calibrated pattern on the at least one select target.
  • Another example system includes a system for target illumination and mapping, comprising, a directed light source, and an image sensor, the light source having an aperture and configured to, illuminate a target area within a field of view, via an incremental scan, where each increment has a unique outbound angle from the light source aperture, and a unique inbound angle to the image sensor aperture, send data regarding the incremental outbound angles to the processor, and the image sensor having an aperture and configured to, receive reflected illumination from the at least one select target within the field of view, generate data regarding the received reflected illumination including inbound angles, and send the data regarding the received reflected illumination to the processor, where the data regarding the outbound angles and the data regarding the inbound angles include information used to calculate a distance from the system to the target via triangulation, and where the distance between light source aperture and the image capture aperture is relatively fixed.
  • Such systems where the directed light source is a laser, where the laser is at least one of, amplitude modulated and pulse width modulated.
  • Such systems where the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a depth map based on the illumination.
  • Such systems where the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a point cloud based on the depth map.
  • Another example method includes a method for target illumination and mapping.
  • Such a method including, via a directed light source, having an aperture, illuminating a target area within a field of view, via an incremental scan, where each increment has a unique outbound angle from the light source aperture, and a unique inbound angle to the image sensor aperture, sending data regarding the incremental outbound angles to the processor, and via an image sensor, having an aperture, receiving reflected illumination from the at least one select target within the field of view, generating data regarding the received reflected illumination including inbound angles, and sending the data regarding the received reflected illumination to the processor, where the data regarding the outbound angles and the data regarding the inbound angles include information used to calculate a distance from the system to the target via triangulation, and where the distance between light source aperture and the image capture aperture is relatively fixed.
  • the directed light source is a laser, where the laser is at least one of, amplitude modulated and pulse width modulated.
  • the image senor includes optical filters.
  • the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a depth map based on the illumination.
  • Methods here where the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a point cloud based on the depth map.
  • Figure 1 is a perspective view of components consistent with certain aspects related to the innovations herein.
  • Figures 2A - 2B show an example monolithic array and projection lens, front side and perspective view consistent with certain aspects related to the innovations herein.
  • Figures 3A - 3B are a front, top, side, and perspective views showing an example array consistent with certain aspects related to the innovations herein.
  • Figures 4A - 4B are a front, top, side, and perspective views show an example array with a flexible PCB consistent with certain aspects related to the innovations herein.
  • Figure 5 is an illustration of an example full / flood array illuminated target area consistent with certain aspects related to the innovations herein.
  • Figures 6A - 6E are a perspective view and sequence illustrations of example array column illuminations consistent with certain aspects related to the innovations herein.
  • Figures 7A - 7E are a perspective view and sequence illustrations of example sub- array illuminations consistent with certain aspects related to the innovations herein.
  • Figures 8A - 8E are a perspective view and sequence illustrations of example single array element illuminations consistent with certain aspects related to the innovations herein.
  • Figure 9 is a perspective view of example system components of certain directional illumination embodiments herein.
  • Figures 10A-10D show example views of various possible scanning mechanism designs consistent with certain aspects related to the innovations herein.
  • Figure 11 is a depiction of a target area illuminated by an example directional scanning illumination consistent with certain aspects related to the innovations herein.
  • Figure 12 depicts an example embodiment of a 2-axis MEMS consistent with certain aspects related to the innovations herein.
  • Figure 13 depicts an example embodiment of a 2 single-axis MEMS configuration according to certain embodiments herein.
  • Figure 14 depicts an example embodiment including a single rotating polygon and a single axis mirror consistent with certain aspects related to the innovations herein.
  • Figure 15 depicts an example embodiment including dual polygons consistent with certain aspects related to the innovations herein.
  • Figure 16 is a depiction of an example full target illumination consistent with certain aspects related to the innovations herein.
  • Figure 17 is an illustration of an illumination utilized to create a subject outline consistent with certain aspects related to the innovations herein.
  • Figure 18 is an illustration of illumination of a sub-set of the subject, consistent with certain aspects related to the innovations herein.
  • Figure 19 is an illustration of illumination of multiple sub-sets of the subject, consistent with certain aspects related to the innovations herein.
  • Figure 20 depicts an example skeletal tracking of a target consistent with certain aspects related to the innovations herein.
  • Figure 21 depicts an example projection of a pattern onto a target area consistent with certain aspects related to the innovations herein.
  • Figure 22 is a flow chart depicting target illumination and image recognition consistent with certain aspects related to the innovations herein.
  • Figure 23 illustrates system components and their interaction with both ambient full spectrum light and directed NIR consistent with certain aspects related to the innovations herein.
  • Figure 24 is a perspective view of an example video imaging sensing assembly consistent with certain aspects related to the innovations herein.
  • Figure 25 is an associated graph of light transmission through a certain example filter consistent with certain aspects related to the innovations herein.
  • Figure 26A is a perspective view of the video imaging sensing assembly of the present invention illustrating one combined notch and narrow band optical filter utilizing two elements consistent with certain aspects related to the innovations herein.
  • Figure 26B is an associated graph of light transmission through certain example filters of certain embodiments herein.
  • Figure 27 A is a perspective view of an example video imaging sensing assembly illustrating three narrow band filters of different frequencies consistent with certain aspects related to the innovations herein.
  • Figure 27B is an associated graph of light transmission through certain example filters consistent with certain aspects related to the innovations herein.
  • Figure 28 is a perspective view of triangulation embodiment components consistent with certain aspects related to the innovations herein.
  • Figure 29 is a depiction of block areas of a subject as selected by the user or recognition software consistent with certain aspects related to the innovations herein.
  • Figure 30 is a depiction of a single spot map as determined by the user or recognition software consistent with certain aspects related to the innovations herein.
  • Figure 31 depicts an example embodiment showing superimposed distance measurements in mm as related to certain embodiments herein.
  • Figure 32 depicts an example multiple spot map as determined by the user or recognition software consistent with certain aspects related to the innovations herein.
  • Figure 33 depicts an example embodiment showing superimposed distance in mm and table as related to certain embodiments herein.
  • Figure 34 depicts an example embodiment showing axial alignment of the components of directed light source and the image sensor consistent with certain aspects related to the innovations herein.
  • Figure 35 shows an example embodiment with a configuration including axial alignment and no angular component to the light source consistent with certain aspects related to the innovations herein.
  • Figure 36 shows an example embodiment with a configuration including axial alignment and an angular component to the light source consistent with certain aspects related to the innovations herein.
  • Figure 37A-37C depict an example embodiment showing a top, side, and axial views of configurations consistent with certain aspects related to the innovations herein.
  • Figure 38A-38C depict an example embodiment of a top, side, and axial views of a configuration according to certain embodiments herein with a horizontal and vertical offset between the image sensor and the illumination device.
  • Figure 39 depicts an example embodiment configuration including axial alignment and an angular component to the light source with an offset in the Z axis between the image sensor and the illumination device consistent with certain aspects related to the innovations herein.
  • Figure 40 depicts an example embodiment of a process flow and screenshots consistent with certain aspects related to the innovations herein.
  • Figure 41 depicts an example embodiment including light interacting with an image sensor consistent with certain aspects related to the innovations herein.
  • Figure 42 depicts an example embodiment of image spots overlaid on a monochrome pixel map of a sensor consistent with certain aspects related to the innovations herein.
  • Figure 43 shows an example perspective view of an example of illumination being directed onto a human forehead for biometrics purposes consistent with certain aspects related to the innovations herein.
  • Figure 44A shows an example embodiment of sequential triangulation and a perspective view including one line of sequential illumination being directed into a room with a human figure consistent with certain aspects related to the innovations herein.
  • Figure 44B shows an example embodiment of sequential triangulation and a perspective view including select pixels consistent with certain aspects related to the innovations herein.
  • Figure 45 shows an example embodiment a human subject with a projected image consistent with certain aspects related to the innovations herein.
  • Figure 46A is an example embodiment showing a human subject with a projected illumination incorporating safety eye blocking consistent with certain aspects related to the innovations herein.
  • Figure 46B is another example embodiment showing a human subject with a projected illumination incorporating safety eye blocking consistent with certain aspects related to the innovations herein.
  • Figure 47A is a detailed illustration of a human eye and the small output window of the illumination device.
  • Figure 47B is a human eye pupil relative to the small illumination device output window.
  • Figure 47C is a detailed illustration of a human eye and the large output window of the illumination device.
  • Figure 47D is a human eye pupil relative to the large illumination device output window.
  • Figure 48A is an example embodiment showing a chart assigning color values to shades of gray consistent with certain aspects related to the innovations herein.
  • Figure 48B shows an example perspective view of certain embodiments herein including illumination directed onto a human figure after color enhancement consistent with certain aspects related to the innovations herein.
  • Figure 49A is an example graph showing a square wave formed by different systems consistent with certain aspects related to the innovations herein.
  • Figure 49B is an example perspective view illustrating one line of a propagated square wave consistent with certain aspects related to the innovations herein.
  • Figure 50A is an example perspective view of the throw angle effect on projected patterns consistent with certain aspects related to the innovations herein.
  • Figure 50B is an example perspective view showing calibrated projected patterns to compensate for distance consistent with certain aspects related to the innovations herein.
  • Figure 50C is an example perspective with of oriented calibration based on object shape consistent with certain aspects related to the innovations herein.
  • Figure 51 is an example table of projected pattern methodologies consistent with certain aspects related to the innovations herein.
  • Figure 52A is a perspective view of an example of an adjacent configuration consistent with certain aspects related to the innovations herein.
  • Figure 52B is a perspective view of an example system consistent with certain aspects related to the innovations herein.
  • Figure 52C is a perspective view of an example of an objective configuration consistent with certain aspects related to the innovations herein.
  • Enhanced software and hardware control of light sources has led to vast possibilities when it comes to gesture recognition, depth-of-field measurement, image/object tracking, three dimensional imaging, among other things.
  • the embodiments here may work with such software and/or systems to illuminate targets, capture image information of the illuminated targets, and analyze that information for use in any number of operational situations. Additionally, certain embodiments may be used to measure distances to objects and/or targets in order to aid in mapping of three dimensional space, create depth of field maps and/or point clouds.
  • Object or gesture recognition is useful in many technologies today. Such technology can allow for system/software control using human gestures instead of keyboard or voice control.
  • the technology may also be used to map physical spaces and analyze movement of physical objects. To do so, certain embodiments may use an illumination coupled with a camera or image sensor in various configurations to map the target area.
  • the illumination could be sourced any number of ways including but not limited to arrays of Light Emitting Diodes (LEDs) or directional scanning laser light.
  • LEDs Light Emitting Diodes
  • IR/NIR infrared/ near infrared
  • IR/NIR infrared/ near infrared
  • Direction and eye safety may be achieved, depending on the configuration of the system, by utilizing an addressable array of emitting devices or using a scanning mechanism, while minimizing illumination to non-targeted areas, thus reducing the overall energy required as compared with flood illumination.
  • the system may also be used to calculate the amount of illumination required, the total output power, and help determine the duration of each cycle of illumination.
  • the system may then compare the illumination requirements to any number of maximum eye safe levels in order to adjust any of the parameters for safety. This may also result in directing the light on certain areas to improve illumination in those, while minimizing other areas.
  • Various optics, filters, durations, intensities and polarizations could also be used to modify the light used to illuminate the objects in order to obtain additional illuminated object data.
  • the image capture could be through any of various cameras and image sensors.
  • Various filters, lenses and focus features could be used to capture the illuminated object data and send it to computing hardware and/or software for manipulation and analysis.
  • individual illumination elements may be grouped into columns or blocks to simplify the processing by the computers.
  • targeted areas could be thus illuminated.
  • Other examples, using directional illumination sources, could be used to project pixels of light onto a target area.
  • Such example segments/areas may each be illuminated for an approximately equal fraction of frame rate such that an image capture device, such as a Complementary Metal Oxide Semiconductor (CMOS) camera may view and interpret the illumination as homogeneous illumination for the duration of one frame or refresh.
  • CMOS Complementary Metal Oxide Semiconductor
  • the illumination and image capture should be properly timed to ensure that the targeted areas are illuminated during the time that the image capture device collects data.
  • the illumination source(s) and the image capture should synchronize in order to ensure proper data capture. If the image capture and illumination are out of synch, the system will have a hard time deciphering if the target object has moved, or if the illumination merely missed the target.
  • distance calculations derived from using the illumination and capture systems described herein may add to the information that the system may use to calculate and map three dimensional space. This may be accomplished, in certain embodiments, using triangulation measurements among the illumination source, the image capture device(s) and the illuminated object(s).
  • certain example systems may include certain components, including
  • an illumination source such as an addressable array of semiconductor light emitting devices or directional sources using lasers, some kind of projection optics or mechanical structure for spreading the light if an array of sources
  • an image capture devices such as a CMOS, Charge Couple Device (CCD) or other imaging device which may incorporate a short band pass filter allowing visible and specific IR/NIR in certain embodiments
  • computing devices such as a microprocessor(s) which may be used in conjunction with computing instructions to control the array or directional illumination source, database(s) and/or data storage to store data information as it is collected, object and/or gesture recognition instructions to interpret and analyze the captured image information.
  • Recognition instructions/software could be used to help analyze any captured images in order to do any number of things including to identify the subject requiring directed illumination to send commands to the microprocessor controlling the array identifying only the necessary elements to energize as to direct illumination on the target, thereby creating the highest possible level of eye safe illumination on the target.
  • the system may utilize object tracking technology such as recognition software, to locate a person's eyes who may be in the target field, and block the light from a certain area around them for eye safety.
  • object tracking technology such as recognition software
  • Such an example may keep emitted light from a person's eyes, and allow the system to raise the light intensity in other areas of illumination, while keeping the raised intensity light away from the eyes of a user or person within the system's range.
  • the illumination of the target field may be accomplished a number of ways.
  • One such way is through an array of illumination sources such as LEDs.
  • Figure 1 illustrates an example system utilizing such illumination sources.
  • the illumination source may be timed in accordance with the image capture device's frame duration and rate. In this way, during one open frame time of the image capture device/camera, which can be any amount of time but is often l/30th, l/60th or l/120th of a second, the illumination source may illuminate the target and/or target area.
  • These illumination sources can operate a number of ways during that one frame time, including, turning on all elements, or a select number of elements, all with the same power level or intensity, and for the entire frame duration.
  • Other examples include turning the illumination sources on all at the same intensity or power, but change the length of time each is on, within the frame time. Still other examples include changing the power or intensity of illumination sources, but keep all with the same length of time to be on, and yet another is changing both the power and time the illumination sources are on.
  • the effective output power for the array may be measured over time to help calculate safe levels of exposure, for example, to the human eye.
  • an eye safety limits may be calculated by dividing output power over time. This output power would be affected by the variations in illumination time and intensity disclosed above.
  • the illumination device 102 is arranged as an array 102 utilizing diverging projection optics 104, housed on a physical mechanical structure 106.
  • the array of illumination sources arranged to generate directed illumination 108 on a particular target area 110, shown in this example as a human form 112 and an object 114 but could be any number of things.
  • the illumination device 102 in Figure 1 is connected to a computer system including an example microprocessor 116, as well as the image capture system shown here as a video imaging camera 118, lens tube 120, camera lens 122, and camera filter 124.
  • the system is also shown in communication with a computer system including object recognition software or instructions 126 that can enable the system to direct and/or to control the illumination in any number of ways described herein.
  • the array 102 is shown connected to a computing system including a microprocessor 116 which can individually address and drive the different semiconductor light emitting devices 102 through an electronic control system.
  • the example microprocessor 116 may be in communication with a memory or data storage (not pictured) for storing predefined and/or user generated command sequences.
  • the computing system is further shown with an abstract of recognition software 126, which can enable the software to control the directed illumination.
  • these objects are shown in exploded and/or exaggerated forms, whereas in practice they may take any number of shapes and configurations. Here, they are shown as sometimes separate and symbolic icons.
  • the illumination device 202 may comprise a monolithic array of semiconductor light emitting devices 206, projection optics 204, such as a lens, arranged in between the array 202 and semiconductor light emitting devices 206 and the target area.
  • the array 202 may be any number of things including but not limited to, separate Light Emitting Diodes (LEDs), Edge Emitting Lasers (EELs), Vertical Cavity Surface Emitting Lasers (VCSELs) or other types of semiconductor light emitting devices.
  • the monolithic array 202 is arranged on a printed circuit board (PCB) 208, along with associated driving electronics.
  • the semiconductor light emitting devices 206 are uniformly distributed over the area of the array 202 thereby forming a matrix. Any kind of arrangement of light sources could be used, in order to allow for the light to be projected and directed toward the target area.
  • the number of semiconductor light emitting devices 206 used may vary. For example, an array provided with 10 x 20 array LEDs, for example, may result in proper directed illumination for a particular target area. For standalone devices, a PCB array of discrete semiconductor light emitting devices such as LEDs may suffice such as, for example, an auxiliary system for a laptop or television.
  • the semiconductor light emitting devices 206 are either physically offset or the alignment of alternating columns is offset such that it creates a partially overlapping pattern of illumination. This partially overlapping pattern is described below, for example later in Figure 5.
  • the illumination device 306 may include an array of semiconductor light emitting devices 306, mechanical structure 302, or a frame work with a defined curvature onto which PCBs are mounted which are one or more semiconductor light emitting devices 306 X-wide by Y-tall, arranged with a defined angle of curvature attached to a physical frame.
  • the sub-array PCBs 310 may comprise a sub-array of semiconductor light emitting devices 306 X-wide by Y-tall, hereinafter referred to as sub-array.
  • Each sub-array may include any number of illumination sources including but not limited to, separate LEDs, EELs, VCSELs or other types of semiconductor light emitting devices.
  • the array 302 with sub-array PCBs 310 may include associated driving electronics.
  • the semiconductor light emitting devices 306 may be uniformly distributed over the area of the array 302 sub-array PCBs 310 thereby forming a matrix.
  • the number of semiconductor light emitting devices 306 used in the matrix may vary and the determination may be predefined, or defined by the user or the software.
  • An illumination device for example, may include 10 x 20 array LEDs for directed illumination.
  • a PCB sub-array of discrete semiconductor light emitting devices such as LEDs may be used for an auxiliary system for a laptop or television.
  • the array 302 could be constructed of monolithic sub-arrays, single chip device having all of the semiconductor light emitting devices on a single chip.
  • Figure 3B shows a perspective view of a curved array from Figure 3A.
  • the illumination device 402 may include an array of semiconductor light emitting devices 406, a flexible PCB 412 arranged with a defined angle of curvature which may be attached to a physical frame, including associated driving electronics.
  • the semiconductor light emitting devices 406 may be uniformly distributed over the area of the array 402 thereby forming a matrix.
  • the number of semiconductor light emitting devices 406 used in the matrix may vary and the determination may be predefined, or defined by the user or the software.
  • an illumination device provided with 10 x 20 array LEDs may provide sufficient directed illumination for a particular application.
  • a flexible PCB made up of discrete
  • Figure 4B shows another example view of the curved array from Figure 4 A.
  • FIG. 5 depicts an illustration of an example array 502 and what a target area 520 that could be energized and/or illuminated by the array 502 may look like.
  • each example circle 522 depicts the coverage area of one of the light emitting devices or illumination sources 506.
  • the coverage of each light emitting device 522 may overlap with the adjacent coverage 522, depending on the width of the light emitting device beam and the distance of the target object 530 from the array 502.
  • any arrangement of single illumination devices could be used in any combination.
  • the example in Figure 5 shows all of the devices on at once.
  • Figure 6A depicts an example of the system illuminating a target area and a human 630.
  • the system could also be used to target anything else in the target area, such as, an object 632.
  • the example array 602 is shown in this example, showing one example column of light sources and their respective light beam coverage circles 622. Using an example column defined as one element or light source wide by X elements tall, in this example 1X10 but the number of elements can vary, the system is used to illuminate specific targets.
  • only certain precise areas of the overall target area require illumination.
  • the system could first identify those precise areas within the overall target area using object recognition, and then illuminate those precise area or areas to highlight for additional granularity.
  • the system may provide those coordinates to the computing system including the microprocessor which in turn may calculate the correct precise area elements to illuminate and/or energize.
  • the system could also decipher safety parameters such as the safe duration of that illumination during one cycle.
  • FIGS. 6B and 6C depict the first and the second illuminated columns in an example sequence, where the light emitting array 602 is shown with a particular column in dark, corresponding to a light coverage 622 on the target area.
  • Figure 6B shows an example where one column is lit, of four, 6C is two of four, etc.
  • Figure 6D depicts the last column of the sequence to be illuminated, which is four of four in the example sequence shown here.
  • the system's sequential illumination is shown in parts.
  • Figure 6E depicts what the camera would see in an example duration of one cycle corresponding to the amount of time of one capture frame. In this example, that is columns one through four, with the light coverage circles 622 now overlapping.
  • the illumination source could flip through multiple iterations of illuminating a target, within the time of one camera or image capture device shutter frame.
  • the multiple and sequential illumination cycles show up in one frame of image capture, and to the image capture device, appear as if they are all illuminated at once. Any number of configurations, illumination patterns and timing could be used, depending on the situation.
  • Figure 7A depicts another example of system's ability to illuminate different target areas for capture and recognition.
  • the goal is to recognize and identify an example target 730 but could be anything, such as an object, 732.
  • This example uses blocks of elements projecting their respective beams of illumination 722 defined as Y number of elements wide by X elements tall (in this example 2X2 but the number of elements can vary). This is different than the columns shown in Figures 6A-6E.
  • the system may be used to identify the coordinates of the area which requires illumination and provides that to the microprocessor which in turn may calculate the correct elements to energize and the safe duration of that illumination during one cycle.
  • Figures 7B and 7C depict the first and the second illuminated blocks in the example sequence.
  • 7B is one of seven
  • 7C is two of seven
  • Figure 7D depicts the last block of the sequence to be illuminated, which is seven of seven.
  • Figure 7E depicts what the camera may see illuminated within the duration of one example frame, which is blocks one through seven and all of the illumination circles 722 now overlapping. As described in Figure 6E, Figure 7E is the culmination of multiple illuminations, all illuminated at some time during one frame of the image capture device.
  • Figure 8A depicts an example of the system identifying targets such as a human 830 but could be anything such as an object, 832 within a target area.
  • This example uses individual illumination sources or elements, which allow the image capture devices and computer / software to identify the coordinates of the area which may require specific illumination. Thus, the system can then calculate the specific target elements to illuminate and/or energize for greater granularity, or safety measures.
  • Figures 8B and 8C depict examples of the first and the second illuminated elements in the example sequence.
  • 8B is one of twenty
  • 8C is two of twenty.
  • Figure 8D depicts the last element of the sequence to be illuminated, which is twenty of twenty.
  • Figure 8E depicts what the camera or image capture device may see in duration of one frame.
  • the illumination sources have illuminated one through twenty, now with illumination circles 822, all overlapping the adjacent one, and the image capture device detects all of the illumination within one frame.
  • Example embodiments here may be configured to determine certain operational statistics. Such statistics may include measuring the amount, intensity and/or power the system puts out. This can be used, for example, to ensure that safety limits are met, such as eye safety limits for projection of IR/NIR.
  • the system may utilizes information provided by the
  • illumination source and image sensors to determine the correct duration of each element during one cycle, period between refresh or time length of one frame.
  • F/E P the length of time one element or block of elements is energized during a cycle
  • the system may verify the eye safe limits of each cycle.
  • Each semiconductor light emitting device may be assigned a value corresponding to the eye safe limits determined for the array and associated optics.
  • the variables which determine eye safe limits vary greatly depending upon the size of the external aperture, wavelength of light, mode, coherence, and duration, the specific criteria will be established matching the specifications of the final design, establishing a Lmax - maximum eyesight level per cycle. If
  • a directional illumination may be used.
  • the target area and subsequent targeted subject areas may be illuminated using a scanning process or a process that uses a fixed array of Micro Electrical Mechanical Systems ("MEMS") mirrors.
  • MEMS Micro Electrical Mechanical Systems
  • Any kind of example laser direction control could be used, and more examples are discussed below.
  • any resolution of directional scan could be used, depending on the ability to pulse the illumination source, laser for example, and the direction control system to move the laser beam.
  • the laser may be pulsed, and the MEMS may be moved, directing each separate pulse, so that separate pixels are able to be illuminated on a target area, during the time it takes the camera or image capture system to open for one frame. More granularity/resolution could be achieved if the laser could be pulsed faster and/or the directional control could move faster. Any combination of these could add to the number of pixels that could be illuminated during one frame time.
  • the illumination projection device may have, for example, the ability to control the intensity of each pixel, by controlling the output power or light intensity for each pulse.
  • the intensity of each pulse can be controlled by the amount of electrical current being applied to the semiconductor light emitting device, or by sub dividing the pulse into smaller increments and controlling the number of sub-pulses on during one pulse, or in the case of an array of MEMs controlling the duration of the pulse where the light is directed to the output, for example.
  • Scanned light may be precisely directed on a targeted area to minimize illumination to non-targeted areas. This may reduce the overall energy required to conduct proper image capture, as compared with the level of flood illumination required to achieve the same level of illumination on a particular target. Instructions and/or software may be used to help calculate the amount of illumination required for an image capture, the output power of each pulse of illumination to achieve that, the number of pulses per scanning sequence, and help determine the total optical output of each frame of illumination.
  • the system may specifically direct illumination to both stationary and in-motion objects and targets such as humans.
  • the system may perform a complete illumination of the entire target area, thus allowing the recognition software to check for new objects or changes in the subject(s) being targeted.
  • a light- shaping diffuser can be arranged between the semiconductor light emitting device(s) and the projection optics, to create blurred images of the pulses. Blurring may reduce the dark or un- illuminated transitions between the projected pixels of illumination. Utilization of a diffuser may have the effect of improving eye safe output thus allowing for increased levels of illumination emitted by the device.
  • the device can produce dots or targets of illumination at key points on the subject for the purpose of calculating distance or providing reference marks for collection of other information. Distance calculations are disclosed in more detail below.
  • Figure 9 illustrates an example illumination device 950, utilizing diverging projection optics 952, to generate directed illumination 954 on a target area 910, as identified in this example as human form 912 and object 914.
  • the illumination device 950 in this example is connected to a microprocessor 916, a video imaging sensor 918, lens tube 920, camera lens 924, camera filter 922, object recognition software 926, enabling the recognition software to control the illumination.
  • these objects are shown in exaggerated and/or exploded forms, whereas in practice they may take any number of shapes and configurations. Here, they are shown as sometimes separate and symbolic icons.
  • the illumination device 950 may be configured to be in communication with and/or connected to a computing device such as a microprocessor 916 which can control the scanning mechanism and the semiconductor light emitting device 950.
  • the microprocessor 916 which may be equipped with and/or in communication with memory or storage for storing predefined and/or user generated command sequences.
  • the computing system may receive instructions from recognition software 926, thereby enabling the system to control the directed illumination.
  • Figure 9 also illustrates example embodiments based on an embodiment where a single image sensor 918 is utilized to obtain both red, green, blue (“RGB”) and NIR data for enhancing the ability of machine vision and recognition software 926.
  • RGB red, green, blue
  • This may require the utilization of a band pass filter 924 to allow for RGB imaging and a narrow band filter 922 closely matched to the wavelength of a NIR light source 954 used for augmenting the illumination.
  • the optical filtration can be accomplished by single or multiple element filters.
  • the NIR light source 954 can be from light emitting devices such as, for example but not limited to, LEDs, EEL, VCSELs, DPL, or other semiconductor-based light sources.
  • the way of directing the light onto the subject area 912 can be via many sources including a MEMS device 950 such as a dual axis or eye MEMS mirror, two single axis MEMS mirrors working in conjunction, a multiple MEMS mirror array, or a liquid crystal array, as examples.
  • a MEMS device 950 such as a dual axis or eye MEMS mirror, two single axis MEMS mirrors working in conjunction, a multiple MEMS mirror array, or a liquid crystal array, as examples.
  • Other reflective devices could also be used to accurately point a directed light source, such as a laser beam.
  • these objects are shown in exaggerated forms, whereas in practice they may take any number of shapes and configurations. Here, they are shown as sometimes separate and symbolic icons.
  • a light shaping diffuser (not pictured), can be arranged somewhere after the illumination device 950 and the projection optics 952 to create a blurred projected pixel.
  • the light shaping diffuser may create a blurred projection of the light and a more homogenous overlap of illumination.
  • the light shaping diffuser also has the added effect of allowing for increased levels of illumination while remaining within eye safe limits.
  • the illumination device 1050 includes a semiconductor light emitting device 1056, and a scanning mechanism 1058, projection optics 1052, such as a lens.
  • the illumination device can include a semiconductor light emitting device 1056, such as any number of devices including but not limited to, an LED, EEL, single element or an array of VCSELs, DPL, or other semiconductor based light emitting device, producing light in the infrared and or near infrared light wavelengths.
  • the intensity per pulse can be controlled by a change in numerous things, including, input current which correlates to a change in output power, frequency which would divide each pulse into sub-pulses of an equal energy output with the control of the intensity being determined by the number of sub-pulses "ON" during one pulse, or in the case of an array where each element of the array had a fixed output, the change in intensity would be determined by the number of elements "ON" during one pulse.
  • the light may be directed to the scanning mechanism 1058 through a beam splitter 1060.
  • the scanning mechanism 1058 may be a digital light processor (DLP) or similar device using an array of MEMs mirrors, LCOS (Liquid Crystal On Silicon), LBS (Laser Beam Steering), or combination of two single axis MEMs mirrors or a dual axis or "Eye" type of MEM as mirrors.
  • DLP digital light processor
  • the vertical scan could perform a linear scan at a low frequency (60 Hz as an example display refresh rate), whereas the horizontal scan requires a higher frequency (for example, greater than 90 kHz for a 1920 x 1080 HD display).
  • the stability of the scan in either direction could affect the results, therefore, an example such as one pixel resolution could provide good resolution.
  • Figure 10B shows an alternate embodiment than Figure 10A, where the semiconductor light emitting device 1056 is aligned differently, and without a reflector 1062 needed, as in Figure 10A, before the beam splitter 1060.
  • the reflector 1056 could be a partial mirror as well, allowing light to pass from one side and reflecting from another.
  • the illumination device 1050 includes a semiconductor light emitting device 1056, an additional semiconductor light emitting device 1057 which may be a source of white light or a single source emitting either visible red, green and blue light or a secondary source of IR/NIR light, a scanning mechanism 1058, and projection optics 1052, such as a lens.
  • the illumination device 1050 can include a semiconductor light emitting device 1056, such as, any number of things including but not limited to, an LED, EEL, single element or an array of VCSELs, DPL, or other semiconductor based light emitting devices, producing light in the infrared and/or near infrared light wavelengths.
  • the intensity per pulse can be controlled by a change in, input current which correlates to a change in output power, frequency which would divide each pulse into sub-pulses of an equal energy output with the control of the intensity being determined by the number of sub-pulses "ON" during one pulse, or in the case of an array where each element of the array had a fixed output, the change in intensity would be determined by the number of elements "ON" during one pulse.
  • the light may be directed to the scanning mechanism 1058 through a beam splitter 1060.
  • a reflector 1062 is shown between the light emitting device 1056 and the beam splitter 1060.
  • the reflector 1056 could be a partial mirror as well, allowing light to pass from one side and reflecting from another.
  • the scanning mechanism 1058 may be any number of things including but not limited to, a DLP or similar device using an array of MEMs mirrors, LCOS, LBS , or combination of two single axis MEMs mirrors or a dual axis or "Eye" type of MEMs mirrors.
  • the vertical scan could perform a linear scan at a low frequency (60 Hz for a typical display refresh rate), whereas the horizontal scan requires a higher frequency (greater than 90 kHz for a 1920 x 1080 HD display), for example. If scan in either direction is stable, within one pixel resolution, less error correction is needed.
  • the illumination device 1050 includes a semiconductor light emitting device 1056, and additional semiconductor light emitting devices 1057 which may be single sources emitting visible red, green and blue light or a secondary source of IR/NIR light, a scanning mechanism 1058, and projection optics 1052, such as a lens.
  • light emitting devices 1057 could be any number of single colored lasers including but not limited to red, green and blue, and the associated differing wavelengths. These illumination sources, for instance lasers 1057 could each have a unique wavelength or wavelengths as well.
  • the illumination device can include a semiconductor light emitting device, such as any number of things including but not limited to, an LED, EEL, single element or an array of VCSELs, DPL, or other semiconductor based light emitting device, producing light in the infrared and or near infrared light wavelengths.
  • the intensity per pulse can be controlled by a change in, input current which correlates to a change in output power, frequency which would divide each pulse into sub-pulses of an equal energy output with the control of the intensity being determined by the number of sub-pulses "ON" during one pulse; or in the case of an array where each element of the array had a fixed output, the change in intensity would be determined by the number of elements "ON" during one pulse.
  • the light may be directed to the scanning mechanism 1058 through a beam splitter 1060.
  • the scanning mechanism 1058 may be any number of things including but not limited to, a DLP or similar device using an array of MEMs mirrors, LCOS, LBS, or combination of two single axis MEMs mirrors or a dual axis or "Eye" type of MEMs mirrors.
  • the vertical scan could perform a linear scan at a low frequency (60 Hz for a typical display refresh rate), whereas the horizontal scan may require a higher frequency (greater than 90 kHz for a 1920 x 1080 HD display).
  • Figure 11 depicts an example illustration of how the system may scan the subject area being illuminated.
  • This kind of example scan is an interlaced scan. Any number of other example scan patters may be used to scan an illuminated area, the one in Figure 11 is merely exemplar.
  • the scanning mechanism may produce a scanned illumination in other patterns, such as but not limited to, a raster, progressive or de-interlaced or other format depending upon the requirements of the overall system.
  • each horizontal line is divided into pixels which are illuminated with one or more pulses per pixel.
  • Each pulse width/length becomes a pixel, as MEMS or reflector scans the line in a continuous motion and then moves to the next horizontal line.
  • 407,040 pixels may cover the target area, which is limited by the characteristics of the steering mechanism, in this example with 848 pixels per horizontal line and 480 horizontal lines.
  • Other numbers of pixels may also be used.
  • the MEMS can move 480 lines in the vertical access and 848 lines in the horizontal access, assuming the laser can pulse at the appropriate rate, 407,040 pixels could be projected to cover a target area.
  • any other numbers of pixels may be used depending on the situation and the ability of the laser to pulse and the directional control to position each pulse emission.
  • Example embodiments here may be used to determine certain operational statistics. Such statistics may include measuring the amount, intensity and/or power the system puts out. This can be used, for example, to ensure that safety limits are met, such as eye safety limits for projection of IR/NIR.
  • the system, and in some embodiments the microprocessor computer system may be instructed via code which may utilize the information provided from the illumination source and/or image sensor to help determine the correct duration of each pulse during one frame.
  • Recognition software analyzes image information from a CMOS or CCD sensor.
  • the software determines the area(s) of interest.
  • the coordinates of that area(s) of interest are sent to a microprocessor with the additional information as to the refresh rate / scanning rate / fps (frames per second), of the system.
  • n total number of total pixels/pulses in a scan
  • energy intensity may also be defined as luminous intensity or radiance intensity
  • F FPS - length of time of one frame or one complete scan per second
  • each light pulse may be assigned a value corresponding to the eye safe limits as determined by the semiconductor light emitting device and associated optics.
  • the specific criteria will be established using the specifications of the final design of the light emitting device. This may establish an Lmax - maximum eyesight safety level per frame. If The system will reduce I and/or P until Fi ⁇ L,
  • the system may include additional eye safe protections.
  • the system incorporates object recognition and motion tracking software in order to identify and track a target human's eyes.
  • the system may create a blacked out space preventing the scan from illuminating or shining light directly at the identified eyes of a target human.
  • the system may also include hardware protection which incorporates circuitry designed with a current limiting system that prevents the semiconductor light emitting device from exceeding the power necessary to drive it beyond the maximum safe output level.
  • Figure 12 illustrates one example of a way to steer an illumination source, such as a laser, here by a dual axis MEMS device. Any kind of beam steering technology could be used, but in this example embodiment, a MEMS is utilized.
  • outgoing laser beam 1254 from the light source is directed onto the horizontal scan plane 1260 which directs the beam in a horizontal motion as indicated by horizontal direction of rotation 1230.
  • the horizontal scan plane 1260 may be attached to the vertical scan plane 1270.
  • the vertical scan plane 1270 and horizontal scan plane 1260 may direct the light in a vertical motion as indicated by vertical direction of rotation 1240. Both scan planes may be attached to a MEMS frame 1280.
  • FIG. 13 shows an example embodiment using two single axis MEMS instead of one dual axis MEMS as shown in Figure 12.
  • a system of creating a raster scan uses two single axis MEMS or mirrors to steer an illumination from a source, in this example, a laser beam.
  • Outgoing laser beam 1354 from the illumination source 1350 is directed onto the vertical scan mirror 1360 which directs the beam in a vertical motion.
  • the outgoing laser beam 1354 is then directed to the horizontal mirror 1362 which may create a horizontal sweeping pattern.
  • the combined horizontal and vertical motions of the mirrors or MEMS enables the device to direct light in a sweeping pattern.
  • the system can also be used to direct pulses of laser light at different points in space, by reflecting each pulse in a different area. Progressive illumination of the target using a pulsed illumination source may result in a scanning of a target area over a given time as disclosed above. Certain methods of scanning may be referred to as a raster scan and can produce an image in an interlaced, de-interlaced, or progressive method, for example.
  • Figure 14 illustrates an example embodiment of creating a raster scan utilizing one single axis MEMS or mirror 1460 and one rotating polygon mirror 1464.
  • Outgoing laser beam 1454 from the light source 1450 is directed onto the vertical mirror 1460 which directs the beam in a vertical motion.
  • the outgoing laser beam 1454 is then directed to the rotating polygon mirror 1464 which creates a horizontal sweeping motion of the outgoing laser beam 1454.
  • the combined horizontal and vertical motions of the mirror and the rotating polygon enable the device to direct light in a sweeping pattern.
  • This method of scanning is referred to as a raster scan and can produce an image in a number of scan patterns including but not limited to interlaced, de-interlaced, or progressive method.
  • Figure 15 illustrates an example system of creating a raster scan utilizing two rotating polygon mirrors.
  • outgoing laser beam 1554 from the light source 1550 is directed onto the rotating polygon mirror 1560 which directs the beam in a vertical motion.
  • the outgoing laser beam 1554 is then directed to another rotating polygon mirror 1564 which creates a horizontal sweeping motion of the outgoing laser beam 1554.
  • the combined horizontal and vertical motions of the rotating polygon mirrors enable the device to direct light in a sweeping pattern.
  • This method of scanning is referred to as a raster scan and can produce an image in an interlaced, de-interlaced, or progressive method.
  • Certain embodiments may use other ways to beam steer an illumination source, and the examples described here are not intended to be limiting.
  • Other examples such as electromagnetic control of crystal reflection and/or refraction may be used to steer laser beams as well as others.
  • the users and/or system may desire to highlight a specific target within the target area field of view. This may be for any number of reasons including but not limited to object tracking, gesture recognition, 3D mapping, or any number of other reasons. Examples here include embodiments that may aid in any or all of these purposes, or others.
  • the example embodiments in the system here may first recognize an object that is selected by a user and/or the system via instructions to the computing portions. After the target is identified, the illumination portions of the system may be used to illuminate any or all of the identified targets or areas of the target. Through motion tracking, the illumination source may track the objects and change the illumination as necessary.
  • the next few example figures disclose different illumination methods that may be used in any number of example
  • Figure 16 depicts an illustration of the effect of a targeted subject being illuminated, in this case a human form 1612.
  • the subject of illumination could be other animate or inanimate objects or combinations thereof.
  • This type of targeted illumination may be accomplished by first illuminating and recognizing a target, then directing subsequent illumination only on the specific target, in this case, a human.
  • Figure 17 depicts an illustration of the effect of a targeted subject form having only the outline illuminated 1712.
  • the subject of outlined illumination could be other animate or inanimate objects or combinations thereof (not pictured).
  • Figure 18 depicts an illustration of the effect of a sub-area of targeted subject form being illuminated in this case the right hand 1812.
  • the subject of sub-area illumination could be other animate or inanimate objects or combinations thereof (not pictured).
  • FIG. 19 depicts an illustration of the effect of multiple sub-areas of targeted subject form being illuminated in this case the right hand 1912, the face 1913 and left hand 1915.
  • the subject of multiple sub-areas illumination could be other animate or inanimate objects or combinations thereof (not pictured).
  • a target or target area it may be desirable to project light on only certain areas of that target, depending on the purpose of illumination.
  • For target motion tracking for example, it may be desirable to merely illuminate certain areas of the target, to allow for the system to only have to process those areas, which represent the entire target object to be tracked.
  • Figure 20 depicts an illustration of the effect of illumination of skeletal tracking and highlighting of key skeletal points 2012. This may allow the system to track the target using only certain skeletal points, and not have to illuminate the entire target, and process information about the entire surface of the target to track its motion.
  • the skeletal tracking and key points could be other animate objects or combinations thereof (not pictured). Again, to accomplish such targeted illumination, a target must be first illuminated and then recognized and then subsequent illumination targeted.
  • Figure 21 depicts an example illustration of the effect of illumination of targeted subject with a grid pattern 2112.
  • This pattern may be used by the recognition or other software to determine additional information such as depth and texture. Further discussion below, describes examples that utilize such pattern illuminations.
  • the scanning device may also be used to project outlines, fill, skeletal lines, skeletal points, "Z" tags for distance, De Bruijn Grids, structured light or other patterns for example as required by the recognition software. In other
  • the system is capable of producing and combining any number of illumination styles and patterns as required by the recognition system.
  • a flow chart depicts one example of how the system may determine certain operational statistics. Such statistics may include measuring the amount, intensity and/or power the system puts out. This can be used, for example, to ensure that safety limits are met, such as eye safety limits for projection of IR/NIR. Also, the flow chart may be used to demonstrate calculations of multiple embodiments, such as the array illumination example with fixed intensity, an array with variable intensity, and also a raster scanned example using lasers described later in this disclosure, for example. [00212] The flow chart begins with the illumination device 2210, whatever embodiment that takes, as disclosed here, directing low level full scan illumination over the entire target area 2220. This allows the system to capture one frame of the target area and the image sensor may receive that entire image 2230. From that image, the length of time of one frame or one complete scan per second may inform how the illumination device operates 2240. Next, the illumination device 2210, whatever embodiment that takes, as disclosed here, directing low level full scan illumination over the entire target area 2220. This allows the system to capture one frame of the target area
  • microprocessor or system in general 2250, may determine a specific area of interest in the target area to illuminate specifically 2252. Using this information, once the system is satisfied that the identified area of interest is properly identified, the system may then map the target area and based on that information calculates the total level of intensity for one frame 2260. In examples where power out or total illumination per frame is important to eye safety, or some other parameter, the system can validate this calculation against a stored or accessible maximum number or value 2270. If calculated total intensity is less than or equal to the stored maximum, the system and/or microprocessor may provide the illumination device with instructions to complete one entire illumination scan of the target area 2280.
  • the system may recalculate the intensity to a lower level 2274 and recalculation 2260. If the calculated maximum number cannot be reduced to a level lower than or equal to a stored maximum number, the system may be configured to not illuminate the target area 2272, or to perform some other function to limit eye exposure, and/or return an error message. This process may then repeat for every frame, or may be sampled randomly or at a certain interval.
  • a light shaping diffuser (reference Figure 1, at 104), may be arranged somewhere after the array (not pictured) to create a smooth projection of the semiconductor light emitting devices in the array.
  • the light shaping diffuser (not pictured) may create a smooth projection of the semiconductor light emitting devices in the array and a more homogenous overlap of illumination.
  • the light shaping diffuser (not pictured) may also have an added effect of allowing for increased levels of illumination while remaining within eye safe limits.
  • image capture devices may use a shutter or other device to break up image capture into frames. Examples of common durations are l/30th, l/60th or l/120th of a second.
  • Video imaging sensors may utilize an optical filter designed to cut out or block light outside the range visible to a human being including IRTNIR. This could make utilizing IR/NIR an ineffective means of illumination in certain examples here.
  • the optical filter may be replaced with one that is specifically designed to allow for both the visible range of wavelengths and a specific band of IRTNIR that matches that of the illumination device. This may reduce the distortion created by the IRTNIR, while allowing for the maximum response to the IR/NIR.
  • the optical filter is replaced with one specifically designed to allow for both the visible range of wavelengths and a specific band of IR/NIR that matches that of the semiconductor light source. This may help reduce the distortion created by the IR/NIR, while allowing for the maximum response to the IRTNIR.
  • the optical filter is replaced with one specifically designed to block all wavelengths except only a specific band of IR/NIR that matches that of the semiconductor light source.
  • a semiconductor light emitting device may be used to produce light in the infrared and or near infrared light wavelengths defined as 750nm to 1mm, for example.
  • the projection optics may be a projection lens.
  • IRTNIR could be used in certain situations, even if natural ambient light is present.
  • the use of IR in or around the 976nm range could be used by the illumination source, and filters on the image capture system could be arranged to only see this 976nm range.
  • the natural ambient light has a dark spot, or very low emission in the 976nm range.
  • a combined ambient and NIR device may be used for directed illumination utilizing single CMOS sensor.
  • a dual band pass filter may be incorporated into the optical path of an imaging sensor.
  • This path may include a lens, an IR blocking filer, and an imaging sensor of various resolutions.
  • the IR blocking filter may be replaced by a dual band pass filter including a band pass filter, which may allow visible light to pass in approximate wavelengths between 400nm and 700nm, and a narrow band pass or notch filter, which is closely matched to that of the IR/NIR illumination source.
  • Figure 23 illustrates the interaction of the physical elements of example embodiments here.
  • An illumination device 2350 such as a dual axis or eye MEMS mirror or an array or other method which could direct an NIR light source, producing a source of augmented illumination onto the subject area 2312.
  • Ambient light 2370 and NIR light 2354 are reflected off of the subject area 2312.
  • Reflected ambient light 2372 and reflected NIR 2355 pass through lens 2322.
  • a combined optical filter 2324 may allow only visible and a specific narrow range of IR to pass into optical housing 2320 blocking all other wave lengths of light from reaching image sensor 2318.
  • these objects are shown in exploded and/or exaggerated forms, whereas in practice they may take any number of shapes and configurations. Here, they are shown as sometimes separate and symbolic icons.
  • Figure 24 depicts such an example in a side view of a CMOS or CCD camera 2440.
  • This figure depicts a lens 2442, a filter 2444, and an optional lens tube 2446 or optics housing. Any number of
  • lenses and filters of different sorts may be used, depending on the configuration of the embodiment and the purpose of the image capture. Also, many kinds of image capture devices could be used to receive the reflected illumination and pass it to computing devices for analysis and/or manipulation.
  • FIG. 24 other embodiments of this device may have the order of the filter 2444 and the lens 2442 reversed. Still other embodiment of this device may have the lens 2442 and the filter 2444 combined, wherein the lens is coated and has the same filtering properties as a discrete filter element. This may be done to reduce cost and number of parts and could include any number of coatings and layers. [00226] Still referring to Figure 24, other embodiments may have the camera manufactured in such a way that the sensitivity of the device acts in a similar manner to that of a commercially available camera with a filter 2444.
  • the camera could be receptive to visible light and to only one specific range of IR/NIR, blocking out all of the other wavelengths of IR/NIR and non-visible light.
  • This example device could still require a lens 2442 for the collection of light.
  • the filter may only block above 700nm allowing the inherent loss of responsivity of the sensor below the 400nm to act like a filter.
  • the filter may block some or all of IR/NIR above 700nm typically referred to as an IR blocking filter.
  • the filter may only block above 700nm allowing the inherent loss of responsivity of the sensor below the 400nm to act like a filter.
  • This filter may include a notch, or narrow band, allowing a desired wavelength of IR to pass. In this example, 850nm, as shown by line 2508 in Figure 25.
  • Figure 25 depicts an example graph of the wavelength responsivity enabled by an example filter.
  • the x axis shows wavelength in nanometers (nm) and the y axis shows percent sensitivity 0 - 100% as decimal values 0.0 to 1.1. Specific wavelengths are dependent upon the CMOS or CCD camera being utilized and the wavelength of the semiconductor light emitting devices.
  • the vertically shaded area 2502 represents the typical sensitivity of a CMOS or CCD video imaging device.
  • the "graduated rectangular bar” 2506 represents the portion of the spectrum that is "visible" to the human eye.
  • the “dashed” line 2508 represents the additional responsivity of the proposed filter.
  • the optical filters may be combined into one element 2444.
  • the example depicts an image sensor 2440, optical housing 2446, lens 2442, the combined filter 2444 blocking light below 400nm, between 700nm and 845nm, and 855nm and above.
  • the example is illustrated assuming an NIR light source at 850nm, wavelengths between 800nm and lOOOnm may be utilized depending upon the specific device requirements.
  • the band pass range is +/- 5nm for example only, the actual width of the band pass may be wider or narrower based on specific device requirements.
  • two optical filters are combined.
  • the example depicts an image sensor 2640, optical housing 2646, lens 2642, filter ⁇ 400nm 2643, and a narrow band filter 2644 blocking light between 700nm and 845nm, transmittance between 845nm and 855nm, blocking above 855nm.
  • the example is illustrated assuming an NIR light source at 850nm, wavelengths between 800nm and lOOOnm may be utilized depending upon the specific device requirements.
  • the band pass range is +/- 5nm for example only, the actual width of the band pass may be wider or narrower based on specific device requirements.
  • Figure 26B is a graphical depiction of example CMOS sensitivity to light.
  • the x axis shows wavelength in nanometers (nm) and the y axis shows percent sensitivity.
  • This example shows from 300nm to HOOnm 2602 (vertically shaded); the spectrum of light visible to human eye, 400nm - 700nm 2606, ("graduated rectangular bar”); transmittance of filter from 0% to 100% across the spectrum 300nm to HOOnm (2608 dashed).
  • the range covered by element is depicted above the graph.
  • the narrow band filter 2644 blocking light between 700nm and 845nm, transmittance between 845nm and 855nm, blocking above 855nm is shown as arrow 2645.
  • the filter ⁇ 400nm 2643 is shown as arrow 2647.
  • three optical filters may be combined.
  • the example depicts an image sensor 2740, optical housing 2746, lens 2742, band filter ⁇ 400nm 2743, a narrow band filter 2780 between 700nm and 845nm, and a filter 2782 blocking above >855nm.
  • the example is illustrated assuming an NIR light source at 850nm, wavelengths between 800nm and lOOOnm may be utilized depending upon the specific device requirements.
  • the band pass range is +/- 5nm for example only, the actual width of the band pass may be wider or narrower based on specific device requirements.
  • Figure 27B is a graphical depiction of typical CMOS sensitivity to light.
  • the x axis shows wavelength in nanometers (nm) and the y axis shows percent sensitivity. This example shows from 300nm to HOOnm (2702, shaded); the spectrum of light visible to human eye, 400nm - 700nm (2706, black); transmittance of filter from 0% to 100% across the spectrum 300nm to 1 lOOnm (2708, dashed).
  • band filter ⁇ 400nm 2743 The range covered by band filter ⁇ 400nm 2743 is depicted as an arrow 2747, the range covered by, a narrow band filter 2780 between 700nm and 845nm is depicted as an arrow 2781, and the range covered by filter 2782 blocking above >855nm, is shown as an arrow 2745.
  • the system can alternate between RGB and NIR images by either the utilization of computing systems and/or software to filter out RGB and NIR, or by turning off the NIR illumination for a desired period of time.
  • Polarization of a laser may also be utilized to alternate and differentiate objects.
  • the optical filter or combination of filters may be used to block all light except a selected range of NIR light, blocking light in the visible range completely.
  • Certain embodiments here may be used to determine distances, such as the distance from the example system to a target person, object, or specific area. This can be done as shown here in the example embodiments, using a single camera/image capture device and a scanning projection system for directing points of illumination. These distance measurement embodiments may be used in conjunction with many of the target illumination and image capture embodiments described in this disclosure. They could be used alone as well, or combined with other technologies.
  • the example embodiments here accomplish this by matching the projected points of illumination with a captured image at a pixel level.
  • image recognition is performed, over the target area in order to identify certain areas of interest to track, such as skeletal points on a human, or corners of a box, or any number of things.
  • a series of coordinates may then be assigned to each key identified point.
  • These coordinates may be sent to a computing system which may include microprocessing capabilities and which may in turn control a semiconductor light emitting device that may be coupled to a mechanism that scans the light across an area of interest.
  • the system may be configured to project light only on pixels that correspond to the specified area previously identified. Each pixel in the sequence may then be assigned a unique identifier. An image sensor could then collect the image within the field of view and assign a matching identifier to each projected pixel. The projected pixel's corresponding imaged pixel may be assigned horizontal and vertical angles or slope coordinates. With a known distance between the projection and image source, there is sufficient information to calculate distance to each point using triangulation calculations disclosed in examples here.
  • the system may direct one or more points or pixels of light onto a target area such as a human subject or object.
  • the example device may include a scanning device using a dual axis or two singles axis MEMS, rotating polygon mirrors, or other method for directing light; a collimated light source such as a semiconductor or diode laser which can generate a single pixel; a CMOS, CCD or other imaging device which may incorporate a short band pass filter allowing visible and/or specific IR/ IR; a microprocessor(s) controlling the scanning device; object and/or gesture recognition software and a microprocessor.
  • the human or the software may identify the specific points for distance measurement.
  • the coordinates of the points may be identified by the image sensor and the computing system and sent to the system which controls the light source and direction of projection.
  • the device may energize the light at a pixel (input) corresponding to the points to be measured (output).
  • the device may assign a unique identifier to each illuminated point along with its vertical and horizontal angular components.
  • the projected points and captured image may be synchronized. This may help reduce the probability that an area of interest has moved before a measurement can be taken.
  • the imaged spot location may be compared to projected locations. If the variance between the expected projected spots map and the imaged spots is within a set tolerance then the system may accept them as matching.
  • the image sensor may produce one frame of information and transmits that to the software on the microprocessor.
  • a frame refers to one complete scan of the target area and is the incremental period of time that the image sensor collects one image of the field of view.
  • the software may be used to analyze the image information, identify projected pixels, assign and store information about the location of each point and match it to the illuminated point. Each image pixel may also be assigned angular values for horizontal and vertical orientation.
  • a trigonometric calculation can be used to help determine the depth from the device to each illuminated spot.
  • the resultant distances can either be augmented to the display for human interpretation or passed onto software for further processing.
  • FIG. 28 illustrates an overview of the triangulation distance example embodiments here. These embodiments are not exclusive of the image illumination and capture embodiments disclosed here, for example, they may be used alone, or to augment, complement, and/or aid the image illumination and capture to help gather information and/or data about the target area for the system.
  • the system is operating in a subject area 2810, here, a room.
  • the illumination device 2850 in this example controlled by a microprocessor 2826, is used to project a beam 2854 to illuminate a point on a target 2812.
  • the reflection of the beam 2855 may be captured by the image sensor 2820. Data from that capture may then be transmitted to the microprocessor 2826.
  • Other objects in the room may similarly be identified, such as the briefcase 2814. Data from such an example system may be used to calculate distances to illuminated objects, as will be discussed further below.
  • Figure 29 illustrates an example of how the initial image recognition may be accomplished, in order to later target specific areas for illumination.
  • a human 2912 may be identified.
  • the identification of the area of interest is indicated by rectangular segments 2913. These rectangular segments may be any kind of area identification, used for the system to later target more specific areas to illuminate.
  • the examples shown here are illustrative only.
  • Figure 29 also shows an example object 2914 which could also be identified by a larger area 2915. If computer instructions or software is not used to recognize objects or targets, human intervention could be used.
  • a touch screen or cursor could be used to outline or identify targets of interest - to inform the system of what to focus illumination on, shown here by a traced line around the object.
  • Figure 30 illustrates an example scenario of a target area as seen by the image capture device, and/or caused to be displayed on a visual monitor for human interaction.
  • a target area as seen by the image capture device, and/or caused to be displayed on a visual monitor for human interaction.
  • Example gesture recognition software and software on the microprocessor could use the rectangular segments shown in Figure 29, to direct an illuminated point 3016 on specific areas of a target human 3012.
  • object 3014 could also receive a directed illuminated point 3018.
  • Figure 31 illustrates an example imaged scenario as might be seen on a computer screen or monitor where the system has caused the display to show the calculated distance measurement from the system to the illuminated points 3118 and 3116 which are located on the object 3114 and human targets 3112, respectively.
  • a display of the image the distance calculations "1375" 3116 and "1405" 3118 show up on the screen. They could take any form or be in any unit of measurement, here they show up as 1375 and 1405 without showing the units of measurement, as an example.
  • Figure 32 illustrates a typical imaged scenario as might be seen on a computer screen or monitor showing multiple points illuminated for depth measurement.
  • the system with gesture recognition capabilities such as those from software could use the rectangular segments as depicted in Figure 29 to direct multiple illuminated points 3234 on a target human 3212.
  • a similar process may be used to direct multiple illuminated points 3236 onto an object 3214.
  • the system could be used to automatically select the human target 3212 and a human interface could be used to select the object 3214. This is only an example, and any combination of automatic and/or manually selected targets could be acquired and identified for illumination.
  • Figure 33 illustrates an example embodiment where the system causes display on a computer screen or monitor showing the superimposing of the distance from the illumination device to the multiple illuminated points 3334 in tabular form 3342.
  • the example multiple illuminated points are shown with labels of letters, which in turn are used to show the example distance measurements in the table 3342.
  • Figure 33 also depicts the manually selected object 3314 with multiple illuminated points 3336 superimposed on the image 3340 in this case showing "1400,” "1405,” “1420” and "1425" as distance calculations, without units depicted, as an example.
  • Figure 34 illustrates an example of an embodiment of the physical relationship among components of the illumination device 3450 and the image sensor 3420. The relationship among these components may be used in distance calculations of the reflected illumination off of a target, as disclosed here.
  • the illumination device 3450 may include a light source 3456 which can be any number of sources, such as a semiconductor laser, LED, diode laser, VCSEL or laser array, or a non-coherent collimated light source.
  • the light may pass through an optical component 3460 which may be used to direct the light onto the reflective system, in this example, a MEMS device 3458.
  • the light may then be directed onto the area of interest; here the example beam is shown directed off the Figure 3480.
  • the image capture device / camera / sensor shows the central Z axis 3482 for the image sensor 3420.
  • the MEMS device 3458 also has a horizontal axis line 3484 and a vertical axis line 3486.
  • the image sensor 3420 may include components such as a lens 3442 and a CMOS or CCD image sensor 3440.
  • the image sensor 3440 has a central Z axis 3482 which may also be the path of illumination beam returning from reflection off the target to the center of the sensor 3440 in this example.
  • the image sensor 3440 has a horizontal axis line 3484 and a vertical line axis 3488.
  • both the MEMS 3458 and the image sensor 3440 are offset both horizontally and vertically 3490 wherein z axis 3480 and 3482 are parallel, but the horizontal axis 3484 and the vertical axes 3488 and 3486 are offset by a vertical and/or horizontal value. In such examples, these offsets would have to be accounted for in the distance and triangulation calculations. As discussed throughout this document, the relationships and/or distance between the illumination source and the image capture z axis lines may be used in triangulation calculations.
  • the MEMS 3458 and the image sensor 3440 are aligned, wherein they share the horizontal axis 3484, and where their respective vertical axes 3488 and 3486 are parallel, and axial lines 3482 and 3480 are parallel.
  • Physical aspects of the components of the device may prevent the point of reflection of the directing device and the surface plane of the image sensor from being on the same plane, creating an offset such as discussed here.
  • the offset may be intentionally introduced into the device as a means of improving functionality.
  • the offset is a known factor and becomes an additional internal calibration to the distance algorithm.
  • Figure 35 illustrates an example of how data for triangulation calculations may be captured, which could be used in example embodiments to calculate distance to an illuminated object.
  • the result of using the data in trigonometric calculations may be used to determine the distance D, 3570 from device to point P, 3572.
  • Point P can be located any distance from the back wall of the subject area 3574 to the illumination device 3550.
  • Outgoing laser beam 3554 is directed in this example from the illumination device 3550 to a point P 3572 on a subject area 3574.
  • the reflected laser beam 3555 reflects back and is captured by the image sensor 3520. In this example the image sensor 3520 and the illumination device 3550 are aligned as illustrated earlier Figure 34.
  • Distance h 3576 is known in this example, and the angle represented by ⁇ , 3578 can be determined as further illustrated in this disclosure. In this illustration there is no angular component to outgoing laser beam 3554.
  • the central Z axis for the illumination device is represented by line 3580 and the image sensor 3520 by line 3582 are parallel. Using the functions described in above, the distance D 3570 can be determined.
  • the directed light is pointed parallel to the image sensor with an offset some distance "h” 3576 in the horizontal plane, and the subject area lies a distance "D" 3570 away.
  • the illuminated point "P" 3572 appearing in camera's field of view is offset from the center through an angle ⁇ , 3578 all as shown in Figure 35:
  • Figure 36 illustrates an example calculation of distance where the angle the illumination source uses to illuminate the target is not directly down its own z axis.
  • the trigonometric calculation may be used to determine the distance D, 3670 from device to point P, 3672.
  • Point, 3672 can be located any distance from the back wall of the subject area 3674 to the illumination device 3650.
  • Outgoing laser beam 3654 is directed from the illumination device 3650 to a point P, 3672 on a subject area 3674.
  • the returning laser beam 3655 reflects back and is captured by the image sensor 3620.
  • the image sensor 3620 and the illumination device 3650 are aligned as further illustrated in Figure 34.
  • Distance h, 3676 is known and the angle represented by , 3678 can be determined as further herein.
  • the angular component a, 3688 of the outgoing laser beam 3654 can be determined based upon the horizontal and vertical coordinate of that pixel as described above.
  • h' 3682 and x 3684 may be calculated.
  • the distance D 3670 can be determined.
  • the distance D 3670 can be determined from the angles ⁇ 3678 and a 3688 and the directed spot "offset distance" h 3676:
  • Figures 37 A, B and C show an example where in addition to the offset X 3784 of the outgoing laser beam 3754 there is also a vertical offset Y, 3790.
  • the numerals With the numerals
  • the distance to D 3770 is determined exactly as before in Equation above.
  • the distance to D 3770 is known and the out of- plane angle ⁇ 3792 of the directed spot, the vertical position y of the image spot P 3772 can be determined through:
  • Figures 38 A, B and C further illustrates Figures 36 and 37, where there is an X and Y offset between the illumination device 3850 and the image sensor 3820.
  • the variable k' 3896 is also shown as the offset of the distance between illumination device 3850 z axis 3882 and the point P 3874 where the illumination pixel hits the object 3874.
  • Figure 39 shown an example embodiment similar to Figures 37-38 but in this example, there is an additional horizontal and vertical offset 3998 introduced where the directed illumination device is offset from the image sensor 3920 in the X, Y, and Z axis.
  • Figure 40 illustrates the flow of information from identification of the point(s) to be measured through the calculation of the distance and display or passing of that information.
  • Column A shows what a screen may look like if the human interface is responsible for image recognition.
  • Column B shows a scenario where software is used to detect certain images. The center column describes what may happen at each section.
  • recognition occurs, 4002, where the camera or image sensor device is used to provide image data for analysis.
  • the system may assign to each area of interest, any number of things such as Pixel identification information, a unique identifier, a time stamp, and/or calculate or table angle, 4006.
  • the system and/or microprocessor may transmit a synchronizing signal to the image sensor, and pixel command to the illumination device 4008.
  • the system may then illuminate the subject area with a spot of illumination, 4010.
  • the image sensor may report the location of the pixels associated with the spot 4012.
  • the system and/or microprocessor may analyze the pixel values associated with imaged spot, match imaged pixel to illuminated spot and assign a location to pixel to calculate the angle value, 4014.
  • the microprocessor and/or system may calculate a value for depth, or distance from the system, 4016. Then the system may return a value for depth to the microprocessor for display, 4018. This is shown as a display of data on the example screen in 4018B. Then, the system may repeat the process 4020 as needed as the objects move over time.
  • Certain examples have the active FOV - Field Of View of the directed light and the capture FOV of the image sensor aligned for the calculations used in measuring distances. This calibration of the system may be accomplished using a software application.
  • input video data can be configured for streaming in a video format over a network.
  • Figure 41 shows an example image capture system embodiment.
  • the light in this example, reflected laser light
  • the image sensor example is made up of a number of cells, which, when energized by light, produce an electrical charge, which in turn may be mapped by the system in order to understand where that light source is located. The system can turn these charged cells into an image.
  • a returned reflected laser beam 4156, 4158, 4160, and 4162 returning from the area of interest along the center Z axis 4186 is identified by the CMOS or CCD image sensor 4140.
  • Each point or pixel of light that is directed onto an area of interest, or target, may be captured with a unique pixel location, based on where the reflected light hits the image sensor 4140.
  • Returning pixels 4156, 4158, 4160, 4162 represent examples of unique points with angular references different from 4186. That is, the reflected light beams are captured at different angles, relative to the z axis 4186.
  • Each cell or pixel therefore has a unique coordinate identification and a unique set of angular values in relationship to the horizontal axis 4184 and the vertical axis 4188.
  • Figure 42 illustrates an example image capture device that is using error correction to estimate information about the target object from which the light reflected.
  • the reflected light hits certain cells of the image capture sensor. But in certain examples, the light does not strike the center of one sensor cell. Sometimes, in examples, the light strikes more than once cell or an intersection of more than one cell.
  • the system may have to interpolate and estimate which of the cells receives the most of the returned light, or use different calculations and/or algorithms in order to estimate angular values. In some examples, the system may estimate where returning pixels 4256, 4258, 4260, 4262, will be captured by the image sensor 4250.
  • Pixel 4262 the light is centered on one pixel and/or cell and overflows partially onto eight adjacent pixels and/or cells.
  • Pixel 4260 depicts the situation where the light is centered evenly across four pixels and/or cells.
  • Pixels and/or cells 4256 and 4258 depict examples of the light having an uneven distribution across several pixels and/or cells of the image sensor 4250.
  • the probability that a projected spot will be captured on only one pixel of the image sensor is low.
  • An embedded algorithm will be used to determine the most likely pixel from which to assign the angular value.
  • the imaged spot is centered on one pixel and overlaps eight others. The charged value of the center pixel is highest and would be used.
  • the spot is equally distributed over 4 pixels.
  • a fixed algorithm maybe used, selecting the top left pixel or lower right, etc.
  • a more sophisticated algorithm may also be utilized where factors from prior frames or adjacent spots are incorporated into the equation as weighting factors.
  • a third example may be where there is no one definite pixel. Charged weighting would be one method of selecting one pixel.
  • a fixed algorithm could also be utilized.
  • a weighted average of the angular values could be calculated for imaged spot, creating new unique angular values.
  • the image sensor may send data information to the system for analysis and computations regarding mapping, distance, etc., for example.
  • Different example embodiments may utilize different sources of light in order to help the system differentiate the emitted and reflected light.
  • the system may polarize one laser beam pulse, send it toward an example target, and then change the polarization for all of the other pulses.
  • the system may receive the reflected laser beam pulse, with the unique polarization, and be able to identify the location of the specific target, differentiated from all of the other returned beams.
  • Any combination of such examples could be used to identify and differentiate any number of specific targets in the target field. These could be targets that were identified by the system or by human intervention, through an object recognition step earlier in the process, for example.
  • the system may be used to measure biometrics including a person's heartbeat if they are in the target area. This may be done with the system described here via various measurement techniques.
  • One such example may be because the human face changes reflectivity to IR depending upon how much blood is under the skin, which may be correlated to heart beat.
  • Another technique draws from Eulerian Video Magnification, a method for using identification of a subject area in a video, magnifying that area and comparing frame to frame motion which may be imperceptible to a human observer. Utilizing these technologies a system can infer a human heart beat from a distance of several meters. Some systems need to capture images at a high frame rate which requires sufficient lighting. Often times ambient lighting is not enough for acceptable image capture. One way to deal with this may include an embodiment here that uses directed illumination, according to the disclosures here, to be able to illuminate a specific area of a subject, thus enhancing the ability of a system to function in non-optimal lighting conditions or at significant distances.
  • Technologies that utilize a video image for determining biometric information may require particular illumination such that the systems can capture an acceptable video image at frame rates fast enough to capture frame to frame changes.
  • Ambient lighting may not provide sufficient illumination, and augmented illumination may not be available or in certain circumstances it may not be desirable to provide high levels of visible light, such as a sleeping person, or where the subject is in crowded environment, or at a distance making conventional lighting alternatives unacceptable.
  • Certain embodiments here include using illumination which can incorporate directing IR/NIR.
  • Such embodiments may determine distance and calibrate projected patterns onto a desired object or human, which may help determine surface contours, depth maps and generating point clouds.
  • the system may direct illumination onto one or more areas of a human subject or object.
  • Such a system to direct illumination may be controlled by a human or by software designed to recognize specific areas which require enhanced illumination.
  • the system may work in conjunction with a CMOS, CCD or other imaging device, software which controls the projecting device, object and/or gesture recognition software or human interface, software which analyzes the video image and a microprocessor.
  • a human user, or the recognition software may analyze the image received from the image sensor, identify the subject or subjects of interest, assign one or more areas which require augmented or enhanced illumination.
  • the system may then direct illumination onto those specifically identified areas. If the system is integrated with motion track capabilities, the illumination can be changed with each frame to match the movement of the subject area.
  • the imaging system may then capture the video image and transfer that to the analysis software. Changes to the position, size and intensity of the illumination can be made when the analysis software may even provide feedback to the software controlling the illumination. Analysis of the processed video images may be passed onto other programs and applications.
  • Embodiments of this technology may include the use of color enhancement software which allows the system to replace the levels of gray scale produced in a monochromatic IR image with color equivalents.
  • software which utilizes minute changes in skin color reflectivity may not be able to function with a monochromatic image file.
  • the system may then be able to interpret frame to frame changes.
  • Example embodiments may be used for collecting biometrics such as heart/pulse rate from humans and other living organisms. Examples of these can be a sleeping baby, patients in intensive care, elderly patients, and other applications where non-physical and non-light invasive monitoring is desired.
  • Example embodiments here could be used in many applications. For instance, example embodiments may be used for collecting information about non-human living organisms as well. For example, some animals cannot easily be contained for physical examination. This may be due to danger they may pose to humans, physical size, or the desire to monitor their activity without disturbing them. As another example, certain embodiments may be used for security systems. By isolating an individual in a crowd, a user could determine if that isolated target had an elevated heart rate, which could indicate an elevated level of anxiety. Some other example embodiments may be used for monitoring inanimate objects in non-optimal lighting conditions, such as production lines, and inventory management, for example.
  • Figure 43 illustrates an example embodiment where the biometric of a human target 4312 is desired from a distance of several meters. The distance could vary depending on the circumstances and level of accuracy desired, but this example is one of several meters.
  • recognition software could identify an area of interest, using object recognition methods and/or systems.
  • the coordinates of the target object may then be sent to the illumination device controlling the directed illumination 4320.
  • the example laser beam 4320 may then be sent to and reflected 4322 to be captured by an image sensor (not pictured), and transmitted to the system for analysis.
  • the illumination can be adjusted to optimally illuminate a specific area as depicted in the figure detail 4324 showing an example close up of the target and reflection off of a desired portion of the target person 4312.
  • This example beam could be motion tracked to follow the target, adjusted, or redirected depending on the circumstances. This may allow for the system to continue to track and monitor an identified subject area even if the object is in motion, and continue to gather biometric information and/or update the information.
  • Certain example embodiments here include the ability to create sequential triangulated depth maps.
  • Such depth maps may provide three-dimensional representation of surfaces of an area based on relative distance from an area to an image sensor.
  • the term is related to and may be analogous to depth buffer, Z-buffer, Z-buffering and Z-depth, for example.
  • Certain examples of these provide the Z or distance aspect as a relative value as each point relates to another.
  • Such example technologies may incorporate a method of using sequentially triangulated points.
  • a system that utilizes triangulation may generate accurate absolute distances from the device to the surface area. Furthermore, when the triangulated points are placed and captured sequentially, an accurate depth map of an area may be generated.
  • certain embodiments here may direct light onto specific target area(s), and more specifically to an interactive projected illumination system which may enable identification of an illuminated point and calculation of the distance from the device to that point by using trigonometric calculations referred to as triangulation.
  • a system may direct illumination onto a target area using projected points of light at specific intervals along a horizontal axis then steps down a given distance and repeats, until the entire area is scanned.
  • Each pixel may be unique and identified and matched to an imaged pixel captured by an image sensor.
  • the uniqueness of each pixel may be from a number of identifiers.
  • each projected pixel may have a unique outbound angle and each returning pixel also has a unique angle.
  • the angles combined with a known distance between the point of directed illumination may enable the system to calculate, using triangulation the distance to each point.
  • the imaged pixel with and assigned Z, depth or distance component can be further processed to produce a depth map and with additional processing a point cloud.
  • Figure 44A illustrates an example embodiment generating one row of points 4414 with a human subject 4412 also in the room.
  • each point illuminated has unique and known angular value from its projection.
  • each point in this example has a unique sequential value based on time and location. These points can be timed and spaced so as to prevent overlap or confusion by the system.
  • Figure 44B illustrates example reflected pixels 4424. These reflected points are captured by an image sensor.
  • each imaged pixel also has unique identifies such as angular values and time, as in Figure 44A.
  • the unique identification of projected pixels and captured pixels may allow the system to match a projected point with an imaged point.
  • distance can be calculated from the device to the surfaces in the field of view.
  • This depth or distance information, "Z” can be associated with a corresponding imaged pixel to create a depth map of the scanned target area or objects. Further processing of the depth map can produce a point cloud.
  • Such example depth maps or point clouds may be utilized by other software systems to create three dimensional or "3D" representations of a viewed area, object and human recognition, including facial recognition and skeletal recognition.
  • the example embodiments may capture data in order to infer object motion. This may even include human gesture recognition.
  • Certain example embodiments may produce the illumination scans in various ways, for example, a vertical scan which increments horizontally. Additionally, certain embodiments may use projected points that are sequential but not equally spaced in time. [00308] Some embodiments may incorporate a random or asymmetric aspect to the pattern of points illuminated. This could enable the system to change points frame to frame and through software fill in the gaps between imaged pixels to provide a more complete depth map.
  • some example embodiments either manually or as a function of the software, selectively pick one or more areas within a viewed area to limit the creation of a depth map. By reducing the area mapped, the system may run faster having less data to process. The system may also be dynamically proportioned such that it may provide minimal mapping of the background or areas of non or lesser interest and increase the resolution in those areas of greater interest, thus creating a segmented or hybrid depth map.
  • Certain example embodiments could be used to direct the projection of images at targets.
  • Such an example could using directed illumination incorporating IRTNIR wavelengths of light to improve the ability of object and gesture recognition systems to function in adverse lighting conditions.
  • Augmented reality refers to systems that allow the human user to experience computer generated enhancements to real environments. This could be accomplished with either a monitor of display, or through some form of projected image. In the situation of a projected image, a system could work in low light environments to avoid the projected image from being washed out by ambient light sources.
  • recognition systems can be given improved abilities to identify objects and motion without creating undesirable interference with projected images.
  • object recognition, object tracking and distance measuring are described elsewhere herein and could be used in these example embodiments to find and track targets.
  • Multiple targets could be identified by the system, according to the embodiments disclosed herein. By identifying more than one target, the system could project different or the same image on more than one target object, including motion tracking them. Thus, more than one human could find unique projections on them during a video game, or projected
  • backgrounds could illuminate walls or objects in the room as well, for example.
  • the targets could be illuminated with a device that projects various images.
  • This projector could be integrated with the tracking and distance systems or a separate device. Either way, in some embodiments, the two systems could be calibrated to correct for differences in projected throw angles.
  • Any different kind of projection could be sent to a particularly identified object and/or human target.
  • the projected image could be monochrome or multicolored. In such a way, the system could be used with video games to project images around a target area. It could also have uses in medicine, entertainment, automotive, maintenance, education and security, just as examples.
  • Figure 45 illustrates an example embodiment showing an interactive game scenario.
  • the directed illumination has enabled recognition software to identify where a human 4512 is located in the field of view and has been identified by the system according to any of the example ways described herein.
  • the software may also define the basic size and shape of the subject for certain projections to be located.
  • the example system may then adjust the image accordingly and projects it onto the subject, in this example an image of a spider 4524.
  • Certain example embodiments here include the ability to recognize areas or objects onto which projection of IR/NIR or other illumination is not desired, and block projection to those areas.
  • An example includes recognizing a human user's eyes or face, and keeping the IRTNIR projection away from the eyes or face for safety reasons.
  • Certain example embodiments disclosed here include using directed illumination incorporating IR/NIR wavelengths of light for object and gesture recognition systems to function in adverse lighting conditions. Any system which utilizes light in the infrared spectrum when interacting with humans or other living creatures has the added risk of eye safety. Devices which utilize IR/NIR in proximity to humans can incorporate multiple ways of safeguarding eyes.
  • light is projected in the IR/NIR wavelength onto specifically identified areas, thus providing improved illumination in adverse lighting conditions for object or gesture recognition systems.
  • the illuminated area may then be captured by a CMOS or CCD image sensor.
  • the example embodiment may identify human eyes and provide the coordinates of those eyes to the system which in turn blocks the directed illumination from beaming light directly at the eyes.
  • Figures 46A and 46B illustrate examples of how the system may be able to block IR/NIR projection to a human subject's eyes.
  • the image is captured with a CMOS or CCD image sensor and the image is sent to a microprocessor where one aspect of the software identifies the presence of human eyes in the field of view.
  • the example embodiment may then send the coordinates of the eyes to the embodiment which controls the directed illumination.
  • the embodiment may then create a section of blocked or blank illumination, as directed. As the directed illumination is scanned across a blanked area the light source is turned off. This prevents IR/NIR light from beaming directly into the eyes of a human.
  • Figure 46A is an example of a human subject 4612 with projected illumination 4624 incorporating eye blocking 4626.
  • Figure 46B is an example of a close up of human subject 4612 with a projected illumination incorporating 4624 eye blocking 4626.
  • Sensitive equipment may be located in the target area, that directed IR/NIR could damage. Cameras may be present, that flooding the sensors with IR illumination, may wash the camera out or damage the sensors. Any kind of motivation to block the IR/NIR could drive the embodiment to block out or restrict the amount of IR/NIR or other illumination to a particular area. Additionally, the system could be configured to infer eye location by identifying other aspects of the body. An example of this may be to recognize and identify the arms or the torso of a human target and calculate a probable relative position of a head and reduce or block the amount of directed illumination accordingly.
  • Certain example embodiments here include the ability to adjust the size of the output window and the relative beam divergence as it relates to the overall eye safe operation of the device.
  • a divergent scanned beam has the added effect of increasing the illuminated spot on the retina, which reduces the harmful effect of IR/NIR over the same period of time.
  • Figure 47A and 47B illustrate the impact of output window size to the human eye 4722.
  • Safe levels of IR are determined by intensity over area over time. The lower the intensity is for a given period of time, the safer the MPE or maximum permissible exposure is.
  • the output window 4724 is relatively the same height as the pupil 4722, in this example an output window 7mm tall by 16mm wide and the average dilated pupil is 7mm, approximately 34.4% of the light exiting the output window can enter the eye. If the output window is doubled in size 4726 to 14mm tall and 32mm wide, the maximum light that could enter the pupil drops to 8.6% as illustrated in Figures 47C and 47D, for example.
  • FIG 47A is a detailed illustration of 47B showing the relationship of elements of the device to a human eye at close proximity.
  • Light from a semiconductor laser 4762 or other light source passes through optical element 4766 and is directed onto a 2D MEMs 4768 or other device designed to direct or steer a beam of light.
  • the angular position of the MEMs reflects each pixel of a raster scanned image with a unique angle which creates an effective throw angle of each scan or frame of illumination.
  • the scanned range of light exits the device through an output window 4726 which dimensionally matches the image size of the scanned area.
  • the human eye 4712 is assumed to be located as close as possible to the exit window.
  • a portion of the light from the exit window can enter the pupil 4722 and is focused on the back or retina of the eye 4728.
  • the angular nature of each sequential pixel causes the focused are to be larger than that of a collimated beam. This has the same effect as if the beam had passed through a divergent lens.
  • FIG 47C is a detailed illustration of 47D showing the relationship of elements of the device to a human eye at some distance.
  • Light from a semiconductor laser 4762 or other light source passes through optical element 4766 and is directed onto a 2D MEMs 4768 or other device designed to direct or steer a beam of light.
  • the angular position of the MEMs reflects each pixel of a raster scanned image with a unique angle which creates an effective throw angle of each scan or frame of illumination.
  • the scanned range of light exits the device through an output window 4726 which dimensionally matches the image size of the scanned area.
  • the human eye 4712 is assumed to be located as close as possible to the exit window.
  • a portion of the light from the exit window can enter the pupil 4722 and is focused on the back or retina of the eye 4730.
  • the angular nature of each sequential pixel causes the focused are to be larger than that of a collimated beam. This has the same effect as if the beam had passed through a divergent lens.
  • the grater the throw angle of the device the more small changes in the distance of the output window to the MEMs will result in a positive effect on reducing the total amount of light which can enter the eye.
  • An embodiment of this technology incorporates the ability for the device to dynamically adjust the effective size of the output window.
  • the system can effective adjust the output window to optimize the use of directed illumination while maximizes the eye safety.
  • Certain embodiments here also may incorporate adding the distance from the device to the human and calibrating the intensity of the directed illumination in accordance with the distance. In this embodiment even if the eyes are not detectable, a safe level of IR/NIR can be utilized.
  • Certain example embodiments here may include color variation of the projected illumination. This may be useful because systems using directed illumination may incorporate IR/NIR of light. These are outside of the spectrum of light visible to humans. When this light is captured by a CMOS or CCD imaging sensor may generate a monochromatic image normally depicted in a black and white or gray scale. Humans and image processing systems may rely on color variation to distinguish edges, objects, shapes and motion. In situations where IR/NIR directed illumination works in conjunction with a system that requires color information, specific colors can be artificially assigned to each level of grey for display. Furthermore by artificially applying the color values, differentiation between subtle variations in gray can be emphasized thus improving the image for humans.
  • directing illumination in the IR/ NIR wavelength onto specifically identified areas may provide augmented illumination, as disclosed in here.
  • Such an example illumination may then be captured by a CMOS or CCD image sensor.
  • the system may then apply color values to each shade of gray and either passes that information onto other software for further processing or displays the image on a monitor for a human observer.
  • Projected color is additive, adding light to make different colors, intensity, etc.
  • 8bit color provides 256 levels for each projection device such as a lasers or LEDs, etc.
  • the range is 0-255 since 0 is a value.
  • 24 bit color 8x3 results in 16.8 million colors.
  • the system processing the IR/NIR signals may return black, white and shades of gray in order to interpret the signals.
  • Many IR cameras produce 8 bit gray scale. And it may be very difficult for a human to discern the difference between gray 153 and gray 154.
  • Factors include the quality and calibration of the monitor, the ambient lighting, the observer's biological sensitivity, number of rods versus cones in the eye, etc. The same problem exists for gesture and object recognition software - it has to interpret grey scale into something meaningful.
  • Embodiments here include the ability to add back color values to the grey scales.
  • the system may set grey 153 to be red 255 and 154 to be green 255, or any other settings, this being only one example.
  • color levels may be assigned to each grey scale value. For example, everything below 80 gets 000 or black and everything above 130 gets, 255,255,255 white and the middle range is expanded.
  • Figure 48A illustrates a nine level gray scale with arbitrarily assigned R-red G-green B-blue values using an 8 bit RGB additive index color scale. Because the assignment of color to gray is artificial the scale and assignments can be in formats that are best matched to the post enhancement systems. Any variation of assigned colors may be used, the example shown in Figure 48A is illustrative only.
  • Figure 48B illustrates an example image captured inclusive of a subject 4812 which has been color enhanced according to the assignments of color from Figure 48A.
  • the colors, Red, Green and Blue show up in the amounts indicated in Figure 48A, according to the level of grey scale assigned by the example system.
  • the monochrome system assigned a pixel a grey scale of 5
  • the system here would assign 0 Red, 0 Blue and 200 Green to that pixel, making it a certain shade of green on the display of 48B.
  • a grey scale assignment of 1 would assign 150 Red, 0 Green and 0 Blue, assigning a certain shade of red to the pixels with that grey scale value.
  • the grey scale shading becomes different scales of colors instead of a monochrome scale.
  • some example embodiments could include the display of the color could apply color enhancement to select areas only, once a target is identified and illuminated. Some embodiments may enable a nonlinear allocation of color. In such an embodiment, thresholds can be assigned to the levels. An example of this could be to take all low levels and assign them the same color or black, thus extenuating a narrower range of gray.
  • Certain example embodiments could include identification of a particular target by a human user/observer of the displayed image to be enhanced. This could be accomplished with a mouse, touch screen or other gesture recognition which would allow the observer to indicate an area of interest.
  • Certain embodiments here also include the ability to utilize propagation of a light- based square wave, and more specifically an interactive raster scanning system/method for directing a square wave.
  • directed illumination and ToF - Time-Of-Flight imaging may be used to map and determine distance of target objects and areas.
  • Square waves are sometimes used by short range TOF or time-of-flight depth mapping technologies.
  • an array of LEDs may be turned on an off at a certain rate, to create a square wave.
  • the LEDs may switch polarity to create waves of light with square waves of polarity shifted. In some embodiments, when these waves bounce off or reflect off objects, the length of the wave may change. This may allow Current Assisted Photon Demodulating (CAPD) image sensors to create a depth map.
  • CCD Current Assisted Photon Demodulating
  • projected light from LEDs may not be suitable for generating square waves without using current modulation to switch the polarity of the LEDs, thus resulting in optical switching.
  • a single Continuous Wave (CW) laser may be pulsed at high rates, for example 1.1 nanoseconds, and adjust the timing such that a sweeping laser may create a uniform wave front.
  • Some example embodiments here include using a directed single laser beam which is configured to produce a raster scan based on a 2D MEMs or similar optical steering device.
  • a continuous wave laser such as a semiconductor laser which can be either amplitude modulated or pulse width modulated, or both, is used as the source for generating the square wave.
  • a raster scan can form an interlaced, de- interlaced, or progressive pattern.
  • the laser is reflected off of a beam steering mechanism capable of generating a raster scan, an area of interest can be fully illuminated during one complete scan or frame.
  • Some raster scans are configured to have horizontal lines made up of a given number of pixels and a given number of horizontal lines.
  • each pixel the laser can be turned on.
  • the on time as well as the optical power or amplitude of each pixel may be controlled by the system, generating one or more pulses of a square wave.
  • the pulses for each sequential pixel when timed such that the pulses for each sequential pixel are in phase with the desired wave format, they may generate a wave front that will appear to the imaging system as if generated as a single wave front.
  • further control over the placement of the square wave may be accomplished where a human/user or a system may analyze the reflected image received from the image sensor, and help identify the subject or subjects of interest. The system may then control the directed illumination to only illuminate a desired area. This can reduce the amount of processing required by the imaging system, as well as allow for a higher level of intensity, which also improves the system performance.
  • Figure 49A is an example representative graph which shows four cycles of an example square wave.
  • Dotted line 4922 shows a sample wave generated gain shifted LED.
  • Dashed line 4924 represents an example pulse which is generated by an example semiconductor laser. These example lasers may have switching time that are beneficial to such a system and allow for particular square wave propagation, as shown, with less or nearly no noise on the wave propagation.
  • Solid line 4926 illustrates how the example pulses may be kept in phase if the constraints of the system prevent sequential pulses.
  • Figure 49B illustrates an example target area including a target human figure 4912 in an example room where a propagated square wave generated by system for directed illumination 4916 is used.
  • an example embodiment may use an optical switching mechanism to switch a laser on and off, producing clean pulses to reflect off of a target.
  • in-phase pulses they may form uniform wave fronts 4918.
  • the returning, reflected waves (not pictured) can then be captured and analyzed for demodulation of the square waves.
  • certain embodiments include using gain switching to change the polarity of the laser, creating on and off pulses at various intervals.
  • Factors which these methodologies and others not described here have in common are the need to optimize the pattern projected onto a subject.
  • the frequency of the pattern, or number of times it repeats, number of lines, and other aspects of the pattern effect the system's ability to accurately derive information.
  • Alternating patterns in some examples are necessary to produce the interference or fringe patterns required for the methodology's algorithm.
  • the orientation of the patterns projected onto subject and the general orientation of the subject influences various characteristics related to optimal data extraction.
  • the ability to dynamically adjust the projected patterns on a subject may improve the accuracy, which is the deviation between calculated dimensions and actual, as well as resolution, the number of final data points, and increase information gathering and processing speeds.
  • Certain embodiments here include the ability to direct light onto specific target area(s), determining distance to one or more points and calibrating a projected pattern accordingly. This may be done with directed illumination and single or multipoint distance calculation used in conjunction with projected patterns including structured light, phase shift, or other methods of using projected light patterns to determine surface contours, depth maps or generation of a point clouds.
  • a projected pattern from a single source will diverge the further it is from the origin, this is known as the throw angle.
  • the projected pattern will increase in size, because of the divergence.
  • the subject will occupy a smaller portion of the imaged area as a result of the FOV or viewing angle of the camera.
  • the combined effect of the projected throw angle and the captured FOV may increase the distortion of the projected image.
  • a calibrated projection system may be helpful to map an area and objects in an area where objects may have different locations from the camera.
  • a system that incorporates directed illumination with the ability to determine distance from a projector to one or more subject areas is used to statically or dynamically adjust projected patterns, as disclosed above. Further, some example embodiments may be able to segment a viewed area and adjust patterns accordingly to multiple areas simultaneously. Such example embodiments may analyze each segment independently and combine the results to create independent depth maps or combine independent depth maps into one. And such example embodiments may be used to determine if a flat wall or background is present and eliminate the background from either being projected upon or be removed in post processing. [00351] An embodiment of this system incorporates a system for detecting when either a projected or captured frame is corrupted, or torn. Corruption of a projected or captured image file may result from a number of errors introduced into the system.
  • the system can recognize that either a corrupt image has been projected or that a corrupted image has been captured. The system then may identify the frame such that later processes can discard the frame, repair the frame or determine if the frame is useable.
  • Some embodiments here may determine depth, 3D contours and/or distance and incorporate dynamically calibrating the patterns for optimization. Such examples may be used to determine distance and calibrate projected patterns onto a desired object or human, which may help determine surface contours, depth maps and generating point clouds.
  • one or more points or pixels of light may be directed onto a human subject or an object. Such direction may be via a separate device, or an integrated one combined with a projector, able to direct projected patterns which can be calibrated by the system.
  • the patterns may be projected with a visible wavelength of light or a wavelength in IR/NIR.
  • the projector system may work in conjunction with a CMOS, CCD or other imaging device, software which controls the projecting device; object and/or gesture recognition software or human interface and a microprocessor as disclosed herein.
  • a human/user or the recognition software analyzes the image received from the image sensor, identifies the subject or subjects of interest, assigns one or more points for distance calculation.
  • the system may calculate the distance to each projected point.
  • the distance information may be passed onto the software which controls the projected pattern.
  • the system may then combine the distance information with information about the relative location and shape of the chosen subject areas.
  • the system may then determine which pattern, pattern size and orientation depending on the circumstances.
  • the projector may then illuminate the subject areas with the chosen pattern.
  • the patterns may be captured by the image sensor and analyzed by software which outputs information in the form of a 3D representation of the subject, a depth map, point cloud or other data about the subject, for example.
  • Figure 50A illustrates an example embodiment using non-calibrated phase shift patterns projected onto human subjects 5012, 5013, and 5014.
  • the effect of the throw angle as indicated by reference lines 5024 is illustrated as bands 5016, 5018, and 5020.
  • bands 5016, 5018, and 5020 are illustrated as bands 5016, 5018, and 5020.
  • Figure 50B illustrates an example embodiment, similar to 50A but where the pattern has been calibrated.
  • the phase shift pattern is projected onto human subjects 5032, 5033, and 5034.
  • the example system may determine the distance from the subjects of interest to the image sensor and generates a uniquely calibrated pattern for each subject.
  • patterns 5036, 5038, and 5040 are calibrated such that they will produce the same number and line characteristics on each subject. This may be useful for the system to use in other calculations as described herein.
  • the system can segment the area and project uniquely calibrated patterns onto each subject. In such a way, segmented depth maps can be compared and added together to create a complete depth map of an area. And in such an example, the distance calculating ability of the system can also be used to determine the existence of a wall and other non-critical areas. The example system may use this information to eliminate these areas from analysis.
  • Figure 50C illustrates an example embodiment showing an ability to determine the general orientation of an object, in this example a vertical object 5064 and a horizontal object 5066.
  • phase shifting is optimized when the patterns run perpendicular to the general orientation of the subject.
  • the example system may identify the general orientation of a subject area and adjust the X, Y orientation of the pattern.
  • the pattern projected in Figures 50A, B and C are exemplary only. Any number of patters may be used in the ways described here.
  • Figure 51 shows a table depicting some examples of projected patterns that can be used with dynamic calibration. These examples discussed below are not meant to be exclusive of other options but exemplary only. Further, the examples below only describe the patterns for reference purposes and are not intended as explanations of the process nor the means by which data is extracted from the patterns.
  • sequential binary coding is comprised of alternating black (off) and white (on) stripes generating a sequence of projected patterns, such that each point on the surface of the subject is represented by a unique binary code.
  • N patterns can code 2 N stripes, in the example of a 5 bit pattern, the result are 32 stripes.
  • the example pattern series is 2 stripes (1 black, 1 white), then 4, 8, 16 and 32.
  • a depth map of the subject can be derived.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • there is a series of 5 separate patterns projected by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • sequential gray code, 5112 is similar to sequential binary code referenced in 5110, with the use of intensity modulated stripes instead of binary on/off patterns. This increases the level information that can be derived with the same or fewer patterns.
  • L represents the levels of intensity
  • N the number of patterns in a sequence in this example is 3 resulting in 43 or 64, the number of unique points in one line.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • there is a series of 3 separate patterns projected by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • phase shifting 5114 which utilizes the projection of sequential sinusoidal patterns onto a subject area.
  • a series of three of sinusoidal fringe patterns represented as I N are projected onto the area of interest.
  • the intensities for each pixel (x, y) of the three the patterns are described as
  • I 3 (x, y) I 0 (x, y) + Imod(x, y) cos((p(x, y) + ⁇ ),
  • I ⁇ x, y), I 2 (x, y), and I 3 (x, y) are the intensities of three patterns
  • I 0 (x, y) is the component background
  • Imod(x, y) is the modulation signal amplitude
  • ⁇ ( ⁇ , y) is the phase
  • is the constant phase-shift angle
  • Phase unwrapping is the process that converts the wrapped phase to the absolute phase.
  • the phase information can be retrieved and unwrapped is derived from the intensities in the three fringe patterns.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • there is a series of 3 separate patterns projected by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • Trapezoidal 5116. This method is similar to that described in 5114 phase shifting, but replaces a sinusoidal pattern with trapezoidal-shaped gray levels. Interpretation of the data into a depth map is similar, but can be more computationally efficient.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • there is a series of 3 separate patterns projected by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • One embodiment example of this is a hybrid method, 5118, which combines methods of gray coding as described in 5112 and phase shifting as described in 5114 can be combined to form a precise series of patterns with reduced ambiguity.
  • the gray code pattern determines non ambiguous range of phase while phase shifting provides increased sub-pixel resolution.
  • 4 patterns of a gray code are combined with 4 patterns of phase shifting to create an 8 frame sequence.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • there is a series of 8 separate patterns projected by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • One embodiment example of this utilizes a Moire' pattern, 5120, which is based on the geometric interference between two patterns. The overlap of the patterns forms a series of dark and light fringes. These patterns can be interpreted to derive depth information.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • there is a series of at least 2 separate patterns projected by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • Multi-wavelength also referred to as Rainbow 3D, 5122
  • Rainbow 3D is based upon spatially varying wavelengths projected onto the subject.
  • D directed illumination and image sensor
  • the angle between the image sensor and a particular wavelength of light ⁇
  • unique points can be identified on a subject and utilizing methods of triangulation distances to each point can be calculated.
  • This system can utilize light in the visible spectrum or in the IR/NIR spaced apart such that they can be subsequently separated by the system.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • One embodiment example of this is a continuously varying code, 5124, can be formed utilizing three additive wavelengths, often times primary color channels of RGB or unique wavelengths of IR/NIR such that when added together form a continuously varying pattern.
  • the interpretation of the captured image is similar to that as described in 5122.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • striped indexing utilizes multiple wavelengths selected far enough apart to prevent cross talk noise from the imaging sensor.
  • the wavelengths may be in the visible spectrum, generated by the combination of primary additive color sources such as RGB, or a range of IR/NIR. Stripes may be replaced with patterns to enhance the resolution of the image capture. The interpretation of the captured image is similar to that as described in 5122. [00395] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence.
  • Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • One embodiment example of this is the use of segmented stripes, 5128, where to provide additional information about a pattern, a code is to introduced within a stripe. This creates a unique pattern for each line, and when known by the system, can allow one stripe to be easily identified from another.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • gray scale 5130
  • amplitude modulation provides for control of the intensity
  • stripes can be given gray scale values.
  • a 3 level sequence can be black, gray, and white.
  • the gray stripes can be created by setting the level of each projected pixel at some value between 0 and the maximum.
  • the gray can be generated by a pattern of on/off pixels producing an average illumination of a stripe equivalent to level of gray or by reducing the on time of the pixel such that during one frame of exposure of an imaging device the on is a fraction of the full exposure.
  • the charged level of the imaged pixels is proportionally less than that of full on and greater than off.
  • An example of a pattern sequence is depicted below where B represents black, W represents white, and G represents gray. The pattern is depicted such that the sequence does not necessarily repeat as long as no two values appear next to each other.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • De Bruijn sequence 5132, which refers to a cyclic sequence of patterns where no pattern of elements repeats during the cycle in either an upward or downward progression through the cycle.
  • the decoding of a De Bruijn sequence requires less computation work than other similar patterns.
  • the variation in the pattern may be color/wavelength, width or combination of width and color/wavelength.
  • Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • pseudo-random binary 5134
  • Pseudorandom binary arrays utilize a mathematical algorithm to generate a pseudo-random pattern of points which can be projected onto each segment.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • One embodiment example of this is similar to the methodology described in 5134, where the binary points can be replaced by a point made up of multiple values generating a mini- pattern or code word, 5136. Each projected mini-pattern or code word creates a unique point identifier in each grid segment.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • One embodiment example of this is a color/wavelength coded grid, 5138. In some instances it may be beneficial to have grid lines with alternating colors/wavelengths.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • One embodiment example of this is a color/wavelength dot array, 5140 where unique wavelengths are assigned to points within each segment.
  • visible colors of R red, G green, and B blue are used. These could also be unique wavelengths of IR/NIR spaced far enough apart such as to minimize the cross talk that might occur on the image sensor.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • One embodiment of this is the ability of the system to combine multiple methods into hybrid methods, 5142.
  • the system determines areas of interest and segments the area. The system can then determine which method or combination/hybrid of methods is best suited for the given subject. Distance information can be used to calibrate the pattern for the object.
  • the result is a segmented projected pattern where a specific pattern or hybrid pattern is calibrated to optimize data about each subject area. Factors influencing the patterns selected may include but not be limited to, if the subject is living, inanimate, moving, stationary, relative distance from the device, general lighting, and environmental conditions.
  • the system processes each segment as a unique depth map or point cloud. The system can further recombine the segmented pieces to form a more complete map of the viewed area.
  • Directed illumination as described here controls the illumination of an area at a pixel level.
  • the system has the ability to control amplitude of each pixel from zero, or off, to a maximum level.
  • An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest.
  • the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
  • the software can change the projected pattern each frame or frames as required.
  • Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration.
  • the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest.
  • the system has the ability to analyze each segment independently.
  • the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image.
  • the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
  • Some embodiments include features for directing light onto specific target area(s), and image capture when used in a closed or open loop system. Such an example embodiment may include using use of a shared optical aperture for both the directed illumination and image sensor to help achieve matched throw angels and FOV angles.
  • Certain embodiments may include a device for directing illumination and an image sensor that share the same aperture and for some portion of the optical path have comingled light paths.
  • the path may split, thus allowing the incoming light to be directed to an image sensor.
  • the outgoing light path may exit through the same aperture as the incoming light.
  • Such an example embodiment may provide an optical system where the throw angle of the directed illumination and the FOV angle of the incoming light are matched. This may create a physically calibrated incoming and outgoing optical path. This may also create a system which requires only one optical opening in a device.
  • Figure 52A illustrates an adjacent configuration example where the outgoing and incoming light paths share the same aperture but are not comingled paths.
  • light from a semiconductor laser or other light emitting device 5212 is directed by an optical element (not pictured) to a 2D MEMs (not pictured) or other mechanism for directing the beam.
  • the outgoing light 5220 is the reflected off of a prism 5218 through the shared aperture (not pictured).
  • Incoming light 5228 is reflected off of the same prism 5218 through a lens (not pictured) and onto an image sensor 5226.
  • the prism 5218 can be replaced by two mirrors that occupy the same relative surface (not pictured).
  • This example configuration assumes that some degree of image and directed illumination may be corrected for by other means such as system calibration algorithms.
  • Figure 52B illustrates an example embodiment of one example configuration where the outgoing and incoming light paths share the same aperture and for some portion the optical path is comingled.
  • light from a semiconductor laser or other light emitting device 5234 is directed by an optical element (not pictured) to a 2D MEMs (not pictured), for example, but any other mechanism could be used to direct the beam.
  • the outgoing light 5240 passes through a polarized element 5238 and continues through the shared aperture (not pictured).
  • Incoming light 5242 enters the shared aperture and is reflected off of the polarized element 5238 onto an image sensor 5246. This configuration provides for a simple configuration to achieve coincident apertures.
  • Figure 52C illustrates an example embodiment of an example configuration where outgoing and incoming light paths share the same common objective lens and for some portion the optical path is comingled.
  • light from a semiconductor laser or other light emitting device 5252 is directed by an optical element (not pictured) to a 2D MEMs (not pictured) or other mechanism for directing the beam.
  • the outgoing light 5272 passes through lens 5258 to a scan format lens 5260 which creates a focused spot that maps the directed illumination to the same dimensions as the image sensor active area.
  • the outgoing light then passes through optical element 5262, through a polarized element 5264 and exits through common objective lens 5266.
  • incoming light enters through the common objective lens 5266 and is reflected off of the polarized element 5264 and onto the image sensor 5270.
  • Certain example embodiments may allow for a secondary source of illumination such as a visible light projector to be incorporated into the optical path of the directed illumination device. And certain example embodiments may allow for a secondary image sensor, enabling as an example for one image sensor designed for visible light and one designed for IR/NIR to share the same optical path.
  • features consistent with the present inventions may be implemented via computer-hardware, software and/or firmware.
  • the systems and methods disclosed herein may be embodied in various forms including, for example, a data processor, such as a computer that also includes a database, digital electronic circuitry, firmware, software, computer networks, servers, or in combinations of them.
  • a data processor such as a computer that also includes a database
  • digital electronic circuitry such as a computer that also includes a database
  • firmware firmware
  • software computer networks, servers, or in combinations of them.
  • the disclosed implementations describe specific hardware components, systems and methods consistent with the innovations herein may be implemented with any combination of hardware, software and/or firmware.
  • the above-noted features and other aspects and principles of the innovations herein may be implemented in various environments.
  • Such environments and related applications may be specially constructed for performing the various routines, processes and/or operations according to the invention or they may include a general -purpose computer or computing platform selectively activated or reconfigured by code to provide the necessary functionality.
  • the processes disclosed herein are not inherently related to any particular computer, network, architecture, environment, or other apparatus, and may be implemented by a suitable combination of hardware, software, and/or firmware.
  • various general- purpose machines may be used with programs written in accordance with teachings of the invention, or it may be more convenient to construct a specialized apparatus or system to perform the required methods and techniques.
  • aspects of the method and system described herein, such as the logic may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (“PLDs”), such as field programmable gate arrays (“FPGAs”), programmable array logic (“PAL”) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits.
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • PAL programmable array logic
  • electrically programmable logic and memory devices and standard cell-based devices as well as application specific integrated circuits.
  • Some other possibilities for implementing aspects include: memory devices, microcontrollers with memory (such as EEPROM), embedded microprocessors, firmware, software, etc.
  • aspects may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types.
  • the underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor
  • MOSFET complementary metal-oxide semiconductor
  • CMOS complementary metal-oxide semiconductor
  • ECL emitter-coupled logic
  • polymer technologies e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures
  • mixed analog and digital and so on.
  • transfers of such formatted data and/or instructions by carrier waves include, but are not limited to, transfers (uploads, downloads, e-mail, etc.) over the Internet and/or other computer networks via one or more data transfer protocols (e.g., HTTP, FTP, SMTP, and so on).
  • HTTP HyperText Transfer Protocol
  • FTP FTP
  • SMTP Simple Stream Transfer Protocol

Abstract

Methods and systems described here may be used for target illumination and mapping. Certain embodiments include a light source and an image sensor, where the light source configured to, communicate with a processor, scan a target area within a field of view, receive direction from the processor regarding projecting light within the field of view on at least one target, the image sensor configured to, communicate with the processor, receive reflected illumination from the target area within the field of view, generate data regarding the received reflected illumination; and send the data regarding the received reflected illumination to the processor.

Description

INTERACTIVE ILLUMINATION FOR GESTURE AND/OR OBJECT
RECOGNITION
TECHNICAL FIELD
[0001] The embodiments here relates to an illumination system for illumination of a target area for image capture in order to allow for three dimensional object recognition and target mapping.
BACKGROUND
[0002] Current object recognition illumination and measuring systems do not provide energy efficient illumination. Thus there is a need for an improved, cost efficient illumination device for illumination of a target object such as a human.
SUMMARY
[0003] The disclosure includes methods and systems including a system for target illumination and mapping, comprising, a light source and an image sensor, the light source configured to, communicate with a processor, scan a target area within a field of view, receive direction from the processor regarding projecting light within the field of view on at least one target, the image sensor configured to, communicate with the processor, receive reflected illumination from the target area within the field of view, generate data regarding the received reflected illumination, and send the data regarding the received reflected illumination to the processor.
[0004] Such systems where the light source is an array of light emitting diodes (LEDs). Such systems where the light source is a laser, where the laser is at least one of, amplitude modulated and pulse width modulated.
[0005] Such systems where the laser is an infrared laser and the image sensor is configured to receive and process infrared energy. Such systems where the direction received from the processor includes direction to track the at least one target. Such systems where the data regarding the received reflected illumination includes information that would allow the processor to determine the distance from the system to the select target via triangulation.
[0006] Such systems where the system is light source is further configured to receive direction from the processor to illuminate the tracked target in motion. Such systems where the light source is further configured to block illumination of particular areas on the at least one select target via direction from the processor.
[0007] Such systems where the target is a human, and where the particular areas on the at least one select target are areas which correspond to eyes of the target. Such systems where the scan of the target area is a raster scan. Such systems where the raster scan is completed within one frame of the image sensor.
[0008] Such systems where the light source includes at least one of, a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS, to direct the light. Such systems where the light source includes at least one of, a rotating mirror. Such systems where the tracking the selected target includes more than one selected target.
[0009] Such systems where the image sensor is further configured to generate gray shade image data based on the received infrared illumination, and assign visible colors to gray shades of the image data. Such systems where the image sensor is a complementary metal oxide semiconductor (CMOS). Such systems where the image sensor is a charge coupled device (CCD). Such systems where the light source and the image sensor include optical filters. Such systems where the light source is a laser.
[0010] Another example system includes a system for illuminating a target area, including, a directionally controlled laser light source, and an image sensor, the directionally controlled laser light source configured to, communicate with a processor, scan the target area, receive direction on illuminating specific selected targets within the target area from the processor, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to communicate with the processor, receive the laser light reflected off of the target area, generate data regarding the received reflected laser light, and send the data regarding the received laser light to the processor.
[0011] Such systems where the laser light source is further configured to receive direction from the processor to illuminate at least two target objects with different illumination patterns. Such systems where the data regarding the received reflected laser light is configured to allow the processor to calculate a depth map. Such systems where the image sensor is a complementary metal oxide semiconductor (CMOS).
[0012] Such systems where the image sensor is a charge coupled device (CCD). Such systems where the light source and the image sensor include optical filters. Such systems where the data regarding the received reflected laser light is configured to allow the processor to calculate a point cloud. Such systems where the directional control is via at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS.
[0013] Such systems where the directional control is via at least one rotating mirror. Such systems where the laser is a continuous wave laser, and the laser light source is further configured to receive direction to send a pulse of energy to a unique part of the target area, creating pixels for the image sensor.
[0014] Another example method includes a method for target illumination and mapping, including, via a light source, communicating with a processor, scanning a target area within a field of view, receiving direction from the processor regarding projecting light within the field of view on at least one target, via an image sensor, communicating with the processor, receiving reflected illumination from the target area within the field of view, generating data regarding the received reflected illumination, and sending the data regarding the received reflected
illumination to the processor.
[0015] Such methods where the light source is an array of light emitting diodes (LEDs). Such methods where the light source is a laser, where the laser is at least one of, amplitude modulated and pulse width modulated. Such methods where the laser is an infrared laser and the image sensor is configured to receive and process infrared energy.
[0016] Such methods where the direction received from the processor includes direction to track the at least one target. Such methods where the data regarding the received reflected illumination includes information that would allow the processor to determine the distance from the system to the select target via triangulation. Such methods further comprising, via the light source, receiving direction from the processor to illuminate the tracked target in motion.
[0017] Such methods further comprising, via the light source, blocking illumination of particular areas on the at least one select target via direction from the processor. Such methods where the target is a human, and where the particular areas on the at least one select target are areas which correspond to eyes of the target. Such methods where the scan of the target area is a raster scan. Such methods where the raster scan is completed within one frame of the image sensor. Such methods where the light source includes at least one of, a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS, to direct the light. Such methods where the light source includes at least one of, a rotating mirror. [0018] Such methods where the tracking the selected target includes more than one selected target. Such methods further comprising, via the image sensor, generating gray shade image data based on the received infrared illumination, and assigning visible colors to gray shades of the image data. Such methods where the image sensor is a complementary metal oxide
semiconductor (CMOS). Such methods where the image sensor is a charge coupled device (CCD). Such methods where the light source and the image sensor include optical filters. Such methods where the light source is a laser.
[0019] Another example method includes a method for illuminating a target area, comprising, via a directionally controlled laser light source, communicating with a processor, scanning the target area, receiving direction on illuminating specific selected targets within the target area from the processor, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving the laser light reflected off of the target area, generating data regarding the received reflected laser light, and sending the data regarding the received laser light to the processor. Such methods further comprising, via the laser light source, receiving direction from the processor to illuminate at least two target objects with different illumination patterns.
[0020] Such methods where the data regarding the received reflected laser light is configured to allow the processor to calculate a depth map. Such methods where the image sensor is a complementary metal oxide semiconductor (CMOS). Such methods where the image sensor is a charge coupled device (CCD). Such methods where the light source and the image sensor include optical filters.
[0021] Such methods where the data regarding the received reflected laser light is configured to allow the processor to calculate a point cloud. Such methods where the directional control is via at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS. Such methods where the directional control is via at least one rotating mirror. Such methods further comprising, via the laser light source, receiving direction to send a pulse of energy to a unique part of the target area, creating pixels for the image sensor. Such methods where the laser is a continuous wave laser.
[0022] Another example system includes a system for target area illumination, comprising, a directional illumination source and image sensor, the directional illumination source configured to, communicate with a processor, receive direction to illuminate the target area from the processor, and project illumination on the target area, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, capture reflected illumination off of the target area, generate data regarding the captured reflected illumination, and send the data regarding the capture reflected illumination to the processor, where the illumination source and the image sensor share an aperture and which a throw angle of the directed illumination and a field of view angle of the reflected captured illumination are matched.
[0023] Such systems where the laser is an infrared laser and the image sensor is configured to receive and process infrared energy. Such systems where the laser includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light. Such systems where the data regarding the captured reflected illumination includes information regarding triangulation for distance measurements. Such systems where the illumination source is further configured to receive instruction regarding motion tracking of the select target. Such systems where the shared aperture is at least one of adjacent, common and objective.
[0024] Another example method includes a method for target area illumination, comprising, via a directional illumination source, communicating with a processor, receiving direction to illuminate the target area from the processor, and projecting illumination on the target area, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, capturing reflected illumination off of the target area, generating data regarding the captured reflected illumination, and sending the data regarding the capture reflected illumination to the processor, where the illumination source and the image sensor share an aperture and which a throw angle of the directed illumination and a field of view angle of the reflected captured illumination are matched. Such methods where the laser is an infrared laser and the image sensor is configured to receive and process infrared energy, and where the shared aperture is at least one of adjacent, common and objective.
[0025] Such methods where the laser includes at least one of a single axis micro
electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light. Such methods where the data regarding the captured reflected illumination includes information regarding triangulation for distance measurements. [0026] Another example system includes a system for illuminating a target area, comprising, a light source and an image sensor, the light source configured to, communicate with a processor, illuminate a target area with at least one pattern of light, within a field of view, receive direction to illuminate at least one select target within the target area from the processor, and receive information regarding illuminating the at least one select target with at least one calibrated pattern of light, from the processor, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, receive reflected illumination patterns from the at least one select target within the field of view, generate data regarding the received reflected illumination patterns, and send data about the received reflected illumination patterns to the processor, where the data includes, information allowing the processor to determine distance to the at least one select target via triangulation of the illumination and received reflected illumination, and information regarding structured light of the at least one received reflected illumination patterns.
[0027] Such methods where the pattern is at least one of, alternating illuminated and non- illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array. Such methods where the light source is further configured to change illumination patterns. Such methods where the light source is a laser. Such methods where the direction to illuminate at least one select target, includes direction to track the motion of the at least one select target.
[0028] Another example system includes a system for allowing mapping of a target area, comprising, a laser and an image sensor, the laser configured to, communicate with a processor, receive direction to illuminate at least one select target with a pattern of light, project
illumination on the at least one select target with the pattern of light, receive information regarding calibration of the pattern of light, project calibrated illumination on the at least one select target, the image sensor configured to, communicate with the processor, receive reflected laser illumination patterns from the at least one select target, generate data regarding the received reflected laser illumination patterns, and send the data regarding the received reflected laser illumination to the processor, where the data includes information that would allow the processor to, determine distance via triangulation, generate a map of the target area via 3D surface measurements, and generate a point cloud of the select target.
[0029] Such systems where the pattern is at least one of, alternating illuminated and non- illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array. Such systems where the light source is further configured to change illumination patterns. Such systems where the laser is further configured to receive direction to track a motion of the selected target. Such systems where the image sensor is at least one of complementary metal oxide semiconductor (CMOS) and charge coupled device (CCD).
[0030] Another example method includes a method for illuminating a target area, comprising, via a light source, communicating with a processor, illuminating a target area with at least one pattern of light, within a field of view, receiving direction to illuminate at least one select target within the target area from the processor, and receiving information regarding illuminating the at least one select target with at least one calibrated pattern of light, from the processor, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving reflected illumination patterns from the at least one select target within the field of view, generating data regarding the received reflected illumination patterns, and sending data about the received reflected illumination patterns to the processor, where the data includes, information allowing the processor to determine distance to the at least one select target via triangulation of the illumination and received reflected illumination, and information regarding structured light of the at least one received reflected illumination patterns.
[0031] Such methods where the pattern is at least one of, alternating illuminated and non- illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array. Such methods further comprising, via the light source, projecting a new illumination pattern. Such methods where the light source is a laser. Such methods where the direction to illuminate at least one select target, includes direction to track the motion of the at least one select target.
[0032] Another example method includes a method for allowing mapping of a target area, comprising, via a laser, communicating with a processor, receiving direction to illuminate at least one select target with a pattern of light, projecting illumination on the at least one select target with the pattern of light, receiving information regarding calibration of the pattern of light, projecting calibrated illumination on the at least one select target, via an image sensor, communicating with the processor, receiving reflected laser illumination patterns from the at least one select target, generating data regarding the received reflected laser illumination patterns, and sending the data regarding the received reflected laser illumination to the processor, where the data includes information that would allow the processor to, determine distance via triangulation, generate a map of the target area via 3D surface measurements, and generate a point cloud of the select target.
[0033] Such methods where the pattern is at least one of, alternating illuminated and non- illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array. Such methods further comprising, via the light source, projecting a new illumination pattern. Such methods further comprising, via the laser, receiving direction to track a motion of the selected target. Such methods where the image sensor is at least one of complementary metal oxide semiconductor (CMOS) and charge coupled device (CCD).
[0034] Another example system includes a system for target illumination and mapping, comprising, an infrared light source and an image sensor, the infrared light source configured to, communicate with a processor, illuminate a target area within a field of view, receive direction from the processor, to illuminate at least one select target within the field of view, project illumination on the at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor, having a dual band pass filter, configured to, communicate with the processor, receive reflected illumination from the target area within the field of view, receive reflected illumination from the at least one select target within the target area, generate data regarding the received reflected illumination, and send the data to the processor. Such systems where the dual band pass filter is configured to allow visible light and light at the wavelengths emitted by the infrared light source, to pass. Such systems where the visible light wavelengths are between 400nm and 700nm. Such systems where dual band pass filter includes a notch filter. Such systems where the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD), and where the infrared light source includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light.
[0035] Another example method includes a method for target illumination and mapping, comprising, via an infrared light source, communicating with a processor, illuminating a target area within a field of view, receiving direction from the processor, to illuminate at least one select target within the field of view, projecting illumination on the at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, having a dual band pass filter, communicating with the processor, receiving reflected illumination from the target area within the field of view, receiving reflected illumination from the at least one select target within the target area, generating data regarding the received reflected illumination, and sending the data to the processor.
[0036] Such methods where the dual band pass filter is configured to allow visible light and light at the wavelengths emitted by the infrared light source, to pass. Such methods where the visible light wavelengths are between 400nm and 700nm. Such methods where dual band pass filter includes a notch filter. Such methods where the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD), and where the infrared light source includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light.
[0037] Another example system includes a system for target illumination and mapping, comprising, a laser light source and an image sensor, the laser light source configured to, communicate with a processor, project square wave illumination to at least one select target, where the square wave includes at least a leading edge and a trailing edge, send information to the processor regarding the time the leading edge of the square wave illumination was projected and the time the trailing edge of the square wave was projected, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, receive at least one reflected square wave illumination from the at least one select target, generate a signal based on the received reflected square wave illumination, where the signal includes at least information regarding the received time of the leading edge and received time of the trailing edge of the square wave, and send the signal regarding the received reflected square wave illumination to the processor.
[0038] Such systems where the laser light source is further configured to pulse, and where the square wave leading edge is caused by the laser pulse on and the trailing edge is caused by the laser pulse off Such systems where the laser light source is further configured to change polarization, and where the square wave is caused by a change of polarization. Such systems where the laser light source is further configured to switch gain in order to change polarization. Such systems where the image sensor is a current assisted photon demodulation (CAPD).
[0039] Another example method includes a method for target illumination and mapping, comprising, via a laser light source, communicating with a processor, projecting square wave illumination to at least one select target, where the square wave includes at least a leading edge and a trailing edge, sending information to the processor regarding the time the leading edge of the square wave illumination was projected and the time the trailing edge of the square wave was projected, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving at least one reflected square wave illumination from the at least one select target, generating a signal based on the received reflected square wave illumination, where the signal includes at least information regarding the received time of the leading edge and received time of the trailing edge of the square wave, and sending the signal regarding the received reflected square wave illumination to the processor.
[0040] Such methods, further comprising, via the laser light source, projecting a pulse of energy, where the square wave leading edge is caused by the laser pulse on and the trailing edge is caused by the laser pulse off. Such methods, further comprising, via the laser light source, projecting energy with a new polarization, where the square wave is caused by a change of polarization. Such methods further comprising, via the laser light source switching gain in order to change polarization. Such methods where the image sensor is a current assisted photon demodulation (CAPD).
[0041] Another example system includes a system for target illumination and mapping, comprising, an infrared laser light source and an image sensor, the infrared laser light source configured to, communicate with a processor, illuminate at least one select target within a field of view, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one select target within the field of view, create a signal based on the received reflected illumination, and send the signal to the processor, where the signal includes at least information that would allow the processor to map the target area and generate an image of the target area.
[0042] Such systems where the image is a gray scale image. Such systems where the signal further includes information that would allow the processor to assign visible colors to the gray scale. Such systems where the infrared laser light source is further configured to receive direction from the processor to illuminate a select target. Such systems where the infrared laser light source is further configured to receive direction from the processor to track the motion of the select target and maintain illumination on the select target.
[0043] Another example method includes a method for target illumination and mapping, comprising, via an infrared laser light source, communicating with a processor, illuminating at least one select target within a field of view, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving reflected illumination from the at least one select target within the field of view, creating a signal based on the received reflected illumination, and sending the signal to the processor, where the signal includes at least information that would allow the processor to map the target area and generate an image of the target area.
[0044] Such methods where the image is a gray scale image. Such methods where the signal further includes information that would allow the processor to assign visible colors to the gray scale. Such methods where the infrared laser light source is further configured to receive direction from the processor to illuminate a select target. Such methods where the infrared laser light source is further configured to receive direction from the processor to track the motion of the select target and maintain illumination on the select target.
[0045] Another example system includes a system for target illumination comprising, an illumination device in communication with an image sensor, the illumination device further configured to, communicate with a processor, project low level full scan illumination to a target area, where the laser is at least one of, amplitude modulated and pulse width modulated, the image sensor further configured to, communicate with the processor, receive reflected illumination from the target area, the processor configured to, identify specific target areas of interest, map the target area, set a value of the number of image pulses for one scan, calculate the energy intensity of each pulse, calculate the total intensity per frame, and compare the total intensity per frame to an eye safety limit, the computing system further configured to, direct the illumination device to scan if the total intensity per frame is less than the eye safety limit, and direct the illumination device to stop scan if the total intensity per frame is greater than or equal to the eye safety limit.
[0046] Such systems where the processor is further configured to communicate to a user an error message if the total intensity per frame is greater than or equal to the eye safety limit. Such systems where the processor is further configured to, if the total intensity per frame is greater than or equal to the eye safety limit, map the target area, set a new value of the number of image pulses for one scan, calculate the energy intensity of each pulse, calculate the total intensity per frame, and compare the total intensity per frame to an eye safety limit. Such systems where the computing system is further configured to track the specific target of interest and direct the illumination source to illuminate the specific area of interest. Such systems where the illumination source includes a laser and a micro electromechanical system mirror (MEMS) to direct the light.
[0047] Another example method includes a method for target illumination comprising, via an illumination device, communicating with a processor, projecting low level full scan illumination to a target area, where the laser is at least one of, amplitude modulated and pulse width modulated, via an image sensor, communicating with the processor, receiving reflected illumination from the target area, via the processor, identifying specific target areas of interest, mapping the target area, setting a value of the number of image pulses for one scan, calculating the energy intensity of each pulse, calculating the total intensity per frame, and comparing the total intensity per frame to an eye safety limit, directing the illumination device to scan if the total intensity per frame is less than the eye safety limit, and directing the illumination device to stop scan if the total intensity per frame is greater than or equal to the eye safety limit.
[0048] Such methods further comprising, via the processor, communicating to a user an error message if the total intensity per frame is greater than or equal to the eye safety limit. Such methods further comprising, via the processor, if the total intensity per frame is greater than or equal to the eye safety limit, mapping the target area, setting a new value of the number of image pulses for one scan, calculating the energy intensity of each pulse, calculating the total intensity per frame, and comparing the total intensity per frame to an eye safety limit. Such methods where the computing system is further configured to track the specific target of interest and direct the illumination source to illuminate the specific area of interest. Such methods where the illumination source includes a laser and a micro electromechanical system mirror (MEMS) to direct the light.
[0049] Another example system includes a system for target illumination and mapping, comprising, a directed light source, at least one image projector, and an image sensor, the directed light source configured to, communicate with a processor, illuminate at least one select target area within a field of view, receive direction to illuminate an at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one select target within the target area, create data regarding the received reflected illumination, send data regarding the received reflected illumination to the processor, and the image projector configured to, communicate with the processor, receive direction to project an image on the at least one select target, and project an image on the at least one select target.
[0050] Such systems where the directed light source is an infrared laser. Such systems where the data regarding the received reflected illumination includes information regarding the distance from the system to the target via triangulation. Such systems where the image projector is calibrated to the distance calculation from the processor, where calibration includes adjustments to a throw angle of the image projector. Such systems where the image projector is further configured to project at least two images on at least two different identified and tracked targets. Such systems where the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD). Such systems where the directed light source is configured to project a pattern of illumination on the select target.
[0051] Another example system includes a system for target illumination and mapping, comprising, a directed light source and an image sensor, the directed light source configured to, communicate with a processor, illuminate at least one target area within a field of view, receive direction to track a selected target within the target area from the processor, receive direction to project an image on the tracked selected target from the processor, project an image on the tracked selected target according to the received direction, the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one select target within the field of view, generate data regarding the received reflected illumination, and send the received reflected illumination data to the processor. Such systems where the directed light source is a visible light laser and the image is a laser scan image, where the laser is at least one of, amplitude modulated and pulse width modulated. Such systems where the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
[0052] Another example method includes a method for target illumination and mapping, comprising, via a directed light source, communicating with a processor, illuminating at least one select target area within a field of view, receiving direction to illuminate an at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, via an image sensor, communicating with the processor, receiving reflected illumination from the at least one select target within the target area, creating data regarding the received reflected illumination, sending data regarding the received reflected illumination to the processor, and via an image projector, communicating with the processor, receiving direction to project an image on the at least one select target, and projecting an image on the at least one select target.
[0053] Such methods where the directed light source is an infrared laser. Such methods where the data regarding the received reflected illumination includes information regarding the distance from the system to the target via triangulation. Such methods where the image projector is calibrated to the distance calculation from the processor, where calibration includes adjustments to a throw angle of the image projector. Such methods, further comprising, via the image projector, projecting at least two images on at least two different identified and tracked targets. Such methods where the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD). Such methods further comprising, via the directed light source, projecting a pattern of illumination on the select target.
[0054] Another example method includes a method for target illumination and mapping, comprising, via a directed light source, communicating with a processor, illuminating at least one target area within a field of view, receiving direction to track a selected target within the target area from the processor, receiving direction to project an image on the tracked selected target from the processor, projecting an image on the tracked selected target according to the received direction, via an image sensor, communicating with the processor, receiving reflected illumination from the at least one select target within the field of view, generating data regarding the received reflected illumination, and sending the received reflected illumination data to the processor.
[0055] Such methods where the directed light source is a visible light laser and the image is a laser scan image, where the laser is at least one of, amplitude modulated and pulse width modulated. Such methods where the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
[0056] Another example system includes a system for target illumination and mapping, comprising, a directional light source and an image sensor, the directional light source configured to, communicate with a processor, illuminate at least one target area within a field of view with a scan of at least one pixel point, receive direction to illuminate the target with additional pixel points over time for additional calculations of distance, from the at least one processor, the image sensor configured to, communicate with the processor, receive a reflection of the at least one pixel point from the at least one select target within the field of view, generate data regarding the received pixel reflection, send the data regarding the received pixel reflection to the at least one processor, where the data includes information that the processor could analyze and determine distance from the system to the target via triangulation, and where the data further includes information regarding the relative proximity between the directional light source and the image sensor.
[0057] Such systems where the directional light source is a laser, and at least one of, amplitude modulated and pulse width modulated. Such systems where the data further includes information that the processor could analyze and determine a depth map, based on the calculations of distance of the at least one target pixel point. Such systems where the data further includes information that the processor could analyze and determine the distance between the system and the target via triangulation among the directed light source, the image sensor, and the additional pixel points. Such systems where the directional light source is further configured to receive direction to illuminate the selected target with at least one pixel point from the processor.
[0058] Another example method includes a method for target illumination and mapping, comprising, via a directional light source, communicating with a processor, illuminating at least one target area within a field of view with a scan of at least one pixel point, receiving direction to illuminate the target with additional pixel points over time for additional calculations of distance, from the at least one processor, via an image sensor, communicating with the processor, receiving a reflection of the at least one pixel point from the at least one select target within the field of view, generating data regarding the received pixel reflection, sending the data regarding the received pixel reflection to the at least one processor, where the data includes information that the processor could analyze and determine distance from the system to the target via triangulation, and where the data further includes information regarding the relative proximity between the directional light source and the image sensor.
[0059] Such methods where the directional light source is a laser, and at least one of, amplitude modulated and pulse width modulated. Such methods, where the data further includes information that the processor could analyze and determine a depth map, based on the calculations of distance of the at least one target pixel point. Such methods where the data further includes information that the processor could analyze and determine the distance between the system and the target via triangulation among the directed light source, the image sensor, and the additional pixel points. Such methods further comprising, via the directional light source receiving direction to illuminate the selected target with at least one pixel point from the processor.
[0060] Another example system includes a system for biometric analysis, comprising, a directed laser light source and an image sensor, the directed laser light source configured to communicate with a processor, illuminate a target area within a field of view, receive direction to illuminate at least one select target in the target area, receive direction to illuminate a biometric area of the at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, and the image sensor configured to, communicate with the processor, receive reflected illumination from the at least one target area within the field of view, generate data regarding the received reflected illumination, send the generated data to the processor, where the data includes at least information that would allow the processor to map the target area, identify the select target within the target area, and determine a biometric reading of the at least one select target.
[0061] Such systems where the biometric reading is at least one of, skin deflection, skin reflectivity, and oxygen absorption. Such systems where the illumination is a pattern of illumination, and where the computing system is further configured to analyze the reflected pattern illumination from the target. Such systems where the data contains further information that would allow the processor to calculate a distance from the system to the target via triangulation. Such systems where the light source is further configured to receive calibration information of the illumination pattern, and project the calibrated pattern on the at least one select target.
[0062] Another example method includes a method for biometric analysis, comprising, via a directed laser light source, communicating with a processor, illuminating a target area within a field of view, receiving direction to illuminate at least one select target in the target area, receiving direction to illuminate a biometric area of the at least one select target, where the laser is at least one of, amplitude modulated and pulse width modulated, and via an image sensor, communicating with the processor, receiving reflected illumination from the at least one target area within the field of view, generating data regarding the received reflected illumination, sending the generated data to the processor, where the data includes at least information that would allow the processor to map the target area, identify the select target within the target area, and determine a biometric reading of the at least one select target.
[0063] Such methods where the biometric reading is at least one of, skin deflection, skin reflectivity, and oxygen absorption. Such methods where the illumination is a pattern of illumination, and where the computing system is further configured to analyze the reflected pattern illumination from the target. Such methods where the data contains further information that would allow the processor to calculate a distance from the system to the target via triangulation. Such methods further comprising, via the light source, receiving calibration information of the illumination pattern, and projecting the calibrated pattern on the at least one select target.
[0064] Another example system includes a system for target illumination and mapping, comprising, a directed light source, and an image sensor, the light source having an aperture and configured to, illuminate a target area within a field of view, via an incremental scan, where each increment has a unique outbound angle from the light source aperture, and a unique inbound angle to the image sensor aperture, send data regarding the incremental outbound angles to the processor, and the image sensor having an aperture and configured to, receive reflected illumination from the at least one select target within the field of view, generate data regarding the received reflected illumination including inbound angles, and send the data regarding the received reflected illumination to the processor, where the data regarding the outbound angles and the data regarding the inbound angles include information used to calculate a distance from the system to the target via triangulation, and where the distance between light source aperture and the image capture aperture is relatively fixed.
[0065] Such systems where the directed light source is a laser, where the laser is at least one of, amplitude modulated and pulse width modulated. Such systems where the image senor includes optical filters. Such systems where the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a depth map based on the illumination. Such systems where the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a point cloud based on the depth map.
[0066] Another example method includes a method for target illumination and mapping. Such a method including, via a directed light source, having an aperture, illuminating a target area within a field of view, via an incremental scan, where each increment has a unique outbound angle from the light source aperture, and a unique inbound angle to the image sensor aperture, sending data regarding the incremental outbound angles to the processor, and via an image sensor, having an aperture, receiving reflected illumination from the at least one select target within the field of view, generating data regarding the received reflected illumination including inbound angles, and sending the data regarding the received reflected illumination to the processor, where the data regarding the outbound angles and the data regarding the inbound angles include information used to calculate a distance from the system to the target via triangulation, and where the distance between light source aperture and the image capture aperture is relatively fixed.
[0067] Methods here where the directed light source is a laser, where the laser is at least one of, amplitude modulated and pulse width modulated. Methods here where the image senor includes optical filters. Methods here where the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a depth map based on the illumination. Methods here where the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a point cloud based on the depth map. BRIEF DESCRIPTION OF THE DRAWINGS
[0068] For a better understanding of the embodiments described in this application, reference should be made to the Detailed Description below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
[0069] Figure 1 is a perspective view of components consistent with certain aspects related to the innovations herein.
[0070] Figures 2A - 2B show an example monolithic array and projection lens, front side and perspective view consistent with certain aspects related to the innovations herein.
[0071] Figures 3A - 3B are a front, top, side, and perspective views showing an example array consistent with certain aspects related to the innovations herein.
[0072] Figures 4A - 4B are a front, top, side, and perspective views show an example array with a flexible PCB consistent with certain aspects related to the innovations herein.
[0073] Figure 5 is an illustration of an example full / flood array illuminated target area consistent with certain aspects related to the innovations herein.
[0074] Figures 6A - 6E are a perspective view and sequence illustrations of example array column illuminations consistent with certain aspects related to the innovations herein.
[0075] Figures 7A - 7E are a perspective view and sequence illustrations of example sub- array illuminations consistent with certain aspects related to the innovations herein.
[0076] Figures 8A - 8E are a perspective view and sequence illustrations of example single array element illuminations consistent with certain aspects related to the innovations herein.
[0077] Figure 9 is a perspective view of example system components of certain directional illumination embodiments herein.
[0078] Figures 10A-10D show example views of various possible scanning mechanism designs consistent with certain aspects related to the innovations herein.
[0079] Figure 11 is a depiction of a target area illuminated by an example directional scanning illumination consistent with certain aspects related to the innovations herein.
[0080] Figure 12 depicts an example embodiment of a 2-axis MEMS consistent with certain aspects related to the innovations herein.
[0081] Figure 13 depicts an example embodiment of a 2 single-axis MEMS configuration according to certain embodiments herein. [0082] Figure 14 depicts an example embodiment including a single rotating polygon and a single axis mirror consistent with certain aspects related to the innovations herein.
[0083] Figure 15 depicts an example embodiment including dual polygons consistent with certain aspects related to the innovations herein.
[0084] Figure 16 is a depiction of an example full target illumination consistent with certain aspects related to the innovations herein.
[0085] Figure 17 is an illustration of an illumination utilized to create a subject outline consistent with certain aspects related to the innovations herein.
[0086] Figure 18 is an illustration of illumination of a sub-set of the subject, consistent with certain aspects related to the innovations herein.
[0087] Figure 19 is an illustration of illumination of multiple sub-sets of the subject, consistent with certain aspects related to the innovations herein.
[0088] Figure 20 depicts an example skeletal tracking of a target consistent with certain aspects related to the innovations herein.
[0089] Figure 21 depicts an example projection of a pattern onto a target area consistent with certain aspects related to the innovations herein.
[0090] Figure 22 is a flow chart depicting target illumination and image recognition consistent with certain aspects related to the innovations herein.
[0091] Figure 23 illustrates system components and their interaction with both ambient full spectrum light and directed NIR consistent with certain aspects related to the innovations herein.
[0092] Figure 24 is a perspective view of an example video imaging sensing assembly consistent with certain aspects related to the innovations herein.
[0093] Figure 25 is an associated graph of light transmission through a certain example filter consistent with certain aspects related to the innovations herein.
[0094] Figure 26A is a perspective view of the video imaging sensing assembly of the present invention illustrating one combined notch and narrow band optical filter utilizing two elements consistent with certain aspects related to the innovations herein.
[0095] Figure 26B is an associated graph of light transmission through certain example filters of certain embodiments herein. [0096] Figure 27 A is a perspective view of an example video imaging sensing assembly illustrating three narrow band filters of different frequencies consistent with certain aspects related to the innovations herein.
[0097] Figure 27B is an associated graph of light transmission through certain example filters consistent with certain aspects related to the innovations herein.
[0098] Figure 28 is a perspective view of triangulation embodiment components consistent with certain aspects related to the innovations herein.
[0099] Figure 29 is a depiction of block areas of a subject as selected by the user or recognition software consistent with certain aspects related to the innovations herein.
[00100] Figure 30 is a depiction of a single spot map as determined by the user or recognition software consistent with certain aspects related to the innovations herein.
[00101] Figure 31 depicts an example embodiment showing superimposed distance measurements in mm as related to certain embodiments herein.
[00102] Figure 32 depicts an example multiple spot map as determined by the user or recognition software consistent with certain aspects related to the innovations herein.
[00103] Figure 33 depicts an example embodiment showing superimposed distance in mm and table as related to certain embodiments herein.
[00104] Figure 34 depicts an example embodiment showing axial alignment of the components of directed light source and the image sensor consistent with certain aspects related to the innovations herein.
[00105] Figure 35 shows an example embodiment with a configuration including axial alignment and no angular component to the light source consistent with certain aspects related to the innovations herein.
[00106] Figure 36 shows an example embodiment with a configuration including axial alignment and an angular component to the light source consistent with certain aspects related to the innovations herein.
[00107] Figure 37A-37C depict an example embodiment showing a top, side, and axial views of configurations consistent with certain aspects related to the innovations herein.
[00108] Figure 38A-38C depict an example embodiment of a top, side, and axial views of a configuration according to certain embodiments herein with a horizontal and vertical offset between the image sensor and the illumination device. [00109] Figure 39 depicts an example embodiment configuration including axial alignment and an angular component to the light source with an offset in the Z axis between the image sensor and the illumination device consistent with certain aspects related to the innovations herein.
[00110] Figure 40 depicts an example embodiment of a process flow and screenshots consistent with certain aspects related to the innovations herein.
[00111] Figure 41 depicts an example embodiment including light interacting with an image sensor consistent with certain aspects related to the innovations herein.
[00112] Figure 42 depicts an example embodiment of image spots overlaid on a monochrome pixel map of a sensor consistent with certain aspects related to the innovations herein.
[00113] Figure 43 shows an example perspective view of an example of illumination being directed onto a human forehead for biometrics purposes consistent with certain aspects related to the innovations herein.
[00114] Figure 44A shows an example embodiment of sequential triangulation and a perspective view including one line of sequential illumination being directed into a room with a human figure consistent with certain aspects related to the innovations herein.
[00115] Figure 44B shows an example embodiment of sequential triangulation and a perspective view including select pixels consistent with certain aspects related to the innovations herein.
[00116] Figure 45 shows an example embodiment a human subject with a projected image consistent with certain aspects related to the innovations herein.
[00117] Figure 46A is an example embodiment showing a human subject with a projected illumination incorporating safety eye blocking consistent with certain aspects related to the innovations herein.
[00118] Figure 46B is another example embodiment showing a human subject with a projected illumination incorporating safety eye blocking consistent with certain aspects related to the innovations herein.
[00119] Figure 47A is a detailed illustration of a human eye and the small output window of the illumination device.
[00120] Figure 47B is a human eye pupil relative to the small illumination device output window. [00121] Figure 47C is a detailed illustration of a human eye and the large output window of the illumination device.
[00122] Figure 47D is a human eye pupil relative to the large illumination device output window.
[00123] Figure 48A is an example embodiment showing a chart assigning color values to shades of gray consistent with certain aspects related to the innovations herein.
[00124] Figure 48B shows an example perspective view of certain embodiments herein including illumination directed onto a human figure after color enhancement consistent with certain aspects related to the innovations herein.
[00125] Figure 49A is an example graph showing a square wave formed by different systems consistent with certain aspects related to the innovations herein.
[00126] Figure 49B is an example perspective view illustrating one line of a propagated square wave consistent with certain aspects related to the innovations herein.
[00127] Figure 50A is an example perspective view of the throw angle effect on projected patterns consistent with certain aspects related to the innovations herein.
[00128] Figure 50B is an example perspective view showing calibrated projected patterns to compensate for distance consistent with certain aspects related to the innovations herein.
[00129] Figure 50C is an example perspective with of oriented calibration based on object shape consistent with certain aspects related to the innovations herein.
[00130] Figure 51 is an example table of projected pattern methodologies consistent with certain aspects related to the innovations herein.
[00131 ] Figure 52A is a perspective view of an example of an adjacent configuration consistent with certain aspects related to the innovations herein.
[00132] Figure 52B is a perspective view of an example system consistent with certain aspects related to the innovations herein.
[00133] Figure 52C is a perspective view of an example of an objective configuration consistent with certain aspects related to the innovations herein.
DETAILED DESCRIPTION
[00134] Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a sufficient understanding of the subject matter presented herein. But it will be apparent to one of ordinary skill in the art that the subject matter may be practiced without these specific details. Moreover, the particular embodiments described herein are provided by way of example and should not be used to limit the scope of the inventions to these particular embodiments. In other instances, well-known data structures, timing protocols, software operations, procedures, and components have not been described in detail so as not to unnecessarily obscure aspects of the embodiments of the invention.
Overview
[00135] Enhanced software and hardware control of light sources has led to vast possibilities when it comes to gesture recognition, depth-of-field measurement, image/object tracking, three dimensional imaging, among other things. The embodiments here may work with such software and/or systems to illuminate targets, capture image information of the illuminated targets, and analyze that information for use in any number of operational situations. Additionally, certain embodiments may be used to measure distances to objects and/or targets in order to aid in mapping of three dimensional space, create depth of field maps and/or point clouds.
[00136] Object or gesture recognition is useful in many technologies today. Such technology can allow for system/software control using human gestures instead of keyboard or voice control. The technology may also be used to map physical spaces and analyze movement of physical objects. To do so, certain embodiments may use an illumination coupled with a camera or image sensor in various configurations to map the target area. The illumination could be sourced any number of ways including but not limited to arrays of Light Emitting Diodes (LEDs) or directional scanning laser light.
[00137] In some instances light in the visible spectrum may not be optimal at a level necessary or augmenting with visible light may not be desirable at a level necessary for image sensors to adequately detect; therefore the use of infrared/ near infrared (IR/NIR) may be used in such systems.
[00138] There are numerous infrared/ near infrared (IR/NIR) illumination systems on the market which produce non-directed flood type illumination. However, providing a directed source of illumination may require a dynamic connection between the recognition
software/hardware and the source of illumination. Issues of human eye safety also place constraints on the total amount of IR/NIR illumination that can safely be used. [00139] Direction and eye safety may be achieved, depending on the configuration of the system, by utilizing an addressable array of emitting devices or using a scanning mechanism, while minimizing illumination to non-targeted areas, thus reducing the overall energy required as compared with flood illumination. The system may also be used to calculate the amount of illumination required, the total output power, and help determine the duration of each cycle of illumination. The system may then compare the illumination requirements to any number of maximum eye safe levels in order to adjust any of the parameters for safety. This may also result in directing the light on certain areas to improve illumination in those, while minimizing other areas.
[00140] Various optics, filters, durations, intensities and polarizations could also be used to modify the light used to illuminate the objects in order to obtain additional illuminated object data. The image capture could be through any of various cameras and image sensors. Various filters, lenses and focus features could be used to capture the illuminated object data and send it to computing hardware and/or software for manipulation and analysis.
[00141] In certain examples, using an array of illumination sources, individual illumination elements may be grouped into columns or blocks to simplify the processing by the computers. In a directional illumination embodiment, targeted areas could be thus illuminated. Other examples, using directional illumination sources, could be used to project pixels of light onto a target area.
[00142] Such example segments/areas may each be illuminated for an approximately equal fraction of frame rate such that an image capture device, such as a Complementary Metal Oxide Semiconductor (CMOS) camera may view and interpret the illumination as homogeneous illumination for the duration of one frame or refresh.
[00143] The illumination and image capture should be properly timed to ensure that the targeted areas are illuminated during the time that the image capture device collects data. Thus, the illumination source(s) and the image capture should synchronize in order to ensure proper data capture. If the image capture and illumination are out of synch, the system will have a hard time deciphering if the target object has moved, or if the illumination merely missed the target.
[00144] Further, distance calculations derived from using the illumination and capture systems described herein may add to the information that the system may use to calculate and map three dimensional space. This may be accomplished, in certain embodiments, using triangulation measurements among the illumination source, the image capture device(s) and the illuminated object(s).
[00145] Thus, certain example systems may include certain components, including
combinations of, an illumination source such as an addressable array of semiconductor light emitting devices or directional sources using lasers, some kind of projection optics or mechanical structure for spreading the light if an array of sources, an image capture devices, such as a CMOS, Charge Couple Device (CCD) or other imaging device which may incorporate a short band pass filter allowing visible and specific IR/NIR in certain embodiments, computing devices such as a microprocessor(s) which may be used in conjunction with computing instructions to control the array or directional illumination source, database(s) and/or data storage to store data information as it is collected, object and/or gesture recognition instructions to interpret and analyze the captured image information. Recognition instructions/software could be used to help analyze any captured images in order to do any number of things including to identify the subject requiring directed illumination to send commands to the microprocessor controlling the array identifying only the necessary elements to energize as to direct illumination on the target, thereby creating the highest possible level of eye safe illumination on the target.
[00146] In some example embodiments, for safety, the system may utilize object tracking technology such as recognition software, to locate a person's eyes who may be in the target field, and block the light from a certain area around them for eye safety. Such an example may keep emitted light from a person's eyes, and allow the system to raise the light intensity in other areas of illumination, while keeping the raised intensity light away from the eyes of a user or person within the system's range.
Detailed Examples
[00147] A preferred embodiment of the present invention will be described with reference to Figures 1 to 52C.
Array of Illumination Sources
[00148] As described above, the illumination of the target field may be accomplished a number of ways. One such way is through an array of illumination sources such as LEDs. Figure 1 illustrates an example system utilizing such illumination sources. To illuminate a target or target area, the illumination source may be timed in accordance with the image capture device's frame duration and rate. In this way, during one open frame time of the image capture device/camera, which can be any amount of time but is often l/30th, l/60th or l/120th of a second, the illumination source may illuminate the target and/or target area. These illumination sources can operate a number of ways during that one frame time, including, turning on all elements, or a select number of elements, all with the same power level or intensity, and for the entire frame duration. Other examples include turning the illumination sources on all at the same intensity or power, but change the length of time each is on, within the frame time. Still other examples include changing the power or intensity of illumination sources, but keep all with the same length of time to be on, and yet another is changing both the power and time the illumination sources are on.
[00149] As will be discussed in more detail below, the effective output power for the array may be measured over time to help calculate safe levels of exposure, for example, to the human eye. Thus, an eye safety limits may be calculated by dividing output power over time. This output power would be affected by the variations in illumination time and intensity disclosed above.
[00150] In Figure 1, the illumination device 102, is arranged as an array 102 utilizing diverging projection optics 104, housed on a physical mechanical structure 106. The array of illumination sources, arranged to generate directed illumination 108 on a particular target area 110, shown in this example as a human form 112 and an object 114 but could be any number of things. The illumination device 102 in Figure 1, is connected to a computer system including an example microprocessor 116, as well as the image capture system shown here as a video imaging camera 118, lens tube 120, camera lens 122, and camera filter 124. The system is also shown in communication with a computer system including object recognition software or instructions 126 that can enable the system to direct and/or to control the illumination in any number of ways described herein.
[00151] In this example, the array 102 is shown connected to a computing system including a microprocessor 116 which can individually address and drive the different semiconductor light emitting devices 102 through an electronic control system. The example microprocessor 116 may be in communication with a memory or data storage (not pictured) for storing predefined and/or user generated command sequences. The computing system is further shown with an abstract of recognition software 126, which can enable the software to control the directed illumination. In the example drawing, these objects are shown in exploded and/or exaggerated forms, whereas in practice they may take any number of shapes and configurations. Here, they are shown as sometimes separate and symbolic icons.
[00152] As depicted in the example shown in Figure 2A, the illumination device 202 may comprise a monolithic array of semiconductor light emitting devices 206, projection optics 204, such as a lens, arranged in between the array 202 and semiconductor light emitting devices 206 and the target area. The array 202 may be any number of things including but not limited to, separate Light Emitting Diodes (LEDs), Edge Emitting Lasers (EELs), Vertical Cavity Surface Emitting Lasers (VCSELs) or other types of semiconductor light emitting devices.
[00153] In the example shown in Figure 2B, the monolithic array 202 is arranged on a printed circuit board (PCB) 208, along with associated driving electronics. The semiconductor light emitting devices 206 are uniformly distributed over the area of the array 202 thereby forming a matrix. Any kind of arrangement of light sources could be used, in order to allow for the light to be projected and directed toward the target area.
[00154] The number of semiconductor light emitting devices 206 used may vary. For example, an array provided with 10 x 20 array LEDs, for example, may result in proper directed illumination for a particular target area. For standalone devices, a PCB array of discrete semiconductor light emitting devices such as LEDs may suffice such as, for example, an auxiliary system for a laptop or television.
[00155] In one example embodiment herein, the semiconductor light emitting devices 206 are either physically offset or the alignment of alternating columns is offset such that it creates a partially overlapping pattern of illumination. This partially overlapping pattern is described below, for example later in Figure 5.
[00156] As depicted in Figure 3 A, the illumination device 306 may include an array of semiconductor light emitting devices 306, mechanical structure 302, or a frame work with a defined curvature onto which PCBs are mounted which are one or more semiconductor light emitting devices 306 X-wide by Y-tall, arranged with a defined angle of curvature attached to a physical frame. The sub-array PCBs 310 may comprise a sub-array of semiconductor light emitting devices 306 X-wide by Y-tall, hereinafter referred to as sub-array. Each sub-array may include any number of illumination sources including but not limited to, separate LEDs, EELs, VCSELs or other types of semiconductor light emitting devices. The array 302 with sub-array PCBs 310 may include associated driving electronics. The semiconductor light emitting devices 306 may be uniformly distributed over the area of the array 302 sub-array PCBs 310 thereby forming a matrix. The number of semiconductor light emitting devices 306 used in the matrix may vary and the determination may be predefined, or defined by the user or the software. An illumination device for example, may include 10 x 20 array LEDs for directed illumination. For standalone devices, a PCB sub-array of discrete semiconductor light emitting devices such as LEDs may be used for an auxiliary system for a laptop or television. In some embodiments, the array 302 could be constructed of monolithic sub-arrays, single chip device having all of the semiconductor light emitting devices on a single chip. Figure 3B shows a perspective view of a curved array from Figure 3A.
[00157] As depicted in Figure 4A, the illumination device 402 may include an array of semiconductor light emitting devices 406, a flexible PCB 412 arranged with a defined angle of curvature which may be attached to a physical frame, including associated driving electronics. The semiconductor light emitting devices 406 may be uniformly distributed over the area of the array 402 thereby forming a matrix. The number of semiconductor light emitting devices 406 used in the matrix may vary and the determination may be predefined, or defined by the user or the software. For example, an illumination device provided with 10 x 20 array LEDs may provide sufficient directed illumination for a particular application. For standalone devices, such as an auxiliary system for a laptop or television, a flexible PCB made up of discrete
semiconductor light emitting devices such as LEDs would suffice. Figure 4B shows another example view of the curved array from Figure 4 A.
[00158] Figure 5 depicts an illustration of an example array 502 and what a target area 520 that could be energized and/or illuminated by the array 502 may look like. In the figure, each example circle 522 depicts the coverage area of one of the light emitting devices or illumination sources 506. As can be seen from the example, the coverage of each light emitting device 522 may overlap with the adjacent coverage 522, depending on the width of the light emitting device beam and the distance of the target object 530 from the array 502. As will be described in detail below, any arrangement of single illumination devices could be used in any combination. The example in Figure 5 shows all of the devices on at once.
[00159] Figure 6A depicts an example of the system illuminating a target area and a human 630. The system could also be used to target anything else in the target area, such as, an object 632. The example array 602 is shown in this example, showing one example column of light sources and their respective light beam coverage circles 622. Using an example column defined as one element or light source wide by X elements tall, in this example 1X10 but the number of elements can vary, the system is used to illuminate specific targets.
[00160] In certain embodiments, only certain precise areas of the overall target area require illumination. The system could first identify those precise areas within the overall target area using object recognition, and then illuminate those precise area or areas to highlight for additional granularity. Thus, using coordinates of a precise area which requires specific illumination, the system may provide those coordinates to the computing system including the microprocessor which in turn may calculate the correct precise area elements to illuminate and/or energize. The system could also decipher safety parameters such as the safe duration of that illumination during one cycle.
[00161] For example, the coordinate calculation could be, in an example where Columns = 4, one column P = F/4 where P is the length of time an element or block of elements are energized during a cycle and F is the duration of one cycle.
[00162] The system could be used to sequentially illuminate a given example area. Figures 6B and 6C depict the first and the second illuminated columns in an example sequence, where the light emitting array 602 is shown with a particular column in dark, corresponding to a light coverage 622 on the target area. Figure 6B shows an example where one column is lit, of four, 6C is two of four, etc. Figure 6D depicts the last column of the sequence to be illuminated, which is four of four in the example sequence shown here. Thus, the system's sequential illumination is shown in parts.
[00163] Figure 6E depicts what the camera would see in an example duration of one cycle corresponding to the amount of time of one capture frame. In this example, that is columns one through four, with the light coverage circles 622 now overlapping. In other words, in this example, the illumination source could flip through multiple iterations of illuminating a target, within the time of one camera or image capture device shutter frame. Thus, to the image capture device, the multiple and sequential illumination cycles show up in one frame of image capture, and to the image capture device, appear as if they are all illuminated at once. Any number of configurations, illumination patterns and timing could be used, depending on the situation. [00164] Figure 7A depicts another example of system's ability to illuminate different target areas for capture and recognition. In this example, the goal is to recognize and identify an example target 730 but could be anything, such as an object, 732. This example uses blocks of elements projecting their respective beams of illumination 722 defined as Y number of elements wide by X elements tall (in this example 2X2 but the number of elements can vary). This is different than the columns shown in Figures 6A-6E. In the examples of Figures 7A-7E, the system may be used to identify the coordinates of the area which requires illumination and provides that to the microprocessor which in turn may calculate the correct elements to energize and the safe duration of that illumination during one cycle.
[00165] In one such example calculation, the number of Blocks = 7. Therefore for one block P = F/7.
[00166] Figures 7B and 7C depict the first and the second illuminated blocks in the example sequence. 7B is one of seven, 7C is two of seven. Figure 7D depicts the last block of the sequence to be illuminated, which is seven of seven.
[00167] Figure 7E depicts what the camera may see illuminated within the duration of one example frame, which is blocks one through seven and all of the illumination circles 722 now overlapping. As described in Figure 6E, Figure 7E is the culmination of multiple illuminations, all illuminated at some time during one frame of the image capture device.
[00168] Figure 8A depicts an example of the system identifying targets such as a human 830 but could be anything such as an object, 832 within a target area. This example uses individual illumination sources or elements, which allow the image capture devices and computer / software to identify the coordinates of the area which may require specific illumination. Thus, the system can then calculate the specific target elements to illuminate and/or energize for greater granularity, or safety measures.
[00169] In this example, the calculation may include where Elements = 20. Therefore for one element P = F/20.
[00170] Figures 8B and 8C depict examples of the first and the second illuminated elements in the example sequence. 8B is one of twenty, 8C is two of twenty. Figure 8D depicts the last element of the sequence to be illuminated, which is twenty of twenty.
[00171] Figure 8E depicts what the camera or image capture device may see in duration of one frame. In this way, the illumination sources have illuminated one through twenty, now with illumination circles 822, all overlapping the adjacent one, and the image capture device detects all of the illumination within one frame.
Eye Safety for Array Embodiments
[00172] Example embodiments here may be configured to determine certain operational statistics. Such statistics may include measuring the amount, intensity and/or power the system puts out. This can be used, for example, to ensure that safety limits are met, such as eye safety limits for projection of IR/NIR. The system may utilizes information provided by the
illumination source and image sensors to determine the correct duration of each element during one cycle, period between refresh or time length of one frame.
E = number of semiconductor light emitting devices to be energized
F = duration of one cycle
F/E = P the length of time one element or block of elements is energized during a cycle
[00173] Further, the system may verify the eye safe limits of each cycle. Each semiconductor light emitting device may be assigned a value corresponding to the eye safe limits determined for the array and associated optics. As the variables which determine eye safe limits vary greatly depending upon the size of the external aperture, wavelength of light, mode, coherence, and duration, the specific criteria will be established matching the specifications of the final design, establishing a Lmax - maximum eyesight level per cycle. If
E x P > Lmax
The system will reduce P until E x P < Lmax
[00174] If no allowable solution exists for E x P < Lmax then the system may shift into a fail safe mode which may prevent any element of the array from energizing and return an error message to the recognition software. The process flow is described later in this disclosure in Figure 22.
Scanning Directional Illumination Source
[00175] In certain example embodiments, a directional illumination may be used. In such examples, the target area and subsequent targeted subject areas may be illuminated using a scanning process or a process that uses a fixed array of Micro Electrical Mechanical Systems ("MEMS") mirrors. Any kind of example laser direction control could be used, and more examples are discussed below. Additionally, any resolution of directional scan could be used, depending on the ability to pulse the illumination source, laser for example, and the direction control system to move the laser beam. In certain examples, the laser may be pulsed, and the MEMS may be moved, directing each separate pulse, so that separate pixels are able to be illuminated on a target area, during the time it takes the camera or image capture system to open for one frame. More granularity/resolution could be achieved if the laser could be pulsed faster and/or the directional control could move faster. Any combination of these could add to the number of pixels that could be illuminated during one frame time.
[00176] Regarding the scanning pattern for the light illumination source, many options could be utilized, including but not limited to raster, interlaced, de-interlaced progressive or other methods. The illumination projection device may have, for example, the ability to control the intensity of each pixel, by controlling the output power or light intensity for each pulse. The intensity of each pulse can be controlled by the amount of electrical current being applied to the semiconductor light emitting device, or by sub dividing the pulse into smaller increments and controlling the number of sub-pulses on during one pulse, or in the case of an array of MEMs controlling the duration of the pulse where the light is directed to the output, for example.
[00177] Scanned light may be precisely directed on a targeted area to minimize illumination to non-targeted areas. This may reduce the overall energy required to conduct proper image capture, as compared with the level of flood illumination required to achieve the same level of illumination on a particular target. Instructions and/or software may be used to help calculate the amount of illumination required for an image capture, the output power of each pulse of illumination to achieve that, the number of pulses per scanning sequence, and help determine the total optical output of each frame of illumination.
[00178] The system may specifically direct illumination to both stationary and in-motion objects and targets such as humans. Thus, the first frame and every X frames as directed by the recognition software or default setting within the microprocessor, the system may perform a complete illumination of the entire target area, thus allowing the recognition software to check for new objects or changes in the subject(s) being targeted. In some embodiments, a light- shaping diffuser can be arranged between the semiconductor light emitting device(s) and the projection optics, to create blurred images of the pulses. Blurring may reduce the dark or un- illuminated transitions between the projected pixels of illumination. Utilization of a diffuser may have the effect of improving eye safe output thus allowing for increased levels of illumination emitted by the device.
[00179] According to certain embodiments, the device can produce dots or targets of illumination at key points on the subject for the purpose of calculating distance or providing reference marks for collection of other information. Distance calculations are disclosed in more detail below.
[00180] Figure 9 illustrates an example illumination device 950, utilizing diverging projection optics 952, to generate directed illumination 954 on a target area 910, as identified in this example as human form 912 and object 914. The illumination device 950 in this example is connected to a microprocessor 916, a video imaging sensor 918, lens tube 920, camera lens 924, camera filter 922, object recognition software 926, enabling the recognition software to control the illumination. In the example drawing, these objects are shown in exaggerated and/or exploded forms, whereas in practice they may take any number of shapes and configurations. Here, they are shown as sometimes separate and symbolic icons.
[00181] The illumination device 950 may be configured to be in communication with and/or connected to a computing device such as a microprocessor 916 which can control the scanning mechanism and the semiconductor light emitting device 950. The microprocessor 916, which may be equipped with and/or in communication with memory or storage for storing predefined and/or user generated command sequences. Further, the computing system may receive instructions from recognition software 926, thereby enabling the system to control the directed illumination.
[00182] In some embodiments, Figure 9 also illustrates example embodiments based on an embodiment where a single image sensor 918 is utilized to obtain both red, green, blue ("RGB") and NIR data for enhancing the ability of machine vision and recognition software 926. This may require the utilization of a band pass filter 924 to allow for RGB imaging and a narrow band filter 922 closely matched to the wavelength of a NIR light source 954 used for augmenting the illumination. The optical filtration can be accomplished by single or multiple element filters. The NIR light source 954 can be from light emitting devices such as, for example but not limited to, LEDs, EEL, VCSELs, DPL, or other semiconductor-based light sources. The way of directing the light onto the subject area 912 can be via many sources including a MEMS device 950 such as a dual axis or eye MEMS mirror, two single axis MEMS mirrors working in conjunction, a multiple MEMS mirror array, or a liquid crystal array, as examples. Other reflective devices could also be used to accurately point a directed light source, such as a laser beam. In the example drawing, these objects are shown in exaggerated forms, whereas in practice they may take any number of shapes and configurations. Here, they are shown as sometimes separate and symbolic icons.
[00183] In certain example embodiments, a light shaping diffuser (not pictured), can be arranged somewhere after the illumination device 950 and the projection optics 952 to create a blurred projected pixel. The light shaping diffuser may create a blurred projection of the light and a more homogenous overlap of illumination. The light shaping diffuser also has the added effect of allowing for increased levels of illumination while remaining within eye safe limits.
[00184] Turning now to Figures 10A and 10B, the illumination device 1050 includes a semiconductor light emitting device 1056, and a scanning mechanism 1058, projection optics 1052, such as a lens. The illumination device can include a semiconductor light emitting device 1056, such as any number of devices including but not limited to, an LED, EEL, single element or an array of VCSELs, DPL, or other semiconductor based light emitting device, producing light in the infrared and or near infrared light wavelengths. The intensity per pulse can be controlled by a change in numerous things, including, input current which correlates to a change in output power, frequency which would divide each pulse into sub-pulses of an equal energy output with the control of the intensity being determined by the number of sub-pulses "ON" during one pulse, or in the case of an array where each element of the array had a fixed output, the change in intensity would be determined by the number of elements "ON" during one pulse. The light may be directed to the scanning mechanism 1058 through a beam splitter 1060. The scanning mechanism 1058 may be a digital light processor (DLP) or similar device using an array of MEMs mirrors, LCOS (Liquid Crystal On Silicon), LBS (Laser Beam Steering), or combination of two single axis MEMs mirrors or a dual axis or "Eye" type of MEM as mirrors. The vertical scan could perform a linear scan at a low frequency (60 Hz as an example display refresh rate), whereas the horizontal scan requires a higher frequency (for example, greater than 90 kHz for a 1920 x 1080 HD display). The stability of the scan in either direction could affect the results, therefore, an example such as one pixel resolution could provide good resolution.
[00185] Figure 10B shows an alternate embodiment than Figure 10A, where the semiconductor light emitting device 1056 is aligned differently, and without a reflector 1062 needed, as in Figure 10A, before the beam splitter 1060. The reflector 1056 could be a partial mirror as well, allowing light to pass from one side and reflecting from another.
[00186] As depicted in Figure IOC, the illumination device 1050 includes a semiconductor light emitting device 1056, an additional semiconductor light emitting device 1057 which may be a source of white light or a single source emitting either visible red, green and blue light or a secondary source of IR/NIR light, a scanning mechanism 1058, and projection optics 1052, such as a lens. The illumination device 1050 can include a semiconductor light emitting device 1056, such as, any number of things including but not limited to, an LED, EEL, single element or an array of VCSELs, DPL, or other semiconductor based light emitting devices, producing light in the infrared and/or near infrared light wavelengths. The intensity per pulse can be controlled by a change in, input current which correlates to a change in output power, frequency which would divide each pulse into sub-pulses of an equal energy output with the control of the intensity being determined by the number of sub-pulses "ON" during one pulse, or in the case of an array where each element of the array had a fixed output, the change in intensity would be determined by the number of elements "ON" during one pulse. The light may be directed to the scanning mechanism 1058 through a beam splitter 1060.
[00187] In the figure, a reflector 1062 is shown between the light emitting device 1056 and the beam splitter 1060. The reflector 1056 could be a partial mirror as well, allowing light to pass from one side and reflecting from another. The scanning mechanism 1058 may be any number of things including but not limited to, a DLP or similar device using an array of MEMs mirrors, LCOS, LBS , or combination of two single axis MEMs mirrors or a dual axis or "Eye" type of MEMs mirrors. The vertical scan could perform a linear scan at a low frequency (60 Hz for a typical display refresh rate), whereas the horizontal scan requires a higher frequency (greater than 90 kHz for a 1920 x 1080 HD display), for example. If scan in either direction is stable, within one pixel resolution, less error correction is needed.
[00188] As depicted in Figure 10D, the illumination device 1050 includes a semiconductor light emitting device 1056, and additional semiconductor light emitting devices 1057 which may be single sources emitting visible red, green and blue light or a secondary source of IR/NIR light, a scanning mechanism 1058, and projection optics 1052, such as a lens. In certain embodiments, light emitting devices 1057 could be any number of single colored lasers including but not limited to red, green and blue, and the associated differing wavelengths. These illumination sources, for instance lasers 1057 could each have a unique wavelength or wavelengths as well. The illumination device can include a semiconductor light emitting device, such as any number of things including but not limited to, an LED, EEL, single element or an array of VCSELs, DPL, or other semiconductor based light emitting device, producing light in the infrared and or near infrared light wavelengths. The intensity per pulse can be controlled by a change in, input current which correlates to a change in output power, frequency which would divide each pulse into sub-pulses of an equal energy output with the control of the intensity being determined by the number of sub-pulses "ON" during one pulse; or in the case of an array where each element of the array had a fixed output, the change in intensity would be determined by the number of elements "ON" during one pulse. The light may be directed to the scanning mechanism 1058 through a beam splitter 1060. The scanning mechanism 1058 may be any number of things including but not limited to, a DLP or similar device using an array of MEMs mirrors, LCOS, LBS, or combination of two single axis MEMs mirrors or a dual axis or "Eye" type of MEMs mirrors. The vertical scan could perform a linear scan at a low frequency (60 Hz for a typical display refresh rate), whereas the horizontal scan may require a higher frequency (greater than 90 kHz for a 1920 x 1080 HD display).
[00189] Figure 11 depicts an example illustration of how the system may scan the subject area being illuminated. This kind of example scan is an interlaced scan. Any number of other example scan patters may be used to scan an illuminated area, the one in Figure 11 is merely exemplar. In other embodiments of Figure 11, the scanning mechanism may produce a scanned illumination in other patterns, such as but not limited to, a raster, progressive or de-interlaced or other format depending upon the requirements of the overall system.
[00190] In this example, using a directionally controlled pulsed laser, each horizontal line is divided into pixels which are illuminated with one or more pulses per pixel. Each pulse width/length becomes a pixel, as MEMS or reflector scans the line in a continuous motion and then moves to the next horizontal line. For example, 407,040 pixels may cover the target area, which is limited by the characteristics of the steering mechanism, in this example with 848 pixels per horizontal line and 480 horizontal lines. Other numbers of pixels may also be used. For example, if the MEMS can move 480 lines in the vertical access and 848 lines in the horizontal access, assuming the laser can pulse at the appropriate rate, 407,040 pixels could be projected to cover a target area. As this is limited by the laser pulse length and the time it takes for the directional control system to aim the beam, any other numbers of pixels may be used depending on the situation and the ability of the laser to pulse and the directional control to position each pulse emission.
Eye Safety for Directed Illumination / Scan Embodiments
[00191] Example embodiments here may be used to determine certain operational statistics. Such statistics may include measuring the amount, intensity and/or power the system puts out. This can be used, for example, to ensure that safety limits are met, such as eye safety limits for projection of IR/NIR. The system, and in some embodiments the microprocessor computer system, may be instructed via code which may utilize the information provided from the illumination source and/or image sensor to help determine the correct duration of each pulse during one frame.
[00192] Recognition software analyzes image information from a CMOS or CCD sensor. The software determines the area(s) of interest. The coordinates of that area(s) of interest are sent to a microprocessor with the additional information as to the refresh rate / scanning rate / fps (frames per second), of the system.
P = number pulses "ON" during one scan
n= total number of total pixels/pulses in a scan
I = energy intensity of each pulse
energy intensity may also be defined as luminous intensity or radiance intensity
S = scanned lines per cycle or frame
F = FPS - length of time of one frame or one complete scan per second
Fi = total intensity per frame or∑(Ιΐ9ΐ2,. ··Ιη) x F
[00193] Further, the system may also verify the eye safe limits of each frame. In such an example, each light pulse may be assigned a value corresponding to the eye safe limits as determined by the semiconductor light emitting device and associated optics. As the variables which determine eye safe limits vary greatly depending upon the size of the external aperture, wavelength of light, mode, coherence, and duration, the specific criteria will be established using the specifications of the final design of the light emitting device. This may establish an Lmax - maximum eyesight safety level per frame. If The system will reduce I and/or P until Fi < L,
[00194] If no solution exists for Fi < Lmax then the system may shift into a fail safe mode which will prevent the current cycle from energizing and returns an error message to the recognition software.
[00195] The system may include additional eye safe protections. In one embodiment, the system incorporates object recognition and motion tracking software in order to identify and track a target human's eyes. Where it is possible for eye tracking software to identify the biological eyes, the system may create a blacked out space preventing the scan from illuminating or shining light directly at the identified eyes of a target human.
[00196] The system may also include hardware protection which incorporates circuitry designed with a current limiting system that prevents the semiconductor light emitting device from exceeding the power necessary to drive it beyond the maximum safe output level.
Examples for Directing Illumination
[00197] Discussed below are directed illumination example embodiments that could be used with any of the embodiments herein to capture the image, and also be used for distance measurement, depending on the embodiment.
[00198] Figure 12 illustrates one example of a way to steer an illumination source, such as a laser, here by a dual axis MEMS device. Any kind of beam steering technology could be used, but in this example embodiment, a MEMS is utilized. In this example, outgoing laser beam 1254 from the light source is directed onto the horizontal scan plane 1260 which directs the beam in a horizontal motion as indicated by horizontal direction of rotation 1230. The horizontal scan plane 1260 may be attached to the vertical scan plane 1270. The vertical scan plane 1270 and horizontal scan plane 1260 may direct the light in a vertical motion as indicated by vertical direction of rotation 1240. Both scan planes may be attached to a MEMS frame 1280. The combined horizontal and vertical motions of the scan planes allow the device to direct light in a sweeping pattern. This method of scanning is referred to as a raster scan and can produce an image in a number of scan patterns, such as but not limited to, an interlaced, de-interlaced, or progressive method. [00199] Figure 13 shows an example embodiment using two single axis MEMS instead of one dual axis MEMS as shown in Figure 12. In this example, a system of creating a raster scan uses two single axis MEMS or mirrors to steer an illumination from a source, in this example, a laser beam. Outgoing laser beam 1354 from the illumination source 1350 is directed onto the vertical scan mirror 1360 which directs the beam in a vertical motion. The outgoing laser beam 1354 is then directed to the horizontal mirror 1362 which may create a horizontal sweeping pattern. The combined horizontal and vertical motions of the mirrors or MEMS enables the device to direct light in a sweeping pattern. The system can also be used to direct pulses of laser light at different points in space, by reflecting each pulse in a different area. Progressive illumination of the target using a pulsed illumination source may result in a scanning of a target area over a given time as disclosed above. Certain methods of scanning may be referred to as a raster scan and can produce an image in an interlaced, de-interlaced, or progressive method, for example.
[00200] Figure 14 illustrates an example embodiment of creating a raster scan utilizing one single axis MEMS or mirror 1460 and one rotating polygon mirror 1464. Outgoing laser beam 1454 from the light source 1450 is directed onto the vertical mirror 1460 which directs the beam in a vertical motion. In this example, the outgoing laser beam 1454 is then directed to the rotating polygon mirror 1464 which creates a horizontal sweeping motion of the outgoing laser beam 1454. The combined horizontal and vertical motions of the mirror and the rotating polygon enable the device to direct light in a sweeping pattern. This method of scanning is referred to as a raster scan and can produce an image in a number of scan patterns including but not limited to interlaced, de-interlaced, or progressive method.
[00201] Figure 15 illustrates an example system of creating a raster scan utilizing two rotating polygon mirrors. In this example, outgoing laser beam 1554 from the light source 1550 is directed onto the rotating polygon mirror 1560 which directs the beam in a vertical motion. The outgoing laser beam 1554 is then directed to another rotating polygon mirror 1564 which creates a horizontal sweeping motion of the outgoing laser beam 1554. The combined horizontal and vertical motions of the rotating polygon mirrors enable the device to direct light in a sweeping pattern. This method of scanning is referred to as a raster scan and can produce an image in an interlaced, de-interlaced, or progressive method.
[00202] Certain embodiments may use other ways to beam steer an illumination source, and the examples described here are not intended to be limiting. Other examples such as electromagnetic control of crystal reflection and/or refraction may be used to steer laser beams as well as others.
Illumination Examples
[00203] In certain example embodiments, the users and/or system may desire to highlight a specific target within the target area field of view. This may be for any number of reasons including but not limited to object tracking, gesture recognition, 3D mapping, or any number of other reasons. Examples here include embodiments that may aid in any or all of these purposes, or others.
[00204] The example embodiments in the system here may first recognize an object that is selected by a user and/or the system via instructions to the computing portions. After the target is identified, the illumination portions of the system may be used to illuminate any or all of the identified targets or areas of the target. Through motion tracking, the illumination source may track the objects and change the illumination as necessary. The next few example figures disclose different illumination methods that may be used in any number of example
embodiments.
[00205] Figure 16 depicts an illustration of the effect of a targeted subject being illuminated, in this case a human form 1612. In other example embodiments of Figure 16 (not pictured), the subject of illumination could be other animate or inanimate objects or combinations thereof. This type of targeted illumination may be accomplished by first illuminating and recognizing a target, then directing subsequent illumination only on the specific target, in this case, a human.
[00206] Figure 17 depicts an illustration of the effect of a targeted subject form having only the outline illuminated 1712. In other example embodiments of Figure 17, the subject of outlined illumination could be other animate or inanimate objects or combinations thereof (not pictured).
[00207] Figure 18 depicts an illustration of the effect of a sub-area of targeted subject form being illuminated in this case the right hand 1812. In other example embodiments of Figure 18, the subject of sub-area illumination could be other animate or inanimate objects or combinations thereof (not pictured).
[00208] In certain embodiments, once identified, particular target areas require a focus of illumination in order to isolate the area of interest. This may be for gesture recognition, for example. One such example embodiment is shown in Figure 19 which depicts an illustration of the effect of multiple sub-areas of targeted subject form being illuminated in this case the right hand 1912, the face 1913 and left hand 1915. In other example embodiments of Figure 19, the subject of multiple sub-areas illumination could be other animate or inanimate objects or combinations thereof (not pictured).
[00209] Once a target or target area is identified, it may be desirable to project light on only certain areas of that target, depending on the purpose of illumination. For target motion tracking for example, it may be desirable to merely illuminate certain areas of the target, to allow for the system to only have to process those areas, which represent the entire target object to be tracked. One such example is shown in Figure 20 which depicts an illustration of the effect of illumination of skeletal tracking and highlighting of key skeletal points 2012. This may allow the system to track the target using only certain skeletal points, and not have to illuminate the entire target, and process information about the entire surface of the target to track its motion. In other example embodiments of Figure 20, the skeletal tracking and key points could be other animate objects or combinations thereof (not pictured). Again, to accomplish such targeted illumination, a target must be first illuminated and then recognized and then subsequent illumination targeted.
[00210] Figure 21 depicts an example illustration of the effect of illumination of targeted subject with a grid pattern 2112. This pattern may be used by the recognition or other software to determine additional information such as depth and texture. Further discussion below, describes examples that utilize such pattern illuminations. The scanning device may also be used to project outlines, fill, skeletal lines, skeletal points, "Z" tags for distance, De Bruijn Grids, structured light or other patterns for example as required by the recognition software. In other
embodiments, the system is capable of producing and combining any number of illumination styles and patterns as required by the recognition system.
Maximum Illumination and Eye Safety
[00211] Turning to Figure 22, a flow chart depicts one example of how the system may determine certain operational statistics. Such statistics may include measuring the amount, intensity and/or power the system puts out. This can be used, for example, to ensure that safety limits are met, such as eye safety limits for projection of IR/NIR. Also, the flow chart may be used to demonstrate calculations of multiple embodiments, such as the array illumination example with fixed intensity, an array with variable intensity, and also a raster scanned example using lasers described later in this disclosure, for example. [00212] The flow chart begins with the illumination device 2210, whatever embodiment that takes, as disclosed here, directing low level full scan illumination over the entire target area 2220. This allows the system to capture one frame of the target area and the image sensor may receive that entire image 2230. From that image, the length of time of one frame or one complete scan per second may inform how the illumination device operates 2240. Next, the
microprocessor, or system in general 2250, may determine a specific area of interest in the target area to illuminate specifically 2252. Using this information, once the system is satisfied that the identified area of interest is properly identified, the system may then map the target area and based on that information calculates the total level of intensity for one frame 2260. In examples where power out or total illumination per frame is important to eye safety, or some other parameter, the system can validate this calculation against a stored or accessible maximum number or value 2270. If calculated total intensity is less than or equal to the stored maximum, the system and/or microprocessor may provide the illumination device with instructions to complete one entire illumination scan of the target area 2280. If the calculated maximum is greater than the stored or accessed maximum number, the system may recalculate the intensity to a lower level 2274 and recalculation 2260. If the calculated maximum number cannot be reduced to a level lower than or equal to a stored maximum number, the system may be configured to not illuminate the target area 2272, or to perform some other function to limit eye exposure, and/or return an error message. This process may then repeat for every frame, or may be sampled randomly or at a certain interval.
[00213] Other kinds of examples of power or illumination measurement may be used in various circumstances, besides the illustration here for eye safety. For example, there may be light sensitive instruments in the target area, there may be system power limitations that must be met, etc. Similar methods as those described here may be used to check and/or verify the system power out to the illuminated target area. Specific eye safety calculations for each of the methodologies of illumination are described elsewhere in this disclosure.
[00214] In some embodiments of this device, a light shaping diffuser (reference Figure 1, at 104), may be arranged somewhere after the array (not pictured) to create a smooth projection of the semiconductor light emitting devices in the array. The light shaping diffuser (not pictured) may create a smooth projection of the semiconductor light emitting devices in the array and a more homogenous overlap of illumination. The light shaping diffuser (not pictured) may also have an added effect of allowing for increased levels of illumination while remaining within eye safe limits.
[00215] In other examples, image capture devices may use a shutter or other device to break up image capture into frames. Examples of common durations are l/30th, l/60th or l/120th of a second.
Examples that Incorporate Optical Elements
[00216] Video imaging sensors may utilize an optical filter designed to cut out or block light outside the range visible to a human being including IRTNIR. This could make utilizing IR/NIR an ineffective means of illumination in certain examples here. And according to certain embodiments here, the optical filter may be replaced with one that is specifically designed to allow for both the visible range of wavelengths and a specific band of IRTNIR that matches that of the illumination device. This may reduce the distortion created by the IRTNIR, while allowing for the maximum response to the IR/NIR.
[00217] According to certain embodiments, the optical filter is replaced with one specifically designed to allow for both the visible range of wavelengths and a specific band of IR/NIR that matches that of the semiconductor light source. This may help reduce the distortion created by the IR/NIR, while allowing for the maximum response to the IRTNIR.
[00218] According to certain embodiments, the optical filter is replaced with one specifically designed to block all wavelengths except only a specific band of IR/NIR that matches that of the semiconductor light source.
[00219] According to certain embodiments, a semiconductor light emitting device may be used to produce light in the infrared and or near infrared light wavelengths defined as 750nm to 1mm, for example. In some embodiments, the projection optics may be a projection lens.
[00220] IRTNIR could be used in certain situations, even if natural ambient light is present. In certain embodiments, the use of IR in or around the 976nm range could be used by the illumination source, and filters on the image capture system could be arranged to only see this 976nm range. In such examples, the natural ambient light has a dark spot, or very low emission in the 976nm range. Thus, if the example system focuses the projected and captured IR in that 976nm range, it may be able to be used where natural light is present, and still be able to illuminate and capture images. [00221] In certain embodiments, a combined ambient and NIR device may be used for directed illumination utilizing single CMOS sensor.
[00222] In such an example system, a dual band pass filter may be incorporated into the optical path of an imaging sensor. This path may include a lens, an IR blocking filer, and an imaging sensor of various resolutions. In certain embodiments, the IR blocking filter may be replaced by a dual band pass filter including a band pass filter, which may allow visible light to pass in approximate wavelengths between 400nm and 700nm, and a narrow band pass or notch filter, which is closely matched to that of the IR/NIR illumination source.
[00223] Figure 23 illustrates the interaction of the physical elements of example embodiments here. An illumination device 2350 such as a dual axis or eye MEMS mirror or an array or other method which could direct an NIR light source, producing a source of augmented illumination onto the subject area 2312. Ambient light 2370 and NIR light 2354 are reflected off of the subject area 2312. Reflected ambient light 2372 and reflected NIR 2355 pass through lens 2322. A combined optical filter 2324 may allow only visible and a specific narrow range of IR to pass into optical housing 2320 blocking all other wave lengths of light from reaching image sensor 2318. In the example drawing, these objects are shown in exploded and/or exaggerated forms, whereas in practice they may take any number of shapes and configurations. Here, they are shown as sometimes separate and symbolic icons.
[00224] Turning now to an example of the image capture device/ sensor, Figure 24 depicts such an example in a side view of a CMOS or CCD camera 2440. This figure depicts a lens 2442, a filter 2444, and an optional lens tube 2446 or optics housing. Any number of
combinations of lenses and filters of different sorts may be used, depending on the configuration of the embodiment and the purpose of the image capture. Also, many kinds of image capture devices could be used to receive the reflected illumination and pass it to computing devices for analysis and/or manipulation.
[00225] Referring again to Figure 24, other embodiments of this device may have the order of the filter 2444 and the lens 2442 reversed. Still other embodiment of this device may have the lens 2442 and the filter 2444 combined, wherein the lens is coated and has the same filtering properties as a discrete filter element. This may be done to reduce cost and number of parts and could include any number of coatings and layers. [00226] Still referring to Figure 24, other embodiments may have the camera manufactured in such a way that the sensitivity of the device acts in a similar manner to that of a commercially available camera with a filter 2444. In such an example, the camera could be receptive to visible light and to only one specific range of IR/NIR, blocking out all of the other wavelengths of IR/NIR and non-visible light. This example device could still require a lens 2442 for the collection of light. Such examples are described in more detail below, for example in Figures 25, 26B and 27B.
[00227] Still referring to Figure 24, such an example combined filter that blocks light below visible 400nm is shown below by line 2547 in Figure 25. Such a filter may also block above the visible 700nm as shown below by line 2545 in Figure 25.
[00228] According to one embodiment, the filter may only block above 700nm allowing the inherent loss of responsivity of the sensor below the 400nm to act like a filter. The filter may block some or all of IR/NIR above 700nm typically referred to as an IR blocking filter.
[00229] In other embodiments of this device, the filter may only block above 700nm allowing the inherent loss of responsivity of the sensor below the 400nm to act like a filter. This filter may include a notch, or narrow band, allowing a desired wavelength of IR to pass. In this example, 850nm, as shown by line 2508 in Figure 25.
[00230] Figure 25 depicts an example graph of the wavelength responsivity enabled by an example filter. The x axis shows wavelength in nanometers (nm) and the y axis shows percent sensitivity 0 - 100% as decimal values 0.0 to 1.1. Specific wavelengths are dependent upon the CMOS or CCD camera being utilized and the wavelength of the semiconductor light emitting devices. The vertically shaded area 2502 represents the typical sensitivity of a CMOS or CCD video imaging device. The "graduated rectangular bar" 2506 represents the portion of the spectrum that is "visible" to the human eye. The "dashed" line 2508 represents the additional responsivity of the proposed filter.
[00231] Turning again to Figure 24, in this example embodiment, the optical filters may be combined into one element 2444. In Figure 24, the example depicts an image sensor 2440, optical housing 2446, lens 2442, the combined filter 2444 blocking light below 400nm, between 700nm and 845nm, and 855nm and above. The example is illustrated assuming an NIR light source at 850nm, wavelengths between 800nm and lOOOnm may be utilized depending upon the specific device requirements. The band pass range is +/- 5nm for example only, the actual width of the band pass may be wider or narrower based on specific device requirements.
[00232] According to one embodiment, two optical filters are combined. In Figure 26A, the example depicts an image sensor 2640, optical housing 2646, lens 2642, filter <400nm 2643, and a narrow band filter 2644 blocking light between 700nm and 845nm, transmittance between 845nm and 855nm, blocking above 855nm. The example is illustrated assuming an NIR light source at 850nm, wavelengths between 800nm and lOOOnm may be utilized depending upon the specific device requirements. The band pass range is +/- 5nm for example only, the actual width of the band pass may be wider or narrower based on specific device requirements.
[00233] Figure 26B is a graphical depiction of example CMOS sensitivity to light. The x axis shows wavelength in nanometers (nm) and the y axis shows percent sensitivity. This example shows from 300nm to HOOnm 2602 (vertically shaded); the spectrum of light visible to human eye, 400nm - 700nm 2606, ("graduated rectangular bar"); transmittance of filter from 0% to 100% across the spectrum 300nm to HOOnm (2608 dashed). The range covered by element is depicted above the graph. The narrow band filter 2644 blocking light between 700nm and 845nm, transmittance between 845nm and 855nm, blocking above 855nm is shown as arrow 2645. The filter <400nm 2643, is shown as arrow 2647.
[00234] According to certain embodiments, three optical filters may be combined. In Figure 27A, the example depicts an image sensor 2740, optical housing 2746, lens 2742, band filter <400nm 2743, a narrow band filter 2780 between 700nm and 845nm, and a filter 2782 blocking above >855nm. The example is illustrated assuming an NIR light source at 850nm, wavelengths between 800nm and lOOOnm may be utilized depending upon the specific device requirements. The band pass range is +/- 5nm for example only, the actual width of the band pass may be wider or narrower based on specific device requirements.
[00235] Figure 27B is a graphical depiction of typical CMOS sensitivity to light. The x axis shows wavelength in nanometers (nm) and the y axis shows percent sensitivity. This example shows from 300nm to HOOnm (2702, shaded); the spectrum of light visible to human eye, 400nm - 700nm (2706, black); transmittance of filter from 0% to 100% across the spectrum 300nm to 1 lOOnm (2708, dashed). The range covered by band filter <400nm 2743 is depicted as an arrow 2747, the range covered by, a narrow band filter 2780 between 700nm and 845nm is depicted as an arrow 2781, and the range covered by filter 2782 blocking above >855nm, is shown as an arrow 2745.
[00236] In some example embodiments of this device, the system can alternate between RGB and NIR images by either the utilization of computing systems and/or software to filter out RGB and NIR, or by turning off the NIR illumination for a desired period of time. Polarization of a laser for example, may also be utilized to alternate and differentiate objects.
[00237] In other embodiments of this device, the optical filter or combination of filters may be used to block all light except a selected range of NIR light, blocking light in the visible range completely.
Triangulation and Distance Measurement
[00238] Certain embodiments here may be used to determine distances, such as the distance from the example system to a target person, object, or specific area. This can be done as shown here in the example embodiments, using a single camera/image capture device and a scanning projection system for directing points of illumination. These distance measurement embodiments may be used in conjunction with many of the target illumination and image capture embodiments described in this disclosure. They could be used alone as well, or combined with other technologies.
[00239] The example embodiments here accomplish this by matching the projected points of illumination with a captured image at a pixel level. In such an example, first, image recognition is performed, over the target area in order to identify certain areas of interest to track, such as skeletal points on a human, or corners of a box, or any number of things. A series of coordinates may then be assigned to each key identified point. These coordinates may be sent to a computing system which may include microprocessing capabilities and which may in turn control a semiconductor light emitting device that may be coupled to a mechanism that scans the light across an area of interest.
[00240] The system may be configured to project light only on pixels that correspond to the specified area previously identified. Each pixel in the sequence may then be assigned a unique identifier. An image sensor could then collect the image within the field of view and assign a matching identifier to each projected pixel. The projected pixel's corresponding imaged pixel may be assigned horizontal and vertical angles or slope coordinates. With a known distance between the projection and image source, there is sufficient information to calculate distance to each point using triangulation calculations disclosed in examples here.
[00241] According to certain example embodiments, the system may direct one or more points or pixels of light onto a target area such as a human subject or object. The example device may include a scanning device using a dual axis or two singles axis MEMS, rotating polygon mirrors, or other method for directing light; a collimated light source such as a semiconductor or diode laser which can generate a single pixel; a CMOS, CCD or other imaging device which may incorporate a short band pass filter allowing visible and/or specific IR/ IR; a microprocessor(s) controlling the scanning device; object and/or gesture recognition software and a microprocessor.
[00242] In regards to using the system for distance measurement, the human or the software may identify the specific points for distance measurement. The coordinates of the points may be identified by the image sensor and the computing system and sent to the system which controls the light source and direction of projection. As the direction device scans, the device may energize the light at a pixel (input) corresponding to the points to be measured (output). The device may assign a unique identifier to each illuminated point along with its vertical and horizontal angular components.
[00243] The projected points and captured image may be synchronized. This may help reduce the probability that an area of interest has moved before a measurement can be taken. The imaged spot location may be compared to projected locations. If the variance between the expected projected spots map and the imaged spots is within a set tolerance then the system may accept them as matching.
[00244] The image sensor may produce one frame of information and transmits that to the software on the microprocessor. A frame refers to one complete scan of the target area and is the incremental period of time that the image sensor collects one image of the field of view. The software may be used to analyze the image information, identify projected pixels, assign and store information about the location of each point and match it to the illuminated point. Each image pixel may also be assigned angular values for horizontal and vertical orientation.
[00245] Based on the projected and imaged angles combined with known distance between the projector and image sensor, a trigonometric calculation can be used to help determine the depth from the device to each illuminated spot. The resultant distances can either be augmented to the display for human interpretation or passed onto software for further processing.
[00246] Figure 28 illustrates an overview of the triangulation distance example embodiments here. These embodiments are not exclusive of the image illumination and capture embodiments disclosed here, for example, they may be used alone, or to augment, complement, and/or aid the image illumination and capture to help gather information and/or data about the target area for the system. In this example embodiment, the system is operating in a subject area 2810, here, a room. The illumination device 2850, in this example controlled by a microprocessor 2826, is used to project a beam 2854 to illuminate a point on a target 2812. The reflection of the beam 2855 may be captured by the image sensor 2820. Data from that capture may then be transmitted to the microprocessor 2826. Other objects in the room may similarly be identified, such as the briefcase 2814. Data from such an example system may be used to calculate distances to illuminated objects, as will be discussed further below.
[00247] Figure 29 illustrates an example of how the initial image recognition may be accomplished, in order to later target specific areas for illumination. Using an example image capture system including a camera and object or gesture recognition, a human 2912 may be identified. The identification of the area of interest is indicated by rectangular segments 2913. These rectangular segments may be any kind of area identification, used for the system to later target more specific areas to illuminate. The examples shown here are illustrative only. Figure 29 also shows an example object 2914 which could also be identified by a larger area 2915. If computer instructions or software is not used to recognize objects or targets, human intervention could be used. A touch screen or cursor could be used to outline or identify targets of interest - to inform the system of what to focus illumination on, shown here by a traced line around the object.
[00248] Figure 30 illustrates an example scenario of a target area as seen by the image capture device, and/or caused to be displayed on a visual monitor for human interaction. In the example, as might be seen on a computer screen or monitor showing a single point illuminated 3016 for depth measurement on a target human 3012. Example gesture recognition software and software on the microprocessor could use the rectangular segments shown in Figure 29, to direct an illuminated point 3016 on specific areas of a target human 3012. A similar process may be used for the examples that are manually identified. Likewise, object 3014 could also receive a directed illuminated point 3018. These points will be discussed later for distance calculations.
[00249] Figure 31 illustrates an example imaged scenario as might be seen on a computer screen or monitor where the system has caused the display to show the calculated distance measurement from the system to the illuminated points 3118 and 3116 which are located on the object 3114 and human targets 3112, respectively. In this example, a display of the image, the distance calculations "1375" 3116 and "1405" 3118 show up on the screen. They could take any form or be in any unit of measurement, here they show up as 1375 and 1405 without showing the units of measurement, as an example.
[00250] Figure 32 illustrates a typical imaged scenario as might be seen on a computer screen or monitor showing multiple points illuminated for depth measurement. The system with gesture recognition capabilities such as those from software could use the rectangular segments as depicted in Figure 29 to direct multiple illuminated points 3234 on a target human 3212. A similar process may be used to direct multiple illuminated points 3236 onto an object 3214. In certain examples, the system could be used to automatically select the human target 3212 and a human interface could be used to select the object 3214. This is only an example, and any combination of automatic and/or manually selected targets could be acquired and identified for illumination.
[00251] Figure 33 illustrates an example embodiment where the system causes display on a computer screen or monitor showing the superimposing of the distance from the illumination device to the multiple illuminated points 3334 in tabular form 3342. The example multiple illuminated points are shown with labels of letters, which in turn are used to show the example distance measurements in the table 3342. Figure 33 also depicts the manually selected object 3314 with multiple illuminated points 3336 superimposed on the image 3340 in this case showing "1400," "1405," "1420" and "1425" as distance calculations, without units depicted, as an example.
[00252] Figure 34 illustrates an example of an embodiment of the physical relationship among components of the illumination device 3450 and the image sensor 3420. The relationship among these components may be used in distance calculations of the reflected illumination off of a target, as disclosed here. The illumination device 3450, as detailed above in Figure 10, may include a light source 3456 which can be any number of sources, such as a semiconductor laser, LED, diode laser, VCSEL or laser array, or a non-coherent collimated light source. The light may pass through an optical component 3460 which may be used to direct the light onto the reflective system, in this example, a MEMS device 3458. The light may then be directed onto the area of interest; here the example beam is shown directed off the Figure 3480.
[00253] Turning to the image capture device / camera / sensor, this example illustration shows the central Z axis 3482 for the image sensor 3420. The MEMS device 3458 also has a horizontal axis line 3484 and a vertical axis line 3486. The image sensor 3420 may include components such as a lens 3442 and a CMOS or CCD image sensor 3440. The image sensor 3440 has a central Z axis 3482 which may also be the path of illumination beam returning from reflection off the target to the center of the sensor 3440 in this example. The image sensor 3440 has a horizontal axis line 3484 and a vertical line axis 3488. In this example both the MEMS 3458 and the image sensor 3440 are offset both horizontally and vertically 3490 wherein z axis 3480 and 3482 are parallel, but the horizontal axis 3484 and the vertical axes 3488 and 3486 are offset by a vertical and/or horizontal value. In such examples, these offsets would have to be accounted for in the distance and triangulation calculations. As discussed throughout this document, the relationships and/or distance between the illumination source and the image capture z axis lines may be used in triangulation calculations.
[00254] In some example embodiments, the MEMS 3458 and the image sensor 3440 are aligned, wherein they share the horizontal axis 3484, and where their respective vertical axes 3488 and 3486 are parallel, and axial lines 3482 and 3480 are parallel.
[00255] Physical aspects of the components of the device may prevent the point of reflection of the directing device and the surface plane of the image sensor from being on the same plane, creating an offset such as discussed here. The offset may be intentionally introduced into the device as a means of improving functionality. The offset is a known factor and becomes an additional internal calibration to the distance algorithm.
[00256] Figure 35 illustrates an example of how data for triangulation calculations may be captured, which could be used in example embodiments to calculate distance to an illuminated object. The result of using the data in trigonometric calculations may be used to determine the distance D, 3570 from device to point P, 3572. Point P can be located any distance from the back wall of the subject area 3574 to the illumination device 3550. Outgoing laser beam 3554 is directed in this example from the illumination device 3550 to a point P 3572 on a subject area 3574. The reflected laser beam 3555 reflects back and is captured by the image sensor 3520. In this example the image sensor 3520 and the illumination device 3550 are aligned as illustrated earlier Figure 34. Distance h 3576 is known in this example, and the angle represented by Θ, 3578 can be determined as further illustrated in this disclosure. In this illustration there is no angular component to outgoing laser beam 3554. The central Z axis for the illumination device is represented by line 3580 and the image sensor 3520 by line 3582 are parallel. Using the functions described in above, the distance D 3570 can be determined.
[00257] In one example, the directed light is pointed parallel to the image sensor with an offset some distance "h" 3576 in the horizontal plane, and the subject area lies a distance "D" 3570 away. The illuminated point "P" 3572 appearing in camera's field of view is offset from the center through an angle Θ, 3578 all as shown in Figure 35:
[00258] Assuming a known angle Θ 3578, using the separation between the directed spot at P 3572 and the center of the image sensor's field of view in the image, and the directed spot offset distance h, 3576 then the distance D 3570 is:
[00259] D =h I Tan{ )
[00260] Since, because the image sensor and directed spot are parallel, the point P 3572 is a fixed distance, h 3576 away from the centerline of the image sensor, the absolute position (relative to the image device) of point P 3572 is known.
[00261] Thus, if the center of the focal plane of the image sensor is at a point (Χ,Υ,Ζ) = (0,0,0), then P = (h ,0 , D ).
[00262] Figure 36 illustrates an example calculation of distance where the angle the illumination source uses to illuminate the target is not directly down its own z axis. In this example, the trigonometric calculation may be used to determine the distance D, 3670 from device to point P, 3672. Point, 3672 can be located any distance from the back wall of the subject area 3674 to the illumination device 3650. Outgoing laser beam 3654 is directed from the illumination device 3650 to a point P, 3672 on a subject area 3674. The returning laser beam 3655 reflects back and is captured by the image sensor 3620. In this example the image sensor 3620 and the illumination device 3650 are aligned as further illustrated in Figure 34. Distance h, 3676 is known and the angle represented by , 3678 can be determined as further herein. In this illustration the angular component a, 3688 of the outgoing laser beam 3654 can be determined based upon the horizontal and vertical coordinate of that pixel as described above. In this example h' 3682 and x 3684 may be calculated. Using the function described above the distance D 3670 can be determined.
[00263] To determine the distance of many points all lying in the same plane as the above example. In this case, the output direction of the directed spot is changed, at some angle a relative to the line parallel to the image sensor, as shown in Figure 36.
[00264] The image point P 3672 will be located a distance, where in the Figure 36, x is 3684, h is 3676 and h' is 3682, the formula x = h + h' away from the centerline of the Image sensor, as shown in Figure 36. In this case, the distance D 3670 can be determined from the angles Θ 3678 and a 3688 and the directed spot "offset distance" h 3676:
D =h l[Tan{ ) -Tan( )]
[00265] With the distance D 3670 known, the absolute position x 3684 of the image point can be determined, since:
[00266] x =DTan{ )
[00267] The absolute position of point P = ( DTan(e) , 0 , D).
[00268] Figures 37 A, B and C show an example where in addition to the offset X 3784 of the outgoing laser beam 3754 there is also a vertical offset Y, 3790. With the numerals
corresponding to the same numerals in Figure 36, with the addition of Beta 3792, the vertical angle. This scenario is depicted in Figure 37 A from a Top, Figure 37 B Side, and Figure 37 C from an Axial view.
[00269] To obtain both horizontal and vertical position information, it is sufficient to direct the spot with two known angles - angle a 3758 in the horizontal plane (as in case as shown above, and an angle β 3794 out of the plane - these are shown in the Figure 37.
[00270] The distance to D 3770 is determined exactly as before in Equation above. The distance to D 3770 is known and the out of- plane angle β 3792 of the directed spot, the vertical position y of the image spot P 3772 can be determined through:
Figure imgf000056_0001
[00272] The absolute position is known through equations above:
[00273] P = (DTan(e) ,DTan(p) , D).
[00274] Figures 38 A, B and C further illustrates Figures 36 and 37, where there is an X and Y offset between the illumination device 3850 and the image sensor 3820. In this example, there is an additional offset k, 3894 shown in the axial view as the vertical distance between the image sensor 3820 and the illumination device 3850 and in the side view as the distance between the z axis of the image capture device 3820 and the z axis of the illumination device 3850. The variable k' 3896 is also shown as the offset of the distance between illumination device 3850 z axis 3882 and the point P 3874 where the illumination pixel hits the object 3874.
[00275] Due to the possible 3-dimensional nature of objects to be imaged, it may be useful to have two "independent" measures of the distance D 3870. This can be accomplished by offsetting the directed spot in both the horizontal and vertical directions. This most general case is illustrated in Figures 38 A, B and C.
[00276] Since the directed spot is now offset a distance k 3894 in the vertical direction, there is an independent measure of D 3870 analogous to that in Equation above, using the vertical output angle of the directed spot, β 3892, and the angle φ, using the vertical separation between the directed spot at P and the center of the image sensor's field of view in the image:
[00277] D =k /[Γα«(φ) -Ταη )] .
The vertical position y 3890 is now given by
y =Ζ)Γα«(φ) ,
The absolute position of the image spot P = (DTan($) , DTan^) , D).
[00278] Figure 39 shown an example embodiment similar to Figures 37-38 but in this example, there is an additional horizontal and vertical offset 3998 introduced where the directed illumination device is offset from the image sensor 3920 in the X, Y, and Z axis.
[00279] Figure 40 illustrates the flow of information from identification of the point(s) to be measured through the calculation of the distance and display or passing of that information.
Column A shows what a screen may look like if the human interface is responsible for image recognition. Column B shows a scenario where software is used to detect certain images. The center column describes what may happen at each section.
[00280] First, recognition occurs, 4002, where the camera or image sensor device is used to provide image data for analysis. Next, either the human or software is used to identify an area of interest 4004. Then, the system may assign to each area of interest, any number of things such as Pixel identification information, a unique identifier, a time stamp, and/or calculate or table angle, 4006. Next, the system and/or microprocessor may transmit a synchronizing signal to the image sensor, and pixel command to the illumination device 4008. The system may then illuminate the subject area with a spot of illumination, 4010. Then the image sensor may report the location of the pixels associated with the spot 4012. Next, the system and/or microprocessor may analyze the pixel values associated with imaged spot, match imaged pixel to illuminated spot and assign a location to pixel to calculate the angle value, 4014. Next, the microprocessor and/or system may calculate a value for depth, or distance from the system, 4016. Then the system may return a value for depth to the microprocessor for display, 4018. This is shown as a display of data on the example screen in 4018B. Then, the system may repeat the process 4020 as needed as the objects move over time.
[00281] Certain examples have the active FOV - Field Of View of the directed light and the capture FOV of the image sensor aligned for the calculations used in measuring distances. This calibration of the system may be accomplished using a software application.
[00282] According to some embodiments, input video data can be configured for streaming in a video format over a network.
[00283] Figure 41 shows an example image capture system embodiment. As the light, in this example, reflected laser light, is returned from reflecting off of the target, it passes through the lens 4142 and onto the image sensor 4140. The image sensor example here is made up of a number of cells, which, when energized by light, produce an electrical charge, which in turn may be mapped by the system in order to understand where that light source is located. The system can turn these charged cells into an image.
[00284] In this example, a returned reflected laser beam 4156, 4158, 4160, and 4162 returning from the area of interest along the center Z axis 4186 is identified by the CMOS or CCD image sensor 4140. Each point or pixel of light that is directed onto an area of interest, or target, may be captured with a unique pixel location, based on where the reflected light hits the image sensor 4140. Returning pixels 4156, 4158, 4160, 4162, represent examples of unique points with angular references different from 4186. That is, the reflected light beams are captured at different angles, relative to the z axis 4186. Each cell or pixel therefore has a unique coordinate identification and a unique set of angular values in relationship to the horizontal axis 4184 and the vertical axis 4188.
[00285] Not only can these reflected beams be used to map the image, as discussed, may be used to triangulate the distance of objects as well.
[00286] Figure 42 illustrates an example image capture device that is using error correction to estimate information about the target object from which the light reflected. As was seen in Figure 41, the reflected light hits certain cells of the image capture sensor. But in certain examples, the light does not strike the center of one sensor cell. Sometimes, in examples, the light strikes more than once cell or an intersection of more than one cell. The system may have to interpolate and estimate which of the cells receives the most of the returned light, or use different calculations and/or algorithms in order to estimate angular values. In some examples, the system may estimate where returning pixels 4256, 4258, 4260, 4262, will be captured by the image sensor 4250. In the case of pixel 4262 the light is centered on one pixel and/or cell and overflows partially onto eight adjacent pixels and/or cells. Pixel 4260 depicts the situation where the light is centered evenly across four pixels and/or cells. Pixels and/or cells 4256 and 4258 depict examples of the light having an uneven distribution across several pixels and/or cells of the image sensor 4250.
[00287] The probability that a projected spot will be captured on only one pixel of the image sensor is low. An embedded algorithm will be used to determine the most likely pixel from which to assign the angular value. In certain examples in Figure 42 the imaged spot is centered on one pixel and overlaps eight others. The charged value of the center pixel is highest and would be used. In certain examples in Figure 42, the spot is equally distributed over 4 pixels. In this case a fixed algorithm maybe used, selecting the top left pixel or lower right, etc. A more sophisticated algorithm may also be utilized where factors from prior frames or adjacent spots are incorporated into the equation as weighting factors. A third example may be where there is no one definite pixel. Charged weighting would be one method of selecting one pixel. A fixed algorithm could also be utilized. In another embodiment of this invention, a weighted average of the angular values could be calculated for imaged spot, creating new unique angular values.
[00288] Once the system has captured the reflected light energy, mapped it on the image sensor, the image sensor may send data information to the system for analysis and computations regarding mapping, distance, etc., for example.
[00289] Different example embodiments may utilize different sources of light in order to help the system differentiate the emitted and reflected light. For example, the system may polarize one laser beam pulse, send it toward an example target, and then change the polarization for all of the other pulses. In this way, the system may receive the reflected laser beam pulse, with the unique polarization, and be able to identify the location of the specific target, differentiated from all of the other returned beams. Any combination of such examples could be used to identify and differentiate any number of specific targets in the target field. These could be targets that were identified by the system or by human intervention, through an object recognition step earlier in the process, for example.
Biometric Example Embodiments
[00290] In certain example embodiments, the system may be used to measure biometrics including a person's heartbeat if they are in the target area. This may be done with the system described here via various measurement techniques.
[00291] One such example may be because the human face changes reflectivity to IR depending upon how much blood is under the skin, which may be correlated to heart beat.
[00292] Another technique draws from Eulerian Video Magnification, a method for using identification of a subject area in a video, magnifying that area and comparing frame to frame motion which may be imperceptible to a human observer. Utilizing these technologies a system can infer a human heart beat from a distance of several meters. Some systems need to capture images at a high frame rate which requires sufficient lighting. Often times ambient lighting is not enough for acceptable image capture. One way to deal with this may include an embodiment here that uses directed illumination, according to the disclosures here, to be able to illuminate a specific area of a subject, thus enhancing the ability of a system to function in non-optimal lighting conditions or at significant distances.
[00293] Technologies that utilize a video image for determining biometric information may require particular illumination such that the systems can capture an acceptable video image at frame rates fast enough to capture frame to frame changes. Ambient lighting may not provide sufficient illumination, and augmented illumination may not be available or in certain circumstances it may not be desirable to provide high levels of visible light, such as a sleeping person, or where the subject is in crowded environment, or at a distance making conventional lighting alternatives unacceptable. Certain embodiments here include using illumination which can incorporate directing IR/NIR.
[00294] Such embodiments may determine distance and calibrate projected patterns onto a desired object or human, which may help determine surface contours, depth maps and generating point clouds. And in some embodiments, the system may direct illumination onto one or more areas of a human subject or object. Such a system to direct illumination may be controlled by a human or by software designed to recognize specific areas which require enhanced illumination. The system may work in conjunction with a CMOS, CCD or other imaging device, software which controls the projecting device, object and/or gesture recognition software or human interface, software which analyzes the video image and a microprocessor.
[00295] A human user, or the recognition software may analyze the image received from the image sensor, identify the subject or subjects of interest, assign one or more areas which require augmented or enhanced illumination. The system may then direct illumination onto those specifically identified areas. If the system is integrated with motion track capabilities, the illumination can be changed with each frame to match the movement of the subject area. The imaging system may then capture the video image and transfer that to the analysis software. Changes to the position, size and intensity of the illumination can be made when the analysis software may even provide feedback to the software controlling the illumination. Analysis of the processed video images may be passed onto other programs and applications.
[00296] Embodiments of this technology may include the use of color enhancement software which allows the system to replace the levels of gray scale produced in a monochromatic IR image with color equivalents. In such an example, software which utilizes minute changes in skin color reflectivity may not be able to function with a monochromatic image file. When the gray scale is replaced by assigned color, the system may then be able to interpret frame to frame changes.
[00297] Example embodiments may be used for collecting biometrics such as heart/pulse rate from humans and other living organisms. Examples of these can be a sleeping baby, patients in intensive care, elderly patients, and other applications where non-physical and non-light invasive monitoring is desired.
[00298] Example embodiments here could be used in many applications. For instance, example embodiments may be used for collecting information about non-human living organisms as well. For example, some animals cannot easily be contained for physical examination. This may be due to danger they may pose to humans, physical size, or the desire to monitor their activity without disturbing them. As another example, certain embodiments may be used for security systems. By isolating an individual in a crowd, a user could determine if that isolated target had an elevated heart rate, which could indicate an elevated level of anxiety. Some other example embodiments may be used for monitoring inanimate objects in non-optimal lighting conditions, such as production lines, and inventory management, for example. [00299] Figure 43 illustrates an example embodiment where the biometric of a human target 4312 is desired from a distance of several meters. The distance could vary depending on the circumstances and level of accuracy desired, but this example is one of several meters. In this example, recognition software could identify an area of interest, using object recognition methods and/or systems. The coordinates of the target object may then be sent to the illumination device controlling the directed illumination 4320. The example laser beam 4320 may then be sent to and reflected 4322 to be captured by an image sensor (not pictured), and transmitted to the system for analysis. The illumination can be adjusted to optimally illuminate a specific area as depicted in the figure detail 4324 showing an example close up of the target and reflection off of a desired portion of the target person 4312.
[00300] This example beam could be motion tracked to follow the target, adjusted, or redirected depending on the circumstances. This may allow for the system to continue to track and monitor an identified subject area even if the object is in motion, and continue to gather biometric information and/or update the information.
Sequential Triangulated Depth Map Example Embodiments
[00301] Certain example embodiments here include the ability to create sequential triangulated depth maps. Such depth maps may provide three-dimensional representation of surfaces of an area based on relative distance from an area to an image sensor. The term is related to and may be analogous to depth buffer, Z-buffer, Z-buffering and Z-depth, for example. Certain examples of these provide the Z or distance aspect as a relative value as each point relates to another. Such example technologies may incorporate a method of using sequentially triangulated points. A system that utilizes triangulation may generate accurate absolute distances from the device to the surface area. Furthermore, when the triangulated points are placed and captured sequentially, an accurate depth map of an area may be generated.
[00302] As described above, certain embodiments here may direct light onto specific target area(s), and more specifically to an interactive projected illumination system which may enable identification of an illuminated point and calculation of the distance from the device to that point by using trigonometric calculations referred to as triangulation.
[00303] According to some embodiments, a system may direct illumination onto a target area using projected points of light at specific intervals along a horizontal axis then steps down a given distance and repeats, until the entire area is scanned. Each pixel may be unique and identified and matched to an imaged pixel captured by an image sensor. The uniqueness of each pixel may be from a number of identifiers. For example, each projected pixel may have a unique outbound angle and each returning pixel also has a unique angle. Thus, for example, the angles combined with a known distance between the point of directed illumination may enable the system to calculate, using triangulation the distance to each point. The imaged pixel with and assigned Z, depth or distance component can be further processed to produce a depth map and with additional processing a point cloud.
[00304] Figure 44A illustrates an example embodiment generating one row of points 4414 with a human subject 4412 also in the room. In this example, each point illuminated has unique and known angular value from its projection. And each point in this example has a unique sequential value based on time and location. These points can be timed and spaced so as to prevent overlap or confusion by the system.
[00305] Figure 44B illustrates example reflected pixels 4424. These reflected points are captured by an image sensor. In this example, each imaged pixel also has unique identifies such as angular values and time, as in Figure 44A.
[00306] Thus, in this example embodiment, the unique identification of projected pixels and captured pixels may allow the system to match a projected point with an imaged point. Given the known angles and distance between the source of directed illumination and the image capturing device, by use of triangulation described above, distance can be calculated from the device to the surfaces in the field of view. This depth or distance information, "Z," can be associated with a corresponding imaged pixel to create a depth map of the scanned target area or objects. Further processing of the depth map can produce a point cloud. Such example depth maps or point clouds may be utilized by other software systems to create three dimensional or "3D" representations of a viewed area, object and human recognition, including facial recognition and skeletal recognition. Thus, the example embodiments may capture data in order to infer object motion. This may even include human gesture recognition.
[00307] Certain example embodiments may produce the illumination scans in various ways, for example, a vertical scan which increments horizontally. Additionally, certain embodiments may use projected points that are sequential but not equally spaced in time. [00308] Some embodiments may incorporate a random or asymmetric aspect to the pattern of points illuminated. This could enable the system to change points frame to frame and through software fill in the gaps between imaged pixels to provide a more complete depth map.
[00309] And some example embodiments either manually or as a function of the software, selectively pick one or more areas within a viewed area to limit the creation of a depth map. By reducing the area mapped, the system may run faster having less data to process. The system may also be dynamically proportioned such that it may provide minimal mapping of the background or areas of non or lesser interest and increase the resolution in those areas of greater interest, thus creating a segmented or hybrid depth map.
Augmented Reality
[00310] Certain example embodiments could be used to direct the projection of images at targets. Such an example could using directed illumination incorporating IRTNIR wavelengths of light to improve the ability of object and gesture recognition systems to function in adverse lighting conditions. Augmented reality refers to systems that allow the human user to experience computer generated enhancements to real environments. This could be accomplished with either a monitor of display, or through some form of projected image. In the situation of a projected image, a system could work in low light environments to avoid the projected image from being washed out by ambient light sources. When combined with a directed illumination device that operates in the IR/ NIR wavelengths, recognition systems can be given improved abilities to identify objects and motion without creating undesirable interference with projected images. Such example object recognition, object tracking and distance measuring are described elsewhere herein and could be used in these example embodiments to find and track targets.
[00311] Multiple targets could be identified by the system, according to the embodiments disclosed herein. By identifying more than one target, the system could project different or the same image on more than one target object, including motion tracking them. Thus, more than one human could find unique projections on them during a video game, or projected
backgrounds could illuminate walls or objects in the room as well, for example.
[00312] Once found and tracked, the targets could be illuminated with a device that projects various images. This projector could be integrated with the tracking and distance systems or a separate device. Either way, in some embodiments, the two systems could be calibrated to correct for differences in projected throw angles. [00313] Any different kind of projection could be sent to a particularly identified object and/or human target. The projected image could be monochrome or multicolored. In such a way, the system could be used with video games to project images around a target area. It could also have uses in medicine, entertainment, automotive, maintenance, education and security, just as examples.
[00314] Figure 45 illustrates an example embodiment showing an interactive game scenario. In this example embodiment, the directed illumination has enabled recognition software to identify where a human 4512 is located in the field of view and has been identified by the system according to any of the example ways described herein. The software may also define the basic size and shape of the subject for certain projections to be located. The example system may then adjust the image accordingly and projects it onto the subject, in this example an image of a spider 4524.
Eye Safety Example Embodiment
[00315] Certain example embodiments here include the ability to recognize areas or objects onto which projection of IR/NIR or other illumination is not desired, and block projection to those areas. An example includes recognizing a human user's eyes or face, and keeping the IRTNIR projection away from the eyes or face for safety reasons.
[00316] Certain example embodiments disclosed here include using directed illumination incorporating IR/NIR wavelengths of light for object and gesture recognition systems to function in adverse lighting conditions. Any system which utilizes light in the infrared spectrum when interacting with humans or other living creatures has the added risk of eye safety. Devices which utilize IR/NIR in proximity to humans can incorporate multiple ways of safeguarding eyes.
[00317] According to some embodiments, light is projected in the IR/NIR wavelength onto specifically identified areas, thus providing improved illumination in adverse lighting conditions for object or gesture recognition systems. The illuminated area may then be captured by a CMOS or CCD image sensor. The example embodiment may identify human eyes and provide the coordinates of those eyes to the system which in turn blocks the directed illumination from beaming light directly at the eyes.
[00318] Figures 46A and 46B illustrate examples of how the system may be able to block IR/NIR projection to a human subject's eyes. For example, the image is captured with a CMOS or CCD image sensor and the image is sent to a microprocessor where one aspect of the software identifies the presence of human eyes in the field of view. The example embodiment may then send the coordinates of the eyes to the embodiment which controls the directed illumination. The embodiment may then create a section of blocked or blank illumination, as directed. As the directed illumination is scanned across a blanked area the light source is turned off. This prevents IR/NIR light from beaming directly into the eyes of a human.
[00319] Figure 46A is an example of a human subject 4612 with projected illumination 4624 incorporating eye blocking 4626.
[00320] Figure 46B is an example of a close up of human subject 4612 with a projected illumination incorporating 4624 eye blocking 4626.
[00321] There may be other reasons to block certain objects in the target area from IR/NIR or other radiation. Sensitive equipment may be located in the target area, that directed IR/NIR could damage. Cameras may be present, that flooding the sensors with IR illumination, may wash the camera out or damage the sensors. Any kind of motivation to block the IR/NIR could drive the embodiment to block out or restrict the amount of IR/NIR or other illumination to a particular area. Additionally, the system could be configured to infer eye location by identifying other aspects of the body. An example of this may be to recognize and identify the arms or the torso of a human target and calculate a probable relative position of a head and reduce or block the amount of directed illumination accordingly.
[00322] Certain example embodiments here include the ability to adjust the size of the output window and the relative beam divergence as it relates to the overall eye safe operation of the device. The larger the output window of the device, which represents the closest point a human eye can be placed relative to the light source, and/or the greater the divergence of the throw angle of the scanned beam, the less IR/NIR can enter the eye over a given period of time. A divergent scanned beam has the added effect of increasing the illuminated spot on the retina, which reduces the harmful effect of IR/NIR over the same period of time.
[00323] Figure 47A and 47B illustrate the impact of output window size to the human eye 4722. Safe levels of IR are determined by intensity over area over time. The lower the intensity is for a given period of time, the safer the MPE or maximum permissible exposure is. In 47B where the output window 4724 is relatively the same height as the pupil 4722, in this example an output window 7mm tall by 16mm wide and the average dilated pupil is 7mm, approximately 34.4% of the light exiting the output window can enter the eye. If the output window is doubled in size 4726 to 14mm tall and 32mm wide, the maximum light that could enter the pupil drops to 8.6% as illustrated in Figures 47C and 47D, for example.
[00324] Figure 47A is a detailed illustration of 47B showing the relationship of elements of the device to a human eye at close proximity. Light from a semiconductor laser 4762 or other light source passes through optical element 4766 and is directed onto a 2D MEMs 4768 or other device designed to direct or steer a beam of light. The angular position of the MEMs reflects each pixel of a raster scanned image with a unique angle which creates an effective throw angle of each scan or frame of illumination. The scanned range of light exits the device through an output window 4726 which dimensionally matches the image size of the scanned area. The human eye 4712 is assumed to be located as close as possible to the exit window. A portion of the light from the exit window can enter the pupil 4722 and is focused on the back or retina of the eye 4728. The angular nature of each sequential pixel causes the focused are to be larger than that of a collimated beam. This has the same effect as if the beam had passed through a divergent lens.
[00325] Figure 47C is a detailed illustration of 47D showing the relationship of elements of the device to a human eye at some distance. Light from a semiconductor laser 4762 or other light source passes through optical element 4766 and is directed onto a 2D MEMs 4768 or other device designed to direct or steer a beam of light. The angular position of the MEMs reflects each pixel of a raster scanned image with a unique angle which creates an effective throw angle of each scan or frame of illumination. The scanned range of light exits the device through an output window 4726 which dimensionally matches the image size of the scanned area. The human eye 4712 is assumed to be located as close as possible to the exit window. A portion of the light from the exit window can enter the pupil 4722 and is focused on the back or retina of the eye 4730. The angular nature of each sequential pixel causes the focused are to be larger than that of a collimated beam. This has the same effect as if the beam had passed through a divergent lens. The grater the throw angle of the device, the more small changes in the distance of the output window to the MEMs will result in a positive effect on reducing the total amount of light which can enter the eye.
[00326] An embodiment of this technology incorporates the ability for the device to dynamically adjust the effective size of the output window. By controlling the MEMs in such a way as to change the throw angle or changing the horizontal and vertical scan rates, the system can effective adjust the output window to optimize the use of directed illumination while maximizes the eye safety.
[00327] Certain embodiments here also may incorporate adding the distance from the device to the human and calibrating the intensity of the directed illumination in accordance with the distance. In this embodiment even if the eyes are not detectable, a safe level of IR/NIR can be utilized.
Color Enhanced IR
[00328] Certain example embodiments here may include color variation of the projected illumination. This may be useful because systems using directed illumination may incorporate IR/NIR of light. These are outside of the spectrum of light visible to humans. When this light is captured by a CMOS or CCD imaging sensor may generate a monochromatic image normally depicted in a black and white or gray scale. Humans and image processing systems may rely on color variation to distinguish edges, objects, shapes and motion. In situations where IR/NIR directed illumination works in conjunction with a system that requires color information, specific colors can be artificially assigned to each level of grey for display. Furthermore by artificially applying the color values, differentiation between subtle variations in gray can be emphasized thus improving the image for humans.
[00329] According to certain embodiments, directing illumination in the IR/ NIR wavelength onto specifically identified areas, may provide augmented illumination, as disclosed in here. Such an example illumination may then be captured by a CMOS or CCD image sensor. In certain embodiments, the system may then apply color values to each shade of gray and either passes that information onto other software for further processing or displays the image on a monitor for a human observer.
[00330] Projected color is additive, adding light to make different colors, intensity, etc. For example, 8bit color provides 256 levels for each projection device such as a lasers or LEDs, etc. The range is 0-255 since 0 is a value. For example, 24 bit color 8x3 results in 16.8 million colors.
[00331] Referring to IR/NIR, the system processing the IR/NIR signals may return black, white and shades of gray in order to interpret the signals. Many IR cameras produce 8 bit gray scale. And it may be very difficult for a human to discern the difference between gray 153 and gray 154. Factors include the quality and calibration of the monitor, the ambient lighting, the observer's biological sensitivity, number of rods versus cones in the eye, etc. The same problem exists for gesture and object recognition software - it has to interpret grey scale into something meaningful.
[00332] Embodiments here include the ability to add back color values to the grey scales. The system may set grey 153 to be red 255 and 154 to be green 255, or any other settings, this being only one example. Using various assignment methods and systems, color levels may be assigned to each grey scale value. For example, everything below 80 gets 000 or black and everything above 130 gets, 255,255,255 white and the middle range is expanded.
[00333] Figure 48A illustrates a nine level gray scale with arbitrarily assigned R-red G-green B-blue values using an 8 bit RGB additive index color scale. Because the assignment of color to gray is artificial the scale and assignments can be in formats that are best matched to the post enhancement systems. Any variation of assigned colors may be used, the example shown in Figure 48A is illustrative only.
[00334] Figure 48B illustrates an example image captured inclusive of a subject 4812 which has been color enhanced according to the assignments of color from Figure 48A. Thus, the colors, Red, Green and Blue show up in the amounts indicated in Figure 48A, according to the level of grey scale assigned by the example system. Thus, for example, if the monochrome system assigned a pixel a grey scale of 5, the system here would assign 0 Red, 0 Blue and 200 Green to that pixel, making it a certain shade of green on the display of 48B. A grey scale assignment of 1 would assign 150 Red, 0 Green and 0 Blue, assigning a certain shade of red to the pixels with that grey scale value. In such a way, the grey scale shading becomes different scales of colors instead of a monochrome scale.
[00335] And some example embodiments could include the display of the color could apply color enhancement to select areas only, once a target is identified and illuminated. Some embodiments may enable a nonlinear allocation of color. In such an embodiment, thresholds can be assigned to the levels. An example of this could be to take all low levels and assign them the same color or black, thus extenuating a narrower range of gray.
[00336] And certain example embodiments could include identification of a particular target by a human user/observer of the displayed image to be enhanced. This could be accomplished with a mouse, touch screen or other gesture recognition which would allow the observer to indicate an area of interest. Square Wave Propagation
[00337] Certain embodiments here also include the ability to utilize propagation of a light- based square wave, and more specifically an interactive raster scanning system/method for directing a square wave. In such a way, directed illumination and ToF - Time-Of-Flight imaging may be used to map and determine distance of target objects and areas.
[00338] Square waves are sometimes used by short range TOF or time-of-flight depth mapping technologies. For example, an array of LEDs may be turned on an off at a certain rate, to create a square wave.
[00339] In some embodiments, the LEDs may switch polarity to create waves of light with square waves of polarity shifted. In some embodiments, when these waves bounce off or reflect off objects, the length of the wave may change. This may allow Current Assisted Photon Demodulating (CAPD) image sensors to create a depth map.
[00340] In certain examples, projected light from LEDs may not be suitable for generating square waves without using current modulation to switch the polarity of the LEDs, thus resulting in optical switching. In such embodiments, a single Continuous Wave (CW) laser may be pulsed at high rates, for example 1.1 nanoseconds, and adjust the timing such that a sweeping laser may create a uniform wave front.
[00341] Some example embodiments here include using a directed single laser beam which is configured to produce a raster scan based on a 2D MEMs or similar optical steering device. In this example, a continuous wave laser such as a semiconductor laser which can be either amplitude modulated or pulse width modulated, or both, is used as the source for generating the square wave. Also, in this example embodiment, a raster scan can form an interlaced, de- interlaced, or progressive pattern. When the laser is reflected off of a beam steering mechanism capable of generating a raster scan, an area of interest can be fully illuminated during one complete scan or frame. Some raster scans are configured to have horizontal lines made up of a given number of pixels and a given number of horizontal lines. In such an example, during each pixel the laser can be turned on. The on time as well as the optical power or amplitude of each pixel may be controlled by the system, generating one or more pulses of a square wave. In this example, when timed such that the pulses for each sequential pixel are in phase with the desired wave format, they may generate a wave front that will appear to the imaging system as if generated as a single wave front. [00342] In some embodiments, further control over the placement of the square wave may be accomplished where a human/user or a system may analyze the reflected image received from the image sensor, and help identify the subject or subjects of interest. The system may then control the directed illumination to only illuminate a desired area. This can reduce the amount of processing required by the imaging system, as well as allow for a higher level of intensity, which also improves the system performance.
[00343] Figure 49A is an example representative graph which shows four cycles of an example square wave. Dotted line 4922 shows a sample wave generated gain shifted LED. Dashed line 4924 represents an example pulse which is generated by an example semiconductor laser. These example lasers may have switching time that are beneficial to such a system and allow for particular square wave propagation, as shown, with less or nearly no noise on the wave propagation. Solid line 4926 illustrates how the example pulses may be kept in phase if the constraints of the system prevent sequential pulses.
[00344] Figure 49B illustrates an example target area including a target human figure 4912 in an example room where a propagated square wave generated by system for directed illumination 4916 is used. In such a way, an example embodiment may use an optical switching mechanism to switch a laser on and off, producing clean pulses to reflect off of a target. In an example where in-phase pulses are used, they may form uniform wave fronts 4918. Thus, the returning, reflected waves (not pictured) can then be captured and analyzed for demodulation of the square waves. Additionally, certain embodiments include using gain switching to change the polarity of the laser, creating on and off pulses at various intervals.
Dynamically Calibrated Projected Patterns for 3D Surface Imaging
[00345] There are many elements which impact the performance of 3D surface imaging methodologies which rely on the projection of patterns of light onto a subject. These systems analyze the captured image of the patterns on the subject through various algorithms. These algorithms derive information which allows those systems to generate depth maps or point clouds, data bases which can be used by other systems to infer three dimensional characteristics of a two dimensional image. This information can be further processed to extract such information as gesture, human, facial and object recognition.
[00346] Factors which these methodologies and others not described here have in common are the need to optimize the pattern projected onto a subject. The frequency of the pattern, or number of times it repeats, number of lines, and other aspects of the pattern effect the system's ability to accurately derive information. Alternating patterns in some examples are necessary to produce the interference or fringe patterns required for the methodology's algorithm. In other methods the orientation of the patterns projected onto subject and the general orientation of the subject influences various characteristics related to optimal data extraction.
[00347] The ability to dynamically adjust the projected patterns on a subject may improve the accuracy, which is the deviation between calculated dimensions and actual, as well as resolution, the number of final data points, and increase information gathering and processing speeds.
[00348] Certain embodiments here include the ability to direct light onto specific target area(s), determining distance to one or more points and calibrating a projected pattern accordingly. This may be done with directed illumination and single or multipoint distance calculation used in conjunction with projected patterns including structured light, phase shift, or other methods of using projected light patterns to determine surface contours, depth maps or generation of a point clouds.
[00349] For example, a projected pattern from a single source will diverge the further it is from the origin, this is known as the throw angle. As a subject moves further away from the projector, the projected pattern will increase in size, because of the divergence. And, as a subject gets further away from a camera, the subject will occupy a smaller portion of the imaged area as a result of the FOV or viewing angle of the camera. The combined effect of the projected throw angle and the captured FOV may increase the distortion of the projected image. Thus, a calibrated projection system may be helpful to map an area and objects in an area where objects may have different locations from the camera.
[00350] A system that incorporates directed illumination with the ability to determine distance from a projector to one or more subject areas is used to statically or dynamically adjust projected patterns, as disclosed above. Further, some example embodiments may be able to segment a viewed area and adjust patterns accordingly to multiple areas simultaneously. Such example embodiments may analyze each segment independently and combine the results to create independent depth maps or combine independent depth maps into one. And such example embodiments may be used to determine if a flat wall or background is present and eliminate the background from either being projected upon or be removed in post processing. [00351] An embodiment of this system incorporates a system for detecting when either a projected or captured frame is corrupted, or torn. Corruption of a projected or captured image file may result from a number of errors introduced into the system. In this example of a corrupt frame of information the system can recognize that either a corrupt image has been projected or that a corrupted image has been captured. The system then may identify the frame such that later processes can discard the frame, repair the frame or determine if the frame is useable.
[00352] Some embodiments here may determine depth, 3D contours and/or distance and incorporate dynamically calibrating the patterns for optimization. Such examples may be used to determine distance and calibrate projected patterns onto a desired object or human, which may help determine surface contours, depth maps and generating point clouds.
[00353] According to certain embodiments, one or more points or pixels of light may be directed onto a human subject or an object. Such direction may be via a separate device, or an integrated one combined with a projector, able to direct projected patterns which can be calibrated by the system. The patterns may be projected with a visible wavelength of light or a wavelength in IR/NIR. The projector system may work in conjunction with a CMOS, CCD or other imaging device, software which controls the projecting device; object and/or gesture recognition software or human interface and a microprocessor as disclosed herein.
Structured Light for Surface Imaging
[00354] For example, a human/user or the recognition software analyzes the image received from the image sensor, identifies the subject or subjects of interest, assigns one or more points for distance calculation. The system may calculate the distance to each projected point. The distance information may be passed onto the software which controls the projected pattern. The system may then combine the distance information with information about the relative location and shape of the chosen subject areas. The system may then determine which pattern, pattern size and orientation depending on the circumstances. The projector may then illuminate the subject areas with the chosen pattern. The patterns may be captured by the image sensor and analyzed by software which outputs information in the form of a 3D representation of the subject, a depth map, point cloud or other data about the subject, for example.
[00355] Figure 50A illustrates an example embodiment using non-calibrated phase shift patterns projected onto human subjects 5012, 5013, and 5014. The effect of the throw angle as indicated by reference lines 5024 is illustrated as bands 5016, 5018, and 5020. In this example, the further from the point of origin the subject is, the wider the band becomes and the fewer bands are projected on to the subject.
[00356] Figure 50B illustrates an example embodiment, similar to 50A but where the pattern has been calibrated. Thus, in 50B, the phase shift pattern is projected onto human subjects 5032, 5033, and 5034. Using these calibrated patterns, the example system may determine the distance from the subjects of interest to the image sensor and generates a uniquely calibrated pattern for each subject. In this example, patterns 5036, 5038, and 5040 are calibrated such that they will produce the same number and line characteristics on each subject. This may be useful for the system to use in other calculations as described herein.
[00357] Continuing to refer to figure 50B, there may be multiple subjects at varying distances from the device. In such instances the system can segment the area and project uniquely calibrated patterns onto each subject. In such a way, segmented depth maps can be compared and added together to create a complete depth map of an area. And in such an example, the distance calculating ability of the system can also be used to determine the existence of a wall and other non-critical areas. The example system may use this information to eliminate these areas from analysis.
[00358] Figure 50C illustrates an example embodiment showing an ability to determine the general orientation of an object, in this example a vertical object 5064 and a horizontal object 5066. In this example, phase shifting is optimized when the patterns run perpendicular to the general orientation of the subject. The example system may identify the general orientation of a subject area and adjust the X, Y orientation of the pattern. The pattern projected in Figures 50A, B and C are exemplary only. Any number of patters may be used in the ways described here.
[00359] Figure 51 shows a table depicting some examples of projected patterns that can be used with dynamic calibration. These examples discussed below are not meant to be exclusive of other options but exemplary only. Further, the examples below only describe the patterns for reference purposes and are not intended as explanations of the process nor the means by which data is extracted from the patterns.
[00360] One embodiment example of this is sequential binary coding, 5110, is comprised of alternating black (off) and white (on) stripes generating a sequence of projected patterns, such that each point on the surface of the subject is represented by a unique binary code. N patterns can code 2N stripes, in the example of a 5 bit pattern, the result are 32 stripes. The example pattern series is 2 stripes (1 black, 1 white), then 4, 8, 16 and 32. When the images are captured and combined by the software, 32 unique x, y coordinates for each point along a line can be identified. Utilizing triangulation for each of the 32 points the z-distance can be calculated.
When the data from multiple lines are combined a depth map of the subject can be derived.
[00361] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00362] Directed illumination as described here controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination. Further, in this example there is a series of 5 separate patterns projected, by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
[00363] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00364] One embodiment example of this is sequential gray code, 5112, is similar to sequential binary code referenced in 5110, with the use of intensity modulated stripes instead of binary on/off patterns. This increases the level information that can be derived with the same or fewer patterns. In this example, L represents the levels of intensity and N the number of patterns in a sequence. Further in this example, there are 4 levels of intensity, black (off), white (100% on), 1 step gray (33% on), 2nd step gray (66% on) or L=4. N, the number of patterns in a sequence in this example is 3 resulting in 43 or 64, the number of unique points in one line.
[00365] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00366] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination. Further, in this example there is a series of 3 separate patterns projected, by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
[00367] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00368] One embodiment example of this is sequentially projected phase shifting, 5114, which utilizes the projection of sequential sinusoidal patterns onto a subject area. In this example a series of three of sinusoidal fringe patterns represented as IN, are projected onto the area of interest. The intensities for each pixel (x, y) of the three the patterns are described as
!ι(χ, y) = ¾(x, y) + Imod(x, y) cos((p(x, y) - Θ),
(x, y) = Io(x, y) + Imod(x, y) cos((p(x, y)),
I3(x, y) = I0(x, y) + Imod(x, y) cos((p(x, y) + Θ),
where I^x, y), I2(x, y), and I3(x, y) are the intensities of three patterns, I0(x, y) is the component background, Imod(x, y) is the modulation signal amplitude, φ(χ, y) is the phase, and Θ is the constant phase-shift angle.
[00369] Phase unwrapping is the process that converts the wrapped phase to the absolute phase. The phase information can be retrieved and unwrapped is derived from the intensities in the three fringe patterns.
[00370] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00371] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination. Further, in this example there is a series of 3 separate patterns projected, by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
[00372] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00373] One embodiment example of this is Trapezoidal, 5116. This method is similar to that described in 5114 phase shifting, but replaces a sinusoidal pattern with trapezoidal-shaped gray levels. Interpretation of the data into a depth map is similar, but can be more computationally efficient.
[00374] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00375] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination. Further, in this example there is a series of 3 separate patterns projected, by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
[00376] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00377] One embodiment example of this is a hybrid method, 5118, which combines methods of gray coding as described in 5112 and phase shifting as described in 5114 can be combined to form a precise series of patterns with reduced ambiguity. The gray code pattern determines non ambiguous range of phase while phase shifting provides increased sub-pixel resolution. In this example,4 patterns of a gray code are combined with 4 patterns of phase shifting to create an 8 frame sequence.
[00378] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00379] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination. Further, in this example there is a series of 8 separate patterns projected, by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
[00380] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00381] One embodiment example of this utilizes a Moire' pattern, 5120, which is based on the geometric interference between two patterns. The overlap of the patterns forms a series of dark and light fringes. These patterns can be interpreted to derive depth information.
[00382] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00383] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination. Further, in this example there is a series of at least 2 separate patterns projected, by controlling the projected pixels, the software can change the projected pattern each frame or frames as required.
[00384] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00385] One embodiment example of this is multi-wavelength also referred to as Rainbow 3D, 5122, is based upon spatially varying wavelengths projected onto the subject. With a known physical relationship between the directed illumination and image sensor, D and the calculated value for Θ, the angle between the image sensor and a particular wavelength of light λ, unique points can be identified on a subject and utilizing methods of triangulation distances to each point can be calculated.
[00386] This system can utilize light in the visible spectrum or in the IR/NIR spaced apart such that they can be subsequently separated by the system.
[00387] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00388] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00389] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00390] One embodiment example of this is a continuously varying code, 5124, can be formed utilizing three additive wavelengths, often times primary color channels of RGB or unique wavelengths of IR/NIR such that when added together form a continuously varying pattern. The interpretation of the captured image is similar to that as described in 5122.
[00391] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00392] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00393] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00394] One embodiment example of this is striped indexing, 5126, utilizes multiple wavelengths selected far enough apart to prevent cross talk noise from the imaging sensor. The wavelengths may be in the visible spectrum, generated by the combination of primary additive color sources such as RGB, or a range of IR/NIR. Stripes may be replaced with patterns to enhance the resolution of the image capture. The interpretation of the captured image is similar to that as described in 5122. [00395] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00396] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00397] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00398] One embodiment example of this is the use of segmented stripes, 5128, where to provide additional information about a pattern, a code is to introduced within a stripe. This creates a unique pattern for each line, and when known by the system, can allow one stripe to be easily identified from another.
[00399] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00400] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00401] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00402] One embodiment example of this is stripe indexing gray scale, 5130, where amplitude modulation provides for control of the intensity, stripes can be given gray scale values. In a simple example a 3 level sequence can be black, gray, and white. The gray stripes can be created by setting the level of each projected pixel at some value between 0 and the maximum. In non- amplitude modulated system the gray can be generated by a pattern of on/off pixels producing an average illumination of a stripe equivalent to level of gray or by reducing the on time of the pixel such that during one frame of exposure of an imaging device the on is a fraction of the full exposure. In such an example the charged level of the imaged pixels is proportionally less than that of full on and greater than off. An example of a pattern sequence is depicted below where B represents black, W represents white, and G represents gray. The pattern is depicted such that the sequence does not necessarily repeat as long as no two values appear next to each other.
BWGWBGWGBGWBGBWBGW. [00403] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00404] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00405] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00406] One embodiment example of this is De Bruijn sequence, 5132, which refers to a cyclic sequence of patterns where no pattern of elements repeats during the cycle in either an upward or downward progression through the cycle. In this example a three element pattern where each element has only 2 values 1 or 0, generates a cyclic pattern of 23=8 unique patterns (000, 001, 010, 100, 101, 110, 111). These sequences generate a pattern where no variation is adjacent to a similar pattern. The decoding of a De Bruijn sequence requires less computation work than other similar patterns. The variation in the pattern may be color/wavelength, width or combination of width and color/wavelength. [00407] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00408] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00409] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00410] One embodiment example of this is pseudo-random binary, 5134, utilizes a 2D grid pattern which segments the projected area into smaller areas in which a unique pattern is projected into each sub area such that one area is identifiable from adjacent segments. Pseudorandom binary arrays utilize a mathematical algorithm to generate a pseudo-random pattern of points which can be projected onto each segment.
[00411] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00412] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00413] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00414] One embodiment example of this is similar to the methodology described in 5134, where the binary points can be replaced by a point made up of multiple values generating a mini- pattern or code word, 5136. Each projected mini-pattern or code word creates a unique point identifier in each grid segment.
[00415] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00416] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00417] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00418] One embodiment example of this is a color/wavelength coded grid, 5138. In some instances it may be beneficial to have grid lines with alternating colors/wavelengths.
[00419] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00420] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00421] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00422] One embodiment example of this is a color/wavelength dot array, 5140 where unique wavelengths are assigned to points within each segment. In this example visible colors of R red, G green, and B blue are used. These could also be unique wavelengths of IR/NIR spaced far enough apart such as to minimize the cross talk that might occur on the image sensor.
[00423] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00424] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination.
[00425] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
[00426] One embodiment of this is the ability of the system to combine multiple methods into hybrid methods, 5142. The system determines areas of interest and segments the area. The system can then determine which method or combination/hybrid of methods is best suited for the given subject. Distance information can be used to calibrate the pattern for the object. The result is a segmented projected pattern where a specific pattern or hybrid pattern is calibrated to optimize data about each subject area. Factors influencing the patterns selected may include but not be limited to, if the subject is living, inanimate, moving, stationary, relative distance from the device, general lighting, and environmental conditions. The system processes each segment as a unique depth map or point cloud. The system can further recombine the segmented pieces to form a more complete map of the viewed area.
[00427] Systems that utilize this methodology require changing the projected pattern for each frame in the sequence. Projection systems have additional challenges in projecting the pattern evenly across a subject, especially one that may move along a Z axis, distance from the device, as the throw angle of the projection will alter the number of lines on the subject, or one where the subject may move allowing for only a portion or disproportionate spacing of the pattern to be projected onto the subject.
[00428] Directed illumination as described here, controls the illumination of an area at a pixel level. The system has the ability to control amplitude of each pixel from zero, or off, to a maximum level. An image captured by a sensor can be analyzed by system or a human observer to determine an area or areas of interest. Using triangulation or other method of determining the distance to the object from the device, the system determines the boundaries of the subject to be illuminated with a pattern and with the known distance, applies a calibration algorithm to the projected pattern and by controlling which projected pixels are turned on during one frame, optimize the pattern illumination. Further, in this example there is a series of multiple separate patterns projected, by controlling the projected pixels, the software can change the projected pattern each frame or frames as required. Further, in this example there are any number of separate patterns projected, by controlling the projected pixels, the software can change the projected pattern each frame or frames as required. [00429] Subject areas of interest may be located at distances from the device which are not optimal for one level of calibration. In these instances the system is configured such that the projection area is segmented and uniquely calibrated projected patterns are generated for each area of interest. The system has the ability to analyze each segment independently. To form a cohesive 3D map of independent segments, the information used to segment the areas of interest is utilized to reassemble or stitch the depth map for each segment into one image. As the subjects move, the system is configured to reanalyze the environment and dynamically calibrate the projected patterns accordingly.
Shared Aperture
[00430] Some embodiments include features for directing light onto specific target area(s), and image capture when used in a closed or open loop system. Such an example embodiment may include using use of a shared optical aperture for both the directed illumination and image sensor to help achieve matched throw angels and FOV angles.
[00431 ] For example, generally there may be three basic methodologies for optically configuring a shared aperture: adjacent, common and objective, with variations on the basic configurations to utilize a shared aperture configuration to best fit the overall system design objectives.
[00432] Certain embodiments may include a device for directing illumination and an image sensor that share the same aperture and for some portion of the optical path have comingled light paths. In such an example, at some point the path may split, thus allowing the incoming light to be directed to an image sensor. Continuing with this example, the outgoing light path may exit through the same aperture as the incoming light. Such an example embodiment may provide an optical system where the throw angle of the directed illumination and the FOV angle of the incoming light are matched. This may create a physically calibrated incoming and outgoing optical path. This may also create a system which requires only one optical opening in a device.
[00433] Figure 52A illustrates an adjacent configuration example where the outgoing and incoming light paths share the same aperture but are not comingled paths. In this example, light from a semiconductor laser or other light emitting device 5212 is directed by an optical element (not pictured) to a 2D MEMs (not pictured) or other mechanism for directing the beam. The outgoing light 5220 is the reflected off of a prism 5218 through the shared aperture (not pictured). Incoming light 5228 is reflected off of the same prism 5218 through a lens (not pictured) and onto an image sensor 5226. In this configuration example, the prism 5218 can be replaced by two mirrors that occupy the same relative surface (not pictured). This example configuration assumes that some degree of image and directed illumination may be corrected for by other means such as system calibration algorithms.
[00434] Figure 52B illustrates an example embodiment of one example configuration where the outgoing and incoming light paths share the same aperture and for some portion the optical path is comingled. In this example, light from a semiconductor laser or other light emitting device 5234 is directed by an optical element (not pictured) to a 2D MEMs (not pictured), for example, but any other mechanism could be used to direct the beam. The outgoing light 5240 passes through a polarized element 5238 and continues through the shared aperture (not pictured). Incoming light 5242 enters the shared aperture and is reflected off of the polarized element 5238 onto an image sensor 5246. This configuration provides for a simple configuration to achieve coincident apertures.
[00435] Figure 52C illustrates an example embodiment of an example configuration where outgoing and incoming light paths share the same common objective lens and for some portion the optical path is comingled. In this example, light from a semiconductor laser or other light emitting device 5252, for example, is directed by an optical element (not pictured) to a 2D MEMs (not pictured) or other mechanism for directing the beam. The outgoing light 5272 passes through lens 5258 to a scan format lens 5260 which creates a focused spot that maps the directed illumination to the same dimensions as the image sensor active area. The outgoing light then passes through optical element 5262, through a polarized element 5264 and exits through common objective lens 5266. In the example embodiment, incoming light enters through the common objective lens 5266 and is reflected off of the polarized element 5264 and onto the image sensor 5270.
[00436] Certain example embodiments may allow for a secondary source of illumination such as a visible light projector to be incorporated into the optical path of the directed illumination device. And certain example embodiments may allow for a secondary image sensor, enabling as an example for one image sensor designed for visible light and one designed for IR/NIR to share the same optical path. Conclusion
[00437] It should be noted that in this disclosure, the notion of "black and white" is in reference to the IR gray scale and is for purposes of human understanding only. One of skill in the art understands that the in dealing with IR and IR outputs, a real "black and white" as the human eye perceives it, may not exist. Instead, for IR, black is actually the absence of illumination - or binary "off and White which in additive illumination is the full spectrum of visible light (400-700nm) combined. For IR (700-lOOOnm) "white" does not mean anything, but is relative to the binary "on".
[00438] The inventive aspects here have mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person, other embodiments than the ones disclosed above are equally possible within the scope of the invention.
[00439] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.
[00440] As disclosed herein, features consistent with the present inventions may be implemented via computer-hardware, software and/or firmware. For example, the systems and methods disclosed herein may be embodied in various forms including, for example, a data processor, such as a computer that also includes a database, digital electronic circuitry, firmware, software, computer networks, servers, or in combinations of them. Further, while some of the disclosed implementations describe specific hardware components, systems and methods consistent with the innovations herein may be implemented with any combination of hardware, software and/or firmware. Moreover, the above-noted features and other aspects and principles of the innovations herein may be implemented in various environments. Such environments and related applications may be specially constructed for performing the various routines, processes and/or operations according to the invention or they may include a general -purpose computer or computing platform selectively activated or reconfigured by code to provide the necessary functionality. The processes disclosed herein are not inherently related to any particular computer, network, architecture, environment, or other apparatus, and may be implemented by a suitable combination of hardware, software, and/or firmware. For example, various general- purpose machines may be used with programs written in accordance with teachings of the invention, or it may be more convenient to construct a specialized apparatus or system to perform the required methods and techniques.
[00441] Aspects of the method and system described herein, such as the logic, may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices ("PLDs"), such as field programmable gate arrays ("FPGAs"), programmable array logic ("PAL") devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits. Some other possibilities for implementing aspects include: memory devices, microcontrollers with memory (such as EEPROM), embedded microprocessors, firmware, software, etc. Furthermore, aspects may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. The underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor
("MOSFET") technologies like complementary metal-oxide semiconductor ("CMOS"), bipolar technologies like emitter-coupled logic ("ECL"), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, and so on.
[00442] It should also be noted that the various logic and/or functions disclosed herein may be enabled using any number of combinations of hardware, firmware, and/or as data and/or instructions embodied in various machine-readable or computer-readable media, in terms of their behavioral, register transfer, logic component, and/or other characteristics. Computer-readable media in which such formatted data and/or instructions may be embodied include, but are not limited to, non- volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signaling media or any combination thereof. Examples of transfers of such formatted data and/or instructions by carrier waves include, but are not limited to, transfers (uploads, downloads, e-mail, etc.) over the Internet and/or other computer networks via one or more data transfer protocols (e.g., HTTP, FTP, SMTP, and so on). [00443] Unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise," "comprising," and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of "including, but not limited to." Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words "herein," "hereunder," "above," "below," and words of similar import refer to this application as a whole and not to any particular portions of this application. When the word "or" is used in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list and any combination of the items in the list.
[00444] Although certain presently preferred implementations of the invention have been specifically described herein, it will be apparent to those skilled in the art to which the invention pertains that variations and modifications of the various implementations shown and described herein may be made without departing from the spirit and scope of the invention. Accordingly, it is intended that the invention be limited only to the extent required by the applicable rules of law.
[00445] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.

Claims

CLAIMS What is claimed is:
1. A system for target illumination and mapping, comprising,
a light source and an image sensor;
the light source configured to,
communicate with a processor;
scan a target area within a field of view;
receive direction from the processor regarding projecting light within the field of view on at least one target;
the image sensor configured to,
communicate with the processor;
receive reflected illumination from the target area within the field of view; generate data regarding the received reflected illumination; and send the data regarding the received reflected illumination to the processor.
2. The system of claim 1 wherein the light source is an array of light emitting diodes (LEDs).
3. The system of claim 1 wherein the light source is a laser, wherein the laser is at least one of, amplitude modulated and pulse width modulated.
4. The system of claim 3 wherein the laser is an infrared laser and the image sensor is configured to receive and process infrared energy.
5. The system of claim 4 wherein the direction received from the processor includes direction to track the at least one target.
6. The system of claim 5 wherein the data regarding the received reflected illumination includes information that would allow the processor to determine the distance from the system to the select target via triangulation.
7. The system of claim 5 wherein the system is light source is further configured to receive direction from the processor to illuminate the tracked target in motion.
8. The system of claim 7 wherein the light source is further configured to block illumination of particular areas on the at least one select target via direction from the processor.
9. The system of claim 8 wherein the target is a human; and
wherein the particular areas on the at least one select target are areas which correspond to eyes of the target.
10. The system of claim 4 wherein the scan of the target area is a raster scan.
11. The system of claim 10 wherein the raster scan is completed within one frame of the image sensor.
12. The system of claim 10 wherein the light source includes at least one of, a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS, to direct the light.
13. The system of claim 10 wherein the light source includes at least one of, a rotating mirror.
14. The system of claim 7 wherein the tracking the selected target includes more than one selected target.
15. The system of claim 4 wherein the image sensor is further configured to generate gray shade image data based on the received infrared illumination; and
assign visible colors to gray shades of the image data.
16. The system of claim 1 wherein the image sensor is a complementary metal oxide semiconductor (CMOS).
17. The system of claim 1 wherein the image sensor is a charge coupled device (CCD).
18. The system of claim 1 wherein the light source and the image sensor include optical filters.
19. The system of claim 12 wherein the light source is a laser.
20. A system for illuminating a target area, comprising,
a directionally controlled laser light source, and an image sensor;
the directionally controlled laser light source configured to,
communicate with a processor;
scan the target area,
receive direction on illuminating specific selected targets within the target area from the processor,
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
the image sensor configured to,
communicate with the processor;
receive the laser light reflected off of the target area;
generate data regarding the received reflected laser light; and send the data regarding the received laser light to the processor.
21. The system of claim 20 wherein the laser light source is further configured to receive direction from the processor to illuminate at least two target objects with different illumination patterns.
22. The system of claim 20 wherein the data regarding the received reflected laser light is configured to allow the processor to calculate a depth map.
23. The system of claim 20 wherein the image sensor is a complementary metal oxide semiconductor (CMOS).
24. The system of claim 20 wherein the image sensor is a charge coupled device (CCD).
25. The system of claim 20 wherein the light source and the image sensor include optical filters.
26. The system of claim 22 wherein the data regarding the received reflected laser light is configured to allow the processor to calculate a point cloud.
27. The system of claim 20 wherein the directional control is via at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS.
28. The system of claim 20 wherein the directional control is via at least one rotating mirror.
29. The system of claim 20 wherein the laser is a continuous wave laser; and the laser light source is further configured to receive direction to send a pulse of energy to a unique part of the target area, creating pixels for the image sensor.
30. A method for target illumination and mapping, comprising,
via a light source,
communicating with a processor;
scanning a target area within a field of view;
receiving direction from the processor regarding projecting light within the field of view on at least one target;
via an image sensor,
communicating with the processor;
receiving reflected illumination from the target area within the field of view; generating data regarding the received reflected illumination; and sending the data regarding the received reflected illumination to the processor.
31. The method of claim 30 wherein the light source is an array of light emitting diodes (LEDs).
32. The method of claim 30 wherein the light source is a laser, wherein the laser is at least one of, amplitude modulated and pulse width modulated.
33. The method of claim 32 wherein the laser is an infrared laser and the image sensor is configured to receive and process infrared energy.
34. The method of claim 33 wherein the direction received from the processor includes direction to track the at least one target.
35. The method of claim 34 wherein the data regarding the received reflected illumination includes information that would allow the processor to determine the distance from the system to the select target via triangulation.
36. The method of claim 34 further comprising, via the light source,
receiving direction from the processor to illuminate the tracked target in motion.
37. The method of claim 36 further comprising, via the light source,
blocking illumination of particular areas on the at least one select target via direction from the processor.
38. The method of claim 37 wherein the target is a human; and
wherein the particular areas on the at least one select target are areas which correspond to eyes of the target.
39. The method of claim 33 wherein the scan of the target area is a raster scan.
40. The method of claim 39 wherein the raster scan is completed within one frame of the image sensor.
41. The method of claim 39 wherein the light source includes at least one of, a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS, to direct the light.
42. The method of claim 39 wherein the light source includes at least one of, a rotating mirror.
43. The method of claim 36 wherein the tracking the selected target includes more than one selected target.
44. The method of claim 33 further comprising, via the image sensor,
generating gray shade image data based on the received infrared illumination; and assigning visible colors to gray shades of the image data.
45. The method of claim 30 wherein the image sensor is a complementary metal oxide semiconductor (CMOS).
46. The method of claim 30 wherein the image sensor is a charge coupled device (CCD).
47. The method of claim 30 wherein the light source and the image sensor include optical filters.
48. The method of claim 41 wherein the light source is a laser.
49. A method for illuminating a target area, comprising,
via a directionally controlled laser light source,
communicating with a processor;
scanning the target area,
receiving direction on illuminating specific selected targets within the target area from the processor,
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
via an image sensor,
communicating with the processor;
receiving the laser light reflected off of the target area;
generating data regarding the received reflected laser light; and sending the data regarding the received laser light to the processor.
50. The method of claim 49 further comprising, via the laser light source,
receiving direction from the processor to illuminate at least two target objects with different illumination patterns.
51. The method of claim 49 wherein the data regarding the received reflected laser light is configured to allow the processor to calculate a depth map.
52. The method of claim 49 wherein the image sensor is a complementary metal oxide semiconductor (CMOS).
53. The method of claim 49 wherein the image sensor is a charge coupled device (CCD).
54. The method of claim 49 wherein the light source and the image sensor include optical filters.
55. The method of claim 51 wherein the data regarding the received reflected laser light is configured to allow the processor to calculate a point cloud.
56. The method of claim 49 wherein the directional control is via at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS.
57. The method of claim 49 wherein the directional control is via at least one rotating mirror.
58. The method of claim 49 further comprising, via the laser light source,
receiving direction to send a pulse of energy to a unique part of the target area, creating pixels for the image sensor.
59. The method of claim 49 wherein the laser is a continuous wave laser.
60. A system for target area illumination, comprising, a directional illumination source and image sensor;
the directional illumination source configured to,
communicate with a processor;
receive direction to illuminate the target area from the processor; and project illumination on the target area;
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
the image sensor configured to,
communicate with the processor;
capture reflected illumination off of the target area;
generate data regarding the captured reflected illumination; and send the data regarding the capture reflected illumination to the processor; wherein the illumination source and the image sensor share an aperture and which a throw angle of the directed illumination and a field of view angle of the reflected captured illumination are matched.
61. The system of claim 60 wherein the laser is an infrared laser and the image sensor is configured to receive and process infrared energy.
62. The system of claim 61 wherein the laser includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light.
63. The system of claim 62 wherein the data regarding the captured reflected illumination includes information regarding triangulation for distance measurements.
64. The system of claim 62 wherein the illumination source is further configured to receive instruction regarding motion tracking of the select target.
65. The system of claim 60 wherein the shared aperture is at least one of adjacent, common and objective.
66. A method for target area illumination, comprising,
via a directional illumination source,
communicating with a processor;
receiving direction to illuminate the target area from the processor; and projecting illumination on the target area;
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
via an image sensor,
communicating with the processor;
capturing reflected illumination off of the target area;
generating data regarding the captured reflected illumination; and sending the data regarding the capture reflected illumination to the processor; wherein the illumination source and the image sensor share an aperture and which a throw angle of the directed illumination and a field of view angle of the reflected captured illumination are matched.
67. The method of claim 66 wherein the laser is an infrared laser and the image sensor is configured to receive and process infrared energy; and
wherein the shared aperture is at least one of adjacent, common and objective.
68. The method of claim 67 wherein the laser includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light.
69. The method of claim 68 wherein the data regarding the captured reflected illumination includes information regarding triangulation for distance measurements.
70. A system for illuminating a target area, comprising,
a light source and an image sensor;
the light source configured to,
communicate with a processor; illuminate a target area with at least one pattern of light, within a field of view;
receive direction to illuminate at least one select target within the target area from the processor; and
receive information regarding illuminating the at least one select target with at least one calibrated pattern of light, from the processor,
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
the image sensor configured to,
communicate with the processor;
receive reflected illumination patterns from the at least one select target within the field of view;
generate data regarding the received reflected illumination patterns; and send data about the received reflected illumination patterns to the processor,
wherein the data includes,
information allowing the processor to determine distance to the at least one select target via triangulation of the illumination and received reflected illumination; and
information regarding structured light of the at least one received reflected illumination patterns.
71. The system of claim 70 wherein the pattern is at least one of,
alternating illuminated and non-illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array.
72. The system of claim 71 wherein the light source is further configured to change illumination patterns.
73. The system of claim 70 wherein the light source is a laser.
74. The system of claim 71 wherein the direction to illuminate at least one select target, includes direction to track the motion of the at least one select target.
75. A system for allowing mapping of a target area, comprising,
a laser and an image sensor;
the laser configured to,
communicate with a processor;
receive direction to illuminate at least one select target with a pattern of light;
project illumination on the at least one select target with the pattern of light;
receive information regarding calibration of the pattern of light;
project calibrated illumination on the at least one select target;
the image sensor configured to,
communicate with the processor;
receive reflected laser illumination patterns from the at least one select target;
generate data regarding the received reflected laser illumination patterns; and
send the data regarding the received reflected laser illumination to the processor,
wherein the data includes information that would allow the processor to,
determine distance via triangulation;
generate a map of the target area via 3D surface measurements; and
generate a point cloud of the select target.
76. The system of claim 75 wherein the pattern is at least one of, alternating illuminated and non-illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array.
77. The system of claim 76 wherein the light source is further configured to change illumination patterns.
78. The system of claim 76 wherein the laser is further configured to receive direction to track a motion of the selected target.
79. The system of claim 76 wherein the image sensor is at least one of complementary metal oxide semiconductor (CMOS) and charge coupled device (CCD).
80. A method for illuminating a target area, comprising,
via a light source,
communicating with a processor;
illuminating a target area with at least one pattern of light, within a field of view; receiving direction to illuminate at least one select target within the target area from the processor; and
receiving information regarding illuminating the at least one select target with at least one calibrated pattern of light, from the processor,
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
via an image sensor,
communicating with the processor;
receiving reflected illumination patterns from the at least one select target within the field of view;
generating data regarding the received reflected illumination patterns; and sending data about the received reflected illumination patterns to the processor, wherein the data includes, information allowing the processor to determine distance to the at least one select target via triangulation of the illumination and received reflected illumination; and
information regarding structured light of the at least one received reflected illumination patterns.
81. The method of claim 80 wherein the pattern is at least one of,
alternating illuminated and non-illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array.
82. The method of claim 81 further comprising, via the light source,
projecting a new illumination pattern.
83. The method of claim 80 wherein the light source is a laser.
84. The method of claim 81 wherein the direction to illuminate at least one select target, includes direction to track the motion of the at least one select target.
85. A method for allowing mapping of a target area, comprising,
via a laser,
communicating with a processor;
receiving direction to illuminate at least one select target with a pattern of light; projecting illumination on the at least one select target with the pattern of light; receiving information regarding calibration of the pattern of light; projecting calibrated illumination on the at least one select target; via an image sensor,
communicating with the processor;
receiving reflected laser illumination patterns from the at least one select target; generating data regarding the received reflected laser illumination patterns; and sending the data regarding the received reflected laser illumination to the processor,
wherein the data includes information that would allow the processor to, determine distance via triangulation;
generate a map of the target area via 3D surface measurements; and
generate a point cloud of the select target.
86. The method of claim 85 wherein the pattern is at least one of,
alternating illuminated and non-illuminated stripes, intensity modulated stripes, sequential sinusoidal, trapezoidal, Moire' pattern, multi-wavelength 3D, continuously varying, striped indexing, segmented stripes, coded stripes, indexing gray scale, De Bruiin sequence, pseudo-random binary, mini-pattern, wavelength coded grid, and wavelength dot array.
87. The method of claim 86, further comprising, via the light source,
projecting a new illumination pattern.
88. The method of claim 86, further comprising, via the laser,
receiving direction to track a motion of the selected target.
89. The method of claim 86 wherein the image sensor is at least one of complementary metal oxide semiconductor (CMOS) and charge coupled device (CCD).
90. A system for target illumination and mapping, comprising,
an infrared light source and an image sensor;
the infrared light source configured to,
communicate with a processor;
illuminate a target area within a field of view;
receive direction from the processor, to illuminate at least one select target within the field of view;
project illumination on the at least one select target, wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
the image sensor, having a dual band pass filter, configured to,
communicate with the processor;
receive reflected illumination from the target area within the field of view; receive reflected illumination from the at least one select target within the target area;
generate data regarding the received reflected illumination; and send the data to the processor.
91. The system of claim 90 wherein the dual band pass filter is configured to allow visible light and light at the wavelengths emitted by the infrared light source, to pass.
92. The system of claim 91 wherein the visible light wavelengths are between 400nm and 700nm.
93. The system of claim 91 wherein dual band pass filter includes a notch filter.
94. The system of claim 90 wherein the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD), and
wherein the infrared light source includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light.
95. A method for target illumination and mapping, comprising,
via an infrared light source,
communicating with a processor;
illuminating a target area within a field of view;
receiving direction from the processor, to illuminate at least one select target within the field of view;
projecting illumination on the at least one select target, wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
via an image sensor, having a dual band pass filter,
communicating with the processor;
receiving reflected illumination from the target area within the field of view; receiving reflected illumination from the at least one select target within the target area;
generating data regarding the received reflected illumination; and sending the data to the processor.
96. The method of claim 95 wherein the dual band pass filter is configured to allow visible light and light at the wavelengths emitted by the infrared light source, to pass.
97. The method of claim 96 wherein the visible light wavelengths are between 400nm and 700nm.
98. The method of claim 96 wherein dual band pass filter includes a notch filter.
99. The method of claim 95 wherein the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD), and
wherein the infrared light source includes at least one of a single axis micro electromechanical system mirror (MEMS) and a dual axis MEMS to direct the light.
100. A system for target illumination and mapping, comprising,
a laser light source and an image sensor;
the laser light source configured to,
communicate with a processor;
project square wave illumination to at least one select target,
wherein the square wave includes at least a leading edge and a trailing edge; send information to the processor regarding the time the leading edge of the square wave illumination was projected and the time the trailing edge of the square wave was projected,
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
the image sensor configured to,
communicate with the processor;
receive at least one reflected square wave illumination from the at least one select target;
generate a signal based on the received reflected square wave illumination, wherein the signal includes at least information regarding the received time of the leading edge and received time of the trailing edge of the square wave; and send the signal regarding the received reflected square wave illumination to the processor.
101. The system of claim 100 wherein the laser light source is further configured to pulse, and wherein the square wave leading edge is caused by the laser pulse on and the trailing edge is caused by the laser pulse off.
102. The system of claim 100 wherein the laser light source is further configured to change polarization, and
wherein the square wave is caused by a change of polarization.
103. The system of claim 102 wherein the laser light source is further configured to switch gain in order to change polarization.
104. The system of claim 100 wherein the image sensor is a current assisted photon
demodulation (CAPD).
105. A method for target illumination and mapping, comprising,
via a laser light source, communicating with a processor;
projecting square wave illumination to at least one select target, wherein the square wave includes at least a leading edge and a trailing edge;
sending information to the processor regarding the time the leading edge of the square wave illumination was projected and the time the trailing edge of the square wave was projected,
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
via an image sensor,
communicating with the processor;
receiving at least one reflected square wave illumination from the at least one select target;
generating a signal based on the received reflected square wave illumination, wherein the signal includes at least information regarding the received time of the leading edge and received time of the trailing edge of the square wave; and
sending the signal regarding the received reflected square wave illumination to the processor.
106. The method of claim 105, further comprising, via the laser light source,
projecting a pulse of energy,
wherein the square wave leading edge is caused by the laser pulse on and the trailing edge is caused by the laser pulse off.
107. The method of claim 105, further comprising, via the laser light source,
projecting energy with a new polarization,
wherein the square wave is caused by a change of polarization.
108. The method of claim 107, further comprising, via the laser light source,
switching gain in order to change polarization.
I l l
109. The method of claim 105 wherein the image sensor is a current assisted photon demodulation (CAPD).
110. A system for target illumination and mapping, comprising,
an infrared laser light source and an image sensor;
the infrared laser light source configured to,
communicate with a processor;
illuminate at least one select target within a field of view,
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
the image sensor configured to,
communicate with the processor;
receive reflected illumination from the at least one select target within the field of view;
create a signal based on the received reflected illumination; and send the signal to the processor,
wherein the signal includes at least information that would allow the processor to map the target area and generate an image of the target area.
111. The system of claim 110 wherein the image is a gray scale image.
112. The system of claim 111 wherein the signal further includes information that would allow the processor to assign visible colors to the gray scale.
113. The system of claim 110 wherein the infrared laser light source is further configured to receive direction from the processor to illuminate a select target.
114. The system of claim 113 wherein the infrared laser light source is further configured to receive direction from the processor to track the motion of the select target and maintain illumination on the select target.
115. A method for target illumination and mapping, comprising,
via an infrared laser light source,
communicating with a processor;
illuminating at least one select target within a field of view,
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
via an image sensor,
communicating with the processor;
receiving reflected illumination from the at least one select target within the field of view;
creating a signal based on the received reflected illumination; and sending the signal to the processor,
wherein the signal includes at least information that would allow the processor to map the target area and generate an image of the target area.
116. The method of claim 115 wherein the image is a gray scale image.
117. The method of claim 116 wherein the signal further includes information that would allow the processor to assign visible colors to the gray scale.
118. The method of claim 115 wherein the infrared laser light source is further configured to receive direction from the processor to illuminate a select target.
119. The method of claim 118 wherein the infrared laser light source is further configured to receive direction from the processor to track the motion of the select target and maintain illumination on the select target.
120. A system for target illumination comprising,
an illumination device in communication with an image sensor;
the illumination device further configured to,
communicate with a processor; project low level full scan illumination to a target area,
wherein the laser is at least one of, amplitude modulated and pulse width modulated;
the image sensor further configured to,
communicate with the processor;
receive reflected illumination from the target area;
the processor configured to,
identify specific target areas of interest,
map the target area,
set a value of the number of image pulses for one scan,
calculate the energy intensity of each pulse,
calculate the total intensity per frame, and
compare the total intensity per frame to an eye safety limit;
the computing system further configured to,
direct the illumination device to scan if the total intensity per frame is less than the eye safety limit, and
direct the illumination device to stop scan if the total intensity per frame is greater than or equal to the eye safety limit.
121. The system of claim 120 wherein the processor is further configured to communicate to a user an error message if the total intensity per frame is greater than or equal to the eye safety limit.
122. The system of claim 120 wherein the processor is further configured to, if the total intensity per frame is greater than or equal to the eye safety limit,
map the target area;
set a new value of the number of image pulses for one scan;
calculate the energy intensity of each pulse;
calculate the total intensity per frame; and
compare the total intensity per frame to an eye safety limit.
123. The system of claim 120 wherein the computing system is further configured to track the specific target of interest and direct the illumination source to illuminate the specific area of interest.
124. The system of claim 120 wherein the illumination source includes a laser and a micro electromechanical system mirror (MEMS) to direct the light.
125. A method for target illumination comprising,
via an illumination device,
communicating with a processor;
projecting low level full scan illumination to a target area,
wherein the laser is at least one of, amplitude modulated and pulse width modulated;
via an image sensor,
communicating with the processor;
receiving reflected illumination from the target area;
via the processor,
identifying specific target areas of interest,
mapping the target area,
setting a value of the number of image pulses for one scan,
calculating the energy intensity of each pulse,
calculating the total intensity per frame, and
comparing the total intensity per frame to an eye safety limit;
directing the illumination device to scan if the total intensity per frame is less than the eye safety limit, and
directing the illumination device to stop scan if the total intensity per frame is greater than or equal to the eye safety limit.
126. The method of claim 125, further comprising, via the processor,
communicating to a user an error message if the total intensity per frame is greater than or equal to the eye safety limit.
127. The method of claim 125, further comprising, via the processor,
if the total intensity per frame is greater than or equal to the eye safety limit,
mapping the target area;
setting a new value of the number of image pulses for one scan; calculating the energy intensity of each pulse;
calculating the total intensity per frame; and
comparing the total intensity per frame to an eye safety limit.
128. The method of claim 125 wherein the computing system is further configured to track the specific target of interest and direct the illumination source to illuminate the specific area of interest.
129. The method of claim 125 wherein the illumination source includes a laser and a micro electromechanical system mirror (MEMS) to direct the light.
130. A system for target illumination and mapping, comprising,
a directed light source, at least one image projector, and an image sensor;
the directed light source configured to,
communicate with a processor;
illuminate at least one select target area within a field of view;
receive direction to illuminate an at least one select target,
wherein the laser is at least one of, amplitude modulated and pulse width modulated;
the image sensor configured to,
communicate with the processor;
receive reflected illumination from the at least one select target within the target area;
create data regarding the received reflected illumination;
send data regarding the received reflected illumination to the processor; and the image projector configured to,
communicate with the processor;
receive direction to project an image on the at least one select target; and project an image on the at least one select target.
131. The system of claim 130 wherein the directed light source is an infrared laser.
132. The system of claim 131 wherein the data regarding the received reflected illumination includes information regarding the distance from the system to the target via triangulation.
133. The system of claim 132 wherein the image projector is calibrated to the distance calculation from the processor,
wherein calibration includes adjustments to a throw angle of the image projector.
134. The system of claim 130 wherein the image projector is further configured to project at least two images on at least two different identified and tracked targets.
135. The system of claim 130 wherein the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
136. The system of claim 130 wherein the directed light source is configured to project a pattern of illumination on the select target.
137. A system for target illumination and mapping, comprising,
a directed light source and an image sensor;
the directed light source configured to,
communicate with a processor;
illuminate at least one target area within a field of view;
receive direction to track a selected target within the target area from the processor; receive direction to project an image on the tracked selected target from the processor;
project an image on the tracked selected target according to the received direction;
the image sensor configured to,
communicate with the processor;
receive reflected illumination from the at least one select target within the field of view;
generate data regarding the received reflected illumination; and send the received reflected illumination data to the processor.
138. The system of claim 137 wherein the directed light source is a visible light laser and the image is a laser scan image,
wherein the laser is at least one of, amplitude modulated and pulse width modulated.
139. The system of claim 137 wherein the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
140. A method for target illumination and mapping, comprising,
via a directed light source,
communicating with a processor;
illuminating at least one select target area within a field of view; receiving direction to illuminate an at least one select target,
wherein the laser is at least one of, amplitude modulated and pulse width modulated;
via an image sensor,
communicating with the processor;
receiving reflected illumination from the at least one select target within the target area;
creating data regarding the received reflected illumination; sending data regarding the received reflected illumination to the processor; and via an image projector,
communicating with the processor;
receiving direction to project an image on the at least one select target; and projecting an image on the at least one select target.
141. The method of claim 140 wherein the directed light source is an infrared laser.
142. The method of claim 141 wherein the data regarding the received reflected illumination includes information regarding the distance from the system to the target via triangulation.
143. The method of claim 142 wherein the image projector is calibrated to the distance calculation from the processor,
wherein calibration includes adjustments to a throw angle of the image projector.
144. The method of claim 140, further comprising, via the image projector,
projecting at least two images on at least two different identified and tracked targets.
145. The method of claim 140 wherein the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
146. The method of claim 140 further comprising, via the directed light source,
projecting a pattern of illumination on the select target.
147. A method for target illumination and mapping, comprising,
via a directed light source,
communicating with a processor;
illuminating at least one target area within a field of view;
receiving direction to track a selected target within the target area from the processor; receiving direction to project an image on the tracked selected target from the processor;
projecting an image on the tracked selected target according to the received direction;
via an image sensor,
communicating with the processor;
receiving reflected illumination from the at least one select target within the field of view;
generating data regarding the received reflected illumination; and sending the received reflected illumination data to the processor.
148. The method of claim 147 wherein the directed light source is a visible light laser and the image is a laser scan image,
wherein the laser is at least one of, amplitude modulated and pulse width modulated.
149. The method of claim 147 wherein the image sensor is at least one of a complementary metal oxide semiconductor (CMOS) and a charge coupled device (CCD).
150. A system for target illumination and mapping, comprising,
a directional light source and an image sensor,
the directional light source configured to,
communicate with a processor;
illuminate at least one target area within a field of view with a scan of at least one pixel point;
receive direction to illuminate the target with additional pixel points over time for additional calculations of distance, from the at least one processor;
the image sensor configured to,
communicate with the processor;
receive a reflection of the at least one pixel point from the at least one select target within the field of view; generate data regarding the received pixel reflection;
send the data regarding the received pixel reflection to the at least one processor,
wherein the data includes information that the processor could analyze and determine distance from the system to the target via triangulation, and
wherein the data further includes information regarding the relative proximity between the directional light source and the image sensor.
151. The system of claim 150 wherein the directional light source is a laser, and at least one of, amplitude modulated and pulse width modulated.
152. The system of claim 151 wherein the data further includes information that the processor could analyze and determine a depth map, based on the calculations of distance of the at least one target pixel point.
153. The system of claim 151 wherein the data further includes information that the processor could analyze and determine the distance between the system and the target via triangulation among the directed light source, the image sensor, and the additional pixel points.
154. The system of claim 151 wherein the directional light source is further configured to receive direction to illuminate the selected target with at least one pixel point from the processor.
155. A method for target illumination and mapping, comprising,
via a directional light source,
communicating with a processor;
illuminating at least one target area within a field of view with a scan of at least one pixel point;
receiving direction to illuminate the target with additional pixel points over time for additional calculations of distance, from the at least one processor;
via an image sensor,
communicating with the processor; receiving a reflection of the at least one pixel point from the at least one select target within the field of view;
generating data regarding the received pixel reflection;
sending the data regarding the received pixel reflection to the at least one processor,
wherein the data includes information that the processor could analyze and determine distance from the system to the target via triangulation, and
wherein the data further includes information regarding the relative proximity between the directional light source and the image sensor.
156. The method of claim 155 wherein the directional light source is a laser, and at least one of, amplitude modulated and pulse width modulated.
157. The method of claim 156, wherein the data further includes information that the processor could analyze and determine a depth map, based on the calculations of distance of the at least one target pixel point.
158. The method of claim 156 wherein the data further includes information that the processor could analyze and determine the distance between the system and the target via triangulation among the directed light source, the image sensor, and the additional pixel points.
159. The method of claim 156, further comprising, via the directional light source
receiving direction to illuminate the selected target with at least one pixel point from the processor.
160. A system for biometric analysis, comprising,
a directed laser light source and an image sensor;
the directed laser light source configured to,
communicate with a processor;
illuminate a target area within a field of view;
receive direction to illuminate at least one select target in the target area; receive direction to illuminate a biometric area of the at least one select target,
wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
the image sensor configured to,
communicate with the processor;
receive reflected illumination from the at least one target area within the field of view;
generate data regarding the received reflected illumination;
send the generated data to the processor,
wherein the data includes at least information that would allow the processor to map the target area, identify the select target within the target area, and determine a biometric reading of the at least one select target.
161. The system of claim 160 wherein the biometric reading is at least one of, skin deflection, skin reflectivity, and oxygen absorption.
162. The system of claim 160 wherein the illumination is a pattern of illumination, and wherein the computing system is further configured to analyze the reflected pattern illumination from the target.
163. The system of claim 163 wherein the data contains further information that would allow the processor to calculate a distance from the system to the target via triangulation.
164. The system of claim 163 wherein the light source is further configured to receive calibration information of the illumination pattern; and
project the calibrated pattern on the at least one select target.
165. A method for biometric analysis, comprising,
via a directed laser light source,
communicating with a processor; illuminating a target area within a field of view;
receiving direction to illuminate at least one select target in the target area;
receiving direction to illuminate a biometric area of the at least one select target, wherein the laser is at least one of, amplitude modulated and pulse width modulated; and
via an image sensor,
communicating with the processor;
receiving reflected illumination from the at least one target area within the field of view;
generating data regarding the received reflected illumination;
sending the generated data to the processor,
wherein the data includes at least information that would allow the processor to map the target area, identify the select target within the target area, and determine a biometric reading of the at least one select target.
166. The method of claim 165 wherein the biometric reading is at least one of, skin deflection, skin reflectivity, and oxygen absorption.
167. The method of claim 165 wherein the illumination is a pattern of illumination, and wherein the computing system is further configured to analyze the reflected pattern illumination from the target.
168. The method of claim 167 wherein the data contains further information that would allow the processor to calculate a distance from the system to the target via triangulation.
169. The method of claim 168, further comprising, via the light source,
receiving calibration information of the illumination pattern; and
projecting the calibrated pattern on the at least one select target.
170. A system for target illumination and mapping, comprising,
a directed light source, and an image sensor, the light source having an aperture and configured to,
illuminate a target area within a field of view, via an incremental scan, wherein each increment has a unique outbound angle from the light source aperture, and a unique inbound angle to the image sensor aperture;
send data regarding the incremental outbound angles to the processor; and the image sensor having an aperture and configured to,
receive reflected illumination from the at least one select target within the field of view;
generate data regarding the received reflected illumination including inbound angles; and
send the data regarding the received reflected illumination to the processor;
wherein the data regarding the outbound angles and the data regarding the inbound angles include information used to calculate a distance from the system to the target via triangulation; and
wherein the distance between light source aperture and the image capture aperture is relatively fixed.
171. The system of claim 170 wherein the directed light source is a laser, wherein the laser is at least one of, amplitude modulated and pulse width modulated.
172. The system of claim 171 wherein the image senor includes optical filters.
173. The system of claim 171 wherein the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a depth map based on the illumination.
174. The system of claim 173 wherein the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a point cloud based on the depth map.
175. A method for target illumination and mapping, comprising,
via a directed light source, having an aperture,
illuminating a target area within a field of view, via an incremental scan,
wherein each increment has a unique outbound angle from the light source aperture, and a unique inbound angle to the image sensor aperture;
sending data regarding the incremental outbound angles to the processor; and via an image sensor, having an aperture,
receiving reflected illumination from the at least one select target within the field of view;
generating data regarding the received reflected illumination including inbound angles; and
sending the data regarding the received reflected illumination to the processor; wherein the data regarding the outbound angles and the data regarding the inbound angles include information used to calculate a distance from the system to the target via triangulation; and
wherein the distance between light source aperture and the image capture aperture is relatively fixed.
176. The method of claim 175 wherein the directed light source is a laser, wherein the laser is at least one of, amplitude modulated and pulse width modulated.
177. The method of claim 176 wherein the image senor includes optical filters.
178. The method of claim 176 wherein the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a depth map based on the illumination.
179. The method of claim 178 wherein the data regarding the outbound angles and the data regarding the inbound angles further include information used to calculate a point cloud based on the depth map.
PCT/US2013/050551 2012-07-15 2013-07-15 Interactive illumination for gesture and/or object recognition WO2014014838A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/597,819 US20160006914A1 (en) 2012-07-15 2015-01-15 Interactive Illumination for Gesture and/or Object Recognition

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201261671764P 2012-07-15 2012-07-15
US61/671,764 2012-07-15
US201261682299P 2012-08-12 2012-08-12
US61/682,299 2012-08-12
US201361754914P 2013-01-21 2013-01-21
US61/754,914 2013-01-21

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/597,819 Continuation US20160006914A1 (en) 2012-07-15 2015-01-15 Interactive Illumination for Gesture and/or Object Recognition

Publications (2)

Publication Number Publication Date
WO2014014838A2 true WO2014014838A2 (en) 2014-01-23
WO2014014838A3 WO2014014838A3 (en) 2014-05-01

Family

ID=49949351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/050551 WO2014014838A2 (en) 2012-07-15 2013-07-15 Interactive illumination for gesture and/or object recognition

Country Status (2)

Country Link
US (1) US20160006914A1 (en)
WO (1) WO2014014838A2 (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150256722A1 (en) * 2014-03-07 2015-09-10 Canon Kabushiki Kaisha Image capturing apparatus and control method of image capturing apparatus
WO2016005976A3 (en) * 2014-07-08 2016-04-07 Oculus Vr, Llc Method and system for adjusting light pattern for structured light imaging
WO2016087273A1 (en) * 2014-12-01 2016-06-09 Koninklijke Philips N.V. Device and method for skin detection
CN106030239A (en) * 2014-01-29 2016-10-12 Lg伊诺特有限公司 Device for extracting depth information and method thereof
EP3104209A3 (en) * 2015-05-20 2017-01-25 Oculus VR, LLC Method and system for generating light pattern using polygons
US20170068319A1 (en) * 2015-09-08 2017-03-09 Microvision, Inc. Mixed-Mode Depth Detection
US9648698B2 (en) 2015-05-20 2017-05-09 Facebook, Inc. Method and system for generating light pattern using polygons
CN106679671A (en) * 2017-01-05 2017-05-17 大连理工大学 Navigation marking graph recognition method based on laser data
WO2017106053A1 (en) * 2015-12-16 2017-06-22 Oculus Vr, Llc Range-gated depth camera assembly
WO2017112044A1 (en) * 2015-12-26 2017-06-29 Intel Corporation Stereo depth camera using vcsel with spatially and temporally interleaved patterns
CN107250841A (en) * 2015-02-19 2017-10-13 皇家飞利浦有限公司 Infrared laser light irradiation apparatus
CN108027438A (en) * 2015-09-20 2018-05-11 高通股份有限公司 Light detection and ranging (LIDAR) system with two-beam guiding
WO2018152061A1 (en) * 2017-02-14 2018-08-23 Sony Corporation Using micro mirrors to improve the field of view of a 3d depth map
WO2018169758A1 (en) 2017-03-13 2018-09-20 OPSYS Tech Ltd. Eye-safe scanning lidar system
US10200584B2 (en) 2014-03-07 2019-02-05 Canon Kabushiki Kaisha Image capturing apparatus, control method of image capturing apparatus, and program
EP3477437A1 (en) * 2017-10-27 2019-05-01 Thomson Licensing Method of remote-stimulated device illumination through photoluminescence and corresponding apparatus
WO2019141699A1 (en) * 2018-01-18 2019-07-25 Robert Bosch Gmbh Method for operating a lighting device or a camera apparatus, control device and camera apparatus
US20190391271A1 (en) * 2017-03-31 2019-12-26 Huawei Technologies Co., Ltd. Apparatus and method for scanning and ranging with eye-safe pattern
EP3113070B1 (en) * 2015-04-10 2021-06-02 Google LLC Method and system for optical user recognition
WO2021231559A1 (en) * 2020-05-13 2021-11-18 Luminar, Llc Lidar system with high-resolution scan pattern
US11320538B2 (en) 2019-04-09 2022-05-03 OPSYS Tech Ltd. Solid-state LIDAR transmitter with laser control
US11353559B2 (en) 2017-10-09 2022-06-07 Luminar, Llc Adjustable scan patterns for lidar system
US11415675B2 (en) 2017-10-09 2022-08-16 Luminar, Llc Lidar system with adjustable pulse period
US11494926B1 (en) * 2021-07-01 2022-11-08 Himax Technologies Limited Method for performing hybrid depth detection with aid of adaptive projector, and associated apparatus
US11513195B2 (en) 2019-06-10 2022-11-29 OPSYS Tech Ltd. Eye-safe long-range solid-state LIDAR system
CN116342497A (en) * 2023-03-01 2023-06-27 天津市鹰泰利安康医疗科技有限责任公司 Three-dimensional mapping method and system for inner wall of human body cavity
WO2023153139A1 (en) * 2022-02-09 2023-08-17 株式会社小糸製作所 Projector, and measuring device
US11740331B2 (en) 2017-07-28 2023-08-29 OPSYS Tech Ltd. VCSEL array LIDAR transmitter with small angular divergence
US11762068B2 (en) 2016-04-22 2023-09-19 OPSYS Tech Ltd. Multi-wavelength LIDAR system
US11802943B2 (en) 2017-11-15 2023-10-31 OPSYS Tech Ltd. Noise adaptive solid-state LIDAR system
US11846728B2 (en) 2019-05-30 2023-12-19 OPSYS Tech Ltd. Eye-safe long-range LIDAR system using actuator
US11906663B2 (en) 2018-04-01 2024-02-20 OPSYS Tech Ltd. Noise adaptive solid-state LIDAR system
US11965964B2 (en) 2020-04-07 2024-04-23 OPSYS Tech Ltd. Solid-state LIDAR transmitter with laser control

Families Citing this family (114)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201217721D0 (en) * 2012-10-03 2012-11-14 Holition Ltd Video image processing
US9467680B2 (en) 2013-12-12 2016-10-11 Intel Corporation Calibration of a three-dimensional acquisition system
US20160006954A1 (en) * 2014-07-03 2016-01-07 Snap Vision Technologies LLC Multispectral Detection and Processing From a Moving Platform
US10427336B2 (en) * 2014-11-20 2019-10-01 Baker Hughes, A Ge Company, Llc Periodic structured composite and articles therefrom
US10289820B2 (en) * 2015-02-24 2019-05-14 Motorola Mobility Llc Multiuse 3D IR for electronic device
KR102311688B1 (en) 2015-06-17 2021-10-12 엘지전자 주식회사 Mobile terminal and method for controlling the same
US10578726B2 (en) * 2015-08-18 2020-03-03 Sikorsky Aircraft Corporation Active sensing system and method of sensing with an active sensor system
US9946259B2 (en) 2015-12-18 2018-04-17 Raytheon Company Negative obstacle detector
US10382701B2 (en) 2016-01-27 2019-08-13 Raytheon Company Active imaging systems and method
US10602070B2 (en) * 2016-01-27 2020-03-24 Raytheon Company Variable magnification active imaging system
US9903566B1 (en) * 2016-05-06 2018-02-27 Darryl R. Johnston Portable floor light
US10267915B2 (en) 2016-06-07 2019-04-23 Raytheon Company Optical system for object detection and location
US9961333B1 (en) 2016-06-10 2018-05-01 X Development Llc System and method for light field projection
US10212785B2 (en) * 2016-06-13 2019-02-19 Google Llc Staggered array of individually addressable light-emitting elements for sweeping out an angular range
US9909862B2 (en) 2016-06-13 2018-03-06 Google Llc Curved array of light-emitting elements for sweeping out an angular range
US10659764B2 (en) 2016-06-20 2020-05-19 Intel Corporation Depth image provision apparatus and method
US10609359B2 (en) 2016-06-22 2020-03-31 Intel Corporation Depth image provision apparatus and method
DE102016211983A1 (en) * 2016-06-30 2018-01-04 Robert Bosch Gmbh System and method for user recognition and / or gesture control
US10474297B2 (en) * 2016-07-20 2019-11-12 Ams Sensors Singapore Pte. Ltd. Projecting a structured light pattern onto a surface and detecting and responding to interactions with the same
US10705412B2 (en) * 2016-07-27 2020-07-07 Seikowave, Inc. Thermal management system for 3D imaging systems, opto-mechanical alignment mechanism and focusing mechanism for 3D imaging systems, and optical tracker for 3D imaging systems
US10827163B2 (en) * 2016-08-09 2020-11-03 Facebook Technologies, Llc Multiple emitter illumination source for depth information determination
US9891516B1 (en) 2016-08-23 2018-02-13 X Development Llc Methods for calibrating a light field projection system
US9723693B1 (en) * 2016-08-24 2017-08-01 Abl Ip Holding Llc Lighting devices configurable for generating a visual signature
US20190196019A1 (en) * 2016-08-31 2019-06-27 Singapore University Of Technology And Design Method and device for determining position of a target
US10345681B2 (en) * 2016-10-17 2019-07-09 Nokia Of America Corporation Compressive imaging using structured illumination
US10091496B2 (en) 2016-11-28 2018-10-02 X Development Llc Systems, devices, and methods for calibrating a light field projection system
IT201700021559A1 (en) * 2017-02-27 2018-08-27 St Microelectronics Srl CORRESPONDENT PROCEDURE FOR THE CONTROL OF LASER RAYS, DEVICE, EQUIPMENT AND COMPUTER PRODUCT
US10720069B2 (en) * 2017-04-17 2020-07-21 Rosemount Aerospace Inc. Method and system for aircraft taxi strike alerting
US10542245B2 (en) * 2017-05-24 2020-01-21 Lg Electronics Inc. Mobile terminal and method for controlling the same
US11151235B2 (en) 2017-08-01 2021-10-19 Apple Inc. Biometric authentication techniques
GB201713512D0 (en) * 2017-08-23 2017-10-04 Colordyne Ltd Apparatus and method for projecting and detecting light on a 2D or 3D surface, e.g. for semantic lighting based therapy
WO2019037105A1 (en) 2017-08-25 2019-02-28 深圳市汇顶科技股份有限公司 Power control method, ranging module and electronic device
US10613228B2 (en) * 2017-09-08 2020-04-07 Microsoft Techology Licensing, Llc Time-of-flight augmented structured light range-sensor
KR20200104372A (en) 2017-12-27 2020-09-03 에티컨, 엘엘씨 Hyperspectral imaging in a light-deficient environment
US11675050B2 (en) 2018-01-09 2023-06-13 Innovusion, Inc. LiDAR detection systems and methods
JP2019158691A (en) * 2018-03-15 2019-09-19 セイコーエプソン株式会社 Controller, robot, robot system, and method for recognizing object
JP2021097254A (en) * 2018-03-23 2021-06-24 ソニーグループ株式会社 Signal processing apparatus, signal processing method, image capturing apparatus, and medical application image capturing apparatus
WO2019199775A1 (en) 2018-04-09 2019-10-17 Innovusion Ireland Limited Lidar systems and methods for exercising precise control of a fiber laser
US10663567B2 (en) 2018-05-04 2020-05-26 Microsoft Technology Licensing, Llc Field calibration of a structured light range-sensor
US10452947B1 (en) * 2018-06-08 2019-10-22 Microsoft Technology Licensing, Llc Object recognition using depth and multi-spectral camera
CN114114295A (en) * 2018-06-15 2022-03-01 图达通爱尔兰有限公司 LIDAR system and method for focusing a range of interest
US10627709B2 (en) * 2018-06-29 2020-04-21 Ricoh Company, Ltd. Light source, projection device, measurement device, robot, electronic device, mobile object, and shaping apparatus
JP7379859B2 (en) * 2018-06-29 2023-11-15 株式会社リコー Light sources, projection devices, measurement devices, robots, electronic devices, moving objects, and modeling devices
WO2020025382A1 (en) * 2018-08-01 2020-02-06 Lumileds Holding B.V. Depth map generator
DE112019005684T5 (en) 2018-11-14 2021-08-05 Innovusion Ireland Limited LIDAR SYSTEMS AND METHODS USING A MULTI-FACET MIRROR
US11006097B1 (en) * 2018-12-28 2021-05-11 Facebook, Inc. Modeling for projection-based augmented reality system
US11245875B2 (en) 2019-01-15 2022-02-08 Microsoft Technology Licensing, Llc Monitoring activity with depth and multi-spectral camera
US10747314B1 (en) * 2019-04-02 2020-08-18 GM Global Technology Operations LLC Tracking system with infrared camera
US11747478B2 (en) * 2019-05-15 2023-09-05 Electronic Theatre Controls, Inc. Stage mapping and detection using infrared light
US11412152B2 (en) 2019-06-20 2022-08-09 Cilag Gmbh International Speckle removal in a pulsed hyperspectral imaging system
US10952619B2 (en) * 2019-06-20 2021-03-23 Ethicon Llc Hyperspectral and fluorescence imaging and topology laser mapping with minimal area monolithic image sensor
US10979646B2 (en) 2019-06-20 2021-04-13 Ethicon Llc Fluorescence imaging with minimal area monolithic image sensor
US11903563B2 (en) 2019-06-20 2024-02-20 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a fluorescence imaging system
US11375886B2 (en) 2019-06-20 2022-07-05 Cilag Gmbh International Optical fiber waveguide in an endoscopic system for laser mapping imaging
US11671691B2 (en) 2019-06-20 2023-06-06 Cilag Gmbh International Image rotation in an endoscopic laser mapping imaging system
US11294062B2 (en) 2019-06-20 2022-04-05 Cilag Gmbh International Dynamic range using a monochrome image sensor for hyperspectral and fluorescence imaging and topology laser mapping
US11550057B2 (en) 2019-06-20 2023-01-10 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a fluorescence imaging system
US11389066B2 (en) 2019-06-20 2022-07-19 Cilag Gmbh International Noise aware edge enhancement in a pulsed hyperspectral, fluorescence, and laser mapping imaging system
US11288772B2 (en) 2019-06-20 2022-03-29 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed fluorescence imaging system
US11674848B2 (en) 2019-06-20 2023-06-13 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for hyperspectral imaging
US11265491B2 (en) 2019-06-20 2022-03-01 Cilag Gmbh International Fluorescence imaging with fixed pattern noise cancellation
US11754500B2 (en) 2019-06-20 2023-09-12 Cilag Gmbh International Minimizing image sensor input/output in a pulsed fluorescence imaging system
US11122968B2 (en) 2019-06-20 2021-09-21 Cilag Gmbh International Optical fiber waveguide in an endoscopic system for hyperspectral imaging
US11218645B2 (en) 2019-06-20 2022-01-04 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for fluorescence imaging
US11012599B2 (en) 2019-06-20 2021-05-18 Ethicon Llc Hyperspectral imaging in a light deficient environment
US11276148B2 (en) 2019-06-20 2022-03-15 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed fluorescence imaging system
US11589819B2 (en) 2019-06-20 2023-02-28 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a laser mapping imaging system
US11172811B2 (en) 2019-06-20 2021-11-16 Cilag Gmbh International Image rotation in an endoscopic fluorescence imaging system
US11398011B2 (en) 2019-06-20 2022-07-26 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed laser mapping imaging system
US11877065B2 (en) 2019-06-20 2024-01-16 Cilag Gmbh International Image rotation in an endoscopic hyperspectral imaging system
US11898909B2 (en) 2019-06-20 2024-02-13 Cilag Gmbh International Noise aware edge enhancement in a pulsed fluorescence imaging system
US11252326B2 (en) 2019-06-20 2022-02-15 Cilag Gmbh International Pulsed illumination in a laser mapping imaging system
US11432706B2 (en) 2019-06-20 2022-09-06 Cilag Gmbh International Hyperspectral imaging with minimal area monolithic image sensor
US11633089B2 (en) 2019-06-20 2023-04-25 Cilag Gmbh International Fluorescence imaging with minimal area monolithic image sensor
US11280737B2 (en) 2019-06-20 2022-03-22 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed fluorescence imaging system
US11540696B2 (en) 2019-06-20 2023-01-03 Cilag Gmbh International Noise aware edge enhancement in a pulsed fluorescence imaging system
US11172810B2 (en) 2019-06-20 2021-11-16 Cilag Gmbh International Speckle removal in a pulsed laser mapping imaging system
US11700995B2 (en) 2019-06-20 2023-07-18 Cilag Gmbh International Speckle removal in a pulsed fluorescence imaging system
US11154188B2 (en) 2019-06-20 2021-10-26 Cilag Gmbh International Laser mapping imaging and videostroboscopy of vocal cords
US11622094B2 (en) 2019-06-20 2023-04-04 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for fluorescence imaging
US11793399B2 (en) 2019-06-20 2023-10-24 Cilag Gmbh International Super resolution and color motion artifact correction in a pulsed hyperspectral imaging system
US11937784B2 (en) 2019-06-20 2024-03-26 Cilag Gmbh International Fluorescence imaging in a light deficient environment
US11187658B2 (en) 2019-06-20 2021-11-30 Cilag Gmbh International Fluorescence imaging with fixed pattern noise cancellation
US11758256B2 (en) 2019-06-20 2023-09-12 Cilag Gmbh International Fluorescence imaging in a light deficient environment
US11284784B2 (en) 2019-06-20 2022-03-29 Cilag Gmbh International Controlling integral energy of a laser pulse in a fluorescence imaging system
US11221414B2 (en) 2019-06-20 2022-01-11 Cilag Gmbh International Laser mapping imaging with fixed pattern noise cancellation
US11187657B2 (en) 2019-06-20 2021-11-30 Cilag Gmbh International Hyperspectral imaging with fixed pattern noise cancellation
US11237270B2 (en) 2019-06-20 2022-02-01 Cilag Gmbh International Hyperspectral, fluorescence, and laser mapping imaging with fixed pattern noise cancellation
US11516387B2 (en) 2019-06-20 2022-11-29 Cilag Gmbh International Image synchronization without input clock and data transmission clock in a pulsed hyperspectral, fluorescence, and laser mapping imaging system
US11134832B2 (en) 2019-06-20 2021-10-05 Cilag Gmbh International Image rotation in an endoscopic hyperspectral, fluorescence, and laser mapping imaging system
US11083366B2 (en) 2019-06-20 2021-08-10 Cilag Gmbh International Driving light emissions according to a jitter specification in a fluorescence imaging system
US11233960B2 (en) 2019-06-20 2022-01-25 Cilag Gmbh International Fluorescence imaging with fixed pattern noise cancellation
US11412920B2 (en) 2019-06-20 2022-08-16 Cilag Gmbh International Speckle removal in a pulsed fluorescence imaging system
US11624830B2 (en) 2019-06-20 2023-04-11 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for laser mapping imaging
US11716543B2 (en) 2019-06-20 2023-08-01 Cilag Gmbh International Wide dynamic range using a monochrome image sensor for fluorescence imaging
US11892403B2 (en) 2019-06-20 2024-02-06 Cilag Gmbh International Image synchronization without input clock and data transmission clock in a pulsed fluorescence imaging system
US11716533B2 (en) 2019-06-20 2023-08-01 Cilag Gmbh International Image synchronization without input clock and data transmission clock in a pulsed fluorescence imaging system
US11471055B2 (en) 2019-06-20 2022-10-18 Cilag Gmbh International Noise aware edge enhancement in a pulsed fluorescence imaging system
US11213194B2 (en) 2019-06-20 2022-01-04 Cilag Gmbh International Optical fiber waveguide in an endoscopic system for hyperspectral, fluorescence, and laser mapping imaging
US11925328B2 (en) 2019-06-20 2024-03-12 Cilag Gmbh International Noise aware edge enhancement in a pulsed hyperspectral imaging system
US11931009B2 (en) 2019-06-20 2024-03-19 Cilag Gmbh International Offset illumination of a scene using multiple emitters in a hyperspectral imaging system
US11457154B2 (en) 2019-06-20 2022-09-27 Cilag Gmbh International Speckle removal in a pulsed hyperspectral, fluorescence, and laser mapping imaging system
US10841504B1 (en) 2019-06-20 2020-11-17 Ethicon Llc Fluorescence imaging with minimal area monolithic image sensor
US11533417B2 (en) 2019-06-20 2022-12-20 Cilag Gmbh International Laser scanning and tool tracking imaging in a light deficient environment
US11320517B2 (en) * 2019-08-22 2022-05-03 Qualcomm Incorporated Wireless communication with enhanced maximum permissible exposure (MPE) compliance
CN112857234A (en) * 2019-11-12 2021-05-28 峻鼎科技股份有限公司 Measuring method and device for combining two-dimensional and height information of object
TWI722703B (en) * 2019-12-09 2021-03-21 財團法人工業技術研究院 Projecting appartus and projecting calibration method
CN111275776A (en) * 2020-02-11 2020-06-12 北京淳中科技股份有限公司 Projection augmented reality method and device and electronic equipment
KR20210112525A (en) 2020-03-05 2021-09-15 에스케이하이닉스 주식회사 Camera Module Having an Image Sensor and a Three-Dimensional Sensor
US11182632B1 (en) 2020-08-25 2021-11-23 Toshiba Global Commerce Solutions Holdings Corporation Methods and systems including an edge device camera configured to capture variable image data amounts for audited shopping and related computer program products
KR20220060891A (en) * 2020-11-05 2022-05-12 주식회사 케이티 Apparatus for LIDAR
CN116935467A (en) * 2021-08-12 2023-10-24 荣耀终端有限公司 Data processing method and device
US11397071B1 (en) * 2021-09-14 2022-07-26 Vladimir V. Maslinkovskiy System and method for anti-blinding target game
US20230308767A1 (en) * 2022-03-23 2023-09-28 L3Harris Technologies, Inc. Smart illumination for nightvision using semi-transparent detector array

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030052169A1 (en) * 1999-06-07 2003-03-20 Metrologic Instruments, Inc. Planar laser illumination and imaging (PLIIM) based camera system for producing high-resolution 3-D images of moving 3-D objects
US6877662B2 (en) * 1999-06-07 2005-04-12 Metrologic Instruments, Inc. Led-based planar light illumination and imaging (PLIIM) based camera system employing real-time object coordinate acquisition and producing to control automatic zoom and focus imaging optics
US20090278684A1 (en) * 2008-05-12 2009-11-12 Robert Bosch Gmbh Scanning security detector
US20100296535A1 (en) * 2004-03-16 2010-11-25 Leica Geosystems Ag Laser operation for survey instruments
US20110013024A1 (en) * 1999-05-11 2011-01-20 Pryor Timothy R Camera based interaction and instruction
US20110212778A1 (en) * 2004-08-19 2011-09-01 Igt Virtual input system
US20120051588A1 (en) * 2009-12-21 2012-03-01 Microsoft Corporation Depth projector system with integrated vcsel array
US20120157200A1 (en) * 2010-12-21 2012-06-21 Microsoft Corporation Intelligent gameplay photo capture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8491135B2 (en) * 2010-01-04 2013-07-23 Microvision, Inc. Interactive projection with gesture recognition

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110013024A1 (en) * 1999-05-11 2011-01-20 Pryor Timothy R Camera based interaction and instruction
US20030052169A1 (en) * 1999-06-07 2003-03-20 Metrologic Instruments, Inc. Planar laser illumination and imaging (PLIIM) based camera system for producing high-resolution 3-D images of moving 3-D objects
US6877662B2 (en) * 1999-06-07 2005-04-12 Metrologic Instruments, Inc. Led-based planar light illumination and imaging (PLIIM) based camera system employing real-time object coordinate acquisition and producing to control automatic zoom and focus imaging optics
US20100296535A1 (en) * 2004-03-16 2010-11-25 Leica Geosystems Ag Laser operation for survey instruments
US20110212778A1 (en) * 2004-08-19 2011-09-01 Igt Virtual input system
US20090278684A1 (en) * 2008-05-12 2009-11-12 Robert Bosch Gmbh Scanning security detector
US20120051588A1 (en) * 2009-12-21 2012-03-01 Microsoft Corporation Depth projector system with integrated vcsel array
US20120157200A1 (en) * 2010-12-21 2012-06-21 Microsoft Corporation Intelligent gameplay photo capture

Cited By (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10802148B2 (en) 2014-01-29 2020-10-13 Lg Innotek Co., Ltd. Device for extracting depth information and method thereof
US10338221B2 (en) 2014-01-29 2019-07-02 Lg Innotek Co., Ltd. Device for extracting depth information and method thereof
CN106030239A (en) * 2014-01-29 2016-10-12 Lg伊诺特有限公司 Device for extracting depth information and method thereof
US11680789B2 (en) 2014-01-29 2023-06-20 Lg Innotek Co., Ltd. Device for extracting depth information and method thereof
JP2017505907A (en) * 2014-01-29 2017-02-23 エルジー イノテック カンパニー リミテッド Depth information extraction apparatus and method
JP7237114B2 (en) 2014-01-29 2023-03-10 エルジー イノテック カンパニー リミテッド Depth information extractor
CN106030239B (en) * 2014-01-29 2020-10-09 Lg伊诺特有限公司 Apparatus and method for extracting depth information
EP3101382A4 (en) * 2014-01-29 2017-08-30 LG Innotek Co., Ltd. Device for extracting depth information and method thereof
JP2021165749A (en) * 2014-01-29 2021-10-14 エルジー イノテック カンパニー リミテッド Depth information extraction device
US20150256722A1 (en) * 2014-03-07 2015-09-10 Canon Kabushiki Kaisha Image capturing apparatus and control method of image capturing apparatus
US10200584B2 (en) 2014-03-07 2019-02-05 Canon Kabushiki Kaisha Image capturing apparatus, control method of image capturing apparatus, and program
JP2017521660A (en) * 2014-07-08 2017-08-03 フェイスブック,インク. Method and system for adjusting an optical pattern for structured optical imaging
CN106716175B (en) * 2014-07-08 2019-08-02 脸谱科技有限责任公司 The method and system of light pattern for the imaging of adjustment structure light
CN106716175A (en) * 2014-07-08 2017-05-24 脸谱公司 Method and system for adjusting light pattern for structured light imaging
AU2015287252B2 (en) * 2014-07-08 2019-04-04 Facebook Technologies, Llc Method and system for adjusting light pattern for structured light imaging
KR102372449B1 (en) * 2014-07-08 2022-03-10 페이스북 테크놀로지스, 엘엘씨 Method and system for adjusting light pattern for structured light imaging
WO2016005976A3 (en) * 2014-07-08 2016-04-07 Oculus Vr, Llc Method and system for adjusting light pattern for structured light imaging
KR20170027776A (en) * 2014-07-08 2017-03-10 페이스북, 인크. Method and system for adjusting light pattern for structured light imaging
EP3167311A4 (en) * 2014-07-08 2018-01-10 Facebook Inc. Method and system for adjusting light pattern for structured light imaging
US10996049B2 (en) 2014-07-08 2021-05-04 Facebook Technologies, Llc Method and system for adjusting light pattern for structured light imaging
EP3627186A1 (en) * 2014-07-08 2020-03-25 Facebook Technologies, LLC Method and system for adjusting light pattern for structured light imaging
CN110360953A (en) * 2014-07-08 2019-10-22 脸谱科技有限责任公司 The method and system of light pattern for the imaging of adjustment structure light
US10408605B2 (en) 2014-07-08 2019-09-10 Facebook Technologies, Llc Method and system for adjusting light pattern for structured light imaging
AU2015287252C1 (en) * 2014-07-08 2019-08-22 Facebook Technologies, Llc Method and system for adjusting light pattern for structured light imaging
WO2016087273A1 (en) * 2014-12-01 2016-06-09 Koninklijke Philips N.V. Device and method for skin detection
US10242278B2 (en) 2014-12-01 2019-03-26 Koninklijke Philips N.V. Device and method for skin detection
CN107250841A (en) * 2015-02-19 2017-10-13 皇家飞利浦有限公司 Infrared laser light irradiation apparatus
EP3113070B1 (en) * 2015-04-10 2021-06-02 Google LLC Method and system for optical user recognition
CN107850706A (en) * 2015-05-20 2018-03-27 脸谱公司 For the method and system using object polygon generation light pattern
EP3104209A3 (en) * 2015-05-20 2017-01-25 Oculus VR, LLC Method and system for generating light pattern using polygons
US9648698B2 (en) 2015-05-20 2017-05-09 Facebook, Inc. Method and system for generating light pattern using polygons
US9842407B2 (en) 2015-05-20 2017-12-12 Facebook, Inc. Method and system for generating light pattern using polygons
CN107850706B (en) * 2015-05-20 2019-08-02 脸谱科技有限责任公司 For using object polygon to generate the method and system of light pattern
US20170068319A1 (en) * 2015-09-08 2017-03-09 Microvision, Inc. Mixed-Mode Depth Detection
WO2017044204A1 (en) 2015-09-08 2017-03-16 Microvision, Inc. Mixed-mode depth detection
EP3347738A4 (en) * 2015-09-08 2018-09-05 Microvision, Inc. Mixed-mode depth detection
KR102595996B1 (en) * 2015-09-08 2023-10-30 마이크로비젼, 인코퍼레이티드 Mixed-mode depth detection
KR20180039674A (en) * 2015-09-08 2018-04-18 마이크로비젼, 인코퍼레이티드 Mixed mode depth detection
US10503265B2 (en) 2015-09-08 2019-12-10 Microvision, Inc. Mixed-mode depth detection
CN108027441A (en) * 2015-09-08 2018-05-11 微视公司 Mixed mode depth detection
CN108027438A (en) * 2015-09-20 2018-05-11 高通股份有限公司 Light detection and ranging (LIDAR) system with two-beam guiding
US10061020B2 (en) 2015-09-20 2018-08-28 Qualcomm Incorporated Light detection and ranging (LIDAR) system with dual beam steering
US10708577B2 (en) 2015-12-16 2020-07-07 Facebook Technologies, Llc Range-gated depth camera assembly
WO2017106053A1 (en) * 2015-12-16 2017-06-22 Oculus Vr, Llc Range-gated depth camera assembly
EP3391648B1 (en) * 2015-12-16 2021-07-07 Facebook Technologies, LLC Range-gated depth camera assembly
US9992474B2 (en) 2015-12-26 2018-06-05 Intel Corporation Stereo depth camera using VCSEL with spatially and temporally interleaved patterns
WO2017112044A1 (en) * 2015-12-26 2017-06-29 Intel Corporation Stereo depth camera using vcsel with spatially and temporally interleaved patterns
US11762068B2 (en) 2016-04-22 2023-09-19 OPSYS Tech Ltd. Multi-wavelength LIDAR system
CN106679671A (en) * 2017-01-05 2017-05-17 大连理工大学 Navigation marking graph recognition method based on laser data
WO2018152061A1 (en) * 2017-02-14 2018-08-23 Sony Corporation Using micro mirrors to improve the field of view of a 3d depth map
US11016178B2 (en) 2017-03-13 2021-05-25 OPSYS Tech Ltd. Eye-safe scanning LIDAR system
US11927694B2 (en) 2017-03-13 2024-03-12 OPSYS Tech Ltd. Eye-safe scanning LIDAR system
EP3596492A4 (en) * 2017-03-13 2020-12-16 Opsys Tech Ltd Eye-safe scanning lidar system
WO2018169758A1 (en) 2017-03-13 2018-09-20 OPSYS Tech Ltd. Eye-safe scanning lidar system
CN110402398A (en) * 2017-03-13 2019-11-01 欧普赛斯技术有限公司 The scanning laser radar system of eye-safe
JP2020510208A (en) * 2017-03-13 2020-04-02 オプシス テック リミテッド Eye safety scanning LIDAR system
JP7037830B2 (en) 2017-03-13 2022-03-17 オプシス テック リミテッド Eye safety scanning lidar system
CN110402398B (en) * 2017-03-13 2023-12-01 欧普赛斯技术有限公司 Eye-safe scanning lidar system
US11828829B2 (en) * 2017-03-31 2023-11-28 Huawei Technologies Co., Ltd. Apparatus and method for scanning and ranging with eye-safe pattern
US20190391271A1 (en) * 2017-03-31 2019-12-26 Huawei Technologies Co., Ltd. Apparatus and method for scanning and ranging with eye-safe pattern
US11740331B2 (en) 2017-07-28 2023-08-29 OPSYS Tech Ltd. VCSEL array LIDAR transmitter with small angular divergence
US11415676B2 (en) 2017-10-09 2022-08-16 Luminar, Llc Interlaced scan patterns for lidar system
US11415675B2 (en) 2017-10-09 2022-08-16 Luminar, Llc Lidar system with adjustable pulse period
US11353559B2 (en) 2017-10-09 2022-06-07 Luminar, Llc Adjustable scan patterns for lidar system
EP3477437A1 (en) * 2017-10-27 2019-05-01 Thomson Licensing Method of remote-stimulated device illumination through photoluminescence and corresponding apparatus
US11802943B2 (en) 2017-11-15 2023-10-31 OPSYS Tech Ltd. Noise adaptive solid-state LIDAR system
WO2019141699A1 (en) * 2018-01-18 2019-07-25 Robert Bosch Gmbh Method for operating a lighting device or a camera apparatus, control device and camera apparatus
US11906663B2 (en) 2018-04-01 2024-02-20 OPSYS Tech Ltd. Noise adaptive solid-state LIDAR system
US11320538B2 (en) 2019-04-09 2022-05-03 OPSYS Tech Ltd. Solid-state LIDAR transmitter with laser control
US11846728B2 (en) 2019-05-30 2023-12-19 OPSYS Tech Ltd. Eye-safe long-range LIDAR system using actuator
US11513195B2 (en) 2019-06-10 2022-11-29 OPSYS Tech Ltd. Eye-safe long-range solid-state LIDAR system
US11965964B2 (en) 2020-04-07 2024-04-23 OPSYS Tech Ltd. Solid-state LIDAR transmitter with laser control
WO2021231559A1 (en) * 2020-05-13 2021-11-18 Luminar, Llc Lidar system with high-resolution scan pattern
US11841440B2 (en) 2020-05-13 2023-12-12 Luminar Technologies, Inc. Lidar system with high-resolution scan pattern
US11194048B1 (en) 2020-05-13 2021-12-07 Luminar, Llc Lidar system with high-resolution scan pattern
US11494926B1 (en) * 2021-07-01 2022-11-08 Himax Technologies Limited Method for performing hybrid depth detection with aid of adaptive projector, and associated apparatus
WO2023153139A1 (en) * 2022-02-09 2023-08-17 株式会社小糸製作所 Projector, and measuring device
CN116342497A (en) * 2023-03-01 2023-06-27 天津市鹰泰利安康医疗科技有限责任公司 Three-dimensional mapping method and system for inner wall of human body cavity
CN116342497B (en) * 2023-03-01 2024-03-19 天津市鹰泰利安康医疗科技有限责任公司 Three-dimensional mapping method and system for inner wall of human body cavity

Also Published As

Publication number Publication date
US20160006914A1 (en) 2016-01-07
WO2014014838A3 (en) 2014-05-01

Similar Documents

Publication Publication Date Title
US20160006914A1 (en) Interactive Illumination for Gesture and/or Object Recognition
US10767981B2 (en) Systems and methods for estimating depth from projected texture using camera arrays
JP6546349B2 (en) Depth mapping using structured light and time of flight
US9030529B2 (en) Depth image acquiring device, system and method
EP3065622B1 (en) Mapping the ocular surface
US9885459B2 (en) Pattern projection using micro-lenses
US11330243B2 (en) System and method for 3D scanning
US20160364903A1 (en) 3d geometric modeling and 3d video content creation
US9215449B2 (en) Imaging and processing using dual clocks
US8150142B2 (en) Depth mapping using projected patterns
US9842407B2 (en) Method and system for generating light pattern using polygons
CN106572340A (en) Camera shooting system, mobile terminal and image processing method
US20020057438A1 (en) Method and apparatus for capturing 3D surface and color thereon in real time
CN106454287A (en) Combined camera shooting system, mobile terminal and image processing method
US20040105580A1 (en) Acquisition of three-dimensional images by an active stereo technique using locally unique patterns
CN107783353A (en) For catching the apparatus and system of stereopsis
KR102481774B1 (en) Image apparatus and operation method thereof
CN104903680B (en) The method for controlling the linear dimension of three-dimension object
Zanuttigh et al. Operating principles of structured light depth cameras
CN114647084A (en) MEMS galvanometer based extended reality projection with eye tracking
KR102184210B1 (en) 3d camera system
JP4241250B2 (en) Three-dimensional shape measuring apparatus and method
EP3104209B1 (en) Method and system for generating light pattern using polygons
Kraus Wireless Optical Communication: Infrared, Time-Of-Flight, Light Fields, and Beyond

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13820063

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 13820063

Country of ref document: EP

Kind code of ref document: A2