US20160328854A1 - Distance sensor - Google Patents

Distance sensor Download PDF

Info

Publication number
US20160328854A1
US20160328854A1 US15/149,429 US201615149429A US2016328854A1 US 20160328854 A1 US20160328854 A1 US 20160328854A1 US 201615149429 A US201615149429 A US 201615149429A US 2016328854 A1 US2016328854 A1 US 2016328854A1
Authority
US
United States
Prior art keywords
diffractive optical
projection
light source
light sources
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/149,429
Inventor
Akiteru Kimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magik Eye Inc
Original Assignee
Magik Eye Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magik Eye Inc filed Critical Magik Eye Inc
Priority to US15/149,429 priority Critical patent/US20160328854A1/en
Assigned to Magik Eye Inc. reassignment Magik Eye Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIMURA, AKITERU
Publication of US20160328854A1 publication Critical patent/US20160328854A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • G06T7/0051
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V5/00Refractors for light sources
    • F21V5/008Combination of two or more successive refractors along an optical axis
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/026Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/0001Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
    • G02B6/0005Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being of the fibre type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/24Coupling light guides
    • G02B6/26Optical coupling means
    • G02B6/34Optical coupling means utilising prism or grating
    • G06K9/00805
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01SDEVICES USING THE PROCESS OF LIGHT AMPLIFICATION BY STIMULATED EMISSION OF RADIATION [LASER] TO AMPLIFY OR GENERATE LIGHT; DEVICES USING STIMULATED EMISSION OF ELECTROMAGNETIC RADIATION IN WAVE RANGES OTHER THAN OPTICAL
    • H01S5/00Semiconductor lasers
    • H01S5/40Arrangement of two or more semiconductor lasers, not provided for in groups H01S5/02 - H01S5/30
    • H01S5/42Arrays of surface emitting lasers
    • H01S5/423Arrays of surface emitting lasers having a vertical cavity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • H04N5/2256
    • H04N5/23238
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4233Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive element [DOE] contributing to a non-imaging application
    • G02B27/425Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive element [DOE] contributing to a non-imaging application in illumination systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10052Images from lightfield camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means

Definitions

  • the present disclosure relates generally computer vision systems and relates more particularly to sensors for measuring the distance between a vehicle and an object or point in space.
  • Unmanned vehicles such as robotic vehicles and drones
  • Computer vision systems typically rely on computer vision systems for obstacle detection and navigation in the surrounding environment.
  • These computer vision systems typically rely on various sensors that acquire visual data from the surrounding environment, which the computer vision systems process in order to gather information about the surrounding environment. For instance, data acquired via one or more imaging sensors may be used to determine the distance from the vehicle to a particular object or point in the surrounding environment.
  • a distance sensor includes a projection light source, a first light guiding means positioned to guide light emitted by the projection light source, a diffractive optical element positioned to split the light guided by the first light guiding means into a plurality of projection beams traveling in different directions, and an image capturing device positioned to capture an image of a field of view, including a projection pattern created by an incidence of the plurality of projection beams on an object in the field of view.
  • a distance sensor in another embodiment, includes a plurality of projection light sources, a first plurality of optical fibers, wherein a first end of each optical fiber of the first plurality of optical fibers is coupled to one projection light source of the plurality of projection light sources, a plurality of diffractive optical elements, wherein each diffractive optical element of the plurality of diffractive optical elements is coupled to a second end of one optical fiber of the first plurality of optical fibers, a plurality of illumination light sources, comprising light sources that are different from the plurality of projection light sources, a second plurality of optical fibers, wherein a first end of each optical fiber of the second plurality of optical fibers is coupled to one illumination light source of the plurality of illumination light sources, a plurality of illumination optics, wherein each illumination optic of the plurality of illumination optics is coupled to a second end of one optical fiber of the second plurality of optical fibers, and an image capturing device, wherein the plurality of diffractive optical elements and the plurality of illumination optics are
  • a distance sensor comprises a plurality of vertical cavity surface emitting lasers arranged on a circuit board, a plurality of gradient-index lenses, wherein each gradient-index lens of the plurality of gradient-index lenses is positioned to collimate a beam of light produced by one vertical cavity surface emitting laser of the plurality of vertical cavity surface emitting lasers, a plurality of diffractive optical elements, wherein each diffractive optical element of the plurality of diffractive optical elements is positioned to split a beam collimated by one gradient-index lens of the plurality of gradient-index lenses into a plurality of beams traveling in different directions, and a plurality of Powell lenses, wherein each Powell lens of the plurality of Powell lenses is positioned to generate a projection pattern from a plurality of beams generated by one diffractive optical element of the plurality of diffractive optical elements.
  • FIG. 1A illustrates a cross-sectional view of a first embodiment of a distance sensor of the present disclosure
  • FIG. 1B illustrates a top view of the distance sensor of FIG. 1A ;
  • FIG. 2 illustrates an example field of view of the distance sensor of FIGS. 1A and 1B ;
  • FIG. 3A illustrates a cross-sectional view of a second embodiment of a distance sensor of the present disclosure
  • FIG. 3B illustrates a top view of the distance sensor of FIG. 3A ;
  • FIG. 4 illustrates a flowchart of a method for calculating the distance from a sensor to an object or point in space
  • FIG. 5 illustrates a triangulation technique by which the distance from a sensor to an object or point may be calculated
  • FIG. 6A illustrates a cross-sectional view of a third embodiment of a distance sensor of the present disclosure
  • FIG. 6B illustrates a top view of the distance sensor of FIG. 6A ;
  • FIG. 7 illustrates a first example projection pattern that may be generated by the distance sensor of FIG. 6 ;
  • FIG. 8 illustrates a first example projection pattern that may be generated by the distance sensor of FIG. 6 ;
  • FIG. 9 depicts a portion of a fourth embodiment of a distance sensor of the present disclosure.
  • FIG. 10 depicts a high-level block diagram of a general-purpose computer suitable for use in performing the functions described herein.
  • the present disclosure relates to a distance sensor.
  • Distance sensors may be used in unmanned vehicles in order to help a computer vision system determine the distance from the vehicle to a particular object or point in the surrounding environment.
  • a distance sensor may project one or more beams of light onto the object or point and then compute the distance according to time of flight (TOF), analysis of the reflected light (e.g., lidar), or other means.
  • TOF time of flight
  • Conventional distance sensors of this type tend to be bulky, however, and thus may not be suitable for use in compact vehicles.
  • the sensors can be very expensive to manufacture and tend to have a limited field of view. For instance, even using an arrangement of multiple conventional imaging sensors provides a field of view that is less than 360 degrees.
  • Embodiments of the disclosure provide a compact distance sensor that is economical to manufacture, includes few or no moving parts, and can measure distances in a field of view of up to 360 degrees.
  • the sensor uses a set of beam splitting means such as an array of optical fibers and diffractive optical elements (DOEs) to generate a plurality of projection points around a wide angle lens.
  • DOEs diffractive optical elements
  • Each of the plurality of projection points emits a plurality of beams into a field of view. From the appearances of the beams, the sensor can measure distances in a 180 degree hemispherical field of view. By mounting two such sensors back-to-back, distances can be measured in a 360 degree field of view.
  • the DOEs make it possible to split a beam generated by a single light source (e.g., laser) into multiple projection beams that are projected onto an object or point in the field of view.
  • beams emitted by multiple light sources are split by the DOEs.
  • the distance from the sensor to the object or point can then be calculated in one cycle of projection and image capture from the multiple projections.
  • a compact sensor capable of measuring distances in fields of view of to 360 degrees can provide meaningful data for applications including unmanned vehicle navigation, endoscopy, and other applications.
  • FIGS. 1A and 1B illustrate a first embodiment of a distance sensor 100 of the present disclosure.
  • FIG. 1A illustrates a cross-sectional view of the distance sensor 100
  • FIG. 1B illustrates a top view of the distance sensor 100 of FIG. 1A .
  • the distance sensor 100 may be mounted, for example, to an unmanned vehicle, or may be used as part of an endoscope.
  • the distance sensor 100 comprises a plurality of components arranged in a compact configuration.
  • the components include at least one light source 104 , a first beam splitting means, hereinafter referred to as a first diffractive optical element 106 , a plurality of light guiding means such as optical fibers 102 1 - 102 n (hereinafter collectively referred to as “optical fibers 102 ”), an array of second beam splitting means, hereinafter referred to as second diffractive optical elements 108 1 - 108 n (and hereinafter collectively referred to as “second diffractive optical elements 108 ”), and an imaging sensor 110 including a wide-angle lens 112 .
  • the components are arranged substantially symmetrically about a central axis A-A′.
  • the central axis A-A′ coincides with the optical axis of the imaging sensor 110 .
  • the light source 104 is positioned at a first end of the central axis A-A′.
  • the light source 104 is a laser light source that emits a single beam of light along the central axis A-A′.
  • the single beam emitted by the light source 104 may also be referred to as the “primary beam.”
  • the light source 104 emits light of a wavelength that is known to be relatively safe to human vision (e.g., infrared).
  • the light source 104 may include circuitry to adjust the intensity of its output.
  • the light source 104 may emit light in pulses, so as to mitigate the effects of ambient light on image capture.
  • the first diffractive optical element (DOE) 106 is positioned along the central axis A-A′ in proximity to the light source 104 (e.g., “in front” of the light source 104 , relative to the direction in which light emitted by the light source 104 propagates).
  • the first DOE 106 is positioned to intercept the single beam of light emitted by the light source 104 and to split the single or primary beam into a plurality of secondary beams.
  • the first DOE 106 is any optical component that is capable of splitting the primary beam into a plurality of secondary beams that diverge from the primary beam in different directions.
  • the first DOE 106 may include a conical mirror, a holographic film, a micro lens, o a line generator (e.g., a Powell lens).
  • a line generator e.g., a Powell lens
  • the plurality of secondary beams are arranged in a cone shape.
  • the primary beam may be split by means other than diffraction.
  • Each of the optical fibers 102 is coupled at one end to the first DOE 106 and is coupled at the other end to one of the DOEs 108 in the array of second DOEs 108 . In this way, each of the optical fibers 102 is positioned to carry at least one of the secondary beams produced by the first DOE 106 to one of the DOEs 108 in the array of second DOEs 108 .
  • the array of second DOEs 108 is positioned along the central axis A-A′ in proximity to the first DOE 106 (e.g., “in front” of the first DOE 106 , relative to the direction in which light emitted by the light source 104 propagates).
  • the array of second DOEs 108 is positioned such that the first DOE 106 is positioned between the light source 104 and the array of second DOEs 108 .
  • the second DOEs 108 are arranged in a ring-shaped array, with the central axis A-A′ passing through the center of the ring and the second DOEs 108 spaced at regular intervals around the ring.
  • the second DOEs 108 are spaced approximately thirty degrees apart around the ring.
  • the array of second DOES 108 is positioned “behind” a principal point of the imaging sensor 110 (i.e., the point where the optical axis A-A′ intersects the image plane), relative to the direction in which light emitted by the light source 104 propagates.
  • Each second DOE 108 is positioned to receive one of the secondary beams transmitted by one of the optical fibers 102 from the first DOE 106 and to split the secondary beam into a plurality of (e.g., two or more) tertiary beams that are directed away from the second DOE 108 in a radial manner.
  • each second DOE 108 defines a projection point of the sensor 100 from which a group of projection beams (or tertiary beams) is emitted into the field of view.
  • each respective plurality of tertiary beams fans out to cover a range of approximately one hundred degrees.
  • the second DOEs 108 are any optical components that are capable of splitting a respective secondary beam into a plurality of tertiary beams that diverge from the secondary beam in different directions.
  • each second DOE 108 may include a conical mirror, a holographic film, a micro lens, or a line generator (e.g., a Powell lens).
  • the secondary beams are split by a means other than diffraction.
  • each plurality of tertiary beams is arranged in a fan or radial pattern, with equal angles between each of the beams.
  • each of the second DOEs 108 is configured to project tertiary beams that create a different visual pattern on a surface. For example, one second DOE 108 may project a pattern of dots, while another second DOE 108 may project a pattern of lines or x's.
  • the imaging sensor 110 is positioned along the central axis A′A′, in the middle of the array of second DOEs 108 (e.g., at least partially “in front” of the array of second DOEs 108 , relative to the direction in which light emitted by the light source 104 propagates).
  • the imaging sensor 110 is an image capturing device, such as a still or video camera.
  • the imaging sensor 110 includes a wide-angle lens 112 , such as a fisheye lens, that creates a hemispherical field of view.
  • the imaging sensor 110 includes circuitry for calculating the distance from the distance sensor 100 to an object or point.
  • the imaging sensor includes a network interface for communicating captured images over a network to a processor, where the processor calculates the distance from the distance sensor 100 to an object or point and then communicates the calculated distance back to the distance sensor 100 .
  • the distance sensor 100 uses a single light source (e.g., light source 104 ) to produce multiple projection points from which sets of projection beams (e.g., comprising patterns of dots or lines) are emitted.
  • the distance from the distance sensor 100 to an object can be calculated from the appearances of the projection beams in the field of view (as discussed in greater detail below).
  • the use of the first and second DOEs makes it possible to generate a plurality of projection points around the lens, from the single beam of light emitted by the light source. This, plus the use of the optical fibers 102 to transmit light from the first DOE to the second DOEs, allows the distance sensor 100 maintain a relatively compact form factor while measuring distance within a wide field of view.
  • the imaging sensor 110 and the light source 104 can also be mounted in the same plane in order to make the design more compact; however, in one embodiment, the second DOEs 108 1 - 108 n are positioned behind the principal point of the imaging sensor 110 in order to increase the field of view that can be covered by the projection beams (e.g., such that the depth angle of the field of view is closer to a full 180 degrees, or, in some cases, even greater).
  • each of the second DOEs 108 projects tertiary beams of a different pattern
  • the circuitry in the imaging sensor can easily determine which beams in a captured image were created by which of the second DOEs 108 . This facilitates the distance calculations, as discussed in greater detail below.
  • FIG. 2 illustrates an example field of view 200 of the distance sensor 100 of FIGS. 1A and 1B .
  • certain components of the distance sensor 100 are also illustrated in an exploded view.
  • the field of view 200 is substantially hemispherical in shape.
  • the plurality of tertiary light beams produced by the distance sensor 100 projects a pattern of light onto the “virtual” hemisphere.
  • the patterns are represented by the series of concentric circles that are illustrated where each tertiary beam meets the hemisphere.
  • the circles are depicted as gradually decreasing in size as the distance from the distance sensor 100 increases, in order to show how the patterns created by the tertiary beams change visually by object distance.
  • the field of view of the distance sensor 100 covers approximately 180 degrees.
  • the field of view can be expanded to approximately 360 degrees by mounting two distance sensors back-to-back, e.g., such that their respective light sources emit primary beams in two different directions separated by approximately 180 degrees.
  • FIGS. 3A and 3B illustrate a second embodiment of a distance sensor 300 of the present disclosure.
  • FIG. 3A illustrates a cross-sectional view of the distance sensor 300
  • FIG. 3B illustrates a top view of the distance sensor 300 of FIG. 3A .
  • the distance sensor 300 may be mounted, for example, to an unmanned vehicle, or may be used as part of an endoscope.
  • the distance sensor 300 comprises a plurality of components arranged in a compact configuration.
  • the components include a plurality of light sources 304 1 - 304 m (hereinafter collectively referred to as “light sources 304 ”), a first beam splitting means, hereinafter referred to as a first diffractive optical element 306 , a plurality of light guiding means such as optical fibers 302 1 - 302 n (hereinafter collectively referred to as “optical fibers 302 ”), an array of second beam splitting means, hereinafter referred to as second diffractive optical elements 308 1 - 308 n (and hereinafter collectively referred to as “second diffractive optical elements 308 ”), and an imaging sensor 310 including a wide-angle lens 312 .
  • the components are arranged substantially symmetrically about a central axis A-A′.
  • the central axis A-A′ coincides with the optical axis of the imaging sensor 310 .
  • the light sources 304 are positioned at a first end of the central axis A-A′.
  • the light sources 304 comprise a plurality of laser light sources that each emit a single beam of light along the central axis A-A′.
  • a single beam emitted by one of the light sources 304 may also be referred to as the “primary beam.”
  • the light sources 304 may include two or more different types of light sources, such as two or more laser light sources that emit light of different wavelengths.
  • each of the light sources 304 emits light of a wavelength that is known to be relatively safe to human vision (e.g., infrared).
  • one or more of the light sources 304 may include circuitry to adjust the intensity of its output.
  • one or more of the light sources 304 may emit light in pulses, so as to mitigate the effects of ambient light on image capture.
  • the first diffractive optical element (DOE) 306 is positioned along the central axis A-A′ in proximity to the light sources 304 (e.g., “in front” of the light sources 304 , relative to the direction in which light emitted by the light sources 304 propagates).
  • the first DOE 306 is positioned to intercept the single beam of light emitted by one of the light sources 304 and to split the single or primary beam into a plurality of secondary beams.
  • the first DOE 306 is any optical component that is capable of splitting the primary beam into a plurality of secondary beams that diverge from the primary beam in different directions.
  • the first DOE 306 may include a conical mirror, a holographic film, a micro lens, or a line generator (e.g., a Powell lens).
  • the plurality of secondary beams are arranged in a cone shape.
  • the primary beam may be split by means other than diffraction.
  • Each of the optical fibers 302 is coupled at one end to the first DOE 306 and is coupled at the other end to one of the DOEs 308 in the array of second DOEs 308 . In this way, each of the optical fibers 302 is positioned to carry at least one of the secondary beams produced by the first DOE 306 to one of the DOEs 308 in the array of second DOEs 308 .
  • the array of second DOEs 308 is positioned along the central axis A-A′ in proximity to the first DOE 306 (e.g., “in front” of the first DOE 306 , relative to the direction in which light emitted by the light sources 304 propagates).
  • the array of second DOEs 308 is positioned such that the first DOE 306 is positioned between the light sources 304 and the array of second DOEs 308 .
  • the second DOEs 308 are arranged in a ring-shaped array, with the central axis A-A′ passing through the center of the ring and the second DOEs 308 spaced at regular intervals around the ring.
  • the second DOEs 308 are spaced approximately thirty degrees apart around the ring.
  • the array of second DOES 308 is positioned “behind” a principal point of the imaging sensor 310 (i.e., the point where the optical axis A-A′ intersects the image plane), relative to the direction in which light emitted by the light sources 304 propagates.
  • Each second DOE 308 is positioned to receive one of the secondary beams transmitted by one of the optical fibers 302 from the first DOE 306 and to split the secondary beam into a plurality of (e.g., two or more) tertiary beams that are directed away from the second DOE 308 in a radial manner.
  • each second DOE 308 defines a projection point of the sensor 300 from which a group of projection beams (or tertiary beams) is emitted into the field of view.
  • each respective plurality of tertiary beams fans out to cover a range of approximately one hundred degrees.
  • the second DOEs 308 are any optical components that are capable of splitting a respective secondary beam into a plurality of tertiary beams that diverge from the secondary beam in different directions.
  • each second DOE 308 may include a conical mirror, a holographic film, a micro lens, or a line generator (e.g., a Powell lens).
  • the secondary beams are split by a means other than diffraction.
  • each plurality of tertiary beams is arranged in a fan or radial pattern, with equal angles between each of the beams.
  • each of the second DOEs 308 is configured to project tertiary beams that create a different visual pattern on a surface. For example, one second DOE 308 may project a pattern of dots, while another second DOE 308 may project a pattern of lines or x's.
  • the imaging sensor 310 is positioned along the central axis A′A′, in the middle of the array of second DOEs 308 (e.g., at least partially “in front” of the array of second DOEs 308 , relative to the direction in which light emitted by the light sources 304 propagates).
  • the imaging sensor 310 is an image capturing device, such as a still or video camera.
  • the imaging sensor 310 includes a wide-angle lens 312 , such as a fisheye lens, that creates a hemispherical field of view.
  • the imaging sensor 310 includes circuitry for calculating the distance from the distance sensor 300 to an object or point.
  • the imaging sensor includes a network interface for communicating captured images over a network to a processor, where the processor calculates the distance from the distance sensor 300 to an object or point and then communicates the calculated distance back to the distance sensor 300 .
  • FIG. 4 illustrates a flowchart of a method 400 for calculating the distance from a sensor to an object or point in space.
  • the method 400 may be performed by a processor integrated in an imaging sensor (such as the imaging sensor 110 illustrated in FIG. 1A or the imaging sensor 310 illustrated in FIG. 3A ) or a general purpose computing device as illustrated in FIG. 10 and discussed below.
  • an imaging sensor such as the imaging sensor 110 illustrated in FIG. 1A or the imaging sensor 310 illustrated in FIG. 3A
  • FIG. 10 a general purpose computing device as illustrated in FIG. 10 and discussed below.
  • a light source is activated to generate a primary beam of light.
  • a single primary beam is generated by a single light source; however, in other embodiments, multiple primary beams are generated by multiple light sources.
  • the light source or light sources comprise a laser light source.
  • the primary beam is split into a plurality of secondary beams using a first beam splitting means (e.g., a diffractive optical element) that is positioned in the path along which the primary beam propagates.
  • the first beam splitting means may be, for example, a conical mirror. Step 406 is performed, for example, when the distance sensor (of which the imaging sensor is a part) includes only a single light source.
  • each beam in the plurality of secondary beams is split into a plurality of projection or tertiary beams using a second beam splitting means (e.g., second diffractive optical element) in an array of beam splitting means.
  • a second beam splitting means e.g., second diffractive optical element
  • a plurality of second beam splitting means are positioned in a ring, such that each second beam splitting means is positioned in the path along which one of the secondary beams propagates.
  • at least some of the second beam splitting means are conical mirrors.
  • the method 400 may proceed directly from step 404 to step 408 . In this case, each primary beam of a plurality of primary beams (generated using the plurality of light sources) is directly split into a plurality of projection beams by one of the second beam splitting means.
  • step 410 at least one image of the object or point is captured.
  • the image includes a pattern that is projected onto the object or point and onto the surrounding space.
  • the pattern is created by each of the projection beams projecting a series of dots, lines, or other shapes onto the object, point, or surrounding space.
  • the distance from the sensor to the object or point is calculated using information from the images captured in step 410 .
  • a triangulation technique is used to calculate the distance.
  • the positional relationships between parts of the patterns projected by the sensor can be used as the basis for the calculation.
  • the method 400 ends in step 414 .
  • the method 400 in combination with the sensor depicted in FIGS. 1A-1B or in FIG. 3 , can measure the distance from the sensor to an object or point in space in a single cycle of image capture and calculation.
  • FIG. 5 illustrates a triangulation technique by which the distance from the sensor to the object or point may be calculated in step 412 .
  • FIG. 5 illustrates the example imaging sensor 110 of FIG. 1 , as well as two of the projection points, which may be defined by two of the second diffractive optical elements 108 1 and 108 2 .
  • Each of the projection points emits a respective projection beam 500 1 and 500 2 , which is incident upon the object to create a respective point 502 1 and 502 2 (e.g., dot or line) in a pattern.
  • These points 502 1 and 502 2 are detected by the imaging sensor 110 and may be used to calculate the distance, D, between the imaging sensor 110 and the object as follows:
  • ⁇ 2 is the angle formed between the projection beam 500 2 and a central axis c 2 of the second diffractive optical element 108 2
  • ⁇ 1 is the angle formed between the projection beam 500 1 and a central axis c 1 of the second diffractive optical element 108 1
  • ⁇ 2 is the angle formed between the central optical axis O of the imaging sensor 110 and the angle at which the imaging sensor 110 perceives the point 502 2 created by the projection beam 500 2
  • ⁇ 1 is the angle formed between the central optical axis O of the imaging sensor 110 and the angle at which the imaging sensor 110 perceives the point 502 1 created by the projection beam 500 1 .
  • EQN. 1 is derived from the following relationships:
  • EQNs. 2 and 3 allow one to calculate the distance from a source of a projection pattern (comprising, e.g., a pattern of dots) to an object onto which the projection pattern is projected.
  • the distance is calculated based on the positional relationship between the points of light (e.g., the dots) that form the projection pattern when the points of light are emitted by different projection points around the source.
  • the positional relationships between the points of light are known a priori (i.e., not measured as part of the calculation).
  • FIGS. 6A and 6B illustrate a third embodiment of a distance sensor 600 of the present disclosure.
  • FIG. 6A illustrates a cross-sectional view of the distance sensor 600
  • FIG. 6B illustrates a top view of the distance sensor 600 of FIG. 3A .
  • the distance sensor 600 may be suitable for use as part of an endoscope.
  • the distance sensor 600 comprises a plurality of components arranged in a compact configuration.
  • the components include a plurality of illumination light sources (hereinafter collectively referred to as “illumination light sources 602 ,” of which one illumination light source 602 1 is visible in FIG. 6A ), a plurality of projection light sources (hereinafter collectively referred to as “projection light sources 604 ,” of which one projection light source 604 1 is visible in FIG. 6A ), a plurality of light guiding means such as optical fibers (hereinafter collectively referred to as “optical fibers 606 ,” of which two optical fibers 606 1 and 6066 are visible in FIG.
  • a plurality of beam splitting means hereinafter referred to as a diffractive optical elements (hereinafter collectively referred to as “DOEs 608 ,” of which one DOE 608 1 is visible in FIG. 6A ), a plurality of collimating lenses, such as gradient-index (GRIN) lenses (hereinafter collectively referred to as “GRIN lenses 616 ,” of which one GRIN lens 616 1 is visible in FIG.
  • DOEs 608 a diffractive optical elements
  • GRIN lenses 616 gradient-index
  • Powell lenses 614 a plurality of Powell lenses 614 1 - 614 3 (hereinafter collectively referred to as “Powell lenses 614 ”), a plurality of illumination optics assemblies 618 1 - 618 3 (hereinafter collectively referred to as “illumination optics assemblies 614 ”), and an imaging sensor 610 including a wide-angle lens 612 .
  • the components are arranged substantially symmetrically about a central axis A-A′.
  • the central axis A-A′ coincides with the optical axis of the imaging sensor 610 .
  • the illumination light sources 602 and the projection light sources 604 are positioned at a first end of the central axis A-A′.
  • the illumination light sources 602 comprise a plurality of light emitting diodes (LEDs) that each emit illumination in a direction substantially parallel to the central axis A-A′.
  • LEDs light emitting diodes
  • the projection light sources 604 comprise a plurality of laser light sources, such as vertical cavity surface emitting lasers (VCSELs), that each emit a single beam of light in a direction substantially parallel to the central axis A-A′.
  • VCSELs vertical cavity surface emitting lasers
  • a single beam emitted by one of the projection light sources 604 may also be referred to as the “primary beam.”
  • each of the projection light sources 604 emits light of a wavelength that is known to be relatively safe to human vision (e.g., infrared).
  • the projection light sources 604 may include circuitry to adjust the intensity of their output.
  • one or more of the projection light sources 604 may emit light in pulses, so as to mitigate the effects of ambient light on image capture.
  • the illumination light sources 602 are connected to the illumination optics assemblies 618 via a first subset of the optical fibers 606 .
  • the illumination optics assemblies 618 may include lenses, diffractive elements, or other components that help to direct illumination generated by the illumination light sources 602 in the appropriate directions.
  • the illumination optics assemblies 618 are positioned along the central axis A-A′ in proximity to the illumination light sources 602 (e.g., “in front” of the illumination light sources 602 , relative to the direction in which light emitted by illumination light sources 602 propagates). As more clearly illustrated in FIG.
  • the illumination optics assemblies 618 are arranged in a ring-shaped array, with the central axis A-A′ passing through the center of the ring and the illumination optics assemblies 618 spaced at regular intervals around the ring, e.g., alternating with the Powell lenses 614 .
  • the second DOEs 308 are spaced approximately 120 degrees apart around the ring.
  • the illumination optics assemblies 618 are positioned “behind” a principal point of the imaging sensor 610 (i.e., the point where the optical axis A-A′ intersects the image plane), relative to the direction in which light emitted by the illumination light sources 602 propagates.
  • the projection light sources 604 are connected to the GRIN lenses 616 via a second subset of the optical fibers 606 .
  • GRIN lenses 616 are positioned along the central axis A-A′ in proximity to the projection light sources 604 (e.g., “in front” of the projection light sources 604 , relative to the direction in which light emitted by the projection light sources 604 propagates).
  • Each GRIN lens 616 is coupled to one of the DOEs 608 , which is in turn coupled to one of the Powell lenses 614 .
  • each set of GRIN lens 616 , DOE 608 , and Powell lens 614 receives, via an optical fiber 606 , a single beam of light emitted by one of the projection light sources 604 and splits the single or primary beam into a plurality of secondary beams.
  • the DOEs 608 may be any optical components that are capable of splitting a primary beam into a plurality of secondary beams that diverge from the primary beam in different directions.
  • the DOEs 608 may include a conical mirror, a holographic film, a micro lens, or a line generator.
  • the primary beam may be split by means other than diffraction.
  • the sensor 600 may include three sets each of: (1) illumination light source 602 , optical fiber 606 , and illumination optics 618 (hereinafter an “illumination assembly”); and (2) projection light source 604 , optical fiber 606 , GRIN lens 616 , DOE 608 , and Powell lens 614 (hereinafter a “projection assembly”).
  • the illumination assemblies and projection assemblies may be arranged in an alternating fashion around the central axis A-A′, such that each illumination assembly is spaced from the next illumination assembly by approximately 120 degrees, and each projection assembly is spaced from the next projection assembly by approximately 120 degrees. In further embodiments, however, different numbers of illumination assemblies and projection assemblies may be used. In addition, the spacing between illumination assemblies and projection assemblies may be varied.
  • the imaging sensor 610 is positioned along the central axis A′A′, in the middle of the ring of illumination assemblies and projection assemblies (e.g., at least partially “in front” of the illumination assemblies and projection assemblies, relative to the direction in which light emitted by the illumination assemblies and projection assemblies propagates).
  • the imaging sensor 610 is an image capturing device, such as a still or video camera.
  • the imaging sensor 610 includes a wide-angle lens 612 , such as a fisheye lens, that creates a hemispherical field of view.
  • the imaging sensor 610 includes circuitry for calculating the distance from the distance sensor 600 to an object or point.
  • the imaging sensor includes a network interface for communicating captured images over a network to a processor, where the processor calculates the distance from the distance sensor 600 to an object or point and then communicates the calculated distance back to the distance sensor 300 .
  • FIG. 7 illustrates a first example projection pattern that may be generated by the distance sensor 600 of FIG. 6 .
  • the projection pattern comprises a plurality of groups of parallel lines (e.g., where each projection assembly of the distance sensor 600 projects one group of parallel lines.
  • the planes of the groups of parallel lines may intersect near the central axis A-A′.
  • FIG. 8 illustrates a second example projection pattern that may be generated by the distance sensor 600 of FIG. 6 .
  • the projection pattern comprises a plurality of groups of parallel lines (e.g., where each projection assembly of the distance sensor 600 projects one group of parallel lines.
  • the planes of the groups of parallel lines may intersect to form a triangular shape whose center is located near the central axis A-A′.
  • FIG. 9 depicts a portion of a fourth embodiment of a distance sensor of the present disclosure.
  • FIG. 9 illustrates an optical unit 900 of a distance sensor, which may be used to project a pattern for detection by an imaging sensor (not shown).
  • a plurality of similarly configured optical units may be arranged on a common circuit board to project a plurality of projection patterns into a field of view.
  • the optical unit 900 comprises a plurality of components arranged in a compact configuration.
  • the components include a circuit board 902 , a light source 904 , a light guiding means such as a collimator lens 906 , a beam splitting means, hereinafter referred to as a diffractive optical element (DOE) 908 , and a line generator 910 .
  • DOE diffractive optical element
  • the components are arranged substantially symmetrically on the circuit board 902 , about a central axis A-A′.
  • the central axis A-A′ coincides with the optical axis of the imaging sensor of the distance sensor.
  • the light source 904 may be positioned directly on the circuit board 902 , and in one embodiment, the light source 904 is positioned at a first end of the central axis A-A′.
  • the light source 904 is a laser light source, such as a VCSEL, that emits a single beam of light along the central axis A-A′.
  • the single beam emitted by the light source 904 may also be referred to as the “primary beam.”
  • the light source 904 emits light of a wavelength that is known to be relatively safe to human vision (e.g., infrared).
  • the light source 904 may include circuitry to adjust the intensity of its output.
  • the light source 904 may emit light in pulses, so as to mitigate the effects of ambient light on image capture.
  • the collimator lens 906 is positioned along the central axis A-A′ in proximity to the light source 904 (e.g., “in front” of the light source 904 , relative to the direction in which light emitted by the light source 904 propagates). In particular, the collimator lens 906 is positioned to intercept the single beam of light emitted by the light source 904 and to focus the single or primary beam onto the DOE 908 .
  • the DOE 908 is positioned along the central axis A-A′ in proximity to the light source 904 (e.g., “in front” of the light source 904 , relative to the direction in which light emitted by the light source 904 propagates).
  • the DOE 908 is positioned to intercept the single beam of light focused by the collimator lens 906 and to split the single or primary beam into a plurality of secondary beams.
  • the DOE 908 is any optical component that is capable of splitting the primary beam into a plurality of secondary beams that diverge from the primary beam in different directions.
  • the DOE 908 may include a conical mirror, a holographic film, or a micro lens.
  • the plurality of secondary beams are arranged in a cone shape.
  • the primary beam may be split by means other than diffraction.
  • the line generator 910 is positioned along the central axis A-A′ in proximity to the light source 904 (e.g., “in front” of the light source 904 , relative to the direction in which light emitted by the light source 904 propagates).
  • the line generator 910 is positioned to intercept the plurality of secondary beams produced by the DOE 908 and to further split the secondary beams into a plurality of tertiary beams.
  • the line generator 910 is a Powell lens.
  • the optical unit 900 can be fabricated without optical fibers, it can be fabricated on a very small scale for use in compact distance sensors.
  • FIG. 10 depicts a high-level block diagram of a general-purpose computer suitable for use in performing the functions described herein.
  • the system 1000 comprises one or more hardware processor elements 1002 (e.g., a central processing unit (CPU), a microprocessor, or a multi-core processor), a memory 1004 , e.g., random access memory (RAM) and/or read only memory (ROM), a module 1005 for calculating distance, and various input/output devices 1006 (e.g., storage devices, including but not limited to, a tape drive, a floppy drive, a hard disk drive or a compact disk drive, a receiver, a transmitter, a lens and optics, an output port, an input port and a user input device (such as a keyboard, a keypad, a mouse, a microphone and the like)).
  • hardware processor elements 1002 e.g., a central processing unit (CPU), a microprocessor, or a multi-core processor
  • RAM random access memory
  • the general-purpose computer may employ a plurality of processor elements.
  • the general-purpose computer may employ a plurality of processor elements.
  • the general-purpose computer of this figure is intended to represent each of those multiple general-purpose computers.
  • one or more hardware processors can be utilized in supporting a virtualized or shared computing environment.
  • the virtualized computing environment may support one or more virtual machines representing computers, servers, or other computing devices. In such virtualized virtual machines, hardware components such as hardware processors and computer-readable storage devices may be virtualized or logically represented.
  • the present disclosure can be implemented in software and/or in a combination of software and hardware, e.g., using application specific integrated circuits (ASIC), a programmable logic array (PLA), including a field-programmable gate array (FPGA), or a state machine deployed on a hardware device, a general purpose computer or any other hardware equivalents, e.g., computer readable instructions pertaining to the method(s) discussed above can be used to configure a hardware processor to perform the steps, functions and/or operations of the above disclosed methods.
  • ASIC application specific integrated circuits
  • PDA programmable logic array
  • FPGA field-programmable gate array
  • instructions and data for the present module or process 1005 for calculating distance can be loaded into memory 1004 and executed by hardware processor element 1002 to implement the steps, functions or operations as discussed above in connection with the example method 400 .
  • a hardware processor executes instructions to perform “operations”, this could include the hardware processor performing the operations directly and/or facilitating, directing, or cooperating with another hardware device or component (e.g., a co-processor and the like) to perform the operations.
  • the processor executing the computer readable or software instructions relating to the above described method(s) can be perceived as a programmed processor or a specialized processor.
  • the present module 1005 for calculating distance (including associated data structures) of the present disclosure can be stored on a tangible or physical (broadly non-transitory) computer-readable storage device or medium, e.g., volatile memory, non-volatile memory, ROM memory, RAM memory, magnetic or optical drive, device or diskette and the like.
  • the computer-readable storage device may comprise any physical devices that provide the ability to store information such as data and/or instructions to be accessed by a processor or a computing device such as a computer or an application server.

Abstract

In one embodiment, a distance sensor includes a projection light source, a first light guiding means positioned to guide light emitted by the projection light source, a diffractive optical element positioned to split the light guided by the first light guiding means into a plurality of projection beams traveling in different directions, and an image capturing device positioned to capture an image of a field of view, including a projection pattern created by an incidence of the plurality of projection beams on an object in the field of view.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/159,286, filed May 10, 2015, which is herein incorporated by reference in its entirety.
  • FIELD OF THE DISCLOSURE
  • The present disclosure relates generally computer vision systems and relates more particularly to sensors for measuring the distance between a vehicle and an object or point in space.
  • Unmanned vehicles, such as robotic vehicles and drones, typically rely on computer vision systems for obstacle detection and navigation in the surrounding environment. These computer vision systems, in turn, typically rely on various sensors that acquire visual data from the surrounding environment, which the computer vision systems process in order to gather information about the surrounding environment. For instance, data acquired via one or more imaging sensors may be used to determine the distance from the vehicle to a particular object or point in the surrounding environment.
  • SUMMARY
  • In one embodiment, a distance sensor includes a projection light source, a first light guiding means positioned to guide light emitted by the projection light source, a diffractive optical element positioned to split the light guided by the first light guiding means into a plurality of projection beams traveling in different directions, and an image capturing device positioned to capture an image of a field of view, including a projection pattern created by an incidence of the plurality of projection beams on an object in the field of view.
  • In another embodiment, a distance sensor includes a plurality of projection light sources, a first plurality of optical fibers, wherein a first end of each optical fiber of the first plurality of optical fibers is coupled to one projection light source of the plurality of projection light sources, a plurality of diffractive optical elements, wherein each diffractive optical element of the plurality of diffractive optical elements is coupled to a second end of one optical fiber of the first plurality of optical fibers, a plurality of illumination light sources, comprising light sources that are different from the plurality of projection light sources, a second plurality of optical fibers, wherein a first end of each optical fiber of the second plurality of optical fibers is coupled to one illumination light source of the plurality of illumination light sources, a plurality of illumination optics, wherein each illumination optic of the plurality of illumination optics is coupled to a second end of one optical fiber of the second plurality of optical fibers, and an image capturing device, wherein the plurality of diffractive optical elements and the plurality of illumination optics are arranged in a ring around a central optical axis of the image capturing device.
  • In another embodiment, a distance sensor comprises a plurality of vertical cavity surface emitting lasers arranged on a circuit board, a plurality of gradient-index lenses, wherein each gradient-index lens of the plurality of gradient-index lenses is positioned to collimate a beam of light produced by one vertical cavity surface emitting laser of the plurality of vertical cavity surface emitting lasers, a plurality of diffractive optical elements, wherein each diffractive optical element of the plurality of diffractive optical elements is positioned to split a beam collimated by one gradient-index lens of the plurality of gradient-index lenses into a plurality of beams traveling in different directions, and a plurality of Powell lenses, wherein each Powell lens of the plurality of Powell lenses is positioned to generate a projection pattern from a plurality of beams generated by one diffractive optical element of the plurality of diffractive optical elements.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The teaching of the present disclosure can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which:
  • FIG. 1A illustrates a cross-sectional view of a first embodiment of a distance sensor of the present disclosure;
  • FIG. 1B illustrates a top view of the distance sensor of FIG. 1A;
  • FIG. 2 illustrates an example field of view of the distance sensor of FIGS. 1A and 1B;
  • FIG. 3A illustrates a cross-sectional view of a second embodiment of a distance sensor of the present disclosure;
  • FIG. 3B illustrates a top view of the distance sensor of FIG. 3A;
  • FIG. 4 illustrates a flowchart of a method for calculating the distance from a sensor to an object or point in space;
  • FIG. 5 illustrates a triangulation technique by which the distance from a sensor to an object or point may be calculated;
  • FIG. 6A illustrates a cross-sectional view of a third embodiment of a distance sensor of the present disclosure;
  • FIG. 6B illustrates a top view of the distance sensor of FIG. 6A;
  • FIG. 7 illustrates a first example projection pattern that may be generated by the distance sensor of FIG. 6;
  • FIG. 8 illustrates a first example projection pattern that may be generated by the distance sensor of FIG. 6;
  • FIG. 9 depicts a portion of a fourth embodiment of a distance sensor of the present disclosure; and
  • FIG. 10 depicts a high-level block diagram of a general-purpose computer suitable for use in performing the functions described herein.
  • To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures.
  • DETAILED DESCRIPTION
  • In one embodiment, the present disclosure relates to a distance sensor. Distance sensors may be used in unmanned vehicles in order to help a computer vision system determine the distance from the vehicle to a particular object or point in the surrounding environment. For instance, a distance sensor may project one or more beams of light onto the object or point and then compute the distance according to time of flight (TOF), analysis of the reflected light (e.g., lidar), or other means. Conventional distance sensors of this type tend to be bulky, however, and thus may not be suitable for use in compact vehicles. Moreover, the sensors can be very expensive to manufacture and tend to have a limited field of view. For instance, even using an arrangement of multiple conventional imaging sensors provides a field of view that is less than 360 degrees.
  • Embodiments of the disclosure provide a compact distance sensor that is economical to manufacture, includes few or no moving parts, and can measure distances in a field of view of up to 360 degrees. In one embodiment, the sensor uses a set of beam splitting means such as an array of optical fibers and diffractive optical elements (DOEs) to generate a plurality of projection points around a wide angle lens. Each of the plurality of projection points emits a plurality of beams into a field of view. From the appearances of the beams, the sensor can measure distances in a 180 degree hemispherical field of view. By mounting two such sensors back-to-back, distances can be measured in a 360 degree field of view. The DOEs make it possible to split a beam generated by a single light source (e.g., laser) into multiple projection beams that are projected onto an object or point in the field of view. However, in other embodiments, beams emitted by multiple light sources are split by the DOEs. The distance from the sensor to the object or point can then be calculated in one cycle of projection and image capture from the multiple projections. A compact sensor capable of measuring distances in fields of view of to 360 degrees can provide meaningful data for applications including unmanned vehicle navigation, endoscopy, and other applications.
  • FIGS. 1A and 1B illustrate a first embodiment of a distance sensor 100 of the present disclosure. In particular, FIG. 1A illustrates a cross-sectional view of the distance sensor 100, while FIG. 1B illustrates a top view of the distance sensor 100 of FIG. 1A. The distance sensor 100 may be mounted, for example, to an unmanned vehicle, or may be used as part of an endoscope.
  • As illustrated in FIG. 1A, the distance sensor 100 comprises a plurality of components arranged in a compact configuration. The components include at least one light source 104, a first beam splitting means, hereinafter referred to as a first diffractive optical element 106, a plurality of light guiding means such as optical fibers 102 1-102 n (hereinafter collectively referred to as “optical fibers 102”), an array of second beam splitting means, hereinafter referred to as second diffractive optical elements 108 1-108 n (and hereinafter collectively referred to as “second diffractive optical elements 108”), and an imaging sensor 110 including a wide-angle lens 112.
  • The components are arranged substantially symmetrically about a central axis A-A′. In one embodiment, the central axis A-A′ coincides with the optical axis of the imaging sensor 110. In one embodiment, the light source 104 is positioned at a first end of the central axis A-A′. In one embodiment, the light source 104 is a laser light source that emits a single beam of light along the central axis A-A′. Hereinafter, the single beam emitted by the light source 104 may also be referred to as the “primary beam.” In one embodiment, the light source 104 emits light of a wavelength that is known to be relatively safe to human vision (e.g., infrared). In a further embodiment, the light source 104 may include circuitry to adjust the intensity of its output. In a further embodiment, the light source 104 may emit light in pulses, so as to mitigate the effects of ambient light on image capture.
  • The first diffractive optical element (DOE) 106 is positioned along the central axis A-A′ in proximity to the light source 104 (e.g., “in front” of the light source 104, relative to the direction in which light emitted by the light source 104 propagates). In particular, the first DOE 106 is positioned to intercept the single beam of light emitted by the light source 104 and to split the single or primary beam into a plurality of secondary beams. The first DOE 106 is any optical component that is capable of splitting the primary beam into a plurality of secondary beams that diverge from the primary beam in different directions. For example, in one embodiment, the first DOE 106 may include a conical mirror, a holographic film, a micro lens, o a line generator (e.g., a Powell lens). In this case, the plurality of secondary beams are arranged in a cone shape. In further embodiments, the primary beam may be split by means other than diffraction.
  • Each of the optical fibers 102 is coupled at one end to the first DOE 106 and is coupled at the other end to one of the DOEs 108 in the array of second DOEs 108. In this way, each of the optical fibers 102 is positioned to carry at least one of the secondary beams produced by the first DOE 106 to one of the DOEs 108 in the array of second DOEs 108.
  • The array of second DOEs 108 is positioned along the central axis A-A′ in proximity to the first DOE 106 (e.g., “in front” of the first DOE 106, relative to the direction in which light emitted by the light source 104 propagates). In particular, the array of second DOEs 108 is positioned such that the first DOE 106 is positioned between the light source 104 and the array of second DOEs 108. As more clearly illustrated in FIG. 1B, in one embodiment, the second DOEs 108 are arranged in a ring-shaped array, with the central axis A-A′ passing through the center of the ring and the second DOEs 108 spaced at regular intervals around the ring. For instance, in one embodiment, the second DOEs 108 are spaced approximately thirty degrees apart around the ring. In one embodiment, the array of second DOES 108 is positioned “behind” a principal point of the imaging sensor 110 (i.e., the point where the optical axis A-A′ intersects the image plane), relative to the direction in which light emitted by the light source 104 propagates.
  • Each second DOE 108 is positioned to receive one of the secondary beams transmitted by one of the optical fibers 102 from the first DOE 106 and to split the secondary beam into a plurality of (e.g., two or more) tertiary beams that are directed away from the second DOE 108 in a radial manner. Thus, each second DOE 108 defines a projection point of the sensor 100 from which a group of projection beams (or tertiary beams) is emitted into the field of view. In one embodiment, each respective plurality of tertiary beams fans out to cover a range of approximately one hundred degrees. The second DOEs 108 are any optical components that are capable of splitting a respective secondary beam into a plurality of tertiary beams that diverge from the secondary beam in different directions. For example, in one embodiment, each second DOE 108 may include a conical mirror, a holographic film, a micro lens, or a line generator (e.g., a Powell lens). In other embodiments, however, the secondary beams are split by a means other than diffraction.
  • In one embodiment, each plurality of tertiary beams is arranged in a fan or radial pattern, with equal angles between each of the beams. In one embodiment, each of the second DOEs 108 is configured to project tertiary beams that create a different visual pattern on a surface. For example, one second DOE 108 may project a pattern of dots, while another second DOE 108 may project a pattern of lines or x's.
  • The imaging sensor 110 is positioned along the central axis A′A′, in the middle of the array of second DOEs 108 (e.g., at least partially “in front” of the array of second DOEs 108, relative to the direction in which light emitted by the light source 104 propagates). In one embodiment, the imaging sensor 110 is an image capturing device, such as a still or video camera. As discussed above, the imaging sensor 110 includes a wide-angle lens 112, such as a fisheye lens, that creates a hemispherical field of view. In one embodiment, the imaging sensor 110 includes circuitry for calculating the distance from the distance sensor 100 to an object or point. In another embodiment, the imaging sensor includes a network interface for communicating captured images over a network to a processor, where the processor calculates the distance from the distance sensor 100 to an object or point and then communicates the calculated distance back to the distance sensor 100.
  • Thus, in one embodiment, the distance sensor 100 uses a single light source (e.g., light source 104) to produce multiple projection points from which sets of projection beams (e.g., comprising patterns of dots or lines) are emitted. The distance from the distance sensor 100 to an object can be calculated from the appearances of the projection beams in the field of view (as discussed in greater detail below). In particular, the use of the first and second DOEs makes it possible to generate a plurality of projection points around the lens, from the single beam of light emitted by the light source. This, plus the use of the optical fibers 102 to transmit light from the first DOE to the second DOEs, allows the distance sensor 100 maintain a relatively compact form factor while measuring distance within a wide field of view. The imaging sensor 110 and the light source 104 can also be mounted in the same plane in order to make the design more compact; however, in one embodiment, the second DOEs 108 1-108 n are positioned behind the principal point of the imaging sensor 110 in order to increase the field of view that can be covered by the projection beams (e.g., such that the depth angle of the field of view is closer to a full 180 degrees, or, in some cases, even greater).
  • Moreover, since each of the second DOEs 108 projects tertiary beams of a different pattern, the circuitry in the imaging sensor can easily determine which beams in a captured image were created by which of the second DOEs 108. This facilitates the distance calculations, as discussed in greater detail below.
  • FIG. 2 illustrates an example field of view 200 of the distance sensor 100 of FIGS. 1A and 1B. In FIG. 2, certain components of the distance sensor 100 are also illustrated in an exploded view. As shown, the field of view 200 is substantially hemispherical in shape. Furthermore, the plurality of tertiary light beams produced by the distance sensor 100 projects a pattern of light onto the “virtual” hemisphere. The patterns are represented by the series of concentric circles that are illustrated where each tertiary beam meets the hemisphere. The circles are depicted as gradually decreasing in size as the distance from the distance sensor 100 increases, in order to show how the patterns created by the tertiary beams change visually by object distance.
  • As shown in FIG. 2, the field of view of the distance sensor 100 covers approximately 180 degrees. In one embodiment, the field of view can be expanded to approximately 360 degrees by mounting two distance sensors back-to-back, e.g., such that their respective light sources emit primary beams in two different directions separated by approximately 180 degrees.
  • Although the sensor 100 is illustrated as including only a single light source 104 (which reduces the total number of components in the sensor 100), in alternative embodiments, the sensor may include a plurality of light sources. FIGS. 3A and 3B, for example, illustrate a second embodiment of a distance sensor 300 of the present disclosure. In particular, FIG. 3A illustrates a cross-sectional view of the distance sensor 300, while FIG. 3B illustrates a top view of the distance sensor 300 of FIG. 3A. The distance sensor 300 may be mounted, for example, to an unmanned vehicle, or may be used as part of an endoscope.
  • As illustrated in FIG. 3A, the distance sensor 300 comprises a plurality of components arranged in a compact configuration. The components include a plurality of light sources 304 1-304 m (hereinafter collectively referred to as “light sources 304”), a first beam splitting means, hereinafter referred to as a first diffractive optical element 306, a plurality of light guiding means such as optical fibers 302 1-302 n (hereinafter collectively referred to as “optical fibers 302”), an array of second beam splitting means, hereinafter referred to as second diffractive optical elements 308 1-308 n (and hereinafter collectively referred to as “second diffractive optical elements 308”), and an imaging sensor 310 including a wide-angle lens 312.
  • The components are arranged substantially symmetrically about a central axis A-A′. In one embodiment, the central axis A-A′ coincides with the optical axis of the imaging sensor 310. In one embodiment, the light sources 304 are positioned at a first end of the central axis A-A′. In one embodiment, the light sources 304 comprise a plurality of laser light sources that each emit a single beam of light along the central axis A-A′. Hereinafter, a single beam emitted by one of the light sources 304 may also be referred to as the “primary beam.” The light sources 304 may include two or more different types of light sources, such as two or more laser light sources that emit light of different wavelengths. In one embodiment, each of the light sources 304 emits light of a wavelength that is known to be relatively safe to human vision (e.g., infrared). In a further embodiment, one or more of the light sources 304 may include circuitry to adjust the intensity of its output. In a further embodiment, one or more of the light sources 304 may emit light in pulses, so as to mitigate the effects of ambient light on image capture.
  • The first diffractive optical element (DOE) 306 is positioned along the central axis A-A′ in proximity to the light sources 304 (e.g., “in front” of the light sources 304, relative to the direction in which light emitted by the light sources 304 propagates). In particular, the first DOE 306 is positioned to intercept the single beam of light emitted by one of the light sources 304 and to split the single or primary beam into a plurality of secondary beams. The first DOE 306 is any optical component that is capable of splitting the primary beam into a plurality of secondary beams that diverge from the primary beam in different directions. For example, in one embodiment, the first DOE 306 may include a conical mirror, a holographic film, a micro lens, or a line generator (e.g., a Powell lens). In this case, the plurality of secondary beams are arranged in a cone shape. In further embodiments, the primary beam may be split by means other than diffraction.
  • Each of the optical fibers 302 is coupled at one end to the first DOE 306 and is coupled at the other end to one of the DOEs 308 in the array of second DOEs 308. In this way, each of the optical fibers 302 is positioned to carry at least one of the secondary beams produced by the first DOE 306 to one of the DOEs 308 in the array of second DOEs 308.
  • The array of second DOEs 308 is positioned along the central axis A-A′ in proximity to the first DOE 306 (e.g., “in front” of the first DOE 306, relative to the direction in which light emitted by the light sources 304 propagates). In particular, the array of second DOEs 308 is positioned such that the first DOE 306 is positioned between the light sources 304 and the array of second DOEs 308. As more clearly illustrated in FIG. 3B, in one embodiment, the second DOEs 308 are arranged in a ring-shaped array, with the central axis A-A′ passing through the center of the ring and the second DOEs 308 spaced at regular intervals around the ring. For instance, in one embodiment, the second DOEs 308 are spaced approximately thirty degrees apart around the ring. In one embodiment, the array of second DOES 308 is positioned “behind” a principal point of the imaging sensor 310 (i.e., the point where the optical axis A-A′ intersects the image plane), relative to the direction in which light emitted by the light sources 304 propagates.
  • Each second DOE 308 is positioned to receive one of the secondary beams transmitted by one of the optical fibers 302 from the first DOE 306 and to split the secondary beam into a plurality of (e.g., two or more) tertiary beams that are directed away from the second DOE 308 in a radial manner. Thus, each second DOE 308 defines a projection point of the sensor 300 from which a group of projection beams (or tertiary beams) is emitted into the field of view. In one embodiment, each respective plurality of tertiary beams fans out to cover a range of approximately one hundred degrees. The second DOEs 308 are any optical components that are capable of splitting a respective secondary beam into a plurality of tertiary beams that diverge from the secondary beam in different directions. For example, in one embodiment, each second DOE 308 may include a conical mirror, a holographic film, a micro lens, or a line generator (e.g., a Powell lens). In other embodiments, however, the secondary beams are split by a means other than diffraction.
  • In one embodiment, each plurality of tertiary beams is arranged in a fan or radial pattern, with equal angles between each of the beams. In one embodiment, each of the second DOEs 308 is configured to project tertiary beams that create a different visual pattern on a surface. For example, one second DOE 308 may project a pattern of dots, while another second DOE 308 may project a pattern of lines or x's.
  • The imaging sensor 310 is positioned along the central axis A′A′, in the middle of the array of second DOEs 308 (e.g., at least partially “in front” of the array of second DOEs 308, relative to the direction in which light emitted by the light sources 304 propagates). In one embodiment, the imaging sensor 310 is an image capturing device, such as a still or video camera. As discussed above, the imaging sensor 310 includes a wide-angle lens 312, such as a fisheye lens, that creates a hemispherical field of view. In one embodiment, the imaging sensor 310 includes circuitry for calculating the distance from the distance sensor 300 to an object or point. In another embodiment, the imaging sensor includes a network interface for communicating captured images over a network to a processor, where the processor calculates the distance from the distance sensor 300 to an object or point and then communicates the calculated distance back to the distance sensor 300.
  • FIG. 4 illustrates a flowchart of a method 400 for calculating the distance from a sensor to an object or point in space. In one embodiment, the method 400 may be performed by a processor integrated in an imaging sensor (such as the imaging sensor 110 illustrated in FIG. 1A or the imaging sensor 310 illustrated in FIG. 3A) or a general purpose computing device as illustrated in FIG. 10 and discussed below.
  • The method 400 begins in step 402. In step 404, a light source is activated to generate a primary beam of light. In one embodiment, a single primary beam is generated by a single light source; however, in other embodiments, multiple primary beams are generated by multiple light sources. In one embodiment, the light source or light sources comprise a laser light source.
  • In optional step 406, the primary beam is split into a plurality of secondary beams using a first beam splitting means (e.g., a diffractive optical element) that is positioned in the path along which the primary beam propagates. The first beam splitting means may be, for example, a conical mirror. Step 406 is performed, for example, when the distance sensor (of which the imaging sensor is a part) includes only a single light source.
  • In step 408, each beam in the plurality of secondary beams is split into a plurality of projection or tertiary beams using a second beam splitting means (e.g., second diffractive optical element) in an array of beam splitting means. In one embodiment, a plurality of second beam splitting means are positioned in a ring, such that each second beam splitting means is positioned in the path along which one of the secondary beams propagates. In one embodiment, at least some of the second beam splitting means are conical mirrors. In one embodiment, where the distance sensor comprises a plurality of light sources, the method 400 may proceed directly from step 404 to step 408. In this case, each primary beam of a plurality of primary beams (generated using the plurality of light sources) is directly split into a plurality of projection beams by one of the second beam splitting means.
  • In step 410, at least one image of the object or point is captured. The image includes a pattern that is projected onto the object or point and onto the surrounding space. The pattern is created by each of the projection beams projecting a series of dots, lines, or other shapes onto the object, point, or surrounding space.
  • In step 412, the distance from the sensor to the object or point is calculated using information from the images captured in step 410. In one embodiment, a triangulation technique is used to calculate the distance. For example, the positional relationships between parts of the patterns projected by the sensor can be used as the basis for the calculation.
  • The method 400 ends in step 414. Thus, the method 400, in combination with the sensor depicted in FIGS. 1A-1B or in FIG. 3, can measure the distance from the sensor to an object or point in space in a single cycle of image capture and calculation.
  • FIG. 5, for example, illustrates a triangulation technique by which the distance from the sensor to the object or point may be calculated in step 412. In particular, FIG. 5 illustrates the example imaging sensor 110 of FIG. 1, as well as two of the projection points, which may be defined by two of the second diffractive optical elements 108 1 and 108 2. The projection points are spaced equal distances, x, from the imaging sensor 110, such that there is a distance of s between the two projection points (e.g., x=s/2). Each of the projection points emits a respective projection beam 500 1 and 500 2, which is incident upon the object to create a respective point 502 1 and 502 2 (e.g., dot or line) in a pattern. These points 502 1 and 502 2 are detected by the imaging sensor 110 and may be used to calculate the distance, D, between the imaging sensor 110 and the object as follows:

  • D=s/(−tan α2+tan α1+tan θ2+tan θ1)   (EQN. 1)
  • where α2 is the angle formed between the projection beam 500 2 and a central axis c2 of the second diffractive optical element 108 2, α1 is the angle formed between the projection beam 500 1 and a central axis c1 of the second diffractive optical element 108 1, θ2 is the angle formed between the central optical axis O of the imaging sensor 110 and the angle at which the imaging sensor 110 perceives the point 502 2 created by the projection beam 500 2, and θ1 is the angle formed between the central optical axis O of the imaging sensor 110 and the angle at which the imaging sensor 110 perceives the point 502 1 created by the projection beam 500 1.
  • EQN. 1 is derived from the following relationships:

  • D*tan α1 +D*tan θ 1 =x   (EQN. 2)

  • D*tan α2 +D*tan θ 2 =s−x   (EQN. 3)
  • EQNs. 2 and 3 allow one to calculate the distance from a source of a projection pattern (comprising, e.g., a pattern of dots) to an object onto which the projection pattern is projected. The distance is calculated based on the positional relationship between the points of light (e.g., the dots) that form the projection pattern when the points of light are emitted by different projection points around the source. In this embodiment, the positional relationships between the points of light are known a priori (i.e., not measured as part of the calculation).
  • As discussed above, one application to which the distance sensor of the present disclosure may be well-suited is endoscopy, due to its compact size and ability to measure distances in fields of view of up to 360 degrees. FIGS. 6A and 6B, for example, illustrate a third embodiment of a distance sensor 600 of the present disclosure. In particular, FIG. 6A illustrates a cross-sectional view of the distance sensor 600, while FIG. 6B illustrates a top view of the distance sensor 600 of FIG. 3A. The distance sensor 600 may be suitable for use as part of an endoscope.
  • As illustrated in FIG. 6A, the distance sensor 600 comprises a plurality of components arranged in a compact configuration. The components include a plurality of illumination light sources (hereinafter collectively referred to as “illumination light sources 602,” of which one illumination light source 602 1 is visible in FIG. 6A), a plurality of projection light sources (hereinafter collectively referred to as “projection light sources 604,” of which one projection light source 604 1 is visible in FIG. 6A), a plurality of light guiding means such as optical fibers (hereinafter collectively referred to as “optical fibers 606,” of which two optical fibers 606 1 and 6066 are visible in FIG. 6A), a plurality of beam splitting means, hereinafter referred to as a diffractive optical elements (hereinafter collectively referred to as “DOEs 608,” of which one DOE 608 1 is visible in FIG. 6A), a plurality of collimating lenses, such as gradient-index (GRIN) lenses (hereinafter collectively referred to as “GRIN lenses 616,” of which one GRIN lens 616 1 is visible in FIG. 6A), a plurality of Powell lenses 614 1-614 3 (hereinafter collectively referred to as “Powell lenses 614”), a plurality of illumination optics assemblies 618 1-618 3 (hereinafter collectively referred to as “illumination optics assemblies 614”), and an imaging sensor 610 including a wide-angle lens 612.
  • The components are arranged substantially symmetrically about a central axis A-A′. In one embodiment, the central axis A-A′ coincides with the optical axis of the imaging sensor 610. In one embodiment, the illumination light sources 602 and the projection light sources 604 are positioned at a first end of the central axis A-A′. In one embodiment, the illumination light sources 602 comprise a plurality of light emitting diodes (LEDs) that each emit illumination in a direction substantially parallel to the central axis A-A′.
  • In one embodiment, the projection light sources 604 comprise a plurality of laser light sources, such as vertical cavity surface emitting lasers (VCSELs), that each emit a single beam of light in a direction substantially parallel to the central axis A-A′. Hereinafter, a single beam emitted by one of the projection light sources 604 may also be referred to as the “primary beam.” In one embodiment, each of the projection light sources 604 emits light of a wavelength that is known to be relatively safe to human vision (e.g., infrared). In a further embodiment, the projection light sources 604 may include circuitry to adjust the intensity of their output. In a further embodiment, one or more of the projection light sources 604 may emit light in pulses, so as to mitigate the effects of ambient light on image capture.
  • The illumination light sources 602 are connected to the illumination optics assemblies 618 via a first subset of the optical fibers 606. The illumination optics assemblies 618 may include lenses, diffractive elements, or other components that help to direct illumination generated by the illumination light sources 602 in the appropriate directions. In one embodiment, the illumination optics assemblies 618 are positioned along the central axis A-A′ in proximity to the illumination light sources 602 (e.g., “in front” of the illumination light sources 602, relative to the direction in which light emitted by illumination light sources 602 propagates). As more clearly illustrated in FIG. 6B, in one embodiment, the illumination optics assemblies 618 are arranged in a ring-shaped array, with the central axis A-A′ passing through the center of the ring and the illumination optics assemblies 618 spaced at regular intervals around the ring, e.g., alternating with the Powell lenses 614. For instance, in one embodiment, the second DOEs 308 are spaced approximately 120 degrees apart around the ring. In one embodiment, the illumination optics assemblies 618 are positioned “behind” a principal point of the imaging sensor 610 (i.e., the point where the optical axis A-A′ intersects the image plane), relative to the direction in which light emitted by the illumination light sources 602 propagates.
  • The projection light sources 604 are connected to the GRIN lenses 616 via a second subset of the optical fibers 606. In one embodiment, GRIN lenses 616 are positioned along the central axis A-A′ in proximity to the projection light sources 604 (e.g., “in front” of the projection light sources 604, relative to the direction in which light emitted by the projection light sources 604 propagates). Each GRIN lens 616 is coupled to one of the DOEs 608, which is in turn coupled to one of the Powell lenses 614. Together, each set of GRIN lens 616, DOE 608, and Powell lens 614 receives, via an optical fiber 606, a single beam of light emitted by one of the projection light sources 604 and splits the single or primary beam into a plurality of secondary beams. The DOEs 608 may be any optical components that are capable of splitting a primary beam into a plurality of secondary beams that diverge from the primary beam in different directions. For example, in one embodiment, the DOEs 608 may include a conical mirror, a holographic film, a micro lens, or a line generator. In further embodiments, the primary beam may be split by means other than diffraction.
  • As illustrated in FIG. 6B, the sensor 600 may include three sets each of: (1) illumination light source 602, optical fiber 606, and illumination optics 618 (hereinafter an “illumination assembly”); and (2) projection light source 604, optical fiber 606, GRIN lens 616, DOE 608, and Powell lens 614 (hereinafter a “projection assembly”). The illumination assemblies and projection assemblies may be arranged in an alternating fashion around the central axis A-A′, such that each illumination assembly is spaced from the next illumination assembly by approximately 120 degrees, and each projection assembly is spaced from the next projection assembly by approximately 120 degrees. In further embodiments, however, different numbers of illumination assemblies and projection assemblies may be used. In addition, the spacing between illumination assemblies and projection assemblies may be varied.
  • The imaging sensor 610 is positioned along the central axis A′A′, in the middle of the ring of illumination assemblies and projection assemblies (e.g., at least partially “in front” of the illumination assemblies and projection assemblies, relative to the direction in which light emitted by the illumination assemblies and projection assemblies propagates). In one embodiment, the imaging sensor 610 is an image capturing device, such as a still or video camera. As discussed above, the imaging sensor 610 includes a wide-angle lens 612, such as a fisheye lens, that creates a hemispherical field of view. In one embodiment, the imaging sensor 610 includes circuitry for calculating the distance from the distance sensor 600 to an object or point. In another embodiment, the imaging sensor includes a network interface for communicating captured images over a network to a processor, where the processor calculates the distance from the distance sensor 600 to an object or point and then communicates the calculated distance back to the distance sensor 300.
  • As discussed above, the distance sensor 600 may be especially well-suited for endoscopy applications, due to its ability to emit both illuminating radiation and a projection pattern. FIG. 7 illustrates a first example projection pattern that may be generated by the distance sensor 600 of FIG. 6. As illustrated, the projection pattern comprises a plurality of groups of parallel lines (e.g., where each projection assembly of the distance sensor 600 projects one group of parallel lines. The planes of the groups of parallel lines may intersect near the central axis A-A′.
  • FIG. 8 illustrates a second example projection pattern that may be generated by the distance sensor 600 of FIG. 6. As illustrated, the projection pattern comprises a plurality of groups of parallel lines (e.g., where each projection assembly of the distance sensor 600 projects one group of parallel lines. The planes of the groups of parallel lines may intersect to form a triangular shape whose center is located near the central axis A-A′.
  • FIG. 9 depicts a portion of a fourth embodiment of a distance sensor of the present disclosure. In particular, FIG. 9 illustrates an optical unit 900 of a distance sensor, which may be used to project a pattern for detection by an imaging sensor (not shown). A plurality of similarly configured optical units may be arranged on a common circuit board to project a plurality of projection patterns into a field of view.
  • As illustrated in FIG. 9, the optical unit 900 comprises a plurality of components arranged in a compact configuration. The components include a circuit board 902, a light source 904, a light guiding means such as a collimator lens 906, a beam splitting means, hereinafter referred to as a diffractive optical element (DOE) 908, and a line generator 910.
  • The components are arranged substantially symmetrically on the circuit board 902, about a central axis A-A′. In one embodiment, the central axis A-A′ coincides with the optical axis of the imaging sensor of the distance sensor. The light source 904 may be positioned directly on the circuit board 902, and in one embodiment, the light source 904 is positioned at a first end of the central axis A-A′. In one embodiment, the light source 904 is a laser light source, such as a VCSEL, that emits a single beam of light along the central axis A-A′. Hereinafter, the single beam emitted by the light source 904 may also be referred to as the “primary beam.” In one embodiment, the light source 904 emits light of a wavelength that is known to be relatively safe to human vision (e.g., infrared). In a further embodiment, the light source 904 may include circuitry to adjust the intensity of its output. In a further embodiment, the light source 904 may emit light in pulses, so as to mitigate the effects of ambient light on image capture.
  • The collimator lens 906 is positioned along the central axis A-A′ in proximity to the light source 904 (e.g., “in front” of the light source 904, relative to the direction in which light emitted by the light source 904 propagates). In particular, the collimator lens 906 is positioned to intercept the single beam of light emitted by the light source 904 and to focus the single or primary beam onto the DOE 908.
  • The DOE 908 is positioned along the central axis A-A′ in proximity to the light source 904 (e.g., “in front” of the light source 904, relative to the direction in which light emitted by the light source 904 propagates). In particular, the DOE 908 is positioned to intercept the single beam of light focused by the collimator lens 906 and to split the single or primary beam into a plurality of secondary beams. The DOE 908 is any optical component that is capable of splitting the primary beam into a plurality of secondary beams that diverge from the primary beam in different directions. For example, in one embodiment, the DOE 908 may include a conical mirror, a holographic film, or a micro lens. In this case, the plurality of secondary beams are arranged in a cone shape. In further embodiments, the primary beam may be split by means other than diffraction.
  • The line generator 910 is positioned along the central axis A-A′ in proximity to the light source 904 (e.g., “in front” of the light source 904, relative to the direction in which light emitted by the light source 904 propagates). In particular, the line generator 910 is positioned to intercept the plurality of secondary beams produced by the DOE 908 and to further split the secondary beams into a plurality of tertiary beams. In one embodiment, the line generator 910 is a Powell lens.
  • Because the optical unit 900 can be fabricated without optical fibers, it can be fabricated on a very small scale for use in compact distance sensors.
  • FIG. 10 depicts a high-level block diagram of a general-purpose computer suitable for use in performing the functions described herein. As depicted in FIG. 10, the system 1000 comprises one or more hardware processor elements 1002 (e.g., a central processing unit (CPU), a microprocessor, or a multi-core processor), a memory 1004, e.g., random access memory (RAM) and/or read only memory (ROM), a module 1005 for calculating distance, and various input/output devices 1006 (e.g., storage devices, including but not limited to, a tape drive, a floppy drive, a hard disk drive or a compact disk drive, a receiver, a transmitter, a lens and optics, an output port, an input port and a user input device (such as a keyboard, a keypad, a mouse, a microphone and the like)). Although only one processor element is shown, it should be noted that the general-purpose computer may employ a plurality of processor elements. Furthermore, although only one general-purpose computer is shown in the figure, if the method(s) as discussed above is implemented in a distributed or parallel manner for a particular illustrative example, i.e., the steps of the above method(s) or the entire method(s) are implemented across multiple or parallel general-purpose computers, then the general-purpose computer of this figure is intended to represent each of those multiple general-purpose computers. Furthermore, one or more hardware processors can be utilized in supporting a virtualized or shared computing environment. The virtualized computing environment may support one or more virtual machines representing computers, servers, or other computing devices. In such virtualized virtual machines, hardware components such as hardware processors and computer-readable storage devices may be virtualized or logically represented.
  • It should be noted that the present disclosure can be implemented in software and/or in a combination of software and hardware, e.g., using application specific integrated circuits (ASIC), a programmable logic array (PLA), including a field-programmable gate array (FPGA), or a state machine deployed on a hardware device, a general purpose computer or any other hardware equivalents, e.g., computer readable instructions pertaining to the method(s) discussed above can be used to configure a hardware processor to perform the steps, functions and/or operations of the above disclosed methods. In one embodiment, instructions and data for the present module or process 1005 for calculating distance (e.g., a software program comprising computer-executable instructions) can be loaded into memory 1004 and executed by hardware processor element 1002 to implement the steps, functions or operations as discussed above in connection with the example method 400. Furthermore, when a hardware processor executes instructions to perform “operations”, this could include the hardware processor performing the operations directly and/or facilitating, directing, or cooperating with another hardware device or component (e.g., a co-processor and the like) to perform the operations.
  • The processor executing the computer readable or software instructions relating to the above described method(s) can be perceived as a programmed processor or a specialized processor. As such, the present module 1005 for calculating distance (including associated data structures) of the present disclosure can be stored on a tangible or physical (broadly non-transitory) computer-readable storage device or medium, e.g., volatile memory, non-volatile memory, ROM memory, RAM memory, magnetic or optical drive, device or diskette and the like. More specifically, the computer-readable storage device may comprise any physical devices that provide the ability to store information such as data and/or instructions to be accessed by a processor or a computing device such as a computer or an application server.
  • While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims (20)

What is claimed is:
1. An apparatus, comprising:
a projection light source;
a first light guiding means positioned to guide light emitted by the projection light source;
a diffractive optical element positioned to split the light guided by the first light guiding means into a plurality of projection beams traveling in different directions; and
an image capturing device positioned to capture an image of a field of view, including a projection pattern created by an incidence of the plurality of projection beams on an object in the field of view.
2. The apparatus of claim 1, wherein the projection light source comprises a single laser light source.
3. The apparatus of claim 1, wherein the projection light source comprises a plurality of laser light sources.
4. The apparatus of claim 3, wherein the plurality of laser light sources comprises at least two different types of laser light sources.
5. The apparatus of claim 1, wherein the first light guiding means comprises a plurality of optical fibers.
6. The apparatus of claim 5, wherein the diffractive optical element comprises a plurality of diffractive optical elements, and wherein each optical fiber of the plurality of optical fibers is coupled to one diffractive optical element of the plurality of diffractive optical elements.
7. The apparatus of claim 6, further comprising:
an additional diffractive optical element, separate from the plurality of diffractive optical elements, coupling the projection light source to the plurality of optical fibers.
8. The apparatus of claim 6, wherein the plurality of diffractive optical elements is arranged in a ring around a central optical axis of the image capturing device.
9. The apparatus of claim 1, further comprising:
an illumination light source, comprising a light source different from the projection light source;
a second light guiding means positioned to guide light emitted by the projection light source;
an illumination optic positioned to direct the light from the second light guiding means into the field of view.
10. The apparatus of claim 9, wherein the illumination optic comprises a plurality of illumination optics, the diffractive optical element comprises a plurality of diffractive optical elements, and the plurality of illumination optics and the plurality of diffractive optical elements are arranged in an alternating pattern in a ring configuration around a central optical axis of the image capturing device.
11. The apparatus of claim 9, further comprising:
a collimating lens positioned between the first light guiding means and the diffractive optical element; and
a line generator, wherein the line generator is positioned such that the diffractive optical element is situated between the collimating lens and the line generator.
12. The apparatus of claim 11, wherein the line generator is a Powell lens.
13. The apparatus of claim 11, wherein the collimating lens is a gradient-index lens.
14. The apparatus of claim 1, wherein the projection light source comprises a plurality of vertical cavity surface emitting lasers arranged on a circuit board.
15. The apparatus of claim 14, wherein the light guiding means comprises a collimating lens positioned between the projection light source and the diffractive optical element.
16. The apparatus of claim 15, wherein the collimating lens is a gradient-index lens.
17. The apparatus of claim 15, further comprising:
a line generator, wherein the line generator is positioned such that the diffractive optical element is situated between the collimating lens and the line generator.
18. The apparatus of claim 17, wherein the line generator is a Powell lens.
19. An apparatus, comprising:
a plurality of projection light sources;
a first plurality of optical fibers, wherein a first end of each optical fiber of the first plurality of optical fibers is coupled to one projection light source of the plurality of projection light sources;
a plurality of diffractive optical elements, wherein each diffractive optical element of the plurality of diffractive optical elements is coupled to a second end of one optical fiber of the first plurality of optical fibers;
a plurality of illumination light sources, comprising light sources that are different from the plurality of projection light sources;
a second plurality of optical fibers, wherein a first end of each optical fiber of the second plurality of optical fibers is coupled to one illumination light source of the plurality of illumination light sources;
a plurality of illumination optics, wherein each illumination optic of the plurality of illumination optics is coupled to a second end of one optical fiber of the second plurality of optical fibers; and
an image capturing device, wherein the plurality of diffractive optical elements and the plurality of illumination optics are arranged in a ring around a central optical axis of the image capturing device.
20. An apparatus, comprising:
a plurality of vertical cavity surface emitting lasers arranged on a circuit board;
a plurality of gradient-index lenses, wherein each gradient-index lens of the plurality of gradient-index lenses is positioned to collimate a beam of light produced by one vertical cavity surface emitting laser of the plurality of vertical cavity surface emitting lasers;
a plurality of diffractive optical elements, wherein each diffractive optical element of the plurality of diffractive optical elements is positioned to split a beam collimated by one gradient-index lens of the plurality of gradient-index lenses into a plurality of beams traveling in different directions; and
a plurality of Powell lenses, wherein each Powell lens of the plurality of Powell lenses is positioned to generate a projection pattern from a plurality of beams generated by one diffractive optical element of the plurality of diffractive optical elements.
US15/149,429 2015-05-10 2016-05-09 Distance sensor Abandoned US20160328854A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/149,429 US20160328854A1 (en) 2015-05-10 2016-05-09 Distance sensor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562159286P 2015-05-10 2015-05-10
US15/149,429 US20160328854A1 (en) 2015-05-10 2016-05-09 Distance sensor

Publications (1)

Publication Number Publication Date
US20160328854A1 true US20160328854A1 (en) 2016-11-10

Family

ID=57222485

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/149,323 Active 2036-10-28 US10228243B2 (en) 2015-05-10 2016-05-09 Distance sensor with parallel projection beams
US15/149,429 Abandoned US20160328854A1 (en) 2015-05-10 2016-05-09 Distance sensor

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/149,323 Active 2036-10-28 US10228243B2 (en) 2015-05-10 2016-05-09 Distance sensor with parallel projection beams

Country Status (8)

Country Link
US (2) US10228243B2 (en)
EP (2) EP3295119A4 (en)
JP (3) JP6761817B2 (en)
KR (2) KR20180006377A (en)
CN (2) CN107850420B (en)
HK (2) HK1252962A1 (en)
TW (2) TWI687652B (en)
WO (2) WO2016182982A1 (en)

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
US20190026912A1 (en) * 2017-07-24 2019-01-24 Hand Held Products, Inc. Dual-pattern optical 3d dimensioning
US10218964B2 (en) 2014-10-21 2019-02-26 Hand Held Products, Inc. Dimensioning system with feedback
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
US10240914B2 (en) 2014-08-06 2019-03-26 Hand Held Products, Inc. Dimensioning system with guided alignment
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10247547B2 (en) 2015-06-23 2019-04-02 Hand Held Products, Inc. Optical pattern projector
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US10386468B2 (en) * 2015-03-05 2019-08-20 Hanwha Techwin Co., Ltd. Photographing apparatus and method
US10393508B2 (en) 2014-10-21 2019-08-27 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US10402956B2 (en) 2014-10-10 2019-09-03 Hand Held Products, Inc. Image-stitching for dimensioning
US10401872B2 (en) * 2017-05-23 2019-09-03 Gopro, Inc. Method and system for collision avoidance
WO2019182881A1 (en) * 2018-03-20 2019-09-26 Magik Eye Inc. Distance measurement using projection patterns of varying densities
US10467806B2 (en) 2012-05-04 2019-11-05 Intermec Ip Corp. Volume dimensioning systems and methods
US10488192B2 (en) 2015-05-10 2019-11-26 Magik Eye Inc. Distance sensor projecting parallel patterns
US20190377088A1 (en) * 2018-06-06 2019-12-12 Magik Eye Inc. Distance measurement using high density projection patterns
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
US10593130B2 (en) 2015-05-19 2020-03-17 Hand Held Products, Inc. Evaluating image values
US10589860B2 (en) * 2017-05-23 2020-03-17 Gopro, Inc. Spherical infrared emitter
US10593014B2 (en) * 2018-03-26 2020-03-17 Ricoh Company, Ltd. Image processing apparatus, image processing system, image capturing system, image processing method
US10612958B2 (en) 2015-07-07 2020-04-07 Hand Held Products, Inc. Mobile dimensioner apparatus to mitigate unfair charging practices in commerce
US10635922B2 (en) 2012-05-15 2020-04-28 Hand Held Products, Inc. Terminals and methods for dimensioning objects
US10679076B2 (en) 2017-10-22 2020-06-09 Magik Eye Inc. Adjusting the projection system of a distance sensor to optimize a beam layout
CN111492262A (en) * 2017-10-08 2020-08-04 魔眼公司 Distance measurement using warp-wise grid pattern
US10747227B2 (en) 2016-01-27 2020-08-18 Hand Held Products, Inc. Vehicle positioning and object avoidance
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US10885761B2 (en) 2017-10-08 2021-01-05 Magik Eye Inc. Calibrating a sensor system including multiple movable sensors
US10908013B2 (en) 2012-10-16 2021-02-02 Hand Held Products, Inc. Dimensioning system
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
US10931883B2 (en) 2018-03-20 2021-02-23 Magik Eye Inc. Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging
US11002537B2 (en) 2016-12-07 2021-05-11 Magik Eye Inc. Distance sensor including adjustable focus imaging sensor
US11019249B2 (en) * 2019-05-12 2021-05-25 Magik Eye Inc. Mapping three-dimensional depth map data onto two-dimensional images
US11029762B2 (en) 2015-07-16 2021-06-08 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
US11172108B2 (en) 2017-06-13 2021-11-09 Sharp Kabushiki Kaisha Imaging device
US11320537B2 (en) 2019-12-01 2022-05-03 Magik Eye Inc. Enhancing triangulation-based three-dimensional distance measurements with time of flight information
US11474209B2 (en) 2019-03-25 2022-10-18 Magik Eye Inc. Distance measurement using high density projection patterns
US11475584B2 (en) * 2018-08-07 2022-10-18 Magik Eye Inc. Baffles for three-dimensional sensors having spherical fields of view
US11483503B2 (en) 2019-01-20 2022-10-25 Magik Eye Inc. Three-dimensional sensor including bandpass filter having multiple passbands
US11580662B2 (en) 2019-12-29 2023-02-14 Magik Eye Inc. Associating three-dimensional coordinates with two-dimensional feature points
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
US11688088B2 (en) 2020-01-05 2023-06-27 Magik Eye Inc. Transferring the coordinate system of a three-dimensional camera to the incident point of a two-dimensional camera
US20230328218A1 (en) * 2022-04-08 2023-10-12 Himax Technologies Limited Structured light projector and three-dimensional image sensing apparatus

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170072319A (en) 2014-10-24 2017-06-26 매직 아이 인코포레이티드 Distance sensor
EP3295119A4 (en) 2015-05-10 2019-04-10 Magik Eye Inc. Distance sensor
TWI758367B (en) * 2016-12-07 2022-03-21 美商麥吉克艾公司 Distance sensor projecting parallel patterns
US10440349B2 (en) * 2017-09-27 2019-10-08 Facebook Technologies, Llc 3-D 360 degrees depth projector
WO2020040390A1 (en) * 2018-08-23 2020-02-27 엘지전자 주식회사 Apparatus and method for generating three-dimensional image
CN109633675B (en) * 2019-01-25 2021-04-13 广州市慧建科技有限公司 Laser emitting device
EP3910286B1 (en) * 2020-05-12 2022-10-26 Hexagon Technology Center GmbH Improving structured light projection through the minimization of visual artifacts by way of deliberately introduced optical aberrations
KR102313115B1 (en) * 2021-06-10 2021-10-18 도브텍 주식회사 Autonomous flying drone using artificial intelligence neural network

Family Cites Families (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62287215A (en) * 1986-06-06 1987-12-14 Olympus Optical Co Ltd Optical system device for endoscope lighting
US4914460A (en) 1987-05-29 1990-04-03 Harbor Branch Oceanographic Institution Inc. Apparatus and methods of determining distance and orientation
JP3141681B2 (en) 1994-04-27 2001-03-05 キヤノン株式会社 Optical system with anti-vibration function
JPH08555A (en) 1994-06-16 1996-01-09 Fuji Photo Optical Co Ltd Illumination device of endoscope
US5980454A (en) * 1997-12-01 1999-11-09 Endonetics, Inc. Endoscopic imaging system employing diffractive optical elements
US7581681B2 (en) 1998-03-24 2009-09-01 Metrologic Instruments, Inc. Tunnel-type digital imaging system for use within retail shopping environments such as supermarkets
JP2001161630A (en) * 1999-12-09 2001-06-19 Olympus Optical Co Ltd Endoscope
US7940299B2 (en) 2001-08-09 2011-05-10 Technest Holdings, Inc. Method and apparatus for an omni-directional video surveillance system
DE10308383A1 (en) * 2003-02-27 2004-09-16 Storz Endoskop Produktions Gmbh Method and optical system for measuring the topography of a measurement object
JP4644540B2 (en) * 2005-06-28 2011-03-02 富士通株式会社 Imaging device
US20070091174A1 (en) 2005-09-30 2007-04-26 Topcon Corporation Projection device for three-dimensional measurement, and three-dimensional measurement system
US20070156051A1 (en) * 2005-12-29 2007-07-05 Amit Pascal Device and method for in-vivo illumination
JP4760391B2 (en) * 2006-01-13 2011-08-31 カシオ計算機株式会社 Ranging device and ranging method
JP4799216B2 (en) 2006-03-03 2011-10-26 富士通株式会社 Imaging device having distance measuring function
US8471892B2 (en) 2006-11-23 2013-06-25 Z. Jason Geng Wide field-of-view reflector and method of designing and making same
JP5242101B2 (en) * 2007-08-31 2013-07-24 オリンパスメディカルシステムズ株式会社 Capsule endoscope
US8187097B1 (en) 2008-06-04 2012-05-29 Zhang Evan Y W Measurement and segment of participant's motion in game play
US8531650B2 (en) * 2008-07-08 2013-09-10 Chiaro Technologies LLC Multiple channel locating
US8334900B2 (en) * 2008-07-21 2012-12-18 The Hong Kong University Of Science And Technology Apparatus and method of optical imaging for medical diagnosis
JP5484098B2 (en) 2009-03-18 2014-05-07 三菱電機株式会社 Projection optical system and image display apparatus
JP4991787B2 (en) * 2009-04-24 2012-08-01 パナソニック株式会社 Reflective photoelectric sensor
GB0921461D0 (en) * 2009-12-08 2010-01-20 Qinetiq Ltd Range based sensing
US8320621B2 (en) * 2009-12-21 2012-11-27 Microsoft Corporation Depth projector system with integrated VCSEL array
US20110188054A1 (en) * 2010-02-02 2011-08-04 Primesense Ltd Integrated photonics module for optical projection
CN103124514B (en) * 2010-08-24 2016-09-28 欧司朗有限公司 There is the color-tunable light source unit of unit and fluorescence unit
JP5163713B2 (en) * 2010-08-24 2013-03-13 カシオ計算機株式会社 Distance image sensor, distance image generation device, distance image data acquisition method, and distance image generation method
TWI428558B (en) 2010-11-10 2014-03-01 Pixart Imaging Inc Distance measurement method and system, and processing software thereof
JP5830270B2 (en) * 2011-05-24 2015-12-09 オリンパス株式会社 Endoscope apparatus and measuring method
US10054430B2 (en) 2011-08-09 2018-08-21 Apple Inc. Overlapping pattern projector
WO2013024822A1 (en) * 2011-08-12 2013-02-21 株式会社フジクラ Optical fiber structure, illumination device, endoscope, and optical fiber structure manufacturing method
JP2015513070A (en) * 2012-01-31 2015-04-30 スリーエム イノベイティブ プロパティズ カンパニー Method and apparatus for measuring the three-dimensional structure of a surface
WO2013129387A1 (en) * 2012-03-01 2013-09-06 日産自動車株式会社 Range finding device and range finding method
EP2833095B1 (en) * 2012-03-28 2023-06-28 Fujitsu Limited Imaging device
JP6250040B2 (en) * 2012-05-18 2017-12-20 シーメンス・ヘルスケア・ダイアグノスティックス・インコーポレーテッドSiemens Healthcare Diagnostics Inc. Fisheye lens analyzer
EP2696590B1 (en) 2012-08-06 2014-09-24 Axis AB Image sensor positioning apparatus and method
US9741184B2 (en) * 2012-10-14 2017-08-22 Neonode Inc. Door handle with optical proximity sensors
US9927571B2 (en) 2012-10-24 2018-03-27 Seereal Technologies S.A. Illumination device
US9285893B2 (en) * 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
WO2014106843A2 (en) * 2013-01-01 2014-07-10 Inuitive Ltd. Method and system for light patterning and imaging
US8768559B1 (en) 2013-01-22 2014-07-01 Qunomic Virtual Technology, LLC Line projection system
US9142019B2 (en) * 2013-02-28 2015-09-22 Google Technology Holdings LLC System for 2D/3D spatial feature processing
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US9364167B2 (en) 2013-03-15 2016-06-14 Lx Medical Corporation Tissue imaging and image guidance in luminal anatomic structures and body cavities
US20140320605A1 (en) * 2013-04-25 2014-10-30 Philip Martin Johnson Compound structured light projection system for 3-D surface profiling
EP3092601A4 (en) * 2014-01-06 2017-11-29 Eyelock Llc Methods and apparatus for repetitive iris recognition
GB2522248A (en) * 2014-01-20 2015-07-22 Promethean Ltd Interactive system
JP6370177B2 (en) 2014-09-05 2018-08-08 株式会社Screenホールディングス Inspection apparatus and inspection method
US20160128553A1 (en) 2014-11-07 2016-05-12 Zheng Jason Geng Intra- Abdominal Lightfield 3D Endoscope and Method of Making the Same
JP6484072B2 (en) 2015-03-10 2019-03-13 アルプスアルパイン株式会社 Object detection device
EP3295119A4 (en) 2015-05-10 2019-04-10 Magik Eye Inc. Distance sensor

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10467806B2 (en) 2012-05-04 2019-11-05 Intermec Ip Corp. Volume dimensioning systems and methods
US10635922B2 (en) 2012-05-15 2020-04-28 Hand Held Products, Inc. Terminals and methods for dimensioning objects
US10805603B2 (en) 2012-08-20 2020-10-13 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US10908013B2 (en) 2012-10-16 2021-02-02 Hand Held Products, Inc. Dimensioning system
US10228452B2 (en) 2013-06-07 2019-03-12 Hand Held Products, Inc. Method of error correction for 3D imaging device
US10240914B2 (en) 2014-08-06 2019-03-26 Hand Held Products, Inc. Dimensioning system with guided alignment
US10859375B2 (en) 2014-10-10 2020-12-08 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US10402956B2 (en) 2014-10-10 2019-09-03 Hand Held Products, Inc. Image-stitching for dimensioning
US10775165B2 (en) 2014-10-10 2020-09-15 Hand Held Products, Inc. Methods for improving the accuracy of dimensioning-system measurements
US10810715B2 (en) 2014-10-10 2020-10-20 Hand Held Products, Inc System and method for picking validation
US10393508B2 (en) 2014-10-21 2019-08-27 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US10218964B2 (en) 2014-10-21 2019-02-26 Hand Held Products, Inc. Dimensioning system with feedback
US10386468B2 (en) * 2015-03-05 2019-08-20 Hanwha Techwin Co., Ltd. Photographing apparatus and method
US10488192B2 (en) 2015-05-10 2019-11-26 Magik Eye Inc. Distance sensor projecting parallel patterns
US10593130B2 (en) 2015-05-19 2020-03-17 Hand Held Products, Inc. Evaluating image values
US11403887B2 (en) 2015-05-19 2022-08-02 Hand Held Products, Inc. Evaluating image values
US11906280B2 (en) 2015-05-19 2024-02-20 Hand Held Products, Inc. Evaluating image values
US10247547B2 (en) 2015-06-23 2019-04-02 Hand Held Products, Inc. Optical pattern projector
US10612958B2 (en) 2015-07-07 2020-04-07 Hand Held Products, Inc. Mobile dimensioner apparatus to mitigate unfair charging practices in commerce
US11029762B2 (en) 2015-07-16 2021-06-08 Hand Held Products, Inc. Adjusting dimensioning results using augmented reality
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10225544B2 (en) 2015-11-19 2019-03-05 Hand Held Products, Inc. High resolution dot pattern
US10747227B2 (en) 2016-01-27 2020-08-18 Hand Held Products, Inc. Vehicle positioning and object avoidance
US10872214B2 (en) 2016-06-03 2020-12-22 Hand Held Products, Inc. Wearable metrological apparatus
US10339352B2 (en) 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US10417769B2 (en) 2016-06-15 2019-09-17 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
US10163216B2 (en) 2016-06-15 2018-12-25 Hand Held Products, Inc. Automatic mode switching in a volume dimensioner
US11002537B2 (en) 2016-12-07 2021-05-11 Magik Eye Inc. Distance sensor including adjustable focus imaging sensor
US10909708B2 (en) 2016-12-09 2021-02-02 Hand Held Products, Inc. Calibrating a dimensioner using ratios of measurable parameters of optic ally-perceptible geometric elements
US11047672B2 (en) 2017-03-28 2021-06-29 Hand Held Products, Inc. System for optically dimensioning
US10401872B2 (en) * 2017-05-23 2019-09-03 Gopro, Inc. Method and system for collision avoidance
US10589860B2 (en) * 2017-05-23 2020-03-17 Gopro, Inc. Spherical infrared emitter
US11172108B2 (en) 2017-06-13 2021-11-09 Sharp Kabushiki Kaisha Imaging device
US10733748B2 (en) * 2017-07-24 2020-08-04 Hand Held Products, Inc. Dual-pattern optical 3D dimensioning
US20190026912A1 (en) * 2017-07-24 2019-01-24 Hand Held Products, Inc. Dual-pattern optical 3d dimensioning
EP3435026A1 (en) * 2017-07-24 2019-01-30 Hand Held Products, Inc. Dual-pattern optical 3d dimensioning
US10885761B2 (en) 2017-10-08 2021-01-05 Magik Eye Inc. Calibrating a sensor system including multiple movable sensors
CN111492262A (en) * 2017-10-08 2020-08-04 魔眼公司 Distance measurement using warp-wise grid pattern
US11199397B2 (en) 2017-10-08 2021-12-14 Magik Eye Inc. Distance measurement using a longitudinal grid pattern
US10679076B2 (en) 2017-10-22 2020-06-09 Magik Eye Inc. Adjusting the projection system of a distance sensor to optimize a beam layout
US10931883B2 (en) 2018-03-20 2021-02-23 Magik Eye Inc. Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging
US11381753B2 (en) 2018-03-20 2022-07-05 Magik Eye Inc. Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging
WO2019182881A1 (en) * 2018-03-20 2019-09-26 Magik Eye Inc. Distance measurement using projection patterns of varying densities
US11062468B2 (en) 2018-03-20 2021-07-13 Magik Eye Inc. Distance measurement using projection patterns of varying densities
CN112166345A (en) * 2018-03-20 2021-01-01 魔眼公司 Distance measurement using projected patterns of varying density
EP3769121A4 (en) * 2018-03-20 2021-12-29 Magik Eye Inc. Distance measurement using projection patterns of varying densities
US10593014B2 (en) * 2018-03-26 2020-03-17 Ricoh Company, Ltd. Image processing apparatus, image processing system, image capturing system, image processing method
US10584962B2 (en) 2018-05-01 2020-03-10 Hand Held Products, Inc System and method for validating physical-item security
US20190377088A1 (en) * 2018-06-06 2019-12-12 Magik Eye Inc. Distance measurement using high density projection patterns
US11474245B2 (en) * 2018-06-06 2022-10-18 Magik Eye Inc. Distance measurement using high density projection patterns
US11475584B2 (en) * 2018-08-07 2022-10-18 Magik Eye Inc. Baffles for three-dimensional sensors having spherical fields of view
US11483503B2 (en) 2019-01-20 2022-10-25 Magik Eye Inc. Three-dimensional sensor including bandpass filter having multiple passbands
US11474209B2 (en) 2019-03-25 2022-10-18 Magik Eye Inc. Distance measurement using high density projection patterns
US11019249B2 (en) * 2019-05-12 2021-05-25 Magik Eye Inc. Mapping three-dimensional depth map data onto two-dimensional images
US11639846B2 (en) 2019-09-27 2023-05-02 Honeywell International Inc. Dual-pattern optical 3D dimensioning
US11320537B2 (en) 2019-12-01 2022-05-03 Magik Eye Inc. Enhancing triangulation-based three-dimensional distance measurements with time of flight information
US11580662B2 (en) 2019-12-29 2023-02-14 Magik Eye Inc. Associating three-dimensional coordinates with two-dimensional feature points
US11688088B2 (en) 2020-01-05 2023-06-27 Magik Eye Inc. Transferring the coordinate system of a three-dimensional camera to the incident point of a two-dimensional camera
US20230328218A1 (en) * 2022-04-08 2023-10-12 Himax Technologies Limited Structured light projector and three-dimensional image sensing apparatus

Also Published As

Publication number Publication date
US10228243B2 (en) 2019-03-12
HK1252962A1 (en) 2019-06-06
EP3295119A1 (en) 2018-03-21
WO2016182982A1 (en) 2016-11-17
KR20180005659A (en) 2018-01-16
KR20180006377A (en) 2018-01-17
TW201706563A (en) 2017-02-16
JP2020148784A (en) 2020-09-17
TWI687652B (en) 2020-03-11
JP2018514783A (en) 2018-06-07
WO2016182985A1 (en) 2016-11-17
HK1253380A1 (en) 2019-06-14
CN107850420A (en) 2018-03-27
EP3295119A4 (en) 2019-04-10
US20160327385A1 (en) 2016-11-10
CN107896506A (en) 2018-04-10
EP3295118A1 (en) 2018-03-21
JP6761817B2 (en) 2020-09-30
TW201706564A (en) 2017-02-16
JP2018514780A (en) 2018-06-07
EP3295118A4 (en) 2018-11-21
CN107850420B (en) 2021-01-12

Similar Documents

Publication Publication Date Title
US20160328854A1 (en) Distance sensor
CN107407553B (en) Distance sensor
KR102543275B1 (en) Distance sensor projecting parallel patterns
US10488192B2 (en) Distance sensor projecting parallel patterns
US11475584B2 (en) Baffles for three-dimensional sensors having spherical fields of view
US20200182974A1 (en) Vertical cavity surface emitting laser-based projector
US20230324516A1 (en) Time of flight-based three-dimensional sensing system
KR20170031185A (en) Wide field-of-view depth imaging
KR20210033528A (en) Detector to determine the position of at least one object
EP3596425B1 (en) Optoelectronic devices for collecting three-dimensional data
EP3889541A1 (en) Imaging system and method using projection apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: MAGIK EYE INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KIMURA, AKITERU;REEL/FRAME:038514/0446

Effective date: 20160506

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE