FI128501B - Stereo imaging apparatus - Google Patents

Stereo imaging apparatus Download PDF

Info

Publication number
FI128501B
FI128501B FI20186073A FI20186073A FI128501B FI 128501 B FI128501 B FI 128501B FI 20186073 A FI20186073 A FI 20186073A FI 20186073 A FI20186073 A FI 20186073A FI 128501 B FI128501 B FI 128501B
Authority
FI
Finland
Prior art keywords
image
light
axis
input element
imaging device
Prior art date
Application number
FI20186073A
Other languages
Finnish (fi)
Swedish (sv)
Other versions
FI20186073A1 (en
Inventor
Jukka-Tapani Mäkinen
Kai Ojala
Original Assignee
Teknologian Tutkimuskeskus Vtt Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Teknologian Tutkimuskeskus Vtt Oy filed Critical Teknologian Tutkimuskeskus Vtt Oy
Priority to FI20186073A priority Critical patent/FI128501B/en
Priority to PCT/FI2019/050886 priority patent/WO2020120842A1/en
Publication of FI20186073A1 publication Critical patent/FI20186073A1/en
Application granted granted Critical
Publication of FI128501B publication Critical patent/FI128501B/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • G03B35/10Stereoscopic photography by simultaneous recording having single camera with stereoscopic-base-defining system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D5/00Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable
    • G01D5/26Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable characterised by optical transfer means, i.e. using infrared, visible, or ultraviolet light
    • G01D5/32Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable characterised by optical transfer means, i.e. using infrared, visible, or ultraviolet light with attenuation or whole or partial obturation of beams of light
    • G01D5/34Mechanical means for transferring the output of a sensing member; Means for converting the output of a sensing member to another variable where the form or nature of the sensing member does not constrain the means for converting; Transducers not specially adapted for a specific variable characterised by optical transfer means, i.e. using infrared, visible, or ultraviolet light with attenuation or whole or partial obturation of beams of light the beams of light being detected by photocells
    • G01D5/36Forming the light into pulses
    • G01D5/38Forming the light into pulses by diffraction gratings
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B17/00Systems with reflecting surfaces, with or without refracting elements
    • G02B17/08Catadioptric systems
    • G02B17/0804Catadioptric systems using two curved mirrors
    • G02B17/0808Catadioptric systems using two curved mirrors on-axis systems with at least one of the mirrors having a central aperture
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B17/00Systems with reflecting surfaces, with or without refracting elements
    • G02B17/08Catadioptric systems
    • G02B17/0856Catadioptric systems comprising a refractive element with a reflective surface, the reflection taking place inside the element, e.g. Mangin mirrors
    • G02B17/086Catadioptric systems comprising a refractive element with a reflective surface, the reflection taking place inside the element, e.g. Mangin mirrors wherein the system is made of a single block of optical material, e.g. solid catadioptric systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1814Diffraction gratings structurally combined with one or more further optical elements, e.g. lenses, mirrors, prisms or other diffraction gratings
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/06Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe involving anamorphosis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/17Bodies with reflectors arranged in beam forming the photographic image, e.g. for reducing dimensions of camera

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

An apparatus (500) comprises: - an imaging device (CAM1) comprising an input element (100), a focusing unit (300) and an image sensor (DET1), wherein the input element (100) has a symmetry axis (AX1), wherein a viewing region (ZONE1) surrounds the input element (100), and wherein the imaging device (CAM1) is arranged to form an annular image (IMG1) of the viewing region (ZONE1) on the image sensor (DET1), - a curved diffractive element (G1) to provide a first diffracted light ray (LR2a) and a second light ray (LR2b, LR2c) by diffracting light (LB1) received from a first object point (P0) located in the viewing region (ZONE1) such that the first diffracted light ray (LR2a) propagates towards the axis (AX1) in a first azimuthal direction (q1), and such that the second diffracted light ray (LR2b, LR2c) propagates towards the axis (AX1) in a second different azimuthal direction (q2), - wherein the imaging device (CAM1) is arranged to form a first image point (P1a) of the first object point (P0) by focusing light of the first diffracted light ray (LR2a) to the image sensor (DET1), and the imaging device (CAM1) is arranged to form a second different image point (P1b, P1c) of the first object point (P0) by focusing light of the second diffracted light ray (LR2b, LR2c) to the image sensor (DET1).

Description

STEREO IMAGING APPARATUS
FIELD Some aspects relate to capturing a stereo image.
BACKGROUND It is known that a stereo image of an object may be captured by capturing a first image of the object with a first camera, by capturing a second image of the object with a second camera, and by associating the first captured image with the second captured image.
It is known that the distance to the object may be determined by capturing the first image of the object with the first camera, by capturing the second image of the object with the second camera, and by comparing the first captured image with the second captured image.
It is known that the distance between a single camera and an object may be determined e.g. by capturing a first image of the object when the camera is at a first transverse position, by moving the camera in a transverse direction, by capturing a second image of the object when the camera is in a second transverse position, and by comparing the first captured image with the second o captured image.
S + US 2018/039050 discloses an optical system for panoramic stereoscopic T imaging, the optical system comprising an outer reflector, an inner reflector, N 30 and a camera. The outer reflector comprises a plurality of striations and a E generally paraboloidal major structure having a wide end and a narrow end, o the narrow end facing toward the camera. The outer reflector is configured to 3 reflect light to the camera. The inner reflector is inside the outer reflector. The = outer reflector is partially reflective and partially transmissive. The inner N 35 reflector comprises a generally paraboloidal reflective surface having a wide end and a narrow end, the narrow end facing towards the camera. The innerreflector is configured to reflect light that is transmitted through the outer reflector to the camera.
US 2015/346582 discloses an imaging device, which comprises an omnidirectional lens.
US 2011/074917 discloses a panoramic imaging system comprising a panoramic imaging lens, a relay lens for condensing a light output from the panoramic imaging lens, and an imaging device for acquiring an image according to the condensed light and converting the image to an electric signal.
US 2016/077315 discloses a panoramic camera, which comprises a convex reflector having an axially symmetric aspheric surface that provides a virtual curved and compressed image of a panoramic scene with a non-parabolic image compression, and a decompression lens to decompress the virtual curved and compressed image into a real image with parabolic image decompression.
US 2015/098090 discloses an optical position-measuring device for detecting a position of two objects movable relative to one another.
The position- measuring device comprises a scanning system adapted to form first and second scanning beam paths to generate a group of phase-shifted signals at an output end from interfering partial beams of rays.
The scanning system is also adapted to form a third scanning beam path to determine position along a second lateral shift direction.
A light source is adapted to supply a beam to o the scanning system via a first light guide and common coupling-in optics for O all three scanning beam paths. 3 & 30 SUMMARY i o Some versions may relate to a stereo imaging apparatus.
Some versions may 3 relate to an apparatus for measuring a distance.
Some versions may relate to = a method for capturing a stereo image.
Some versions may relate to a method N 35 for measuring a distance.
Some versions may relate to a method for monitoring position of an object.
Some versions may relate to measuring the position ofa vehicle. Some versions may relate to a position monitoring device. Some versions may relate to a vehicle, which comprises a position monitoring device. According to an aspect, there is provided an apparatus (500), comprising: -animaging device (CAM1) comprising an input element (100), a focusing unit (300) and an image sensor (DET1), wherein the input element (100) has a symmetry axis (AX1), wherein a viewing region (ZONE1) surrounds the input element (100), and wherein the imaging device (CAM1) is arranged to form an annular image (IMG1) of the viewing region (ZONE1) on the image sensor (DET), - a curved diffractive element (G1) to provide a first diffracted light ray (LR2a) and a second light ray (LR2b, LR2c) by diffracting light (LB1) received from a first object point (PO) located in the viewing region (ZONE1) such that the first diffracted light ray (LR2a) propagates towards the axis (AX1) in a first azimuthal direction (p1), and such that the second light ray (LR2b, LR2c) propagates towards the axis (AX1) in a second different azimuthal direction (92), - wherein the imaging device (CAM1) is arranged to form a first image point (P1a) of the first object point (PO) by focusing light of the first diffracted light ray (LR2a) to the image sensor (DET1), and the imaging device (CAM1) is arranged to form a second different image point (P1b, P1c) of the first object point (PO) by focusing light of the second light ray (LR2b, LR2c) to the image sensor (DET1). Further aspects are defined in the claims. N The stereo imaging apparatus may comprise a curved diffraction grating,
N + which surrounds an input element of an omnidirectional imaging device. The 2 input element may be e.g. a catadioptric lens or a paraboloid reflector, which N 30 has an axis of symmetry. The imaging device may have a wide viewing region E about the symmetry axis. The imaging device and the apparatus may have a o viewing region, which completely surrounds the axis. The viewing region may 3 represent 360° angle about the axis. The input element of the apparatus may = be called e.g. as an omnidirectional lens or as a panoramic lens. The imaging N 35 device may be called e.g. as an omnidirectional camera or as a panoramic camera.
The stereo imaging apparatus may form an annular image of the viewing region on the image sensor. The apparatus may be arranged to capture the annular stereo image formed on the image sensor.
The annular stereo image may comprise a first partial image of an object and a second partial image of an object, wherein the angular position of the first partial image with respect to the second partial image may depend on the distance between the object and the input element. The apparatus may be arranged to detect the angular position of the first partial image, with respect to the second partial image, and the apparatus may be arranged to determine a distance value from the detected angular position of the first partial image. Distance information may be determined from angular separation between adjacent partial images of the annular image. The distance information may be determined from the annular image e.g. by using image recognition and/or by using directed illumination. The apparatus may be configured to determine a distance between an object and the apparatus by analyzing the captured annular image. The analysis of the captured annular image may also be performed in a distributed manner, e.g. by using a service running on a remote computer and/or in an internet server.
o The apparatus may have a 360* horizontal view around the vertical axis. The O monitoring device may provide position information by measuring the positions + of objects. The monitoring device may provide position information e.g. for T controlling operation of a vehicle. The information may be used e.g. for N 30 controlling the velocity and/or direction of the vehicle. i o The apparatus may be arranged to provide information about the positions of 3 objects located near said apparatus. The position information may be used = e.g. for controlling operation of an autonomous vehicle. The control system of N 35 an autonomous vehicle may use real time information about positions of the objects e.g. in order to control the velocity and/or direction of the vehicle.
The viewing region may completely surround the grating element. The viewing region may correspond to a 360° horizontal view around the vertical axis. The apparatus may measure distances to objects which are within the viewing 5 region. The apparatus may measure distances to multiple objects by analyzing a single captured image. The objects may be located at arbitrary positions around the grating element. The monitoring device does not need to comprise any moving parts. In particular, the monitoring device does not need to comprise a rotating mirror.
BRIEF DESCRIPTION OF THE DRAWINGS In the following examples, several variations will be described in more detail with reference to the appended drawings, in which Fig. 1 shows, by way of example, in a three-dimensional view, an omnidirectional imaging device, Fig. 2 shows, by way of example, in a three-dimensional view, a stereo imaging apparatus, Fig. 3 shows, by way of example, in a three-dimensional view, a portion of a diffraction grating, o Fig. 4 shows, by way of example, in a top view, diffracted light rays O formed by a cylindrical diffraction grating, <+ T Fig. 5a shows, by way of example, in a top view, diffracted light rays N 30 formed by the cylindrical diffraction grating, in a situation where E the object point is at a first distance,
R 3 Fig. 5b shows, by way of example, in a top view, diffracted light rays = formed by the cylindrical diffraction grating, in a situation where N 35 the object point is at a second different distance,
Fig. 5c shows, by way of example, in a three-dimensional view, the azimuthal direction of a first light ray, and the azimuthal direction of a second light ray, Fig. 6a shows, by way of example, in a cross-sectional side view, propagation of light in the stereo imaging apparatus, Fig. 6b shows, by way of example, in a three-dimensional view, propagation of light in the stereo imaging apparatus, Fig. 7 shows, by way of example, in a top view, angular position of a first partial image and angular position of a second partial image, Fig. 8 shows, by way of example, in a side view, an effective input pupil of the imaging device, Fig. 9 shows, by way of example, in a three-dimensional view, upper and lower boundary of a viewing region of the stereo imaging apparatus, Fig. 10 shows, by way of example, in a three-dimensional view, a vehicle, which comprises the stereo imaging apparatus, Fig. 11 shows, by way of example, functional units of the stereo imaging apparatus,
O N Fig. 12 shows, by way of example, in a cross-sectional side view, a stereo
N + imaging apparatus, which comprises a curved reflective T surface, and 00 N 30 E Fig. 13 shows, by way of example, in a cross-sectional side view, a stereo n imaging apparatus, which comprises a wavefront modifying unit. : 00
O N 35 DETAILED DESCRIPTION
Referring to Fig. 1, the omnidirectional imaging device CAM1 may comprise an input element 100, aperture stop AS1, a focusing unit 300, and an image sensor DET1. The input element 100 may gather light LB1 from the viewing region ZONE1. The input element 100 may receive light LB1 from the viewing region ZONE1, and the input element 100 may deflect at least a part of the received light to focusing optics 300. The input element 100 may provide deflected light LB10 towards the focusing unit 300. The focusing unit 300 may form the annular image IMG1 on the image sensor DET1 by focusing the deflected light LB10. The input element 100 may provide a deflected beam LB10 e.g. by reflecting light LB1 gathered from the viewing region ZONE1, and the focusing unit 300 may provide focused light LB12 by focusing light of a deflected light beam LB10. The focused light may impinge on the image sensor DET so as to form an annular image IMG1.
The omnidirectional imaging device CAM1 may be arranged to receive light from the viewing region ZONE1. The imaging device CAM1 may be arranged to form the annular image IMG1 on an image sensor DET1. The image sensor DET1 may convert the optical image IMG1 into a digital image. The imaging device CAM1 may be arranged to capture an annular image IMG1, which comprises images of objects OBJ1 located in the viewing region ZONE1. The captured image IMG1 may be subsequently processed and/or analyzed e.g. by using a data processor CNT1 (Fig. 11).
The imaging device CAM1 may form an image SUB1 of an object OBJ1, which is located in the viewing region ZONE1. The input element 100 may gather o light LB1 from the object OBJ1. The image SUB1 may be called e.g. as a O partial image or as a sub-image. The imaging device CAM1 may form an image + P1 of a point PO of the object OBJ1. The image point P1 may be an image of ? the point PO of the partial image SUB1. The annular image IMG1 may comprise & 30 the partial image SUB1, which in turn may comprise the image point P1.
I = o The input element 100 may be axially symmetric with respect to the axis AX1.
3 The viewing region ZONE1 may completely surround the axis AX1. The input = element 100 may be e.g. catadioptric lens, which is arranged to receive light N 35 from the viewing region ZONE1. The catadioptric lens may have e.g. two refractive surfaces and two reflective surfaces to provide a folded optical path.
The folded optical path may allow reducing the size of the imaging device CAM1. The imaging device CAM1 may have a low height, due to the folded optical path.
The optical elements of the imaging device CAM1 may be axially symmetric with respect to the symmetry axis AX1. The imaging device CAM1 may form an annular image IMG1 of a viewing region ZONE1, which surrounds the symmetry axis AX1.
The image sensor DET1 may be e.g. a CCD sensor or a CMOS sensor. CCD means charge coupled device. CMOS means Complementary Metal Oxide Semiconductor. The imaging device CAM1 may comprise an aperture stop AS1 e.g. to define an input pupil EPU1 of the imaging device CAM1. The aperture stop AS1 may e.g. limit or reduce optical aberrations of the imaging device CAM1. The aperture stop AS1 may be positioned e.g. between the input element 100 and the focusing unit 300. SX, SY and SZ may denote orthogonal directions. The symmetry axis AX1 may be parallel with the direction SZ. The vertical direction SZ may be parallel to the direction of gravity, but the direction SZ does not need to be parallel to the direction of gravity.
o A reference plane REF1 may be perpendicular to the axis AX1. The viewing S region ZONE1 may have a first conical boundary ZB1 and a second conical 5 boundary ZB2. «1 may denote an angle between the first conical boundary ? ZB1 and the reference plane REF1. a2 may denote an angle between the N 30 second conical boundary ZB2 and the reference plane REF1. The angle a1 E may be e.g. in the range of +10° to +60°, wherein the angle «2 may be e.g. in o the range of -60° to 0°. o = The reference plane REF1 may be e.g. a horizontal reference plane REF1. N 35 The viewing region ZONE1 may have an upper conical boundary ZB1 and alower conical boundary ZB2. The vertical field (=a1+a2) of view of the device 100 may be e.g. in the range of 10° to 120°. The imaging device CAM1 may further comprise a wavefront modifying unit 200 to increase the resolution of the image IMG1 (Fig. 13). The wavefront modifying unit 200 may e.g. correct wavefront distortion of a light beam received from the input element 100. The wavefront modifying unit 200 may e.g. collimate a light beam received from the input element 100. Referring to Fig. 2, the stereo imaging apparatus 500 may comprise a curved diffraction grating element G1 and the omnidirectional imaging device CAM1. The curved diffraction grating element G1 may surround the input element 100 of the imaging device CAM1. The grating element G1 may provide diffracted light LB2a, LB2b, LB2c by diffracting light received from an object point PO of an object OBJ1. The object OBJ1 may be located in the viewing region ZONE1. The object point PO may be located in the viewing region ZONE1. The apparatus 500 may comprise: - an input element 100 to gather light from a viewing region ZONE1, - an image sensor DET1, - a focusing unit 300 to form an annular image IMG1 of the viewing region ZONE1 on the image sensor DET1 by focusing the gathered light, and -adiffractive element G1 surrounding the input element 100, o wherein the diffractive element G1 is arranged to provide diffracted light rays O by diffracting light gathered from an object point PO located in the viewing + region ZONE1, wherein the imaging device CAM1 is arranged to form partial ? images SUB1a, SUB1b, SUB1c on the image sensor DET1 by focusing light N 30 of the diffracted light rays to the image sensor DET1.
I = o The curved diffraction grating G1 may completely surround the input element 3 100 of the imaging device CAM1. The curved diffraction grating G1 may be = e.g. a substantially cylindrical diffractive element, which may encircle the input N 35 element 100. The diffractive features DF1 of the grating element G1 may be located on a substantial cylindrical surface.
The grating G1 may provide two or more diffracted light beams LB2a, LB2b, LB2c by diffracting light received from the point PO of the object OBJ1. The grating G1 may provide a first diffracted light beam LB2a by diffracting light LB1 received from the point PO. The grating G1 may provide a second diffracted light beam LB2b by diffracting light LB1 received from the point PO. The grating G1 may provide a third diffracted light beam LB2c by diffracting light LB1 received from the point PO. The first diffracted light beam LB2a may be formed e.g. in the diffraction order m=1. The second diffracted light beam LB2b may be formed e.g. in the diffraction order m=0. The third diffracted light beam LB2c may be formed e.g. in the diffraction order m=-1. The diffracted light beams LB2a, LB2b, LB2c may propagate towards the symmetry axis AX1 of the input element 100. The imaging device CAM1 may form a first image point P1a by focusing light of the first diffracted light beam LB2a. The imaging device CAM1 may form a second image point P1b by focusing light of the second diffracted light beam LB2b. The imaging device CAM1 may form a third image point P1c by focusing light of the third diffracted light beam LB2c.
The angle p1 may denote angular distance between the first image point Pia and the second image point P1b of the annular image IMG1, with respect to the symmetry axis AX1. The angle p2 may denote angular distance between the second image point Pba and the third image point P1c of the annular image —IMG1, with respect to the symmetry axis AX1. N The angular distance 91, p2 may depend on the distance L 1 between the point
N + PO and the symmetry axis AX1. The angular distance p1 and/or ¢2 may be ? determined from the annular image IMG1. The distance L1 may be determined N 30 from the angular distance 91 and/or 91, respectively.
I = o The grating element G1 may be e.g. a transmissive surface relief grating, 3 which comprises a plurality is substantially parallel grooves G1 positioned = according to a grating constant dc. N 35
The diffractive element G1 of the apparatus 500 may be formed e.g. by wrapping diffractive foil around the input element 100 of the imaging device CAM1. The diffractive element G1 of the apparatus 500 may be a substantially cylindrical piece of diffractive foil, which surrounds the input element 100.
Referring to Fig. 3, the diffraction grating G1 may comprise a plurality of longitudinal diffractive features DF1. POR1 may denote a portion of the curved diffraction grating element G1. POR1 may denote a portion of a cylindrical diffraction grating element G1. The diffractive features DF1 may be diffractive lines. The diffractive features DF1 may be e.g. longitudinal grooves and/or ridges. The symbol dc denotes the grating constant of the diffraction grating G1. The grating constant de may mean the distance between centers of adjacent similar diffractive linear features DF1. The line density Nc of the grating G1 may be equal to the inverse (1/ dg) of the grating constant de. The line density Ne means the number of substantially similar diffractive features per unit length. The line density Nc (=1/dc) may indicate e.g. the number of diffractive grooves DF1 per the length of 1 mm. The line density Nc of the grating G1 may be e.g. in the range of 50/mm to 1200/mm. The line density Nc of the grating G1 may be e.g. in the range of 50 lines per mm to 1200 lines per mm. The diffraction grating G1 may form one or more diffracted light rays LR2ax, LR2ao, LR2a.1 by diffracting light of an input ray LR1a received from an object point PO. The light ray LR2a, may be diffracted e.g. in the diffraction order m=1. The light ray LR2ao may be diffracted e.g. in the diffraction order m=0. o The light ray LR2a.1 may be diffracted e.g. in the diffraction order m=-1.
S + The angle 6m=1 may denote the angle between the direction of the light ray T LR2a1 and the surface normal N1 of the grating G1. The angle Om=o may denote & 30 the angle between the direction of the light ray LR2ao and the surface normal E N1. The angle Om=.1 may denote the angle between the direction of the light o ray LR2a.1 and the surface normal N1. The angle 6in may denote the angle 3 between the direction of the input ray LR1a and the surface normal N1.
The diffracted light ray LR2ao may be substantially parallel with the input ray LR1a. The directions of the diffracted light rays may be determined e.g. according to the following diffraction grating equation. dg(sin®;, —sinb,,) =m- 1 (1) m may denote the diffraction order. A may denote the wavelength of light. For example, the Om=1 = 40° in a situation where the wavelength A = 650 nm and the grating constant de = mm/1000. The line density Nc of the grating G1 may be equal to 1000/mm. The imaging apparatus 500 may optionally comprise an optical filter FIL1 to limit spectral bandwidth of light impinging on the image sensor DET1 (Fig. 6a). The optical filter FIL1 may have a fixed bandwidth. The bandwidth of the optical filter FIL1 may be e.g. narrower than or equal to 20 nm, advantageously narrower than or equal to 10 nm. The bandwidth of the optical filter FIL1 may be e.g. in the range of 1 nm to 20 nm. The bandwidth of the optical filter FIL1 may be e.g. in the range of 1 nm to 10 nm. The spectral position of the center of the passband of the filter FIL1 may be fixed. The optical filter FIL1 may also have an adjustable bandwidth. The spectral position of the center of the passband of the filter FIL1 may also be adjustable. Alternatively, or in addition, the object OBJ1 may be illuminated with o illuminating light, which has a narrow bandwidth. The bandwidth of the optical N filter FIL1 and/or the illuminating light may be e.g. narrower than or egual to 20
N + nm, advantageously narrower than or egual to 10 nm. The object OBJ1 may 2 be illuminated e.g. with laser light which has a bandwidth less than 1 nm. N 30 E Referring to Fig. 4, the apparatus 500 may comprise a substantially cylindrical o diffractive element G1, which may be substantially concentric with the 3 symmetry axis AX1. For example, a diffraction grating film G1 may be e.g. = wrapped around the symmetry axis AX1 to form a substantially cylindrical N 35 diffractive element. The longitudinal diffractive features DF1 of the diffractive element G1 may be substantially parallel with the symmetry axis AX1.
The apparatus 500 may receive light LB1 from the object point PO located in the viewing region ZONE1. The light LB1 may comprise light rays LR1a, LR1b, LR1c. The grating G1 may form diffracted light rays LR2a:, LR2ao, LR2a.1 by diffracting light of the light ray LR1a, corresponding to the diffraction orders m=1, m=0, and m=-1, respectively. The grating G1 may form diffracted light rays LR2b1, LR2bo, LR2b.1 by diffracting light of the light ray LR1b. The grating G1 may form diffracted light rays LR2c1, LR2co, LR2c.1 by diffracting light of the light ray LR1c.
The diffracted light rays LR2a.1, LR2bo, LR2c; may propagate towards the symmetry axis AX1. The angle p1 may denote the (azimuthal) angle between the light rays LR2a.1 and LR2bo. The angle 92 may denote the (azimuthal) angle between the light rays LR2bo and LR2c1. The angles 91, 92 between the — directions of the diffracted light rays LR2a.1, LR2bo, LR2c; may depend on the distance L1 between the object point PO and the symmetry axis AX1. Referring to Fig. 5a, the stereo imaging apparatus 500 may simultaneously form several partial images SUB1a, SUB1b, SUB1c of the same object OBJ1 onthe image sensor DET1. The angular position of the partial images SUB1a and SUB1c in the image IMG1 may depend on the distance L1 of the object OBJ1. The angular distance 91, p2 between the image points P1a, P1b, P1c (when viewed from the symmetry axis AX1) may depend on the distance L 1. The distance L1 may be determined e.g. from the angle ¢1, from the angle ¢2 and/or from the sum o1 + 02. N The omnidirectional imaging device CAM1 may substantially maintain the 5 azimuthal direction of each light ray which propagates towards the axis AX1 T and subsequently through the input element 100 of omnidirectional imaging N 30 device CAM1 to the image sensor DET1.
I = o rea may denote the distance between the grating element G1 and the 3 symmetry axis AX1. The inner diameter of the substantially cylindrical grating = element G1 may be equal to 2:ra1. The grating element G1 may be in contact N 35 with the input element 100, or the inner diameter of the grating element G1 may be greater than the outer diameter of the input element 100. Selecting alarge distance rei may facilitate measuring large distances L1. Selecting a small distance re1 may minimize the size of the apparatus 500. Fig. 5b shows a second situation where the distance L1 between an object OBJ1 and the axis AX1 is smaller than the distance L1 of Fig. 5a. The angular values 91 and 92 of Fig. 5b are smaller than the angular values p1 and ¢2 of Fig. 5a. The angle ¢1 may increase with increasing distance L 1. Referring to Fig. 5c, the azimuthal direction of a light ray may be defined with respect to a reference direction (e.g. -SX or +SX) and/or with respect to the azimuthal direction of a reference light ray (e.g. LR2bo). The grating element G1 may receive light rays LR1a, LR1b from an object point PO. The light ray LR1a may impinge on the grating element G1 at a point PG1a. The light ray LR1b may impinge on the grating element G1 at a different point PG1b. The grating G1 may form light rays LR2a1, LR2ao, LR2a.1 by diffracting light of the light ray LR1a, corresponding to the diffraction orders m=1, m=0, and m=-1. The grating G1 may form light rays LR2b1, LR2bo, LR2b; by diffracting light of the light ray LR1b. The first light ray LR2a.4 and the second light ray LR2bo may propagate towards the axis AX1. REF1 may denote a plane, which is perpendicular to the axis AX1. The plane REF1 may be defined by the directions SX and SY. The axis AX1 may be parallel with the direction SZ.
o The first diffracted light ray LR2a.1 may have a projection LR2a'. on the plane N REF1. The second light ray LR2bo may have a projection LR2b'o on the plane 5 REF1. The azimuth angle oa of the ray LR2a.1 may denote the angle between T the projection LR2a.1 of the first diffracted light ray LR2a.; and a reference & 30 direction (-SX). The azimuth angle op of the ray LR2bo may denote the angle E between the projection LR2b'o of the second light ray LR2bo and the reference Q direction (-SX). o = The azimuthal direction of the ray LR2a.1 may mean the direction of the N 35 projection LR2a'.1 of the ray LR2a.1 on the plane REF 1. The azimuthal direction of the ray LR2bo may mean the direction of the projection LR2b'o of the ray
LR2bo on the plane REF1. The azimuthal direction of the ray LR2c1 (see Fig. 5b) may mean the direction of the projection of the ray LR2c; on the plane REF1.
The angle pamay specify the azimuthal direction of the ray LR2a.1 with respect to the reference direction (-SX). The angle o may specify the azimuthal direction of the ray LR2bo with respect to the reference direction (-SX). The angle 91 may specify the azimuthal direction of the ray LR2a.1 with respect to the projection LR2b'o. The angle 92 may specify the azimuthal direction of the ray L R2c, with respect to the projection LR2b'.
The light ray LR1a may have a projection LR1a' on the plane REF1. The light ray LR1a may have a zenith angle B1a with respect to the direction (-SZ). The light ray LR1b may have a projection LR1b' on the plane REF1. The light ray LR1b may have a zenith angle B1b with respect to the direction (-SZ).
Fig. 6a shows, by way of example, propagation of light through the apparatus
500.
The input element 100 may be e.g. a catadioptric element, which comprises a refractive input surface SRF1, a first reflective surface SRF2, a second reflective surface SRF3, and a refractive output surface SRF4. The grating element G1 may provide diffracted light LB2 by diffracting light LB1 received e.g. from an object point PO.
o The input surface SRF1 may provide refracted light LB3 by refracting the S diffracted light LB2. The input surface SRF1 may refract the light LR3 towards 5 the first reflective surface SRF2. The first reflective surface SRF2 may reflect ? light LR4 towards the second reflective surface SRF3. The second reflective & 30 surface SRF3 may reflect light LR5 towards the output surface SRF4. The E output surface SRF4 may provide output light LB10 to the focusing unit 300 o through an aperture stop AS1. The focusing unit 300 may focus light received 3 from the input element 100 to the image sensor DET1. The focusing unit 300 = may provide focused light LB12 to the image sensor DET1. The focusing unit N 35 300 may form an image point P1 on the image sensor DET1 by focusing the light received from the input unit 100.
The aperture stop AS1 of the apparatus 500 may limit the effective aperture of the omnidirectional imaging device CAM1 such that device CAM1 may form a substantially sharp partial image SUB1a by focusing light diffracted by the curved diffractive element G1. The f-number of the omnidirectional imaging device CAM1 may be e.g. substantially equal to 1.68, the vertical field of view of the viewing region ZONE1 may be e.g. from -15° to +15°, the outer diameter of the annular image IMG1 may be e.g. substantially equal to 8 mm, and the inner diameter of the annular image IMG1 may be e.g. substantially equal to 4 mm.
The diameter of the input element 100 may be e.g. 24 mm, and the height of the omnidirectional imaging device CAM1 may be e.g. 24 mm in the direction of the symmetry axis AX1. The diffractive features DF1 may be e.g. linear grooves. The line density Ng of the cylindrical grating element G1 may be e.g. 1000 lines / mm. The line density Nc of the cylindrical grating element G1 may be e.g. in the range of 50/mm to 1200/mm.
Referring to Fig. 6b, a light ray LR2a., diffracted in the diffraction order m=-1 towards the axis AX1 may contribute to forming a first image point P1a on the image sensor DET1. A light ray LR2bo diffracted in the diffraction order m=0 towards the axis AX1 may contribute to forming a second image point P1b. A light ray LR2c4 diffracted in the diffraction order m=1 towards the axis AX1 may — contribute to forming a third image point P1c. The image points P1a, P1b, P1c o may be images of the same object point PO located in the viewing region S ZONE1. The distance L1 of the point PO may be determined e.g. from the 5 angular position of the image point P1a, with respect to the point P1b, P1c.
O N 30 Referring to Fig. 7, the image points P1a, P1b, P1c may be formed at a radial E distance rp1a from the center of the annular image IMG1. The apparatus 500 o may form the image points P1a, P1b, P1c of the object point PO substantially 3 at the same radial distance rpia from the axis AX1.
S N 35 —fPia May denote a distance between the image point P1a and the center of the annular image IMG1. rp1i may denote a distance between the image point P1band the center of the annular image IMG1. rp1c may denote a distance between the image point P1c and the center of the annular image IMG1. The radial distance rp1pb may be substantially equal to the radial distance rp1a. The radial distance rpic may be substantially equal to the radial distance rp1a.
The apparatus 500 may form three or more image points P1a, P1b, Pic of the object point PO such that the image points P1a, P1b, P1c are not on the same line.
The device CAM1, 500 may capture the annular image IMG1. The image sensor DET1 may convert the optical annular image IMG1 into a digital image DIMG.
The annular image IMG1 may have an inner boundary IB1 and an outer boundary OB2. The inner boundary 1B1 may be e.g. the image of the first conical boundary ZB1 of the viewing region VIEW1. The outer boundary OB2 may be e.g. the image of the second conical boundary ZB2 of the viewing region VIEVWV1.
Alternatively, the inner boundary IB1 may be the image of the second conical boundary ZB2 and the outer boundary OB2 may be the image of the first conical boundary ZB1.
The center of the annular image IMG1 may coincide with the symmetry axis = AX1. The annular image IMG1 may have an inner radius rmin and an outer o radius rmax. The diameter dmax may be substantially equal to 2:rmax.
S + The image information representing the omnidirectional viewing region ZONE 1 T may be formed in the annular region between the inner radius ruin and the & 30 outer radius rmax.
I = Q The ratio of the inner radius ruin to the outer radius rmax may be e.g. in the 3 range of 0.3t0 0.7.
N
The annular image IMG1 may surround a central region CREG1. The apparatus 500 may operate such that light gathered from the omnidirectional viewing region ZONE1 is not focused to the central region CREG1.
The method may comprise using an image recognition algorithm to recognize the partial images SUB1a, SUB1b, SUB1c of an object OBJ1 in the annular image IMG1. The apparatus 500 may be arranged to operate such that the angular distance 91 between the image points P1a and P1b is equal to the angular distance 92 between the image points P1b, P1c. The method may comprise using information about the equality of the angles (m1=02) for verifying and/or facilitating image recognition. The method may comprise determining whether the annular image comprises three substantially identical partial images, which are separated by the same angular distance ¢1 (=¢2).
The input element 100 and the focusing unit 300 may be arranged to form the annular optical image IMG1 on the image sensor DET1.
Referring to Fig. 8, the imaging device CAM1 may comprise an aperture stop AS1 to define the width wer of an effective optical aperture EPU1 of the imaging device CAM1. The imaging device CAM1 may comprise an aperture stop AS1 to define the height her of the effective optical aperture EPU1. The aperture EPU1 may also be called e.g. as an entrance pupil. The aperture stop AS1 may define the entrance pupil EPU1 of the imaging device CAM1 by preventing propagation of marginal rays.
o The aperture stop AS1 may improve the sharpness of the image IMG1 by N preventing propagation of marginal rays, which could cause blurring of the 5 optical image IMG1. The aperture stop AS1 may be arranged to prevent 2 propagation of rays, which may cause blurring of the optical image IMG1. The N 30 aperture stop AS1 may be arranged to define the dimensions entrance pupil E EPU1. The entrance pupil EPU1 may have a width wer and a height her.
R 3 For example, the aperture stop AS1 may define an entrance pupil EPE of the = imaging device CAM1 such that the effective F-number of the imaging device N 35 CAM1 is in the range of 1.5 to 10.
The aperture stop AS1 may improve the sharpness of the image IMG1 by limiting optical aberrations caused by the curved diffraction grating G1. The width wer of the effective optical aperture of the imaging device CAM1 may be e.g. smaller than 10% of the radius of curvature rai of the grating element G1, so as to limit optical aberrations caused by the curved diffraction grating G1. The radius of curvature rc of the grating element G1 may be e.g. greater than times the width wer of the effective optical aperture of the imaging device CAM1, so as to limit optical aberrations caused by the curved diffraction grating G1. 10 Referring to Fig. 9, the omnidirectional viewing region ZONE1 may completely surround the input element 100. The viewing region ZONE1 may have an upper conical boundary ZB1 and a lower conical boundary ZB2. a1 may denote the angle between the upper boundary ZB1 and a horizontal plane REF1. a2 may denote the angle between the lower boundary ZB2 and the horizontal plane REF 1. The angle amax may be e.g. in the range of 10° to 20°. The angle amin may be e.g. in the range of -20* to +5°. The difference oamax- min May be may be called e.g. as the vertical field of view.
The horizontal field of view of the imaging device CAM1 may be e.g. substantially equal to 360°. The horizontal field of view of the apparatus 500 may be e.g. substantially equal to 360°. Referring to Fig. 10, the apparatus 500 may be installed e.g. to vehicle 1000. The vehicle 1000 may be e.g. an automobile, a bus, a train, or a tram.
The apparatus 500 may be installed e.g. to a ship (i.e. boat). The apparatus may o be installed to a moving and/or stationary body 1000. An object OBJ1 may be O located in the viewing region of the apparatus 500. The apparatus 500 may be + arranged to detect the position of one or more objects OBJ1 with respect to T the input element 100 of the apparatus 500. The apparatus 500 may be N 30 arranged to provide position information POS1. The position information POS1 E may comprise information about the position of the object OBJ1 with respect o to the apparatus 500. o = For example, a corner of an object OBJ1 may be used as an object point PO.
N 35 For example, one or more corners, edges and/or surfaces of the object OBJ1 may be used as detectable features, which may be detected by an imagerecognition algorithm, so as to determine a distance L1. The annular image may comprise partial images SUB1a, SUB1b, SUB1c of a detectable feature of the object. The apparatus 500 may determine the distance L1 from the angular separation 91, p2 of the partial images SUB1a, SUB1b, SUB1c.
The apparatus 500 may operate in an environment which comprises one or more objects OBJ1. The position information POS1 may be used e.g. for avoiding collision of the vehicle 1000 with the object OBJ1. The position information POS1 may be used e.g. for optimizing a route of the vehicle 1000 with respect to the object OBJ1. The position information POS1 may be used e.g. for tracking the position of the vehicle 1000. The position information POS1 may be used e.g. for predicting the position of the vehicle 1000.
The method may comprise controlling the velocity and/or direction of movement of the vehicle 1000 based on the measured position of the object OBJ1. The positions of one or more objects OBJ1 may be measured by using the apparatus 500. The distance between and object OBJ1 and the apparatus 500 may be measured by using the apparatus 500. The distance L1 between the object OBJ1 and the apparatus 500 may be monitored by using the apparatus
500. The apparatus 500 may be arranged measure the velocity of the object OBJ1 with respect to the apparatus 500. The apparatus 500 may be arranged o measure the velocity of the apparatus 500 with respect to the object OBJ1. N The apparatus 500 may be arranged to detect a change of distance between 5 the object OBJ1 and the apparatus 500. An object or obstacle OBJ1 may T comprise a surface portion R1a and/or R1b. N 30 E The apparatus 500 may be attached to a vehicle 1000. The vehicle may be o moving at a non-zero velocity with respect to an obstacle OBJ1. A vehicle 1000 3 may comprise the apparatus 500. The position of the vehicle 1000 may be = monitored by using the apparatus 500. The position of the vehicle 1000 with N 35 respect to one or more obstacles may be monitored by using the apparatus
500. The velocity of the vehicle 1000 may be monitored by using the apparatus
500. A collision between the vehicle 1000 may be avoided by using position information provided by the apparatus 500. A route for the vehicle 1000 may be selected based on information about the positions of the obstacles. The vehicle may be e.g. a ground vehicle, an airborne vehicle, or a boat. The vehicle may be e.g. a car, a bus, a train, a motorcycle, a helicopter, or a flying device. Referring to Fig. 11, a position monitoring apparatus 500 may comprise the grating G1 and the imaging device CAM1. The imaging device CAM1 may comprise the image sensor DET1 for capturing the annular image IMG1. The apparatus 500 may comprise a data processing unit CNT1 for performing data processing operations. The processing unit CNT1 may be configured to determine a distance L1 by analyzing the captured image IMG1. The data processing unit CNT1 may comprise one or more data processors. The unit CNT1 may be configured to process image data. The memory MEM3 may comprise computer program PROG1. The computer program code PROG1 may be configured to, when executed on at least one processor CNT1, cause the apparatus 500 to capture the annular image IMG1. The computer program code PROG1 may be configured to, when executed on at least one processor CNT1, cause the apparatus 500 to determine a distance from the captured image IMG1. The image sensor DET1 may convert the annular optical image IMG1 into a digital image DIMG1. The apparatus 500 may comprise a memory MEM1 for o storing the digital image DIMG1.
S + The apparatus 500 may comprise a memory MEM2 for storing determined 2 position data POS1. The position data POS1 may comprise e.g. the N 30 coordinates of one or more objects. The apparatus 500 may provide position E information POS1.
R 3 The apparatus 500 may comprise a memory MEMS for storing computer = program PROG1. The computer program PROG1 may comprise computer N 35 program code configured to, when executed on at least one processor CNT1, cause the apparatus 500 to measure the positions of the objects OBJ1.
The apparatus 500 may comprise a communication unit RXTX1 to send measured position data POS1. The apparatus 500 may send the position data POS1 e.g. to a control unit of a traffic control system. The apparatus 500 may send the position data POS1 e.g. to a surveillance system. The apparatus 500 may send the position data POS1 e.g. to a control system of a vehicle 1000. COM1 denotes a communication signal. The communication unit RKTX1 may be arranged to send the data e.g. by wireless communication, by an electrical cable, or by an optical cable. The communication unit RXTX1 may be arranged to send the data to the Internet and/or to a mobile communications network. The apparatus 500 may optionally comprise e.g. a user interface UIF1. The user interface UIF1 may comprise e.g. a display for displaying information to a user. The user interface UIF1 may be arranged to display e.g. distance information L1. The apparatus 500 may be arranged to provide information about the presence of objects e.g. for controlling lighting. The apparatus 500 may be arranged to provide information about the movements of objects e.g. for controlling lighting. The apparatus 500 may be arranged to provide information about the presence of objects e.g. for stopping operation of an industrial robot. The apparatus 500 may be arranged to provide information for a surveillance and/or security system. The apparatus 500 may be arranged to provide information about the presence of objects e.g. for initiating an alarm. The apparatus 500 may be arranged to provide information about the movements of objects e.g. for o initiating an alarm.
S + Referring to Fig. 12, the apparatus 500 may comprise an omnidirectional ? imaging device CAM1 and a substantially cylindrical diffraction grating G1. The & 30 imaging device CAM1 may comprise an input element 100, a focusing unit E 300, and an image sensor DET1. The input element 100 may comprise e.g. a o curved reflective surface SRF5. The input element 100 may comprise single 3 curved reflector SRF5. The reflector SRF5 may be e.g. a paraboloid reflector. = hyperboloid, conical, spherical, or ellipsoidal reflector. A single paraboloid N 35 — hyperboloid, conical, spherical, or ellipsoidal reflector may be used as the input element 100.
The grating G1 may provide diffracted light LB2 by diffracting light LB1 received from an object OBJ1. The surface SRF5 may reflect diffracted light towards the focusing optics 300. The focusing optics 300 may form the annular image IMG1 on the image sensor DET1 by focusing the light reflected from the surface SRF5. The distance L1 to the object OBJ1 may be determined from the angular distance between the partial images of the image IMG1. Referring to Fig. 13, the apparatus 500 may comprise an omnidirectional imaging device CAM1 and a substantially cylindrical diffraction grating G1. The imaging device CAM1 may comprise an input element 100, an aperture stop AS1, a focusing unit 300, and an image sensor DET1. The imaging device CAM1 may optionally comprise a wavefront modifying unit 200. The distance L1 to the object OBJ1 may be determined from the angular distance between the partial images of the image IMG1. The apparatus 500 may be arranged to form the annular optical image IMG1 on the image sensor DET1, by diffracting and focusing light of several light beams LB14, LB12 For example, a first light beam LB1; may be received from a first object OBJ1, and a second light beam LB1> may be received from a second object.
The first light beam LB14 may propagate in a first direction DIR.
The second light beam LB1> may propagate in a second different direction DIR2. The apparatus 500 may be arranged to form a first image point P11 on the image sensor DET1 by diffracting and focusing light of the first light beam o LB14. The apparatus 500 may be arranged to form a second image point P12 O on the image sensor DET by diffracting and focusing light of the second light 3 beam LB11. & 30 The input element 100, the optical elements of the (optional) modifying unit E 200, the aperture stop AS1, and the optical elements of the focusing unit 300 o may be substantially axially symmetric with respect to the axis AX1. o = The input element 100 may be substantially axially symmetric about the axis N 35 AX1. The optical components of the imaging apparatus 500 may be substantially axially symmetric about the axis AX1. The input element 100 maybe axially symmetric about the axis AX1. The axis AX1 may be called e.g. as the symmetry axis, or as the symmetry axis. The input surface SRF1 of the input element 100 may provide a first refracted light beam LB3 by refracting light of a diffracted light beam LB2. The first reflective surface SRF2 may provide a first reflected beam LB4 by reflecting light of the first refracted beam LB3. The second reflective surface SRF3 may provide a second reflected beam LB5 by reflecting light of the first reflected beam LB4. The output surface SRF4 may provide an output beam LB10 by refracting light of the second reflected beam LBS. The input element 100 may be arranged to operate e.g. such that the second reflected beam LB5 formed by the second reflective surface SRF3 does not intersect the first refracted light beam LB3 formed by the input surface SRF1.
The input element 100 may be comprise or consist of substantially transparent material, e.g. glass or plastic. The light may propagate from the input surface SRF1 to the output surface SRF4 in a substantially homogeneous material without propagating in a gas. The reflective surfaces SRF2, SRF3 of the input element LNS1 may be arranged to reflect light by total internal reflection (TIR). The focusing unit 300 may comprise one or more lenses 301, 302, 303, 304. The aperture stop AS1 may be positioned between the input element 100 and — the focusing unit 300. The center of the aperture stop AS1 may substantially o coincide with the axis AX1. The aperture stop AS1 may be substantially S circular. The aperture stop AS1 may limit transverse dimensions of light beams 5 which propagate from the input element 100 to the focusing unit 300. The ? aperture stop AS1 may limit transverse dimensions of light beams which pass & 30 through the aperture stop AS1. The aperture stop AS1 may define the entrance E pupil EPU1 of the imaging device CAM1 (Fig. 8).
R 3 The imaging device CAM1 may optionally comprise a wavefront modifying unit = 200 to modify the wavefront of light beams, which are coupled out of the input N 35 element 100. The wavefront modifying unit 200 may form an intermediate beam by modifying the wavefront of an output beam which is coupled out ofthe element 100 through the output surface SRF4. The wavefront modifying unit 200 may form e.g. a substantially collimated light beam. The aperture stop AS1 may be arranged to limit the transverse dimensions of the intermediate beam. The light of the intermediate beam may be focused on the image sensor DET1 by the focusing unit 300. The focusing unit 300 may be arranged to form a focused beam by focusing light of the intermediate beam. The input element 100 may also be arranged to operate such that the wavefront modifying unit 200 is not needed. The output beam of the input element 100 may be directly coupled to the focusing unit 300. For the person skilled in the art, it will be clear that modifications and variations of the devices and the methods according to the present invention are perceivable. The figures are schematic. The particular embodiments described above with reference to the accompanying drawings are illustrative only and not meant to limit the scope of the invention, which is defined by the appended claims.
O QA O
N <+ <Q 00
N
I a a
O NS O
O 0
O N

Claims (14)

1. An apparatus (500), comprising: - an imaging device (CAM1) comprising an input element (100), a focusing unit (300) and an image sensor (DET1), wherein the input element (100) has a symmetry axis (AX1), wherein a viewing region (ZONE1) surrounds the input element (100), and wherein the imaging device (CAM1) is arranged to form an annular image (IMG1) of the viewing region (ZONE1) on the image sensor (DET), characterized in that the apparatus (500) further comprises a curved diffractive element (G1) to provide a first diffracted light ray (LR2a) and a second light ray (LR2b, LR2c) by diffracting light (LB1) received from a first object point (PO) located in the viewing region (ZONE1) such that the first diffracted light ray (LR2a) propagates towards the axis (AX1) in a first azimuthal direction (p1), and such that the second light ray (LR2b, LR2c) propagates towards the axis (AX1) in a second different azimuthal direction (92), wherein the imaging device (CAM1) is arranged to form a first image point (P1a) of the first object point (PO) by focusing light of the first diffracted light ray (LR2a) to the image sensor (DET1), and the imaging device (CAM1) is arranged to form a second different image point (P1b, P1c) of the first object point (PO) by focusing light of the second light ray (LR2b, LR2c) to the image sensor (DET1), wherein the input element (100) is arranged to provide deflected light (LB10) by deflecting light of the light rays (LR2a, LR2b, LR2c) received from the o diffractive element (G1), and the focusing unit (300) is arranged to focus the O deflected light (LB10) to the image sensor DET, x wherein the diffractive element (G1) comprises a plurality of substantially linear 0 diffractive features (DF1), the diffractive features (DF1) are located on a - 30 — substantially cylindrical surface which is concentric with an axis (AX1) of T symmetry of the input element (100), the linear diffractive features (DF1) are 0 substantially parallel with the axis (AX1), and the radial distance (rp1a) of the 3 first image point (P1a) from the center (AX1) of the annular image (IMG1) is > substantially equal to the radial distance (rp1) of the second image point (P1b) from the center (AX1) of the annular image (IMG1).
2. The apparatus (500) of claim 1, wherein the input element (100) is axially symmetric with respect to a first axis (AX1), and wherein the viewing region (ZONE1) surrounds the first axis (AX1).
3. The apparatus (500) of claim 1 or 2, wherein the line density (1/dg1) of the diffractive element (G1) is in the range of 50/mm to 1200/mm.
4. The apparatus (500) according to any of the claims 1 to 3, wherein the diffractive element (G1) comprises a diffractive foil wrapped around the input element (100).
5. The apparatus (500) according to any of the claims 1 to 4, wherein the radius of curvature (rc1) of the grating element (G1) is greater than 10 times the width (Werr) Of the effective optical aperture of the imaging device (CAM1).
6. The apparatus (500) according to any of the claims 1 to 5, comprising a spectrally selective optical filter (FIL1) to limit spectral bandwidth of light focused to the image sensor (DET1).
7. The apparatus (500) according to any of the claims 1 to 6, wherein the apparatus (500) comprises a data processing unit (CNT1) configured to detect the angular position (91) of the first image point (Pia) with respect to the angular position (p2) of the second image point (P1b, P1c), and to determine a distance value (L1) from the detected angular position (91).
o
8. The apparatus (500) according to any of the claims 1 to 7, wherein the O apparatus (500) is configured to recognize a first partial image (SUB1a) of a + first object (OBJ1) and a second partial image (SUB1b, SUB1c) of the first ? object (OBJ1) by image recognition.
& 30 E
9. The apparatus (500) according to any of the claims 1 to 8, wherein the input o element (100) is a catadioptric element, which comprises a refractive input 3 surface (SRF1), a first reflective surface (SRF2), a second reflective surface = (SRF3), and a refractive output surface (SRF4).
N 35
10. The apparatus (500) according to any of the claims 1 to 8, wherein the input element (100) comprises a reflective surface selected from a group consisting of paraboloid surface, hyperboloid surface, conical surface, and ellipsoid surface.
11. The apparatus (500) according to any of the claims 1 to 10, wherein a first angle (01) defining the first boundary (ZB1) of the viewing region (ZONE1) is in the range of +10° to +60° with respect to a horizontal plane (REF1), and a second angle (a2) defining a second boundary (ZB2) of the viewing region (ZONE1) with respect to a horizontal plane (REF 1) is in the range of -60* to - 10°.
12. A vehicle (1000), comprising the apparatus (500) according to any of the claims 1 to 11.
13. A method, comprising: - using an omnidirectional imaging device (CAM1) to form an annular image (IMG1) of a viewing region (ZONE1) on an image sensor (DET1), wherein the viewing region (ZONE1) surrounds an input element (100) of the imaging device (CAM1), and wherein the input element (100) has an axis (AX1) of symmetry, characterized in that the method further comprises using a curved diffractive element (G1) to provide a first diffracted light ray (LR2a) and a second light ray (LR2b, LR2c) by diffracting light (LB1) received from a first object point (PO) located in the viewing region (ZONE1) such that the first diffracted light ray o (LR2a) propagates towards the axis (AX1) in a first azimuthal direction (41), O and such that the second light ray (LR2b, LR2c) propagates towards the axis + (AX1) in a second different azimuthal direction (92), T - forming a first image point (P1a) of the first object point (PO) by focusing light N 30 of the first diffracted light ray (LR2a) to the image sensor (DET1), and E - forming a second different image point (P1b, P1c) of the first object point (PO) o by focusing light of the second light ray (LR2b, LR2c) to the image sensor 3 (DET), = wherein the input element (100) is arranged to provide deflected light (LB10) N 35 by deflecting light of the light rays (LR2a, LR2b, LR2c) received from thediffractive element (G1), and the focusing unit (300) is arranged to focus the deflected light (LB10) to the image sensor DET, wherein the diffractive element (G1) comprises a plurality of substantially linear diffractive features (DF1), the diffractive features (DF1) are located on a substantially cylindrical surface which is concentric with an axis (AX1) of symmetry of the input element (100), the linear diffractive features (DF1) are substantially parallel with the axis (AX1), and the radial distance (rp1a) of the first image point (P1a) from the center (AX1) of the annular image (IMG1) is substantially equal to the radial distance (rp1p) of the second image point (P1b) from the center (AX1) of the annular image (IMG1).
14. The method of claim 13 comprising detecting an angular position (91) of the first image point (P1a) with respect to the second image point (P1b, P1c), and determining a distance value (L1) from the detected angular position (91).
O
QA
O
N <+ <Q 00
N
I =
O
NS
O
O 0
O
N
FI20186073A 2018-12-13 2018-12-13 Stereo imaging apparatus FI128501B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
FI20186073A FI128501B (en) 2018-12-13 2018-12-13 Stereo imaging apparatus
PCT/FI2019/050886 WO2020120842A1 (en) 2018-12-13 2019-12-12 Stereo imaging apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
FI20186073A FI128501B (en) 2018-12-13 2018-12-13 Stereo imaging apparatus

Publications (2)

Publication Number Publication Date
FI20186073A1 FI20186073A1 (en) 2020-06-14
FI128501B true FI128501B (en) 2020-06-30

Family

ID=68965919

Family Applications (1)

Application Number Title Priority Date Filing Date
FI20186073A FI128501B (en) 2018-12-13 2018-12-13 Stereo imaging apparatus

Country Status (2)

Country Link
FI (1) FI128501B (en)
WO (1) WO2020120842A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114995044A (en) * 2021-02-26 2022-09-02 中强光电股份有限公司 Omnidirectional display device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102033300A (en) * 2009-09-30 2011-04-27 鸿富锦精密工业(深圳)有限公司 Panoramic lens and pan-shot system with panoramic lens
US20130147919A1 (en) * 2011-12-09 2013-06-13 California Institute Of Technology Multi-View Difraction Grating Imaging With Two-Dimensional Displacement Measurement For Three-Dimensional Deformation Or Profile Output
DE212015000145U1 (en) * 2014-05-30 2017-01-13 Teknologian Tutkimuskeskus Vtt Oy Omnidirectional imaging device
WO2016140928A1 (en) * 2015-03-01 2016-09-09 Arkive, Llc Panoramic stereoscopic imaging systems

Also Published As

Publication number Publication date
FI20186073A1 (en) 2020-06-14
WO2020120842A1 (en) 2020-06-18

Similar Documents

Publication Publication Date Title
JP6746328B2 (en) Optical system, imaging apparatus and projection apparatus including the same
KR101076986B1 (en) Solid Catadioptric Lens with a Single Viewpoint
RU2699312C1 (en) Optical system having a refracting surface and a reflecting surface, and an image capturing device and a projection device, which include it
KR20010024698A (en) An omnidirectional imaging apparatus
US20190346569A1 (en) Optical assembly and a lidar device having an optical assembly of this type
CA2974124A1 (en) Ranging system, integrated panoramic reflector and panoramic collector
JPH0719861A (en) Scanning type optical range finder
US20140362232A1 (en) Objective lens with hyper-hemispheric field of view
KR20140119719A (en) Optical system intended to measure brdf, bsdf and btdf
US20140340472A1 (en) Panoramic bifocal objective lens
FI128501B (en) Stereo imaging apparatus
JP2019101181A (en) Imaging device
US10789730B2 (en) Method and apparatus for monitoring a position
JP7134925B2 (en) stereo camera
CN108604055B (en) Omnidirectional catadioptric lens with odd-order aspheric profile or multiple lenses
US20190154885A1 (en) Panoramic imaging system
JP2019028127A (en) Optical system, and imaging apparatus and projection apparatus including the same
Pernechele Hyper-hemispheric and bifocal panoramic lenses
US10178372B2 (en) Long focal length monocular 3D imager
CN114185243A (en) Non-blind-area multi-view panoramic stereo imaging device
JP2009180752A (en) Imaging optical system and range finder
JP7043375B2 (en) Stereo camera, in-vehicle lighting unit, and stereo camera system
JP6732442B2 (en) Lightwave distance measuring device
US20190033566A1 (en) Optical system including refractive surface and reflective surface, and imaging apparatus and projection apparatus including the same
JP7264749B2 (en) Stereo camera and stereo camera integrated light unit

Legal Events

Date Code Title Description
FG Patent granted

Ref document number: 128501

Country of ref document: FI

Kind code of ref document: B