EP3748393A1 - Surround-view imaging system - Google Patents

Surround-view imaging system Download PDF

Info

Publication number
EP3748393A1
EP3748393A1 EP19178290.3A EP19178290A EP3748393A1 EP 3748393 A1 EP3748393 A1 EP 3748393A1 EP 19178290 A EP19178290 A EP 19178290A EP 3748393 A1 EP3748393 A1 EP 3748393A1
Authority
EP
European Patent Office
Prior art keywords
lens
image
lens system
imaging
imaging system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19178290.3A
Other languages
German (de)
French (fr)
Inventor
David Musick
Norbert Leclerc
Hendrik Zachmann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jabil Optics Germany GmbH
Original Assignee
Jabil Optics Germany GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jabil Optics Germany GmbH filed Critical Jabil Optics Germany GmbH
Priority to EP19178290.3A priority Critical patent/EP3748393A1/en
Priority to JP2020097979A priority patent/JP2020197713A/en
Priority to US16/892,463 priority patent/US11588992B2/en
Publication of EP3748393A1 publication Critical patent/EP3748393A1/en
Priority to US18/157,140 priority patent/US11838663B2/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/705Pixels for depth measurement, e.g. RGBZ
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B17/00Systems with reflecting surfaces, with or without refracting elements
    • G02B17/08Catadioptric systems
    • G02B17/0856Catadioptric systems comprising a refractive element with a reflective surface, the reflection taking place inside the element, e.g. Mangin mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • the present invention refers to a surround-view imaging system for time-of-flight (TOF) depth sensing applications and a time-of-flight depth sensing based collision avoidance system comprising such an imaging system.
  • TOF time-of-flight
  • Various types of optical systems as well as sensor configurations can provide a range of desired zenithal and azimuthal angle combinations for a surround-view time-of-flight depth sensing based collision avoidance system.
  • This reference point typically corresponds to some illumination and imaging system. Depending on the application, this reference point may be in a stationary position (as it is often the case with security cameras), or it may be positioned on a moving object (such as an automobile, a forklift, or a mobile robot).
  • surround-view For obtaining a surround-view image, multiple individual imaging systems are typically required and they have to be arranged such that their individual field of views (FOV) can be combined in the form of a panoramic environmental view.
  • FOV field of views
  • Implementing multiple imaging systems produces higher costs, especially when the various sensors are also considered. Therefore, technical solutions to create a comparable surround-view image on a single sensor by a single lens system are demanded.
  • surround-view means that a 360° panoramic view can be imaged by a single imaging system (at least in principle).
  • a typical time-of-flight depth sensing system consists of an illumination system including beam forming (e.g. electronic and/or optical beam forming in a temporal and/or spatial manner), an imaging system comprising a receiving optics (e.g. a single lens or a lens system/objective) and a sensor for image detection, and an evaluation electronics for calculating the distances and maybe setting some alarms from the detected image signal.
  • the illumination system typically sends out modulated or pulse light.
  • the distance of an object can be calculated from the time-of-flight which the emitted light requires for traveling from the illumination system to the object and back to the receiving optics.
  • Optical beam forming can be achieved by a beam shaping optics included in the illumination system.
  • the beam shaping optics and the receiving optics can be separate optical elements (one-way optics) or the beam shaping optics and the receiving optics can use single, multiple or all components of the corresponding optics commonly (two-way optics).
  • Such time-of-flight depth sensing systems are typically referred to as systems using light/laser ranging and detection (LiDAR/LaDAR).
  • a surround-view image can be produced by using a wide-angle lens (e.g. a 'fisheye' lens or rectilinear lens) as the first lens in a lens system of an imaging system.
  • Wide-angle lenses can have an angle of view (AOV), i.e., the maximum zenithal angle for which a lens can provide an image, of more than 180°.
  • Lenses with an AOV of more than 180° are also called ultra wide-angle lenses.
  • Angles of view up to around 300° can be achieved.
  • the imageable azimuthal angle range is typically 360°, which allows surround-view in the azimuthal direction. Therefore, with an ultra wide-angle lens solid angles ⁇ of up to around 3 ⁇ steradiant can be imaged.
  • Wide-angle lenses typically show a strong curvilinear barrel distortion, which can to some degree optically corrected in rectilinear lenses.
  • An optical barrel distortion correction can also be included in the design of an associated lens system.
  • Lens systems with an angle of coverage larger than 180° are called ultra wide-angle lens systems.
  • a collision avoidance system for ground-based applications often only requires surround view imaging only in a limited zenithal angle range near the ground.
  • the lower zenithal angles between 0° and 60° are in most cases only of minor interest.
  • standard wide-angle lens systems are designed to provide the best imaging results for the central region of the image. In the outer regions, the image often shows only a reduced sharpness and the linearity of the projection becomes low.
  • aberration effects like enhanced curvature and optical distortion in the zone of interest (ZOI) are difficult to handle with standard ultra wide-angle lens systems.
  • SNR signal-to-noise ratio
  • the objective problem of the invention is therefore related to the problem of providing a surround-view imaging system for time-of-flight depth sensing applications and a time-of-flight depth sensing based collision avoidance system comprising such an imaging system which avoid or at least minimize the problems in the prior art.
  • the invention refers to a surround-view imaging system in which a range of desired zenithal and azimuthal angle combinations for a surround-view time-of-flight based collision avoidance systems shall be provided.
  • the invention solves the objective problem by at least providing an imaging system comprising a lens system, adapted for imaging angles of view larger than 120° in an image on an image plane; a sensor system, adapted to convert at least a part the image in the image plane into an electronic image signal; and an evaluation electronics, adapted to analyze the electronic image signal and to output resulting environmental information; wherein the lens system and/or the sensor system are designed for specifically imaging fields of view starting at zenithal angles larger than 60°.
  • the lens system is further adapted to image around the optical axis of the lens system (axially symmetric imaging) in an image on an image plane perpendicular to the optical axis of the lens system (perpendicular imaging).
  • some components of the lens system may also be arranged off-axial or the image plane could be shifted and/or tilted with respect to the optical axis of the optical system.
  • Such embodiments allow an increased flexibility for matching the FOV of the imaging system to a desired ZOI of a specific TOF depth sensing application.
  • the environmental information which is outputted by the evaluation electronics can be any type of information contained in the image signal.
  • only a trigger signal may be outputted as environmental information.
  • the trigger signal can be utilized to trigger an external event.
  • a collision avoidance system for a mobile robot for example, it may not be required to reconstruct a full three-dimensional image (3D image) or a 3D point cloud from the image signal.
  • a simple trigger signal correlated to a minimum allowable distance will be sufficient to avoid a collision by instantly stopping the robot.
  • Other types of environmental information may be two-dimensional images (2D images) or fully reconstructed 3D images or 3D point clouds. These images may include optical distortions caused by the lens system.
  • the evaluation electronics can further output environmental information in the form of identified metadata based on image recognition results.
  • the outputted environmental information could also be, for example, an array consisting of two elements; an object identifier and a numerical value for the distance from the reference point to the recognized object.
  • the evaluation electronics may be further adapted to correct optical distortions in the image signal and to output undistorted image information. That means predefined lens system data is used by the evaluation electronics to correct the optical distortions in the electronic image signal to output environmental information as undistorted image information.
  • software or an equivalent circuitry may be implemented in the evaluation electronics to remove the distortion form the electronic image signal and to form image information with a rectangular or trapezoidal format. From the undistorted image information also a distortion-free 3D point cloud may be calculated.
  • the invention is based on the finding that for most TOF depth sensing applications using wide-angle lens systems the corresponding ZOI lies in field of views starting at zenithal angles larger than 60°. Lower zenithal angles ranges may be only of minor importance for such applications.
  • standard wide-angle lens systems are typically designed to provide the best imaging results in the central region of the image, most of the imaging capabilities are wasted for imaging regions outside the ZOI.
  • the lens system can be optimized for the relevant ZOI of a specific TOF depth sensing application.
  • the lens system may be optimized to provide diffraction-limited imaging for all imaging points in the FOV.
  • the imaging for zenithal angles less than 60° can be fully neglected in the design of a corresponding lens system.
  • the lens system is a panomorph lens system using panomorph distortion as a design parameter for increasing the magnification of zenithal angles in the field of view compared to zenithal angles outside the field of view.
  • a panomorph lens system is specifically designed to improve the optical performances in a predefined ZOI.
  • the image of the FOV in the image plane is then enlarged compared to the image regions outside the FOV.
  • the image resolution can thus be enhanced.
  • panomorph distortion in the lens system the image resolution for a predefined sensor can be optimized by adapting the image scale to the available sensor surface. Therefore, the lens system can be optimized for specifically imaging FOV starting at zenithal angles larger than 60°.
  • the lens system is an anamorphic lens system adapted to change the aspect ratio of the image in the image plane.
  • an anamorphic lens system cylindrical and/or toroidal lenses are used for non-axially symmetric imaging. Anamorphic designs maybe useful if a predefined sensor and the image in the image plane show different aspect ratios. Therefore, the sensor may not able to detect the whole image or parts of the sensor will not be used for imaging, which means that available image resolution is wasted.
  • anamorphic distortion can be integrated in the lens system.
  • the lens system can be optimized for specifically imaging FOV starting at zenithal angles larger than 60°.
  • the central region of the entrance aperture of the lens system is covered by a blind.
  • the blind blocks rays entering from undesired small zenithal angles below 60°.
  • the blind can be a surface matched (e.g. curved) circular blind, a corresponding elliptical blind or a corresponding freeform blind.
  • the blind may also cover zenithal angles larger than 60° for some regions.
  • the form of the blind can be used to further define the effective FOV of the imaging system, which means that specific zenithal and/or azimuthal angle ranges may be selectively blocked by the blind.
  • At least a single region other than the central region of the entrance aperture of the lens system is covered by a blind.
  • This embodiment can be can be used to block specific zenithal and/or azimuthal angle ranges by the blind in lens systems where another design approach is used for specifically imaging FOV starting at zenithal angles larger than 60°.
  • the blind provides a flexible blocking function.
  • the lens system is a catadioptric lens system in which refractive and reflective optical elements are combined.
  • a catadioptric lens system is typically used when extremely compact lens system designs are required. Further, chromatic and off-axis aberration can be minimized within such systems.
  • the first lens of the lens system is a biconcave lens element in which object rays are reflected by a single total internal reflection. Also preferred is that the first lens of the lens system is a complex freeform lens element in which object rays are reflected by two total internal reflections. However, reflections may also occur on surfaces of the lens system which are designed as metallic or dielectric mirrors.
  • the lens system comprises plastic lenses, glass lenses or a combination thereof.
  • Plastic lenses have a lower weight and a favorable price compared to standard glass lenses, however, glass lenses can provide a higher optical quality.
  • a lens system which combines both types of lens materials can have good optical quality, a low weight and a lower price compared to lens systems comprising only glass lenses.
  • the sensor system comprises at least a single 2D sensor, at least a single 1D sensor or a combination thereof.
  • a single 2D sensor may detect the image of the complete FOV of the imaging system. However, for some applications the full azimuthal or zenithal angle range of the FOV may not be required.
  • the 2D sensor may be arranged such that the central region of the sensor is located outside the optical axis of the lens system. For sensors with a large aspect ratio, such a spatially "shifted" detection has the additional advantage that it can increase the fill factor of the sensor.
  • Another option is to combine two or more sensors with smaller detection area for detecting the image. Non-relevant image regions can thus be omitted during detection.
  • the image can also be detected by one or more 1D sensor arranged inside the image of the FOV. 2D sensors and 1D sensors may also be used in combination.
  • a detection of the central region of the image is omitted by the sensor system. That means, the central region of the image is not detected by the sensor system and the electronic image signal does not contain image information for this region. Therefore, the electronic image signal of the sensor will not include undesired image information which therefore must not be processed by the calculation electronics. Due to less information, energy can be saved and the calculations of the calculations electronics can be accelerated.
  • the FOV preferably comprises zenithal angles between 80° and 100°, more preferably between 60° and 90° and even more preferably between 90° and 120°. These preferred fields of view correspond to a typical ZOI in a collision avoidance system.
  • the sensor system is combined with an emitter array.
  • the sensor system and the emitter array may consist of individual elements or the sensor system and the emitter array form a combined transceiver system.
  • a single lens system can therefore be used as a two-way optics for the illumination system and the imaging system of a conventional TOF depth sensing system. Using a single lens system reduces costs, weight and size of a corresponding collision avoidance system. Moreover, a combined transceiver system may have lower costs compared an individual sensor systems and a corresponding emitter array.
  • the sensor system and the emitter array form a coherent transceiver.
  • coherent TOF depth sensing systems the effective SNR can be increased compared to non-coherent systems such that higher detection efficiencies can be achieved even with less emitted light power or at long distances.
  • a collision avoidance system comprising an imaging system according to the invention.
  • a collision avoidance system is an electro-optical system adapted to provide means and methods for detecting and preventing collisions in a monitored environment.
  • the collision avoidance system can be part of a security assistance system and may comprise as a further component control electronics adapted to determine a possible collusion and to initiate suitable measures to avoid such a collision. Suitable measures may range from simply issuing a warning message up to taking over full control over a protected apparatus or system to avert any damage.
  • Fig. 1 shows calculated ray paths in a first embodiment of a lens system 10 according to the invention.
  • the inset shows the definition of the zenithal angles ⁇ and azimuthal angles ⁇ with respect to the optical axis 12 of the lens system 10.
  • the depicted lens system 10 is fully refractive and consists of 10 glass lenses. However, the number of lenses and the material type can be replaced with other quantities and materials.
  • the lens system is designed for a zenithal field of view (FOV) of 20° starting at a zenithal angle of 80°.
  • the minimum angle of view (AOV) of the lens system 10 is thus 200°. All depicted rays are focused onto a common image plane 16, which is perpendicular to the central optical axis 12 of the lens system 10.
  • Fig. 2 shows calculated ray paths in a second embodiment of a lens system 10 according to the invention.
  • the lens system 10 is fully refractive and consists of six plastic lenses. However, the number of lenses and the material type can be replaced by other quantities and materials.
  • the lens system is designed for a zenithal FOV of 30° starting at a zenithal angle of 75°. The minimum AOV of the lens system 10 is thus 210°. As it can be seen for the bundle of rays near the center of the lens system, rays under small zenithal angles are not focused well in the image plane 14 while rays in the FOV a sharply focused in the image plane 14.
  • the requirements for a plastic lens design are highly simplified such that the lens system 10 can be designed in a less complex manner compared to glass-based wide-angle lens systems 10.
  • Fig. 3 shows calculated ray paths in a third embodiment of a lens system 10 according to the invention.
  • the depicted lens system 10 corresponds to the lens system 10 shown in Fig. 2 .
  • a blind 18 covers the central area of the lens system 10 to block rays entering from the undesired smaller zenithal angles.
  • a blind 18 can be used on any lens system 10 according to the invention.
  • the blind 18 can be a circular structure following the surface of the lens system on the object side (so-called curved circular blind 18).
  • Other shapes of the blind 18 are possible to allow an individual transmittance of additional rays from selected zenithal angle ranges or to block specific azimuthal angle ranges.
  • a blind 18 can also be elliptically shaped or in the form of two circular blinds attached to one another along a section of their circumferences (with or without curvature). As a blind 18 blocks rays entering from the smaller zenithal angles, only rays in the FOV are focused in the image plane 14.
  • Fig. 4 shows calculated ray paths in a fourth embodiment of a lens system 10 according to the invention.
  • the lens system 10 is a catadioptric lens system 10 comprising refractive and reflective optical components.
  • the first component of the depicted lens system 10 is a biconcave lens element 40 in which object rays are reflected by a single total internal reflection 42 (TIR).
  • TIR total internal reflection
  • Object rays in the FOV enter the lens element 40 from the side and are reflected at the inner surface on the top of the lens element 40.
  • a zenithal FOV of 30° is realized starting at a zenithal angle of 75°.
  • Fig. 5 shows calculated ray paths in a fifth embodiment of a lens system according to the invention.
  • this lens system 10 is a catadioptric lens system 10 comprising refractive and reflective optical components.
  • the first component of the depicted lens system 10 is a complex freeform lens element 50 in which object rays are reflected by two total internal reflections 52, 54.
  • Object rays in the FOV enter the freeform lens element 50 from the side and are reflected first at the inner surface on the bottom of the lens element 50 and second at the inner surface on the top of the lens element 50.
  • a zenithal FOV of 30° is realized starting at a zenithal angle of 75°.
  • the lens element 50 is formed such that only rays in the FOV can enter the following parts of the lens system 10.
  • the freeform shapes of the surfaces of the top and of bottom of the lens element 50 are designed such that zenithal rays with angles not corresponding to the FOV are blocked.
  • an additional blind may not be required for blocking undesirable smaller zenithal angles.
  • Fig. 6 shows calculated ray paths in further exemplary embodiments of lens systems 10 according to the invention.
  • the depicted lens systems 10 are similar to the lens system 10 shown in Fig. 1 , however, any lens system according to the invention could be applied.
  • the left lens system 10 has a zenithal angular range from 60° to 90° corresponding to a FOV of 30°.
  • the right lens system 10 shows a zenithal angular range from 90° to 120° which again corresponds to a FOV of 30° (minimum AOV is 240°).
  • the position and the size of the FOV can be selected from a wide zenithal angular range.
  • Fig. 7 shows illustrations of single 2D image sensors 20 detecting a reduced azimuthal angle range of the image 16. Detecting only a reduced azimuthal angle range may be desired when a full panoramic perspective is not required for a specific application. If the imaging system 10 is installed such that a part of the FOV is obscured and can thus not be used for imaging or collision avoidance, the sensor 20 may be shifted along the image plane in relation to the optical axis 12. Other options are to change the size of the senor 20 or adapting the lens system 10 to maximize the area of detection on the sensor 20. In the left illustration, the azimuthal FOV is around 270°, while in the right illustration an azimuthal FOV of 180° is imaged on the sensor.
  • the smallest zenithal angles in the FOV are imaged at the inner border A of the image 16, while the largest zenithal angles in the FOV are imaged at the outer border B of the image 16.
  • the image resolution can be increased by using the full detection area of the sensor 20 for the remaining azimuthal angular range.
  • Figure 8 shows an illustration of using anamorphic distortion for maximizing the achievable image resolution.
  • An axially symmetric lens system 10 images the FOV 16 as a circle in the image plane 14.
  • anamorphic distortion in the lens system 10 preferably by adding cylindrical and/or toroidal lenses, the aspect ratios can be matched such that the image 16 can be detected by a maximum number of pixels on the sensor 20.
  • the applied anamorphic distortion thus creates different magnifications in the horizontal and vertical directions on the image sensor 20 in the image plane 14, which provides an increased usage of the pixels on the image sensor 20 and therefore allows better light collection.
  • the resolution of the detection can be enhanced for the magnified regions in the image 16. Therefore, the imaging system 10 can be aligned such that some regions of the FOV may be imaged with an increased optical quality.
  • Figure 9 shows exemplary embodiments of a sensor system 20 according to the invention.
  • the sensor system 20 comprises at least a single 2D detector, at least a single 1D Detector or a combination thereof.
  • the image can be fully or partly comprised to allow a specialized detection of different parts of the image 16.
  • the illustrations show how two or four rectangular detectors can be used to detect the image 16.
  • a detection of the central region of the image 16 in the image plane 14 is omitted by the sensor system 20. This saves costs, allows an increased optical resolution and avoids a time- and energy-consuming processing of undesired information in the electronic image signal by the calculation electronics 30.
  • FIG. 10 shows a schematic view of an exemplary embodiment of an imaging system 100 according to the invention.
  • the depicted imaging system 100 comprises a lens system 10, adapted for imaging angles of view larger than 120° symmetrically around the optical axis 12 of the lens system 10 in an image 16 on an image plane 14 perpendicular to the optical axis 12 of the lens system 10; a sensor system 20, adapted to convert at least a part the image 16 in the image plane 14 into an electronic image signal; and an evaluation electronics 30, adapted to analyze the electronic image signal and to output resulting environmental information; wherein the lens system 10 and/or the sensor system 20 are designed for specifically imaging fields of view starting at zenithal angles larger than 80°.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Lenses (AREA)
  • Studio Devices (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Blocking Light For Cameras (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The present invention refers to a surround-view imaging system for time-of-flight (TOF) depth sensing applications and a time-of-flight sensing based collision avoidance system comprising such an imaging system. The imaging system (100) for time-of-flight depth sensing applications comprises a lens system (10), adapted for imaging angles of view (AOV) larger than 120° in an image (16) on an image plane (14); a sensor system (20), adapted to convert at least a part the image (16) in the image plane (14) into an electronic image signal; and an evaluation electronics (30), adapted to analyze the electronic image signal and to output resulting environmental information; wherein the lens system (10) and/or the sensor system (20) are designed for specifically imaging fields of view (FOV) starting at zenithal angles larger than 60°.

Description

  • The present invention refers to a surround-view imaging system for time-of-flight (TOF) depth sensing applications and a time-of-flight depth sensing based collision avoidance system comprising such an imaging system. Various types of optical systems as well as sensor configurations can provide a range of desired zenithal and azimuthal angle combinations for a surround-view time-of-flight depth sensing based collision avoidance system.
  • Technological Background
  • In many fields, including security, automotive and robotics, there is an increasing demand to obtain a surround-view perspective for time-of-flight depth measurements relative to a given reference point. This reference point typically corresponds to some illumination and imaging system. Depending on the application, this reference point may be in a stationary position (as it is often the case with security cameras), or it may be positioned on a moving object (such as an automobile, a forklift, or a mobile robot).
  • For obtaining a surround-view image, multiple individual imaging systems are typically required and they have to be arranged such that their individual field of views (FOV) can be combined in the form of a panoramic environmental view. Implementing multiple imaging systems produces higher costs, especially when the various sensors are also considered. Therefore, technical solutions to create a comparable surround-view image on a single sensor by a single lens system are demanded. In this context, surround-view means that a 360° panoramic view can be imaged by a single imaging system (at least in principle).
  • A typical time-of-flight depth sensing system consists of an illumination system including beam forming (e.g. electronic and/or optical beam forming in a temporal and/or spatial manner), an imaging system comprising a receiving optics (e.g. a single lens or a lens system/objective) and a sensor for image detection, and an evaluation electronics for calculating the distances and maybe setting some alarms from the detected image signal. The illumination system typically sends out modulated or pulse light. The distance of an object can be calculated from the time-of-flight which the emitted light requires for traveling from the illumination system to the object and back to the receiving optics.
  • Optical beam forming can be achieved by a beam shaping optics included in the illumination system. The beam shaping optics and the receiving optics can be separate optical elements (one-way optics) or the beam shaping optics and the receiving optics can use single, multiple or all components of the corresponding optics commonly (two-way optics). Such time-of-flight depth sensing systems are typically referred to as systems using light/laser ranging and detection (LiDAR/LaDAR).
  • A surround-view image can be produced by using a wide-angle lens (e.g. a 'fisheye' lens or rectilinear lens) as the first lens in a lens system of an imaging system. Wide-angle lenses can have an angle of view (AOV), i.e., the maximum zenithal angle for which a lens can provide an image, of more than 180°. Lenses with an AOV of more than 180° are also called ultra wide-angle lenses. Angles of view up to around 300° can be achieved. In a normal axially symmetric imaging system the imageable azimuthal angle range is typically 360°, which allows surround-view in the azimuthal direction. Therefore, with an ultra wide-angle lens solid angles Ω of up to around 3π steradiant can be imaged. Wide-angle lenses typically show a strong curvilinear barrel distortion, which can to some degree optically corrected in rectilinear lenses. An optical barrel distortion correction can also be included in the design of an associated lens system. Lens systems with an angle of coverage larger than 180° are called ultra wide-angle lens systems.
  • A collision avoidance system for ground-based applications often only requires surround view imaging only in a limited zenithal angle range near the ground. For collision avoidance system in vehicles, where the optical axis of the lens system is typically pointing upwards "into the sky", the lower zenithal angles between 0° and 60° are in most cases only of minor interest. However, standard wide-angle lens systems are designed to provide the best imaging results for the central region of the image. In the outer regions, the image often shows only a reduced sharpness and the linearity of the projection becomes low. Even when applying elaborate optical correction, aberration effects like enhanced curvature and optical distortion in the zone of interest (ZOI) are difficult to handle with standard ultra wide-angle lens systems. Due to the wide angle of view, object rays are entering the optical system from all possible directions, causing noise by undesired internal reflections inside the lens system. Further, bright sunlight or other intense light sources outside the ZOI can over-illuminate the senor and decrease the signal-to-noise ratio (SNR) of the sensor detection. The electronic image signal of the sensor will thus include the undesired parts of the image which still have to be processed by the calculation electronics with some efforts.
  • The objective problem of the invention is therefore related to the problem of providing a surround-view imaging system for time-of-flight depth sensing applications and a time-of-flight depth sensing based collision avoidance system comprising such an imaging system which avoid or at least minimize the problems in the prior art. In particular, the invention refers to a surround-view imaging system in which a range of desired zenithal and azimuthal angle combinations for a surround-view time-of-flight based collision avoidance systems shall be provided.
  • Summary of Invention
  • The invention solves the objective problem by at least providing an imaging system comprising a lens system, adapted for imaging angles of view larger than 120° in an image on an image plane; a sensor system, adapted to convert at least a part the image in the image plane into an electronic image signal; and an evaluation electronics, adapted to analyze the electronic image signal and to output resulting environmental information; wherein the lens system and/or the sensor system are designed for specifically imaging fields of view starting at zenithal angles larger than 60°.
  • Preferably, the lens system is further adapted to image around the optical axis of the lens system (axially symmetric imaging) in an image on an image plane perpendicular to the optical axis of the lens system (perpendicular imaging). However, some components of the lens system may also be arranged off-axial or the image plane could be shifted and/or tilted with respect to the optical axis of the optical system. Such embodiments allow an increased flexibility for matching the FOV of the imaging system to a desired ZOI of a specific TOF depth sensing application.
  • The environmental information which is outputted by the evaluation electronics can be any type of information contained in the image signal. In some embodiments only a trigger signal may be outputted as environmental information. The trigger signal can be utilized to trigger an external event. In a collision avoidance system for a mobile robot, for example, it may not be required to reconstruct a full three-dimensional image (3D image) or a 3D point cloud from the image signal. A simple trigger signal correlated to a minimum allowable distance will be sufficient to avoid a collision by instantly stopping the robot. Other types of environmental information may be two-dimensional images (2D images) or fully reconstructed 3D images or 3D point clouds. These images may include optical distortions caused by the lens system. However, in a collision avoidance system an additional calculation of the distances with predefined lens system data could be required. The evaluation electronics can further output environmental information in the form of identified metadata based on image recognition results. In this case the outputted environmental information could also be, for example, an array consisting of two elements; an object identifier and a numerical value for the distance from the reference point to the recognized object.
  • In some applications, it may also be required to present the environmental information to a human. While a collision avoidance system may be able to work with 2D images, 3D images or 3D point clouds including optical distortions from the lens system, for a human these images should be presented distortion free or distortion corrected. Therefore, the evaluation electronics may be further adapted to correct optical distortions in the image signal and to output undistorted image information. That means predefined lens system data is used by the evaluation electronics to correct the optical distortions in the electronic image signal to output environmental information as undistorted image information. For ease in displaying or processing the image gathered by the sensor, software or an equivalent circuitry may be implemented in the evaluation electronics to remove the distortion form the electronic image signal and to form image information with a rectangular or trapezoidal format. From the undistorted image information also a distortion-free 3D point cloud may be calculated.
  • The invention is based on the finding that for most TOF depth sensing applications using wide-angle lens systems the corresponding ZOI lies in field of views starting at zenithal angles larger than 60°. Lower zenithal angles ranges may be only of minor importance for such applications. As standard wide-angle lens systems are typically designed to provide the best imaging results in the central region of the image, most of the imaging capabilities are wasted for imaging regions outside the ZOI. By designing a lens system, for example, specifically for imaging FOV starting at zenithal angles larger than 60°, the lens system can be optimized for the relevant ZOI of a specific TOF depth sensing application. In particular, the lens system may be optimized to provide diffraction-limited imaging for all imaging points in the FOV. The imaging for zenithal angles less than 60° can be fully neglected in the design of a corresponding lens system.
  • In a preferred embodiment, the lens system is a panomorph lens system using panomorph distortion as a design parameter for increasing the magnification of zenithal angles in the field of view compared to zenithal angles outside the field of view. A panomorph lens system is specifically designed to improve the optical performances in a predefined ZOI. The image of the FOV in the image plane is then enlarged compared to the image regions outside the FOV. When the image is detected by the sensing system, the image resolution can thus be enhanced. By using panomorph distortion in the lens system, the image resolution for a predefined sensor can be optimized by adapting the image scale to the available sensor surface. Therefore, the lens system can be optimized for specifically imaging FOV starting at zenithal angles larger than 60°.
  • In another preferred embodiment, the lens system is an anamorphic lens system adapted to change the aspect ratio of the image in the image plane. In an anamorphic lens system, cylindrical and/or toroidal lenses are used for non-axially symmetric imaging. Anamorphic designs maybe useful if a predefined sensor and the image in the image plane show different aspect ratios. Therefore, the sensor may not able to detect the whole image or parts of the sensor will not be used for imaging, which means that available image resolution is wasted. For matching the different aspect ratios, anamorphic distortion can be integrated in the lens system. Also in this embodiment the lens system can be optimized for specifically imaging FOV starting at zenithal angles larger than 60°.
  • In another preferred embodiment, the central region of the entrance aperture of the lens system is covered by a blind. The blind blocks rays entering from undesired small zenithal angles below 60°. Preferably, the blind can be a surface matched (e.g. curved) circular blind, a corresponding elliptical blind or a corresponding freeform blind. The blind may also cover zenithal angles larger than 60° for some regions. The form of the blind can be used to further define the effective FOV of the imaging system, which means that specific zenithal and/or azimuthal angle ranges may be selectively blocked by the blind.
  • In an alternative preferred embodiment at least a single region other than the central region of the entrance aperture of the lens system is covered by a blind. This embodiment can be can be used to block specific zenithal and/or azimuthal angle ranges by the blind in lens systems where another design approach is used for specifically imaging FOV starting at zenithal angles larger than 60°. In this case, the blind provides a flexible blocking function.
  • In another preferred embodiment, the lens system is a catadioptric lens system in which refractive and reflective optical elements are combined. A catadioptric lens system is typically used when extremely compact lens system designs are required. Further, chromatic and off-axis aberration can be minimized within such systems. Preferably, the first lens of the lens system is a biconcave lens element in which object rays are reflected by a single total internal reflection. Also preferred is that the first lens of the lens system is a complex freeform lens element in which object rays are reflected by two total internal reflections. However, reflections may also occur on surfaces of the lens system which are designed as metallic or dielectric mirrors.
  • In another preferred embodiment, the lens system comprises plastic lenses, glass lenses or a combination thereof. Plastic lenses have a lower weight and a favorable price compared to standard glass lenses, however, glass lenses can provide a higher optical quality. A lens system which combines both types of lens materials can have good optical quality, a low weight and a lower price compared to lens systems comprising only glass lenses.
  • In another preferred embodiment, the sensor system comprises at least a single 2D sensor, at least a single 1D sensor or a combination thereof. A single 2D sensor may detect the image of the complete FOV of the imaging system. However, for some applications the full azimuthal or zenithal angle range of the FOV may not be required. For such applications the 2D sensor may be arranged such that the central region of the sensor is located outside the optical axis of the lens system. For sensors with a large aspect ratio, such a spatially "shifted" detection has the additional advantage that it can increase the fill factor of the sensor. Another option is to combine two or more sensors with smaller detection area for detecting the image. Non-relevant image regions can thus be omitted during detection. The image can also be detected by one or more 1D sensor arranged inside the image of the FOV. 2D sensors and 1D sensors may also be used in combination.
  • In another preferred embodiment, a detection of the central region of the image is omitted by the sensor system. That means, the central region of the image is not detected by the sensor system and the electronic image signal does not contain image information for this region. Therefore, the electronic image signal of the sensor will not include undesired image information which therefore must not be processed by the calculation electronics. Due to less information, energy can be saved and the calculations of the calculations electronics can be accelerated.
  • In another preferred embodiment, the FOV preferably comprises zenithal angles between 80° and 100°, more preferably between 60° and 90° and even more preferably between 90° and 120°. These preferred fields of view correspond to a typical ZOI in a collision avoidance system.
  • In another preferred embodiment, the sensor system is combined with an emitter array. The sensor system and the emitter array may consist of individual elements or the sensor system and the emitter array form a combined transceiver system. A single lens system can therefore be used as a two-way optics for the illumination system and the imaging system of a conventional TOF depth sensing system. Using a single lens system reduces costs, weight and size of a corresponding collision avoidance system. Moreover, a combined transceiver system may have lower costs compared an individual sensor systems and a corresponding emitter array.
  • In another preferred embodiment, the sensor system and the emitter array form a coherent transceiver. With coherent TOF depth sensing systems the effective SNR can be increased compared to non-coherent systems such that higher detection efficiencies can be achieved even with less emitted light power or at long distances.
  • According to another aspect of the invention, there is provided a collision avoidance system comprising an imaging system according to the invention. A collision avoidance system is an electro-optical system adapted to provide means and methods for detecting and preventing collisions in a monitored environment. The collision avoidance system can be part of a security assistance system and may comprise as a further component control electronics adapted to determine a possible collusion and to initiate suitable measures to avoid such a collision. Suitable measures may range from simply issuing a warning message up to taking over full control over a protected apparatus or system to avert any damage.
  • Further aspects of the invention could be learned from the following description.
  • Brief Description of the Drawings
  • In the following, the invention will be described in further detail. The examples given are adapted to describe the invention.
  • Fig. 1
    shows calculated ray paths in a first embodiment of a lens system according to the invention;
    Fig. 2
    shows calculated ray paths in a second embodiment of a lens system according to the invention;
    Fig. 3
    shows calculated ray paths in a third embodiment of a lens system according to the invention;
    Fig. 4
    shows calculated ray paths in a fourth embodiment of a lens system according to the invention;
    Fig. 5
    shows calculated ray paths in a fifth embodiment of a lens system according to the invention;
    Fig. 6
    shows calculated ray paths in further exemplary embodiments of lens systems according to the invention;
    Fig. 7
    shows illustrations of single 2D image sensors detecting a reduced azimuthal angle range of the image;
    Fig. 8
    shows an illustration of using anamorphic distortion for maximizing the achievable image resolution;
    Fig. 9
    shows exemplary embodiments of a sensor system according to the invention; and
    Fig. 10
    a schematic view of an exemplary embodiment of an imaging system according to the invention.
    Detailed Description of the Invention
  • Fig. 1 shows calculated ray paths in a first embodiment of a lens system 10 according to the invention. The inset shows the definition of the zenithal angles φ and azimuthal angles θ with respect to the optical axis 12 of the lens system 10. The depicted lens system 10 is fully refractive and consists of 10 glass lenses. However, the number of lenses and the material type can be replaced with other quantities and materials. The lens system is designed for a zenithal field of view (FOV) of 20° starting at a zenithal angle of 80°. The minimum angle of view (AOV) of the lens system 10 is thus 200°. All depicted rays are focused onto a common image plane 16, which is perpendicular to the central optical axis 12 of the lens system 10.
  • Fig. 2 shows calculated ray paths in a second embodiment of a lens system 10 according to the invention. The lens system 10 is fully refractive and consists of six plastic lenses. However, the number of lenses and the material type can be replaced by other quantities and materials. The lens system is designed for a zenithal FOV of 30° starting at a zenithal angle of 75°. The minimum AOV of the lens system 10 is thus 210°. As it can be seen for the bundle of rays near the center of the lens system, rays under small zenithal angles are not focused well in the image plane 14 while rays in the FOV a sharply focused in the image plane 14. However, by targeting the design for a desired zenithal FOV as a zone of interest (ZOI), the requirements for a plastic lens design are highly simplified such that the lens system 10 can be designed in a less complex manner compared to glass-based wide-angle lens systems 10.
  • Fig. 3 shows calculated ray paths in a third embodiment of a lens system 10 according to the invention. The depicted lens system 10 corresponds to the lens system 10 shown in Fig. 2. Additionally a blind 18 covers the central area of the lens system 10 to block rays entering from the undesired smaller zenithal angles. However, a blind 18 can be used on any lens system 10 according to the invention. Preferably, the blind 18 can be a circular structure following the surface of the lens system on the object side (so-called curved circular blind 18). Other shapes of the blind 18 are possible to allow an individual transmittance of additional rays from selected zenithal angle ranges or to block specific azimuthal angle ranges. For example, a blind 18 can also be elliptically shaped or in the form of two circular blinds attached to one another along a section of their circumferences (with or without curvature). As a blind 18 blocks rays entering from the smaller zenithal angles, only rays in the FOV are focused in the image plane 14.
  • Fig. 4 shows calculated ray paths in a fourth embodiment of a lens system 10 according to the invention. The lens system 10 is a catadioptric lens system 10 comprising refractive and reflective optical components. The first component of the depicted lens system 10 is a biconcave lens element 40 in which object rays are reflected by a single total internal reflection 42 (TIR). Object rays in the FOV enter the lens element 40 from the side and are reflected at the inner surface on the top of the lens element 40. In this embodiment, a zenithal FOV of 30° is realized starting at a zenithal angle of 75°. Rays from outside the FOV are practically blocked as the condition for TIR and the concave shape of the top surface of the lens element 40 are limiting the imageable zenithal angle range. An additional blind may thus not be required for blocking undesired smaller zenithal angles.
  • Fig. 5 shows calculated ray paths in a fifth embodiment of a lens system according to the invention. Also this lens system 10 is a catadioptric lens system 10 comprising refractive and reflective optical components. The first component of the depicted lens system 10 is a complex freeform lens element 50 in which object rays are reflected by two total internal reflections 52, 54. Object rays in the FOV enter the freeform lens element 50 from the side and are reflected first at the inner surface on the bottom of the lens element 50 and second at the inner surface on the top of the lens element 50. In this embodiment, a zenithal FOV of 30° is realized starting at a zenithal angle of 75°. The lens element 50 is formed such that only rays in the FOV can enter the following parts of the lens system 10. In particular, the freeform shapes of the surfaces of the top and of bottom of the lens element 50 are designed such that zenithal rays with angles not corresponding to the FOV are blocked. Also in this embodiment, an additional blind may not be required for blocking undesirable smaller zenithal angles.
  • Fig. 6 shows calculated ray paths in further exemplary embodiments of lens systems 10 according to the invention. The depicted lens systems 10 are similar to the lens system 10 shown in Fig. 1, however, any lens system according to the invention could be applied. In the figure different realizations of a zenithal FOV in such lens systems 10 are illustrated. The left lens system 10 has a zenithal angular range from 60° to 90° corresponding to a FOV of 30°. The right lens system 10 shows a zenithal angular range from 90° to 120° which again corresponds to a FOV of 30° (minimum AOV is 240°). However, the position and the size of the FOV can be selected from a wide zenithal angular range. Preferred are zenithal fields of view between 20° to 40° which are selected in a zenithal angular range starting from 60° and reaching up to more than 150°.
  • Fig. 7 shows illustrations of single 2D image sensors 20 detecting a reduced azimuthal angle range of the image 16. Detecting only a reduced azimuthal angle range may be desired when a full panoramic perspective is not required for a specific application. If the imaging system 10 is installed such that a part of the FOV is obscured and can thus not be used for imaging or collision avoidance, the sensor 20 may be shifted along the image plane in relation to the optical axis 12. Other options are to change the size of the senor 20 or adapting the lens system 10 to maximize the area of detection on the sensor 20. In the left illustration, the azimuthal FOV is around 270°, while in the right illustration an azimuthal FOV of 180° is imaged on the sensor. In the illustrations, the smallest zenithal angles in the FOV are imaged at the inner border A of the image 16, while the largest zenithal angles in the FOV are imaged at the outer border B of the image 16. By limiting the detected azimuthal angular range, the image resolution can be increased by using the full detection area of the sensor 20 for the remaining azimuthal angular range.
  • Figure 8 shows an illustration of using anamorphic distortion for maximizing the achievable image resolution. An axially symmetric lens system 10 images the FOV 16 as a circle in the image plane 14. When using a single 2D sensor 20 with a rectangular detection surface, a large number of pixels may not be used because of the different aspect ratios of the image 16 and the senor 20. By using anamorphic distortion in the lens system 10, preferably by adding cylindrical and/or toroidal lenses, the aspect ratios can be matched such that the image 16 can be detected by a maximum number of pixels on the sensor 20. The applied anamorphic distortion thus creates different magnifications in the horizontal and vertical directions on the image sensor 20 in the image plane 14, which provides an increased usage of the pixels on the image sensor 20 and therefore allows better light collection. Furthermore, with a larger magnification, the resolution of the detection can be enhanced for the magnified regions in the image 16. Therefore, the imaging system 10 can be aligned such that some regions of the FOV may be imaged with an increased optical quality.
  • Figure 9 shows exemplary embodiments of a sensor system 20 according to the invention. The sensor system 20 comprises at least a single 2D detector, at least a single 1D Detector or a combination thereof. By selecting a specific type of sensor arrangement, the image can be fully or partly comprised to allow a specialized detection of different parts of the image 16. The illustrations show how two or four rectangular detectors can be used to detect the image 16. In all shown embodiments, a detection of the central region of the image 16 in the image plane 14 is omitted by the sensor system 20. This saves costs, allows an increased optical resolution and avoids a time- and energy-consuming processing of undesired information in the electronic image signal by the calculation electronics 30.
  • Figure 10 shows a schematic view of an exemplary embodiment of an imaging system 100 according to the invention. The depicted imaging system 100 comprises a lens system 10, adapted for imaging angles of view larger than 120° symmetrically around the optical axis 12 of the lens system 10 in an image 16 on an image plane 14 perpendicular to the optical axis 12 of the lens system 10; a sensor system 20, adapted to convert at least a part the image 16 in the image plane 14 into an electronic image signal; and an evaluation electronics 30, adapted to analyze the electronic image signal and to output resulting environmental information; wherein the lens system 10 and/or the sensor system 20 are designed for specifically imaging fields of view starting at zenithal angles larger than 80°.
  • Reference List
  • 10
    lens system
    12
    optical axis
    14
    image plane
    16
    image
    18
    blind
    20
    sensor system
    30
    evaluation electronics
    40
    biconcave lens element
    42
    total internal reflection
    50
    complex freeform lens element
    52
    first total internal reflection
    54
    second total internal reflection
    100
    imaging system
    θ
    azimuthal angle
    φ
    zenithal angle
    AOV
    angle of view
    FOV
    field of view
    TOF
    time-of-flight
    ZOI
    zone of interest
    A, B
    borders of the imaged FOV

Claims (15)

  1. Imaging system (100) for time-of-flight depth sensing applications, comprising:
    a lens system (10), adapted for imaging angles of view larger than 120° in an image (16) on an image plane (14);
    a sensor system (20), adapted to convert at least a part the image (16) in the image plane (14) into an electronic image signal; and
    an evaluation electronics (30), adapted to analyze the electronic image signal and to output resulting environmental information;
    wherein the lens system (10) and/or the sensor system (20) are designed for specifically imaging fields of view starting at zenithal angles larger than 60°.
  2. Imaging system according to claim 1, wherein the evaluation electronics (30) is further adapted to correct optical distortions in the image signal and to output undistorted image information.
  3. Imaging system according to one of the preceding claims, wherein the lens system (10) is a panomorph lens system (10) using panomorph distortion as a design parameter for increasing the magnification of zenithal angles in the field of view compared to zenithal angles outside the field of view.
  4. Imaging system according to one of the preceding claims, wherein the lens system (10) is an anamorphic lens system (10) adapted to change the aspect ratio of the image (16) in the image plane (14).
  5. Imaging system according to one of the preceding claims, wherein the central region of the entrance aperture of the lens system (10) is covered by a blind (16).
  6. Imaging system according to one of the preceding claims, wherein the lens system (10) is a catadioptric lens system.
  7. Imaging system according to claim 6, wherein the first lens of the lens system (10) is a biconcave lens element (40) in which object rays are reflected by a single total internal reflection (42).
  8. Imaging system according to claim 6, wherein the first lens of the lens system (10) is a complex freeform lens element (50) in which object rays are reflected by two total internal reflections (52, 54).
  9. Imaging system according to one of the preceding claims, wherein the lens system (10) comprises plastic lenses, glass lenses or a combination thereof.
  10. Imaging system according to one of the preceding claims, wherein the sensor system (20) comprises at least a single 2D sensor, at least a single 1D sensor or a combination thereof.
  11. Imaging system according to one of the preceding claims, wherein a detection of the central region of the image (16) is omitted by the sensor system (20).
  12. Imaging system according to one of the preceding claims, wherein the field of view comprises zenithal angles between 80° and 100°, between 60° and 90° or between 90° and 120°.
  13. Imaging system according to one of the preceding claims, wherein the sensor system (20) is combined with an emitter array.
  14. Imaging system according to claim 13, wherein the sensor system (20) and the emitter array forming a coherent transceiver.
  15. Collision avoidance system comprising an imaging system according to one of the preceding claims.
EP19178290.3A 2019-06-04 2019-06-04 Surround-view imaging system Withdrawn EP3748393A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP19178290.3A EP3748393A1 (en) 2019-06-04 2019-06-04 Surround-view imaging system
JP2020097979A JP2020197713A (en) 2019-06-04 2020-06-04 Surround view imaging system
US16/892,463 US11588992B2 (en) 2019-06-04 2020-06-04 Surround-view imaging system
US18/157,140 US11838663B2 (en) 2019-06-04 2023-01-20 Surround-view imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP19178290.3A EP3748393A1 (en) 2019-06-04 2019-06-04 Surround-view imaging system

Publications (1)

Publication Number Publication Date
EP3748393A1 true EP3748393A1 (en) 2020-12-09

Family

ID=66770270

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19178290.3A Withdrawn EP3748393A1 (en) 2019-06-04 2019-06-04 Surround-view imaging system

Country Status (3)

Country Link
US (2) US11588992B2 (en)
EP (1) EP3748393A1 (en)
JP (1) JP2020197713A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020012059A1 (en) * 1996-06-24 2002-01-31 Wallerstein Edward P. Imaging arrangement which allows for capturing an image of a view at different resolutions
US20100238568A1 (en) * 2006-06-15 2010-09-23 Olympus Corporation Optical System
US20160188985A1 (en) * 2014-12-24 2016-06-30 Samsung Electronics Co., Ltd. Lens assembly, obstacle detecting unit using the same, and moving robot having the same
US20170310952A1 (en) * 2014-10-10 2017-10-26 Conti Temic Microelectronic Gmbh Stereo Camera for Vehicles
DE102017125686A1 (en) * 2017-11-03 2019-05-09 Pepperl + Fuchs Gmbh Optical scanner

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6459451B2 (en) * 1996-06-24 2002-10-01 Be Here Corporation Method and apparatus for a panoramic camera to capture a 360 degree image
JP3764953B2 (en) * 2002-04-17 2006-04-12 立山マシン株式会社 Method for recording image conversion parameters in an annular image
US7567274B2 (en) * 2002-12-09 2009-07-28 Frank Edughom Ekpar Method and apparatus for creating interactive virtual tours
JP2004215228A (en) * 2002-12-17 2004-07-29 Sony Corp Photographing device
US20100002070A1 (en) * 2004-04-30 2010-01-07 Grandeye Ltd. Method and System of Simultaneously Displaying Multiple Views for Video Surveillance
JP4766841B2 (en) * 2003-09-08 2011-09-07 株式会社オートネットワーク技術研究所 Camera device and vehicle periphery monitoring device mounted on vehicle
US7593057B2 (en) * 2004-07-28 2009-09-22 Microsoft Corp. Multi-view integrated camera system with housing
US7920200B2 (en) * 2005-06-07 2011-04-05 Olympus Corporation Image pickup device with two cylindrical lenses
JPWO2012060269A1 (en) * 2010-11-04 2014-05-12 コニカミノルタ株式会社 Image processing method, image processing apparatus, and imaging apparatus
US9239389B2 (en) * 2011-09-28 2016-01-19 Samsung Electronics Co., Ltd. Obstacle sensor and robot cleaner having the same
US9383753B1 (en) * 2012-09-26 2016-07-05 Google Inc. Wide-view LIDAR with areas of special attention
DK2835973T3 (en) * 2013-08-06 2015-11-30 Sick Ag 3D camera and method for recording three-dimensional image data
JP6032163B2 (en) * 2013-09-11 2016-11-24 トヨタ自動車株式会社 3D object recognition apparatus, 3D object recognition method, and moving body
US9854164B1 (en) * 2013-12-31 2017-12-26 Ic Real Tech, Inc. Single sensor multiple lens camera arrangement
WO2016044264A1 (en) * 2014-09-15 2016-03-24 Remotereality Corporation Compact panoramic camera: optical system, apparatus, image forming method
US10578719B2 (en) * 2016-05-18 2020-03-03 James Thomas O'Keeffe Vehicle-integrated LIDAR system
DE102017108569B4 (en) * 2017-04-21 2019-03-14 Mekra Lang Gmbh & Co. Kg Camera system for a motor vehicle, mirror replacement system with such a camera system and driver assistance system with such a system
JP2019101181A (en) * 2017-11-30 2019-06-24 キヤノン株式会社 Imaging device
US10222474B1 (en) * 2017-12-13 2019-03-05 Soraa Laser Diode, Inc. Lidar systems including a gallium and nitrogen containing laser light source
JP7081473B2 (en) * 2018-03-02 2022-06-07 株式会社リコー Imaging optical system, imaging system and imaging device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020012059A1 (en) * 1996-06-24 2002-01-31 Wallerstein Edward P. Imaging arrangement which allows for capturing an image of a view at different resolutions
US20100238568A1 (en) * 2006-06-15 2010-09-23 Olympus Corporation Optical System
US20170310952A1 (en) * 2014-10-10 2017-10-26 Conti Temic Microelectronic Gmbh Stereo Camera for Vehicles
US20160188985A1 (en) * 2014-12-24 2016-06-30 Samsung Electronics Co., Ltd. Lens assembly, obstacle detecting unit using the same, and moving robot having the same
DE102017125686A1 (en) * 2017-11-03 2019-05-09 Pepperl + Fuchs Gmbh Optical scanner

Also Published As

Publication number Publication date
US11588992B2 (en) 2023-02-21
US20200389611A1 (en) 2020-12-10
US11838663B2 (en) 2023-12-05
US20230156361A1 (en) 2023-05-18
JP2020197713A (en) 2020-12-10

Similar Documents

Publication Publication Date Title
US9618622B2 (en) Optical object-detection device having a MEMS and motor vehicle having such a detection device
JP2018529102A (en) LIDAR sensor
KR100804719B1 (en) Imaging module
US20110164108A1 (en) System With Selective Narrow FOV and 360 Degree FOV, And Associated Methods
CN109307929B (en) Optical system with refraction surface and reflection surface, image shooting device and projection device
EP1172678A2 (en) Omnidirectional vision sensor
JP3976021B2 (en) Position measurement system
EP2950142A1 (en) Imaging apparatus and distance measuring apparatus using the same
US20180276844A1 (en) Position or orientation estimation apparatus, position or orientation estimation method, and driving assist device
CN110300900B (en) Lidar sensor for sensing objects
US11405600B2 (en) Stereo camera
CN109855594A (en) Photographic device, distance measuring equipment, onboard system and mobile device
US11838663B2 (en) Surround-view imaging system
Wang et al. Vehicle-mounted imaging lidar with nonuniform distribution of instantaneous field of view
WO2020129398A1 (en) Observation apparatus
JP6983740B2 (en) Stereo camera system and distance measurement method
CN110609382A (en) High-precision miniaturized long-focus star sensor optical system
EP2538173A1 (en) Method for arranging photoreceiving lens, and optical displacement sensor
JP7043375B2 (en) Stereo camera, in-vehicle lighting unit, and stereo camera system
CN117769661A (en) Receiving and transmitting optical system, laser radar, terminal equipment, method and device
EP3743179B1 (en) Spherical coordinate sensor for vehicle occupant monitoring
JP6983584B2 (en) Imaging device, range measuring device equipped with it, and in-vehicle camera system
EP3995888A1 (en) Stereo camera and light unit with integrated stereo camera
CN114096870A (en) Receiving system for laser radar, laser radar and ghost line suppression method
US11372109B1 (en) Lidar with non-circular spatial filtering

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210610