DE202014101550U1 - 3D camera for capturing three-dimensional images - Google Patents

3D camera for capturing three-dimensional images

Info

Publication number
DE202014101550U1
DE202014101550U1 DE202014101550.7U DE202014101550U DE202014101550U1 DE 202014101550 U1 DE202014101550 U1 DE 202014101550U1 DE 202014101550 U DE202014101550 U DE 202014101550U DE 202014101550 U1 DE202014101550 U1 DE 202014101550U1
Authority
DE
Germany
Prior art keywords
mirror
3d camera
according
panorama
optics
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
DE202014101550.7U
Other languages
German (de)
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sick AG
Original Assignee
Sick AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sick AG filed Critical Sick AG
Priority to DE202014101550.7U priority Critical patent/DE202014101550U1/en
Publication of DE202014101550U1 publication Critical patent/DE202014101550U1/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • G01C3/14Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument with binocular observation at a single point, e.g. stereoscopic type
    • G01S17/894
    • G01S17/931
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4812Constructional features, e.g. arrangements of optical elements common to transmitter and receiver transmitted and received beams following a coaxial path
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B17/00Systems with reflecting surfaces, with or without refracting elements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Abstract

3D camera (10) with an image sensor (14) for detecting three-dimensional image data from a viewing area (30) and with a panoramic sensor (22) upstream of the image sensor (14), characterized in that the geometry of the panoramic mirror optics (22) is not a rotational body forms.

Description

  • The invention relates to a 3D camera for capturing three-dimensional image data with panoramic mirror optics according to the preamble of claim 1.
  • In contrast to a conventional camera, a 3D camera also captures depth information and thus generates three-dimensional image data with distance or distance values for the individual pixels of the 3D image, which is also referred to as a distance image or depth map. The extra distance dimension can be used in a variety of applications to gain more information about objects in the scene captured by the camera and to solve various tasks.
  • In automation technology, objects can be captured and classified on the basis of three-dimensional image information in order to make further automatic processing steps dependent on which objects are preferably recognized, including their position and orientation. Thus, for example, the control of robots or various actuators can be supported on a conveyor belt.
  • For vehicles that operate on public roads or in enclosed areas, especially in the area of factory and logistics automation, a 3D camera should capture the entire environment and in particular a planned infrastructure as completely as possible and in three dimensions. This applies to virtually all imaginable vehicles, be it those with drivers such as cars, trucks, work machines and forklifts or driverless vehicles such as AGVs (Automated Guided Vehicle) or industrial trucks. The image data is used to enable autonomous navigation or assist a driver, inter alia to detect obstacles, avoid collisions, or facilitate loading and unloading of goods in transit, including boxes, pallets, containers or trailers.
  • To determine the depth information, various methods are known, such as time-of-flight measurements or stereoscopy. In a light transit time measurement, a light signal is emitted and the time taken to receive the remitted light signal. One distinguishes between pulse and phase methods. Regardless of the exact time-of-flight method, such cameras are referred to as TOF cameras (Time-of-Flight). Stereoscopic methods are based on two-eyed spatial vision and search for associated image elements in two images taken from different perspectives, whose disparity in knowledge of the optical parameters of the stereo camera estimates the distance by triangulation. Stereo systems can operate passively, ie alone with the ambient light, or have their own illumination, which preferably generates a lighting pattern in order to enable the distance estimation even in structureless scenes. In another 3D imaging process, for example, from the US Pat. No. 7,433,024 is known, a lighting pattern is recorded by only one camera and estimated the distance by pattern evaluation.
  • The field of view (FOV) of such 3D cameras is limited even with fisheye lenses to less than 180 ° and in practice even mostly below 90 °. To expand the field of view by using multiple cameras is conceivable, but costs considerable hardware and adjustment effort.
  • In the prior art, for example from the US Pat. No. 6,157,018 or the WO 0 176 233 A1 , various mirror optics are known to achieve omnidirectional 3D imaging. Such cameras are called katadioptrische cameras because of the combination of an imaging optics and downstream mirror optics. Nayar and Baker in "Catadioptric image formation", Proceedings of the 1997 DARPA Image Understanding Workshop, New Orleans, May 1997, pages 1341-1437 have shown that a so-called single viewpoint condition must be fulfilled for an equalization. This is given for common mirror shapes such as elliptical, parabolic, hyberbolic or conical mirrors.
  • It is also known to combine such panorama mirror optics with a light transit time method, in order to obtain in this way an omnidirectional 3D camera.
  • Examples are the DE 20 2006 014 939 U1 , the US 20 2011 052 106 U1 , the EP 2 354 806 A1 or the DE 10 2010 004 095 A1 , However, the mirror optics are not well adapted for many applications, especially in the field of vehicles, which is why the already low measurement resolution of available 3D TOF image sensors is not well utilized anyway.
  • It is therefore an object of the invention to improve a 3D camera with panoramic mirror optics.
  • This object is achieved by a 3D camera for capturing three-dimensional image data according to claim 1. The 3D camera captures a very large angular range of up to 360 ° with the aid of panorama mirror optics and is therefore designed for omnidirectional imaging. The invention is based on the basic idea of the usual shape of panoramic mirror optics as Targeted deviation of rotational body. This makes it possible to continue to ensure a continuous monitoring over the large angular range, however, to achieve a redistribution of the measurement points by the shaping and thus to capture more particularly interesting subareas more precisely at the expense of less interesting subareas. A typical example of such application-related redistribution is a vehicle where most of the measurement points in the direction of travel require relatively many reverse-directional points but fewer side-to-side measurement points. The 3D camera thus keeps everything in view and at the same time concentrates on essential parts.
  • The invention has the advantage that the entire environment can be detected omnidirectionally over up to 360 ° three-dimensionally with optimally adapted spatial and temporal resolution. The 3D camera remains compact and comparatively inexpensive.
  • The 3D camera preferably has a lighting unit with an upstream panorama mirror optics. All variants of the panorama mirror optics of the image sensor mentioned below can also apply to the panorama mirror optics of the illumination unit. In a preferred embodiment, these panoramic mirror optics are the same or even the same panorama mirror optics are used for both the image sensor and the illumination unit. The redistribution of the light of the illumination unit is not quite so critical to the measurement result, as long as enough light power is available, so that the panorama mirror optics may have greater tolerances for itself and compared to the panorama mirror optics of the image sensor.
  • The 3D camera is preferably designed as a light transit time camera with a light transit time unit to determine the light transit time of a light signal emitted by the illumination unit, reflected on objects in the field of view and detected in the image sensor. The determination of the distance by a time-of-flight method is in some respects less complex than a triangulation, ie a passive or active stereo camera or a camera with evaluation of a lighting pattern. In a triangulating system, a basic distance between the participating units, ie the camera or the cameras or the lighting, is absolutely necessary. A 3D measurement is only possible in a minimum distance given by the base distance as relatively large, in which the illumination and detection areas overlap. In contrast, a time-of-flight method offers a particularly small dead zone in the near range. Again, because of the possibility of a minimum or vanishing base distance, a low overall height can be achieved which is particularly useful on vehicles where otherwise the 3D camera protrudes significantly beyond the vehicle. The light transit time measurement is robust against ambient light, such as sunlight or light from a room lighting, and thus outdoor capable. It also provides dense depth maps. Finally, the basic light transit time method can be adopted directly from single-beam systems; at most, image transformations and equalization are still required in order to obtain perspective image sections. A small number of measuring points in a subarea are sufficient, because each pixel measures distance by itself and, unlike a triangulation, a correlation window over several pixels is not checked. This in turn opens up more freedom to redistribute the measurement points with the panoramic mirror, which is not designed as a rotation body.
  • Image sensor and illumination unit is preferably associated with a common panorama mirror optics with matching transmit and receive paths. For this purpose, for example, a beam splitter or a light guide arrangement is used. Since, as just explained, the base distance can disappear in the case of time-of-flight method in contrast to triangulating methods, this possibility exists of merging the transmit and receive paths and thus maximizing many of the advantages mentioned in the previous section as well as saving panorama optics by a dual function.
  • The panorama mirror optics preferably has an elevation axis through a vertex and is described in horizontal sections with respect to the elevation axis by a function of the mirror radius as a function of the circumferential angle. The function of the mirror radius descriptively describes a contour line. Preferably, the optical axis of the image sensor extends on the height axis, the panorama mirror optics is thus aligned with its vertex to the image sensor out. In contrast to a conventional, designed as a rotational body panoramic mirror optics, the elevation axis is also no symmetry axis, since at least in a horizontal section, the function of the mirror radius no constant and thus the respective contour line of the outer contour is not a circle (horizontal variation of the mirror shape). Due to this shape, lines on the outer contour perpendicular to the contour lines are inevitably different from those of a truncated cone or circular cylinder. However, these lines can also be further changed via the measure forced by the contour lines (vertical variation of the mirror shape).
  • The function of the mirror radius is preferably continuously differentiable at least once. In this way shading areas are going through Sharp corners and edges are avoided, and a smooth panorama mirror look is also easier to manufacture.
  • The function of the mirror radius preferably has at least one maximum and at least one minimum. Accordingly, the number of maxima or minima corresponding subareas of the field of view with many measuring points and with few measuring points.
  • The function of the mirror radius preferably has in several, in particular all horizontal sections a same, but scaled by a constant factor course. This results in a more regular geometry of the panorama mirror optics, which is easier to design and manufacture. The same shaping of the contour lines leads to definite angular sectors of a specific measuring point density. Scaling is required because the panoramic mirror optics would otherwise provide the image sensor with an edge parallel to its optical axis. The contour lines should therefore include a larger area as the distance from the image sensor increases.
  • The outer contour of the panoramic mirror optics preferably has a curvature in the height direction. In this way, a vertical variation of the mirror shape is now introduced according to the horizontal variations of the mirror shape that were previously specified by the function of the mirror radius. For this purpose, the outer contour is considered in sections of the panoramic optics in planes, which each include the elevation axis. So these are just the above-mentioned lines on the outer contour, which are perpendicular to the contour lines. In a conical panoramic mirror optics, the outer contour in this vertical direction would be described by straight lines. This preferred embodiment differs therefrom deliberately by a curvature in order to vary the measuring point distribution in the height direction. The curvature may be a simple convex or concave curvature, as in a hyperboloid or paraboloid, but also, for example, curvatures in an S-shape or even a free-form are conceivable.
  • The panoramic mirror optics preferably has a different height in different angular ranges. This is often associated with the fact that the elevation axis of the panorama mirror optics including the vertex is arranged decentrally. Then it does not make sense to design the panorama mirror optics everywhere with the same height, since a part of the mirror surfaces then would have no optical effect.
  • The optical axis of the image sensor and / or the lighting unit is preferably offset from the height axis. In this embodiment, the alignment of the panorama mirror optics with elevation axis and vertex on the optical axis of the image sensor is deviated by a parallel offset and / or a tilt of at least a few degrees. There is thus a preferred side of the panorama mirror optics, from which more measuring points are detected than from the other side.
  • A receiving optics of the image sensor and / or a transmitting optics of the light source is preferably not rotationally symmetrical. This serves to spot-shape in an X-Y plane perpendicular to the optical axis of the image sensor or the light source. This spot shaping can compensate for distortions in the XY direction due to the shape of the panoramic mirror optics.
  • The panoramic mirror optics is preferably rotatable and / or tiltable. The rotation follows, for example, a steering movement of a vehicle, so that a preferred partial area of the field of view with a particularly large number of measuring points is also in cornering in the direction of travel. Tilting can, for example, compensate for tilting movements on bumps or downhill sections with gradients detected by means of an acceleration sensor.
  • The panorama mirror optics are preferably deformable. This offers even more degrees of freedom than just a turn or a tilt and makes it possible to dynamically adapt the measuring point density to the current requirements. For this purpose, an elastic mirror surface can be deformed with an actuator. Similar effects can also be achieved by micromirrors (MEMS).
  • The invention will be explained in more detail below with regard to further features and advantages by way of example with reference to embodiments and with reference to the accompanying drawings. The illustrations of the drawing show in:
  • 1a a sectional view of a 3D camera in a stacked arrangement of image sensor and lighting;
  • 1b a sectional view of a 3D camera in oppositely oriented arrangement of image sensor and lighting,
  • 1c a sectional view of a 3D camera with aligned image sensor and lighting,
  • 2a a sectional view of a 3D camera with a common transmit and receive path by means of a beam splitter;
  • 2 B a sectional view of a 3D camera with a common transmit and receive path by means of optical fiber and circulator;
  • 3a a representation of the function of the mirror radius as a function of the orbital angle as a constant;
  • 3b a representation of the function of the mirror radius as a function of the orbital angle as a rising straight line;
  • 3c a representation of the function of the mirror radius as a function of the orbital angle in a polygon;
  • 3d a representation of the function of the mirror radius as a function of the orbital angle with a maximum and a minimum;
  • 3e a representation of the function of the mirror radius as a function of the orbital angle with two maxima and two minima;
  • 3f a representation of the function of the mirror radius as a function of the circulation angle with three maxima and three minima of different amplitude;
  • 4a a sectional view of a panoramic mirror with vertical asymmetry by two different tilt angle and a parallel offset;
  • 4b a sectional view of a panoramic mirror with vertical asymmetry and yet equal viewing angle ranges by two different tilt angles and different maximum heights;
  • 4c a sectional view of a panoramic mirror with vertical asymmetry and the same tilt angles at different maximum heights;
  • 4d a sectional view of a panoramic mirror with vertical asymmetry by convex or concave curvature of the outer contour;
  • 4e a sectional view of a panoramic mirror with vertical asymmetry through an S-shaped or freely curved outer contour;
  • 5a a sectional view of the beam path when using a non-rotationally symmetrical imaging optics with a conical panoramic mirror; and
  • 5b a sectional view of the beam path when using a non-rotationally symmetrical imaging optics with a panorama mirror with elliptical cross-sectional profile.
  • 1 shows a sectional view of a 3D camera 10 , The 3D camera 10 on the one hand has a camera unit 12 with an image sensor 14 and a control and evaluation unit 16 and on the other hand, a lighting unit 18 with a light source 20 on. Transmit and receive lenses are not shown for simplicity. The image sensor 14 has a plurality of pixels and is in common with the control and evaluation unit 16 capable of the duration of a pulsed or periodically modulated light signal of the lighting unit 18 up to an object and back, and from that to measure the object distance based on the speed of light. Such TOF chips are known per se and are therefore not described here in detail. In addition to the usual matrix arrangement of the same size, rectangular pixels is also known, the pixel arrangement or pixel size to the particular radial arrangement in the 3D camera 10 adapt. Furthermore, it is known that the time of flight measurement takes place in a separate control and evaluation unit 16 directly on the chip of the image sensor 14 to implement. Incidentally, the functionality of the control and evaluation unit almost arbitrarily on the camera unit 12 , the lighting unit 18 or further, not shown, secondary or higher-level units are distributed.
  • The camera unit 12 and the lighting unit 18 are each as panorama mirrors 22 . 24 assigned mirror optics or catadioptric optics assigned. The geometric design of these mirror optics is discussed in detail below. In 1 are the panoramic optics 22 . 24 simplified shown as triangles, which corresponds in three dimensions to a cone, which does not fall within the scope of the claims because of its rotational symmetry.
  • The panorama mirror 22 . 24 make sure the field of view 26 the camera unit 12 or the lighting field 28 the lighting unit 18 extended to a large angular range of up to 360 °. A light transit time measurement is only in an overlap area 30 possible, which is the actual field of view of the 3D camera 10 represents. That's why this overlap area should be 30 maximized and to avoid dead zones as close to the 3D camera 10 be introduced as possible. This is achieved by the fact that the base distance between the camera unit 12 and lighting unit 18 is chosen very small. This allows a light transit time method in contrast to a triangulating method such as stereoscopy, since the base distance in the Time of flight measurement is not received. A small base distance is even advantageous, since this reduces systematic errors due to shading and parasitic influences on the optical path length.
  • The 1a shows a stacked arrangement of camera unit 12 and lighting unit 18 , The optical axis of the camera unit 12 lies in the optical axis of the lighting unit 18 , Of course, the reverse arrangement with rotated by 180 ° camera unit 12 and lighting unit 18 would be conceivable. 1b shows an alternative arrangement, in the camera unit 12 and lighting unit 18 are aligned opposite to each other. This allows the electrical components, such as image sensor 14 , Control and evaluation unit 16 or light source 20 , without interference by a panorama mirror 22 . 24 be connected or integrated with each other.
  • To minimize the base distance and maximize the overlap area 30 but most advantageous is an arrangement according to 1c , at the camera unit 12 and lighting unit 18 aligned with each other. This has the advantage, in addition to the smallest dead zone, that the panorama mirrors 22 . 24 can be formed as a common component and overall the lowest height is achieved.
  • For a time-of-flight procedure, even arrangements with vanishing base distances are possible to optimize these advantages, in which case the virtual origin is the same or the transmit and receive paths coincide. This is even useful for panoramic mirrors known per se without the geometric features to be described below.
  • 2a shows a corresponding 3D camera 10 with a beam splitter 32 , The transmitting and receiving side beam path on the panorama mirror 22 . 24 and in the field of view or illumination 26 . 28 is thus identical except for the direction. Therefore, the deadband is almost completely eliminated because of the overlap area 30 is congruent with the field of view or illumination 26 . 28 , The height is thereby reduced again, and it is synonymous only a common panoramic mirror 22 . 24 needed.
  • In the embodiment according to 2a each has its own receiving optics 34 and its own transmission optics 36 intended. It is conceivable instead to use only a common lens, which then between beam splitter 32 and panorama mirror 22 . 24 is arranged.
  • The arrangement of further optical elements in the common or separate part of the beam path is conceivable. For example, the losses of the beam splitting can be achieved by using polarized light, for example from a laser light source or by using polar filters in front of the image sensor 14 and light source 20 , a polarizing beam splitter 30 and a λ / 4 plate between the beam splitter 30 and panorama mirror 22 . 24 , ideally to be reduced to near zero.
  • 2 B shows an embodiment of the 3D camera 10 , in place of the beam splitter 30 a circulator 38 is used. With the help of a lens 40 and light guides 42a -C, the arrangement of the components within the 3D camera can be freely selected within certain limits.
  • In various embodiments of the invention, the panoramic mirrors 22 . 24 shaped in different ways, so that they do not form rotational bodies. This serves to redistribute the measuring point density in a targeted manner to the needs of the application, thereby creating subareas of high measuring resolution at the expense of other subareas of reduced measuring resolution. The design of the panoramic mirror 22 . 24 has analogous effects, anyway preferred both panoramic mirrors 22 . 24 be formed with each other the same, and is an example of the panorama mirror 22 of the image sensor described.
  • As a reference to illustrate the considerations may serve a conventional conical panoramic mirror, whose axis of symmetry with the optical axis of the image sensor 14 matches. Its symmetry axis is the elevation axis of the panorama mirror, and in horizontal sections perpendicular to the elevation axis, the outer contour forms a circle, which can be illustrated as a contour line as on a map. The outer contour within a horizontal section is described by a function of the mirror radius, which is constant in the case of the circle. In the height direction, the radius of the circle is scaled by a factor corresponding to the height. In this vertical direction, the outer contour of a conical panorama mirror forms a straight line in respective lines perpendicular to the contour lines.
  • This symmetrical shape and arrangement is now varied in many ways in various embodiments, and these variations can also be performed in hybrid forms in any combination. 3 shows possible contour lines or functions of the mirror radius as a horizontal variation of the shape of the panorama mirror 22 , 4 shows examples of an additional vertical variation of lines perpendicular to the contour lines of 3 , In this case, at the same time still an offset between the optical axis of the image sensor 12 and the elevation axis are introduced. 5 after all shows an additional non-rotationally symmetrical design of the receiving optics, which incidentally could also be combined with a conventional panoramic mirror.
  • Through a non-rotationally symmetrical panorama mirror 22 For example, the lateral resolution or the measurement point density distribution can be varied horizontally and / or vertically and can be designed as angle-dependent. This is for customization to specific applications. For example, in vehicles that primarily travel forwards and backwards, it may be advantageous to achieve higher point density and thus resolution in the area in front of and behind the vehicle to safely detect obstacles, while in the lateral angle areas where the environment is only roughly detected should be, a lower measuring point density is sufficient. It may be necessary to compensate for distortion artifacts through calibration and evaluation algorithms.
  • The 3a -F show by way of example a multiplicity of possible functions of the mirror radius as a function of the circulation angle or contour lines within a horizontal section. Defining the elevation axis through the vertex of the panoramic view 22 as Z-axis, so runs in the 3a -F the paper plane corresponding horizontal section in the XY plane. The contour lines in different horizontal sections usually vary at least by a constant over all circulation angle scaling factor, but may also change in shape and, for example, at different altitudes of the shape of the 3a -F to the form of another of the 3a -F change. In contrast to the ever-conceivable pure scaling, however, a change of shape with a reasonably smooth outer contour of the three-dimensional panoramic mirror 22 not possible. It should be noted that in the 3a -F first shapes are shown where the vertex of the panoramic spine 22 is central, but there are also embodiments with an offset, as based on the 4 still shown.
  • 3a shows for comparison a mirror radius, which is constant as a function of the circumferential angle. This creates a circle as a contour line. Are all contour lines circles, results in a body of revolution, so that a horizontal section through a panoramic mirror according to the invention 22 at most in some heights and preferably nowhere gives such a circle.
  • The contour line according to 3b varies its mirror radius linearly with the circumferential angle. Therefore, in the function of the mirror radius, a jump occurs after one full revolution to balance between the minimum and maximum. Such points of discontinuity result in shading effects or shadowed angular areas, and such a panorama mirror 22 would also be very difficult to manufacture. Therefore, according to the invention, the mirror radius is preferably continuous.
  • For quite similar reasons, simple continuity does not usually suffice, but the function of the mirror radius derived according to the circumferential angle should even be continuously differentiable at least once. 3c shows by way of illustration a polygon, in this case a regular hexagon, as an example of a continuous but not everywhere derivable function of the mirror radius. Since the equivalent origin of the illuminating beam is usually not exactly in the center of the mirror, but farther away, in this case shaded areas result at the corners and thus no omnidirectional imaging. If the imaginary ray origin is closer than the mirror center point, corresponding overlapping regions result where different image segments overlap. Only when the imaginary ray origin coincides with the center of the mirror, a homogeneous, omnidirectional image results without shading and overlapping. However, it is practically impossible to realize this for all heights, so that not continuously differentiable functions of the mirror radius are preferably excluded.
  • An inventive panorama mirror 22 should therefore preferably have a cross-sectional profile at any height, which is described by a non-constant and at least once continuously differentiable function of the mirror radius. Such a function inevitably has at least a maximum and a minimum, which corresponds to a subarea with an increased and a subarea with a reduced measuring point density. Of course, the conditions for the mirror shape only apply to optically effective areas of the panorama mirror 22 ,
  • 3d -F show some examples which satisfy these conditions. In 3d There is exactly one minimum and one maximum at which the corresponding areas with higher and lower measuring point density merge into one another. 3e shows a parabolic profile with two maxima and minima, resulting in two opposite areas with increased measuring point density. This can be advantageous, for example, in vehicles that travel primarily forwards or backwards, where a higher angular resolution or measuring point density is needed specifically in the area in front of and behind the vehicle in order to reliably detect obstacles. In the lateral angular ranges in which the two minima lie, the environment should only be roughly recorded, and for this the lower measuring point density is sufficient. As 3f illustrated, In general, more than two maxima and minima are possible if more than two regions of increased or reduced measuring point density are desired. In this case, the height or amplitude of the maxima and minima can vary as well 3f illustrated by the example of a less pronounced maximum compared to the other two maxima.
  • The course of the function of the mirror radius inevitably results in not only radial asymmetries within a horizontal sectional plane, but also vertical asymmetries. But it can also deliberately vertical variations of the geometry of the panoramic mirror 22 be used to vary the vertical measuring point density or the vertical viewing angle range θ and thus adapt to the application.
  • The 4a -E each show vertical sections with some examples. It should be noted that due to the zu 3 explained horizontal variation, the vertical sections differ depending on the orbital angle and in 4a In each case only one fixed circulation angle is illustrated. The vertical variations can then be characterized by several parameters: tilt angle α 1 , α 2 and resulting vertical viewing angles θ 1 , θ 2, respectively in the opposite radial direction, an offset Δx between the optical axis of the image sensor 14 and vertex of the panorama mirror 22 as well as maximum heights z max1 , z max2 of the panorama mirror 22 , the latter not necessarily referring to the physical extent, but the optically effective outer contour.
  • In 4a the tilt angles are different with α 12 on opposite mirror sides. As a result, the 3D camera 10 looks at the two opposite sides mirror in different vertical directions, namely on the left further up and right further down. In addition, an offset Δx ≠ 0 on the right causes a larger viewing angle range than on the left, θ 12 . In the vertical sections, not shown, to other circumferential angles results in a continuous transition between the two situations on the right and left side of the 4a , The contour line or the horizontal profile of this embodiment could, for example, that of the 3d be.
  • It is also possible, as in 4b without an offset Δx in centric or central arrangement to provide two different tilt angles α 12 . As a result, the panorama mirror 22 higher on one side than on the other, z max1 > z max2 . As in 4a looks at the 3D camera 10 on the opposite sides in different vertical directions, but now with over the entire circumference same vertical measuring point densities and equal vertical viewing angle ranges, θ 1 = θ 2 . Again, the contour line can be 3d result.
  • Again alternatively, according to 4c a vertex shift Δx ≠ 0 with the same tilt angles α 1 = α 2 are made. This in turn causes different heights of the panorama mirror , z max1 > z max2 . Now the viewing angle ranges on the right and on the left are different, θ 1 > θ 2 , with the same vertical measuring point density or angular resolution.
  • Instead of flat vertical contours of the panorama mirror 22 Curvatures are also possible. 4d shows an example with a concave and a convex curvature in the vertical direction. This results in a reduction or enlargement of the viewing angle ranges, θ 12 , and, consequently, an increase or decrease in the vertical measuring point density.
  • As 4e illustrated, even more complicated vertical curvatures are conceivable, shown there using the example of an S-shaped contour and a freeform. Depending on whether a section of the vertical contour is convex or concave, there is an increase or a decrease in the measuring point density for some vertical angles, as indicated by different hatching in the case of the S-shaped contour.
  • In all embodiments, the vertical contour of the panorama mirror runs 22 preferably continuous or even continuously differentiable with height and strictly monotonously increasing. Areas in which the monotony is injured result in flat or even inward curved areas that surround the image sensor 14 can not reach through reflection, and thus regularly to gaps in the image.
  • The course of the vertical contour in vertical sections to each in the 4a - Not shown circulating angles depends on the horizontal profile or the contour line. A contour line according to 3d leads to a continuous transition between the illustrated extremes. For example, with contour lines 3e However, with two or three maxima, there are also other vertical contours that do not simply represent a continuous intermediate form. Under the already mentioned condition that the function of the mirror radius is continuously differentiable, almost arbitrary combinations and transitions of the in the 3 and 4 shown contours are used. By combining horizontal and vertical asymmetries, practically any subregions of increased and reduced measuring point density can be determined.
  • In the previous considerations, it was assumed that the beam path of the image sensor 14 or from its imaging optics 44 rotationally symmetric to the panorama mirror 22 lies and that the distribution of the measuring points or pixels of the image sensor 14 in the radial direction as circumferential direction is at least approximately homogeneous. But this does not have to be the case. For example, the distortion of the imaging optics 44 be used to vary the measuring point density in the vertical direction. The imaging optics 44 can also be designed not rotationally symmetric targeted.
  • 5 illustrates this with the example of an imaging optics with cylindrical lenses, which have a different magnification in the X-direction and Y-direction, ie, a different optical magnification of the image sensor 14 on the panorama mirror 22 , 5a shows a different spot diameter in the X-direction and Y-direction on the panorama mirror 22 by an imaging optics, which is a cylindrical lens 44a and optionally at least one further lens 44b having. Depending on the mirror shape, the measuring point density in the X direction and Y direction can be designed differently. In 5a is simplistic a conical panoramic mirror 22 shown, because even by the non-rotationally symmetrical imaging optics, a redistribution of the measuring point density is possible. The effect is also understandable for any other mirror form, as exemplified by the 3 and 4 were explained.
  • A non-rotationally symmetrical imaging optics 44 is particularly advantageous in accordance with an elliptical cross-sectional profile 3e , Such a panorama mirror 22 is in 5b shown. The beam path between image sensor 14 and panorama mirror 22 can here with the help of the cylindrical lens 44a be adapted directly to the different diameters of the two ellipse main axes. Distorting effects of the panorama mirror 22 to individual pixels are compensated by the imaging optics. A dotted line in 5b clarifies beyond that, as in addition by an example parabolic vertical contour of the panorama mirror 22 An angular adjustment in the vertical direction can be made and thus the vertical field of view can be extended at the expense of the vertical resolution.
  • Instead of cylindrical lenses 44a It is also possible to use diffractive or holographic optics and refractive free-form optics to detect any deviations from a rotational symmetry of the imaging optics 44 to create. Here, too, preferably a continuous or continuously differentiable course of the imaging properties of the imaging optics 44 above the cross section of the image sensor 14 desirable. Furthermore, highly rectangular image sensors with very different side lengths, ie those with a high aspect ratio (aspect ratio), or with a different number of pixels in the X direction and Y direction can be used, which has a very similar effect.
  • It is again noted that the embodiments based on the panoramic mirror 22 of the image sensor 14 have been explained, but quite corresponding to the panoramic mirror 24 , the beam path of the light source 20 in the lighting unit 18 and their transmission optics applies.
  • The previous embodiments are based on rigid panorama mirrors 22 . 24 , But it is also conceivable, the panorama mirror 22 . 24 be flexible or variable in itself and so the measurement point density distribution even dynamically adapt to changing requirements in operation, such as a current direction of travel. In a vehicle, as mentioned several times, it may be advantageous to achieve a higher measuring point density in front and behind than laterally. Now, if the steering of the vehicle is actuated to make a turn, the panorama mirror 22 . 24 be rotated in sync with the steering movement, so that always in the current direction of travel increased measurement resolution is achieved. The panorama mirror 22 . 24 not only can be rotatable, but also tiltable to vary the viewing angle ranges in the vertical direction, such as pitching movements of a vehicle.
  • In principle, it is even conceivable that the shape of the optically effective outer contour of the panorama mirror 22 . 24 to change dynamically. One possibility for this is the use of an elastic panoramic mirror 22 . 24 which is then deformed electrically, pneumatically, thermally or by other physical effects. Another possibility is the use of MEMS (Micro-Electro-Mechanical System) or MOEMS (Micro-Opto-Electro-Mechanical System) based adaptive mirror systems that effectively as a variable outer contour of the panoramic mirror 22 . 24 behavior.
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • US 7433024 [0005]
    • US 6157018 [0007]
    • WO 0176233 A1 [0007]
    • DE 202006014939 U1 [0009]
    • US 202011052106 U1 [0009]
    • EP 2354806 A1 [0009]
    • DE 102010004095 A1 [0009]
  • Cited non-patent literature
    • Nayar and Baker in "Catadioptric image formation", Proceedings of the 1997 DARPA Image Understanding Workshop, New Orleans, May 1997, pp. 1341-1437 [0007]

Claims (14)

  1. 3D camera ( 10 ) with an image sensor ( 14 ) for capturing three-dimensional image data from a viewing area ( 30 ) and with an image sensor ( 14 ) upstream panorama mirror optics ( 22 ), characterized in that the geometry of the panorama mirror optics ( 22 ) does not form a body of revolution.
  2. 3D camera ( 10 ) according to claim 1, which is a lighting unit ( 18 ) with an upstream panorama mirror optics ( 24 ) having.
  3. 3D camera ( 10 ) according to claim 2, wherein image sensor ( 14 ) and lighting unit ( 18 ) a common panorama mirror optics ( 22 . 24 ) is associated with matching transmit and receive paths.
  4. 3D camera ( 10 ) according to claim 2 or 3, which is a time-of-flight camera with a light-time unit ( 16 ) is designed to determine the light transit time of a light signal emitted by the illumination unit ( 18 ) sent to objects in the field of vision ( 30 ) and in the image sensor ( 14 ) is detected.
  5. 3D camera ( 10 ) according to one of the preceding claims, wherein the panorama mirror optics ( 22 . 24 ) has an elevation axis through a vertex and is described in horizontal sections (x, y) with respect to the elevation axis (z) by a function (r (φ)) of the mirror radius (r) in dependence on the circumferential angle (φ).
  6. 3D camera ( 10 ) according to claim 5, wherein the function (r (φ)) of the mirror radius (r) is continuously differentiable at least once.
  7. 3D camera ( 10 ) according to claim 5 or 6, wherein the function (r (φ)) of the mirror radius (r) has at least one maximum and at least one minimum.
  8. 3D camera ( 10 ) according to one of claims 5 to 7, wherein the function (r (φ)) of the mirror radius (r) in several, in particular all horizontal sections has a same, but scaled by a constant factor gradient.
  9. 3D camera ( 10 ) according to one of claims 5 to 8, wherein the outer contour of the panorama mirror optics ( 22 . 24 ) in the height direction (z) has a curvature.
  10. 3D camera ( 10 ) according to one of claims 5 to 9, wherein the panoramic mirror optics ( 22 . 24 ) in different angular ranges (φ) has a different height (z max ).
  11. 3D camera ( 10 ) according to one of the preceding claims, wherein the optical axis of the image sensor ( 14 ) and / or the lighting unit ( 18 ) is offset from the height axis (z).
  12. 3D camera ( 10 ) according to one of the preceding claims, wherein a receiving optical system ( 44 ) of the image sensor ( 14 ) and / or a transmission optics of the light source ( 20 ) is not rotationally symmetrical.
  13. 3D camera ( 10 ) according to one of the preceding claims, wherein the panorama mirror optics ( 22 . 24 ) is rotatable and / or tiltable.
  14. 3D camera ( 10 ) according to one of the preceding claims, wherein the panorama mirror optics ( 22 . 24 ) is deformable.
DE202014101550.7U 2014-04-02 2014-04-02 3D camera for capturing three-dimensional images Active DE202014101550U1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE202014101550.7U DE202014101550U1 (en) 2014-04-02 2014-04-02 3D camera for capturing three-dimensional images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE202014101550.7U DE202014101550U1 (en) 2014-04-02 2014-04-02 3D camera for capturing three-dimensional images

Publications (1)

Publication Number Publication Date
DE202014101550U1 true DE202014101550U1 (en) 2015-07-07

Family

ID=53677130

Family Applications (1)

Application Number Title Priority Date Filing Date
DE202014101550.7U Active DE202014101550U1 (en) 2014-04-02 2014-04-02 3D camera for capturing three-dimensional images

Country Status (1)

Country Link
DE (1) DE202014101550U1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017005527A1 (en) 2017-06-10 2018-01-04 Daimler Ag Device for optically detecting an environment and vehicle with such a device
WO2018086813A1 (en) * 2016-11-10 2018-05-17 Osram Gmbh Tof camera, motor vehicle, method for producing a tof camera and method for determining a distance to an object
DE102017204073A1 (en) 2017-03-13 2018-09-13 Osram Gmbh Tof camera, motor vehicle, method for manufacturing a tof camera, and method for determining a distance to an object
DE102017215329A1 (en) * 2017-09-01 2019-03-07 Pepperl + Fuchs Gmbh Door sensor device
WO2019048148A1 (en) * 2017-09-06 2019-03-14 Robert Bosch Gmbh Scanning system and transmitting and receiving device for a scanning system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6157018A (en) 1997-12-13 2000-12-05 Ishiguro; Hiroshi Omni directional vision photograph device
WO2001076233A1 (en) 2000-03-30 2001-10-11 Genex Technologies, Inc. Method and apparatus for omnidirectional imaging
DE202006014939U1 (en) 2006-09-28 2006-11-30 Sick Ag 3 D scene imaging super wide angle optical system has super wide angle optics feeding light travel time recording camera chip
US7433024B2 (en) 2006-02-27 2008-10-07 Prime Sense Ltd. Range mapping using speckle decorrelation
DE102010004095A1 (en) 2010-01-07 2011-04-21 Deutsches Zentrum für Luft- und Raumfahrt e.V. Device for three-dimensional detection of environment in e.g. service robotics for self-localization, has hyperboloid mirror for refracting or reflecting light towards camera that is formed as time-of-flight-camera
EP2354806A1 (en) 2010-02-01 2011-08-10 Sick Ag Optoelectronic sensor
DE202011052106U1 (en) 2011-11-25 2012-01-03 Sick Ag Distance measuring optoelectronic sensor

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6157018A (en) 1997-12-13 2000-12-05 Ishiguro; Hiroshi Omni directional vision photograph device
WO2001076233A1 (en) 2000-03-30 2001-10-11 Genex Technologies, Inc. Method and apparatus for omnidirectional imaging
US7433024B2 (en) 2006-02-27 2008-10-07 Prime Sense Ltd. Range mapping using speckle decorrelation
DE202006014939U1 (en) 2006-09-28 2006-11-30 Sick Ag 3 D scene imaging super wide angle optical system has super wide angle optics feeding light travel time recording camera chip
DE102010004095A1 (en) 2010-01-07 2011-04-21 Deutsches Zentrum für Luft- und Raumfahrt e.V. Device for three-dimensional detection of environment in e.g. service robotics for self-localization, has hyperboloid mirror for refracting or reflecting light towards camera that is formed as time-of-flight-camera
EP2354806A1 (en) 2010-02-01 2011-08-10 Sick Ag Optoelectronic sensor
DE202011052106U1 (en) 2011-11-25 2012-01-03 Sick Ag Distance measuring optoelectronic sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Nayar und Baker in "Catadioptric image formation", Proceedings of the 1997 DARPA Image Understanding Workshop, New Orleans, May 1997, Seiten 1341-1437

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018086813A1 (en) * 2016-11-10 2018-05-17 Osram Gmbh Tof camera, motor vehicle, method for producing a tof camera and method for determining a distance to an object
DE102017204073A1 (en) 2017-03-13 2018-09-13 Osram Gmbh Tof camera, motor vehicle, method for manufacturing a tof camera, and method for determining a distance to an object
DE102017005527A1 (en) 2017-06-10 2018-01-04 Daimler Ag Device for optically detecting an environment and vehicle with such a device
DE102017215329A1 (en) * 2017-09-01 2019-03-07 Pepperl + Fuchs Gmbh Door sensor device
WO2019048148A1 (en) * 2017-09-06 2019-03-14 Robert Bosch Gmbh Scanning system and transmitting and receiving device for a scanning system

Similar Documents

Publication Publication Date Title
JP3719095B2 (en) Behavior detection apparatus and gradient detection method
US20140152975A1 (en) Method for dynamically adjusting the operating parameters of a tof camera according to vehicle speed
US20170219713A1 (en) Vehicle with Multiple Light Detection and Ranging Devices (LIDARs)
US20160146940A1 (en) Opto-electronic detection device and method for sensing the surroundings of a motor vehicle by scanning
EP1589484B1 (en) Method for detecting and/or tracking objects
JP2012509464A (en) Six-degree-of-freedom measuring device and method
JP2014029604A (en) Moving object recognition system, moving object recognition program, and moving object recognition method
JP2014222429A (en) Image processor, distance measuring device, mobile object apparatus control system, mobile object, and program for image processing
US7130745B2 (en) Vehicle collision warning system
US9880263B2 (en) Long range steerable LIDAR system
US8014002B2 (en) Contour sensor incorporating MEMS mirrors
WO2014125153A1 (en) System and method for scanning a surface and computer program implementing the method
CN104898125B (en) The small-sized LIDAR of low cost for automobile
US8571302B2 (en) Method and apparatus to build 3-dimensional grid map and method and apparatus to control automatic traveling apparatus using the same
EP3057063A1 (en) Object detection device and vehicle using same
Yi et al. An omnidirectional stereo vision system using a single camera
US9551791B2 (en) Surround sensing system
Lin et al. High resolution catadioptric omni-directional stereo sensor for robot vision
US9201424B1 (en) Camera calibration using structure from motion techniques
JP2009188980A (en) Stereo camera having 360 degree field of view
US10088296B2 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
US9602811B2 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
EP2696166A2 (en) Optical measurement device and vehicle
WO2006050430A2 (en) Optical tracking system using variable focal length lens
JP2019050035A (en) Methods and systems for object detection using laser point clouds

Legal Events

Date Code Title Description
R207 Utility model specification
R150 Term of protection extended to 6 years