EP3894925A1 - Dispositif de reproduction multi-canaux et appareil comportant un dispositif de reproduction multi-ouvertures - Google Patents

Dispositif de reproduction multi-canaux et appareil comportant un dispositif de reproduction multi-ouvertures

Info

Publication number
EP3894925A1
EP3894925A1 EP19813525.3A EP19813525A EP3894925A1 EP 3894925 A1 EP3894925 A1 EP 3894925A1 EP 19813525 A EP19813525 A EP 19813525A EP 3894925 A1 EP3894925 A1 EP 3894925A1
Authority
EP
European Patent Office
Prior art keywords
view
beam deflection
image
imaging device
image sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19813525.3A
Other languages
German (de)
English (en)
Inventor
Frank Wippermann
Jacques DUPARRÉ
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Original Assignee
Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV filed Critical Fraunhofer Gesellschaft zur Forderung der Angewandten Forschung eV
Publication of EP3894925A1 publication Critical patent/EP3894925A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/0816Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/08Mirrors
    • G02B5/09Multifaceted or polygonal mirrors, e.g. polygonal scanning mirrors; Fresnel mirrors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/208Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/26Reflecting filters
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/17Bodies with reflectors arranged in beam forming the photographic image, e.g. for reducing dimensions of camera
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B19/00Cameras
    • G03B19/18Motion-picture cameras
    • G03B19/22Double cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/02Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with scanning movement of lens or cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors

Definitions

  • Multi-channel imaging device and device with a multi-aperture imaging device Multi-channel imaging device and device with a multi-aperture imaging device
  • the present invention relates to a multi-channel imaging device and to a device with a multi-channel imaging device.
  • the present invention further relates to a portable device having a multi-aperture imaging device.
  • the object of the present invention is therefore to create a multi-aperture imaging device which enables a high degree of image information while at the same time requiring little space for the multi-aperture imaging device.
  • a key concept of the present invention is to have recognized that the above object can be achieved in that a high degree of image information can be obtained by recording the overall field of view in different wavelength ranges, which means a small number of recording channels and thus small sizes and low costs enables.
  • a multi-aperture imaging device comprises an image sensor; an array of optical channels arranged next to one another, each optical channel comprising an optical system for imaging at least a partial field of view of an overall field of view onto an image sensor region of the image sensor.
  • the multi-aperture imaging device has a beam deflection device for deflecting a beam path the optical channels, the beam deflecting device having a first beam deflecting region which is effective for a first wavelength range of electromagnetic radiation passing through the optical channel; and has a second beam deflection area which is effective for a second wavelength range of the electromagnetic radiation which passes through the optical channel and which is different from the first wavelength range.
  • the multi-aperture imaging device is designed to capture a first image of the entire field of view using the first beam deflection region with the image sensor, so that the first image is based on the first wavelength range; and to capture a second image of the entire field of view using the second beam deflection region with the image sensor, so that the second image is based on the second wavelength region.
  • the multi-aperture imaging device is designed to determine a depth map for the first exposure using the second exposure. This enables depth information to be obtained regarding the entire field of view.
  • the first beam deflection area is arranged on a first side of the beam deflection device and the second beam deflection area is arranged on a second side opposite to the first side, and the beam deflection device is designed such that the first side faces the image sensor for acquiring a first image of the total field of view is arranged, and for capturing a second image of the total field of view the second side is arranged facing the image sensor.
  • a first side of the beam deflection device has a coating that differs from a second, opposite side, in order to be effective in the first or second wavelength range.
  • the beam deflection device is designed to reflect the first wavelength range when effective in the first wavelength range and to at least partially absorb different wavelength ranges thereof beers and / or wherein the beam deflection device is designed to reflect the second wavelength range when effective in the second wavelength range and to absorb at least partially different wavelength ranges therefrom.
  • the overall field of view is a first overall field of view
  • the multi-aperture imaging device has a first line of sight for capturing the first overall field of view and a second line of sight towards a second overall field of view.
  • the multi-aperture imaging device is designed to capture a third image of the second overall field of view using the first beam deflection region with the image sensor, so that the third image is based on the first wavelength range; and to acquire a fourth image of the second total field of view using the second beam deflection region with the image sensor, so that the fourth image is based on the second wavelength region.
  • Both total, possibly spatially spaced, fields of view can thus be recorded in both wavelength ranges.
  • the first overall field of view and the second overall field of view are arranged along different main directions of the multi-aperture imaging device, and the beam deflection regions deflect the beam path alternately in the direction of the first overall field of view and the second overall field of view and alternately with the first beam deflection region and the second beam deflection region when a progressive rotary movement is carried out around.
  • This can be an implemented or theoretical consideration of the sequence of movements.
  • Exemplary embodiments provide in particular that a shortest path and therefore a shortest actuating time are implemented in order to change a position or position of the beam deflection device, so that the beam deflection device can be moved in different directions.
  • the beam deflection device is designed to have an angle of incidence of 45 ° ⁇ 10 ° of the first beam deflection region with respect to the image sensor in order to obtain a first image of the overall field of view and to have an angle of incidence of 45 ° ⁇ 10 ° of the second in order to obtain a second image of the overall field of view To have beam deflection area with respect to the image sensor.
  • This angle of attack enables the beam path to be deflected by approximately 90 ° and to a small extent Size of the multi-aperture imaging device, since the small thickness of the multi-aperture imaging device can be used to advantage.
  • the multi-aperture imaging device is designed to capture the entire field of view through at least two partial fields of view and to capture at least one of the partial fields of view through at least a first optical channel and a second optical channel. This enables the occlusion effects to be avoided or reduced.
  • the multi-aperture imaging device is designed to segment the total field of view into exactly two partial facial fields and to capture exactly one of the partial field of view through a first optical channel and a second optical channel. This enables the occlusion to be reduced or avoided and, at the same time, a small number of optical channels, which enables a small size and / or low costs.
  • the first optical channel and the second optical channel are spaced apart by at least one further optical channel in the array.
  • This enables the occlusion effects to be avoided or reduced.
  • occlusion effects can be reduced or avoided.
  • a first partial field of view is taken up by channels to the left and right of the channel which receives a second partial field of view, in particular when the total field of view is divided into exactly two partial fields of view along a vertical direction or perpendicular to a direction along which the optical channels in the array are more optical Channels can be arranged, the line extension direction.
  • the beam deflection device is formed as an array of facets, each optical channel being assigned to a facet, and each of the facets having the first beam deflection area and the second beam deflection area.
  • the facets of the array of facets are formed as reflectors that are reflective on both sides and plane-parallel on both sides. This enables simple design of the facets.
  • the image sensor regions are designed for image generation in the first wavelength region and for image generation in the second wavelength region. This enables a space-saving design of the image sensor.
  • pixels of the image sensor areas are designed for image generation in the first wavelength range and at least partially for image generation in the second wavelength range. This can be done, for example, by arranging appropriate filters and / or by integrating or substituting appropriately configured photo cells in groups of photo cells, for example a Bayer pattern.
  • the first wavelength range comprises a visible spectrum and the second wavelength range comprises an infrared spectrum, in particular a near infrared spectrum.
  • the multi-aperture imaging device furthermore has an illumination device which is designed to emit a temporal or spatial illumination pattern with a third wavelength range which at least partially corresponds to the second wavelength range. This enables targeted illumination of the entire field of view with light of the second wavelength range, so that the arrangement of further illumination sources for this wavelength range can be dispensed with.
  • the multi-aperture imaging device is designed to capture the entire field of view, at least stereoscope. This enables an additional increase in the image information obtained.
  • the beam deflection device is designed to block or attenuate the second wavelength region with the first beam deflection region and to block or attenuate the first wavelength region with the second beam deflection region. This enables the wavelength ranges to be isolated the deflection, so that only light strikes the image sensor for use in the desired recording.
  • a device comprises a multi-aperture imaging device according to the invention and is designed to generate a depth map of the overall field of view.
  • the device does not have an additional infrared camera.
  • the device is designed to record the entire field of view from a perspective and to provide no stereoscopic image of the overall field of view.
  • This embodiment is particularly advantageous with the generation of depth information based on the different wavelength ranges, which enable additional imaging modules to be saved for stereoscopic purposes.
  • Figure 1 is a schematic perspective view of a device according to an embodiment.
  • FIG. 2 shows a schematic view of a main side of a device according to a further exemplary embodiment
  • 3a shows a beam deflecting device and a state of diaphragms in a first
  • 3b shows the beam deflection device and the diaphragms in a second operating state
  • 4a shows a schematic view of the beam deflection device according to an exemplary embodiment, which comprises a plurality of beam deflection regions
  • 4b shows a schematic view of the beam deflection device according to an alternative configuration to FIG. 4a and according to an exemplary embodiment
  • FIG. 5a shows a schematic perspective view of an imaging device according to an exemplary embodiment
  • FIG. 5b shows a schematic perspective view of a multi-aperture imaging device according to an exemplary embodiment, which has an illumination device which is designed to emit a temporal or spatial illumination pattern;
  • FIG. 5c shows a schematic side sectional view of a modified imaging device, in which the beam deflection device can be rotated between a first position of the first operating state and a second position;
  • 6a is a schematic view of an overall field of view, which comprises four overlapping partial fields of view;
  • FIG. 6b shows a division of the total field of view that is changed to FIG. 6a, in which a
  • Partial field of view is recorded twice and partial fields of view are arranged alongside one another in a first direction;
  • Fig. 6c a changed to Fig. 6a distribution of the total field of view, in which a
  • Partial field of view is recorded twice and partial fields of view are arranged side by side along a second direction;
  • FIG. 7a shows a schematic perspective view of a device which comprises two multi-aperture imaging devices for stereoscopic imaging of an entire field of view, according to an exemplary embodiment
  • Fig. 7b. 1 shows a schematic perspective view of a device which comprises two multi-aperture imaging devices, according to an exemplary embodiment, which is designed to create the depth information instead of a stereoscopic image from the image in one of the wavelength ranges;
  • FIG. 7c shows a schematic perspective view of a preferred embodiment of a multi-aperture imaging device according to an exemplary embodiment, which has a single viewing direction;
  • FIG. 8 shows a schematic structure comprising a first multi-aperture imaging device and a second multi-aperture imaging device with a common image sensor;
  • 9a-d are schematic views of a multi-aperture imaging device according to one
  • Embodiment that uses different wavelength ranges.
  • FIG. 10 shows a schematic graph of a sensitivity of an image sensor region of the image sensor of the multi-aperture imaging device over the wavelengths of a first and second wavelength region according to an exemplary embodiment.
  • the following exemplary embodiments relate to the use of different wavelength ranges for imaging on an image sensor.
  • the wavelength relates to electromagnetic radiation, especially light.
  • An example of different wavelength ranges is, for example, the use of visible light, for example in a wavelength range from approximately 380 nm to approximately 650 nm.
  • a wavelength range that differs from this can, for example, be an ultraviolet spectrum with wavelengths of less than 380 nm and / or an infrared Spectrum with wavelengths of more than 700 nm, approximately from approximately 1,000 nm to approximately 1,000 pm, in particular a near infrared spectrum with wavelengths in a range from approximately 700 nm or 780 nm to approximately 3 pm.
  • the first and the second wavelength range have wavelengths that are at least partially different from one another on.
  • the wavelength ranges have no overlaps.
  • the wavelength ranges have an overlap, which is, however, only partial, so that there are wavelengths in both ranges that enable a differentiation.
  • a beam deflection area can be a surface area or an area of an object that is configured to deflect a beam path at least in a specific wavelength range.
  • This can be a sequence of at least one applied layer, for example dielectric but also electrically conductive layers, which provide or set a reflectivity. It can be an electrically passive or active property.
  • a main page of a device may be understood to be the embodiment described herein as the side of a housing or the device that has a large or a large dimension compared to other sides.
  • a first main page may refer to a front side and a second main page may indicate a back side.
  • Side pages can be understood to mean pages or surfaces that connect the main pages to one another.
  • the exemplary embodiments described below relate to portable devices, the aspects set forth can easily be transferred to other mobile or immobile devices. It goes without saying that the described portable devices can be installed in other devices, for example in vehicles. Furthermore, the housing of a device can be designed so that it is not portable. Therefore, the exemplary embodiments described below are not intended to be limited to portable devices, but can relate to any implementation of a device.
  • the portable device 10 comprises a housing 12 with a first transparent area 14a and a second transparent area 14b.
  • the housing 12 can be formed from an opaque plastic, a metal or the like.
  • the transparent areas 14a and / or 14b can be made in one piece with the housing 12 may be formed or be formed in several pieces.
  • the transparent areas 14a and / or 14b can be recesses in the housing 12, for example.
  • a transparent material can be arranged in a region of the cutouts or the transparent regions 14a and / or 14b.
  • Transparent materials of the transparen ⁇ th portions 14a and / or 14b of electromagnetic radiation may be at least in a wavelength range to be transparent for an imaging device, in particular a Multiaperturab Struktursvorraum 16 and an image sensor of the same is susceptible.
  • the transparent regions 14a and / or 14b can be partially or completely opaque in different wavelength ranges.
  • the imaging device 16 can be designed to detect a first and a second wavelength range, for example to detect a visible wavelength range and an at least partially different wavelength range.
  • the imaging device or multi-aperture imaging device 16 is arranged inside the housing 12.
  • the imaging device 16 comprises a beam deflecting device 18 and an image capturing device 19.
  • the image capturing device 19 can comprise two or more optical channels, each having one or more optics for changing (such as bundling, focusing or scattering) an optical path of the imaging device 16 and an image sensor.
  • Optics can be disjoint or undivided or channel-specific with respect to different optical channels.
  • the optics it is also possible for the optics to have elements which act jointly for two, more or all optical channels, for example a common converging lens, a common filter or the like combined with a channel-specific lens.
  • the image acquisition device 19 can have one or more image sensors, the associated beam paths of which are directed through one or more optical channels onto the beam deflection device 18 and deflected by the latter.
  • the at least two optical channels can be deflected such that they capture overlapping partial visual fields (partial object areas) of an overall visual field (overall object area).
  • the imaging device 16 can be referred to as a multi-aperture imaging device. Each image sensor area of the image sensor can be assigned to an optical channel.
  • a constructional gap can be arranged between adjacent image sensor areas or the image sensor areas can be different image sensors or parts be implemented thereof, but it is alternatively or additionally also possible that be ⁇ neighboring image sensing areas directly adjoin each other and are separated from the reading of the image sensor of each other.
  • the portable device 10 has a first operating state and a second operating state.
  • the operating state can be correlated with a slope, position or orientation of the beam deflection device 18. This can relate to which wavelength range is deflected by the beam deflection device 16 by using pages with different effectiveness for the deflection.
  • two different operating states can be related to the direction in which the beam path is deflected. In the exemplary multi-aperture imaging device 16, for example, 4 operating states could exist, two for two different viewing directions and two for the different wavelength ranges.
  • the beam deflection device 16 has a first beam deflection area which is effective for the first wavelength range of electromagnetic radiation passing through the optical channel; and has a second beam deflection region which is effective for the second wavelength region of the electromagnetic radiation which passes through the optical channels and which is different from the first wavelength region.
  • the beam deflection device 18 can deflect the beam path 22 of the imaging device 16 such that it runs through the first transparent region 14a, as is indicated by the beam path 22a.
  • the beam deflection device 18 can be designed to deflect the beam path 22 of the imaging device 16 such that it runs through the second transparent region 14b, as is indicated by the beam path 22b.
  • This can also be understood to mean that the beam deflection device 18 directs the beam path 22 through one of the transparent regions 14a and / or 14b at a time and based on the operating state.
  • a position of a visual field (object area), which is captured by the imaging device 16 can be arranged to be variable in space.
  • the first beam deflection area effective for the first wavelength range and the second beam deflection area effective for the second wavelength range can be used alternately to deflect the beam paths of the optical channels or the beam path 22. This makes it possible to steer that part of the spectrum in the direction of the image sensor for which the beam deflection area is effective.
  • E.g. can he Strahlumlenk Scheme have a band-pass functionality and those Wellenlän ⁇ gene regions deflect, ie, reflecting, for the band-pass functionality, is designed while other wavelength ranges suppressed, are filtered out or at least greatly attenuated, such as by at least 20 dB, at least 40 dB, or at least 60 dB.
  • the beam deflection areas can be arranged on the same side of the beam deflection device 18, which offers advantages in the case of beam deflection devices that can be displaced in translation.
  • different beam deflection areas can also be arranged on different sides of the beam deflection device 18, which can alternately face the image sensor based on a rotational movement of the beam deflection device 18.
  • An angle of attack can be arbitrary. However, it is advantageous to use an angle of approximately 45 ° when using two possibly opposite viewing directions of the multi-aperture imaging device 16, so that a rotational movement of 90 ° is sufficient to change the viewing direction. On the other hand, with just one viewing direction, a further degree of freedom can be selected.
  • the total field of view of the respective line of sight can be captured with wavelength ranges that are different from one another, in that the multi-aperture imaging device is designed to capture a first image of the entire field of view using the first beam deflection region with the image sensor, so that the first image is based on the first wavelength range; and to capture a second image of the entire field of view using the second beam deflection region with the image sensor, so that the second image is based on the second wavelength region.
  • a wavelength range that is not visible to the human eye can be used to obtain additional image information, such as depth maps.
  • the portable device 10 may include a first bezel 24a and a second bezel 24b.
  • the diaphragm 24a is arranged in a region of the transparent region 14a and is designed to optically at least partially close the transparent region 14a in a closed state of the diaphragm 24a.
  • the diaphragm 24a is designed to completely or at least 50%, 90% or at least 99% of the surface of the transparent region 14a be closed off the transparent region 14a in the closed state.
  • the diaphragm 24b is designed to close the transparent area 14b in the same or similar manner as described for the diaphragm 24a in connection with the transparent area 14a.
  • the diaphragm 24b can at least partially optically close the transparent area 14b, so that the transparent area 14b causes false light to a small extent or possibly not into the interior of the housing 12 occurs. This enables a slight influence on the recording of the visual field in the first operating state by the false light entering through the aperture 14b.
  • the diaphragm 24a can at least partially optically close the transparent region 14a.
  • the diaphragms 24a and / or 24b can be designed such that they close transparent areas 14a and 14b in such a way that this false light occurs from unwanted directions (in which, for example, the detected visual field is not arranged) to a small extent or does not occur.
  • the diaphragms 24a and 24b can be formed continuously and can be arranged with respect to all optical channels of the imaging device 16. This means that the diaphragms 24a and 24b can be used by all optical channels of the multi-aperture imaging device based on the respective operating state. According to one embodiment, no individual round diaphragms are arranged for each optical channel, but instead an aperture 24a or 24b is used, which is used by all optical channels.
  • the diaphragms 24a and / or 24b can be shaped following a polygon, for example rectangular, oval, round or elliptical.
  • Switching between the first and the second operating state can comprise, for example, a movement of the beam deflection device 18 based on a translatory movement 26 and / or based on a rotary movement 28.
  • the diaphragms 24a and / or 24b can be designed, for example, as a mechanical diaphragm.
  • the diaphragms 24a and / or 24b can be designed as an electrochromic diaphragm. This enables a small number of mechanically moving parts.
  • an embodiment of the diaphragms 24a and / or 24b as an electrochromic diaphragm enables the transparent areas 14a and / or 14b to be opened and / or closed silently, as well as a good embodiment that can be integrated into the optics of the portable device 10.
  • the diaphragms 24a and / or 24b can be designed such that they are hardly or not perceived by a user in a closed state, since there are few visual differences from the housing 12.
  • the housing 12 can be formed flat.
  • the main sides 13a and / o ⁇ the 13b may be arranged in parallel, in the space in an x / y-plane, or a plane.
  • Secondary sides or secondary surfaces 15a and / or 15b between the main sides 13a and 13b can be arranged obliquely or perpendicularly in space, the main sides 13a and / or 13b and / or the secondary sides 15a and / or 15b can be curved or flat.
  • An expansion of the housing 12 along a first housing direction z between the main sides 13a and 13b, for example parallel or antiparallel to a surface normal of a display of the portable device 10, may be slight if it is combined with further dimensions of the housing 12 along further dimensions, ie along a Direction of extension of the main side 13a and / or 13b is compared.
  • the minor sides 15a and 15b can be parallel or anti-parallel to the surface normal of a display.
  • the main sides 13a and / or 13b can be arranged perpendicular to a surface normal of a display of the portable device 10 in space.
  • an extension of the housing along the x direction and / or the y direction can have at least three times, at least five times or at least seven times an extension of the housing 12 along the first extension z.
  • the expansion of the housing z can be understood in a simplifying manner, but without restricting effect, as the thickness or depth of the housing 12.
  • the portable device may include device 10.
  • the portable device 20 may include a display 33, such as a screen or display.
  • the device 20 can be a portable communication device, such as a mobile phone (smartphone), a tablet computer, a mobile music player, a monitor or screen device, which has the imaging device 16.
  • the transparent region 14a and / or the transparent region 14b can be arranged in a region of the housing 12 in which the display 33 is arranged. This means that the diaphragm 24a and / or 24b can be arranged in a region of the display 33.
  • the transparent area 14a and / or 14b and / or the panel 24a or 24b can be covered by the display 33.
  • information from the display can be displayed at least temporarily.
  • the presentation of the information can be any operation of the portable device 20.
  • a viewfinder function can be represented on the display 33, in which a field of view can be represented which is generated by the imaging device inside the Housing 12 is scanned or detected.
  • images that have already been captured or any other information can be displayed .
  • the transparent area 14a and / or the aperture 24a may be hidden from the display 33 so that the transparent portion 14a and / or the aperture 24a during a Be ⁇ drive 20 of the portable device hardly or not perceptible.
  • the transparent regions 14a and 14b can each be arranged in at least one main side 13a of the housing 12 and / or in an opposite main side.
  • the housing 12 can have a transparent area at the front and a transparent area at the rear.
  • front and back can be arbitrarily replaced by other terms, such as left and right, top and bottom or the like, without restricting the exemplary embodiments described here.
  • the transparent regions 14a and / or 14b can be arranged on a secondary side.
  • the transparent areas can be arranged as desired and / or depending on the directions in which the beam paths of the optical channels can be deflected.
  • the display 33 can be designed, for example, to be temporarily deactivated during the acquisition of an image with the imaging device or to increase the transparency of the display 33 out of the housing 12.
  • the display 33 can also remain active in this area, for example if the display 33 emits no or hardly any electromagnetic radiation in a relevant wavelength range into the interior of the portable device 20 or the housing 12 or towards the imaging device 16.
  • FIG. 3a shows the beam deflection device 18 and a state of the multi-aperture imaging device, which is associated, for example, with an operating state of the first aperture 24a and the second aperture 24b.
  • the aperture 24b can at least partially close the transparent region 14b at times, so that false light penetrates to a small or no extent into the interior of the housing of the portable device through the transparent region 14b.
  • 3b shows the beam deflecting device 18, the aperture 24a and the aperture 24b in a second operating state, the beam deflecting device 18 being executed, for example the rotary movement 28 has a different viewing direction by 90 °.
  • the beam deflecting device directs now the beam path with a Strahiumlenk Scheme 18B to which is effective for the second wavelength range, so that detection of a can in view ⁇ direction of the beam path 22b arranged overall field of view at the second wavelength range effected.
  • the first viewing direction shown in FIG. 3a would be taken up again, but under the influence of the beam deflection area 18B.
  • a larger number of overall fields of view for example 2, 3 or more, can thus also be acquired.
  • the beam deflection device 18 can deflect the beam path 22 such that it runs as a beam path 22b through the transparent area 14b, while the diaphragm 24a optically at least partially closes the transparent areas 14a.
  • the diaphragm 24b can have an at least partially or completely open state.
  • the open state can refer to a transparency of the panel.
  • an electrochromic diaphragm can be called open or closed, depending on a control state, without moving mechanical components.
  • a diaphragm 24b designed as an electrochromic diaphragm can be partially or completely transparent at least temporarily during the second operating state for a wavelength range to be detected by the imaging device.
  • the first operating state as shown in FIG.
  • the aperture 24b can be partially or completely non-transparent or opaque for this wavelength range. Switching between the first operating state according to FIG. 3a and the second operating state according to FIG. 3b can be based on the rotational movement 28 of the deflection device 18 and / or based on a translatory movement, as described in connection with FIGS. 4a and 4b, and are preserved or include at least one of these movements.
  • the imaging device can comprise a plurality or a plurality of optical channels, for example two, four or a greater number.
  • the imaging device has four optical channels on that beam deflecting device 18, a number of ray deflection 32a-h ge ⁇ Gurss a number of optical channels multiplied by a number of operating states between which the beam deflecting device 18 or the portable device is switchable environmentally include.
  • the ray deflection 32a and 32e ei ⁇ nem first optical channel can be assigned, wherein the beam deflector 32a the beam path of the first optical channel in the first operating state and the beam deflection element 32e the beam path of the first optical channel in the first operating state deflects.
  • the beam deflecting elements 32b and 32f, 32c and 32g or 32d and 32h can be assigned to further optical channels.
  • the beam deflection device can be translationally movable along the translatory movement direction 26 and / or can be moved back and forth between a first position and a second position of the beam deflection device 18 with respect to the optical channels of the imaging device in order to switch between the first operating state and the second operating state.
  • a distance 34 over which the beam deflection device 18 is moved between the first position and the second position can correspond to at least a distance between four optical channels of the imaging device.
  • the beam deflection device 18 can have a block-wise sorting of the beam deflection elements 32a-h. E.g.
  • the beam deflection elements 32a-d can be designed to deflect the beam paths of the imaging device in a first viewing direction towards a first field of view, wherein each optical channel can be assigned to a partial field of view of the overall field of view.
  • the beam deflecting elements 32e-h can be designed to deflect the beam paths of the imaging device in a second viewing direction towards a second field of view, wherein each optical channel can be assigned to a partial field of view of the overall field of view.
  • the beam deflecting elements 32a-h can be, for example, regions of the beam deflecting device 18 which are curved differently from one another or flat facets of a facet mirror.
  • the beam deflecting device 18 can be understood as an array of facets and / or deflecting elements 32a-h which are inclined at different angles to one another, so that beam paths of optical channels striking the beam deflecting device 18 into partial fields of view of the field of view of the first th operating state and beam paths striking deflecting elements 32e-h and deflected by them into different partial fields of view of a field of view of the second operating state.
  • FIG. 4b shows a schematic view of the beam deflection device 18 according to a configuration that is different from the configuration according to FIG. 4a.
  • 4a can be understood as a block-wise sorting of the beam deflection elements 32a-h based on an operating state
  • the configuration according to FIG. 4b can be understood as a channel-wise sorting of the beam deflection elements 32a-h based on a sequence of the optical channels of the imaging device .
  • the beam deflection elements 32a and 32e assigned to the first optical channel can be arranged adjacent to one another.
  • the beam deflection elements 32b and 32f, 32c and 32g or 32d and 32h which can be assigned to the optical channels 2, 3 or 4, can be arranged adjacent to one another.
  • a distance 34 ′ over which the beam deflection device 18 is moved in order to be moved back and forth between the first position and the second position may be less than the distance 34, for example a quarter or a half thereof. This enables an additionally reduced design of the imaging device and / or the portable device.
  • the beam deflection elements can also provide different types of beam deflection areas, so that a pure assignment to optical channels is provided, so that a first optical channel is deflected, for example, by deflection either with the beam deflection element 32a in the first wavelength range or by deflection with the beam deflection element 32e in the second wavelength range.
  • the rotational movement can be combined with the translatory movement. It is conceivable, for example, that a translatory movement switches between the wavelength ranges, i. that is, the different beam deflecting elements 32a-h are arranged on a common side of the beam deflecting device 18, a design that is reflective on both sides making it possible to switch the viewing direction or vice versa.
  • FIG. 4c shows a schematic side sectional view of a beam deflecting element 32, as can be used for a beam deflecting device described here, such as the beam deflecting device 18 of FIGS. 4a or 4b.
  • the beam deflecting element 32 can have a polygonal cross-section. Although a triangular cross section is shown, it can also be any other polygon.
  • the cross section can also have at least one curved surface, it being possible, in particular in the case of reflecting surfaces, for an embodiment which is at least sectionally flat to avoid imaging errors. Beam deflection regions with different effective wavelengths can be arranged on different and opposite main sides 35a and 35b.
  • the beam deflecting element 32 has, for example, a first side 35a, a second side 35b and a third side 35c. At least two sides, for example sides 35a and 35b, are reflective, so that the beam deflecting element 32 is reflective on both sides.
  • the sides 35a and 35b can be main sides of the beam deflecting element 32, that is to say sides whose area is larger than the side 35c.
  • the beam deflecting element 32 can be wedge-shaped and reflective on both sides.
  • a further surface can be arranged opposite the surface 35c, that is to say between the surfaces 35a and 35b, but this is considerably smaller than the surface 35c.
  • the wedge formed by the surfaces 35a, 35b and 35c does not run to an arbitrary point, but is provided with a surface on the pointed side and is therefore truncated.
  • FIG. 4d shows a schematic side sectional view of the beam deflecting element 32, in which a suspension or a displacement axis 37 of the beam deflecting element 32 is described.
  • the displacement axis 37 about which the beam deflection element 32 can be moved in the beam deflection device 18 in a rotational and / or translatory manner, can be displaced eccentrically with respect to a centroid 43 of the cross section.
  • the center of area can alternatively also be a point which describes the half-dimension of the beam deflecting element 32 along a thickness direction 45 and along a direction 47 perpendicular thereto.
  • the displacement axis can, for example, remain unchanged along a thickness direction 45 and have any offset in a direction perpendicular to it.
  • an offset along the thickness direction 45 is also conceivable.
  • the displacement can, for example, take place in such a way that when the beam deflecting element 32 rotates about the displacement axis 37, a higher adjustment path is obtained than when rotating about the surface center of gravity 43.
  • the displacement by which the displacement axis 37 can move The edge between the sides 35a and 35b moved during a rotation is increased at the same rotation angle compared to a rotation around the center of gravity 43.
  • the beam deflecting element 32 is preferably arranged such that the edge, that is to say the pointed side of the wedge-shaped cross section, faces the image sensor between the sides 35a and 35b. A small other side 35a or 35b can thus deflect the beam path of the optical channels by slight rotational movements.
  • the rotation can be carried out in such a way that a space requirement of the beam deflection device along the thickness direction 45 is small, since a movement of the beam deflection element 32 such that a main side is perpendicular to the image sensor is not necessary.
  • the side 35c can also be referred to as the secondary side or the rear side.
  • a plurality of beam deflection elements can be connected to one another in such a way that a connecting element is arranged on the side 35c, or runs through the cross section of the beam deflection elements, that is to say is arranged in the interior of the beam deflection elements, for example in the region of the displacement axis 37.
  • the retaining element can do so be arranged so that it is not or only to a small extent, i.e. i.e., a maximum of 50%, a maximum of 30% or a maximum of 10% protrudes beyond the beam deflecting element 32 along the direction 45, so that the holding element does not increase or determine the extent of the overall structure along the direction 45.
  • the extension in the thickness direction 45 can alternatively be determined by the lenses of the optical channels, i. H. these have the dimension defining the minimum of the thickness.
  • the beam deflecting element 32 can be formed from glass, ceramic, glass ceramic, plastic, metal or a combination of these materials and / or other materials.
  • the beam deflecting element 32 can be arranged such that the tip, ie the edge between the main sides 35a and 35b, faces the image sensor.
  • the beam deflecting elements can be held in such a way that they only take place on the back or inside the beam deflecting elements, ie the main sides are not covered.
  • a common holding or connecting element can extend over the rear side 35c.
  • the axis of rotation of the trap arm 32 can be arranged eccentrically.
  • FIG. 4e shows a schematic perspective view of a multi-aperture imaging device 40 which comprises an image sensor 36 and a single-line array 38 of optical channels 42a-d arranged next to one another.
  • the beam deflection device 18 comprises a number of beam deflection elements 32a-d, which can correspond to the number of optical channels. Alternatively, a smaller number of beam deflecting elements can be arranged, for example if at least one beam deflecting element is used by two optical channels. Alternatively, a higher number can also be arranged, for example if the deflection direction of the beam deflection device 18 is switched by a translational movement, as is described in connection with FIGS. 4a and 4b. Each beam deflection element 32a-d can be assigned to an optical channel 42a-d.
  • the beam steering elements 32a-d can be formed as a plurality of elements 32 according to FIGS. 4c and 4d. Alternatively, at least two, several or all of the beam deflecting elements 32a-d can be formed in one
  • 4f shows a schematic side sectional view of the beam deflecting element 32, the cross section of which is formed as a free-form surface.
  • the side 35c can have a cutout 49 which enables a holding element to be fastened, the cutout 49 also being able to be formed as a projecting element, for example as a tongue of a tongue and groove system.
  • the cross section also has a fourth side 35d, which has a smaller surface area than the main sides 35a and 35b and connects the same to one another.
  • FIG. 4g shows a schematic side sectional view of a first beam deflecting element 32a and a second beam deflecting element 32b lying behind in the direction of illustration.
  • the cutouts 49a and 49b can be arranged such that they are essentially congruent, so that an arrangement of a connecting element in the cutouts is made possible.
  • FIG. 4h shows a schematic perspective view of the beam deflecting device 18, which comprises, for example, four beam deflecting elements 32a-d which are connected to a connecting element 51.
  • the connecting element can be usable in order to be translationally and / or rotationally movable by an actuator.
  • the connecting element 51 can be formed in one piece and over an extension direction, for example the y direction in FIG. 4e, run on or in the Strahiumlenketementen 32a-d.
  • the connecting element 51 can also be connected only to at least one side of the beam deflection device 18, for example if the beam deflection elements 32a-d are formed in one piece.
  • a connection to an actuator and / or a connection of the beam deflecting elements 32a-d can also be made in any other way, for example by means of gluing, wringing or soldering.
  • the imaging device 16 comprises the beam deflection device 18, an image sensor 36 and a single-line array 38 of optical channels 42a-cf arranged next to one another.
  • Each optical channel 42a-d can have optics that are designed to optically influence beam paths 22-1 to 22-4 of the imaging device 16.
  • the optics can be channel-specific or have common components for groups of two or more optical channels.
  • the image sensor 36 can comprise image sensor areas 44a-d, wherein the beam paths 22-1 to 22-4 of the optical channels 22a-d can each strike an image sensor area 44a-d.
  • an optical channel 22a-d and / or a beam path 22-1 to 22-4 can be assigned to each image sensor area 44a-d.
  • the beam deflection device 18 can be designed to deflect the beam paths 22-1 to 22-4 in different directions and / or deflect different wavelengths, for example, based on different operating states of the portable device and / or on different positions of the beam deflection device 18 in connection with FIGS. 1, 2, 3a, 3b, 4a-h. This means that the imaging device 16 can be formed as a multi-aperture imaging device 40 or can comprise this.
  • the image sensor areas 44a-d can, for example, each be formed from a chip that comprises a corresponding pixel array, wherein the image sensor areas can be mounted on a common substrate or a common circuit board.
  • the image sensor regions 44a-d can be formed in each case from a part of a common pixel array which extends continuously over the image sensor regions 44a-d, the common pixel array being formed, for example, on a single chip. For example, only the pixel values of the common pixel array in the image sensor areas 44a-d are then read out.
  • Different mixtures of these alternatives are of course also possible, such as the presence of one Chips for two or more channels and a further chip for still other channels or the like.
  • these can be mounted, for example, on one or more boards, such as, for example, all together or in groups or the like.
  • the single-line array 38 can have a carrier 39, on which optics 41a-d of the optical channels are arranged.
  • the carrier 39 can be passed by the optical beam paths 22-1 to 22-4 used for the imaging in the individual optical channels.
  • the optical channels of the multi-aperture imaging device can traverse the carrier 39 between the black-light deflection device 18 and an image sensor 36.
  • the carrier 39 can keep a relative position between the optics 41 a-d stable.
  • the carrier 39 can be made transparent and comprise, for example, a glass material and / or a polymer material.
  • the optics 41a-d can be arranged on at least one surface of the carrier 39.
  • the carrier 39 is not or only insignificant along the direction parallel to a main side of the field sensor 36 and perpendicular to the line stretching direction 56, i. i.e., at most 20%, at most 10% or at most 5%, larger than a corresponding dimension of the optics 41a-d.
  • the beam deflection device can be designed in such a way that in the first position and in the second stiffening it deflects the beam path 22-1 to 22-4 of each optical channel 42a-d in a different direction from one another. This means that the deflected beam paths 22-1 to 22-4 can have an angle to one another, as is described in connection with FIG. 6a.
  • the optical channels 16a-d can be arranged in at least one row along a row extension direction 56.
  • the array 38 can be formed as a multi-line array comprising at least two lines or as a single-line array comprising (exactly) one line of optical channels.
  • the optical channels can be directed by the beam deflecting device 18 to variable visual fields based on a set viewing direction.
  • the optical channels can be at an angle to one another within a viewing direction, so that the optical channels are directed into the highest partially overlapping partial visual fields of the overall visual field become.
  • the different angles of the optical channels can be obtained based on the optics of the optical channels and / or based on a different deflection of the optical channels on the beam deflection device 18.
  • the imaging device 16 can comprise an actuator 48a, which is, for example, part of an optical image stabilizer 46a and / or can be used to change the position or position of the beam deflection device 18.
  • the optical image stabilizer 46 can be designed to enable optical image stabilization of an image captured by the image sensor 36.
  • the actuator 48a can be designed to generate a rotational movement 52 of the beam deflection device 18.
  • the rotational movement 52 can take place about an axis of rotation 54, the axis of rotation 54 of the beam deflecting device 18 being able to be arranged in a central region of the beam deflecting device 18 or apart therefrom.
  • the rotational movement 52 can be superimposed on the rotational movement 28 or the translatory movement 26 for switching the beam deflection device between a first and a second position or operating state. If the beam deflection device 18 is translationally movable, the translatory movement 26 can be arranged in space parallel to a line extension direction 56 of the one-line array 38.
  • the line extension direction 56 can refer to a direction along which the optical channels 42a-d are arranged next to one another.
  • an optical image stabilization can be obtained along a first image axis 58, possibly perpendicular to the line extension direction 56.
  • the optical image stabilizer 46 can comprise an actuator 48b, which is designed to translate the single-line array 38 along the line extension direction 56. Based on the translational movement of the one-line array 38 along the line extension direction 56, an optical image stabilization along a second image axis 62 can be obtained, possibly parallel to the line extension direction 56 or parallel to the direction of movement of the one-line array 38.
  • the actuators 48a and 48b can be formed, for example, as a piezoelectric actuator, pneumatic actuator, hydraulic actuator, direct current motor, stepper motor, thermal actuator, electrostatic actuator, electrostrictive actuator and / or magnetostrictive actuator.
  • the actuators 48a and 48b can be formed identically or differently from one another.
  • an actuator can also be arranged, which is designed to move the beam deflection device 18 in a rotary manner and the single-line array 38 in a translatory manner.
  • the axis of rotation 54 can be parallel to the line extension direction 56.
  • the rotational movement 52 about the axis of rotation 54 can lead to a small installation space requirement of the imaging device 16 along a direction parallel to the image axis 58, so that the portable device, which comprises the imaging device 16 inside a housing, can also have a small dimension.
  • the portable device can have a flat housing.
  • the translatory movement 26 can, for example, be carried out parallel or substantially parallel to an extension of a main side 13a and / or 13b of the device 10, so that an additional installation space that may be required for switching the beam deflection device between operating states can be carried out along the line extension direction 56 can be arranged and / or there is no need to provide installation space along a thickness direction of the device.
  • the actuators 48a and / or 48b can be arranged along the line extension direction and / or perpendicularly thereto, parallel to an extension direction from main sides of the housing of the device.
  • actuators for shadowing between operating states and / or actuators of the optical image stabilizer can be arranged next to, in front of and behind an extension between the image sensor, the one-line array 38 and the beam deflecting device 18, with an arrangement above and / or below is omitted in order to keep the overall height of the imaging device 16 low.
  • actuators for switching the operating state and / or the optical image stabilizer can be arranged in a plane in which the image sensor 36, the single-line array 38 and the beam deflection device 18 are arranged.
  • the actuator 48b and / or other actuators can be designed to change a distance between the image sensor 36 and the single-line array 38 or the optics of the optical channels.
  • the actuator 48b can be designed, for example, to move the single-line array 38 and / or the image sensor 36 relative to one another along a beam path of the beam paths 22-1 to 22-4 or perpendicular to the line extension direction 56 in order to focus the image of the To change the field of view and / or to obtain an auto focus function.
  • the imaging device 16 can have a focus device that is designed to change the focus of the imaging device.
  • the focus device can be designed to provide a relative movement between the single-row array 38 and the image sensor 36.
  • the focus device can be designed to perform the relative movement by executing a movement of the movement which is simultaneous to the relative movement Execute beam deflector 18.
  • the actuator 48b or another actuator can be designed to keep a distance between the single-line array 38 and the beam deflection device 18 at least essentially constant or, if no additional actuator is used, at least essentially, if necessary, exactly constant, that is, the beam deflection device 18 to move to an extent such as the single-line array 38.
  • an implementation of a focus function can lead to an increased dimension (thickness) of the device.
  • the imaging device 16 can have a focus device for changing a focus.
  • the focus device can be designed to provide a relative movement (focusing movement) between at least one optics 41a-d of the optical channels of the multi-aperture imaging device 16 and the image sensor 36.
  • the focus device can have an actuator for providing the relative movement, for example the actuator 48b and / or 48a.
  • the beam deflection device 18 can be moved simultaneously with the focusing movement by appropriate design or use, possibly using a further actuator. This means that a distance between the single-line array 38 and the beam deflecting device remains unchanged and / or that the beam deflecting device 18 is moved simultaneously or with a time delay to the same or comparable extent as the focusing movement, so that at least at one point in time the field of view is captured by the Multi-aperture imaging device is unchanged compared to a distance before a change in focus.
  • the imaging device 16 comprises a control device 53, which is designed to receive image information from the image sensor 36.
  • a control device 53 which is designed to receive image information from the image sensor 36.
  • an image of the total field of view is evaluated, which is evaluated by deflecting the beam paths 22-1 to 22- 4 of the optical channels 42a to 42d with the first beam deflection area, and a corresponding, ie, corresponding image is evaluated, which is obtained by deflecting the beam paths 22- 1 to 22-4 of the optical channels 42a to 42d with the second beam deflection area is obtained, the order of the first and second imaging being arbitrary.
  • the control device 53 can generate two overall images of the acquired total field of view, for example using methods for stitching, wherein a first overall image is based on the first wavelength range and a second overall image is based on the second wavelength range.
  • the control device can be designed to determine a depth map for the first exposure using the second exposure, for example based on a wavelength range that is not visible to humans, such as an infrared range, in particular a near infrared range (NIR).
  • the control device can be designed, for example, to evaluate a pattern visible in the second wavelength range.
  • a predefined pattern for example a dot pattern in the NIR wavelength range in the direction of the overall field of view, can be emitted and a distortion of the pattern can be evaluated in the second image or image.
  • the distortion can be correlated with depth information.
  • the control device 53 can be designed to provide the depth map by evaluating the depth information.
  • temporal information can also be evaluated, for example with knowledge of a temporal variance of the pattern.
  • the lighting source can be designed to emit the temporal and / or spatial lighting pattern with a third wavelength range that completely or partially encompasses the second wavelength range, so that the third wavelength range at least partially corresponds to the second wavelength range.
  • a partial reflection of the wavelengths of the emitted pattern represents a sufficient source for the second wavelength range arriving at the image sensor and also includes wavelength shifts or partial reflections, for example based on absorptions.
  • the second wavelength range and the third wavelength range may also be congruent.
  • the deflected beam paths of the optical channels can run through a transparent area of a housing of the device, wherein an aperture can be arranged in the transparent area.
  • an aperture can be arranged in the transparent area.
  • the diaphragm can have an open state for the two, the plurality or all of the optical channels. This means that the diaphragms can be effective for at least two optical channels of the multi-aperture imaging device.
  • the diaphragm 24b can optically at least partially close the transparent region 14b for the two, the plurality or all optical channels.
  • the aperture 24a can optically at least partially close the transparent region 14a for the two, the plurality or all optical channels.
  • 5b shows a schematic perspective view of the multi-aperture imaging device 16 according to an exemplary embodiment, in which the array 38 has, for example, two optical channels which comprise optics 41a-b, any higher number being possible, for example three, four, five or more .
  • One of the optical channels 41a and 41b is designed to capture a partial field of view 64a or 64b of a total field of view 60.
  • the partial fields of view 64a and 64b overlap with one another and together form the overall facial field 60.
  • the multi-aperture imaging device 16 comprises an illumination device 55, which is designed to emit a temporal or spatial illumination pattern 55a, in particular in the direction of the overall field of view 60.
  • the illumination pattern 55a can comprise a third wavelength range, which at least partially overlaps or corresponds to the second wavelength range that a deflection of the beam paths using the second beam deflection region hits the pattern distorted in the overall field of view on the image sensor and can be evaluated by the control device 53.
  • FIG. 5 c shows a schematic side sectional view of a modified imaging device 16 ′, in which the beam deflection device 18 can be moved between a first position Pos1 of the first operating state and a second position Pos2 of the second operating state based on a rotational movement 52 ′ about the axis of rotation 54.
  • the imaging device 16 ' can have a first viewing direction 57a.
  • the imaging device 16 ′ can have a first viewing direction 57b.
  • Main pages 59a and 59b of the beam deflection device 18 can be formed reflectively as a mirror and / or as facet elements.
  • the beam deflection device 18 can be switchable between a center position 61, so that a distance between parallel planes 63a and 63b, which can describe a minimum dimension of the imaging device 16 'along a normal direction of the planes 63a and 63b, can be described by the dimensions of the Image sensor 36, the array 38 is not influenced by a movement of the beam deflection device 18.
  • the rotational movement 52 can be superimposed on the rotational movement 28. Put simply, a superimposition of switching and optical image stabilization can be implemented.
  • Actuators of the multi-aperture imaging device can be arranged such that it is at least partially arranged between two planes 63a and 63b, which are spanned by the sides of a cuboid.
  • the sides of the cuboid can be aligned parallel to one another and parallel to the line extension direction of the array and part of the beam path of the optical channels between the image sensor and the beam deflection device.
  • the volume of the cuboid is minimal and nevertheless includes the image sensor, the array and the beam deflection device and their operational movements.
  • a thickness direction of the multi-aperture imaging device can be arranged normal to the planes 63a and / or 63b.
  • the actuators can have a dimension or extension parallel to the thickness direction. A portion of at most 50%, at most 30% or at most 10% of the dimension can protrude from an area between levels 63a and 63b beyond level 63a and / or 63b or protrude from the area.
  • the actuators thus protrude, for example, at most insignificantly level 63a and / or 63b.
  • the actuators do not protrude beyond levels 63a and / or 63b. The advantage of this is that an expansion of the multi-aperture imaging device along the thickness direction is not increased by the actuators.
  • a volume of the multi-aperture imaging device can have a small or minimal installation space between the planes 63a and 63b.
  • an installation space of the multi-aperture imaging device can be large or arbitrarily large.
  • the volume of the virtual cuboid is influenced, for example, by an arrangement of the image sensor 36, the array 38 and the beam deflecting device, the arrangement of these components being described in accordance with the The exemplary embodiments can be such that the installation space of these components along the direction perpendicular to the planes and therefore the distance between the planes 63a and 63b to one another is small or minimal. Compared to other arrangements of the components, the volume and / or the distance between other sides of the virtual cuboid can be increased.
  • FIG. 6a shows a schematic view of an overall field of view 60, which comprises four overlapping partial fields of view 64a-d.
  • the partial fields of view 64a-d are arranged, for example, along two directions H and V in the object area, which for example, but not restrictively, can denote a horizontal direction and a vertical direction. Any other directional arrangement is possible.
  • 5a for example, the beam path 22-1 towards the partial field of view 64a, the beam path 22-2 towards the partial field of view 64b, the beam path 22-3 towards the partial field of view 64c and / or the beam path 22-4 towards the Field of view 64d can be directed.
  • the beam paths 22-1 to 22-4 are directed in different directions from one another.
  • the total field of view 60 is detected in the exemplary embodiment described by means of four optical channels which detect the partial field of view 64a-d, the total field of view 60 can also be captured by any other number of partial field of view greater than 1, i.e. at least 2, at least 3, at least five, at least seven or more.
  • FIG. 6b shows a division of the total field of view 60, which is also possible and has been changed from FIG. 6a and is captured, for example, by only two partial fields of view 64a and 64b.
  • the partial facial fields 64a and 64b can, for example, be arranged along the direction V or, as shown in FIG. 6c, along the direction H and overlap with one another in order to enable effective image merging.
  • the partial fields of view are only shown with different sizes for better differentiation, even if this can mean a corresponding optional implementation.
  • An assignment of the partial visual fields 64a and 64b to the optical channels and a relative alignment of the array 14 can in principle be arbitrary.
  • a direction along which the partial visual fields are arranged for example V in FIG. 6b or H in FIG. 6c, can be arranged as desired in relation to the line extension direction 56 of the array 14.
  • An arrangement such that the line extension direction 56 and the direction along which the partial fields of view are arranged are advantageous at least within a tolerance range of ⁇ 25 °, ⁇ 15 ° or ⁇ 5 ° are arranged perpendicular to each other, preferably perpendicular to each other.
  • the line extension direction 56 is arranged parallel to the direction H arranged perpendicular to V, for example.
  • the line extension direction 56 is also rotated in accordance with the arrangement of the partial visual fields 64a and 64b rotated with respect to FIG. 6b, so that the line extension direction 56 is parallel to V or within the specified tolerance range perpendicular to H.
  • the optical channels 42a c and the image sensor areas 44a-c could thus overlap in the representation plane of FIG. 6c or be congruent within the tolerance range and are shown offset from one another for the sake of illustration.
  • Multi-aperture imaging devices in accordance with exemplary embodiments can be designed to capture the entire field of view 60 through at least two partial fields of view 64a-b.
  • At least one of the partial facial fields, other than single-field partial fields, such as the partial field of view 64b or the partial fields of view according to the explanations for FIG. 6a can be captured by at least a first optical channel 42a and a second optical channel 42c.
  • the total field of view can be segmented into exactly two partial fields of view 64a and 64b.
  • Exactly one of the partial fields of view for example the partial field of view 64a, can be detected by two optical channels 42a and 42c.
  • Other partial fields of view can be captured through one channel.
  • multi-aperture imaging devices provide for the use of exactly two optical channels in order to image the two partial visual fields 64a and 64b in the respective wavelength range or in both wavelength ranges.
  • occlusions or occlusion effects which means that instead of a double detection of a visual field arranged behind an object, only one viewing angle is detected.
  • some exemplary embodiments provide for capturing at least one of the partial facial fields 64a and / or 64b with a further optical channel 42a-c, so that at least this channel 42a-c is captured several times, in particular twice.
  • a different number of partial fields of view recorded twice and / or a different number of partial fields of view and / or a different number of optical channels is also possible.
  • optical channels 42a and 42c and / or image sensor areas 44a and 44c can be used for multiple detection of a partial facial field.
  • des 64 be arranged symmetrically around an optical channel 42b for detecting the other partial field of view, be spaced apart from one another in the array 14 by at least one optical channel 42b directed at another partial field of view and / or have an enlarged or maximum distance from one another within the array, to allow some degree of disparity.
  • FIG. 7a shows a schematic perspective view of a device 70i, which comprises a first multi-aperture imaging device 16a and a second multi-aperture imaging device 16b, and is designed to stereoscopically capture the entire field of view 60 with the multi-aperture imaging devices.
  • the overall field of view 60 is arranged, for example, on a main side 13b facing away from the main side 13a.
  • the multi-aperture imaging devices 16a and 16b can capture the entire field of view 60 through transparent areas 14a and 14c, respectively, with diaphragms 24a and 24c arranged in the main side 13b being at least partially transparent.
  • Apertures 24b and 24d arranged in the main side 13a can at least partially optically close transparent areas 14b and 14d, respectively, so that an amount of false light from a side facing the main side 13a, which can falsify the images of the multi-aperture imaging devices 16a and / or 16b, is at least reduced is.
  • the multi-aperture imaging devices 16a and 16b are shown spaced apart from one another, the multi-aperture imaging devices 16a and 16b can also be spatially adjacent or combined.
  • the single-line arrays of the imaging devices 16a and 16b can be arranged side by side or parallel to one another.
  • the single-line arrays can form lines to one another, with each multi-aperture imaging device 16a and 16b having a single-line array.
  • the imaging devices 16a and 16b can have a common beam deflection device and / or a common carrier 39 and / or a common image sensor 36.
  • the transparent areas 14a-d can additionally be equipped with a switchable diaphragm 24a-d, which covers the optical structure in the case of non-use.
  • the aperture 24a-d can comprise a mechanically moving part. The movement of the mechanically moved part can take place using an actuator, as described for example for the actuators 48a and 48b.
  • the diaphragm can be electrically controllable and comprise an electrochromic layer or an electrochromic layer sequence. According to an embodiment preferred in FIG.
  • a device 702 is configured similarly to device 70i, but is designed such that the depth information instead of a stereoscopic image creates a recording from the recording in one of the wavelength ranges, for example by evaluating a pattern distortion in a non-visible one Wavelength range.
  • the device 70 is configured and configured, for example, with only a single imaging device 18 in order to record the entire field of view from a perspective, namely that of the imaging device 16, and to record no stereoscopic images of the overall field of view.
  • the device 70 can also be designed in accordance with the preferred embodiment in order to provide or generate a depth map of the overall field of view, for example by evaluating a pattern distortion in one of the detected wavelength ranges, for example by the control device 53 or a calculation device of the device 70 or the device set up for this purpose Imaging device 16.
  • the device 70 can be implemented without an additional infrared camera that complements or expands the imaging device 16, since such functionality is already implemented in the imaging device 16, possibly including the lighting device 55.
  • the imaging device 16 of a device 70a is designed to have only one viewing direction compared to the devices 70i and 702, so that an arrangement of a corresponding viewing window in other directions and the anyway optional diaphragms can be dispensed with .
  • the devices 7O2 and 7O3 can also be designed to create a depth map of the overall field of view.
  • FIG. 8 shows a schematic structure comprising a first multi-aperture imaging device 16a and a second multi-aperture imaging device 16b, as can be arranged, for example, in the imaging system 70i.
  • the multi-aperture imaging devices 16a and 16b can be formed in whole or in part as a common multi-aperture imaging device.
  • the single-line arrays 38a and 38b form a common line.
  • the image sensors 36a and 36b can be on a common substrate or on a common circuit carrier such as a common circuit board or a common one Flexboard mounted. Alternatively, the image sensors 36a and 36b can also comprise substrates that are different from one another.
  • multi-aperture imaging devices comprising a common image sensor, a common array and / or a common beam deflection device 18 and further multi-aperture imaging devices which have separate components.
  • An advantage of a common image sensor, a common single-line array and / or a common beam deflection device is that a movement of a respective component can be obtained with great precision by actuating a small amount of actuators, and synchronization between actuators can be reduced or avoided. Furthermore, high thermal stability can be obtained.
  • further multi-aperture imaging devices can have a common array, a common image sensor and / or a common beam deflection device. By arranging at least one further group of imaging optical channels, any number of which can be implemented, the multi-aperture imaging device can be designed to capture the entire field of view at least stereoscope.
  • the beam paths or optical axes can be directed in different directions from the beam deflection device. This can be obtained by deflecting the beam paths during a deflection on the beam deflection device and / or by means of the optics in a manner different from parallelism.
  • the beam paths or optical axes can deviate from parallelism before or without beam deflection.
  • this fact is described by the fact that the channels can be provided with a kind of preliminary divergent. With this preliminary divergence of the optical axes, it would be possible, for example, that not all facet inclinations differ from facets of the beam deflection device, but that some groups of channels, for example, have the facets with the same inclination or are directed onto them.
  • the latter can then be formed in one piece or continuously merging into one another, quasi as a facet, which is assigned to this group of channels adjacent in the direction of line extension.
  • the divergence of the optical axes of these channels could then originate from the divergence of these optical axes, as is achieved by a lateral offset between optical centers of the optics of the optical channels and image sensor regions of the channels.
  • the pre-divergence could be limited to one level, for example.
  • the optical axes could run in a common plane before or without beam deflection, but divergent in this plane, and the Facets only cause an additional divergence in the other transverse plane, i.e.
  • the optics can allow (pre) divergence of the beam paths along a first (image) direction and the beam deflection device can allow a divergence of the beam paths along a second (image) direction.
  • the aforementioned possible pre-divergence can be achieved, for example, by the optical centers of the optics lying on a straight line along the line extension direction, while the centers of the image sensor areas from the projection of the optical centers along the normal of the plane of the image sensor areas onto points on a straight line in are arranged differently from the image sensor plane, for example at points which deviate from the points on the aforementioned straight line in the image sensor plane channel-specifically along the line extension direction and / or along the direction perpendicular to both the line extension direction and the image sensor normal.
  • preliminary divergence can be achieved by the centers of the image sensors lying on a straight line along the line extension direction, while the centers of the optics from the projection of the optical centers of the image sensors along the normal of the plane of the optical centers of the optics onto points on a straight line in are arranged differently at the optic center level, such as at points that are channel-specific from the points on the aforementioned straight line in the optic center level! along the line extension direction and / or along the direction perpendicular to both the line extension direction and the normal of the optics center plane.
  • the aforementioned channel-specific deviation from the respective projection only extends in the direction of line extension, that is to say the optical axes are only in a common plane and are blown away with a pre-divergence. Both optical centers and image sensor area centers then lie on a straight line parallel to the line extension direction, but with different intermediate distances.
  • a lateral offset between lenses and image sensors in a vertical lateral direction to the line extension direction led to an increase in the overall height.
  • a pure in-plane offset in the direction of line extension does not change the overall height, but it does possibly fewer facets and / or the facets only have a tilt in an angular orientation, which simplifies the structure. For example, neighboring optical ones can be used
  • a facet can be arranged with respect to a group of optical channels, inclined only in one direction and parallel to the direction of line extension.
  • optical channels are assigned to the same partial field of view, e.g. for the purpose of super resolution or to increase the resolution with which the corresponding partial field of view is scanned through these channels.
  • the optical channels within such a group would then run parallel, for example before beam deflection, and would be deflected by a facet onto a partial field of view.
  • pixel images of the image sensor of one channel of a group would lie in intermediate positions between images of the pixels of the image sensor of another channel of this group.
  • the above exemplary embodiments can thus be implemented in the form of a multi-aperture imaging device and / or a device comprising such a multi-aperture imaging device, with a single-line channel arrangement, each channel transmitting a partial field of view of an overall field of view and the partial fields of view partially overlapping.
  • a setup with several such multi-aperture imaging devices for stereo trio, quattro etc. setups for 3D image recording is possible.
  • the plurality of modules can be designed as a continuous line.
  • the connected line could use identical actuators and a common beam deflecting element.
  • One or more reinforcing substrates possibly present in the beam path can extend over the entire line, which can form a stereo, trio, quattro structure.
  • the optical axes can also run divergent without a beam deflection device, so that fewer facets on the beam deflection device. steering unit are required.
  • the facets then advantageously have only one angle component.
  • the image sensor can be in one piece, have only one coherent pixel matrix or several interrupted ones.
  • the image sensor can be composed of many partial sensors which are arranged next to one another on a printed circuit board, for example.
  • An autofocus drive can be designed such that the beam deflecting element is moved synchronously with the optics or is stationary.
  • Sub-modules can also be constructed as a system.
  • the sub-modules or systems can be installed in a housing, such as a smartphone.
  • the systems can be arranged in one or more rows and / or rows and at any location.
  • two imaging devices 16 can be arranged in the housing 12 in order to enable stereoscopic detection of a visual field.
  • the device 70 comprises further multi-aperture imaging devices 16, so that the overall field of view 60 can be scanned with more than two multi-aperture imaging devices.
  • This enables a number of partially overlapping channels which, due to their viewing directions adapted to each channel, record the entire field.
  • at least one further arrangement of channels according to the exemplary embodiments described here and / or the described arrangement of channels can be arranged, which can be formed as exactly one line or as separate modules.
  • the single-line array can be arranged in multiple lines with a further line, wherein the further line of optical channels can be assigned to a further multi-aperture imaging device.
  • the optical channels of the further line can also record overlapping partial areas and together cover the entire field of view. This enables a stereo, trio, quattro, etc. structure to be obtained from array cameras, which consist of channels which partially overlap and cover the entire field of view within their sub-grouping.
  • multi-aperture cameras with a linear channel arrangement can comprise a plurality of optical channels which are arranged next to one another and each transmit parts of the overall field of view.
  • a mirror beam deflection device
  • a mirror can advantageously be arranged in front of the imaging lenses, which mirror can be used for beam deflection and for reducing the overall height of the device
  • a channel-by-channel mirror such as a facet mirror, where the facets can be flat or arbitrarily curved or provided with a free-form surface, it can be advantageous to construct the imaging optics of the channels essentially identically, whereas the viewing directions of the channels can be the individual facets of the mirror array are influenced or predetermined.
  • the imaging optics of the channels can be designed or shaped differently, so that there are different viewing directions.
  • the deflecting mirror (beam deflecting device) can be rotatably mounted, the axis of rotation being perpendicular to the optical channels, ie parallel to the line extension direction of the channels.
  • the deflecting mirror can be reflective on both sides, whereby metallic or dielectric layers or layer sequences can be arranged in order to obtain a reflectivity.
  • Rotation or translational displacement of the mirror can be carried out in an analog or bistable or multiply stable manner. It can be understood to be stable if a force has to be applied to move along a predicted direction, wherein falling below the force can result in the beam deflecting device stopping or returning.
  • the analog rotation can be used for a one-dimensional adaptation of the image position, which can be understood as optical image stabilization. For example, a movement of only a few degrees can be sufficient, for example s 15 °, ⁇ 10 ° or £ 1 °.
  • the bistable or multiple stable rotation of the mirror can be used to switch the viewing direction of the camera. For example, you can switch between the viewing directions in front of, next to and behind the display.
  • Analog and bistable / multi-stable movements or positions can be combined, ie overlaid.
  • the exemplary embodiments described here can be used to replace solutions in portable devices, such as smartphones, which use two cameras with different viewing directions forwards and backwards, by a structure which comprises only one imaging device.
  • the structure can be distinguished, for example, by the fact that the viewing window in the housing for the cameras with the viewing direction to the front and to the rear is arranged in the same position, ie opposite in the upper or lower housing cover. Areas of these housing covers, which are arranged for the beam passage, can be transparent and, in the case of the use of visible light, can consist of or include glass and / or polymers.
  • FIGS. 9a-d which can be used on their own or as part of a device according to the invention, such as device 70i, 702 and / or 70a.
  • the side sectional views shown relate, for example, to the respective facets of a faceted beam deflection device.
  • the beam deflection device can be formed, for example, as an array of facets.
  • a facet can be assigned to each optical channel, wherein each facet can deflect one or more optical channels.
  • Each of the facets can have a corresponding first beam deflection area and second beam deflection area.
  • the facets of the array of facets can be formed as reflective mirrors on both sides.
  • the beam deflection device can be moved such that the front edge of the facet is moved slightly up and down for alternate deflection with different sides, without the surface normal of the sides 35a and 35b being parallel to a surface normal of the image sensor.
  • a simple and / or small ßau size along the line extension direction of the array can be obtained by the beam deflection device being rotatably supported by 90 ° or more, for example approximately 180 ° or even 360 °.
  • the four positions mentioned can be obtained by a purely rotary movement, so that additional facets and / or a translatory movement can be dispensed with.
  • This also enables a simple configuration of the facets as plane-parallel mirrors, for example as a single plane-parallel mirror with adjustment of the divergence of the beam paths by means of the optics and / or as mutually inclined or tilted plane-parallel facets which adjust the divergence in whole or in part.
  • 9a shows a schematic side sectional view of a multi-aperture imaging device 90 according to an exemplary embodiment, in which opposite sides 18A and 1 SB are designed to deflect an optical path 22 in such a way that on the sides 18A and
  • the beam deflection device is shown in a first position, in which the side 18A faces the image sensor 36.
  • the beam deflection device 18A has a first beam deflection area, which is formed, for example, on the side 18A and which is effective for a first wavelength range of electromagnetic radiation passing through the optical channel, for example the visible wavelength range.
  • the beam deflecting device has a second beam deflecting region 18B, which is, for example, for a second wavelength range, such as ultraviolet (UV) infrared (IR) or near infrared (NIR), which is different from the first wavelength range and which is effective through electromagnetic radiation passing through the optical channel.
  • UV ultraviolet
  • IR infrared
  • NIR near infrared
  • the wavelength ranges can be disjoint, but can also partially overlap as long as they are at least partially different and thus enable different image information to be obtained.
  • N coded
  • the beam steering device can be designed in order to obtain a first angle of view of the entire field of view, an angle of attack ai of the first beam fume deflection region 18A with respect to the
  • the beam deflecting device 18 in a second position, in which the side 18B faces the image sensor, so that the side 18B is effective, for example around NIR. Redirect light.
  • the beam deflection can be rotated 180 ° from the ers ⁇ th position 18th
  • the beam deflecting region 18A can be arranged on a first side of the beam deflecting device 18 and the second beam deflecting region 18B can be arranged on a second side which is arranged opposite the first side.
  • the beam deflecting device 18 can be designed as a whole or in the individual beam deflecting elements such that the first side is arranged facing the image sensor for capturing a first image of the total field of view, and the second side is arranged facing the image sensor for capturing a second image of the total field of view.
  • a rotary and / or translatory movement can be used to change the sides facing the image sensor.
  • a plane-parallel configuration of the beam deflection device or the facet thereof enables the facet or beam deflection device 18, in order to obtain a second image of the total facial field, for example using the second wavelength range, to set an angle of incidence a 2 of the second beam deflection area 18B with respect to the image sensor of 45 ° within a tolerance range of ⁇ 10 ° ⁇ 5 ° or ⁇ 2 °.
  • the tolerance ranges can compensate, for example, that beam deflection elements have a slightly different angle of attack of 45 °, which results from an inclination or tilting of different facets of the beam deflection device 18 relative to one another, so that the individual facets or deflection regions are obtained in an average of approximately 45 ° deviate from this due to the individual inclination.
  • the beam deflection regions 18A and 18B can be obtained by coatings which are configured differently from one another and which are either reflective or non-reflective in the first or second wavelength range.
  • Exemplary embodiments provide that a corresponding coating with one or more layers is provided on the sides of the beam deflection device 18 in order to produce the beam deflection regions 18A and 18B.
  • These layers can have, for example, one or more dielectric layers, the layer thickness of which can be adapted to the angle of incidence of the beam deflection device.
  • wavelength ranges in particular of the respective other wavelength range
  • some embodiments have a range for absorbing certain wavelengths, such as a volume absorber or the like.
  • the area can be covered by the coating, so that, for example, some wavelengths are reflected first and non-reflected, for example transmitted, wavelength areas are absorbed.
  • the corresponding wavelengths can be reflected by the coating, while other wavelengths, for example at least undesired parts of the second wavelength range, can be transmitted, ie transmitted, by these layers.
  • the area for absorption arranged behind the coating can absorb these portions in order to avoid or at least reduce the image in the multi-aperture imaging device thereby being adversely affected.
  • a complementary device for absorbing unwanted parts of the first wavelength range, which is effective when the second beam deflection region 18B is used for beam deflection, can be arranged on the second side.
  • 9c shows the beam deflecting device 18 in an optional third position, in which the side 18A is again facing the image sensor, but the inclination is selected such that the beam paths are deflected in the direction of a second overall field of view, for example the first overall field of view 9a and 9b.
  • 9d shows the beam deflection device in an optional fourth position, in which the side 18B again faces the image sensor, so that the side 18B is effective, for example in order to deflect from the second overall field of view in the direction of the image sensor 36.
  • the additional positions according to FIG. 9c and FIG. 9d for the acquisition of the second total field of view enable the second overall field of view to be recorded using the first beam deflection region 18A with the image sensor, so that this recording is based on the first wavelength range.
  • the second overall field of view can be imaged with a further image, specifically using the beam deflection region 18B with the image sensor, so that this image is based on the second wavelength range.
  • the two overall fields of view can be arranged along different main directions of the multi-aperture imaging device, for example along opposite directions, that is, along approximately 180 ° different directions.
  • the beam deflection areas can, for example, be carried out along a progressive rotational movement 9a-d, the beam path alternately in the direction of the first overall field of view and the second overall field of view and alternately with the first beam deflection region 18A and the second beam deflection region
  • Redirect 18B This can be a possible but not absolutely necessary sequence of movements. Rather, the direction of rotation can always be selected, for example, which enables a shortest and / or fastest change of position, so that it is possible to switch between the positions in any order, in particular in the case of detection of a third overall field of view along a third direction and / or in an arrangement of the total fields of view at an angle not equal to 180 °.
  • FIGS. 9a-9d can be approached in any order, approximately in each case approximately 45 °.
  • a translational displacement of the beam deflection device can also be implemented.
  • pixels of the image sensor can be designed to be effective for both wavelength ranges and / or cells with different sensitivity can be arranged spatially adjacent, so that at least the image sensor range is sensitive to both wavelength ranges .
  • the image sensor areas can be designed for image generation in the first wavelength range and for image generation in the second wavelength range.
  • CMOS pixels can be sensitive in the visual and NIR range at the same time
  • the overlying color filter array (“CFA" - in the visual typically in a Bayer arrangement) can be depending on the color (red, green, blue; or magenta, cyan, yellow / yellow) but also contain "filter pixels", only some of which also transmit the NIR and only partially, but that is sufficient.
  • CFA color filter array
  • individual cells can be replaced or implemented by cells that are only sensitive in the NIR
  • pixels of the image sensor areas can be configured for image generation in the first wavelength range and for image generation in the second wavelength range.
  • the invention thus relates to a beam deflection device in facetVISION architecture with different configurations of the mirror front and mirror rear sides, wherein facetVISION is related to the multi-aperture imaging devices described herein.
  • a key idea is to design the deflecting mirror so that it has different functionalities on the front and rear.
  • the 1st side reflects the visual spectral range (visual - VIS) at the desired beam deflection angle, but not the near infrared (NIR) and the 2nd Side reflects NIR under the desired beam deflection, but not VIS, all of this, for example through differently configured dielectric layer systems on the 1st and 2nd mirror sides.
  • Mirror switching can be used as a VIS or NIR camera.
  • the mirror no longer necessarily has a wedge shape, but is a simple plane-parallel plate. 180 ° rotation used for mirror switching VIS / NIR. Any negative installation space implications in the rotating area of the mirror can be cured by opening and closing cover glasses at the location of the windows (through openings of the device).
  • the camera can be constructed with a one-sided viewing direction ("World” or “Selfie”), mirror switching (180 °) then only serves to change the recorded spectral range. But can still allow front and rear viewing directions. Then e.g. in 90 ° turning steps of the mirror: Welt-Vf S, Selfie-NIR, Weit-NIR, Selfie-VIS.
  • the image sensor areas can be configured for image generation in the first wavelength range 66 and for image generation in the second wavelength range 68.
  • the first wavelength range 66 is arranged, for example, between a first lower wavelength Ai and a first upper wavelength hz, with A ⁇ ⁇ h z .
  • the second wavelength range 68 is arranged, for example, between a second lower wavelength l 3 and a second upper wavelength l 4 , with l 3 ⁇ l.
  • the second wavelength range 68 has longer wavelengths than the first wavelength range 66, it is also possible for the second wavelength range 68 to have smaller wavelengths than the first wavelength range 66.
  • the wavelength ranges 66 and 68 can overlap with one another , but can also be spaced apart from one another by an intermediate region 72.
  • the image sensor area can be designed to generate image data at least in the wavelength areas 66 and 68, which means that it has, at least in the wavelength areas 66 and 68, a sensitivity Ei which is increased compared to a sensitivity Eo in which the image sensor area does not, for example Image data or image signals generated because it is insensitive to these wavelengths.
  • the beam deflection can be carried out selectively for the wavelength ranges 66 and 68, so that wavelengths outside the respective wavelength range for which the beam deflection range is currently effective are attenuated or filtered out, it being sufficient that at least the wavelengths are suppressed or attenuated which are arranged in the complementary wavelength range.
  • a wavelength range for which the image sensor is insensitive can also be deflected by the beam deflection region 18A and / or 18B.
  • the image sensor area can also be designed for imaging outside the wavelength ranges 66 and 68.
  • the image sensor area can have a plurality of pixels, i. i.e., pixels (picture element).
  • Each pixel can consist of at least one, preferably several imaging, i.e. H. photosensitive, sensor cells. These can be arranged freely or according to a pattern, for example a Bayer pattern.
  • a sensitivity of the image sensor area for the second wavelength area 68 can be obtained, for example, by a first subset of pixels being sensitive to the first wavelength area 66 and a second subset of other pixels being sensitive to the second wavelength area 68.
  • a pixel of the first subset can be interlaced with one another and alternately, i.
  • Pixels of the image sensor areas can thus be designed for image generation in the first wavelength range 66 and / or at least partially for image generation in the second wavelength range 68.
  • aspects have been described in connection with a device, it is understood that these aspects also represent a description of the corresponding method, so that a block or a component of a device is also to be understood as a corresponding method step or as a feature of a method step. Analogously, aspects that have been described in connection with or as a method step also represent a description of a corresponding block or details or feature of a corresponding device.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Studio Devices (AREA)
  • Cameras In General (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)

Abstract

L'invention concerne un dispositif de reproduction multi-ouvertures comportant un capteur d'image, un réseau de canaux optiques disposés les uns à côté des autres, chaque canal optique comprenant une optique pour la reproduction d'au moins un champ de vision partiel d'un champ de vision global sur une zone du capteur d'image, un dispositif de déviation de faisceau destiné à dévier le chemin optique des canaux optiques, le dispositif de déviation de faisceau comprenant une première zone de déviation de faisceau qui est active pour une première plage de longueurs d'onde d'un rayonnement électromagnétique traversant le canal optique, et une deuxième zone de déviation de faisceau qui est active pour une deuxième plage de longueurs d'onde, différente de la première, du rayonnement électromagnétique traversant les canaux optiques.
EP19813525.3A 2018-12-10 2019-12-03 Dispositif de reproduction multi-canaux et appareil comportant un dispositif de reproduction multi-ouvertures Pending EP3894925A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102018221358 2018-12-10
DE102018222830.2A DE102018222830A1 (de) 2018-12-10 2018-12-21 Multikanalabbildungsvorrichtung und vorrichtung mit einer multiaperturabbildungsvorrichtung
PCT/EP2019/083521 WO2020120230A1 (fr) 2018-12-10 2019-12-03 Dispositif de reproduction multi-canaux et appareil comportant un dispositif de reproduction multi-ouvertures

Publications (1)

Publication Number Publication Date
EP3894925A1 true EP3894925A1 (fr) 2021-10-20

Family

ID=70776860

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19813525.3A Pending EP3894925A1 (fr) 2018-12-10 2019-12-03 Dispositif de reproduction multi-canaux et appareil comportant un dispositif de reproduction multi-ouvertures

Country Status (8)

Country Link
US (1) US11611736B2 (fr)
EP (1) EP3894925A1 (fr)
JP (1) JP7285931B2 (fr)
KR (1) KR20210095679A (fr)
CN (1) CN113227867B (fr)
DE (1) DE102018222830A1 (fr)
TW (1) TWI742480B (fr)
WO (1) WO2020120230A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3816646A1 (fr) * 2019-10-28 2021-05-05 Koninklijke Philips N.V. Système de surveillance comportant une caméra et un miroir non magnétique pour un système d'examen par résonance magnétique

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6342940B1 (en) 1998-12-28 2002-01-29 Futaba Denshi Kogyo Kabushiki Kaisha Optical printer
JP2000190566A (ja) * 1998-12-28 2000-07-11 Futaba Corp 光プリントヘッド及び光プリンタ
JP2004054108A (ja) * 2002-07-23 2004-02-19 Nikon Corp 光路分割光学素子とこれを用いた顕微鏡
US8784301B2 (en) * 2011-08-12 2014-07-22 Intuitive Surgical Operations, Inc. Image capture unit and method with an extended depth of field
KR101371388B1 (ko) * 2012-06-29 2014-03-10 국방과학연구소 다중 선형검출기를 이용한 회전형 전방위 360도 파노라믹 적외선 영상 생성장치
US20140055624A1 (en) * 2012-08-23 2014-02-27 Microsoft Corporation Switchable camera mirror apparatus
US20140110585A1 (en) * 2012-10-19 2014-04-24 James Justice Multi-Spectral Sensor System Comprising a Plurality of Buttable Focal Plane Arrays
DE102015215833A1 (de) * 2015-08-19 2017-02-23 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Multiaperturabbildungsvorrichtung mit Optiksubstrat
DE102015215841B4 (de) * 2015-08-19 2017-06-01 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung mit einer Multikanalabbildungsvorrichtung und Verfahren zum Herstellen derselben
DE102015215836B4 (de) * 2015-08-19 2017-05-18 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Multiaperturabbildungsvorrichtung mit einer reflektierende Facetten aufweisenden Strahlumlenkvorrichtung
US20170096144A1 (en) * 2015-10-05 2017-04-06 Ford Global Technologies, Llc System and Method for Inspecting Road Surfaces
DE102015220566B4 (de) * 2015-10-21 2021-03-18 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung mit einer Multiaperturabbildungsvorrichtung, Verfahren zum Bereitstellen derselben und Verfahren zum Erfassen eines Gesamtgesichtsfeldes
AT518355B1 (de) * 2016-03-08 2018-04-15 Ait Austrian Inst Tech Gmbh Anordnung zur Erstellung von Bildern
DE102016208210A1 (de) 2016-05-12 2017-11-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. 3d-multiaperturabbildungsvorrichtungen, multiaperturabbildungsvorrichtung, verfahren zum bereitstellen eines ausgangssignals einer 3d-multiaperturabbildungsvorrichtung und verfahren zum erfassen eines gesamtgesichtsfeldes
KR101789317B1 (ko) * 2016-11-04 2017-10-23 한남대학교 산학협력단 멀티스케일 이미징 시스템
DE102017208709B3 (de) 2017-05-23 2018-10-25 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Multiaperturabbildungsvorrrichtung und Verfahren zum Bereitstellen einer Multiaperturabbildungsvorrichtung

Also Published As

Publication number Publication date
DE102018222830A1 (de) 2020-06-10
WO2020120230A1 (fr) 2020-06-18
TW202024718A (zh) 2020-07-01
US11611736B2 (en) 2023-03-21
JP7285931B2 (ja) 2023-06-02
CN113227867A (zh) 2021-08-06
JP2022512354A (ja) 2022-02-03
CN113227867B (zh) 2023-02-17
TWI742480B (zh) 2021-10-11
KR20210095679A (ko) 2021-08-02
US20210281819A1 (en) 2021-09-09

Similar Documents

Publication Publication Date Title
EP3338133B1 (fr) Dispositif comprenant un dispositif de reproduction à multiples canaux et son procédé de fabrication
EP3225021B1 (fr) Dispositif de reproduction multiouverture, système de reproduction et procédé de fourniture d'un dispositif de reproduction multiouverture
EP3403392B1 (fr) Détecteur optique à ouvertures multiples et methode de détection de signaux optiques
EP3366032B1 (fr) Dispositif comportant un dispositif de reproduction à ouvertures multiples, procédé de fabrication de celui-ci et procédé de détection d'un champ visuel global
EP3371650B1 (fr) Dispositif imageur multi-ouvertures, système imageur et procédé de détection d'une zone d'objet
EP3338132B1 (fr) Dispositif d'imagerie à ouverture multiple, dispositif portatif et procédé de fabrication d'un dispositif d'imagerie à ouverture multiple
EP3860108B1 (fr) Dispositif de reproduction multi-ouverture, système de reproduction et procédé de fourniture d'un dispositif de reproduction multi-ouverture
EP3632094B1 (fr) Dispositif de reproduction multi-ouverture, système de reproduction et procédé de fourniture d'un dispositif de reproduction multi-ouverture
DE102015215833A1 (de) Multiaperturabbildungsvorrichtung mit Optiksubstrat
EP3593201B1 (fr) Dispositif de reproduction multiouverture, système de reproduction et procédé de fourniture d'un dispositif de reproduction multi-ouverture
EP2294483A1 (fr) Système de projection
EP3981145A1 (fr) Dispositif de reproduction multi-canaux et appareil comportant un dispositif de reproduction multi-ouvertures
EP3894925A1 (fr) Dispositif de reproduction multi-canaux et appareil comportant un dispositif de reproduction multi-ouvertures
DE102017012197B4 (de) Multiaperturabbildungsvorrichtung, Abbildungssystem und Verfahren zum Bereitstellen einer Multiaperturabbildungsvorrichtung
DE102015017384B4 (de) Vorrichtung mit einer Multiaperturabbildungsvorrichtung, Verfahren zum Bereitstellen derselben und Verfahren zum Erfassen eines Gesamtgesichtsfeldes

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210520

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
RAP3 Party data changed (applicant data changed or rights of an application transferred)

Owner name: FRAUNHOFER-GESELLSCHAFT ZUR FOERDERUNG DER ANGEWANDTEN FORSCHUNG E.V.

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20230808