WO2023202387A1 - 一种光学成像模组、光学成像系统及终端设备 - Google Patents

一种光学成像模组、光学成像系统及终端设备 Download PDF

Info

Publication number
WO2023202387A1
WO2023202387A1 PCT/CN2023/086660 CN2023086660W WO2023202387A1 WO 2023202387 A1 WO2023202387 A1 WO 2023202387A1 CN 2023086660 W CN2023086660 W CN 2023086660W WO 2023202387 A1 WO2023202387 A1 WO 2023202387A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
filter
component
field
view
Prior art date
Application number
PCT/CN2023/086660
Other languages
English (en)
French (fr)
Inventor
陈廷爱
王庆平
杨沫
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023202387A1 publication Critical patent/WO2023202387A1/zh

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0015Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
    • G02B13/002Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/06Panoramic objectives; So-called "sky lenses" including panoramic objectives having reflecting surfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/18Optical objectives specially designed for the purposes specified below with lenses having one or more non-spherical faces, e.g. for reducing geometrical aberration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles

Definitions

  • the present application relates to the field of optical imaging technology, and in particular, to an optical imaging module, an optical imaging system and a terminal device.
  • optical lenses With the development of science and technology, users have higher and higher requirements for the imaging function of optical lenses. For example, imaging with a very large field of view is required.
  • conventional optical lenses cannot achieve ultra-wide field of view shooting, and require the use of specially designed optical lenses, such as fisheye lenses (see Figure 1a), panoramic reflection lenses (see Figure 1b) or panoramic ring lenses (see Figure 1c )wait.
  • optical distortion refers to the ratio of the difference between the real image height (y_chief) and the ideal image height (y_ref) and the ideal image height, which can be expressed by the following formula 1, and the ideal image height can be expressed by the following formula 2.
  • Optical distortion 100% ⁇ (y_chief-y_ref)/y_ref Formula 1
  • y_ref f ⁇ tan ⁇ Formula 2
  • f represents the focal length and ⁇ represents half of the field of view.
  • This application provides an optical imaging module, an optical imaging system and a terminal device, which can not only realize imaging with a large field of view, but also reduce the optical distortion of the captured image.
  • the application provides an optical imaging module.
  • the optical imaging module includes a first lens component, a second lens component, a third lens component, a light combining component and a detection component.
  • the first light of the first lens component The angle between the axis and the second optical axis of the second lens component is greater than 0° and less than 180°, or greater than 180° and less than 360°.
  • the first lens component is used to propagate the first light ray from the first field of view to the light combining component; the second lens component is used to propagate the second light ray from the second field of view to the light combining component.
  • the first field of view and the third light combining component The two fields of view partially overlap; the light combining component is used to mix the first light and the second light to obtain a third light; the third lens component is used to focus the third light from the light combining component to the detection component; the detection component is used to An image is formed based on the focused third ray.
  • the first light rays in the first field of view can be collected into the light combining group as much as possible through the first lens assembly.
  • the second light of the second field of view can be collected as much as possible into the transmission aperture of the light combining component through the second lens component;
  • the light combining component mixes the first light and the second light to obtain a third light , to achieve the aliasing of the first field of view and the second field of view;
  • the first light ray and the second light ray are coupled to the same detection component, thereby unmixing and separating the smaller first field of view and the second field of view.
  • the image formed based on the first field of view and the image formed based on the second field of view can be unmixed, separated and displayed separately, so that one of the fields of view can be flexibly selected as the observation field, which can be applied to directional or peripheral imaging. Visual observation.
  • the angle between the first optical axis and the second optical axis is equal to 90°.
  • the angle between the first optical axis and the second optical axis helps to simplify the design and assembly of the optical imaging module.
  • the field of view angle of the first field of view is greater than or equal to 90° and less than or equal to 135°; and/or the field of view angle of the second field of view is greater than or equal to 90° and less than or equal to 135°. 135°.
  • the field of view angle of the first field of view is greater than or equal to 98° and less than or equal to 130°; and/or the field of view angle of the second field of view is greater than or equal to 98° and less than or equal to 130°.
  • the optical distortion of the synthetic field of view after combining the first field of view and the second field of view is small, and imaging of an extremely large field of view can be achieved.
  • the first light ray propagated through the first lens component is parallel light or non-parallel light; and/or the second light ray propagated through the second lens component is parallel light or non-parallel light.
  • Converting the first light of the first field of view into parallel light through the first lens assembly helps to reduce the difficulty of assembling the various lens groups. Converting the first light in the first field of view into non-parallel light through the first lens component helps reduce the size of the entire optical imaging module during design.
  • the light combining component includes a polarizing light splitting element, which is used to reflect the first polarized light in the first light ray to the third lens component and transmit the second polarized light in the second light ray.
  • the polarization state of the first polarized light is different from the polarization state of the second polarized light.
  • the polarization splitting element can be used to mix the first polarized light in the first light ray and the second polarized light in the second light ray, thereby obtaining the third light ray.
  • the polarizing light splitting element includes a first light combining surface, and the first light combining surface is coated with a polarizing light splitting film or etched with a metal wire grid.
  • the detection component includes a first filter layer and a first photosensitive layer, the first filter layer includes N first filter blocks, and the first filter block includes n first filter units. , at least two of the n first filter units allow the third light to pass through different polarization states, N and n are both integers greater than 1; the first photosensitive layer includes P first pixel blocks, and the A pixel block includes p first pixels, p is greater than or equal to n, and P is a positive integer.
  • each first pixel block can detect all polarized light (i.e., first polarized light). light and second polarized light).
  • first polarized light i.e., first polarized light
  • second polarized light i.e., second polarized light
  • p is greater than n
  • multiple first pixels can correspond to one first filter unit, that is, pixel merging can be achieved, thereby helping to increase the signal-to-noise ratio of the formed image.
  • the detection component further includes a second filter layer; the second filter layer includes M second filter blocks, the second filter block includes m second filter units, and mth second filter units. At least two of the two filter units allow the third light to pass through different wavelength ranges, and M and m are both integers greater than 1; a second filter unit corresponds to a first filter unit and belongs to the same second filter unit.
  • the m first filter units corresponding to the m second filter units of the filter block allow the The polarization states of the three light rays passing through are the same; or, the n second filter units corresponding to the n first filter units belonging to the same first filter block allow the third light ray to pass through the same wavelength range.
  • the first filter layer allows all polarization states of light (ie, first polarized light and second polarized light) to pass through, and the second filter layer allows all wavelength bands to pass through. In this way, the first light and the second light can completely cover the polarization sensor, thereby improving the utilization rate of the polarization sensor. Furthermore, by collecting one frame of original image based on this detection component, an image with a larger field of view and smaller optical distortion can be obtained, which can be applied to scenes such as video streaming. It also does not require high-capacity image processing bandwidth, helping to save processor computing power.
  • the light combining component is a spectrum splitting element
  • the spectrum splitting element is used to reflect k light rays of the first waveband in the first light ray to the third lens assembly, and to reflect k light rays in the second light ray.
  • a second waveband of light is transmitted to the second lens component, a first waveband of light is corresponding to a second waveband of light, and k is an integer greater than 1.
  • the spectral splitting element can be used to mix part of the first light (i.e., k light rays of the first wave band) and part of the second light (i.e., k light rays of the second wave band), thereby obtaining the third light.
  • the spectrum splitting element includes a second light combining surface, and the second light combining surface includes a multi-passband spectral film.
  • the detection component includes a third filter layer and a second photosensitive layer; the third filter layer includes Q third filter blocks, and the third filter block includes at least 2k third filter blocks. Unit, 2k third filter units allow the third light to pass through different wavelength ranges, Q and q are positive integers; the second photosensitive layer includes Q second pixel blocks, and the second pixel block includes at least 2k second pixels , a second pixel block corresponds to a third filter block, and a second pixel corresponds to a third filter unit.
  • the light combining component is specifically configured to reflect the first light to the third lens component during the first period, and to transmit the second light to the third lens component during the second period.
  • the light combining component includes a dichroic mirror coated with an electrically controlled film; or a deflection reflector; or a dichroic prism, a first switch and a second switch, wherein the first switch is located between the first lens component and the dichroic prism. , the second switch is located between the second lens assembly and the dichroic prism.
  • the first switch includes a first electronically controlled switch or a first liquid crystal light valve; and/or the second switch includes a second electronically controlled switch or a second liquid crystal light valve.
  • the detection component includes a fourth filter layer and a third photosensitive layer;
  • the fourth filter layer includes H fourth filter blocks, and the fourth filter block includes h fourth filter units , at least two of the h fourth filter units allow the third light to pass through different wavelength ranges, H and h are both integers greater than 1;
  • the third photosensitive layer includes H third pixel blocks,
  • the third pixel block includes h third pixels, one third pixel block corresponds to one fourth filter block, and one fourth filter unit corresponds to one third pixel.
  • This detection component is compatible with existing image sensors. Moreover, by collecting 2 frames of original images (that is, collecting one frame in the first period and one frame in the second period), an image with a larger field of view and smaller optical distortion can be obtained, which can be applied to scenarios such as video streaming. Moreover, high-capacity image processing bandwidth is not required, helping to save processor computing power.
  • this application provides an optical imaging system, including two optical imaging modules of the first aspect or any one of the first aspects, and the detection components of the two optical imaging modules are connected through a bus.
  • the present application provides a terminal device.
  • the terminal device includes the above-mentioned first aspect or the method in the first aspect.
  • Any optical imaging module; or the terminal device includes the above second aspect or any optical imaging system in the second aspect.
  • the terminal device may also include a processor, and the processor may be used to control the imaging of the optical imaging module.
  • Figure 1a is a schematic structural diagram of a fisheye lens in the prior art
  • Figure 1b is a schematic structural diagram of a panoramic ring lens in the prior art
  • Figure 1c is a schematic structural diagram of a panoramic reflection lens in the prior art
  • Figure 2a is a schematic diagram of an application scenario in which an optical imaging module provided by this application is integrated into a smartphone;
  • Figure 2b is a schematic diagram of an application scenario in which an optical imaging module provided by this application is integrated into a smart tablet;
  • Figure 2c is a schematic diagram of an application scenario in which an optical imaging module provided by this application is integrated into a smart bracelet;
  • Figure 2d is a schematic diagram of an application scenario in which an optical imaging module provided by this application is integrated into a camera;
  • Figure 2e is a schematic diagram of an application scenario in which an optical imaging module provided by this application is integrated into a sweeping robot;
  • Figure 2f is a schematic diagram of an application scenario in which an optical imaging module provided by this application is integrated into a vehicle;
  • Figure 2g is a schematic diagram of an application scenario in which an optical imaging module provided by this application is integrated into AR glasses;
  • Figure 2h is a schematic diagram of an application scenario in which an optical imaging module provided by this application is integrated into a roadside unit;
  • Figure 3 is a schematic structural diagram of an optical imaging module provided by this application.
  • Figure 4a is a schematic structural diagram of a first lens assembly provided by this application.
  • Figure 4b is a schematic structural diagram of another first lens assembly provided by this application.
  • Figure 5 is a schematic diagram of the angle between the first optical axis and the second optical axis provided by the present application.
  • Figure 6a is a schematic structural diagram of a third lens assembly provided by the present application.
  • Figure 6b is a schematic structural diagram of another third lens assembly provided by this application.
  • Figure 7a is a schematic diagram of the simulation results of a field curvature provided by this application.
  • Figure 7b is a schematic diagram of the simulation results of optical distortion provided by this application.
  • Figure 8a is a schematic diagram of the simulation results of another field curvature provided by this application.
  • Figure 8b is a schematic diagram of another simulation result of optical distortion provided by this application.
  • Figure 9 is a schematic diagram of a polarizing beam splitter provided by the present application.
  • Figure 10 is a spectroscopic schematic diagram of a spectrum spectrometer provided by this application.
  • Figure 11 is a spectroscopic diagram of another light combining component provided by this application.
  • Figure 12 is a schematic structural diagram of a detection component provided by this application.
  • Figure 13 is a schematic diagram of the corresponding relationship between a first filter block and a second filter block provided by this application;
  • Figure 14a is a schematic diagram of the corresponding relationship between another first filter block and a second filter block provided by this application;
  • Figure 14b is a schematic diagram of a first filter block in a first filter layer provided by this application.
  • Figure 15 is a schematic diagram of the corresponding relationship between another first filter block and a second filter block provided by this application;
  • Figure 16 is a schematic structural diagram of a detection component provided by this application.
  • Figure 17a is a schematic diagram of a third filter layer provided by this application.
  • Figure 17b is a schematic diagram of another third filter layer provided by this application.
  • Figure 18 is a schematic structural diagram of a third image sensor provided by this application.
  • Figure 19a is a schematic diagram of a fourth filter layer provided by this application.
  • Figure 19b is a schematic diagram of another fourth filter layer provided by this application.
  • Figure 19c is a schematic diagram of another fourth filter layer provided by this application.
  • Figure 19d is a schematic diagram of another fourth filter layer provided by this application.
  • Figure 20 is a schematic structural diagram of a detection component provided by this application.
  • Figure 21a is a schematic structural diagram of an optical imaging module provided by this application.
  • Figure 21b is a schematic structural diagram of an optical imaging module provided by this application.
  • Figure 22 is a schematic structural diagram of an optical imaging system provided by this application.
  • Figure 23 is a schematic structural diagram of a terminal device provided by this application.
  • the optical imaging module in this application can be integrated into a terminal device or provided in a component of the terminal device.
  • the terminal device can be, for example, a smartphone (see Figure 2a), a smart tablet (see Figure 2b), a smart bracelet (see Figure 2c), a smart home device (see the camera shown in Figure 2d), or smart manufacturing equipment , game consoles, robots (see the sweeping robot shown in Figure 2e) or intelligent transportation equipment (such as automated guided vehicles (AGV) or unmanned transport vehicles, etc.).
  • AAV automated guided vehicles
  • an optical imaging module integrated into a smartphone which can be called an ultra-wide-angle smartphone
  • taking photos with an ultra-wide-angle smartphone can help users choose the best shot with the help of the large-field image's understanding of light, shadow, and scenes. angle and shooting position, so as to obtain better shooting results.
  • large-field images can be used to help outdoor users release their focus, allowing outdoor users to focus on the game screen while walking while avoiding dangers.
  • the ultra-wide-angle camera can be used in security monitoring, such as panoramic monitoring or environmental monitoring.
  • the optical imaging module can also be integrated on vehicles (such as unmanned vehicles, smart vehicles, electric vehicles, digital vehicles, etc.) or intelligent transportation equipment as ultra-wide-angle vehicle cameras, as shown in Figure 2f .
  • Ultra-wide-angle vehicle-mounted cameras can obtain measurement information such as the distance of surrounding objects in real time or periodically, thereby providing necessary information for lane correction, vehicle distance maintenance, reversing and other operations.
  • the ultra-wide-angle vehicle-mounted camera can observe the surrounding environment with the help of large-field imaging (such as front-view short focus >135°, surround view >180°), it can achieve: a) target recognition and classification, such as various lane line recognition, traffic light recognition, and Traffic sign recognition, etc.; b) Passable space detection (freespace), for example, the safe boundary (travelable area) for vehicles can be divided, mainly for vehicles, ordinary road edges, side stone edges, and boundaries where no obstacles are visible , dividing unknown boundaries, etc.; c) The ability to detect lateral moving targets, such as the detection and tracking of pedestrians and vehicles crossing intersections; d) Positioning and map creation, such as visual simultaneous positioning and map construction (simultaneous localization) and mapping, SLAM) technology positioning and map creation, etc. Ultra-wide-angle vehicle cameras can be used in fields such as unmanned driving, autonomous driving, assisted driving, smart driving or connected cars.
  • target recognition and classification such as various lane line recognition, traffic light recognition, and Traffic sign recognition
  • the optical imaging module in this application can also be integrated into a near eye display (NED) device (which can be called an ultra-wide-angle NED device).
  • the ultra-wide-angle NED device can, for example, be an enhanced Augmented reality (AR) equipment or virtual reality (VR) equipment, AR equipment may include but is not limited to AR glasses or AR helmets, and VR devices may include but is not limited to VR glasses or VR helmets.
  • AR enhanced Augmented reality
  • VR virtual reality
  • VR devices may include but is not limited to VR glasses or VR helmets.
  • See Figure 2g takes AR glasses as an example. Users can wear AR glasses to play games, watch videos, participate in virtual meetings, live broadcasts, or video shopping, etc.
  • Ultra-wide-angle NED devices can use the illumination information obtained from larger field-of-view images to assist in enhancing the lighting and shadows of generated AR virtual objects, or help NED devices generate multiple interactive virtual objects in a large field-of-view space.
  • optical imaging module provided in this application can also be applied in a variety of other scenarios, and is not limited to the above example scenarios.
  • optical imaging modules can also be installed on drones as airborne cameras.
  • the optical imaging module can also be installed on roadside traffic equipment (such as roadside unit (RSU)) as a roadside traffic camera, see Figure 2h, thus enabling intelligent vehicle-road collaboration, etc.
  • RSU roadside unit
  • the form and position of the optical imaging module are only examples.
  • the optical imaging module can also be set in other possible positions, and the optical imaging module can also be set in other possible forms. This application There are no restrictions on this.
  • optical imaging module proposed in this application will be described in detail below with reference to Figures 3 to 22.
  • the optical imaging module may include a first lens component, a second lens component, a third lens component, a light combining component and a detection component.
  • the angle ⁇ between the first optical axis of the first lens component and the second optical axis of the second lens component is greater than 0° and less than 180°, or greater than 180° and less than 360°, please refer to Figure 5 below. It can also be understood that the first optical axis and the second optical axis are not parallel (that is, ⁇ is not equal to 180°) and do not overlap (that is, ⁇ is not equal to 0°).
  • the first lens component is used to propagate the first light from the first field of view (or sub-field of view 1) to the light combining component.
  • the second lens component is used to propagate the second light from the second field of view (or sub-field of view 2) to the light combining component.
  • the angle ⁇ between the center line of the first field of view and the center line of the second field of view is the same as the angle ⁇ between the first optical axis and the second optical axis.
  • the light combining component is used to mix the first light and the second light to obtain a third light. It can also be understood that part of the first light and part of the second light are mixed into the third light after passing through the combined light component.
  • the third lens component is used to focus the third light from the light combining component to the detection component.
  • the detection component is used to form an image according to the focused third light. It should be noted that the names of the light combining components in this application are only examples, and the light combining components can also split light first and then mix.
  • the first field of view is different from the second field of view, and the field angle of the first field of view and the field of view angle of the second field of view may be the same or different.
  • the field of view angle of the first field of view (or the full field of view angle of the first field of view) is greater than or equal to 90° and less than or equal to 135°.
  • the field of view angle of the first field of view is Greater than 98° and less than 130°; and/or, the field of view angle of the second field of view (or the full field of view angle of the second field of view) is greater than or equal to 90° and less than or equal to 135°, further,
  • the field of view angle of the second field of view is greater than 98° and less than 130°.
  • the field of view angle of the first field of view is 90°, 100°, 105°, 110°, 120°, 125°, 130° or 135°, etc.
  • the field of view angle of the second field of view is 90°, 100°. , 105°, 110°, 120°, 125°, 130° or 135°, etc.
  • the first field of view partially overlaps with the second field of view. In this way, the synthesis of the first field of view and the second field of view can be achieved (for example, it can be unmixed, separated, re-stitched, etc.).
  • the combined field of view angle of the two fields of view is 190°.
  • the field of view angle of the first field of view is 100°
  • the field of view angle of the second field of view is 120°
  • the combined field of view angle of the first field and the second field of view is 215°.
  • the angle of the first field of view and the second field of view must be 15%.
  • the minimum field of view angle can be 96°.
  • the first lens component and the second lens component may also be called a front lens, and the third lens component may also be called a rear lens.
  • the third lens assembly is located at a certain distance behind the exit of the light combining assembly. It can be understood that the optical imaging module may not include a third lens component. Based on this, the first lens component is required to focus the first light to the detection component, and the second lens component is required to focus the second light to the detection component.
  • the first light ray of the first field of view can be collected as much as possible into the transmission aperture of the light combining component through the first lens component; the second light ray of the second field of view can be collected through the second lens component Collect as much as possible into the transmission aperture of the light combining component; the light combining component mixes the first light and the second light to obtain the third light, achieving the aliasing of the first field of view and the second field of view; the first light and the second light
  • the light is coupled to the same detection component, so that by unmixing, separating, and re-splicing the smaller first field of view and the second field of view, it is possible to achieve imaging of a very large field of view and reduce the resulting image. optical distortion.
  • the optical distortion is less than 10% and the combined field of view is greater than 190°.
  • the image formed based on the first field of view and the image formed based on the second field of view can be displayed separately after unmixing and separation, so that one of the fields of view can be flexibly selected as the observation field, which can be applied to directional or peripheral vision. observe.
  • the optical imaging module includes a first lens component and a second lens component, and may further include a third lens component.
  • the third lens component and the second lens component may be the same as the first lens component, or they may be different.
  • the differences between the third lens assembly and the second lens assembly and the first lens assembly may include, but are not limited to, differences in the number of lenses included and/or optical parameters.
  • the optical parameters may include but are not limited to the radius, thickness, refractive index, abbe number, material, etc. of the lens.
  • the first lens component is taken as an example for detailed introduction below.
  • the first lens component includes at least one lens.
  • the lens included in the first lens assembly may be a spherical lens or an aspherical lens. It can also be understood that the first lens assembly may include a single spherical lens, or a single aspherical lens, or a combination of multiple spherical lenses, or a combination of multiple aspherical lenses, or at least one spherical lens and at least one aspherical lens. A combination of spherical lenses.
  • the lenses may include, but are not limited to, concave lenses and convex lenses; further, there are many different types of convex lenses and concave lenses, for example, convex lenses include biconvex lenses, plano-convex lenses, and meniscus lenses, etc., and concave lenses include biconcave lenses, plano-concave lenses, and meniscus-convex lenses. wait.
  • the first lens assembly formed by a combination of multiple spherical lenses and/or aspherical lenses helps to improve imaging quality and reduce aberrations.
  • the material of the lens in the first lens assembly may be optical materials such as glass, resin, or crystal.
  • the material of the lens is resin, it helps to reduce the quality of the optical imaging module.
  • the material of the lens is glass, it helps to further improve the image quality.
  • the first lens assembly includes at least one lens made of glass material. It should be understood that when the first lens assembly includes at least three lenses, some of the lenses may be made of resin, some of the lenses may be made of glass, and some of the lenses may be made of crystal; or some of the lenses may be made of resin and some of the lenses may be made of crystal.
  • the material is glass; or some of the lenses are made of glass and some of the lenses are crystal; or all of the lenses are made of resin; or all of the lenses are made of glass; or all of the lenses are made of crystal; this application does not limit this.
  • FIG. 4a is a schematic structural diagram of a first lens component provided by the present application.
  • the first lens component includes two lenses.
  • the two lenses included in the first lens assembly can also be of other shapes, please refer to Figure 4b.
  • the field of view angle of the first field of view may be 100°.
  • the first lens component may be a component with optical power that can converge the first light rays in the first field of view, that is, the first light rays entering the light combining component are non-parallel rays.
  • the first lens component is a component without optical power. Based on this, the first light ray entering the light combining component is a parallel light ray. Due to engineering constraints and limitations of the receiving aperture of the light combining component, the maximum field of view of the first light entering the light combining component is usually set to ⁇ 35°.
  • FIG. 5 exemplarily shows the angle ⁇ between the first optical axis of the first lens assembly and the second optical axis of the second lens assembly.
  • the angle ⁇ between the first optical axis and the second optical axis can also be other angles greater than 0° and less than 180°, for example, 45°, 90°, 100°, 120°, 130° etc.; or, the angle ⁇ between the first optical axis and the second optical axis can also be other angles greater than 180° and less than 360°, for example, 190°, 200°, 240°, etc., which are no longer the same here. List one.
  • Figure 6a and Figure 6b are schematic structural diagrams of two third lens assemblies provided by this application.
  • the third lens assembly includes three lenses. It should be noted that the third lens assembly may include more or less lenses than three lenses. Regarding the lenses, please refer to the aforementioned introduction of the lenses in the first lens assembly, which will not be described again this time.
  • the reference wavelength used for field curvature simulation is 486.1 Nanometer (nm), 587.5nm, 656.3nm. It can be understood that the reference wavelength used can also be 470nm, 510nm, 555nm, 610nm, 650nm, etc.
  • the simulated field curvature results can be seen in Figure 7a below, and the optical distortion can be seen in Figure 7b below. It can be seen from Figure 7a that the field curvature is relatively reasonable. Therefore, the aberrations based on the first lens assembly and the second lens assembly shown in FIG. 4a and the third lens assembly shown in FIG.
  • the reference wavelength used in the field curvature simulation is 486.1 Nanometer (nm), 587.5nm, 656.3nm.
  • the simulated field curvature results can be seen in Figure 8a below, and the optical distortion can be seen in Figure 8b below. It can be seen from Figure 8a that the field curvature is relatively reasonable. Therefore, the aberrations based on the first lens assembly and the second lens assembly shown in FIG. 4b and the third lens assembly shown in FIG. 6b are small and can be easily corrected.
  • the light combining component can mix the first light from the first lens component and the second light from the second lens component to obtain the third light.
  • the first light and the second light are mixed into the third light after passing through the light combining component.
  • the light combining component includes a polarizing light splitting element.
  • the polarization splitting element performs light splitting based on the polarization states of the received first light and the second light.
  • the polarizing beam splitter element may be a polarizing beam splitter (PBS).
  • PBS polarizing beam splitter
  • Figure 9 is a schematic diagram of a polarizing beam splitter provided in this application.
  • the polarizing beam splitter can be made by plating one or more layers of polarizing light splitting films or etching metal wire grids on the inclined surface of the right-angle prism (which can be called the first light combined surface), and then bonding them together through an adhesive layer.
  • the polarizing beam splitter can divide the incident light (including P-polarized light and S-polarized light) into horizontally polarized light (ie, P-polarized light) and vertically polarized light (ie, S-polarized light).
  • P-polarized light passes completely, S-polarized light is reflected at an angle of 45 degrees, and the exit direction of S-polarized light is at an angle of 90 degrees to the exit direction of P-polarized light.
  • PBS has transmission and reflection characteristics. Generally, the reflectance for S-polarized light is above 99.5%, and the transmittance for P-polarized light is above 91%. PBS can realize that the wavelength range of the light splitting film is relatively wide and the degree of polarization is relatively high.
  • the polarization splitting element is used to reflect the first polarized light in the first light ray to the third lens component, and transmit the second polarized light in the second light ray to the third lens. component, the polarization state of the first polarized light is different from the polarization state of the second polarized light. Further, optionally, the first polarized light and the second polarized light are polarized lights whose polarization states are perpendicular to each other.
  • the first polarized light is S-polarized light
  • the second polarized light is P-polarized light
  • the S-polarized light of the first light is reflected by the first light combining surface of the polarizing beam splitting element to the third lens component
  • the second light of The P-polarized light is transmitted to the third lens component by the first light combining surface of the polarization splitting element.
  • the third light ray includes S-polarized light of the first light ray and P-polarized light of the second light ray.
  • the polarizing beam splitting prism given above is only an example, and the polarizing beam splitting element may also be a polarizing beam splitting plate, for example.
  • the polarizing light splitting plate is formed by plating one or more layers of polarizing light splitting films (or polarizing light splitting films) on the surface of the glass plate (which can be called the first light combining surface) or etching a metal wire grid.
  • This application does not limit the specific form of the polarizing light splitting element. All forms that can realize the functions of the polarizing light splitting element in this application are within the protection scope of this application.
  • the optical imaging module may also include a light-absorbing structure that can absorb the P-polarized light of the first light transmitted by the first light combining surface of the polarization splitting element and the S-polarized light of the reflected second light. , thereby reducing interference from unnecessary light when the optical imaging module is imaging.
  • the light combining component includes a spectrum splitting element.
  • the spectral light splitting element performs light splitting based on the wavelength bands of the received first light and the second light. It can be understood that the colors of light corresponding to different wavelength bands are also different. Therefore, the spectrum splitting element can also be understood as performing light splitting based on the colors of the received first light and the second light.
  • the spectrum spectroscopic element can be plated with a multi-passband spectral film (or called a color separation film) on the second spectroscopic interface.
  • the multi-passband spectral film can separate monochromatic light or narrow-band complex color light in different bands from the complex color light. .
  • the wavelength range allowed to be transmitted or reflected by each passband in the multi-passband spectral film can be selected according to actual needs.
  • the spectrum splitting element is used to reflect k light rays of the first wavelength band among the first light rays to the third lens assembly, and transmit k light rays of the second wavelength band among the second light rays to the third lens assembly.
  • a light of the first waveband corresponds to a light of the second waveband
  • k is an integer greater than 1.
  • the k first wave bands have different wavelength ranges
  • the k second wave bands have different wavelength ranges.
  • the third rays include k rays of the first waveband and k rays of the second waveband.
  • FIG. 10 is a schematic diagram of a spectroscopic spectrometer provided by this application.
  • the multi-passband spectral film R1R2G1G2B1B2 coated on the second light-combination surface of the spectrum splitter element is taken as an example.
  • the multi-passband spectrum film R1R2G1G2B1B2 represents the band corresponding to the reflection color R1, the band corresponding to the color G1 and the color B1 of the spectrum spectrometer.
  • the corresponding band, the reflected band can be collectively called the first band; and the band corresponding to the transmitted color R2, the band corresponding to the color G2 and the band corresponding to the color B2, the transmitted band can be collectively called the second band.
  • the band corresponding to each color in Figure 10 can be represented by a rectangular block, and the abscissa represents the wavelength.
  • the color of the spectral spectroscopic element that reflects the three first wavebands of the first light is represented as R1G1B1, and the color of the three second wavebands of the second light that is transmitted by the spectroscopic element.
  • the color is represented by R2G2B2.
  • the third light ray may include the light ray corresponding to R1G1B1 in the first light ray and the light ray corresponding to R2G2B2 in the second light ray.
  • the first band of color R1 corresponds to the second band of color R2.
  • Color R1 and color R2 can synthesize (or approximately synthesize) the color R in the conventional Bayer image sensor, or they can also be other defined colors R; color G1 The first band corresponds to the second band of color G2. Color G1 and color G2 can synthesize (or approximately synthesize) the color G in the conventional Bayer image sensor, or they can also be other defined colors G; the first band of color B1 Corresponding to the second band of color B2, color B1 and color B2 can synthesize (or approximately synthesize) the color B in the conventional Bayer image sensor, or they can also be other defined colors B.
  • the center wavelength of the first band may be greater than the center wavelength of the corresponding second band, or the center wavelength of the first band may be less than the center wavelength of the corresponding second band, where "1" in R1G1B1 indicates relative Short wavelength ( ⁇ ), "2" represents a relatively long wavelength.
  • the multi-passband spectrum film of the second spectroscopic interface of the spectrum splitting element can also be R2R1G2G1B2B1 (representing reflection R2G2B2, transmission R1G1B1), or R1R2Y1Y2B1B2 (representing reflection R1Y1B1, transmission R2Y2B2), or C1C2M1M2Y1Y2 (representing reflection C1M1Y1, Transmission C2M2Y2), we will not list them one by one this time.
  • the multi-passband spectral film can be selected according to actual needs.
  • the first light ray may include light rays corresponding to R1R2G1G2B1B2, and the second light rays may also include light rays corresponding to R1R2G1G2B1B2.
  • the optical imaging module may also include a light-absorbing structure that can absorb light corresponding to R2G2B2 in the first light and light corresponding to R1G1B1 in the second light.
  • the light combining component includes time-division multiplexed light splitting elements.
  • the light combining component can also perform light splitting based on time periods. Specifically, the light combining component is specifically configured to reflect the first light to the third lens component during the first period, and transmit the second light to the third lens component during the second period.
  • the time-division multiplexed light splitting element includes a light splitting prism, a first switch (shutter) and a second switch.
  • the time-division multiplexed light splitting element includes a light splitting prism configured with a first switch and a second switch
  • the light splitting prism is an ordinary light splitting prism.
  • Figure 11 is a schematic light spectrum diagram of another light combining component provided by this application.
  • the light combining component includes a first switch, a second switch and a beam splitting prism.
  • the first switch is located between the first lens assembly and the dichroic prism and is used to control whether the first light ray passes through;
  • the second switch is located between the second lens assembly and the dichroic prism and is used to control whether the second light ray passes through.
  • the first switch is turned on (the dotted line indicates that the switch is in the open state), the second switch is closed (the solid line indicates that the switch is in the off state), and the first light can pass through
  • the first switch is reflected to the third lens assembly through the dichroic prism, and the second light is blocked by the second switch and cannot enter the dichroic prism; please refer to (b) in Figure 11.
  • the first switch is turned off, and the second switch When it is turned on, the second light can pass through the second switch and be transmitted to the third lens assembly through the dichroic prism. The first light is blocked by the first switch and cannot enter the dichroic prism.
  • the first switch includes a first electronically controlled switch or a first liquid crystal light valve; and/or the second switch includes a second electronically controlled switch or a second liquid crystal light valve.
  • the light combining component includes a deflectable reflector.
  • the reflective surface of the deflectable mirror faces the first lens assembly, reflects the first light to the third lens assembly, and blocks the second light from entering the third lens assembly.
  • the angle between the reflective surface of the deflectable mirror and the first optical axis of the first lens assembly is equal to (90°- ⁇ /2).
  • the angle between the reflective surface of the deflectable mirror and the first optical axis of the first lens assembly is equal to 45°, and the deflectable reflective light
  • the reflective surface of the mirror can reflect all the first light to the third lens group.
  • the reflective surface of the deflectable mirror is used to block the first light from entering the third lens assembly and allow the second light to be transmitted to the third lens assembly.
  • the reflective surface of the polarizable reflective light mirror is parallel to the second optical axis of the second lens component and perpendicular to the first optical axis of the first lens component.
  • the rotation point of the deflectable reflector cannot be at the intersection of the first optical axis and the second optical axis, but should be located close to the first lens assembly to ensure that the second light is not blocked during the second period. and prevent the first light from entering the third lens assembly.
  • the polarizable reflector needs to meet two conditions: 1) allow the second light to enter the third lens assembly; 2) prevent the first light from entering the third lens assembly.
  • the light combining component includes a beam splitter coated with an electrically controlled film.
  • the electrically controlled film that controls the beam splitter reflects the first light to the third lens assembly and blocks the second light from entering the third lens assembly; in the second period, the electrically controlled film that controls the beam splitter reflects the second light to the third lens assembly. Transmit to the third lens component and block the first light from entering the third lens component.
  • the optical imaging module may also include a light-absorbing structure that can absorb invalid light at any time.
  • the second light at the first time is invalid light, and the first light at the second time is also invalid light. This reduces the interference of unnecessary light when the optical imaging module is imaging.
  • the third light ray can be focused on the detection component, and the detection component can form an image based on the focused third light ray. Further, optionally, the detection component can photoelectrically convert the received third light to obtain an electrical signal, and form an image based on the electrical signal.
  • the detection component includes a first image sensor.
  • the first image sensor is a polarization sensor
  • the pixels of the polarization sensor are polarization pixels.
  • the detection component is type A
  • the light combining component is the above-mentioned structure one.
  • the detection component is a polarization sensor
  • the light combining component is a polarization splitter element.
  • the polarization transmittance of the polarized pixel is the same as the polarization transmittance of the third light obtained after passing through the light combining component.
  • the third light ray includes P polarized light and S polarized light
  • the polarized pixel can respond to P polarized light or S polarized light.
  • the detection component is a polarization sensor
  • the polarization sensor includes a first filter layer and a first photosensitive layer.
  • the polarization sensor may further include a second filter layer.
  • the first filter layer is a polarization filter layer
  • the second filter layer is a color filter layer.
  • the color filter layer may be, for example, a color mosaic filter layer. It should be noted that the order of the first filter layer and the second filter layer can be interchanged, that is, the second filter layer is located between the first filter layer and the first photosensitive layer.
  • the first filter layer includes N first filter blocks, the first filter block is the smallest repeatable block of the first filter layer, and the first filter block includes n first filter units, so At least two of the n first filter units allow the third light to pass through different polarization states, and both N and n are integers greater than 1.
  • the first filter block includes two first filter units. One filter unit allows the P-polarized light in the third light to pass (vertical line Filling means that P polarized light is allowed to pass), and the other first filter unit allows S polarized light in the third light to pass (horizontal line filling means that S polarized light is allowed to pass).
  • the second filter layer includes M second filter blocks
  • the second filter block is the smallest repeatable block of the second filter layer
  • the second filter block includes m second filter blocks.
  • at least two of the m second filter units allow the third light to pass through different wavelength ranges
  • M and m are both integers greater than 1.
  • the second filter block includes three filter units. The three filter units allow the third light to pass through different wavebands.
  • R means allow Through red light
  • G means green light is allowed to pass through
  • B means blue light is allowed to pass through.
  • one second filter unit corresponds to one first filter unit.
  • the first photosensitive layer can be based on the number n of the first filter units included in the first filter block and the second filter block.
  • the m first filter units corresponding to the m second filter units belonging to the same second filter block allow the third light to pass through the same polarization state.
  • the three first filter units in the first filter layer corresponding to the three second filter units RGB included in the second filter block allow the polarized light to pass through evenly. It is P-polarized light in the third light ray or both are S-polarized light.
  • the first pixel block includes 2 ⁇ 3 first pixels.
  • the three first filter units in the first filter layer corresponding to the three second filter units RGB included in the second filter block allow the polarized light to pass through evenly.
  • the first pixel block includes 3 ⁇ 2 first pixels.
  • one polarization state can correspond to all wavebands, or one waveband can correspond to all polarization states.
  • each first pixel can detect all polarized light (ie, P polarized light and S polarized light) and all wavelength bands of light that are allowed to pass through.
  • the n second filter units corresponding to the n first filter units belonging to the same first filter block allow the third light to pass through the same wavelength range.
  • the first filter block includes four first filter units. One filter unit allows the P-polarized light in the third light to pass (the filled vertical line indicates that the P-polarized light is allowed to pass), and the other filter unit allows the P-polarized light in the third light to pass through.
  • One filter unit allows the S-polarized light in the third light to pass (the horizontal line filling indicates that the S-polarized light is allowed to pass), and the other first filter unit allows the P-polarized light and S-polarized light in the third light to pass (to the left
  • the tilted filling means allowing P polarized light and S polarized light to pass through at the same time)
  • the first filter unit allows the S polarized light and P polarized light in the third light to pass through (the filling tilted to the right means allowing S polarized light at the same time) and P polarized light transmission).
  • the four first filter units allow the corresponding four second filter units to allow the third light to pass through the same wavelength range.
  • the second filter layer is a monochromatic filter layer, and the monochromatic filter layer refers to Only the light of a certain color in the third light is allowed to pass; or it can be a full-color filter layer.
  • the full-color filter layer means that the light of white light is allowed to pass.
  • the number m of the second filter units included in the second filter block can be understood as equal to 1.
  • the first photosensitive layer divides the first pixel block according to the number n of first filter units included in the first filter block and the number m of second filter units included in the second filter block.
  • the first filter block includes 4 first filter units
  • the second filter block includes 4 second filter units
  • the first pixel block includes 4 ⁇ 4 the first pixel. I won’t list them one by one this time. In this way, the first light and the second light can completely cover the polarization sensor, thereby improving the utilization of the polarization sensor.
  • the detection component includes a second image sensor.
  • the second image sensor is a spectrum sensor
  • the pixels of the spectrum sensor are spectrum pixels.
  • the light combining component is the above-mentioned structure 2.
  • the detection component is a spectrum sensor
  • the light combining component is a spectrum splitter element.
  • FIG 16 is a schematic structural diagram of a detection component provided by this application.
  • the detection component is a spectrum sensor, and the spectrum sensor includes a third filter layer and a second photosensitive layer.
  • the third filter layer may be a color filter layer.
  • For the color filter layer please refer to the relevant introduction in type A mentioned above, which will not be described again here.
  • the third filter layer includes Q third filter blocks, Q is a positive integer, the third filter block is the smallest repeatable block of the third filter layer, and the third filter block includes at least 2k third filter blocks.
  • optical unit a third filter unit is used to receive from The third light of the spectrum splitting element is a light of a first wavelength band or a light of a second wavelength band, and the 2k third filter units allow different wavelength bands of the third light to pass through.
  • the number of third filter units included in the third filter block is the same as the number of passbands of the multi-passband spectral film coated on the second light combining surface.
  • the third filter block includes 6 third filter units. These 6 third filter units allow R1, R2, G1, G2, The light corresponding to B1 and B2 passes through, and the distribution pattern of the six third filter units is shown in Figure 17a.
  • a second pixel block includes 2 ⁇ 3 pixels.
  • the second photosensitive layer includes Q second pixel blocks, the second pixel block includes at least 2k second pixels, one second pixel block corresponds to a third filter block, and one second pixel corresponds to a third filter block. light unit.
  • the distribution mode of the six third filter units shown in Figure 17a is only a possible example, and it can also be any other possible distribution mode, which is not limited in this application.
  • the multi-passband spectral film R1R2G1G2B1B2 is plated with the second combined light surface.
  • the third filter block includes 4 ⁇ 4 third filter units. These 16 third filter units allow R1, R2, G1, G2, The light corresponding to B1 and B2 passes through. The distribution of these 16 third filter units can be seen in Figure 17b.
  • a second pixel block includes 4 ⁇ 4 second pixels, and one second pixel corresponds to one third filter unit. Filter unit.
  • the smallest repeatable third filter block in the third filter layer may also include a greater number of third filter units than in the above-mentioned figure 17b, which is not limited in this application.
  • the number of third filter units included in the third filter block may be different from the number of passbands of the multi-passband spectral film coated on the second light combining surface.
  • the detection component includes a third image sensor.
  • the third image sensor may be a common image sensor (ie, a Bayer image sensor).
  • the light combining component is the above-mentioned structure three.
  • the third image sensor may include a fourth filter layer and a third photosensitive layer, see FIG. 18 .
  • the fourth filter layer may be a color filter layer.
  • the fourth filter layer includes H fourth filter blocks, and the fourth filter block is the smallest repeatable block of the fourth filter layer.
  • the fourth filter block includes h fourth filter units, at least two of the h fourth filter units allow the third light to pass through different wavebands, H and h are both is an integer greater than 1.
  • Figure 19a is a schematic diagram of a fourth filter layer provided by the present application.
  • the fourth filter layer includes 3 ⁇ 3 fourth filter blocks.
  • the fourth filter block includes 2 ⁇ 2 fourth filter units, which can be expressed as RGGB, and R indicates that the fourth filter unit allows the fourth filter unit to The light in the red band of the three light rays passes through, G indicates that the fourth filter unit allows the light in the green band of the third light ray to pass, and B indicates that the fourth filter unit allows the light in the blue band of the third light ray to pass through.
  • the 2 ⁇ 2 fourth filter units included in the fourth filter block can be expressed as RYYB, R indicates that the fourth filter unit allows the light in the red band of the third light to pass, and Y indicates that the fourth The filter unit allows the light in the yellow band of the third light to pass through, and B indicates that the fourth filter unit allows the light in the blue band of the third light to pass through.
  • the third photosensitive layer includes H third pixel blocks, the third pixel block is the smallest repeatable block of the third photosensitive layer, the third pixel block includes h third pixels, and one third pixel block corresponds to one third pixel block. Four filter blocks, one fourth filter unit corresponds to one third pixel.
  • the fourth filter layer can also have other possible distributions.
  • the fourth filter layer may include 1 ⁇ 3 fourth filter units, which may be represented as RGB, where R indicates that the fourth filter unit allows light in the red band of the third light to pass, and G indicates that the fourth filter unit Allow the green band light in the third light to pass through, B represents the fourth filter unit Allow the light in the blue band of the third ray to pass, see Figure 19b or Figure 19c.
  • the fourth filter layer may also include 4 ⁇ 4 fourth filter units, see Figure 19d, in which the 4 fourth filter units all allow the light in the red band of the third light to pass, and 8 fourth Each of the filter units allows the green light in the third light to pass through, and the four fourth filter units all allow the blue light in the third light to pass through.
  • the fourth filter layer may also be a monochromatic filter layer, which only allows light of a certain color in the third light to pass; or it may be a full-color (black and white) filter.
  • the light layer please refer to Figure 20.
  • the full-color filter layer allows white light to pass through.
  • the shapes of the first filter layer, the second filter layer, the third filter layer and the fourth filter layer given above are only examples.
  • the above-mentioned filter units can also be in other geometrically symmetrical shapes (such as regular hexagons or rectangles).
  • the filter units can be closely arranged to form a filter layer, and the filter layer can be in a rectangular or square shape.
  • the filter unit is a square
  • the filter units are closely arranged to form a filter layer that can be square or rectangular
  • the filter unit is a regular hexagon
  • the filter units are closely arranged to form a whole filter layer It can be roughly rectangular or square in shape, and the edges may be uneven.
  • the optical imaging module may include a first lens component, a second lens component, a light combining component, a third lens component and a detection component.
  • the angle between the first optical axis of the first lens component and the second optical axis of the second lens component is equal to 90°.
  • the first light ray in the first field of view is propagated to the light combining component through the first lens component
  • the second light ray in the second field of view is propagated to the light combining component through the light combining component.
  • the light combining component mixes the first light ray and the second light ray.
  • the third light ray is obtained, the third light ray is focused to the detection component through the third lens component, and the detection component forms an image based on the focused third ray of light.
  • the detection component forms an image based on the focused third ray of light.
  • the first image S1 corresponding to the first field of view and the second image S2 corresponding to the second field of view can be unmixed, separated, calibrated, spliced, etc. in a corresponding manner, and then synthesized Large field of view images that can be directly observed.
  • the distortion of the synthesized image is controllable, the field of view of the synthesized image is >190°, and the optical distortion is ⁇ 10%.
  • the light combining component as the polarizing light splitting element of the above-mentioned structure 1 as an example
  • the first filter layer in the detection component as shown in Figure 14b as above
  • the second filter layer as a monochromatic filter layer.
  • the first pixel block includes 2 ⁇ 2 pixels. Each first pixel block in the detection component can detect the first light ray in the first field of view and the second light ray in the second field of view.
  • one first pixel among the four first pixels in the first pixel block can detect S-polarized light
  • one first pixel can detect P-polarized light
  • two first pixels can simultaneously detect S-polarized light and P-polarized light can be separated into the first image S1 of the first field of view and the second image S2 of the second field of view using the following formula 3.
  • I 0 indicates that S-polarized light is detected
  • I 90 indicates that P-polarized light is detected
  • I 135 indicates that P-polarized light and S-polarized light are detected at the same time
  • I 45 indicates that P-polarized light and S-polarized light are detected at the same time.
  • Step a image preprocessing. It mainly includes feature point extraction and identification, internal parameter reading, distortion correction, brightness adjustment, etc.
  • Step b image registration. Mainly includes coordinate conversion and perspective matrix solution.
  • Step c image synthesis. Mainly including image fusion, boundary processing, etc.
  • Step d image display. Mainly includes data transmission and display.
  • this application can also provide an optical imaging system.
  • Figure 22 is an optical imaging system provided by this application.
  • the optical imaging system includes two optical imaging modules in any of the above embodiments. Further, the two detection components of the two imaging modules can be connected through a bus.
  • the optical imaging system is equivalent to rotating one of the optical imaging modules 180° around the imaging plane of the detection component. It can also be understood that the imaging surfaces (or photosensitive surfaces) of the detection components of the two optical imaging modules are opposite to each other.
  • the detection components of the two optical imaging modules are connected through a bus, and the data of the two detection components can be output together through the bus, which helps to avoid asynchronization between the two detection components, so that synchronization can be achieved in hardware.
  • the terminal device may include at least one processor 2301 and the optical imaging module 2302 in any of the above embodiments. Further, optionally, the optical imaging module may also include a memory 2303. Memory 2303 is used to store programs or instructions. The processor is used to call programs or instructions to control the imaging of the above-mentioned optical imaging module. Processor 2301 executes instructions stored in a non-transitory computer-readable medium such as memory 2303. For the optical imaging module 2302, please refer to the above related introduction, and will not be described again here. The processor 2301 may also be a plurality of computing devices that control individual components or subsystems of the terminal device 2300 in a distributed manner.
  • the processor 2301 may be a circuit with signal (or data) processing capabilities.
  • the processor may be a circuit with the ability to read and execute instructions, such as a central processing unit (Central Processing Unit, CPU). , microprocessor, graphics processing unit (GPU) (can be understood as a microprocessor), or digital signal processor (digital signal processor, DSP), etc.; in another implementation, the processor can A certain function is realized through the logical relationship of the hardware circuit. The logical relationship of the hardware circuit is fixed or can be reconstructed.
  • the processor is an application-specific integrated circuit (ASIC) or a programmable logic device. , PLD) implemented hardware circuit, such as FPGA.
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • the process of the processor loading the configuration file and realizing the hardware circuit configuration can be understood as the process of the processor loading instructions to realize the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as a neural network processing unit (NPU). Tensor processing unit (TPU), deep learning processing unit (DPU), etc.
  • FIG. 23 functionally illustrates the processor, memory, and other elements of processor 2301 in the same block, one of ordinary skill in the art will understand that the processor and memory may not actually be stored in the same physical enclosure. multiple processors or memories within.
  • the memory may be a hard drive or other storage medium located in a housing different from processor 2301.
  • the processor can also be remote from the terminal device but can communicate wirelessly with the terminal device.
  • memory 2303 may contain instructions (eg, program logic) that may be read by processor 2301 to perform various functions of terminal device 2300, including the functions described above. Memory 2303 may also contain additional instructions, including instructions for sending data to, receiving data from, interacting with, and/or controlling other systems of the end device. In addition to instructions, the memory 2303 can also store data, such as image information acquired by the optical imaging module 2302.
  • the memory can be, for example, random access memory (RAM), flash memory, read-only memory (ROM), programmable ROM (PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically erasable EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or any other form of storage media well known in the art.
  • RAM random access memory
  • ROM read-only memory
  • PROM programmable ROM
  • EPROM erasable programmable read-only memory
  • electrically erasable programmable read-only memory electrically erasable programmable read-only memory
  • register hard disk, mobile hard disk, CD-ROM or any other form of storage media well known in the art.
  • An exemplary storage medium is coupled to the processor such that the processor can read information from the storage medium and write information to the storage medium.
  • the functional framework of the terminal device shown in Figure 23 is just an example.
  • the terminal device 2300 may include more, less, or different devices, and each device may include more, less or different components.
  • the devices and components shown can be combined or divided in any way, which is not specifically limited in this application.
  • the terminal device may be a vehicle (such as an unmanned vehicle, a smart vehicle, an electric vehicle, or a digital vehicle, etc.), a robot, a mapping device, a drone, or a smart home device (such as a television, a sweeping robot, or a smart desk lamp). , audio system, intelligent lighting system, electrical control system, home background music, home theater system, intercom system, or video surveillance, etc.), intelligent manufacturing equipment (such as industrial equipment), intelligent transportation equipment (such as AGV, unmanned transport vehicle) , or trucks, etc.), or smart terminals (mobile phones, watches, computers, tablets, PDAs, desktops, headsets, speakers, wearable devices, vehicle-mounted devices, virtual reality devices, augmented reality devices, etc.), etc.
  • a vehicle such as an unmanned vehicle, a smart vehicle, an electric vehicle, or a digital vehicle, etc.
  • a robot such as a robot, a mapping device, a drone, or a smart home device (such as a television, a sweeping robot, or
  • vertical does not mean absolute verticality, and certain engineering errors may be allowed.
  • At least one means one or more, and “plurality” means two or more.
  • And/or describes the association of associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A exists alone, A and B exist simultaneously, and B exists alone, where A, B can be singular or plural.
  • At least one of the following or similar expressions thereof refers to any combination of these items, including any combination of a single item (items) or a plurality of items (items).
  • a, b or c can mean: a, b, c, "a and b", “a and c", “b and c”, or “a and b and c” ”, where a, b, c can be single or multiple.
  • the character “/” generally indicates that the related objects are in an "or” relationship.
  • the character “/” indicates that the related objects are in a “division” relationship.
  • the word “exemplarily” is used to mean an example, illustration or explanation. Any embodiment or design described herein as “example” is not intended to be construed as preferred or advantageous over other embodiments or designs. Alternatively, it can be understood that the use of the word “example” is intended to present concepts in a specific manner and does not constitute a limitation on this application.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Lenses (AREA)

Abstract

一种光学成像模组、光学成像系统及终端设备,用于解决现有技术中无法同时实现超大视场和较小光学畸变的问题。可应用于安防监控、无人驾驶或智能驾驶等领域。光学成像模组包括:第一透镜组件用于将第一视场的第一光线传播至合光组件,第二透镜组件用于将第二视场的第二光线传播至合光组件,第一视场与第二视场部分重叠,第一透镜组件的第一光轴与第二透镜组件的第二光轴之间夹角α满足:0°<α<180°、或者180°<α<360°。合光组件用于将第一光线和第二光线混合得到第三光线。第三透镜组件用于将第三光线聚焦至探测组件。探测组件用于根据聚焦后的第三光线形成图像。如此既可实现超大视场的成像,又可减小形成的图像的光学畸变。

Description

一种光学成像模组、光学成像系统及终端设备
相关申请的交叉引用
本申请要求在2022年04月18日提交中国专利局、申请号为202210407071.8、申请名称为“一种光学成像模组、光学成像系统及终端设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及光学成像技术领域,尤其涉及一种光学成像模组、光学成像系统及终端设备。
背景技术
随着科学技术的发展,用户对光学镜头的成像功能的要求越来越高。例如,需要获得超大视场角的成像等。但是常规光学镜头无法实现超大视场的拍摄,需要采用特殊设计的光学镜头,例如鱼眼镜头(请参阅图1a)、全景反射镜头(请参阅图1b)或全景环带镜头(请参阅图1c)等。
这些光学镜头虽然可以拍摄到超大视场的图像,但是拍摄的图像存在较大的光学畸变。不仅使得图像的边缘扭曲,而且会影响边缘图像的解析力。目前这些采用特殊设计的光学镜头可以实现超大视场的成像,但获得的图像光学畸变较大。其中,光学畸变是指真实像高(y_chief)与理想像高(y_ref)的差值与理想像高的比值,可用下述公式1表示,理想像高可用下述公式2表示。
光学畸变=100%×(y_chief-y_ref)/y_ref  公式1
y_ref=f×tanθ   公式2
其中,f表示焦距,θ表示视场的一半。
综上所述,如何既可以实现超大视场的成像,又可以使得拍摄的图像的光学畸变较小,是当前亟需解决的技术问题。
发明内容
本申请提供一种光学成像模组、光学成像系统及终端设备,用于既可以实现超大视场的成像,又可以减小拍摄的图像的光学畸变。
第一方面,本申请提供一种光学成像模组,该光学成像模组包括第一透镜组件、第二透镜组件、第三透镜组件、合光组件以及探测组件,第一透镜组件的第一光轴与第二透镜组件的第二光轴之间夹角大于0°且小于180°、或者大于180°且小于360°。第一透镜组件用于将来自第一视场的第一光线传播至合光组件;第二透镜组件用于将来自第二视场的第二光线传播至合光组件,第一视场与第二视场部分重叠;合光组件用于将第一光线和第二光线混合得到第三光线;第三透镜组件用于将来自合光组件的第三光线聚焦至探测组件;探测组件,用于根据聚焦后的第三光线形成图像。
基于上述方案,通过第一透镜组件可以将第一视场的第一光线尽可能的收集至合光组 件的传输口径中;通过第二透镜组件可以将第二视场的第二光线尽可能的收集至合光组件的传输口径中;合光组件将第一光线和第二光线混合得到第三光线,实现第一视场和第二视场的混叠;第一光线和第二光线耦合到同一个探测组件上,从而通过对较小的第一视场和第二视场进行解混、分离、再拼接等,既可以实现超大视场的成像,又可以减小形成的图像的光学畸变。例如,光学畸变<10%,合成后的视场角大于190°。进一步,可基于对第一视场形成的图像和基于第二视场形成的图像进行解混、分离后分别显示,从而可以灵活选择其中一个视场作为观测视场,进而可适用于定向或周视观察。
在一种可能的实现方式中,第一光轴与第二光轴之间的夹角等于90°。
通过设计第一光轴与第二光轴之间的夹角等于90°,有助于简化光学成像模组的设计和装配。
在一种可能的实现方式中,第一视场的视场角大于或等于90°且小于或等于135°;和/或,第二视场的视场角大于或等于90°且小于或等于135°。
进一步,可选的,第一视场的视场角大于或等于98°且小于或等于130°;和/或,第二视场的视场角大于或等于98°且小于或等于130°。
基于此,第一视场和第二视场合成后的合成视场的光学畸变较小,而且,可以实现超大视场的成像。
在一种可能的实现方式中,经第一透镜组件传播的第一光线为平行光或非平行光;和/或,经第二透镜组件传播的第二光线为平行光或非平行光。
通过第一透镜组件将第一视场的第一光线转换为平行光,有助于降低各个透镜组之间的装配难度。通过第一透镜组件将第一视场的第一光线转换为非平行光,有助于在设计时缩减整个光学成像模组的尺寸。
在一种可能的实现方式中,合光组件包括偏振分光元件,偏振分光元件用于将第一光线中的第一偏振光反射至第三透镜组件,将第二光线中的第二偏振光透射至第三透镜组件,第一偏振光的偏振态与第二偏振光的偏振态不同。
通过偏振分光元件可以实现将第一光线中的第一偏振光与第二光线中的第二偏振光混合,从而获得第三光线。
进一步,可选的,偏振分光元件包括第一合光面,第一合光面镀制有偏振分光膜或刻蚀有金属线栅。
在一种可能的实现方式中,探测组件包括第一滤光层和第一感光层,第一滤光层包括N个第一滤光块,第一滤光块包括n个第一滤光单元,n个第一滤光单元中至少两个第一滤光单元允许第三光线通过的偏振态不同,N和n均为大于1的整数;第一感光层包括P个第一像素块,第一像素块包括p个第一像素,p大于或等于n,P为正整数。
通过第一像素块包括的第一像素的数量大于或等于第一滤光块包括的第一滤光单元的数量,可以保证每个第一像素块可以检测到全部的偏振光(即第一偏振光和第二偏振光)。而且,当p大于n时,可以实现多个第一像素对应一个第一滤光单元,即实现像素合并,从而有助于增加形成的图像的信噪比。
在一种可能的实现方式中,探测组件还包括第二滤光层;第二滤光层包括M个第二滤光块,第二滤光块包括m个第二滤光单元,m个第二滤光单元中至少两个滤光单元允许第三光线通过的波段范围不同,M和m均为大于1的整数;一个第二滤光单元对应一个第一滤光单元,属于同一个第二滤光块的m个第二滤光单元对应的m个第一滤光单元允许第 三光线通过的偏振态相同;或者,属于同一个第一滤光块的n个第一滤光单元对应的n个第二滤光单元允许第三光线通过的波段范围相同。
通过基于第一滤光块包括的第一滤光单元的数量n和第二滤光块包括的第二滤光单元的数量m划分第一像素块,可以保证每个第一像素块可以检测到第一滤光层允许通过的全部的偏振态的光线(即第一偏振光和第二偏振光)和第二滤光层允许通过的全部波段的光线。从而可以实现第一光线和第二光线完整覆盖偏振传感器,从而可提高偏振传感器的利用率。进一步,基于该探测组件采集1帧原始图像即可获得具有较大视场较小光学畸变的图像,从而可应用于视频流媒体等场景。而且不需要高容量的图像处理带宽,有助于节省处理器的算力。
在一种可能的实现方式中,合光组件为光谱分光元件,光谱分光元件用于将第一光线中的k个第一波段的光线反射至第三透镜组件,并将第二光线中的k个第二波段的光线透射至第二透镜组件,一个第一波段的光线与一个第二波段的光线对应,k为大于1的整数。
通过光谱分光元件可以实现将第一光线中部分光线(即k个第一波段的光线)和第二光线中的部分光线(即k个第二波段的光线)混合,从而获得第三光线。
进一步,可选的,光谱分光元件包括第二合光面,第二合光面包括多通带光谱膜。
在一种可能的实现方式中,探测组件包括第三滤光层和第二感光层;第三滤光层包括Q个第三滤光块,第三滤光块至少包括2k个第三滤光单元,2k个第三滤光单元允许第三光线中通过的波段范围不同,Q和q为正整数;第二感光层包括Q个第二像素块,第二像素块至少包括2k个第二像素,一个第二像素块对应一个第三滤光块,一个第二像素对应一个第三滤光单元。
通过该探测组件采集1帧原始图像即可获得具有较大视场较小光学畸变的图像,从而可应用于视频流媒体等场景。而且不需要高容量的图像处理带宽,有助于节省处理器的算力。
在一种可能的实现方式中,合光组件具体用于在第一时段将第一光线反射至第三透镜组件,在第二时段将第二光线透射至第三透镜组件。
示例性地,合光组件包括镀制有电控膜的分光镜;或者包括偏转反光镜;或者包括分光棱镜、第一开关和第二开关,其中第一开关位于第一透镜组件与分光棱镜之间,第二开关位于第二透镜组件与分光棱镜之间。进一步,第一开关包括第一电控开关或第一液晶光阀;和/或,第二开关包括第二电控开关或第二液晶光阀。
在一种可能的实现方式中,探测组件包括第四滤光层和第三感光层;第四滤光层包括H个第四滤光块,第四滤光块包括h个第四滤光单元,h个第四滤光单元中至少两个第四滤光单元允许第三光线中通过的波段范围不同,H和h均为大于1的整数;第三感光层包括H个第三像素块,第三像素块包括h个第三像素,一个第三像素块对应一个第四滤光块,一个第四滤光单元对应一个第三像素。
通过该探测组件,可以兼容现有的图像传感器。而且,采集2帧原始图像(即第一时段采集一帧,第二时段采集一帧)即可获得具有较大视场较小光学畸变的图像,从而可应用于视频流媒体等场景。而且,不需要高容量的图像处理带宽,有助于节省处理器的算力。
第二方面,本申请提供一种光学成像系统,包括两个上述第一方面或第一方面中的任意一种光学成像模组,两个光学成像模组的探测组件通过总线连接。
第三方面,本申请提供一种终端设备,该终端设备包括上述第一方面或第一方面中的 任意一种光学成像模组;或者该终端设备包括上述第二方面或第二方面中的任意一种光学成像系统。
进一步,可选的,该终端设备还可包括处理器,处理器可用于控制光学成像模组成像。
上述第二方面至第三方面中任一方面可以达到的技术效果可以参照上述第一方面中有益效果的描述,此处不再重复赘述。
附图说明
图1a为现有技术中的一种鱼眼镜头的结构示意图;
图1b为现有技术中的一种全景环带镜头的结构示意图;
图1c为现有技术中的一种全景反射镜头的结构示意图;
图2a为本申请提供的一种光学成像模组集成于智能手机的应用场景示意图;
图2b为本申请提供的一种光学成像模组集成于智能平板的应用场景示意图;
图2c为本申请提供的一种光学成像模组集成于智能手环的应用场景示意图;
图2d为本申请提供的一种光学成像模组集成于摄像头的应用场景示意图;
图2e为本申请提供的一种光学成像模组集成于扫地机器人的应用场景示意图;
图2f为本申请提供的一种光学成像模组集成于车辆的应用场景示意图;
图2g为本申请提供的一种光学成像模组集成于AR眼镜的应用场景示意图;
图2h为本申请提供的一种光学成像模组集成于路侧单元的应用场景示意图;
图3为本申请提供的一种光学成像模组的结构示意图;
图4a为本申请提供的一种第一透镜组件的结构示意图;
图4b为本申请提供的另一种第一透镜组件的结构示意图;
图5为本申请提供的一种第一光轴和第二光轴之间的夹角的示意图;
图6a为本申请提供的一种第三透镜组件的结构示意图;
图6b为本申请提供的另一种第三透镜组件的结构示意图;
图7a为本申请提供的一种场曲的模拟结果示意图;
图7b为本申请提供的一种光学畸变的模拟结果示意图;
图8a为本申请提供的另一种场曲的模拟结果示意图;
图8b为本申请提供的另一种光学畸变的模拟结果示意图;
图9为本申请提供的一种偏振光分束器的分光示意图;
图10为本申请提供的一种光谱分光元件的分光示意图;
图11为本申请提供的另一种合光组件的分光示意图;
图12为本申请提供的一种探测组件的结构示意图;
图13为本申请提供的一种第一滤光块和第二滤光块的对应关系示意图;
图14a为本申请提供的另一种第一滤光块和第二滤光块的对应关系示意图;
图14b为本申请提供的一种第一滤光层中第一滤光块的示意图;
图15为本申请提供的另一种第一滤光块和第二滤光块的对应关系示意图;
图16为本申请提供的一种探测组件的结构示意图;
图17a为本申请提供的一种第三滤光层的示意图;
图17b为本申请提供的另一种第三滤光层的示意图;
图18为本申请提供的一种第三图像传感器的结构示意;
图19a为本申请提供的一种第四滤光层的示意图;
图19b为本申请提供的另一种第四滤光层的示意图;
图19c为本申请提供的另一种第四滤光层的示意图;
图19d为本申请提供的另一种第四滤光层的示意图;
图20为本申请提供的一种探测组件的结构示意图;
图21a为本申请提供的一种光学成像模组的结构示意图;
图21b为本申请提供的一种光学成像模组的结构示意图;
图22为本申请提供的一种光学成像系统的结构示意图;
图23为本申请提供的一种终端设备的结构示意图。
具体实施方式
下面将结合附图,对本申请实施例进行详细描述。
以下,对本申请的可能的应用场景进行介绍。需要说明的是,这些介绍是为了便于本领域技术人员理解,并不是对本申请所要求的保护范围构成限定。
在一种可能的应用场景中,本申请中的光学成像模组可集成于终端设备或设置于终端设备的部件中。终端设备例如可以是智能手机(请参阅图2a)、智能平板(请参阅图2b)、智能手环(请参阅图2c)、智能家居设备(请参阅图2d所示的摄像头)、智能制造设备、游戏机、机器人(请参阅图2e所示的扫地机器人)或智能运输设备(如自动导引运输车(automated guided vehicle,AGV)或者无人运输车等)等。示例性的,以光学成像模组集成于智能手机(可称为超广角智能手机)为例,使用超广角智能手机拍照可以借助大视场图像对光影及场景的理解,帮助用户选择最佳拍摄角与拍摄位,从而可获取较好的拍摄效果。或者,使用超广角智能手机或游戏机玩游戏时可以借助大视场图像帮助户外用户释放注视点,让户外用户可以边行走边关注游戏画面的同时可以尽量避免危险等。以光学成像模组集成于摄像头(可称为超广角摄像头)为例,超广角摄像头可以应用于安防监控,例如可以实现全景监控或环境监视等。
在另一种可能的应用场景中,光学成像模组也可以集成于车辆(例如无人车、智能车、电动车、数字汽车等)或智能运输设备上,作为超广角车载摄像机,如图2f。超广角车载摄像机可以实时或周期性地获取周围物体的距离等测量信息,从而可为车道纠偏、车距保持、倒车等操作提供必要信息。由于超广角车载摄像机可以借助大视场成像(如前视短焦>135°,环视>180°)观察周围环境,可以实现:a)目标识别与分类,例如各类车道线识别、红绿灯识别以及交通标志识别等;b)可通行空间检测(freespace),例如,可对车辆行驶的安全边界(可行驶区域)进行划分,主要对车辆、普通路边沿、侧石边沿、没有障碍物可见的边界、未知边界进行划分等;c)对横向移动目标的探测能力,例如对十字路口横穿的行人以及车辆的探测和追踪;d)定位与地图创建,例如基于视觉同步定位与地图构建(simultaneous localization and mapping,SLAM)技术的定位与地图创建等。超广角车载摄像机可被用于无人驾驶、自动驾驶、辅助驾驶、智能驾驶或网联车等领域。
在又一种可能的应用场景中,本申请中的光学成像模组也可以集成于近眼显示(near eye display,NED)设备(可称为超广角NED设备),超广角NED设备例如可以是增强现实(augmented reality,AR)设备或虚拟现实(virtual reality,VR)设备,AR设备可以包括但不限于AR眼镜或AR头盔,VR设备可以包括但不限于VR眼镜或VR头盔。请参阅 图2g,以AR眼镜为例示例,用户可佩戴AR眼镜设备进行游戏、观看视频、参加虚拟会议、直播、或视频购物等。超广角NED设备可以借助较大视场图像中的获取的照明信息,辅助增强生成的AR虚拟物的照明与阴影,或帮助NED设备在大视场空间范围生成多个互融互动的虚拟物。
需要说明的是,如上应用场景只是举例,本申请所提供的光学成像模组还可以应用在多种其它场景下,而不限于上述示例出的场景。例如,光学成像模组也可被安装在无人机上,作为机载摄像机等。再比如,光学成像模组也可以安装在路边交通设备(如路侧单元(road side unit,RSU))上,作为路边交通摄像机,可参见图2h,从而可实现智能车路协同等。此外,上述给出的应用场景中,光学成像模组的形态和位置仅是示例,光学成像模组也可以设置于其它可能的位置,光学成像模组也可以设置为其它可能的形态,本申请对此均不作限定。
基于上述内容,下面结合附图3至附图22,对本申请提出的光学成像模组进行具体阐述。
如图3所示,为本申请提供的一种光学成像模组的结构示意图。该光学成像模组可包括第一透镜组件、第二透镜组件、第三透镜组件、合光组件以及探测组件。其中,第一透镜组件的第一光轴与第二透镜组件的第二光轴之间夹角α大于0°且小于180°、或者大于180°且小于360°,请参阅下述图5。也可以理解为,第一光轴与第二光轴不平行(即α不等于180°)且不重合(即α不等于0°)。所述第一透镜组件用于将来自第一视场(或称为子视场1)的第一光线传播至所述合光组件。所述第二透镜组件用于将来自第二视场(或称为子视场2)的第二光线传播至所述合光组件。其中,第一视场的中心线与第二视场的中心线之间的夹角β与第一光轴和第二光轴之间的夹角α相同。所述合光组件用于将所述第一光线和所述第二光线混合,得到第三光线。也可以理解为,第一光线的部分和第二光线的部分经合光组件内后混合形成第三光线。所述第三透镜组件用于将来自所述合光组件的所述第三光线聚焦至所述探测组件。所述探测组件用于根据聚焦后的第三光线形成图像。需要说明的是,本申请中合光组件的名称仅是示例,合光组件也可以先分光再混合。
其中,第一视场与第二视场不同,第一视场的视场角与第二视场的视场角大小可以相同也可以不同。示例性地,所述第一视场的视场角(或称为第一视场的全视场角)大于或等于90°且小于或等于135°,进一步,第一视场的视场角大于98°且小于130°;和/或,所述第二视场的视场角(或称为第二视场的全视场角)大于或等于90°且小于或等于135°,进一步,第二视场的视场角大于98°且小于130°。例如,第一视场的视场角为90°、100°、105°、110°、120°、125°、130°或135°等,第二视场的视场角为90°、100°、105°、110°、120°、125°、130°或135°等。进一步,第一视场与第二视场部分重叠。如此,可以实现第一视场和第二视场的合成(例如可以是解混、分离、再拼接等)。例如,第一视场的10%与第二视场的10%重叠,第一视场的视场角和第二视场的视场角均为100°,基于此,第一视场和第二视场合成后的合成视场角为190°。再比如,第一视场的视场角为100°,第二视场的视场角为120°,第一视场的5°与第二视场的5°重叠,基于此,第一视场和第二视场合成后的合成视场角为215°。再比如,第一视场的视场角与第二视场的视场角相同,且第一视场与第二视场需要有15%的重叠,则第一视场和第二视场的最小视场角可为96°。
一种可能的实现方式中,第一透镜组件和第二透镜组件也可称为前置透镜,第三透镜组件也可以称为后置透镜。第三透镜组件位于合光组件出口后的一定距离处。可以理解的是,光学成像模组也可以不包括第三透镜组件,基于此,需要第一透镜组件将第一光线聚焦至探测组件,第二透镜组件将第二光线聚焦至探测组件。
基于上述光学成像模组,通过第一透镜组件可以将第一视场的第一光线尽可能的收集至合光组件的传输口径中;通过第二透镜组件可以将第二视场的第二光线尽可能的收集至合光组件的传输口径中;合光组件将第一光线和第二光线混合得到第三光线,实现第一视场和第二视场的混叠;第一光线和第二光线耦合到同一个探测组件上,从而通过对较小的第一视场和第二视场进行解混、分离、再拼接等,既可以实现超大视场的成像,又可以减小形成的图像的光学畸变。例如,光学畸变<10%,合成后的视场角大于190°。进一步,基于第一视场形成的图像和基于第二视场形成的图像可以在解混、分离后分别显示,从而可以灵活选择其中一个视场作为观测视场,进而可适用于定向或周视观察。
下面对图3所示的各个功能组件分别进行介绍说明,以给出示例性的具体实现方案。
一、透镜组件
在一种可能的实现方式中,光学成像模组包括第一透镜组件和第二透镜组件,进一步,还可包括第三透镜组件。其中,第三透镜组件、第二透镜组件可以与第一透镜组件相同,或者也可以不同。第三透镜组件、第二透镜组件与第一透镜组件不同可以包括但不限于包括的透镜的数量和/或光学参数不同。其中,光学参数可以包括但不限于透镜的曲面半径(radius)、厚度(thickness)、折射率(index)、色散系数(abbe number)、材料(material)等。
为了便于方案的说明,如下以第一透镜组件为例进行详细介绍。
在一种可能的实现方式中,第一透镜组件包括至少一个透镜。第一透镜组件包括的透镜可以是球面透镜或者也可以是非球面透镜。也可以理解为,第一透镜组件可以包括单片的球面透镜、或者单片非球面透镜、或者多片球面透镜的组合、或者多片非球面透镜的组合、或者至少一片球面透镜和至少一片非球面透镜的组合。可以理解的是,透镜可以包括但不限于凹透镜和凸透镜;进一步,凸透镜和凹透镜有多种不同的类型,例如凸透镜有双凸透镜、平凸透镜以及凹凸透镜等,凹透镜有双凹透镜、平凹透镜以及凹凸透镜等。通过多片球面透镜和/或非球面透镜的组合形成的第一透镜组件,有助于提高成像质量,降低像差。
进一步,可选的,第一透镜组件中的透镜的材料可以是玻璃、或者树脂、或晶体等光学材料。当透镜的材料为树脂时,有助于减轻光学成像模组的质量。当透镜的材料为玻璃时,有助于进一步提高成像质量。进一步,为了有效抑制温漂,第一透镜组件中包括至少一个玻璃材料的透镜。应理解,当第一透镜组件包括至少三个透镜时,可以部分透镜的材料为树脂、部分透镜的材料为玻璃、部分透镜的材料为晶体;或者也可以部分透镜的材料为树脂、部分透镜的材料为玻璃;或者部分透镜的材料为玻璃、部分透镜的材料为晶体;或者全部透镜的材料为树脂;或者全部透镜的材料为玻璃;或者全部透镜的材料为晶体;本申请对此不作限定。
请参阅图4a,为本申请提供的一种第一透镜组件的结构示意图。该示例中以第一透镜组件包括两个透镜为例。需要说明的是,第一透镜组件包括的两个透镜也可以是其它形状的,请参阅图4b。基于图4b所示的第一透镜组件,第一视场的视场角可为100°。
在一种可能的实现方式中,第一透镜组件可以是有光焦度的组件,可以将第一视场的第一光线进行汇聚,即进入合光组件的第一光线为非平行光线。或者,第一透镜组件为无光焦度的组件,基于此,进入合光组件的第一光线为平行光线。由于工程约束及合光组件的接收口径等的限制,通常设置进入合光组件的第一光线的最大视场≤35°。
以第一透镜组件和第二透镜组件相同为例,请参阅图5,示例性地示出了第一透镜组件的第一光轴和第二透镜组件的第二光轴之间的夹角α。该示例中以第一光轴和第二光轴之间的夹角α=90°为例示例的。需要说明的是,第一光轴与第二光轴之间的夹角α也可以是其它大于0°且小于180°的角度,例如,45°、90°、100°、120°、130°等;或者,第一光轴与第二光轴之间的夹角α也可以是其它大于180°且小于360°的角度,例如,190°、200°、240°等,此处不再一一列举。
请参阅图6a和图6b,为本申请提供的两种第三透镜组件的结构示意图。该示例中以第三透镜组件包括三个透镜为例。需要说明的是,第三透镜组件可以包括比三个透镜更多或更少的透镜,关于透镜可参见前述第一透镜组件中透镜的介绍,此次不再赘述。
以第一透镜组件和第二透镜组件均为上述图4a所示的结构、第三透镜组件为上述图6a所示的结构为例模拟场曲和光学畸变,场曲模拟采用的参考波长为486.1纳米(nm)、587.5nm、656.3nm。可以理解的是,采用的参考波长也可以是470nm、510nm、555nm、610nm和650nm等。模拟的场曲结果可参见下述图7a,光学畸变可参见下述图7b。由图7a可以看出,场曲大小较合理。因此,基于图4a所示的第一透镜组件和第二透镜组件以及基于图6a所示的第三透镜组件的像差较小,容易被矫正。由图7b可以确定,基于图4a所示的第一透镜组件和第二透镜组件以及基于图6a所示的第三透镜组件的光学畸变均较小,最大的光学畸变小于27%,说明该光学成像模组的成像的光学畸变较小。
以第一透镜组件和第二透镜组件均为上述图4b所示的结构、第三透镜组件为上述图6b所示的结构为例模拟场曲和光学畸变,场曲模拟采用的参考波长为486.1纳米(nm)、587.5nm、656.3nm。模拟的场曲结果可参见下述图8a,光学畸变可参见下述图8b。由图8a可以看出,场曲大小较合理。因此,基于图4b所示的第一透镜组件和第二透镜组件以及基于图6b所示的第三透镜组件的像差较小,容易被矫正。由图8b可以确定,基于图4b所示的第一透镜组件和第二透镜组件以及基于图6b所示的第三透镜组件的光学畸变均较小,最大的光学畸变小于14%,说明该光学成像模组的成像的光学畸变也较小。
二、合光组件
在一种可能的实现方式中,合光组件可将来自第一透镜组件的第一光线和来自第二透镜组件的第二光线混合,获得第三光线。换言之,第一光线和第二光线经合光组件后混叠为第三光线。
下面基于合光组件的分光原理,示例性的示出了三种可能的结构。
结构1,合光组件包括偏振分光元件。
在一种可能的实现方式中,偏振分光元件是基于接收到的第一光线和第二光线的偏振态进行分光的。示例性地,偏振分光元件可以为偏振光分束器(polarizing beam splitter,PBS)。请参阅图9,为本申请提供的一种偏振光分束器的分光示意图。偏振光分束器可通过在直角棱镜的斜面(可称为第一合光面)镀制一层或多层偏振分光薄膜或刻蚀有金属线栅,然后通过胶层相贴合。利用光束以布鲁斯特角入射时P偏振光(箭头表示)透射率为1而S偏振光(实点表示)透射率小于1的性质,在光束以布鲁斯特角多次通过薄膜以后, 达到使的P偏振分量完全透过,而绝大部分S偏振分量反射(至少90%以上)的一个光学元件。示例性地,偏振光分束器可将入射光(包括P偏振光和S偏振光)分为的水平偏振光(即P偏振光)和垂直偏振光(即S偏振光)。其中,P偏振光完全通过,S偏振光以45度角被反射,且S偏振光的出射方向与P偏振光的出射方向成90度角。也可以理解为,PBS具有透射和反射特性,通常,对S偏振光的反射率在99.5%以上,对P偏振光的透过率在91%以上。通过PBS可以实现分光膜工作的波长范围比较宽,偏振度也比较高。
具体的,所述偏振分光元件用于将所述第一光线中的第一偏振光反射至所述第三透镜组件,将所述第二光线中的第二偏振光透射至所述第三透镜组件,所述第一偏振光的偏振态与所述第二偏振光的偏振态不同。进一步,可选的,第一偏振光和第二偏振光为偏振态互相垂直的偏振光。示例性地,第一偏振光为S偏振光,第二偏振光为P偏振光,第一光线的S偏振光被偏振分光元件的第一合光面反射至第三透镜组件,第二光线的P偏振光被偏振分光元件的第一合光面透射至第三透镜组件。可以理解的是,第三光线包括第一光线的S偏振光和第二光线的P偏振光。
可以理解的是,上述给出的偏振分光棱镜仅是示例,偏振分光元件例如还可以是偏振分光平板等。偏振分光平板是通过在玻璃平板的表面(可称为第一合光面)上镀制一层或多层偏振分光薄膜(或称为偏振分光膜)或刻蚀有金属线栅形成的。本申请对偏振分光元件的具体形态不作限定,凡是可以实现本申请中的偏振分光元件的功能的各类形态均在本申请的保护范围内。
需要说明的是,光学成像模组还可包括吸光结构,该吸光结构可以吸收被偏振分光元件的第一合光面透射的第一光线的P偏振光和被反射的第二光线的S偏振光,从而降低光学成像模组成像时被非必要光线的干扰。
结构2,合光组件包括光谱分光元件。
在一种可能的实现方式中,光谱分光元件是基于接收到的第一光线和第二光线的波段进行分光的。可以理解的是,不同波段对应的光线的颜色也不同,因此,光谱分光元件也可以理解为是基于接收到的第一光线和第二光线的颜色进行分光的。示例性的。该光谱分光元件可以通过在第二分光界面上镀制多通带光谱膜(或称为分色膜),多通带光谱膜可以从复色光中分离出不同波段的单色光或窄带复色光。具体可以根据实际需求选择多通带光谱膜中的各个通带允许透射或反射的波长范围。
具体的,光谱分光元件用于将第一光线中的k个第一波段的光线反射至所述第三透镜组件,并将所述第二光线中的k个第二波段的光线透射至所述第二透镜组件,一个第一波段的光线与一个第二波段的光线对应,所述k为大于1的整数。进一步,k个第一波段的波长范围不同,k个第二波段的波长范围不同。可以理解的是,第三光线包括k个第一波段的光线和k个第二波段的光线。
请参阅图10,为本申请提供的一种光谱分光元件的分光示意图。该示例中以光谱分光元件的第二合光面镀制的多通带光谱膜R1R2G1G2B1B2为例,多通带光谱膜R1R2G1G2B1B2表示光谱分光元件反射颜色R1对应的波段、颜色G1对应的波段和颜色B1对应的波段,被反射的波段可统称为第一波段;并透射颜色R2对应的波段、颜色G2对应的波段和颜色B2对应的波段,被透射的波段可统称为第二波段。图10中各个颜色对应的波段可以用矩形块表示,横坐标表示波长。换言之,光谱分光元件反射第一光线中的三个第一波段的光线的颜色表示为R1G1B1,透射第二光线中的三个第二波段的光线的颜 色表示为R2G2B2。基于此,第三光线可包括第一光线中的R1G1B1对应的光线、及第二光线中的R2G2B2对应的光线。进一步,颜色R1的第一波段与颜色R2的第二波段对应,颜色R1与颜色R2可以合成(或近似合成)常规Bayer图像传感器中的颜色R,或者也可以是定义的其它颜色R;颜色G1的第一波段与颜色G2的第二波段对应,颜色G1与颜色G2可以合成(或近似合成)常规Bayer图像传感器中的颜色G,或者也可以是定义的其它颜色G;颜色B1的第一波段与颜色B2的第二波段对应,颜色B1与颜色B2可以合成(或近似合成)常规Bayer图像传感器中的颜色B,或者也可以是定义的其它颜色B。进一步,可选的,第一波段的中心波长可以大于对应的第二波段的中心波长,或者第一波段的中心波长小于对应的第二波段的中心波长,其中,R1G1B1中的“1”表示相对短的波长(λ),“2”表示相对长的波长。
可以理解的是,光谱分光元件的第二分光界面的多通带光谱膜还可以是R2R1G2G1B2B1(表示反射R2G2B2,透射R1G1B1)、或者R1R2Y1Y2B1B2(表示反射R1Y1B1,透射R2Y2B2)、或者C1C2M1M2Y1Y2(表示反射C1M1Y1,透射C2M2Y2),此次不再一一列举。本申请中对光谱分光元件透射哪些波段反射哪些波段不作限定,可根据实际需求进行选择多通带光谱膜。
需要说明的是,第一光线可包括R1R2G1G2B1B2对应的光线,第二光线也可包括R1R2G1G2B1B2对应的光线。光学成像模组还可包括吸光结构,该吸光结构可以吸收第一光线中的R2G2B2对应的光线和第二光线中的R1G1B1对应的光线。
结构3,合光组件包括分时复用的分光元件。
在一种可能的实现方式中,合光组件也可以基于时段进行分光。具体的,所述合光组件具体用于在第一时段将所述第一光线反射至所述第三透镜组件,在第二时段将所述第二光线透射至所述第三透镜组件。
如下,示例性的是出了三种可能的分时复用的分光元件。
结构3.1,分时复用的分光元件包括分光棱镜、第一开关(shutter)和第二开关。
也可以理解为,分时复用的分光元件包括配置有第一开关和第二开关的分光棱镜,该分光棱镜为普通的分光棱镜。请参阅图11,为本申请提供的另一种合光组件的分光示意图。该合光组件包括第一开关、第二开关和分光棱镜。第一开关位于第一透镜组件与分光棱镜之间,用于控制第一光线是否通过;第二开关位于第二透镜组件与分光棱镜之间,用于控制第二光线是否通过。请参阅图11中的(a),在第一时段,第一开关打开(用虚线表示开关处于打开状态),第二开关关闭(用实线表示开关处于关闭状态),第一光线可透过第一开关并经分光棱镜反射至第三透镜组件,第二光线被第二开关阻挡无法进入分光棱镜;请参阅图11中的(b),在第二时段,第一开关关闭,第二开关打开,第二光线可透过第二开关并经分光棱镜透射至第三透镜组件,第一光线被第一开关阻挡无法进入分光棱镜。
在一种可能的实现方式中,所述第一开关包括第一电控开关或第一液晶光阀;和/或所述第二开关包括第二电控开关或第二液晶光阀。
结构3.2,合光组件包括可偏转反光镜。
在一种可能的实现方式中,在第一时段,可偏转反光镜的反射面朝向第一透镜组件,将第一光线反射至所述第三透镜组件,并阻挡第二光线进入第三透镜组件。具体的,可偏转反光镜的反射面与第一透镜组件的第一光轴之间的夹角等于(90°-α/2)。例如,当α=90°时,可偏转反光镜的反射面与第一透镜组件的第一光轴之间的夹角等于45°,可偏转反光 镜的反射面可将第一光线全部反射至所述第三透镜组。在第二时段,可偏转反光镜的反射面用于阻挡第一光线进入第三透镜组件,允许第二光线透射至第三透镜组件。例如,当α=90°,可偏振反射光镜的反射面与第二透镜组件的第二光轴平行,与第一透镜组件的第一光轴垂直。需要说明的是,可偏转反光镜所绕的旋转点不能在第一光轴和第二光轴的交点,应该位于靠近第一透镜组件的位置,以保证在第二时段,不遮挡第二光线且阻止第一光线进入第三透镜组件。也可以理解为,在第二时段,可偏振反光镜需要满足两个条件,1)允许第二光线进入第三透镜组件;2)阻止第一光线进入第三透镜组件。
结构3.3,合光组件包括镀制有电控膜的分光镜。
在第一时段,控制分光镜的电控膜将第一光线反射至第三透镜组件,并阻挡第二光线进入第三透镜组件;在第二时段,控制分光镜的电控膜将第二光线透射至第三透镜组件,并阻挡第一光线进入第三透镜组件。
需要说明的是,光学成像模组还可包括吸光结构,该吸光结构可以吸收在任意时刻的无效光线,第一时刻的第二光线为无效光线,第二时刻的第一光线也为无效光线,从而降低光学成像模组成像时被非必要光线的干扰。
三、探测组件
在一种可能的实现中,第三光线可在探测组件上聚焦,探测组件可根据聚焦后的第三光线形成图像。进一步,可选的,探测组件可对接收到的第三光线进行光电转换得到电信号,根据电信号形成图像。
如下,基于探测组件的编码方式,示例性的示出了三种可能的探测组件类型。
类型A,探测组件包括第一图像传感器。
在一种可能的实现方式中,第一图像传感器为偏振传感器,偏振传感器的像素为偏振像素。若探测组件为类型A,合光组件为上述结构一。换言之,探测组件为偏振传感器,合光组件为偏振分光元件。进一步,偏振像素的偏振透过性与经合光组件后获得的第三光线的偏振透过性相同。例如,第三光线包括P偏振光和S偏振光,偏振像素可以响应P偏振光或S偏振光。
请参阅图12,为本申请提供的一种探测组件的结构示意图。该探测组件为偏振传感器,偏振传感器包括第一滤光层和第一感光层。进一步,偏振传感器还可包括第二滤光层。其中,第一滤光层为偏振滤光层,第二滤光层为色彩滤光层,色彩滤光层例如可以是彩色马赛克滤光层。需要说明的是,第一滤光层和第二滤光层的顺序可以互换,即第二滤光层位于第一滤光层与第一感光层之间。
其中,第一滤光层包括N个第一滤光块,第一滤光块为第一滤光层的最小可重复块,所述第一滤光块包括n个第一滤光单元,所述n个第一滤光单元中至少两个第一滤光单元允许所述第三光线通过的偏振态不同,所述N和n均为大于1的整数。请参阅图13中的(1)或图14a中的(1),第一滤光块包括两个第一滤光单元,一个滤光单元允许第三光线中的P偏振光通过(竖直线填充表示允许P偏振光透过),另一个第一滤光单元允许第三光线中的S偏振光通过(水平线填充表示允许S偏振光透过)。进一步,可选的,第二滤光层包括M个第二滤光块,第二滤光块为第二滤光层的最小可重复块,所述第二滤光块包括m个第二滤光单元,m个第二滤光单元中至少两个滤光单元允许所述第三光线通过的波段范围不同,M和m均为大于1的整数。请参阅图13中的(2)或图14a中的(2),第二滤光块包括三个滤光单元,三个滤光单元允许第三光线通过的波段互不相同,R表示允许 透过红光,G表示允许透过绿光,B表示允许透过蓝光。其中,一个第二滤光单元对应一个第一滤光单元。
为了保证每个第一像素块可以检测到全部的偏振态的光线和全部波段的光线,第一感光层可根据第一滤光块包括的第一滤光单元的数量n和第二滤光块包括的第二滤光单元的数量m划分第一像素块,例如,第一像素块包括的第一像素数量p=n×m。其中,第一像素块为第一感光层中最小的可重复块。
在一种可能的实现方式中,属于同一个第二滤光块的m个第二滤光单元对应的m个第一滤光单元允许所述第三光线通过的偏振态相同。结合图13中的(1)和(2),第二滤光块包括的3个第二滤光单元RGB对应的第一滤光层中的3个第一滤光单元允许通过的偏振光均为第三光线中的P偏振光或均为S偏振光。相应的,第一像素块包括2×3个第一像素。结合图14a中的(1)和(2),第二滤光块包括的3个第二滤光单元RGB对应的第一滤光层中的3个第一滤光单元允许通过的偏振光均为第三光线中的P偏振光或均为S偏振光。相应的,第一像素块包括3×2个第一像素。如此,一个偏振态可以对应全部的波段,或者,一个波段可以对应全部的偏振态。也可以理解为,每个第一像素可以检测到全部的偏振光(即P偏振光和S偏振光)和允许通过的全部波段的光线。
或者,属于同一个第一滤光块的n个第一滤光单元对应的n个第二滤光单元允许所述第三光线通过的波段范围相同。请参阅图14b,第一滤光块包括四个第一滤光单元,一个滤光单元允许第三光线中的P偏振光通过(竖直线填充表示允许P偏振光透过),另一个第一滤光单元允许第三光线中的S偏振光通过(水平线填充表示允许S偏振光透过),再一个第一滤光单元允许第三光线中的P偏振光和S偏振光通过(向左倾斜的填充表示同时允许P偏振光和S偏振光透过),再一个第一滤光单元允许第三光线中的S偏振光和P偏振光通过(向右倾斜的填充表示同时允许S偏振光和P偏振光透过)。进一步,这四个第一滤光单元允对应的四个第二滤光单元允许第三光线通过的波段范围相同,例如第二滤光层为单色滤光层,单色滤光层是指只允许第三光线中某一颜色的光线通过;或者也可以是全色滤光层,全色滤光层是指允许白光的光线通过。基于此,第二滤光块包括的第二滤光单元的数量m可以理解为等于1。进一步,第一像素块包括的像素的数量p=n×1。
需要说明的是,第一感光层根据第一滤光块包括的第一滤光单元的数量n和第二滤光块包括的第二滤光单元的数量m划分第一像素块还可以有其它可能的设置方式。请参阅图15中的(1)和(2),第一滤光块包括4个第一滤光单元,第二滤光块包括4个第二滤光单元,第一像素块包括4×4个第一像素。此次不再一一列举。如此,可以实现第一光线和第二光线完整覆盖偏振传感器,从而可提高偏振传感器的利用率。
类型B,探测组件包括第二图像传感器。
在一种可能的实现方式中,第二图像传感器为光谱传感器,光谱传感器的像素为光谱像素。相应的,合光组件为上述结构二。换言之,探测组件为光谱传感器,合光组件为光谱分光元件。
请参阅图16,为本申请提供的一种探测组件的结构示意图。该探测组件为光谱传感器,光谱传感器包括第三滤光层和第二感光层。其中,第三滤光层可为色彩滤光层,色彩滤光层可参见前述类型A中的相关介绍,此处不再赘述。
其中,第三滤光层包括Q个第三滤光块,Q为正整数,第三滤光块为第三滤光层的最小可重复块,第三滤光块包括至少2k个第三滤光单元,一个第三滤光单元用于接收来自 光谱分光元件的第三光线中的一个第一波段的光线或一个第二波段的光线,2k个第三滤光单元允许通过的所述第三光线的波段不同。在一种可能的实现方式中,第三滤光块包括的第三滤光单元的数量与第二合光面镀制的多通带光谱膜的通带数量相同。例如,上述第二合光面镀制的多通带光谱膜R1R2G1G2B1B2,第三滤光块包括6个第三滤光单元,这6个第三滤光单元分别允许R1、R2、G1、G2、B1、B2对应的光线通过,6个第三滤光单元的分布方式请参阅图17a,相应的,一个第二像素块包括2×3个像素。进一步,第二感光层包括Q个第二像素块,所述第二像素块至少包括2k个第二像素,一个第二像素块对应一个第三滤光块,一个第二像素对应一个第三滤光单元。需要说明的是,上述图17a给出的6个第三滤光单元的分布方式仅是一种可能的示例,也可以是其它任意可能的分布方式,本申请对此不作限定。
以第二合光面镀制的多通带光谱膜R1R2G1G2B1B2,第三滤光块包括4×4个第三滤光单元,这16个第三滤光单元分别允许R1、R2、G1、G2、B1、B2对应的光线通过,这16个第三滤光单元的分布方式可请参阅图17b,相应的,一个第二像素块包括4×4个第二像素,一个第二像素对应一个第三滤光单元。需要说明的是,第三滤光层中最小可重复的第三滤光块包括的第三滤光单元的数量也可以比上述图17b更多,本申请对此不作限定。此外,第三滤光块包括的第三滤光单元的数量可以与第二合光面镀制的多通带光谱膜的通带的数量不同。
基于上述类型A和类型B的探测组件,通过采集1帧原始图像即可获得具有较大视场较小光学畸变的图像,从而可应用于视频流媒体等场景。而且不需要高容量的图像处理带宽,有助于节省处理器的算力。
类型C,探测组件包括第三图像传感器。
在一种可能的实现方式中,第三图像传感器可以为普通的图像传感器(即拜耳(Bayer)图像传感器)。相应的,合光组件为上述结构三。
在一种可能的实现方式中,第三图像传感器可包括第四滤光层和第三感光层,请参阅图18。其中,第四滤光层可为色彩滤光层。第四滤光层包括H个第四滤光块,第四滤光块是第四滤光层的最小可重复块。所述第四滤光块包括h个第四滤光单元,所述h个第四滤光单元中至少两个第四滤光单元允许所述第三光线中通过的波段不同,H和h均为大于1的整数。请参阅图19a,为本申请提供的一种第四滤光层的示意图。该第四滤光层以包括3×3个第四滤光块为例,第四滤光块包括2×2个第四滤光单元,可表示为RGGB,R表示第四滤光单元允许第三光线中的红色波段的光线通过,G表示第四滤光单元允许第三光线中的绿色波段的光线通过,B表示第四滤光单元允许第三光线中蓝色波段的光线通过。需要说明的是,第四滤光块包括的2×2个第四滤光单元,可表示为RYYB,R表示第四滤光单元允许第三光线中的红色波段的光线通过,Y表示第四滤光单元允许第三光线中的黄色波段的光线通过,B表示第四滤光单元允许第三光线中蓝色波段的光线通过。进一步,第三感光层包括H个第三像素块,第三像素块是第三感光层的最小可重复块,所述第三像素块包括h个第三像素,一个第三像素块对应一个第四滤光块,一个第四滤光单元对应一个第三像素。
需要说明的是,第四滤光层也可以是其它可能的分布。例如,第四滤光层可包括1×3个第四滤光单元,可表示为RGB,R表示第四滤光单元允许第三光线中的红色波段的光线通过,G表示第四滤光单元允许第三光线中的绿色波段的光线通过,B表示第四滤光单元 允许第三光线中蓝色波段的光线通过,请参阅图19b或图19c。再比如,第四滤光层也可包括4×4个第四滤光单元,请参阅图19d,其中4个第四滤光单元均允许第三光线中的红色波段的光线通过,8第四滤光单元均允许第三光线中的绿色光线通过,4个第四滤光单元均允许第三光线中的蓝色光线通过。
需要说明的是,第四滤光层例如也可以是单色滤光层,单色滤光层是指只允许第三光线中某一颜色的光线通过;或者也可以是全色(黑白)滤光层,请参阅图20,全色滤光层是指允许白光通过。
基于上述类型C的探测组件,通过采集2帧原始图像(即第一时段采集一帧,第二时段采集一帧)即可获得具有较大视场较小光学畸变的图像,从而可应用于视频流媒体等场景。而且不需要高容量的图像处理带宽,有助于节省处理器的算力。
可以理解的是,上述给出的第一滤光层、第二滤光层、第三滤光层和第四滤光层的形状仅是示例。上述给出的滤光单元还可以是其它几何对称的形状(例如正六边形或矩形等),滤光单元相互之间可紧密排列成滤光层,滤光层可呈长方形或者正方形。当滤光单元为正方形时,滤光单元相互之间紧密排列形成的滤光层可呈正方形或长方形;当滤光单元为正六边形时,滤光单元之间紧密排列形成的滤光层整体上大致可呈长方形或正方形,边缘可能会不整齐。
基于上述内容,下面结合具体的硬件结构,给出上述光学成像模组的两种具体实现方式。以便于进一步理解上述光学成像模组的结构及光学成像模组的成像过程的实现过程。需要说明的是,上述给出各个模块中,如果没有特殊说明以及逻辑冲突,根据其内在的逻辑关系可以组合形成其它可能的光学成像模组。
如图21a和图21b所示,为本申请提供的两种光学成像模组的结构示意图。光学成像模组可包括第一透镜组件、第二透镜组件、合光组件、第三透镜组件和探测组件。该示例中,以第一透镜组件的第一光轴与第二透镜组件的第二光轴之间的夹角等于90°为例。第一视场的第一光线经第一透镜组件传播至合光组件,第二视场的第二光线经合光组件传播至合光组件,合光组件将第一光线和第二光线混叠,获得第三光线,第三光线经第三透镜组件聚焦至探测组件,探测组件基于聚焦后的第三光线形成图像。关于第一透镜组件、第二透镜组件、合光组件、第三透镜组件和探测组件的详细介绍可参见前述相关描述,此次不再赘述。
基于上述光学成像模组获得图像后,可按对应的方式将第一视场对应的第一图像S1和第二视场对应的第二图像S2进行解混、分离、校准、拼接等后,合成可以直接观察的大视场图像。例如,合成的图像畸变大小可控,合成的图像的视场>190°,光学畸变<10%。
为了便于方案的说明,以合光组件为上述结构一的偏振分光元件为例,探测组件中第一滤光层以上述图14b为例,第二滤光层以单色滤光层为例,第一像素块以包括2×2个像素为例,探测组件中的每个第一像素块可以检测到第一视场的第一光线和第二视场的第二光线。其中,第一像素块中的4个第一像素中的一个第一像素可以检测到S偏振光,一个第一像素可以检测到P偏振光,两个第一像素可以同时检测到S偏振光和P偏振光,利用下述公式3,可以分别分离出第一视场的第一图像S1和第二视场的第二图像S2。
其中,I0表示检测到S偏振光,I90表示检测到P偏振光,I135表示同时检测到P偏振光和S偏振光,I45表示同时检测到P偏振光和S偏振光。
进一步,对第一图像S1和第二图像S2进行如下处理,即可生成用户可直接观看的大视场图像。
步骤a,图像预处理。主要包括特征点提取与识别,内参读取,畸变校正,亮度调整等。
步骤b,图像配准。主要包括坐标转换与透视矩阵求解。
步骤c,图像合成。主要包括图像融合,边界处理等。
步骤d,图像显示。主要包括数据传输与显示。
基于上述描述的光学成像模组的架构和功能原理,本申请还可以提供一种光学成像系统。请参阅图22,为本申请提供的一种光学成像系统。该光学成像系统包括两个上述任一实施例中的光学成像模组。进一步,这两个成像模组的两个探测组件可通过总线连接。
在一种可能的实现方式中,该光学成像系统相当于将其中一个光学成像模组绕探测组件的成像面旋转180°获得的。也可以理解为,两个光学成像模组的探测组件的成像面(或称为光敏面)相背。
基于该光学成像系统,可以实现360°成像。而且,两个光学成像模组的探测组件通过总线连接,两个探测组件的数据可以通过总线一起输出,有助于避免了两个探测组件之间的不同步,从而可在硬件上实现同步。
基于上述描述的光学成像模组的架构和功能原理,本申请还可以提供一种终端设备。请参阅图23,该终端设备可以包括至少一个处理器2301和上述任一实施例中的光学成像模组2302。进一步,可选地,该光学成像模组还可包括存储器2303。存储器2303用于存储程序或指令。处理器用于调用程序或指令控制上述光学成像模组成像。处理器2301执行存储在例如存储器2303这样的非暂态计算机可读介质中的指令。光学成像模组2302可参见前述相关介绍,此处不再赘述。处理器2301还可以是采用分布式方式控制终端设备2300的个体组件或子系统的多个计算设备。
处理器2301可以是一种具有信号(或数据)的处理能力的电路,在一种实现中,处理器可以是具有指令读取与运行能力的电路,例如中央处理单元(Central Processing Unit,CPU)、微处理器、图形处理器(graphics processing unit,GPU)(可以理解为一种微处理器)、或数字信号处理器(digital singnal processor,DSP)等;在另一种实现中,处理器可以通过硬件电路的逻辑关系实现一定功能,该硬件电路的逻辑关系是固定的或可以重构的,例如处理器为专用集成电路(application-specific integrated circuit,ASIC)或可编程逻辑器件(programmable logic device,PLD)实现的硬件电路,例如FPGA。在可重构的硬件电路中,处理器加载配置文档,实现硬件电路配置的过程,可以理解为处理器加载指令,以实现以上部分或全部单元的功能的过程。此外,还可以是针对人工智能设计的硬件电路,其可以理解为一种ASIC,例如神经网络处理单元(neural network processing pnit,NPU) 张量处理单元(tensor processing unit,TPU)、深度学习处理单元(deep learning processing unit,DPU)等。尽管图23功能性地图示了处理器、存储器、和在相同块中的处理器2301的其它元件,但是本领域的普通技术人员应该理解该处理器和存储器实际上可以不存储在相同的物理外壳内的多个处理器或存储器。例如,存储器可以是硬盘驱动器或位于不同于处理器2301的外壳内的其它存储介质。再比如,处理器也可以远离该终端设备但可以与该终端设备进行无线通信。
在一些实施例中,存储器2303可包含指令(例如,程序逻辑),指令可被处理器2301读取来执行终端设备2300的各种功能,包括以上描述的功能。存储器2303也可包含额外的指令,包括向终端设备的其它系统发送数据、从其接收数据、与其交互和/或对其进行控制的指令。除了指令以外,存储器2303还可存储数据,例如光学成像模组2302获取的图像信息等。
存储器例如可以是随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。
需要说明的是,图23给出的终端设备的功能框架只是一个示例,在其它示例中,终端设备2300可以包括更多、更少或不同的装置,并且每个装置可以包括更多、更少或不同的组件。此外,示出的装置和组件可以按任意种的方式进行组合或划分,本申请对此不做具体限定。
示例性地,该终端设备例如可以是车辆(例如无人车、智能车、电动车、或数字汽车等)、机器人、测绘设备、无人机、智能家居设备(例如电视、扫地机器人、智能台灯、音响系统、智能照明系统、电器控制系统、家庭背景音乐、家庭影院系统、对讲系统、或视频监控等)、智能制造设备(例如工业设备)、智能运输设备(例如AGV、无人运输车、或货车等)、或智能终端(手机、手表、计算机、平板电脑、掌上电脑、台式机、耳机、音响、穿戴设备、车载设备、虚拟现实设备、增强现实设备等)等。
在本申请的各个实施例中,如果没有特殊说明以及逻辑冲突,不同的实施例之间的术语和/或描述具有一致性、且可以相互引用,不同的实施例中的技术特征根据其内在的逻辑关系可以组合形成新的实施例。
本申请中,“垂直”不是指绝对的垂直,可以允许有一定工程上的误差。“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。在本申请的文字描述中,字符“/”,一般表示前后关联对象是一种“或”的关系。在本申请的公式中,字符“/”,表示前后关联对象是一种“相除”的关系。另外,在本申请中,“示例性地”一词用于表示作例子、例证或说明。 本申请中被描述为“示例”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。或者可理解为,使用示例的一词旨在以具体方式呈现概念,并不对本申请构成限定。
可以理解的是,在本申请中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请的实施例的范围。上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定。术语“第一”、“第二”等类似表述,是用于分区别类似的对象,而不必用于描述特定的顺序或先后次序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
以上,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (17)

  1. 一种光学成像模组,其特征在于,包括第一透镜组件、第二透镜组件、第三透镜组件、合光组件以及探测组件,所述第一透镜组件的第一光轴与所述第二透镜组件的第二光轴之间夹角大于0°且小于180°、或者大于180°且小于360°;
    所述第一透镜组件,用于将来自第一视场的第一光线传播至所述合光组件;
    所述第二透镜组件,用于将来自第二视场的第二光线传播至所述合光组件,所述第一视场与所述第二视场部分重叠;
    所述合光组件,用于将所述第一光线和所述第二光线混合,得到第三光线;
    所述第三透镜组件,用于将来自所述合光组件的所述第三光线聚焦至所述探测组件;
    所述探测组件,用于根据聚焦后的第三光线形成图像。
  2. 如权利要求1所述的模组,其特征在于,所述第一光轴与所述第二光轴之间的夹角等于90°。
  3. 如权利要求1或2所述的模组,其特征在于,所述第一视场的视场角大于或等于90°且小于或等于135°;和/或,
    所述第二视场的视场角大于或等于90°且小于或等于135°。
  4. 如权利要求1~3任一项所述的模组,其特征在于,经所述第一透镜组件传播的第一光线为平行光或非平行光;和/或,
    经所述第二透镜组件传播的第二光线为平行光或非平行光。
  5. 如权利要求1~4任一项所述的模组,其特征在于,所述合光组件包括偏振分光元件;
    所述偏振分光元件,用于将所述第一光线中的第一偏振光反射至所述第三透镜组件,将所述第二光线中的第二偏振光透射至所述第三透镜组件,所述第一偏振光的偏振态与所述第二偏振光的偏振态不同。
  6. 如权利要求5所述的模组,其特征在于,所述偏振分光元件包括第一合光面,所述第一合光面镀制有偏振分光膜或刻蚀有金属线栅。
  7. 如权利要求5或6所述的模组,其特征在于,所述探测组件包括第一滤光层和第一感光层;
    所述第一滤光层包括N个第一滤光块,所述第一滤光块包括n个第一滤光单元,所述n个第一滤光单元中至少两个第一滤光单元允许所述第三光线通过的偏振态不同,所述N和n均为大于1的整数;
    所述第一感光层包括P个第一像素块,所述第一像素块包括p个第一像素,p大于或等于n,所述P为正整数。
  8. 如权利要求7所述的模组,其特征在于,所述探测组件还包括第二滤光层;
    所述第二滤光层包括M个第二滤光块,所述第二滤光块包括m个第二滤光单元,所述m个第二滤光单元中至少两个滤光单元允许所述第三光线通过的波段范围不同,所述M和m均为大于1的整数;
    其中,一个第二滤光单元对应一个第一滤光单元,属于同一个第二滤光块的m个第二滤光单元对应的m个第一滤光单元允许所述第三光线通过的偏振态相同,或者,属于同一个第一滤光块的n个第一滤光单元对应的n个第二滤光单元允许所述第三光线通过的波段范围相同。
  9. 如权利要求1~4任一项所述的模组,其特征在于,所述合光组件为光谱分光元件;
    所述光谱分光元件,用于将所述第一光线中的k个第一波段的光线反射至所述第三透镜组件,并将所述第二光线中的k个第二波段的光线透射至所述第二透镜组件,一个第一波段的光线与一个第二波段的光线对应,所述k为大于1的整数。
  10. 如权利要求9所述的模组,其特征在于,所述光谱分光元件包括第二合光面,所述第二合光面包括多通带光谱膜。
  11. 如权利要求9或10所述的模组,其特征在于,所述探测组件包括第三滤光层和第二感光层;
    所述第三滤光层包括Q个第三滤光块,所述第三滤光块至少包括2k个第三滤光单元,所述2k个第三滤光单元允许所述第三光线中通过的波段范围不同,所述Q和q为正整数;
    所述第二感光层包括Q个第二像素块,所述第二像素块至少包括2k个第二像素,一个第二像素块对应一个第三滤光块,一个第二像素对应一个第三滤光单元。
  12. 如权利要求1~4任一项所述的模组,其特征在于,所述合光组件,具体用于:
    在第一时段将所述第一光线反射至所述第三透镜组件,在第二时段将所述第二光线透射至所述第三透镜组件。
  13. 如权利要求12所述的模组,其特征在于,所述合光组件包括以下任一项:
    镀制有电控膜的分光镜;
    偏转反光镜;
    分光棱镜、第一开关和第二开关,所述第一开关位于所述第一透镜组件与所述分光棱镜之间,所述第二开关位于所述第二透镜组件与所述分光棱镜之间。
  14. 如权利要求13所述的模组,其特征在于,所述第一开关包括第一电控开关或第一液晶光阀;和/或,
    所述第二开关包括第二电控开关或第二液晶光阀。
  15. 如权利要求12~14任一项所述的模组,其特征在于,所述探测组件包括第四滤光层和第三感光层;
    所述第四滤光层包括H个第四滤光块,所述第四滤光块包括h个第四滤光单元,所述h个第四滤光单元中至少两个第四滤光单元允许所述第三光线中通过的波段范围不同,H和h均为大于1的整数;
    所述第三感光层包括H个第三像素块,所述第三像素块包括h个第三像素,一个第三像素块对应一个第四滤光块,一个第四滤光单元对应一个第三像素。
  16. 一种光学成像系统,其特征在于,包括两个如权利要求1~15任一项所述的光学成像模组;
    其中,两个所述光学成像模组的探测组件通过总线连接。
  17. 一种终端设备,其特征在于,包括处理器以及如权利要求1~15任一项所述的光学成像模组;
    所述处理器,用于控制所述光学成像模组成像。
PCT/CN2023/086660 2022-04-18 2023-04-06 一种光学成像模组、光学成像系统及终端设备 WO2023202387A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210407071.8 2022-04-18
CN202210407071.8A CN116953898A (zh) 2022-04-18 2022-04-18 一种光学成像模组、光学成像系统及终端设备

Publications (1)

Publication Number Publication Date
WO2023202387A1 true WO2023202387A1 (zh) 2023-10-26

Family

ID=88419075

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/086660 WO2023202387A1 (zh) 2022-04-18 2023-04-06 一种光学成像模组、光学成像系统及终端设备

Country Status (2)

Country Link
CN (1) CN116953898A (zh)
WO (1) WO2023202387A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117519216A (zh) * 2024-01-08 2024-02-06 中建八局检测科技有限公司 一种基于传感器联合导航检测避障的运料小车

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101750862A (zh) * 2008-12-16 2010-06-23 康佳集团股份有限公司 一种投影照明系统
CN106483636A (zh) * 2015-08-25 2017-03-08 洛克威尔自动控制技术股份有限公司 用于极宽的视场的模块化透镜
CN107390348A (zh) * 2016-05-17 2017-11-24 杭州海康机器人技术有限公司 光学成像装置和摄像机
CN107741274A (zh) * 2017-10-19 2018-02-27 中国科学院西安光学精密机械研究所 一种微型偏振光谱成像探测系统及方法
JP2020194061A (ja) * 2019-05-28 2020-12-03 セイコーエプソン株式会社 投射型表示装置
CN113540138A (zh) * 2021-06-03 2021-10-22 奥比中光科技集团股份有限公司 一种多光谱图像传感器及其成像模块

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101750862A (zh) * 2008-12-16 2010-06-23 康佳集团股份有限公司 一种投影照明系统
CN106483636A (zh) * 2015-08-25 2017-03-08 洛克威尔自动控制技术股份有限公司 用于极宽的视场的模块化透镜
CN107390348A (zh) * 2016-05-17 2017-11-24 杭州海康机器人技术有限公司 光学成像装置和摄像机
CN107741274A (zh) * 2017-10-19 2018-02-27 中国科学院西安光学精密机械研究所 一种微型偏振光谱成像探测系统及方法
JP2020194061A (ja) * 2019-05-28 2020-12-03 セイコーエプソン株式会社 投射型表示装置
CN113540138A (zh) * 2021-06-03 2021-10-22 奥比中光科技集团股份有限公司 一种多光谱图像传感器及其成像模块

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117519216A (zh) * 2024-01-08 2024-02-06 中建八局检测科技有限公司 一种基于传感器联合导航检测避障的运料小车
CN117519216B (zh) * 2024-01-08 2024-03-08 中建八局检测科技有限公司 一种基于传感器联合导航检测避障的运料小车

Also Published As

Publication number Publication date
CN116953898A (zh) 2023-10-27

Similar Documents

Publication Publication Date Title
JP5724755B2 (ja) 撮像システム
CN207529011U (zh) 变焦透镜、投射型显示装置及摄像装置
CN103064171B (zh) 一种高分辨率大视场光学成像系统
CN102866480B (zh) 一种基于计算成像技术的大视场光学成像系统
WO2023202387A1 (zh) 一种光学成像模组、光学成像系统及终端设备
CN205880490U (zh) 一种全景成像装置
US11756975B2 (en) Image sensor and image sensing method to generate high sensitivity image through thin lens element and micro lens array
US7777970B2 (en) Super-wide-angle lens and imaging system having same
Sun et al. Single-lens camera based on a pyramid prism array to capture four images
CN103004218B (zh) 三维摄像装置、摄像元件、透光部、及图像处理装置
CN112817151A (zh) 一种波导镜片及ar显示装置
CN102474649B (zh) 三维摄像装置及透光板
CN100490501C (zh) 实时无失真成像的全景视频系统
JP6004073B2 (ja) 光学系及び撮像装置
JP5783314B2 (ja) 全天球型光学系および撮像システム
CN203587870U (zh) 一种多视角摄像镜头模组
CN102109681B (zh) 色彩分光系统
CN108449539A (zh) 多镜头成像装置及多镜头成像系统
JP5839135B2 (ja) 全天球型光学系及び撮像装置
CN207352264U (zh) 全景成像系统和电子设备
CN207516656U (zh) 一种用于不同视角成像的成像装置
RU108651U1 (ru) Растровая система воспроизведения объемного изображения
CN207473188U (zh) 变焦透镜、投射型显示装置及摄像装置
US11899169B2 (en) Lens assembly and electronic device including the same
CN207352263U (zh) 全景成像系统和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23791043

Country of ref document: EP

Kind code of ref document: A1