CN111787301A - Lens, three-dimensional imaging method, device, equipment and storage medium - Google Patents

Lens, three-dimensional imaging method, device, equipment and storage medium Download PDF

Info

Publication number
CN111787301A
CN111787301A CN202010542078.1A CN202010542078A CN111787301A CN 111787301 A CN111787301 A CN 111787301A CN 202010542078 A CN202010542078 A CN 202010542078A CN 111787301 A CN111787301 A CN 111787301A
Authority
CN
China
Prior art keywords
image
lens
sub
dimensional
lenses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010542078.1A
Other languages
Chinese (zh)
Inventor
吕键
唐攀
陈磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Launca Medical Device Technology Co ltd
Original Assignee
Guangdong Launca Medical Device Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Launca Medical Device Technology Co ltd filed Critical Guangdong Launca Medical Device Technology Co ltd
Priority to CN202010542078.1A priority Critical patent/CN111787301A/en
Publication of CN111787301A publication Critical patent/CN111787301A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0055Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras employing a special optical element
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a lens, a three-dimensional imaging method, a three-dimensional imaging device, a three-dimensional imaging equipment and a storage medium. The lens is provided with an incident axis and comprises a lens element, the lens element comprises a first sub-lens and at least two second sub-lenses, the second sub-lenses are in a non-rotational symmetric structure, the first sub-lens comprises a first effective light-passing part, the second sub-lenses comprise a second effective light-passing part, and the second effective light-passing parts of any two second sub-lenses are rotationally symmetric about the incident axis; the first effective light-passing part in the lens element is used for passing an incident beam to form a first image on the image side of the lens; each second effective light-passing part in the lens element is used for passing incident light beams to form second images which are the same as the number of the second effective light-passing parts on the image side of the lens, and the first images and the second images are spaced two by two. Three-dimensional imaging is realized through one lens, the transverse size of a three-dimensional imaging system is greatly reduced, and the system can flexibly carry out three-dimensional scanning on a narrow space.

Description

Lens, three-dimensional imaging method, device, equipment and storage medium
Technical Field
The present invention relates to the field of three-dimensional imaging technologies, and in particular, to a lens, a three-dimensional imaging method, an apparatus, a device, and a storage medium.
Background
In the conventional three-dimensional scanning systems, images of a shot object at different angles are generally acquired through a plurality of lenses to form three-dimensional point cloud data, and then point cloud matching is performed by calculating the similarity of point clouds at two adjacent scanning moments, namely, image data shot at different positions are spliced along with continuous movement of the lenses, so that a complete three-dimensional model of the object is finally reconstructed. However, due to the arrangement of a plurality of lenses, the conventional three-dimensional scanning system has the problem that the size is too large, and the three-dimensional scanning of a narrow space is difficult.
Disclosure of Invention
In view of the above, it is necessary to provide a lens, a three-dimensional imaging method, an apparatus, a device, and a storage medium for solving the problem of how to reduce the size of a three-dimensional scanning system.
A lens barrel having an incident axis, comprising a lens element including a first sub-lens and at least two second sub-lenses, the second sub-lenses being of a non-rotationally symmetric structure, the first sub-lens including a first effective light-passing portion, the second sub-lenses including a second effective light-passing portion, the second effective light-passing portions of any two of the second sub-lenses being rotationally symmetric about the incident axis;
the first effective light-passing part in the lens element is used for passing an incident beam to form a first image on the image side of the lens; each second effective light-passing part in the lens element is used for passing incident light beams to form second images which are the same as the second effective light-passing parts in number on the image side of the lens, and the first images and the second images are spaced from each other.
In the lens, the non-rotational symmetric structure can reduce the structural size of the second sub-lens in the radial direction, so that at least two sub-lenses are accommodated in one lens, at least two imaging images at different angles for a shot object can be obtained through the lens, and corresponding three-dimensional point cloud data can be obtained. On the other hand, the lens can also obtain an imaging image which is nearly synchronous through the first sub-lens to assist the three-dimensional point cloud data obtained by the second sub-lens, so that a more accurate and stable three-dimensional model is obtained. The first sub-lens and the second sub-lens are both arranged in the lens, and the three-dimensional imaging of a shot object can be realized through one lens, so that the transverse size of the three-dimensional imaging system can be greatly reduced, and the three-dimensional imaging system can flexibly and efficiently carry out three-dimensional scanning on a narrow space.
In one embodiment, the first sub-lens and each of the second sub-lenses in the lens element are structurally capable of being spliced into one lens rotationally symmetric about the optical axis.
In one embodiment, the first sub-lens and each of the second sub-lenses in the lens element are spaced apart from each other.
In one embodiment, the lens includes any one of the following solutions:
the lens element comprises one first sub-lens and two second sub-lenses, and the first sub-lens is arranged between the two second sub-lenses;
the lens element includes one of the first sub-lenses and two of the second sub-lenses, the first sub-lenses and the two of the second sub-lenses being disposed around the incident axis.
In one embodiment, the first sub-lens of the lens element is a centrosymmetric structure.
In one embodiment, the lens barrel includes a first imaging unit and at least two second imaging units, each of the first imaging units includes at least two first sub-lenses arranged along the incident axis direction, each of the second imaging units includes at least two second sub-lenses arranged along the incident axis direction, the number of the first sub-lenses in the first imaging unit is equal to the number of the second sub-lenses in any one of the second imaging units, each of the first sub-lenses is included in one of the lens elements, and each of the second sub-lenses is included in one of the lens elements.
In one embodiment, the lens includes first apertures and second apertures, the number of the first apertures is the same as the number of the first sub-lenses in the lens element, the number of the second apertures is the same as the number of the second sub-lenses in the lens element, and in a direction parallel to the incident axis, there is an overlap of projections of the first sub-lenses in the lens element and one of the first apertures on a plane perpendicular to the incident axis, and there is an overlap of projections of each of the second sub-lenses in the lens element and one of the second apertures on a plane perpendicular to the incident axis.
In one embodiment, any two of the second apertures are rotationally symmetric about the incident axis.
In one embodiment, the aperture of each of the second apertures is the same.
In one embodiment, the first aperture is larger than the second aperture.
In one embodiment, the lens includes a lens barrel, the lens element is disposed in the lens barrel, the lens barrel is provided with an entrance aperture at an object end, and the incident axis passes through the entrance aperture.
A three-dimensional imaging module comprises an image sensor and the lens, wherein the image sensor is arranged on the image side of the lens. By adopting the lens, the three-dimensional imaging module can realize three-dimensional imaging of a shot object through one lens, so that the transverse size of the module can be effectively reduced, the space for installing the three-dimensional imaging module in three-dimensional imaging equipment is reduced, and the three-dimensional imaging equipment can perform more efficient and flexible three-dimensional scanning on a narrow space.
In one embodiment, the number of the image sensors is one.
In one embodiment, the three-dimensional imaging module comprises a first optical filter and at least two second optical filters, the first optical filter overlaps with a projection of the first sub-lens in the lens element on a plane perpendicular to the incident axis, each second sub-lens in the lens element overlaps with a projection of one second optical filter on a plane perpendicular to the incident axis, and the first optical filter and the second optical filters are used for filtering light rays with different wavelengths.
In one embodiment, the three-dimensional imaging module comprises a light source, the light source and the lens are fixedly arranged relatively, and the light source is used for irradiating a shot object.
A three-dimensional imaging device comprises the three-dimensional imaging module. Through adopting above-mentioned three-dimensional imaging module, three-dimensional imaging equipment can carry out more high-efficient nimble three-dimensional scanning to narrow space.
A three-dimensional imaging device comprises the lens barrel of any one of the above items. By adopting the lens, the three-dimensional imaging equipment can perform more efficient and flexible three-dimensional scanning on a narrow space.
A three-dimensional imaging method is applied to any lens, and comprises the following steps:
acquiring a first image and a second image of the same frame within preset time, wherein the first image has two-dimensional surface information of a shot object;
obtaining a frame of three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame;
and determining a three-dimensional model of the shot object in the frame according to the first image and the three-dimensional information image of the same frame.
The three-dimensional imaging method can acquire a first image through the first sub-lens and acquire a second image through the second sub-lens in the lens. The first image has two-dimensional surface information of the shot object, so the first image can be used as a two-dimensional information image with the two-dimensional surface information, and a three-dimensional information image with three-dimensional point cloud information is obtained by utilizing at least two second images which are spaced from each other. In the above way, each frame of image has two-dimensional and three-dimensional information, and the accuracy of three-dimensional imaging of each frame can be effectively improved by combining the two-dimensional information image and the three-dimensional information image in the system. On the other hand, the first sub-lens and the second sub-lens are both arranged in the lens, so that three-dimensional imaging of a shot object can be realized through one lens, the transverse size of a three-dimensional imaging system can be greatly reduced, and the three-dimensional imaging method can be flexibly and efficiently applied to narrow space.
In one embodiment, the first image includes at least one of color, texture, and brightness of the subject.
In one embodiment, the method further comprises:
acquiring the three-dimensional information images of two adjacent frames;
performing feature matching processing on the three-dimensional information images of the two adjacent frames to obtain a point cloud matching result;
and splicing the three-dimensional information images of the two adjacent frames according to the point cloud matching result.
In one embodiment, a three-dimensional imaging method includes: and carrying out iterative closest point algorithm processing on the three-dimensional information images of two adjacent frames so as to splice the three-dimensional information images of the two adjacent frames.
In one embodiment, the method further comprises:
acquiring the first images of two adjacent frames;
performing feature matching processing on the first images of the two adjacent frames to obtain a two-dimensional matching result;
and splicing the first images of the two adjacent frames according to the two-dimensional matching result.
In one embodiment, the step of performing feature matching processing on the first images of two adjacent frames includes any one of:
carrying out scale invariant feature conversion processing on corresponding features in the first images of two adjacent frames;
carrying out accelerated robust feature processing on corresponding features in the first images of two adjacent frames;
and carrying out the method processing of restoring the structure in motion on the corresponding features in the first images of the two adjacent frames.
In one embodiment, the predetermined time is less than or equal to 200 ms.
In one embodiment, the acquiring the first image and the second image of the same frame within the preset time includes:
and obtaining interval time between the first image and the second image of the same frame, wherein the interval time satisfies t less than or equal to 1ms, and the exposure time of the first image and the exposure time of the second image are both within 100 ms.
In one embodiment, a three-dimensional imaging method includes: projecting a first flash of light on a subject to acquire the first image and projecting a second flash of light on the subject to acquire the second image, the first flash of light being projected alternately with the second flash of light.
In one embodiment, a first flash of light is projected on a subject to acquire the first image.
In one embodiment, a second flash of light is projected on the subject to acquire the second image.
A three-dimensional imaging apparatus comprising:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a first image and a second image of the same frame within preset time, and the first image has two-dimensional surface information of an object;
the processing module is used for obtaining a frame of three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame; and
and the determining module is used for determining a three-dimensional model of the shot object in the frame according to the first image and the three-dimensional information image in the same frame.
In one embodiment, the three-dimensional imaging device further comprises a projection module, wherein the projection module is used for projecting a first flash and a second flash to a shot object within a preset time;
the acquisition module is used for acquiring a first image according to the first flash and acquiring a second image according to the second flash.
A three-dimensional imaging apparatus comprising:
a projector for projecting a first flash and a second flash to an object within a preset time;
a memory storing a computer program;
the receiver is used for acquiring a first image according to the first flash in preset time and acquiring a second image according to the second flash, wherein the first image has two-dimensional surface information of a shot object;
a processor configured to execute the computer program on the memory to implement: obtaining a frame of three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame; and determining a three-dimensional model of the shot object in the frame according to the first image and the three-dimensional information image of the same frame.
In one embodiment, the projector is capable of sequentially projecting a first flash and a second flash to the object within a preset time.
A storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps of the three-dimensional imaging method of any one of the above.
The three-dimensional imaging method, the three-dimensional imaging device, the three-dimensional imaging equipment and the storage medium utilize the first image with the two-dimensional information of the surface of the shot object and the three-dimensional information image with the three-dimensional information of the surface of the shot object to determine the three-dimensional model of the surface of the shot object, so that the accuracy of three-dimensional imaging of each frame can be effectively improved. Furthermore, the three-dimensional imaging method, the three-dimensional imaging device, the three-dimensional imaging equipment and the storage medium can also be applied to continuous three-dimensional scanning, and the continuous scanning of the shot object can obtain a stable and accurate continuous three-dimensional model by combining the splicing processing of the first image and the three-dimensional information image.
Drawings
FIG. 1 is a schematic view of a three-dimensional imaging module including a lens according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a partial structure of the lens barrel shown in FIG. 1;
FIG. 3 is a schematic distribution diagram of an image corresponding to the lens of FIG. 2;
FIG. 4 is a schematic view of a three-dimensional imaging module including a lens according to another embodiment of the present disclosure;
FIG. 5 is a schematic view of a partial structure of a lens barrel according to another embodiment of the present application;
FIG. 6 is a schematic distribution diagram of an image corresponding to the lens of FIG. 5;
FIG. 7 is a schematic view of a partial structure of a lens barrel according to another embodiment of the present application;
FIG. 8 is a schematic view of a partial structure of a lens barrel according to another embodiment of the present application;
FIG. 9 is a schematic diagram illustrating a distribution of an image corresponding to the lens of FIG. 8;
FIG. 10 is a schematic view of a three-dimensional imaging module including a lens according to another embodiment of the present disclosure;
fig. 11 is a schematic partial structural diagram of a three-dimensional imaging device according to an embodiment of the present application;
fig. 12 is a schematic flowchart of a three-dimensional imaging method according to an embodiment of the present application;
fig. 13 is a schematic flow chart of a three-dimensional imaging method according to another embodiment of the present application;
fig. 14 is a schematic flowchart illustrating a process of stitching a three-dimensional information image in a three-dimensional imaging method according to an embodiment of the present application;
fig. 15 is a schematic flowchart illustrating stitching of a first image in a three-dimensional imaging method according to an embodiment of the present application;
fig. 16 is a block diagram of a three-dimensional imaging device according to an embodiment of the present application;
fig. 17 is a block diagram of a three-dimensional imaging apparatus according to another embodiment of the present application;
fig. 18 is an internal structural view of a three-dimensional imaging apparatus according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein.
In the description of the present invention, it is to be understood that the terms "central," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," "axial," "radial," "circumferential," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation, and are not to be considered limiting of the invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; they may be directly connected or indirectly connected through intervening media, or they may be connected internally or in any other suitable relationship, unless expressly stated otherwise. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "upper," "lower," "left," "right," and the like as used herein are for illustrative purposes only and do not denote a unique embodiment.
In the conventional three-dimensional scanning systems, images of a subject at different angles are generally acquired through a plurality of lenses to form three-dimensional point cloud data, and then point cloud matching is performed by calculating point cloud similarity at adjacent scanning moments, that is, image data shot at different positions are spliced along with continuous movement of the lenses, so that a complete three-dimensional model of the subject is finally reconstructed. However, due to the arrangement of a plurality of lenses, the conventional three-dimensional scanning system has the problem that the size is too large, and the three-dimensional scanning of a narrow space is difficult.
Referring to fig. 1, some embodiments of the present application provide a three-dimensional imaging module 20, which includes a lens 10 and an image sensor 210, wherein the image sensor 210 is disposed on an image side of the lens 10. The image sensor 210 may be a CCD (charge coupled Device) or a CMOS (Complementary Metal oxide semiconductor). The imaging surface 103 of the lens 10 overlaps the photosensitive surface of the image sensor 210.
The lens 10 has positive optical power, and the lens 10 is used for converging image information of a subject onto an imaging surface 103 to form an imaged image. The lens 10 includes a lens barrel 100 and a lens element 110 having a special-shaped structure, the lens element 110 is installed in the lens barrel 100, an object end of the lens barrel 100 is provided with a light inlet 1001, a central axis of the light inlet 1001 is collinear with an incident axis 101 of the lens 10, or there may be a small deviation, and the incident axis 101 is a virtual reference axis. The shape of the light entry hole 1001 in some embodiments may be oval, rectangular, etc. The incident axis 101 of the lens 10 should be perpendicular to the photosensitive surface and pass through the center of the photosensitive surface. The light beams from the subject are converged by the lens 10 to form a corresponding number of imaged images on the photosensitive surface of the image sensor 210. Particularly, when the number of the image sensors 210 is one, each of the imaging images can be formed on the image sensor 210, so that the lateral dimension of the module can be effectively controlled, and further, the small-sized design of the three-dimensional imaging module 20 can be realized.
Referring to fig. 1 and 2, in the embodiment shown in fig. 1 and 2, the lens element includes a first sub-lens 111 and two second sub-lenses 112, the first sub-lens 111 and the two second sub-lenses 112 are disposed at an interval in a direction perpendicular to the incident axis 101, and object-side surfaces of the first sub-lens 111 and the second sub-lens 112 face the light incident hole. The second sub-lens 112 is of a non-rotationally symmetric structure, and for the second sub-lens 112 of the non-rotationally symmetric structure, there is no symmetry axis in a direction parallel to the incident axis 101, so that the second sub-lens 112 can be rotated by an angle θ (0 < θ < 360 °) around the symmetry axis and still can be overlapped with a state when the second sub-lens is not rotated.
The incident axis 101 passes through the first sub-lens 111, and the first sub-lens 111 is centrosymmetric with respect to the incident axis 101. The two second sub-lenses 112 are respectively disposed on two opposite sides of the first sub-lens 111 in a direction perpendicular to the incident axis 101, and the two second sub-lenses 112 are centrosymmetric with respect to the incident axis 101, that is, after one of the second sub-lenses 112 is rotated by 180 ° around the incident axis 101, the one second sub-lens can overlap with the other second sub-lens 112, or there may be a slight deviation. The two second sub-lenses 112 having a centrosymmetric relationship are identical in structure, for example, the object-side surfaces of the two second sub-lenses 112 are identical in surface shape, and the image-side surfaces are also identical in surface shape. In particular, in this embodiment, the first sub-lens 111 and the two second sub-lenses 112 may be split from the same lens, i.e. the first sub-lens 111 and the two second sub-lenses 112 can be structurally spliced into one lens that is rotationally symmetric about the optical axis. Specifically, when the first sub-lens 111 and the second sub-lens 112 are each split by one lens, the splitting path of the lens is parallel to the optical axis of the lens, and when the split lens is mounted in the lens barrel after the splitting is completed, the cut surfaces formed by the first sub-lens 111 and the second sub-lens 112 after the splitting are planar and remain parallel to each other, and the split first sub-lens 111 and the split second sub-lens 112 are arranged at intervals in the direction perpendicular to the optical axis of the lens. The object side surface of the lens can be spherical or aspherical, and the image side surface of the lens can be spherical or aspherical.
In this embodiment, the first sub-lens 111 and the second sub-lens 112 each include an arc-shaped edge 1107, the arc-shaped edges 1107 of the first sub-lens 111 and the second sub-lens 112 are far away from the incident axis 101 than their structures, and if the first sub-lens 111 and the second sub-lens 112 are spliced into a complete lens, the arc-shaped edges 1107 of the two sub-lenses will be used as the edges of the maximum effective light-passing area of the object side surface or the image side surface of the lens. It should be noted that a general lens actually further includes a clamping portion, and the clamping portion is a non-light-transmitting portion, but the description of various symmetric relationships of the first sub-lens 111, the second sub-lens 112, and the lens in the present application does not refer to the clamping portion.
With further reference to fig. 3, the first sub-lens 111 includes a first effective light-passing portion 1111, and each of the second sub-lenses 112 includes a second effective light-passing portion 1121. The first effective light-passing portion 1111 of the first sub-lens 111 is used for passing an incident light beam to form a first image 105 on the image side of the lens 10, and the second effective light-passing portions of the second sub-lenses 112 are also used for passing the incident light beam to form at least two second images 106 on the image side of the lens 10, which are the same as and separated from the number of the second effective light-passing portions 1121 of the lens elements, i.e. the incident light beam can form one second image 106 after being converged by each second effective light-passing portion 1121. It should be noted that the first image 105 formed by the first sub-lens 111 and the second image 106 formed by the second sub-lens 112 are spaced apart from each other and do not overlap each other. In some embodiments, the first effective light passing portion 1111 and the second effective light passing portions 1121 of the same lens element 110 are spaced apart from each other. The area through which the incident light beam forming the first image 105 passes in the first sub-lens 111 is the first effective light-passing portion 1111, and the area through which the incident light beam forming the second image 106 passes in the second sub-lens 112 is the second effective light-passing portion 1121. In the embodiment of the present application, any two second effective light-passing portions 1121 should be rotationally symmetric about the incident axis 101 as much as possible to ensure that each second image 106 is formed as an image under a symmetric viewing angle about the subject, thereby facilitating obtaining accurate three-dimensional data of the subject surface through each second image 106. The rotational symmetry relationship in this application includes, but is not limited to, the ability of one component to overlap another component after rotation of the component by 90 °, 135 °, 180 °, 225 °, etc. degrees about the rotational symmetry axis.
The first sub-lens 111 and each second sub-lens 112 are spaced from each other, so as to separate the imaging images (the first image 105 and the second image 106) on the imaging plane 103, and thus, the system terminal can perform three-dimensional analysis on the corresponding features in each imaging image.
The shape of the photosensitive surface on the image sensor 210 is generally rectangular, and in the embodiment of fig. 1, the separation direction of the two second sub-lenses 112 is parallel to the length direction of the photosensitive surface, and the separation distance between the second sub-lenses 112 in the direction parallel to the length direction is greater than or equal to half the length of the photosensitive surface, thereby facilitating the formation of two separated imaged images on the photosensitive surface. The above-mentioned separation distance between the sub-lenses is understood to be the minimum distance between the two sub-lenses in the direction parallel to the length direction. Further, in a direction parallel to the length direction, the spacing distance between the two second sub-lenses 112 should be less than or equal to three-quarters of the length of the photosensitive surface, so as to prevent the problem that the imaging quality is reduced due to the excessively large spacing distance between the second sub-lenses 112, which causes the imaged image to be excessively tilted.
Above, through adopting above-mentioned camera lens 10, the transverse dimension of three-dimensional imaging module 20 can be effectively reduced to the usage space of extension module makes three-dimensional imaging module 20 can carry out the three-dimensional formation of image of more high-efficient flexibility to narrow space. It should be noted that, in addition to only one image sensor 210, two or more image sensors 210 may be disposed in the three-dimensional imaging module 20, and each image sensor 210 corresponds to one or two imaged images.
Referring to fig. 2 and 3, when the lens element 110 in the lens barrel 10 is a complete lens, the subject can form an original image on the image plane 103 of the lens barrel 10 after being converged by the lens. When the lens is divided into the first sub-lens 111 and the two second sub-lenses 112 along a direction perpendicular to the incident axis 101 and the first sub-lens 111 and the two second sub-lenses 112 are spaced from each other, the incident light beam passes through each sub-lens and then correspondingly forms a new imaging image on the imaging plane 103, i.e., the original imaging image on the imaging plane 103 is gradually separated into three new imaging images as the spacing distance between the sub-lenses increases. The direction of the spacing between the imaged images depends in part on the direction of the spacing between the sub-lenses. When the separation distance between the sub-lenses is sufficiently large, the three new imaged images will be completely separated from each other, not overlapping each other. At this time, after performing terminal analysis on the features such as the pits and the bumps in the two second images 106 arranged at intervals, three-dimensional information such as the depth and the height of the corresponding features can be obtained, and the terminal analysis method includes, but is not limited to, a binocular distance measurement method, or performing cross-correlation, least square method and other comparisons on the two second images 106 to obtain three-dimensional point cloud of the object to be shot and the like.
Taking a binocular distance measurement method as an example, for a concave or convex feature structure on a subject, corresponding images of the concave and convex in an imaging image will have different degrees of dispersion, and the first imaging unit 1021 and the second imaging unit 1022 which are arranged at intervals can image the feature structure at different angles, so that the lens 10 also has a binocular vision effect, and the same feature on the two second images 106 is subjected to terminal analysis, for example, dispersion of the feature images and/or the distance between the feature images in the two images is analyzed, so as to obtain depth information of the feature structure. By using the lens 10 in the above embodiment, three-dimensional information of the surface of the object can be reconstructed using two-dimensional imaging information of the object, thereby realizing three-dimensional imaging for the surface of the object.
On the other hand, the first image 105 in color, which is an image carrying two-dimensional information, can be obtained through the first sub-lens 111, and then the three-dimensional information image obtained by the second sub-lens 112 can be superimposed with the two-dimensional image, thereby obtaining more accurate and comprehensive three-dimensional imaging. Further, when the three-dimensional imaging system using the lens 10 needs to continuously move and scan, the system may perform Iterative Closest Point (ICP) processing on three-dimensional Point cloud information in the imaging images at two adjacent scanning moments to match the three-dimensional Point cloud in the second image at two adjacent scanning moments, so as to splice the imaging images of adjacent regions. Meanwhile, the system can splice the imaging images of two adjacent scanning moments by performing Scale-invariant feature transform (SIFT), Speeded Up Robust Features (SURF) and other feature algorithms on the first image 105 carrying two-dimensional information obtained at two adjacent scanning moments. Therefore, the three-dimensional imaging system can splice two-dimensional information images at two adjacent scanning moments by performing feature matching on the two-dimensional information images (such as color images) at the two adjacent scanning moments; and meanwhile, carrying out iterative closest point processing on the three-dimensional information images (such as three-dimensional point clouds) at two adjacent scanning moments so as to splice the three-dimensional information images at the two adjacent scanning moments. By the combined action of the two splicing methods, the splicing accuracy of the images obtained by the system during continuous movement can be effectively improved, so that a continuous and complete three-dimensional model is reconstructed.
In the design of the above-described embodiment, it is only necessary to set the sub-lenses in the lens 10 at a distance apart in the direction perpendicular to the incident axis 101 so that two new imaged images appear spaced, and thus two imaged images at different angles with respect to the subject can be obtained by one lens 10. Compared with the design with two or more lenses 10, the single lens 10 design can greatly reduce the transverse size of the three-dimensional imaging system, and is also beneficial to reducing the size of a fixed structure for mounting the lens 10 in the three-dimensional imaging device, so that the device can better perform three-dimensional imaging on a narrow space. For example, when the lens 10 is installed in a probe of an endoscope, since only one lens 10 is required to achieve three-dimensional information acquisition, the size of the probe can be effectively reduced, thereby improving the operational flexibility of the probe in a narrow space.
In the embodiment shown in fig. 1, each first sub-lens 111 is capable of forming a first image unit 1021, and each first image unit 1021 corresponds to a first image 105, i.e. the incident light beam is adjusted by the first image unit 1021 to form the first image 105 on the image plane 103 of the lens 10. Each second sub-lens 112 can form a second imaging unit 1022, and each second imaging unit 1022 corresponds to a second image 106, that is, an incident light beam can form a second image 106 on the imaging surface 103 of the lens 10 after being adjusted by the second imaging unit 1022. The incident light beams can correspondingly form a plurality of imaging images on the image side after being adjusted by each imaging unit.
In some embodiments, the lens 10 includes apertures, the number of which is the same as the number of sub-lenses in the lens element 110, and each sub-lens (the first sub-lens 111 and the second sub-lens 112) in the lens element 110 corresponds to an aperture in a direction parallel to the incident axis 101, one to one, the aperture corresponding to the first sub-lens 111 is the first aperture 121, and the aperture corresponding to the second sub-lens 112 is the second aperture 122. The first sub-lens 111 and the first diaphragm 121 in a corresponding relationship together form a first imaging unit 1021, and the second sub-lens 112 and the second diaphragm 122 in a corresponding relationship together form a second imaging unit 1022. In the embodiment of the present application, the sub-lenses and the apertures in any of the imaging units are arranged in a direction parallel to the incident axis 101. In a direction parallel to the incident axis 101, each sub-lens in the same imaging unit overlaps with its corresponding aperture's projection onto the imaging plane 103. The incident light beam can form a corresponding imaging image on the imaging surface 103 after being modulated by each imaging unit, wherein the aperture in each imaging unit can be used for controlling the depth of field and the brightness of the image, and the aperture of the aperture can be determined or adjustable.
In addition, the second aperture 122 in each second imaging unit 1022 should be centered as much as possible with respect to the incident axis 101 of the lens 10, and the apertures of the second apertures 122 should be the same, so as to ensure that the brightness of the second image 106 formed by each second imaging unit 1022 tends to be uniform, thereby facilitating the accuracy of the three-dimensional information obtained by each second image 106. The aperture can also be used to limit the edge beam, suppress spherical aberration introduced by the edge beam, and control the depth of field of the imaged image. In some embodiments, each aperture in lens 10 is independent from barrel 100, in which case the apertures can be assembled together when lens element 110 is installed into barrel 100.
In the embodiment of fig. 1, the first aperture 121 is disposed on the image side of the first sub-lens 111, the second aperture 122 is disposed on the image side of the second sub-lens 112, a connection line between centers of the first aperture 121 and the second aperture 122 is perpendicular to the incident axis 101, and in a direction parallel to the incident axis 101, there is an overlap between the first sub-lens 111 and a projection of the first aperture 121 on the imaging plane 103, and there is an overlap between the second sub-lens 112 and a projection of the second aperture 122 on the imaging plane 103. In other embodiments, the first stop 121 may be disposed on the object side of the first sub-lens 111, and the second stop 122 may also be disposed on the object side of the second sub-lens 112, and a central connecting line of the first sub-lens 111 and the second sub-lens 112 is still perpendicular to the incident axis 101. The symmetrical arrangement of the sub-lenses and the apertures about the incident axis 101 is beneficial to improving the consistency of the brightness, the definition and the size of an imaged image, and is further beneficial to the accuracy of terminal analysis.
It should be noted that, since each second imaging unit 1022 is mainly used for acquiring three-dimensional information, the aperture of each second aperture 122 should be kept consistent to ensure that the brightness, the depth of field, and the like of the second image 106 tend to be consistent, thereby improving the accuracy of the terminal analysis processing. The first imaging unit 1021 is mainly used for obtaining two-dimensional color imaging, so the aperture of the first aperture 121 may be larger than that of the second aperture 122 in some embodiments, so as to improve the brightness of the first image 105 and ensure the sharpness of the two-dimensional color imaging. In the above, since the types of the imaged images obtained by the first imaging unit 1021 and the second imaging unit 1022 are independent of each other, in some embodiments, the structure of the first effective light-passing portion 1111 in the first sub-lens 111 may be different from the structure of the second effective light-passing portion 1121 in the second sub-lens 112, but may also be the same.
It should be noted that, in some embodiments, each sub-lens in the lens element 110 is coated with a light-shielding film, the light-shielding film is disposed on the object-side surface and/or the image-side surface of the sub-lens, and a light-passing region is reserved on each light-shielding film, the light-shielding film can function as a diaphragm, and the size of the light-passing region can be regarded as the size of the aperture of the diaphragm.
In addition, to prevent the incident light beam outside the first sub-lens 111 and the second sub-lens 112 from also reaching the image sensor 210, in some embodiments, the lens 10 further includes an optical barrier 130, the optical barrier 130 is connected between the sub-lenses in the lens element 110, and the optical barrier 130 is opaque. The light blocking plate 130 may be a metal plate or a plastic plate, and the light blocking plate 130 may be disposed perpendicular to the incident axis 101. The light blocking plate 130 may be provided with a black coating layer to prevent stray light from being formed in the lens 10 after the incident light beam is reflected by the light blocking plate 130. The light blocking plate 130 can also function to increase the mounting stability between the sub-lenses by connecting the sub-lenses.
In addition, in some embodiments, in order to avoid the interference light from reaching the imaging surface 103, the three-dimensional imaging module 20 further includes a filter disposed between the lens 10 and the image sensor 210, or disposed on the object side of the lens 10, such as covering the light inlet 1001 of the lens barrel 100. Aiming at different wave bands of working light rays, the optical filter comprises at least one of a visible light band-pass optical filter, an infrared band-pass optical filter and an infrared cut-off optical filter. In some embodiments, the three-dimensional imaging module 20 includes a first filter 221 and at least two second filters 222, where the first filter 221 overlaps with a projection of the first sub-lens 111 in the lens element 110 on a plane perpendicular to the incident axis 101, each of the second sub-lenses 112 in the lens element 110 overlaps with a projection of one of the second filters 222 on a plane perpendicular to the incident axis 101, and the first filter 221 and the second filters 222 are used for filtering out light rays with different wavelengths. For example, in one embodiment, when the first imaging unit 1021 is used for converging incident light in the visible wavelength band to form a color image, the first filter 221 is an infrared cut filter to filter infrared light. When the second imaging unit 1022 is used to converge the incident light in the infrared band to form an infrared image, the second filter 222 is an infrared band pass filter, so as to filter out the light outside the expected band. Of course, in other embodiments, the second image 106 may be formed by converging visible light at a specific wavelength, such as 587.56nm or 555nm, and the corresponding second filter 222 should be a band pass filter at the corresponding wavelength band. The filtering relationship between the first filter 221 and the second filter 222 for light can be varied, and is not limited to the above scheme, as long as the light interfering with the first image 105 and the second image 106 can be eliminated.
In some embodiments, the first filter 221 and the second filter 222 are integrated, so as to reduce the number of filters to reduce the alignment requirement and the processing complexity of the filters in the module. The first filter 221 and the second filter 222 may also be disposed at an interval, and the specific disposition relationship may be determined according to actual design requirements.
On the other hand, in some embodiments, when the second image 106 is formed by converging light rays in a specific wavelength band, a light source 230 may be additionally disposed in the three-dimensional imaging module 20 to irradiate light rays in the corresponding wavelength band to the object. The light source 230 is fixedly disposed opposite to the lens 10. For example, when the specific wavelength band is 900nm of infrared light, the second filter 222 may select a narrow band-pass filter for 900nm to filter out interference of incident light beams outside the 900nm wavelength on the second image 106. Of course, the light source 230 capable of emitting light of 587.56nm, 555nm, etc. may be selected, and the second filter 222 should be a band pass filter in the corresponding wavelength band. In other embodiments, instead of providing the optical filter, a filter film may be provided on the object-side surface and/or the image-side surface of the second sub-lens 112 to achieve the filtering effect. The wavelengths of the light emitted by the light source 230 and the wavelengths that the optical filter can pass through are only specific examples, and the wavelengths corresponding to the light source 230 and the optical filter in the actual product are determined according to the requirement, which is not described herein again.
In some embodiments, the three-dimensional scanning system may obtain the first image 105 first, and then the light source 230 may emit white light to illuminate the object, so that the three-dimensional imaging module 20 can obtain the reflected first image 105 in color. Of course, when three-dimensional imaging is performed with a sufficient amount of ambient light, the light source 230 capable of emitting white light may not be provided. Subsequently, the light source 230 irradiates light with a specific wavelength to the subject, so that the three-dimensional imaging module 20 can obtain the corresponding second image 106, and the irradiation of monochromatic light can improve the accuracy of analyzing each second image 106 by the system to obtain the three-dimensional point cloud. In some embodiments, the time interval from the acquisition of the first image 105 to the acquisition of the second image 106 is less than 1ms, and each imaging time (exposure time) is less than or equal to 100ms, so that the first image 105 and the second image 106 can be considered to be acquired at the same time, and can be considered to be imaging of the same region of the subject and acquired at the same time. In other embodiments, the three-dimensional scanning system may also acquire the first image 105 and the second image 106 simultaneously.
In addition, in some embodiments, in the case that the problem characteristic of the object surface is very poor, a projection element may be disposed in the three-dimensional imaging module 20 to project a special optical pattern onto the object surface, so as to improve the contrast of the object surface and improve the detection sensitivity. Optical patterns include, but are not limited to, stripes, spots, grids, and the like.
On the other hand, the arrangement of each second sub-lens 112 and each aperture stop in the present application is not limited to the arrangement mentioned in the above embodiments. Referring to fig. 4, in some embodiments, the axial direction of each second sub-lens 112 is inclined to the incident axis 101, and for each second sub-lens 112 after being inclined, the object side surface of the second sub-lens 112 is closer to the incident axis 101 than the image side surface. In some embodiments, the angle between the axial direction of the second sub-lens 112 and the incident axis 101 of the lens 10 is 1 ° to 20 °. The second sub-lenses 112 arranged obliquely can increase the spacing distance between the second images 106, i.e. the second sub-lenses 112 can form a spacing relationship between the corresponding imaged images with a smaller spacing distance, thereby being beneficial to further reducing the lateral dimension of the lens 10. In addition, by controlling the inclination angle, it is also beneficial to avoid that the area of the imaged image with the characteristic information exceeds the imaging range of the image sensor 210 due to the overlarge interval distance between the imaged images. Likewise, in some embodiments, the second aperture 122 corresponding to the second sub-lens 112 is also tilted synchronously with the corresponding second sub-lens 112, and the central axis of the second aperture 122 tilted synchronously is parallel to the axial direction of the corresponding sub-lens, so as to ensure the brightness uniformity of the second image 106. The above tilted arrangement of each second sub-lens 112 and the second aperture 122 can also be understood as the tilted arrangement of the corresponding second imaging unit 1022 as a whole, and each second imaging unit 1022 should also have a rotational symmetry with respect to the incident axis 101 after being tilted with respect to the incident axis 101. It should be noted that, in some embodiments, the above axial direction of the second sub-lenses 112 can be understood as that when the axial direction of each second sub-lens 112 in the same lens element is parallel to the central axis of the first sub-lens 111, each second sub-lens 112 can be translated to be spliced with the first sub-lens 111 to form a complete lens.
On the other hand, the specific setting position of the diaphragm can be varied and is not limited to the setting scheme presented in fig. 1. In some embodiments, the center line of the two second apertures 122 is parallel to the center-of-gravity line of the two second sub-lenses 112; in other embodiments, the center connecting line of the two apertures is inclined to the center connecting line of the two sub-lenses. Depending on the position of the aperture, the position of the corresponding imaged image will also change. For example, the separation direction between the two second sub-lenses 112 is parallel to the length direction of the photosensitive surface, and at this time, when the connecting line direction of the centers of the apertures is inclined to the connecting line direction of the centers of gravity of the two second sub-lenses 112, the two second images 106 not only have an interval in the length direction of the photosensitive surface, but also have an interval component in the direction inclined to the length direction, that is, the two second images 106 are arranged at intervals along the diagonal direction of the photosensitive surface, so that the utilization rate of the photosensitive surface can be increased.
In the embodiment shown in fig. 5, in addition to disposing the two second imaging units 1022 on the opposite sides of the first imaging unit 1021, the two second imaging units 1022 may also be disposed on the same side of the first imaging unit 1021, where the first imaging unit 1021 and the two second imaging units 1022 are disposed around the incident axis 101, but the two second imaging units 1022 are still rotationally symmetric about the incident axis 101.
Referring to fig. 6, as for the arrangement of the first and second sub lenses 111 and 112 in the embodiment of fig. 5, the arrangement of the first and second images 105 and 106 is also different according to the arrangement positions of the first and second imaging units 1021 and 1022. In addition to the arrangement of the imaging units shown in the above embodiments, the first imaging unit 1021 and the second imaging unit 1022 may also be arranged in other manners, and the arrangement of the corresponding first image 105 and the second image 106 may also be different, and further detailed arrangements are not described herein.
In addition to the spaced arrangement, the sub-lenses in the elements of the lens 10 can also be arranged in a staggered arrangement to obtain spaced imaged images. Referring to fig. 7, in some embodiments, the first sub-lens 111 and the two second sub-lenses 112 that can be spliced into a complete lens are disposed in a staggered manner in a direction perpendicular to the incident axis 101, the two second sub-lenses disposed in the staggered manner are held in contact with the first sub-lens 111, and when the two second sub-lenses 112 are translated in the staggered manner, they can be jointly spliced into a complete lens with the first sub-lens. Fig. 6 is a diagram showing the arrangement of the imaging image corresponding to the arrangement of each sub-lens in this embodiment. With the offset arrangement, the separation distance between the two second images 106 will increase as the offset distance between the two second sub-lenses 112 increases, and the separation direction of each imaged image depends in part on the offset direction of the two sub-lenses and in addition on the position where the second aperture 122 is disposed.
Further, the positional relationship between the aperture and the sub-lens also determines the direction and distance of the separation between the imaged images. In some embodiments, the second apertures 122 in the two second imaging units 1022 have a separation distance in a direction perpendicular to the incident axis 101, and the separation distance has a magnitude that directly affects the separation distance of the two second images 106 in the direction.
By implementing the spacing and offset design for the sub-lenses in the lens element 110 and by adjusting the setting position of the aperture, it is possible to flexibly obtain each imaging image in the desired arrangement and separation relationship. In addition, the arrangement relationship between the sub-lenses and the arrangement relationship between the diaphragms are not limited to the description of the above embodiments, but all the sub-lenses capable of obtaining a desired image by the above arrangement principle should be included in the description scope of the present application.
Further, the number of the second sub-lenses 112 in the lens element 110 may be three, four or more than two, as shown in the above embodiments, but the second effective light passing portions 1121 in any two of the second sub-lenses 112 should be rotationally symmetric about the incident axis 101. At this time, each sub-lens is still disposed in one lens barrel, and each sub-lens can also be formed by splitting one lens, and each split second sub-lens 112 is of a non-rotational symmetric structure. Compared with a plurality of lenses 10 with complete lenses, each sub-lens in the design has a smaller radial size relative to the complete lens, so that the lens can be installed in one lens 10, the transverse size of a module is reduced, and the incident light beams can form separate imaging images after passing through the sub-lenses
Specifically, referring to fig. 8, in some embodiments, the lens element 110 includes one first sub-lens 111 and three second sub-lenses 112, and the projection shapes of the first sub-lens 111 and the second sub-lens 112 on the imaging plane 103 are all fan-shaped in a direction parallel to the incident axis 101, the four sub-lenses are arranged at intervals, and the four sub-lenses are rotationally symmetric about the incident axis 101 of the lens 10. In the above, when the four sub-lenses are moved close to the incident axis 101, a complete lens can be formed by splicing. Specifically, a complete lens can be equally split into four sub-lenses, the splitting path passes through and is parallel to the central axis of the lens, then the four sub-lenses are translated by the same distance along the radial direction of the original lens, and the four sub-lenses which are moved and fixed by the lens barrel 100 belong to one lens element 110, and the lens element 110 is rotationally symmetric about the incident axis 101. In some embodiments, the aperture of the first aperture 121 may be the same as the aperture of the second aperture 122, or may be different. While three second imaging units 1022 are illustrated above, in other embodiments, four, five or more second sub-lenses 112 may be disposed in the lens element. In addition, the structural relationship between the first sub-lens 111 and the second sub-lens 112 in some embodiments is not limited to the above description, and any arrangement that can be derived simply through the above principle should be considered to be within the scope of the present application, and is not described herein again.
Referring to fig. 9, in the embodiment shown in fig. 8, four imaging units (one first imaging unit 1021 and three second imaging units 1022) are disposed around the incident axis 101, and the effective light-passing portions in any two imaging units are rotationally symmetric about the incident axis 101, so that the corresponding four imaged images (one first image 105 and three second images 106) will be correspondingly arranged around the center of the image plane 103 and spaced from each other.
The above embodiments are mainly described about the case where one lens element 110 is provided in the lens barrel 10. Further, however, the lens barrel 10 in some embodiments may be provided with at least two lens elements 110, in addition to one lens element 110. The number of lens elements 110 in the lens 10 may be two, three, four, five, or more, and the lens elements 110 are sequentially arranged in the direction of the incident axis 101. In these embodiments, the lens 10 still includes a lens barrel 100, and each lens element 110 is disposed in the lens barrel 100. In some embodiments, different lens elements may be split from lenses of different structures by splitting one complete lens to serve as the first sub-lens 111 and the second sub-lens 112 in one lens element. For a lens 10 having more than two lens elements 110, the structure of the lens 10 can be regarded as being equally divided by a lens group that can be practically applied in a product, including but not limited to a telephoto lens group, a wide-angle lens group, a macro lens group, and the like. Note that, in addition to being cut out from one lens, each of the first sub-lens 111 and the second sub-lens 112 may be formed separately.
In the embodiment of the present application, the number of the first sub-lenses 111 is the same and the number of the second sub-lenses 112 is the same in each lens element 110, for example, each lens element 110 includes one first sub-lens 111 and two second sub-lenses 112. The first sub-lenses 111 of the lens elements 110 are in corresponding relationship with the first sub-lenses 111 of the other lens elements 110, each group of the first sub-lenses 111 in the corresponding relationship constitutes one first imaging unit 1021, and the first imaging unit 1021 in some embodiments further includes a first aperture stop 121. Each second sub-lens 112 of the lens elements 110 is in a corresponding relationship with one second sub-lens 112 of the other lens elements 110, each group of the second sub-lenses 112 in the corresponding relationship constitutes one second imaging unit 1022, and the second imaging unit 1022 in some embodiments further includes the second aperture 122. In the direction parallel to the incident axis 101, there is an overlap of the projections of the sub-lenses in the same imaging unit on the imaging plane 103. In particular, in some embodiments, any two adjacent sub-lenses in any imaging unit can be arranged at intervals from each other, or a cemented structure can also be formed.
Referring to fig. 10 in particular, in an embodiment of the present application, the lens 10 includes five lens elements 110, each lens element 110 includes a first sub-lens 111 and two second sub-lenses 112, the first sub-lens 111 and the second sub-lens 112 in the same lens element can be split from a complete lens, and the shape of the sub-lenses and the separation direction of the sub-lenses from the incident axis 101 can be combined with the embodiment presented with reference to fig. 1. The lens 10 further includes a first aperture 121 and a second aperture 122, the first aperture 121 corresponds to each first sub-lens 111, the second aperture 122 corresponds to each second sub-lens 112, and along a direction parallel to the incident axis 101, there is overlap between the projections of each first sub-lens 111 and the first aperture 121 on the imaging plane 103, and there is overlap between the projections of the second sub-lens 112 and the second aperture 122 belonging to the same second imaging unit 1022 on the imaging plane 103. The first aperture 121 may be disposed between the first sub-lens 111 closest to the image side and the image sensor 210, or the first aperture 121 may be disposed between any two first sub-lenses 111, or may be disposed on the object side of the first sub-lens 111 farthest from the image sensor 210, and the second aperture 122 is similarly disposed. In these embodiments, each second imaging unit should be symmetric with respect to the incident axis 101, so as to ensure that the brightness, depth of field, and size of the corresponding second image 106 tend to be uniform.
The incident light beam, after being conditioned by the first imaging unit 1021, will form a first image 105 on the imaging surface 103, and the incident light beam, after being conditioned by the second imaging unit 1022, will form a second image 106 on the imaging surface 103. The direction of the interval between the first image 105 and the second image 106 depends on the direction of the interval between the first imaging unit 1021 and the second imaging unit 1022, and also depends on the positions where the first aperture 121 and the second aperture 122 are disposed. The separation distance between the first image 105 and the second image 106 depends on the separation distance between the first imaging unit 1021 and the second imaging unit 1022, and also depends on the arrangement positions of the first aperture 121 and the second aperture 122.
In addition, in some embodiments, the first imaging unit 1021 and the second imaging unit 1022 may also be disposed obliquely to the incident axis 101, that is, the axial directions of the first imaging unit 1021 and the second imaging unit 1022 are oblique to the incident axis 101, when the sub-lens on the object side in the imaging unit is closer to the incident axis 101 than the sub-lens on the image side.
Referring to fig. 11, an embodiment of the present application further provides a three-dimensional imaging device 30, and the three-dimensional imaging device 30 may include the three-dimensional imaging module 20 in any embodiment. The three-dimensional imaging apparatus 30 may be applied to the fields of medical treatment, industrial manufacturing, and the like. Due to the small transverse size of the three-dimensional imaging module 20, the three-dimensional imaging device 30 can perform efficient and flexible three-dimensional detection on a narrow space. For example, when the three-dimensional imaging module 20 is disposed in a probe of an apparatus, the small size characteristics of the module allow the probe to be made smaller, thereby increasing the flexibility of operation of the probe in narrow spaces. On the other hand, the three-dimensional imaging module 20 can obtain a two-dimensional information image and a three-dimensional information image of the same region of the object to be shot, so that image splicing of two adjacent scanning moments can be effectively improved when the device is continuously scanned in three dimensions, and the efficiency, accuracy and stability of continuous scanning are improved.
In some embodiments, the three-dimensional imaging device 30 includes, but is not limited to, a smartphone, a dental camera device, an industrial inspection device, a drone, a vehicle-mounted camera device, and the like.
Some embodiments of the present application also provide a three-dimensional imaging method, which may be assisted by the lens of the above embodiments. The three-dimensional imaging method in the embodiment of the application can perform excellent three-dimensional imaging on the shot object. And when the shot object is continuously scanned to obtain the whole three-dimensional model of the continuous scanning area, the accuracy of the three-dimensional model formed when the system continuously scans the shot object in a three-dimensional mode can be improved.
Referring to fig. 12, in some embodiments, a three-dimensional imaging method includes the steps of:
step S410: acquiring a first image and a second image of the same frame within preset time, wherein the first image has two-dimensional surface information of a shot object;
step S420: obtaining a frame of three-dimensional information image with three-dimensional surface information according to at least two second images of the same frame;
step S430: and determining a three-dimensional model of the shot object in the frame according to the first image and the three-dimensional information image of the same frame.
Wherein the first image is obtainable through a first sub-lens in a lens element and the second image is obtainable through a second sub-lens in a lens element, each second image having a one-to-one correspondence with each second sub-lens in any lens element.
The imaging obtained for one shortest imaging period of the subject can be regarded as one frame. Each frame includes at least one first image and at least two second images. The total length of time for acquiring the first image and the second image in the same frame is prevented from being too long, so that the first image and the second image cannot be used as images of the same shot object region due to the fact that the total length of time for acquiring the two imaging images is too long. Generally, since three-dimensional imaging of a subject often requires moving a lens to achieve a continuous scanning effect, imaging of each frame should be controlled within a very short time, so as to ensure that the imaged image of the frame is imaging the same region of the subject.
In some embodiments, the preset time in step S410 may also be controlled within 200 ms. The preset time may be 50ms, 70ms, 100ms, 130ms, 150ms, 180ms or 200 ms. By controlling the total acquisition time length of the first image and each second image in the same frame within 200ms, the first image and the second image in the same frame can be further ensured to image the same area of the object. In the above, acquiring the first image and the second image of the same frame within the preset time may be understood as: and in the same frame, controlling the time from the beginning of the exposure of the first imaging image to the end of the exposure of the last imaging image of the frame within a preset time. It should be noted that the first image and the second image of the same frame may be acquired simultaneously or may not be acquired simultaneously, and the acquisition order of the two images is arbitrary, but the second images in the same frame should be acquired simultaneously to ensure the accuracy of the three-dimensional information image obtained in step S420.
In particular, in some embodiments, to prevent the first image and the second image from interfering with each other when the first image and the second image are acquired, for example, the first image is imaged by full-band visible light, and the second image is imaged by monochromatic visible light, the first image and the second image may be acquired separately in a short interval time, so as to avoid the interference when the images are acquired. In some embodiments, the interval between the acquisition of the first image and the second image in the same frame satisfies 0 < t ≦ 1ms, for example, the above interval can be understood as the time interval from the end of the exposure of the first image to the beginning of the exposure of the second image in the same frame, and the exposure order of the first image and the second image can be changed. At this time, because the interval time is extremely short, and the first image and the second image of the same frame are acquired sequentially or simultaneously within the extremely short interval time, the first image and the second image in the same frame can be regarded as imaging of the same region of the object to be shot, so that the three-dimensional model of the region of the object to be shot can be accurately represented.
On the other hand, in some embodiments, the exposure time of the first image and the second image is controlled within 100ms, so that the situation that the first image and the second image cannot image the same subject area due to too long exposure time of the imaged image can also be prevented.
In the three-dimensional imaging method, the first sub-lens in the lens is used for acquiring a first image, and the second sub-lens is used for acquiring a second image. The first image has two-dimensional surface information (at least one of color, texture, brightness, etc.) of the subject, so that the first image can be used as a two-dimensional information image having the two-dimensional surface information, and a three-dimensional information image having three-dimensional surface information (e.g., depressions, protrusions, etc.) is obtained using at least two second images spaced apart from each other. And each frame of image has two-dimensional and three-dimensional information, and the three-dimensional information image is superposed on the two-dimensional information image by combining the two-dimensional information image and the three-dimensional information image in the system, so that the two-dimensional information and the three-dimensional information of the shot object are simultaneously embodied in the final three-dimensional model, and the accuracy of three-dimensional imaging of each frame can be effectively improved. On the other hand, the first sub-lens and the second sub-lens are both arranged in one lens, so that three-dimensional imaging of a shot object can be realized through one lens, the transverse size of the three-dimensional imaging system can be greatly reduced, and the three-dimensional imaging method can be flexibly and efficiently applied to narrow space.
In some embodiments, there are various methods for obtaining the three-dimensional information image by processing the angular and spaced second images. A common method of calculating each second image using the principle of binocular disparity to obtain subject depth information may be considered to be within the scope of the present application.
Specifically, in some embodiments, the dense three-dimensional point cloud of the corresponding imaging area of the object to be photographed in the frame may be obtained by performing image processing on each second image of the same frame, such as cross-correlation, least square method, and the like, and the depth information of the object may be reflected by the three-dimensional point cloud. Therefore, in some embodiments, after the step of acquiring the second image, step S420 may specifically include: and reconstructing a three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame to obtain the frame of three-dimensional information image with the three-dimensional point cloud information. And determining the three-dimensional shapes of the object surface, such as concave shapes, convex shapes and the like according to the three-dimensional point cloud information.
On the other hand, there are various ways of obtaining the first image and the second image. In some embodiments, the first image may be formed by using the ambient light reflected by the object, i.e. the ambient light is reflected into the lens after being irradiated to the object, and finally forms the first image on the image sensor at the image side of the lens after being converged by the first sub-lens. The ambient light is generally white light such as sunlight and lamplight, and the first image at this time has two-dimensional information such as color, lines, brightness and the like of the shot object. In other embodiments, the first image may be obtained by illuminating the subject with a specific light beam. Referring to fig. 13, in some embodiments, in performing step S410, the three-dimensional imaging method further includes step S401: the first flash is projected to the object, and the first image is formed by the light reflected to the lens by the object. The first flash light can be white light or monochromatic light. When the first flash light is white light, the formed first image carries the information of the color, the grain and the brightness of the shot object; when the first flash is monochromatic, the formed first image will mainly carry the information of the grain and the brightness of the object.
Similarly, in some embodiments, the second image may also be formed by using the ambient light reflected by the object, that is, the ambient light is reflected into the lens after being irradiated to the object, and after being converged by each second sub-lens, the second image with the same number as that of the second sub-lenses in the lens element is finally formed, and at this time, the three-dimensional information image may be obtained by performing calculation analysis on at least two second images. It is noted that in some embodiments, when the first image and the second image are both images for white light, there may be overlapping exposure times for the first image and the second image without separate acquisition. That is, when the imaging band (the wavelength of light allowing the formation of the corresponding imaged image) of one of the first image and the second image is contained in or identical to the imaging band of the other, the first image and the second image may be acquired simultaneously, or there may be overlapping exposure times, or they may be acquired separately. When the imaging wave band of one of the first image and the second image is contained in the imaging wave band of the other image, the corresponding optical filter is arranged on the incident light path corresponding to the first image and the second image so as to prevent the imaging of the first image and the second image from interfering.
For example, in some embodiments, the first image corresponds to all light in the visible region (400nm-780nm), i.e., the first image may be a color image, while the second image corresponds to only 700nm of red light in the visible region. At this time, an infrared cut filter may be disposed on the incident light path corresponding to the first image to filter out infrared light, and a 700nm narrow band pass filter may be disposed on the incident light path corresponding to each second image to filter out incident light rays outside 700 nm. Thus, in these embodiments, when a white flash is projected onto a subject, the first image finally formed is a color image and the second image is a red image at 700nm, due to the presence of the filter, without interfering with each other. When the first image and the second image may be formed of the same flash, the flash projected to the subject may be irradiated and continued until the three-dimensional imaging operation is ended.
As can be seen from the above, in some embodiments, the first image and the second image can be acquired simultaneously by one flash, so that the three-dimensional imaging procedure can be simplified.
On the other hand, referring to fig. 13, in some embodiments, the three-dimensional imaging method further includes step S402: a second flash is projected against the subject. A second image is acquired by receiving a second flash of light reflected by the subject. The second flash light may be white light or may be monochromatic light, such as visible light or infrared light of a certain wavelength.
When the imaging brightness of the second image needs to be enhanced, the imaging brightness can be enhanced by increasing the illumination brightness of the second flash, and in order to avoid the second flash from interfering with the first image, the first flash and the second flash can be projected alternately to acquire the first image and the second image separately. For example, the first image is a color (400nm-780nm) image, the second image is a red light (700nm) image, and then white light is projected to the object to obtain the first image, and high-brightness 700nm red light is projected to the object after the exposure is finished to obtain the second image of the same frame, so that the interference of the high-brightness red light to the white light image is prevented.
In acquiring images of the same frame, the first flash may have the same exposure time as the first image, the second flash may have the same exposure time as the second image, and the first flash and the second flash may be projected at an interval time. When the first image and the second image of the next frame need to be acquired, a new projection cycle is performed again to project the first flash and the second flash.
Above, referring to fig. 13, in some embodiments, step S410 includes the following steps:
step S401: projecting a first flash of light to a subject;
step S411: acquiring a first image with two-dimensional surface information of a shot object;
step S402: projecting a second flash of light to the subject;
step S412: a second image is acquired.
After obtaining the second image, the three-dimensional imaging method includes:
step S421: reconstructing a three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame;
step S430: and determining a three-dimensional model of the shot object in the frame according to the first image and the three-dimensional information image of the same frame.
In the above embodiment, when the first image and the second image of the same frame are acquired correspondingly, the first flash and the second flash should be projected within a preset time, and the first image and the second image of the same frame are acquired within the preset time.
For step S430, in some embodiments, the method specifically includes: and superposing the two-dimensional surface information in the first image of the same frame with the three-dimensional surface information in the three-dimensional information image to determine a three-dimensional model of the shot object in the frame. Therefore, the two-dimensional information and the three-dimensional information of the surface of the shot object can be simultaneously displayed in the three-dimensional model formed by superposition.
On the other hand, in some embodiments, the three-dimensional imaging method described above may also be applied to a continuous three-dimensional scanning process of a subject, and referring to fig. 14, the three-dimensional imaging method further includes:
step S442: acquiring three-dimensional information images of two adjacent frames;
step S444: carrying out feature matching processing on the three-dimensional information images of two adjacent frames to obtain a point cloud matching result;
step S446: and splicing the three-dimensional information images of two adjacent frames according to the point cloud matching result.
In the above step S442, the three-dimensional information image of each frame is obtained by processing at least two second images of the adjacent two frames. The step of obtaining the three-dimensional information image of each frame according to at least two second images of the frame may specifically be: and reconstructing a three-dimensional information image with three-dimensional point cloud according to at least two second images of the same frame. The step can reconstruct dense three-dimensional point cloud of a corresponding imaging area of the object under the frame by performing image processing on at least two second images of the same frame, such as cross correlation, least square method and the like, and reconstruct a three-dimensional information image with the three-dimensional point cloud, wherein the depth information of the object can be reflected by the three-dimensional point cloud in the three-dimensional information image.
In the step S444, Iterative Closest Point (ICP) processing may be performed on the three-dimensional information images of the two adjacent frames, so as to match the three-dimensional Point clouds in the three-dimensional information images of the two adjacent frames to obtain a Point cloud matching result, and then the step S446 is performed, the three-dimensional information images of the two adjacent frames are spliced according to the Point cloud matching result, so as to reconstruct a continuous three-dimensional model of the object.
The method has the advantages that the three-dimensional information in the three-dimensional information image is utilized for splicing, the two-dimensional information in the first image can be utilized for splicing, and the two types of splicing processing are mutually assisted and corrected, so that the accuracy and the integrity of the continuous three-dimensional model can be greatly improved. In this regard, referring to fig. 15, in some embodiments, the three-dimensional imaging method further comprises:
step S452: acquiring first images of two adjacent frames of an object;
step S454: performing feature matching processing on the first images of two adjacent frames to obtain a two-dimensional matching result;
step S456: and splicing the first images of two adjacent frames according to the two-dimensional matching result.
In the step S454, the step of performing the feature matching process on the first image of the adjacent frame includes any one of:
scale-invariant feature transform (SIFT) is performed on corresponding features in the first image of the adjacent frame; or
Accelerated robust feature processing (SURF) of corresponding features in a first image of an adjacent frame; or
The corresponding feature in the first image of the adjacent frame is processed by a method of recovering structure in Motion (SFM).
And performing feature algorithm calculation on the first images of the adjacent frames by the feature matching processing method. For example, feature points (e.g., extreme points) in the images of adjacent frames can be obtained, information such as positions and direction values (gradient values) of the feature points is obtained, information such as positions and direction values (gradient values) of corresponding feature points in the first image of the adjacent frame is compared to obtain information such as moving distance, moving direction, and rotation angle of the images of the adjacent frames, and then step S456 is performed according to the matching results to stitch the first images of the adjacent frames to obtain a continuous two-dimensional information image corresponding to the scanning area. It should be noted that, for steps S442, S444, and S446 and steps S452, S454, and S456, after the corresponding first image and the corresponding three-dimensional information image are obtained, the two steps may be executed sequentially or simultaneously.
Therefore, the continuous three-dimensional model of the shot object can be reconstructed more stably and accurately according to the splicing information of the first image and the splicing information of the three-dimensional information image. In some embodiments, the final continuous three-dimensional model may be only the stitched continuous three-dimensional information image, while the stitching process of the first image is used to assist in correcting the stitching of the three-dimensional information image. In other embodiments, the successive two-dimensional information images and three-dimensional information images are superimposed in conjunction with the stitching process of the first image and the three-dimensional information image to obtain a three-dimensional model in the scanned area with two-dimensional information (color, texture, shading, etc.) and three-dimensional information (depressions, protrusions, etc.).
In some environments, the geometric features of the surface of the object are not obvious, so that effective information in a three-dimensional information image is too little to play a good splicing role in a continuous scanning process.
By combining the stitching processing of the first image and the three-dimensional information image, the three-dimensional imaging method in the embodiment of the application can obtain a stable and accurate three-dimensional model in the continuous scanning process of the shot object.
Particularly, in the case that the geometric features of the surface of the shot object are not obvious, the three-dimensional imaging method can mainly rely on calculating the two-dimensional information of the first image of the adjacent frames to realize image splicing in the continuous scanning process; and under the condition that the two-dimensional information such as color, texture and the like on the surface of the shot object is not obvious, the three-dimensional imaging method can mainly depend on calculating three-dimensional information images of adjacent frames to realize image splicing in the continuous scanning process. Therefore, the three-dimensional imaging method can be applied to various scenes and has stable and reliable continuous scanning performance. On the other hand, the first sub-lens and the second sub-lens are both arranged in one lens, so that three-dimensional imaging of a shot object can be realized through one lens, the transverse size of the three-dimensional imaging system can be greatly reduced, and the three-dimensional imaging method can be flexibly and efficiently applied to narrow space.
Referring to fig. 16, some embodiments of the present application further provide a three-dimensional imaging apparatus 600, the three-dimensional imaging apparatus 600 including:
an obtaining module 602, configured to obtain a first image and a second image of the same frame within a preset time, where the first image has two-dimensional surface information of an object;
a processing module 604, configured to obtain a frame of three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame; and
the determining module 606 is configured to determine a three-dimensional model of the subject in the frame according to the first image and the three-dimensional information image of the same frame.
Referring to fig. 17, in some embodiments, the three-dimensional imaging apparatus 600 further includes a projection module 608, where the projection module 608 is configured to project a first flash and a second flash to the subject within a preset time, and the acquisition module 602 is configured to acquire a first image according to the first flash and acquire a second image according to the second flash.
For specific definition of each module of the three-dimensional imaging apparatus 600, reference may be made to the above definition of the three-dimensional imaging method, which is not described herein again. The various modules in the three-dimensional imaging device 600 described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent of a processor in the three-dimensional imaging device, and can also be stored in a memory in the three-dimensional imaging device in a software form, so that the processor can call and execute operations corresponding to the modules.
For example, in some embodiments, the above-mentioned obtaining module 602 includes a lens and an image sensor disposed on an image side of the lens, where the incident light is adjusted by a first sub-lens and a second sub-lens in the lens to form a first image and a second image on a photosensitive surface of the image sensor, respectively.
Referring to fig. 18, some embodiments of the present application further provide a three-dimensional imaging apparatus including:
a projector for projecting a first flash and a second flash to an object within a preset time;
a memory storing a computer program;
the receiver is used for acquiring a first image according to the first flash in preset time and acquiring a second image according to the second flash, wherein the first image has two-dimensional surface information of a shot object;
a processor configured to execute a computer program on a memory to implement: obtaining a frame of three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame; and determining a three-dimensional model of the shot object in the frame according to the first image and the three-dimensional information image of the same frame.
The projector, receiver, memory and processor are connected by a system bus.
In some embodiments, the definition of the projector may be as defined above for the three-dimensional imaging method, and is not described here in detail. For example, in some embodiments, the projector can sequentially project the first flash and the second flash to the object within a preset time. For the projector, in some embodiments, the projector may be a light source capable of projecting a flash of light, and in particular, but not limited to, a laser light source or an LED light source.
In some embodiments, the receiver includes the lens described in any of the above embodiments, and an image sensor disposed on an image side of the lens, the first flash light is adjusted by the first sub-lens in the lens to form a first image on a photosensitive surface of the image sensor, the second flash light is adjusted by the second sub-lens to form a second image on the photosensitive surface of the image sensor, and the image sensor is capable of transmitting signals of the first image and the second image to the processor.
In some embodiments, the processor may be defined as described above for the three-dimensional imaging method, and is not described here. In the above, the memory stores a computer program, and the processor executes the computer program to implement the steps in the above method embodiments.
It will be understood by those skilled in the art that the structure shown in fig. 18 is a block diagram of only a portion of the structure relevant to the present application, and does not constitute a limitation on the three-dimensional imaging apparatus to which the present application is applied, and a specific three-dimensional imaging apparatus may include more or less components than those shown in the drawings, or may combine some components, or have a different arrangement of components.
In an embodiment, a computer-readable storage medium is provided, in which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile memory may include Read-only memory (ROM), magnetic tape, floppy disk, flash memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The three-dimensional imaging method, the three-dimensional imaging device 600, the three-dimensional imaging apparatus and the storage medium determine the three-dimensional model of the surface of the object by using the first image with the two-dimensional information of the surface of the object and the three-dimensional information image with the three-dimensional information of the surface of the object, so that the three-dimensional imaging accuracy of each frame can be effectively improved. Further, the three-dimensional imaging method, the three-dimensional imaging apparatus 600, the three-dimensional imaging device, and the storage medium described above can also be applied to continuous three-dimensional scanning, and by combining the above-described stitching processing of the first image and the three-dimensional information image, a stable and accurate continuous three-dimensional model can be obtained for continuous scanning of a subject.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (14)

1. A lens barrel having an incident axis, comprising a lens element, wherein the lens element comprises a first sub-lens and at least two second sub-lenses, the second sub-lenses are in a non-rotational symmetric structure, the first sub-lens comprises a first effective light-passing portion, the second sub-lenses comprise a second effective light-passing portion, and the second effective light-passing portions of any two of the second sub-lenses are rotationally symmetric with respect to the incident axis;
the first effective light-passing part in the lens element is used for passing an incident beam to form a first image on the image side of the lens; each second effective light-passing part in the lens element is used for passing incident light beams to form second images which are the same as the second effective light-passing parts in number on the image side of the lens, and the first images and the second images are spaced from each other.
2. The lens barrel as claimed in claim 1, wherein the first sub-lens and each of the second sub-lenses in the lens element can be spliced into one lens which is rotationally symmetric about an optical axis.
3. A lens barrel according to claim 1 or 2, wherein the first sub-lens and each of the second sub-lenses in the lens element are disposed at a distance from each other.
4. The lens barrel according to claim 1 or 2, comprising a first imaging unit and at least two second imaging units, each of the first imaging units including at least two of the first sub-lenses arranged along the incident axis direction, each of the second imaging units including at least two of the second sub-lenses arranged along the incident axis direction, the number of the first sub-lenses in the first imaging unit being equal to the number of the second sub-lenses in any one of the second imaging units, each of the first sub-lenses being included in one of the lens elements, and each of the second sub-lenses being included in one of the lens elements.
5. A lens barrel according to claim 1 or 2, comprising first apertures and second apertures, the number of the first apertures being the same as the number of the first sub-lenses in the lens element, the number of the second apertures being the same as the number of the second sub-lenses in the lens element, and wherein, in a direction parallel to the incident axis, there is an overlap of the projections of the first sub-lenses in the lens element and one of the first apertures on a plane perpendicular to the incident axis, and wherein there is an overlap of the projections of each of the second sub-lenses in the lens element and one of the second apertures on a plane perpendicular to the incident axis.
6. A three-dimensional imaging method applied to the lens barrel according to any one of claims 1 to 5, wherein the three-dimensional imaging method comprises the following steps:
acquiring a first image and a second image of the same frame within preset time, wherein the first image has two-dimensional surface information of a shot object;
obtaining a frame of three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame;
and determining a three-dimensional model of the shot object in the frame according to the first image and the three-dimensional information image of the same frame.
7. The three-dimensional imaging method according to claim 6, further comprising:
acquiring the three-dimensional information images of two adjacent frames;
performing feature matching processing on the three-dimensional information images of the two adjacent frames to obtain a point cloud matching result;
and splicing the three-dimensional information images of the two adjacent frames according to the point cloud matching result.
8. The three-dimensional imaging method according to claim 6 or 7, characterized in that the method further comprises:
acquiring the first images of two adjacent frames;
performing feature matching processing on the first images of the two adjacent frames to obtain a two-dimensional matching result;
and splicing the first images of the two adjacent frames according to the two-dimensional matching result.
9. The three-dimensional imaging method according to claim 6, wherein the preset time is less than or equal to 200 ms.
10. A three-dimensional imaging apparatus, comprising:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a first image and a second image of the same frame within preset time, and the first image has two-dimensional surface information of an object;
the processing module is used for obtaining a frame of three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame; and
and the determining module is used for determining a three-dimensional model of the shot object in the frame according to the first image and the three-dimensional information image in the same frame.
11. The three-dimensional imaging device according to claim 10, further comprising a projection module for projecting the first flash and the second flash to the subject within a preset time;
the acquisition module is used for acquiring a first image according to the first flash and acquiring a second image according to the second flash.
12. A three-dimensional imaging apparatus, comprising:
a projector for projecting a first flash and a second flash to an object within a preset time;
a memory storing a computer program;
the receiver is used for acquiring a first image according to the first flash in preset time and acquiring a second image according to the second flash, wherein the first image has two-dimensional surface information of a shot object;
a processor configured to execute the computer program on the memory to implement: obtaining a frame of three-dimensional information image with three-dimensional point cloud information according to at least two second images of the same frame; and determining a three-dimensional model of the shot object in the frame according to the first image and the three-dimensional information image of the same frame.
13. The three-dimensional imaging apparatus of claim 12, wherein the projector is capable of sequentially projecting the first flash and the second flash to the object within a preset time.
14. A storage medium having a computer program stored thereon, the computer program, when being executed by a processor, realizing the steps of the method of any one of claims 6 to 9.
CN202010542078.1A 2020-06-15 2020-06-15 Lens, three-dimensional imaging method, device, equipment and storage medium Pending CN111787301A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010542078.1A CN111787301A (en) 2020-06-15 2020-06-15 Lens, three-dimensional imaging method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010542078.1A CN111787301A (en) 2020-06-15 2020-06-15 Lens, three-dimensional imaging method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111787301A true CN111787301A (en) 2020-10-16

Family

ID=72756500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010542078.1A Pending CN111787301A (en) 2020-06-15 2020-06-15 Lens, three-dimensional imaging method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111787301A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021253149A1 (en) * 2020-06-15 2021-12-23 广东朗呈医疗器械科技有限公司 Lens, three-dimensional imaging module, apparatus, method, device and storage medium
CN115247777A (en) * 2022-07-20 2022-10-28 重庆长安汽车股份有限公司 Automobile tail lamp for improving qualified rate of appearance gap and surface difference, adjusting method and vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040125381A1 (en) * 2002-12-26 2004-07-01 Liang-Chia Chen Miniature three-dimensional contour scanner
US20070188601A1 (en) * 2006-02-13 2007-08-16 Janos Rohaly Three-channel camera systems with non-collinear apertures
CN104935915A (en) * 2015-07-17 2015-09-23 珠海康弘发展有限公司 Imaging device and three-dimensional imaging system and method
CN106791498A (en) * 2016-11-18 2017-05-31 成都微晶景泰科技有限公司 Image position method, lens array imaging method and device
CN108369369A (en) * 2015-12-14 2018-08-03 奥林巴斯株式会社 Photographic device
CN110505385A (en) * 2019-08-29 2019-11-26 Oppo广东移动通信有限公司 Imaging system, terminal and image acquiring method
US20190379881A1 (en) * 2018-06-08 2019-12-12 Dentsply Sirona Inc. Device, method and system for generating dynamic projection patterns in a camera
CN111238494A (en) * 2018-11-29 2020-06-05 财团法人工业技术研究院 Carrier, carrier positioning system and carrier positioning method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040125381A1 (en) * 2002-12-26 2004-07-01 Liang-Chia Chen Miniature three-dimensional contour scanner
US20070188601A1 (en) * 2006-02-13 2007-08-16 Janos Rohaly Three-channel camera systems with non-collinear apertures
CN104935915A (en) * 2015-07-17 2015-09-23 珠海康弘发展有限公司 Imaging device and three-dimensional imaging system and method
CN108369369A (en) * 2015-12-14 2018-08-03 奥林巴斯株式会社 Photographic device
CN106791498A (en) * 2016-11-18 2017-05-31 成都微晶景泰科技有限公司 Image position method, lens array imaging method and device
US20190379881A1 (en) * 2018-06-08 2019-12-12 Dentsply Sirona Inc. Device, method and system for generating dynamic projection patterns in a camera
CN111238494A (en) * 2018-11-29 2020-06-05 财团法人工业技术研究院 Carrier, carrier positioning system and carrier positioning method
CN110505385A (en) * 2019-08-29 2019-11-26 Oppo广东移动通信有限公司 Imaging system, terminal and image acquiring method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021253149A1 (en) * 2020-06-15 2021-12-23 广东朗呈医疗器械科技有限公司 Lens, three-dimensional imaging module, apparatus, method, device and storage medium
CN115247777A (en) * 2022-07-20 2022-10-28 重庆长安汽车股份有限公司 Automobile tail lamp for improving qualified rate of appearance gap and surface difference, adjusting method and vehicle
CN115247777B (en) * 2022-07-20 2023-10-20 重庆长安汽车股份有限公司 Automobile tail lamp capable of improving appearance clearance and surface difference qualification rate, adjusting method and vehicle

Similar Documents

Publication Publication Date Title
US11461930B2 (en) Camera calibration plate, camera calibration method and device, and image acquisition system
JP5914055B2 (en) Imaging device
JP4807986B2 (en) Image input device
JP2010526992A (en) Single lens, single aperture, single sensor 3D imaging device
JP7462890B2 (en) Method and system for calibrating a plenoptic camera system - Patents.com
US10715711B2 (en) Adaptive three-dimensional imaging system and methods and uses thereof
CN101960861A (en) Sensor with multi-perspective image capture
CN109348114A (en) Imaging device and electronic equipment
JP2008096162A (en) Three-dimensional distance measuring sensor and three-dimensional distance measuring method
CN111787301A (en) Lens, three-dimensional imaging method, device, equipment and storage medium
EP3182372B1 (en) Method and system for estimating the position of a projection of a chief ray on a sensor of a light-field acquisition device
Nousias et al. Corner-based geometric calibration of multi-focus plenoptic cameras
US20210211580A1 (en) Phase detection autofocus (pdaf) optical system
CN107077722A (en) Image capture apparatus and image-capturing method
JP7378219B2 (en) Imaging device, image processing device, control method, and program
CN110462679B (en) Rapid multispectral light field imaging method and system
JP7363068B2 (en) 3D information acquisition system
JP5850648B2 (en) Imaging device
CN111556308A (en) Lens, three-dimensional imaging module and three-dimensional imaging equipment
CN212137840U (en) Lens, three-dimensional imaging module and three-dimensional imaging equipment
US11314150B2 (en) Phase detection autofocus (PDAF) optical system
CN110708532B (en) Universal light field unit image generation method and system
CN212137839U (en) Lens, three-dimensional imaging module and three-dimensional imaging equipment
CN114187724B (en) Target area security and monitoring system based on hundred million-level pixel camera
US20230300311A1 (en) Lens, three-dimensional imaging module, apparatus, method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination