CN116250246A - Image pickup module device, multi-image pickup module, image pickup system, electronic apparatus, and auto-zoom imaging method - Google Patents

Image pickup module device, multi-image pickup module, image pickup system, electronic apparatus, and auto-zoom imaging method Download PDF

Info

Publication number
CN116250246A
CN116250246A CN202180060940.8A CN202180060940A CN116250246A CN 116250246 A CN116250246 A CN 116250246A CN 202180060940 A CN202180060940 A CN 202180060940A CN 116250246 A CN116250246 A CN 116250246A
Authority
CN
China
Prior art keywords
unit
camera
camera module
image
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180060940.8A
Other languages
Chinese (zh)
Inventor
戎琦
袁栋立
王启
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Sunny Opotech Co Ltd
Original Assignee
Ningbo Sunny Opotech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202010753090.7A external-priority patent/CN114070997A/en
Priority claimed from CN202010751498.0A external-priority patent/CN114070994B/en
Application filed by Ningbo Sunny Opotech Co Ltd filed Critical Ningbo Sunny Opotech Co Ltd
Publication of CN116250246A publication Critical patent/CN116250246A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Abstract

Disclosed are an image pickup module device, a multi-shot image pickup module, an image pickup system, an electronic device and an automatic zoom imaging method. The multi-camera shooting module comprises: the device comprises a first shooting unit with a zooming function and a second shooting unit with a preset relative position relation with the first shooting unit, wherein a driving component of the first shooting unit drives at least part of lenses in at least one lens group of the first shooting unit to perform optical zooming based on an adjusting instruction, the adjusting instruction is generated based on the distance between a multi-shooting module and a shot object, and the distance is obtained based on the first shooting module and the second shooting module through a binocular vision distance principle. In this way, the structural configuration of the multi-camera module enables the multi-camera module to perform automatic optical zooming based on the distance between the multi-camera module and a shot object so as to provide better shooting experience.

Description

Image pickup module device, multi-image pickup module, image pickup system, electronic apparatus, and auto-zoom imaging method Technical Field
The application relates to the field of camera modules, in particular to a camera module device, a multi-camera module, a camera system, electronic equipment and an automatic zooming imaging method.
Background
With the popularity of mobile electronic devices, related technologies applied to camera modules of mobile electronic devices for helping users acquire images (e.g., videos or images) have been rapidly developed and advanced. Especially with the development of smart phones, consumers are increasingly diversified in pursuit of shooting functions, and requirements on imaging quality are also higher, which provides more challenges for camera modules.
In recent years, the camera module has undergone a change from a single camera module to a multi-camera module, and more recently, a camera module with optical zoom capability is mounted on a smart phone to meet the requirements of different distance camera shooting.
In the current scheme, when shooting is performed by using an image pickup module with an optical zoom function, an optical zoom operation is required to be performed manually. In one current approach, the user slides on the screen with a finger to manually control the optical zoom operation to switch the near-far view shots. This manual zoom operation has a number of drawbacks during actual use.
First, the user holds the mobile electronic device in one hand and slides on the screen in the other hand, which reduces the stability of the user holding the electronic device, resulting in blurred and unclear captured images or videos.
Secondly, when the user carries out long-range shooting through the camera module after zooming, adverse effects caused by shaking of the electronic equipment can be amplified, and shooting experience is reduced.
Thirdly, the adjustment range is difficult to control by manual zooming, and is often enlarged or reduced, so that a user often needs to manually adjust for many times to enable the shooting range and the imaging quality to meet the requirements.
Therefore, there is a need for a multi-camera module with auto-zoom and imaging scheme thereof.
Disclosure of Invention
An advantage of the present application is to provide a multi-camera module, a camera system, an electronic device, and an auto-zoom imaging method, wherein the structural configuration of the multi-camera module enables the multi-camera module to perform auto-optical zooming based on a distance between the multi-camera module and a subject, so as to provide a better shooting experience.
Another advantage of the present application is to provide a multi-camera module, a camera system, an electronic device, and an automatic zoom imaging method, where the multi-camera module measures distance based on its own configured camera unit, and provides required distance information for automatic optical zooming, that is, the multi-camera module according to the embodiments of the present application expands the functions of its own configured camera unit to implement an automatic zoom function. That is, the multi-camera module according to the embodiment of the application realizes the automatic optical zoom function without additional configuration of a ranging module.
Other advantages and features of the present application will become apparent from the following description, and may be realized by means of the instrumentalities and combinations particularly pointed out in the claims.
To achieve at least one of the above objects or advantages, the present application provides a multi-camera module, comprising:
the first camera unit comprises a photosensitive chip, at least one lens group positioned on a photosensitive path of the photosensitive chip and a driving component used for driving at least part of lenses in the at least one lens group to carry out optical zooming; and
the second camera shooting unit is in a preset relative position relation with the first camera shooting unit;
the driving component is configured to drive at least part of lenses in the at least one lens group to perform optical zooming based on an adjustment instruction, the adjustment instruction is generated based on the distance between the multi-shot camera module and a shot object, the distance between the multi-shot camera module and the shot object is calculated based at least in part on a first image of the shot object acquired by the first camera unit, a second image of the shot object acquired by the second camera unit, and the relative position relationship between the first camera unit and the second camera unit.
In the multi-camera module according to the application, the at least one lens group comprises a first lens group and a second lens group, and the driving assembly comprises a first driving element, and the first driving element is configured to drive at least part of lenses in the first lens group to perform optical zooming based on the adjustment instruction.
In the multi-camera module according to the present application, the driving assembly further includes a second driving element configured to drive the second lens group for optical focusing based on the adjustment instruction.
In the multi-camera module according to the present application, the first camera unit further includes a reflective element disposed on a photosensitive path of the photosensitive chip for turning imaging light.
In the multi-camera module according to the present application, the driving assembly further includes an anti-shake mechanism for driving the reflective element to perform optical anti-shake.
In the multi-camera module according to the present application, the driving assembly further includes an anti-shake mechanism for driving the first lens group and/or the second lens group to perform optical anti-shake.
In the multi-camera shooting module according to the application, the multi-camera shooting module further comprises a third camera shooting unit, a preset position relation is arranged between the third camera shooting unit and the first camera shooting unit, a third equivalent focal length of the third camera shooting unit is larger than a second equivalent focal length of the second camera shooting unit, wherein when the distance between the multi-camera shooting module and the first camera shooting unit, which is obtained based at least in part on a first image of the shot target acquired by the first camera shooting unit, a second image of the shot target acquired by the second camera shooting unit, and the relative position relation between the first camera shooting unit and the second camera shooting unit, exceeds a preset threshold, the adjustment instruction is generated based on a second distance between the multi-camera shooting module and the shot target, and the second distance is obtained based at least in part on the first image of the shot target acquired by the first camera shooting unit, the third image of the shot target acquired by the third camera shooting unit, and the relative position relation between the first camera shooting unit and the third camera shooting unit.
According to another aspect of the present application, there is also provided an image pickup system including:
a multi-shot camera module as described above; and
a processor communicatively coupled to the multi-camera module, wherein the processor is configured to generate adjustment instructions based on a distance of the multi-camera module relative to a subject.
In the image capturing system according to the present application, the processor is further configured to fuse the first image of the subject acquired by the first image capturing module and the second image of the subject acquired by the second image capturing module after performing optical zooming, so as to obtain a fused image of the subject; or fusing the first image of the shot object acquired by the first shooting module after the optical zooming with the third image of the shot object acquired by the third shooting module to obtain a fused image of the shot object.
According to still another aspect of the present application, there is also provided an auto-zoom imaging method including:
obtaining a zooming instruction;
responding to the zooming instruction, and acquiring the distance between the multi-shot camera module and a shot target, wherein the multi-shot camera module comprises a first camera unit;
Generating an adjustment instruction based on the distance, wherein the adjustment instruction is used for driving a driving component of the first camera unit to drive at least part of lenses in at least one lens group of the first camera unit to perform optical zooming; and
and fusing the first image of the shot target acquired by the first camera unit after optical zooming with the images of the shot target acquired by other camera units of the multi-camera module to obtain a fused image.
In the automatic zoom imaging method according to the present application, in response to the zoom instruction, obtaining a distance between a multi-shot camera module and a subject includes:
obtaining a first image of the subject by the first imaging unit;
obtaining a second image of the subject by the second imaging unit; and
based at least in part on the first image, the second image, and a relative positional relationship between the first imaging unit and the second imaging unit, a first distance between the multi-camera module and the subject is obtained, wherein the first distance is a distance between the multi-camera module and the subject.
In the auto-zoom imaging method according to the present application, before obtaining the first image of the subject by the first imaging unit, further comprising: the driving assembly of the first camera shooting unit is pre-driven to drive at least part of lenses in at least one lens group of the first camera shooting unit to carry out optical zooming.
In the automatic zoom imaging method according to the present application, in response to the zoom instruction, obtaining a distance between the multi-shot camera module and a subject, further includes:
when the first distance is larger than a preset threshold value, a third shooting unit of the multi-shooting module is started to obtain a third image of the shot target through the third shooting unit, and a third equivalent focal length of the third shooting unit is larger than a second equivalent focal length of the second shooting unit; and
and obtaining a second distance between the multi-camera module and the shot target based at least in part on the first image, the third image and the relative position relationship between the first camera unit and the third camera unit, wherein the second distance is the distance between the multi-camera module and the shot target.
In the auto-zoom imaging method according to the present application, the adjustment instruction is further configured to drive the first driving element of the driving assembly to drive at least part of lenses in the first lens group of the at least one lens group to perform optical zooming.
In the auto-zoom imaging method according to the present application, the adjustment instruction is further configured to drive the second driving element of the driving assembly to drive the second lens group of the at least one lens group to perform optical focusing.
In the auto-zoom imaging method according to the present application, fusing a first image of the subject acquired by the first imaging unit after performing optical zooming and images of the subject acquired by other imaging units of the multi-camera module to obtain a fused image includes: and fusing the first image of the shot target acquired by the first camera unit after optical zooming with the second image of the shot target acquired by the second camera unit to obtain a fused image.
In the auto-zoom imaging method according to the present application, fusing a first image of the subject acquired by the first imaging unit after performing optical zooming and images of the subject acquired by other imaging units of the multi-camera module to obtain a fused image includes: and fusing the first image of the shot object acquired by the first shooting unit after optical zooming with the third image of the shot object acquired by the third shooting unit to obtain a fused image.
In the auto zoom imaging method according to the present application, the auto zoom imaging method further includes: and moving the multi-shot shooting module based on the movement track of the shot target so that the shot target is always positioned in a shooting window of the multi-shot shooting module.
In the auto zoom imaging method according to the present application, the auto zoom imaging method further includes: the reflecting element of the first imaging unit is moved based on the movement locus of the subject.
In the auto zoom imaging method according to the present application, the auto zoom imaging method further includes: the reflective element of the first imaging unit is rotated based on the movement locus of the subject.
According to still another aspect of the present application, there is also provided a camera module device, including:
the image pickup unit comprises a photosensitive chip, at least one lens group positioned on a photosensitive path of the photosensitive chip and a driving assembly used for driving at least part of lenses in the at least one lens group to carry out optical zooming; and
a ranging unit;
the driving component is configured to drive at least part of lenses in the at least one lens group to perform optical zooming based on an adjustment instruction, the adjustment instruction is generated based on the distance between the camera module device and the shot object, and the distance between the camera module device and the shot object is measured by the ranging unit.
In the camera module device according to the application, the at least one lens group comprises a first lens group and a second lens group, and the driving assembly comprises a first driving element, and the first driving element is configured to drive at least part of lenses in the first lens group to perform optical zooming based on the adjustment instruction.
In the image capturing module apparatus according to the present application, the driving assembly further includes a second driving element configured to drive the second lens group for optical focusing based on the adjustment instruction.
In the image capturing module device according to the present application, the image capturing unit further includes a reflective element disposed on the photosensitive path of the photosensitive chip for turning the imaging light.
In the camera module device according to the present application, the driving assembly further includes an anti-shake mechanism for driving the reflecting element to perform optical anti-shake.
In the camera module device according to the present application, the driving assembly further includes an anti-shake mechanism for driving the first lens group and/or the second lens group to perform optical anti-shake.
In the camera module device according to the present application, the ranging unit includes a projector configured to project a detection signal having a specific wavelength to a subject, and a receiver configured to receive the detection signal reflected back from the subject, and determine a distance between the ranging module and the subject based on a time-of-flight rule.
In the camera module device according to the application, the detection signals comprise an ultrasonic detection signal, a millimeter wave detection signal and a laser pulse detection signal.
In the image capturing module device according to the present application, the distance measuring unit is implemented as a TOF image capturing unit to acquire a distance between the image capturing module device and a subject by the TOF image capturing unit.
According to another aspect of the present application, there is also provided an image pickup system including:
the camera module device as described above; and
and a processor communicatively coupled to the camera module device, wherein the processor is configured to generate adjustment instructions based on a distance of the camera module device relative to a subject.
According to still another aspect of the present application, there is also provided an auto-zoom imaging method including:
obtaining a zooming instruction;
responding to the zooming instruction, and acquiring the distance between the camera module device and the shot target through a ranging module;
generating an adjustment instruction based on the distance, wherein the adjustment instruction is used for driving a driving component of the image capturing unit to drive at least part of lenses in at least one lens group of the image capturing unit to perform optical zooming; and
An image of the subject acquired by the image pickup unit after optical zooming is obtained.
In the automatic zoom imaging method according to the present application, in response to the zoom instruction, obtaining, by a ranging module, a distance between an image capturing module device and a subject, includes:
projecting a detection signal to a shot target;
receiving the detection signal reflected back from the photographed object; and
and determining the distance between the ranging module and the shot target based on a time flight rule, wherein the distance between the ranging module and the shot target is set as the distance between the shooting module device and the shot target.
In the auto-zoom imaging method according to the present application, the adjustment instruction is further configured to drive the first driving element of the driving assembly to drive at least part of lenses in the first lens group of the at least one lens group to perform optical zooming.
In the auto-zoom imaging method according to the present application, the adjustment instruction is further configured to drive the second driving element of the driving assembly to drive the second lens group of the at least one lens group to perform optical focusing.
In the auto-zoom imaging method according to the present application, the auto-focus imaging method further includes: and moving the camera module device based on the movement track of the shot target so that the shot target is always positioned in a shooting window of the camera module device.
In the auto-zoom imaging method according to the present application, the auto-focus imaging method further includes: the reflecting element of the image capturing unit is moved based on the movement locus of the subject.
In the auto zoom imaging method according to the present application, the auto zoom imaging method further includes: rotating a reflecting element of the imaging unit based on a motion trajectory of a subject
Further objects and advantages of the present application will become fully apparent from the following description and the accompanying drawings.
These and other objects, features, and advantages of the present application will become more fully apparent from the following detailed description, the accompanying drawings, and the appended claims.
Drawings
The foregoing and other objects, features and advantages of the present application will become more apparent from the following more particular description of embodiments of the present application, as illustrated in the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
Fig. 1 illustrates a schematic diagram of a multi-shot camera module according to an embodiment of the present application.
Fig. 2 illustrates a schematic diagram of the binocular distance principle according to an embodiment of the present application.
Fig. 3 illustrates a schematic diagram of another multi-shot camera module according to an embodiment of the present application.
Fig. 4 illustrates a schematic diagram of an imaging system according to an embodiment of the present application.
Fig. 5 illustrates a perspective schematic view of an electronic device according to an embodiment of the present application.
Fig. 6 illustrates a flow chart of an auto zoom imaging method according to an embodiment of the present application.
Fig. 7 illustrates a first schematic diagram of tracking a subject object in an auto-zoom imaging method according to an embodiment of the present application.
Fig. 8 illustrates a second schematic diagram of tracking a subject object in an auto-zoom imaging method according to an embodiment of the present application.
Fig. 9 illustrates a third schematic diagram of tracking a subject object in an auto-zoom imaging method according to an embodiment of the present application.
Fig. 10 illustrates a schematic diagram of an image capturing module apparatus according to an embodiment of the present application.
Fig. 11 illustrates a ranging schematic of a ranging unit according to an embodiment of the present application.
Fig. 12 illustrates a schematic diagram of the ranging unit implemented as a TOF camera unit according to an embodiment of the present application.
Fig. 13 illustrates a schematic view of a partition of a projection area of a projection element of the TOF camera unit according to an embodiment of the present application.
Fig. 14 illustrates another schematic diagram of the ranging unit implemented as a TOF camera unit according to an embodiment of the present application.
Fig. 15 illustrates a flowchart of an auto zoom imaging method according to an embodiment of the present application.
Fig. 16 illustrates a schematic diagram of an imaging system according to an embodiment of the present application.
Fig. 17 illustrates a perspective schematic view of an electronic device according to an embodiment of the present application.
Fig. 18 illustrates a first schematic diagram of tracking a subject object in an auto-zoom imaging method according to an embodiment of the present application.
Fig. 19 illustrates a second schematic diagram of tracking a subject object in an auto-zoom imaging method according to an embodiment of the present application.
Fig. 20 illustrates a third schematic diagram of tracking a subject object in an auto-zoom imaging method according to an embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application and not all of the embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein.
Exemplary Multi-shot Camera Module
As shown in fig. 1, a multi-camera module according to an embodiment of the present application is illustrated, where the multi-camera module is configured such that the multi-camera module can perform automatic optical zooming based on a distance between the multi-camera module and a subject to provide a better shooting experience.
As shown in fig. 1, the multi-camera module 710 according to the embodiment of the present application includes a first camera unit 711 having an optical zoom function, and a second camera unit 712 having a preset relative positional relationship with the first camera unit 711. Specifically, the first image capturing unit 711 includes a photosensitive chip 7111, at least one lens group 7112 disposed on a photosensitive path set by the photosensitive chip 7111, and a driving component 7113 for driving at least part of lenses in the at least one lens group 7112 to perform optical zooming.
Accordingly, the first image capturing unit 711 and the second image capturing unit 712 can capture images of a subject. In particular, in the embodiment of the present application, the application of the first image capturing unit 711 and the second image capturing unit 712 of the multi-image capturing module 710 is further expanded, and more specifically, in the embodiment of the present application, the first image capturing unit 711 and the second image capturing unit 712 measure the distance information between the multi-image capturing module 710 and the object based on the principle of binocular distance, that is, at least partially based on the first image of the object captured by the first image capturing unit 711, the second image of the object captured by the second image capturing unit 712, and the relative positional relationship between the first image capturing unit 711 and the second image capturing unit 712 calculates and obtains the distance between the multi-image capturing module 710 and the object. Further, the driving component 7113 of the first image capturing unit 711 is configured to drive at least part of the lenses in the at least one lens group 7112 to perform optical zoom based on the adjustment instruction generated by the distance information, and in this way, the multi-image capturing module 710 is structurally configured to implement an auto optical zoom function.
Fig. 2 illustrates a schematic diagram of the principle of binocular ranging according to an embodiment of the present application. As shown in fig. 2, P is a certain point on the object, OR and OT are optical centers of the first imaging unit 711 and the second imaging unit 712, respectively, imaging points formed by the point P on the photosensitive chips 7111 of the first imaging unit 711 and the second imaging unit 712 are P and P ', f is an effective focal length of the second imaging unit 712, B is a center distance between the first imaging unit 711 and the second imaging unit 712, Z is depth information (i.e., distance information) to be calculated, and a distance from the imaging point P to the point P' is set to X:
X=B-(X R -X T )
according to the triangle-like triangle principle:
(B-(X R -X T ))/B=(Z-f)/Z
it can be seen that z=fb/(X R -X T )
Wherein, the focal length and the shooting center distance B can be obtained by calibration, thus only X is needed to be obtained R The value of X (i.e. the disparity d) allows to determine the depth information Z, i.e. the distance.
It should be noted that, the first image capturing unit 711 and the second image capturing unit 712 have radial distortion due to the characteristics of their optical lenses, and accordingly, the distortion degree can be determined by three parameters K71, K72, and K73; also, because the photosensitive chip 7111 is not completely parallel to the optical lens due to errors in assembly, etc., there is tangential distortion in the imaging of the first image capturing unit 711 and the second image capturing unit 712, and accordingly, the degree of distortion can be determined by the two parameters P71 and P72. In the calibration of the single camera unit, the internal parameters (including but not limited to the focal length f, the imaging origin Cx, cy, and the above five distortion parameters) and the external parameters (the world coordinates of the calibration object) of the single camera unit are mainly determined, whereas the calibration of the binocular camera module (i.e., the camera unit combination consisting of the first camera unit 711 and the second camera unit 712) is required to derive not only the internal parameters of each camera unit, but also the relative position between the first camera unit 711 and the second camera unit 712 (i.e., the rotation matrix R and the translation vector t of the second camera unit 712 relative to the first camera unit 711) by calibration.
After calibration, the images acquired by the first image capturing unit 711 and the second image capturing unit 712 can be corrected based on the calibrated internal reference and binocular relative positional relationship, respectively, to eliminate distortion and perform line alignment for binocular ranging.
It should be noted that, in the embodiment of the present application, since the requirement of the multi-camera module 710 on the ranging accuracy is relatively low, in order to reduce the workload, calibration may be selected at some specific points, for example, points in the depth field (i.e. window area) of the first camera unit 711 and the second camera unit 712 are selected for calibration. Further, for example, the furthest depth of field point and the closest depth of field point are selected, then the connecting line between the closest depth of field point X71 and the furthest depth of field point X72 is equally divided by N, and then the furthest depth of field point, the closest depth of field point and each equally divided point are calibrated, and then the acquired parameters are burnt, so that the ranging efficiency is improved, and the workload required by calibration is reduced.
Further, in the present embodiment, the second image capturing unit 712 may be configured as a main image capturing unit for capturing a first image of a subject, which preferably has a relatively large angle of view, for example, the angle of view of the second image capturing unit 712 is greater than 760 °; and the first image capturing unit 711 is configured as a sub image capturing unit for adjusting a focal length thereof based on a distance between the multi image capturing module and a subject and capturing a first image of the subject. Accordingly, after the first image and the second image are obtained, the multi-shot camera module 710 can fuse the first image and the second image to generate a fused image with higher imaging quality.
More specifically, as shown in fig. 1, in the embodiment of the present application, the at least one lens group 7112 of the first image capturing unit 711 includes a first lens group 7114 and a second lens group 7115, and the driving assembly 7113 includes a first driving element 7117, and the first driving element 7117 is configured to drive at least part of lenses in the first lens group 7114 to perform optical zooming based on the adjustment instruction. That is, in the embodiment of the present application, the first driving element 7117 is a zoom driver for driving at least part of lenses in the first lens group 7114 to move for optical zooming.
As shown in fig. 1, in the embodiment of the present application, the driving component 7113 of the first image capturing unit 711 further includes a second driving element 7118, and the second driving element 7118 is configured to drive the second lens group 7115 to perform optical focusing based on the adjustment instruction. That is, in the embodiment of the present application, the first image capturing unit 711 further has a focusing function, and the second driving element 7118 is a focusing driver. It should be appreciated that, after the optical zooming by the first driving element 7117, the second driver can drive the second lens group 7115 to move so as to not only achieve optical focusing, but also compensate for the influence of the optical zooming to improve the imaging quality. That is, the first driving element 7117 and the second driving element 7118 respectively drive the first lens group 7114 and the second lens group 7115 to realize optical zooming in common in the present invention, and ensure that TTL (Total Track length) is unchanged to have higher imaging quality after optical zooming; that is, the first lens group 7114 is driven by the first driving element 7117 to realize zooming, and the second driving element 7118 drives the second lens group 7115 to realize compensation and/or focusing, thereby obtaining a high-quality image.
It should be noted that, in other examples of the present application, the first driving element 7117 and the second driving element 7118 may be implemented as the same driver (i.e., the zoom driver and the focus driver are implemented as the same driver), or the first driving element 7117 and the second driving element 7118 may have a unitary structure, which is not limited to the present application. Wherein, the first lens group 7114 and the second lens group 7115 are preferably fixed to a connection shaft at the same time, so that the first lens group 7114 and the second lens group 7115 do not deviate during movement, and further, may be a guide rail (both the first lens group 7114 and the second lens group 7115 are assembled to the same guide rail) or a hole shaft structure (the first lens group 7114 and the second lens group 7115 are connected by a shaft and move along the shaft).
It should be understood that in the embodiment of the present application, the at least one lens group 7122 may further include a greater number of lens groups, for example, further includes a third lens group 7126, and the position of the third lens group 7126 is fixed as a fixed lens group, which is not limited in this application.
It is also worth mentioning that for some terminal devices (e.g. smart phones), there is a requirement for the thickness of the multi-camera module 710, i.e. it is required to ensure that the thickness of the multi-camera module 710 is less than a certain value. Accordingly, in other examples of the present application, the first image capturing unit 711 may be implemented as a periscopic image capturing unit, and accordingly, in these examples, the first image capturing unit 711 further includes a reflective element 7119 disposed on a photosensitive path of the photosensitive chip 7111 for turning imaging light.
In order to further improve the imaging performance of the first image capturing unit 711, in some examples of the present application, the first image capturing unit 711 is further configured with an optical anti-shake function. For example, in some examples of the present application, the first image capturing unit 711 further includes a reflective element 7119 disposed on a photosensitive path of the photosensitive chip 7111 for turning imaging light; alternatively, the driving assembly 7113 further includes an anti-shake mechanism for driving the first lens group 7114 and/or the second lens group 7115 to perform optical anti-shake, thereby compensating for errors due to camera shake of a photographer.
In practical applications, when the object is far from the multi-camera module 710, the first camera unit 711 and the second camera unit 712 may not be able to measure distance, that is, the distance between the object and the multi-camera module 710 may be out of the shooting range. Accordingly, as shown in fig. 3, in the embodiment of the present application, based on the original structure of the multi-camera module 710, a third camera unit 713 with a further shooting range may be further configured, that is, the third equivalent focal length of the third camera unit 713 is greater than the second equivalent focal length of the second camera unit 712. And further, the third image capturing unit 713 and the first unit obtain the distance between the multi-shot image capturing module 710 and the subject by the principle of binocular distance, that is, the distance is calculated based at least in part on the first image of the subject acquired by the first image capturing unit 711, the third image of the subject acquired by the third image capturing unit 713, and the relative positional relationship between the first image capturing unit 711 and the third image capturing unit 713.
More specifically, in the embodiment of the present application, when the distance between the multi-shot camera module 710 and the shot object is greater than a preset threshold, the first camera unit 711 and the third camera unit 713 cooperate to measure the distance; when the distance between the multi-shot camera module 710 and the shot object is lower than a preset threshold, the distance is measured by adopting the way that the first camera unit 711 is matched with the second camera unit 712. That is, in some examples of the present application, the multi-camera module 710 may further include a determining module configured to determine a shooting distance to determine which two camera units are activated for ranging.
In summary, the multi-camera module 710 according to the embodiment of the present application is illustrated, and the structural configuration of the multi-camera module 710 enables the multi-camera module 710 to perform automatic optical zooming based on the distance between the multi-camera module 710 and the object to be photographed, so as to provide a better photographing experience.
Schematic image pickup system
According to another aspect of the present application, there is also provided an imaging system.
Fig. 4 illustrates a schematic diagram of the imaging system according to an embodiment of the present application.
As shown in fig. 4, the imaging system 730 includes the multi-camera module 710 as described above and a processor 720 communicatively coupled to the multi-camera module 710, wherein the processor 720 is configured to generate adjustment instructions based on a distance of the multi-camera module 710 relative to a subject. Accordingly, the driving component 7113 of the first image capturing unit 711 drives at least part of the lenses in the at least one lens group 7112 to perform optical zooming after receiving the adjustment command, and in this way, the image capturing system 730 implements an auto-zoom capturing function.
Accordingly, in the embodiment of the present application, the processor 720 is further configured to fuse the first image of the shot object acquired by the first camera module and the second image of the shot object acquired by the second camera module after performing optical zooming, so as to obtain a fused image of the shot object.
Alternatively, in the embodiment of the present application, the processor 720 is further configured to fuse the first image of the shot object acquired by the first image capturing module and the third image of the shot object acquired by the third image capturing module after performing optical zooming, so as to obtain a fused image of the shot object.
Schematic electronic device
According to another aspect of the application, an electronic device is also provided.
Fig. 5 illustrates a perspective schematic view of an electronic device according to an embodiment of the present application.
As shown in fig. 5, the electronic device 7100 according to the embodiment of the present application includes an electronic device main body 7101 and the multi-camera module 710 as described above assembled to the electronic device main body 7101. In an implementation, the multi-camera module 710 is preferably configured on the back of the electronic device body 7101 to be configured as a rear camera module, but may also be configured as the front of the electronic device body 7101 to be configured as a front camera module.
As shown in fig. 5, in the embodiment of the present application, the electronic device main body 7101 includes a screen and an integrated circuit, where the screen may be used to display the image data collected by the multi-camera module 710, and the integrated circuit may be used to process the image data collected by the multi-camera module 710, so as to control the multi-camera module 710 to implement an auto-zoom shooting function.
Schematic automatic zoom imaging method
According to yet another aspect of the present application, there is also provided an auto-zoom imaging method.
Fig. 6 illustrates a flow chart of an auto zoom imaging method according to an embodiment of the present application.
As shown in fig. 6, the auto zoom imaging method according to an embodiment of the present application includes the steps of: s7110, a zooming instruction is obtained; s7120, in response to the zoom instruction, acquiring a distance between the multi-shot image capturing module 710 and a subject, the multi-shot image capturing module 710 including a first image capturing unit 711; s7130, based on the distance, generating an adjustment instruction, where the adjustment instruction is used to drive the driving component 7113 of the first image capturing unit 711 to drive at least part of lenses in the at least one lens group 7112 of the first image capturing unit 711 to perform optical zooming; and S7140, fusing the first image of the subject acquired by the first image capturing unit 711 after performing optical zooming with the images of the subject acquired by the other image capturing units of the multi-camera module 710 to obtain a fused image.
In step S7110, a zoom instruction is acquired. In the embodiment of the application, the zoom instruction includes, but is not limited to, a subject displayed in a single click screen, a subject displayed in a double-hand telescopic screen, and the like. It should be appreciated that the zoom command may be set in advance based on the needs of the user and cannot be conflicting with other command settings.
In step S7120, in response to the zoom instruction, a distance between the multi-shot image capturing module 710 and the subject is acquired, the multi-shot image capturing module 710 including the first image capturing unit 711.
In an example of the present application, in response to the zoom instruction, a process of acquiring a distance between the multi-shot camera module 710 and a subject includes: first, a first image of the subject is obtained by the first image pickup unit 711 and a second image of the subject is obtained by the second image pickup unit 712; then, a first distance between the multi-shot camera module 710 and the subject is obtained based at least in part on the first image, the second image, and a relative positional relationship between the first camera unit 711 and the second camera unit 712, wherein the first distance is a distance between the multi-shot camera module 710 and the subject.
It should be noted that, when the distance measurement is performed by the first image capturing unit 711 and the second image capturing unit 712 according to the binocular distance measurement principle, in order to capture a more optimal first image to improve the distance measurement accuracy, at least a part of the lenses in the at least one lens group 7112 of the first image capturing unit 711 may be moved in advance by the driving assembly 7113 to perform the optical zoom before the binocular distance measurement is performed.
That is, in the embodiment of the present application, before the first image of the subject is obtained by the first image capturing unit 711, it further includes: the driving component 7113 of the first image capturing unit 711 is pre-driven to drive at least part of the lenses in the at least one lens group 7112 of the first image capturing unit 711 for optical zooming.
It should be appreciated that, just as the first image capturing unit 711 is pre-zoomed in step S7120, the driving component 7113 may reduce the moving distance required for zooming in step S7130, so as to improve the zooming efficiency and enhance the capturing experience.
In another example of the present application, in response to the zoom instruction, a process of acquiring a distance between the multi-shot camera module 710 and a subject further includes: in response to the first distance being greater than a preset threshold, activating a third imaging unit 713 of the multi-camera module 710 to obtain a third image of the subject through the third imaging unit 713, wherein a third equivalent focal length of the third imaging unit 713 is greater than a second equivalent focal length of the second imaging unit 712; and obtaining a second distance between the multi-shot camera module 710 and the subject based at least in part on the first image, the third image, and a relative positional relationship between the first camera unit 711 and the third camera unit 713, wherein the second distance is a distance between the multi-shot camera module 710 and the subject.
In step S7130, based on the distance, an adjustment instruction is generated, where the adjustment instruction is used to drive the driving component 7113 of the first image capturing unit 711 to drive at least part of the lenses in the at least one lens group 7112 of the first image capturing unit 711 to perform optical zooming.
Specifically, in the embodiment of the present application, the adjustment instruction is configured to drive the first driving element 7117 of the driving component 7113 to drive at least part of the lenses in the first lens group 7114 of the at least one lens group 7112 to perform optical zooming; and, the adjustment command is further used for driving the second driving element 7118 of the driving component 7113 to drive the second lens group 7115 of the at least one lens group 7112 to perform optical focusing.
That is, in the embodiment of the present application, after the optical zooming is performed by the first driving element 7117, the optical focusing is performed by the second driving element 7118 to implement compensation, and the imaging quality of the first imaging unit 711 after zooming is improved.
In step S7140, the first image of the subject acquired by the first image capturing unit 711 after performing optical zooming and the images of the subject acquired by the other image capturing units of the multi-camera module 710 are fused to obtain a fused image.
In an example of the present application, fusing the first image of the subject acquired by the first image capturing unit 711 after performing optical zooming and the images of the subject acquired by the other image capturing units of the multi-image capturing module 710 to obtain a fused image includes: the first image of the subject acquired by the first image pickup unit 711 after performing optical zooming and the second image of the subject acquired by the second image pickup unit 712 are fused to obtain a fused image.
In another example of the present application, fusing the first image of the subject acquired by the first image capturing unit 711 after performing optical zooming and the images of the subject acquired by the other image capturing units of the multi-camera module 710 to obtain a fused image includes: the first image of the subject acquired by the first image capturing unit 711 after optical zooming and the third image of the subject acquired by the third image capturing unit 713 are fused to obtain a fused image.
In summary, an autofocus imaging method according to an embodiment of the present application is illustrated, which is implemented based on the structural configuration of the multi-camera module 710, so as to improve the shooting experience of the photographer.
Specifically, the automatic zoom imaging method can be applied to image shooting and also can be applied to video shooting. When the image capturing is performed, the multi-camera module 710 (or the electronic device) is usually kept in a disabled state, and at this time, the photographer can complete the image capturing by sending out a zoom command and carrying up to complete the automatic zooming.
When the auto-focus imaging method is applied to video photographing, a subject may move during photographing, and the movement may be irregular, particularly, move along a photographing direction (i.e., distance variation), and it is difficult to achieve matching by a mobile device, so that the shape and size of a photographed object in a video are unchanged or the imaging quality is ensured. That is, the movement of the photographed object is irregular, and the photographed object moves left and right or up and down with respect to the photographer, the photographer can move the apparatus in the same direction, and the photographed object is kept at the center of the picture, and further in order to better satisfy the requirement of video photographing, in practical application, the photographer can photograph by:
1. the photographer does not move the electronic device, and the whole multi-shot camera module 710 is driven by a driver to realize tracking shooting, as shown in fig. 7;
2. The photographer does not move the electronic apparatus, the multi-shot camera module 710 is also kept stationary with respect to the electronic apparatus, and the reflection unit of the first camera unit 711 is moved to achieve tracking shooting, as shown in fig. 8;
3. the photographer does not move the electronic apparatus, the multi-shot camera module 710 is also held stationary with respect to the electronic apparatus, and the reflection unit of the first camera unit 711 is turned to achieve tracking shooting, as shown in fig. 9.
Further, the distance between the photographed object and the photographer often changes during video shooting, and at this time, if the zoom magnification is not timely adjusted, imaging blur is caused or the size of the photographed object changes, so the automatic zoom can realize time adjustment of the zoom magnification, and the photographed object can keep clear imaging or the size of the photographed object in the video unchanged during video shooting. That is, the present invention can be applied to video shooting, and can ensure that the shot object is always positioned in the center of the video (or the position desired by the photographer) and the definition and the size of the shot object are not changed.
Accordingly, in an embodiment of the present application, the auto-zoom imaging method may further include: based on the motion track of the shot object, the multi-shot camera module 710 is moved, so that the shot object is always located in the shooting window of the multi-shot camera module 710.
Accordingly, in an embodiment of the present application, the auto-zoom imaging method may further include: the reflecting element 7119 of the first image capturing unit 711 is moved based on the motion trajectory of the subject.
Accordingly, in an embodiment of the present application, the auto-zoom imaging method may further include: the reflecting element 7119 of the first image capturing unit 711 is rotated based on the motion trajectory of the subject.
According to the method, in the process of video shooting, the object can move forwards and backwards, the required zoom multiple is adjusted, and the size or the position of the shot object displayed in the image is unchanged, so that shooting experience is improved.
Exemplary Camera Module arrangement
As shown in fig. 10, an image capturing module apparatus according to an embodiment of the present application is illustrated, where the configuration of the image capturing module apparatus enables automatic optical zooming of the image capturing module apparatus based on a distance between the image capturing module apparatus and a subject to provide a better capturing experience.
As shown in fig. 10, the image capturing module apparatus 810 according to the embodiment of the present application includes an image capturing unit 811 having an optical zoom function, and a ranging unit 812 configured to measure a distance between the image capturing module apparatus 810 and a subject. Specifically, as shown in fig. 10, the image capturing unit 811 includes a photosensitive chip 8111, at least one lens group 8112 located on a photosensitive path set by the photosensitive chip 8111, and a driving assembly 8113 for driving at least part of lenses in the at least one lens group 8112 to perform optical zooming.
Accordingly, in the embodiment of the present application, the distance measurement unit 812 may measure the distance information between the image capturing module device 810 and the object to be captured, so that the driving component 8113 of the image capturing unit 811 may be configured to drive at least part of the lenses in the at least one lens group 8112 to perform optical zooming based on the adjustment instruction generated by the distance information, and in this way, the image capturing module device 810 is structurally configured to implement an automatic optical zooming function.
It should be noted that, in the embodiment of the present application, the image capturing unit 811 and the ranging unit 812 refer to two components that are structurally integrated in the image capturing module apparatus 810, and not refer to components that are structurally separated. Specifically, in the camera module device 810, the ranging unit 812 and the camera unit 811 may be integrally formed by molding or the like to manufacture the camera module device 810, and the camera module device 810 as a whole is connected to other peripheral devices, such as an image processor.
In particular, in the embodiment of the present application, the distance measuring unit 812 may obtain the distance between the camera module device 810 and the photographed object through a time-of-flight rule. The distance measuring unit 812 has implementation means such as ultrasonic wave, millimeter wave radar, laser radar, etc. according to the wavelength of the detection signal used.
Fig. 11 illustrates a ranging schematic of a ranging unit 812 according to an embodiment of the present application. As shown in fig. 11, the ranging unit 812 includes a projector 8121 and a receiver 8122, the projector 8121 is configured to project a detection signal having a specific wavelength to a subject, and the receiver 8122 is configured to receive the detection signal reflected back from the subject and determine a distance between the ranging module and the subject based on a time-of-flight rule.
Specifically, when the ranging unit 812 performs ranging by ultrasonic time, the projector 8121 projects an ultrasonic detection signal to a subject, and the ultrasonic detection signal propagates in the air, and is reflected back and received by the receiver 8122 after hitting the subject. Accordingly, the timer calculates the time period from the emission of the ultrasonic wave to the reception of the ultrasonic wave, and accordingly, the distance between the ranging unit 812 and the object is s=8340 t/82, where S represents the distance between the ranging unit 812 and the object, and t represents the time period from the emission of the ultrasonic wave to the reception of the ultrasonic wave.
When the distance measuring unit 812 measures distance by millimeter radar waves, the projector 8121 projects millimeter wave detection signals to the photographed target, wherein the millimeter wave is electromagnetic waves in the 830-8300 GHz frequency domain (the wavelength is 81-810 mm). Since the millimeter wave is attenuated in the atmosphere, a longer distance can be detected and perceived, wherein a long-distance radar can achieve perception and detection of more than 8200 m.
When the ranging unit 812 performs ranging by using a laser radar, the projector 8121 projects a laser pulse detection signal to the photographed object, and the laser pulse detection signal propagates in the air, and is reflected back and received by the receiver 8122 after hitting the photographed object. Accordingly, the timer calculates the time period from when the laser pulse detection signal is emitted to when the laser pulse detection signal is received, and then the distance between the ranging unit 812 and the subject can be obtained based on the time period and the propagation speed of the laser pulse detection signal.
It should be understood that, in the embodiment of the present application, the ranging unit 812 is an integrated component of the camera module device 810, and thus, the distance between the ranging unit 812 and the photographed object is the distance between the camera module device 810 and the photographed object.
It should be noted that the ranging unit 812 may be one or more of the above types, so as to achieve ranging of different distances and improve the accuracy of overall ranging.
Preferably, in the embodiment of the present application, the distance measurement unit 812 is implemented as a TOF camera unit 812A, that is, preferably, the TOF camera unit 812A obtains not only distance information by a time-of-flight rule, but also image data of the measured object (i.e., obtains texture information of the photographed object).
As one of ordinary skill in the art will appreciate, as the market progresses, the terminal device, and in particular the smartphone, post-TOF camera unit 812A is increasingly being used, and the TOF camera unit 812A may be used to capture 83D images or other applications. Accordingly, in the embodiment of the present application, the distance information acquired by the TOF camera unit 812A is introduced into the application. It should be understood that in the existing TOF camera unit 812A, its main function is to acquire texture information and depth information of a photographed object, so its accuracy must be ensured to be high enough, however, in the embodiment of the present application, the TOF camera unit 812A is mainly used to provide distance information, so its accuracy requirement can be relatively reduced to reduce cost. It is noted that the projected power may be relatively large when ranging is performed, and thus, it is necessary to secure eye safety.
Fig. 12 illustrates a schematic diagram of the ranging unit 812 implemented as a TOF camera unit 812A according to an embodiment of the present application. As shown in fig. 12, the TOF camera unit 812A includes a projecting unit 8121A and a receiving unit 8122A, wherein the projecting unit 8121A includes a projecting element 8123A, an optical element 8124A, a circuit board 8125A, and a bracket 8126A, the projecting element 8123A is attached to the circuit board 8125A, the optical element 8124A is held on a projection path of the projecting element 8123A by the bracket 8126A, and the detection signal projected by the projecting element 8123A is optically processed; the receiving component 8122A is configured to receive a detection signal from a subject to obtain a distance between the TOF imaging unit 812A and the subject based on a time-of-flight law.
Further, as shown in fig. 12, the projection assembly 8121A further includes a detection element 8127A configured to detect whether the operation state of the TOF camera unit 812A is abnormal, for example, in an example of the present application, the detection element 8127A is a PD element (Photo-Diode) for detecting energy of the projection signal generated by the projection element 8123A; of course, the detecting element 8127A may be implemented as another mechanism for detecting whether the projecting element 8123A is normal.
Further, in the present embodiment, the projecting element 8123A is implemented as a VCSEL laser projector, and the VCSEL laser projector is divided into a plurality of regions, wherein the VCSEL projection includes a plurality of laser projection points. By way of example and not limitation, as shown in fig. 13, the projection element 8123A is divided into four projection areas A, B, C, D, wherein the number of projection lattices of area a is smaller than the number of projection lattices of other areas, but the projection lattice energy of area a is larger than the other area lattices. Accordingly, the area a is preferably used for projection to achieve ranging, and the other area B, C, D has a high measurement accuracy due to the large number of projection points, so that the method is suitable for other applications, such as obtaining depth information. It should be understood that the number of division of the areas of the projection element 8123A is not limited in this application, and the division result only needs to satisfy: there is at least one region having a smaller number of lattices than the other regions, and preferably, the region having the largest number of lattices is 83-810 times the region having the smallest number of lattices.
Fig. 14 illustrates another schematic diagram of the ranging unit 812 implemented as a TOF camera unit 812A according to an embodiment of the present application. In contrast to the TOF camera unit 812A illustrated in fig. 12, in this embodiment, the projection assembly 8121A further includes a collimation element 8129A disposed between the projection element 8123A and the optical element 8124A, configured to integrate the detection signals projected by the projection element 8123A. Accordingly, when the projection area of the projection element 8123A is divided into a plurality of areas, the collimating unit collimates the generated detection signal when ranging is performed, so that the projection distance of the detection signal can be further, thereby reducing the power consumption required for projection. Preferably, the projecting element 8123A is implemented as a VCSEL laser projector, the pattern formed by the projecting dots of which is regular.
It should be noted that, in the embodiment of the present application, when the ranging unit 812 is implemented as the TOF camera unit, that is, when the camera module device 810 is implemented as a multi-camera module, the camera unit 811 and the TOF camera unit 812A refer to two camera units 811 that are structurally integrated in the camera module device 810, and not refer to a camera module that is structurally separated. Specifically, in the image capturing module apparatus 810, the TOF image capturing unit 812A and the image capturing unit 811 may be integrally formed by molding or the like to manufacture the image capturing module apparatus 810, and the image capturing module apparatus 810 as a whole is connected to other peripheral devices, such as an image processor.
It should be noted that, in the embodiment of the present application, the ranging unit 812 may also be used as an auxiliary ranging tool in the calibration process of the imaging unit 811 of the imaging module device 810, that is, capturing scenes with different distances, and burning the scenes into the imaging unit 811.
Further, as shown in fig. 10, in the embodiment of the present application, at least one lens group 8112 of the image capturing unit 811 includes a first lens group 8114 and a second lens group 8115, and the driving assembly 8113 includes a first driving element 8117, and the first driving element 8117 is configured to drive at least part of lenses in the first lens group 8114 to perform optical zooming based on the adjustment instruction. That is, in the embodiment of the present application, the first driving element 8117 is a zoom driver for driving at least part of lenses in the first lens group 8114 to move for optical zooming.
As shown in fig. 10, in the embodiment of the present application, the driving component 8113 of the image capturing unit 811 further includes a second driving element 8118, and the second driving element 8118 is configured to drive the second lens group 8115 for optical focusing and/or compensation based on the adjustment instruction. That is, in the embodiment of the present application, the image capturing unit 811 further has a focusing function, and the second driving element 8118 is a focusing driver. It should be appreciated that, after optical zooming by the first driving element 8117, the second driver can drive the second lens group 8115 to move to supplement the effect of the optical zooming, so as to improve the imaging quality.
It should be noted that, in other examples of the present application, the first driving element 8117 and the second driving element 8118 may be implemented as the same driver (i.e., the zoom driver and the focus driver are implemented as the same driver), or the first driving element 8117 and the second driving element 8118 have a unitary structure, which is not limited to the present application.
It should be understood that in the embodiment of the present application, the at least one lens group 8112 may further include a greater number of lens groups, for example, further includes a third lens group 8116, and the position of the third lens group 8116 is fixed as a fixed lens group, which is not limited in this application.
It is also worth mentioning that for some terminal devices (e.g. smart phones), there is a requirement for the thickness of the camera module device 810, i.e. it is necessary to ensure that the thickness of the camera module device 810 is less than a certain value. Accordingly, in other examples of the present application, the image pickup unit 811 may be implemented as a periscopic image pickup unit 811, and accordingly, in these examples, the image pickup unit 811 further includes a reflecting element 8119 disposed on a photosensitive path of the photosensitive chip 8111 for turning imaging light.
In order to further improve the imaging performance of the imaging unit 811, in some examples of the present application, the imaging unit 811 is further configured with an optical anti-shake function. For example, in some examples of the present application, the image capturing unit 811 further includes a reflecting element 8119 disposed on a photosensitive path of the photosensitive chip 8111 for turning imaging light; alternatively, the driving unit 8113 further includes an anti-shake mechanism for driving the first lens group 8114 and/or the second lens group 8115 to perform optical anti-shake, thereby compensating for errors due to camera shake of a photographer.
In summary, the image capturing module device 810 according to the embodiment of the present application is illustrated, and the structural configuration of the image capturing module device 810 enables the image capturing module device 810 to perform automatic optical zooming based on the distance between the image capturing module device 810 and the object to be captured, so as to provide a better capturing experience.
In order to explain how the camera module device 810 performs auto-optical zooming (i.e., how the camera unit 811 performs auto-optical zooming), an auto-zoom imaging method applied to the camera module device 810 is described below.
Schematic automatic zoom imaging method
Fig. 15 illustrates a flowchart of an auto zoom imaging method according to an embodiment of the present application.
As shown in fig. 15, the auto zoom imaging method according to an embodiment of the present application includes the steps of: s8110, obtaining a zooming instruction; s8120, in response to the zoom instruction, acquiring a distance between the image capturing module device 810 and the object, where the image capturing module device 810 includes an image capturing unit 811; s8130, based on the distance, generating an adjustment instruction, where the adjustment instruction is used to drive the driving component 8113 of the image capturing unit 811 to drive at least part of lenses in the at least one lens group 8112 of the image capturing unit 811 to perform optical zooming; and S8140, obtaining an image of the subject acquired by the imaging unit 811 after the optical zooming.
In step S8110, a zoom instruction is acquired. In the embodiment of the application, the zoom instruction includes, but is not limited to, a subject displayed in a single click screen, a subject displayed in a double-hand telescopic screen, and the like. It should be appreciated that the zoom command may be set in advance based on the needs of the user and cannot be conflicting with other command settings.
In step S8120, in response to the zoom instruction, a distance between the image capturing module device 810 and the subject is acquired, the image capturing module device 810 including an image capturing unit 811.
In an example of the present application, in response to the zoom instruction, a process of acquiring a distance between the camera module device 810 and the photographed object includes: the projector 8121 of the ranging unit 812 projects a detection signal to the photographed object; then, the detection signal reflected from the photographed object is received; then, a distance between the ranging module and the subject is determined based on a time-of-flight rule, wherein the distance between the ranging module and the subject is set as a distance between the camera module device 810 and the subject.
In the embodiment of the application, the detection signals include, but are not limited to, millimeter wave detection signals, ultrasonic detection signals and laser pulse detection signals.
Preferably, in the embodiment of the present application, the distance measurement unit 812 is implemented as a TOF camera unit 812A.
When the distance measuring unit 812 is implemented as the TOF camera unit 812A, the detection signal projected by the TOF camera unit 812A is a laser pulse, which has a certain potential safety hazard for human eyes. Accordingly, before the TOF camera unit 812A performs ranging, the auto-focus imaging method further includes: acquiring an image of a subject by the TOF imaging unit 812A; analyzing the image to determine whether human eyes are contained in the content of the image; and in response to the content of the image including human eyes, delaying ranging, and if not, starting ranging.
It is also possible that in another embodiment of the present application, the autofocus imaging method further includes: in response to the inclusion of a human eye in the content of the image, a detection signal is projected by an area of the TOF camera unit 812A having relatively small projection energy for ranging. That is, in this example, the projection area of the TOF camera unit 812A includes a plurality of areas, wherein the energy of the projection beam generated by a partial area is small. That is, when the human eye is included in the image, the ranging is achieved by taking the projection beam from the region of smaller energy.
In step S8130, based on the distance, an adjustment instruction is generated, where the adjustment instruction is used to drive the driving component 8113 of the image capturing unit 811 to drive at least part of the lenses in the at least one lens group 8112 of the image capturing unit 811 to perform optical zooming.
Specifically, in the embodiment of the present application, the adjustment instruction is configured to drive the first driving element 8117 of the driving assembly 8113 to drive at least part of lenses in the first lens group 8114 of the at least one lens group 8112 to perform optical zooming; and, the adjustment instruction is further configured to drive the second driving element 8118 of the driving assembly 8113 to drive the second lens group 8115 of the at least one lens group 8112 to perform optical focusing.
That is, in the embodiment of the present application, after the optical zooming is performed by the first driving element 8117, the optical focusing is performed by the second driving element 8118, so as to implement compensation, and improve the imaging quality of the imaging unit 811 after zooming.
In step S8140, an image of the subject acquired by the imaging unit 811 after optical zooming is obtained. That is, after optical zooming, an image of a subject is acquired by the image capturing unit 811 after zooming.
In summary, an autofocus imaging method applicable to the above-mentioned camera module device 810 according to the embodiments of the present application is illustrated, which is implemented based on the structural configuration of the camera module device 810, so as to improve the shooting experience of the photographer.
Specifically, the automatic zoom imaging method can be applied to image shooting and also can be applied to video shooting. When the camera module device 810 (or the electronic apparatus) is normally kept in the disabled state during image capturing, the photographer can complete image capturing by sending out a zoom command and carrying up the camera module device to complete automatic zooming.
When the auto-focus imaging method is applied to video photographing, a subject may move during photographing, and the movement may be irregular. In order to better meet the requirement of video shooting, in practical application, a photographer can shoot by the following ways:
1. The photographer does not move the electronic apparatus, and the whole camera module device 810 is driven by a driver to realize tracking shooting, as shown in fig. 18;
2. the photographer does not move the electronic apparatus, the camera module device 810 is also kept stationary with respect to the electronic apparatus, and the reflection unit of the camera unit 811 is moved to achieve tracking shooting, as shown in fig. 19;
3. the photographer does not move the electronic apparatus, the camera module device 810 is also kept stationary with respect to the electronic apparatus, and the reflection unit of the camera unit 811 is turned to achieve tracking shooting, as shown in fig. 20.
Accordingly, in an embodiment of the present application, the auto-zoom imaging method may further include: based on the motion track of the photographed object, the photographing module device 810 is moved so that the photographed object is always located in the photographing window of the photographing module device 810.
Accordingly, in an embodiment of the present application, the auto-zoom imaging method may further include: the reflecting element 8119 of the imaging unit 811 is moved based on the movement locus of the subject.
Accordingly, in an embodiment of the present application, the auto-zoom imaging method may further include: the reflecting element 8119 of the imaging unit 811 is rotated based on the movement locus of the subject.
According to the method, in the process of video shooting, the object can move forwards and backwards, the required zoom multiple is adjusted, and the size or the position of the shot object displayed in the image is unchanged, so that shooting experience is improved.
Schematic image pickup system
According to another aspect of the present application, there is also provided an imaging system.
Fig. 16 illustrates a schematic diagram of the imaging system according to an embodiment of the present application.
As shown in fig. 16, the camera system 830 includes the camera module device 810 as described above and a processor 820 communicatively coupled to the camera module device 810, wherein the processor 820 is configured to generate adjustment instructions based on a distance of the camera module device 810 relative to a subject. Accordingly, the driving component 8113 of the image capturing unit 811 drives at least part of the lenses in the at least one lens group 8112 to perform optical zooming after receiving the adjustment command, in this way, the image capturing system 830 implements an auto-zoom shooting function.
Schematic electronic device
According to another aspect of the application, an electronic device is also provided.
Fig. 17 illustrates a perspective schematic view of an electronic device according to an embodiment of the present application.
As shown in fig. 17, the electronic device 8100 according to an embodiment of the present application includes an electronic device 8100 main body and the above-described image capturing module apparatus 810 assembled to the electronic device main body 8101. In a specific implementation, the camera module device 810 is preferably disposed on the back of the electronic device body 8101 to be configured as a rear camera module, and of course, may also be configured as the front of the electronic device body 8101 to be configured as a front camera module.
As shown in fig. 17, in the embodiment of the present application, the electronic device main body 8101 includes a screen and an integrated circuit, where the screen may be used to display the image data collected by the camera module device 810, and the integrated circuit may be used to process the image data collected by the camera module device 810, so as to control the camera module device 810 to implement an auto zoom shooting function.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are by way of example only and are not limiting. The objects of the present invention have been fully and effectively achieved. The functional and structural principles of the present invention have been shown and described in the examples and embodiments of the invention may be modified or practiced without departing from the principles described.

Claims (36)

  1. A multi-camera module, comprising:
    the first camera unit comprises a photosensitive chip, at least one lens group positioned on a photosensitive path of the photosensitive chip and a driving component used for driving at least part of lenses in the at least one lens group to carry out optical zooming; and
    the second camera shooting unit is in a preset relative position relation with the first camera shooting unit;
    the driving component is configured to drive at least part of lenses in the at least one lens group to perform optical zooming based on an adjustment instruction, the adjustment instruction is generated based on the distance between the multi-shot camera module and a shot object, the distance between the multi-shot camera module and the shot object is calculated based at least in part on a first image of the shot object acquired by the first camera unit, a second image of the shot object acquired by the second camera unit, and the relative position relationship between the first camera unit and the second camera unit.
  2. The multi-camera module of claim 1 wherein the at least one lens group comprises a first lens group and a second lens group, the drive assembly comprising a first drive element configured to drive at least a portion of the lenses of the first lens group for optical zooming based on the adjustment instructions.
  3. The multi-camera module of claim 2 wherein the drive assembly further comprises a second drive element configured to drive the second lens group based on the adjustment instructions.
  4. The multi-camera module of claim 1, wherein the first camera unit further comprises a reflective element disposed on a photosensitive path of the photosensitive chip for turning imaging light.
  5. The multi-camera module of claim 4 wherein the drive assembly further comprises an anti-shake mechanism for driving the reflective element for optical anti-shake.
  6. The multi-camera module of claim 2 wherein the drive assembly further comprises an anti-shake mechanism for driving the first lens group and/or the second lens group for optical anti-shake.
  7. The multi-camera module of claim 1, further comprising a third camera unit having a preset positional relationship with the first camera unit, a third equivalent focal length of the third camera unit being greater than a second equivalent focal length of the second camera unit, wherein the adjustment instruction is generated based on a second distance of the multi-camera module relative to the subject when a distance of the multi-camera module relative to the subject obtained based at least in part on the first image of the subject acquired by the first camera unit, the second image of the subject acquired by the second camera unit, and the relative positional relationship between the first camera unit and the second camera unit exceeds a preset threshold, wherein the second distance is calculated based at least in part on the first image of the subject acquired by the first camera unit, the third image of the subject acquired by the third camera unit, and the relative positional relationship between the first camera unit and the third camera unit.
  8. An image pickup system, comprising:
    a multi-shot camera module according to any one of claims 1 to 7; and
    a processor communicatively coupled to the multi-camera module, wherein the processor is configured to generate adjustment instructions based on a distance of the multi-camera module relative to a subject.
  9. The imaging system of claim 8, wherein the processor is further configured to fuse a first image of the subject acquired by the first imaging module after optical zooming with a second image of the subject acquired by the second imaging module to obtain a fused image of the subject; or fusing the first image of the shot object acquired by the first shooting module after the optical zooming with the third image of the shot object acquired by the third shooting module to obtain a fused image of the shot object.
  10. An electronic device comprising a multi-camera module according to any one of claims 1-7.
  11. An automatic zoom imaging method, comprising:
    obtaining a zooming instruction;
    responding to the zooming instruction, and acquiring the distance between the multi-shot camera module and a shot target, wherein the multi-shot camera module comprises a first camera unit;
    Generating an adjustment instruction based on the distance, wherein the adjustment instruction is used for driving a driving component of the first camera unit to drive at least part of lenses in at least one lens group of the first camera unit to perform optical zooming; and
    and fusing the first image of the shot target acquired by the first camera unit after optical zooming with the images of the shot target acquired by other camera units of the multi-camera module to obtain a fused image.
  12. The auto-zoom imaging method of claim 11, wherein obtaining a distance between a multi-shot camera module and a subject in response to the zoom instruction comprises:
    obtaining a first image of the subject by the first imaging unit;
    obtaining a second image of the subject by the second imaging unit; and
    based at least in part on the first image, the second image, and a relative positional relationship between the first imaging unit and the second imaging unit, a first distance between the multi-camera module and the subject is obtained, wherein the first distance is a distance between the multi-camera module and the subject.
  13. The auto-zoom imaging method of claim 12, wherein, before obtaining the first image of the subject by the first imaging unit, further comprising:
    the driving assembly of the first camera shooting unit is pre-driven to drive at least part of lenses in at least one lens group of the first camera shooting unit to carry out optical zooming.
  14. The auto-zoom imaging method of claim 12, wherein acquiring the distance between the multi-shot camera module and the subject in response to the zoom instruction further comprises:
    when the first distance is larger than a preset threshold value, a third shooting unit of the multi-shooting module is started to obtain a third image of the shot target through the third shooting unit, and a third equivalent focal length of the third shooting unit is larger than a second equivalent focal length of the second shooting unit; and
    and obtaining a second distance between the multi-camera module and the object based at least in part on the first image, the third image and the relative positional relationship between the first camera unit and the third camera unit, wherein the second distance is the distance between the multi-camera module and the object.
  15. The auto-zoom imaging method of any one of claims 12 to 14, wherein the adjustment instructions are further for driving a first driving element of the driving assembly to drive at least part of the lenses of the first lens group of the at least one lens group to perform optical zooming.
  16. The auto-zoom imaging method of claim 15, wherein the adjustment instructions are further for driving a second driving element of the driving assembly to bring a second lens group of the at least one lens group into optical focus and/or compensation.
  17. The auto-zoom imaging method of claim 12, wherein fusing the first image of the subject acquired by the first imaging unit after optical zooming with the images of the subject acquired by the other imaging units of the multi-camera module to obtain a fused image, comprises:
    and fusing the first image of the shot target acquired by the first camera unit after optical zooming with the second image of the shot target acquired by the second camera unit to obtain a fused image.
  18. The auto-zoom imaging method of claim 14, wherein fusing the first image of the subject acquired by the first imaging unit after optical zooming with the images of the subject acquired by the other imaging units of the multi-camera module to obtain a fused image, comprises:
    And fusing the first image of the shot object acquired by the first shooting unit after optical zooming with the third image of the shot object acquired by the third shooting unit to obtain a fused image.
  19. The auto-zoom imaging method of claim 11, further comprising:
    and moving the multi-shot shooting module based on the movement track of the shot target so that the shot target is always positioned in a shooting window of the multi-shot shooting module.
  20. The auto-zoom imaging method of claim 11, further comprising:
    the reflective element of the first imaging unit is moved and/or rotated based on the movement locus of the subject.
  21. A camera module apparatus, comprising:
    the image pickup unit comprises a photosensitive chip, at least one lens group positioned on a photosensitive path of the photosensitive chip and a driving assembly used for driving at least part of lenses in the at least one lens group to carry out optical zooming; and
    a ranging unit;
    the driving component is configured to drive at least part of lenses in the at least one lens group to perform optical zooming based on an adjustment instruction, the adjustment instruction is generated based on the distance between the camera module device and the shot object, and the distance between the camera module device and the shot object is measured by the ranging unit.
  22. The camera module device of claim 21, wherein the at least one lens group comprises a first lens group and a second lens group, the drive assembly comprising a first drive element configured to drive at least a portion of the lenses of the first lens group for optical zooming based on the adjustment instructions.
  23. The camera module device of claim 22, wherein the drive assembly further comprises a second drive element configured to drive the second lens group for optical focusing based on the adjustment instructions.
  24. The camera module device of claim 21, wherein the camera unit further comprises a reflective element disposed on a photosensitive path of the photosensitive chip for turning imaging light.
  25. The camera module device of claim 24, wherein the drive assembly further comprises an anti-shake mechanism for driving the reflective element for optical anti-shake.
  26. The camera module device of claim 22, wherein the drive assembly further comprises an anti-shake mechanism for driving the first lens group and/or the second lens group for optical anti-shake.
  27. The camera module apparatus of claim 21, wherein the ranging unit comprises a projector configured to project a detection signal having a specific wavelength to a subject, and a receiver configured to receive the detection signal reflected back from the subject and determine a distance between the ranging module and the subject based on a time-of-flight law.
  28. The camera module device according to claim 21, wherein the ranging unit is implemented as a TOF camera unit to acquire a distance between the camera module device and a subject object by the TOF camera unit.
  29. An image pickup system, comprising:
    a camera module apparatus according to any one of claims 21 to 28; and
    and a processor communicatively coupled to the camera module device, wherein the processor is configured to generate adjustment instructions based on a distance of the camera module device relative to a subject.
  30. An electronic device comprising a camera module arrangement according to any one of claims 21-28.
  31. An automatic zoom imaging method, comprising:
    Obtaining a zooming instruction;
    responding to the zooming instruction, and acquiring the distance between the camera module device and the shot target through a ranging module;
    generating an adjustment instruction based on the distance, wherein the adjustment instruction is used for driving a driving component of the image capturing unit to drive at least part of lenses in at least one lens group of the image capturing unit to perform optical zooming; and
    an image of the subject acquired by the image pickup unit after optical zooming is obtained.
  32. The auto-zoom imaging method of claim 31, wherein, in response to the zoom instruction, obtaining, by a ranging module, a distance between a camera module device and a subject object, comprises:
    projecting a detection signal to a shot target;
    receiving the detection signal reflected back from the photographed object; and
    and determining the distance between the ranging module and the shot target based on a time flight rule, wherein the distance between the ranging module and the shot target is set as the distance between the shooting module device and the shot target.
  33. The auto-zoom imaging method of claim 31, wherein the adjustment instructions are further for driving the first driving element of the driving assembly to drive at least a portion of the lenses of the first lens group of the at least one lens group to perform optical zooming.
  34. The auto-zoom imaging method of claim 33, wherein the adjustment instructions are further for driving a second driving element of the driving assembly to bring a second lens group of the at least one lens group into optical focus.
  35. The auto-zoom imaging method of claim 31, further comprising:
    and moving the camera module device based on the movement track of the shot target so that the shot target is always positioned in a shooting window of the camera module device.
  36. The auto-zoom imaging method of claim 31, further comprising:
    the reflective element of the imaging unit is moved and/or rotated based on the movement locus of the subject.
CN202180060940.8A 2020-07-30 2021-07-30 Image pickup module device, multi-image pickup module, image pickup system, electronic apparatus, and auto-zoom imaging method Pending CN116250246A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN202010753090.7A CN114070997A (en) 2020-07-30 2020-07-30 Multi-camera module, camera system, electronic equipment and automatic zooming imaging method
CN202010751498.0A CN114070994B (en) 2020-07-30 2020-07-30 Image pickup module device, image pickup system, electronic apparatus, and auto-zoom imaging method
CN2020107514980 2020-07-30
CN2020107530907 2020-07-30
PCT/CN2021/109581 WO2022022682A1 (en) 2020-07-30 2021-07-30 Photographing module apparatus, multi-camera photographing module, photographing system, electronic device, and auto-zoom imaging method

Publications (1)

Publication Number Publication Date
CN116250246A true CN116250246A (en) 2023-06-09

Family

ID=80037659

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180060940.8A Pending CN116250246A (en) 2020-07-30 2021-07-30 Image pickup module device, multi-image pickup module, image pickup system, electronic apparatus, and auto-zoom imaging method

Country Status (2)

Country Link
CN (1) CN116250246A (en)
WO (1) WO2022022682A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114598789A (en) * 2022-03-02 2022-06-07 厦门聚视智创科技有限公司 Trigger type zoom system for virtual reality image acquisition and working method thereof

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103152517B (en) * 2013-02-06 2018-06-22 北京中科虹霸科技有限公司 Imaging modules and mobile equipment for mobile iris identification equipment
WO2014141654A1 (en) * 2013-03-13 2014-09-18 パナソニック株式会社 Distance measurement device, imaging device, and distance measurement method
CN103297573A (en) * 2013-05-16 2013-09-11 董昊程 Mobile phone camera module capable of optically zooming
CN103760660A (en) * 2014-01-21 2014-04-30 武汉虹识技术有限公司 Optical zooming and automatic focusing lens and method
CN104717427B (en) * 2015-03-06 2018-06-08 广东欧珀移动通信有限公司 A kind of automatic zooming method, device and mobile terminal
CN205883405U (en) * 2016-07-29 2017-01-11 深圳众思科技有限公司 Automatic chase after burnt device and terminal
JPWO2018139156A1 (en) * 2017-01-26 2019-12-12 ソニー・オリンパスメディカルソリューションズ株式会社 Medical observation apparatus and control method
CN107659758A (en) * 2017-09-26 2018-02-02 努比亚技术有限公司 Periscopic filming apparatus and mobile terminal
CN209030296U (en) * 2018-12-19 2019-06-25 信利光电股份有限公司 A kind of camera module
CN210839753U (en) * 2019-11-13 2020-06-23 晋城三赢精密电子有限公司 Periscopic zooming camera module

Also Published As

Publication number Publication date
WO2022022682A1 (en) 2022-02-03

Similar Documents

Publication Publication Date Title
US20200326502A1 (en) Interchangeable lens, camera body, and camera
US7405762B2 (en) Camera having AF function
TWI471630B (en) Auto-focus system and method of a digital camera
JP4115801B2 (en) 3D imaging device
JP5168797B2 (en) Imaging device
CN105704380A (en) Camera focusing method and electric device
JP2008541161A (en) Digital camera with triangulation autofocus system and associated method
CN101251706A (en) Optical die set, camera and mobile terminal equipment
CN108833795B (en) Focusing method and device of image acquisition equipment
EP3328056B1 (en) Focusing processing method and apparatus, and terminal device
CN103475805A (en) Active range focusing system and active range focusing method
EP4006623A1 (en) Optical anti-shake apparatus and control method
US7162151B2 (en) Camera
JP2001272591A (en) Electronic still camera
WO2017028652A1 (en) Lens, camera, package inspection system and image processing method
CN103852954A (en) Method for achieving phase focusing
CN116250246A (en) Image pickup module device, multi-image pickup module, image pickup system, electronic apparatus, and auto-zoom imaging method
CN112640421A (en) Exposure method, exposure device, shooting equipment, movable platform and storage medium
JP2001141982A (en) Automatic focusing device for electronic camera
CN114070994B (en) Image pickup module device, image pickup system, electronic apparatus, and auto-zoom imaging method
JP2001141984A (en) Automatic focusing device for electronic camera
JP2007233033A (en) Focusing device and imaging apparatus
JP2001141983A (en) Automatic focusing device for electronic camera
CN113973171B (en) Multi-camera shooting module, camera shooting system, electronic equipment and imaging method
JP4740477B2 (en) Stereoscopic imaging adapter lens and stereoscopic imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination