CN114928688A - Array camera module, electronic equipment with array camera module and image processing method - Google Patents

Array camera module, electronic equipment with array camera module and image processing method Download PDF

Info

Publication number
CN114928688A
CN114928688A CN202210566600.9A CN202210566600A CN114928688A CN 114928688 A CN114928688 A CN 114928688A CN 202210566600 A CN202210566600 A CN 202210566600A CN 114928688 A CN114928688 A CN 114928688A
Authority
CN
China
Prior art keywords
camera module
optical lens
photosensitive chip
array
chip
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210566600.9A
Other languages
Chinese (zh)
Other versions
CN114928688B (en
Inventor
赵波杰
方银丽
陈振宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Sunny Opotech Co Ltd
Original Assignee
Ningbo Sunny Opotech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Sunny Opotech Co Ltd filed Critical Ningbo Sunny Opotech Co Ltd
Priority to CN202210566600.9A priority Critical patent/CN114928688B/en
Publication of CN114928688A publication Critical patent/CN114928688A/en
Application granted granted Critical
Publication of CN114928688B publication Critical patent/CN114928688B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Color Television Image Signal Generators (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)

Abstract

The array camera module comprises a first camera module, a second camera module and a circuit board, wherein the first camera module and the second camera module are electrically connected to the circuit board respectively. The first camera module comprises a first photosensitive chip and a first optical lens corresponding to the first photosensitive chip. The first camera module comprises a first photosensitive chip and a first optical lens corresponding to the first photosensitive chip, wherein the equivalent focal length f1 of the first optical lens is smaller than the equivalent focal length f2 of the second optical lens, the field angle FOV1 of the first optical lens is larger than the field angle FOV2 of the second optical lens, and the first photosensitive chip is a color photosensitive chip and the second photosensitive chip is a black-and-white photosensitive chip.

Description

Array camera module, electronic equipment with array camera module and image processing method
Technical Field
The present invention relates to the field of camera modules, and in particular, to an array camera module, an electronic device having the same, and an image processing method.
Background
With the advancement of science and technology and the development of economy, people have higher requirements on the camera function of portable electronic equipment, such as tablet computers, video cameras, smart phones and the like. Under the trend, the array camera module is produced. As the name suggests, the array camera module includes two or more camera modules, and compare in traditional single camera module, can realize the function of many characteristics through the cooperation between two or more camera modules, for example, realize the effect of "optics zoom" through the camera module of two kinds of different focal lengths.
More specifically, the existing array camera module with "optical zoom" function includes two camera modules with different focal lengths, namely a wide-angle camera module (small focal length and large field angle) and a telephoto camera module (large focal length and small field angle). The wide-angle camera module has a wider and wider optical lens viewing range, but is difficult to shoot objects at far distance clearly, and the telephoto camera module has a narrower optical lens viewing range and can shoot objects at farther distance, so that the functions of optical zooming are realized by the complementary matching of the wide-angle camera module and the telephoto camera module and by utilizing a corresponding image processing algorithm of a background.
However, since the long-focus camera module has a relatively large height dimension, the overall product height of the array camera module is difficult to reduce. This is a significant obstacle to the application of the array camera module, for example, the array camera module is assembled to an electronic device as an image capturing tool thereof, and especially, it is difficult to meet the current trend of thinning the electronic device.
Furthermore, the existing array camera module with the function of optical zooming, the long-focus camera module and the wide-angle camera module of which the corresponding photosensitive chips are all colored photosensitive chips. In the process of image acquisition, the light inlet quantity of the long-focus camera module is insufficient due to the height limitation of the long-focus camera module, and in addition, the photosensitive capability of the color photosensitive chip is relatively weak, so that the color image acquired by the long-focus camera module is dark. This drawback is more pronounced in the case of relatively dark ambient light.
Therefore, some existing array camera modules with an optical zooming function cannot select to close the telephoto camera module under the dark state, and only the wide-angle camera module is set to be in the working state. That is, some conventional array camera modules have no name of the array camera module when shooting in a dark state, and cannot fully exert the imaging advantages of the array camera module.
In addition, due to the size limitation of the telephoto camera module, the existing array camera module with the "optical zoom" function generally has a "optical zoom" multiple of 2 times. When the optical zooming magnification of the array camera module is required to be improved, the height of the long-focus camera module needs to be increased, so that the height of the whole product of the array camera module is correspondingly increased, and the industrial utilization value of the array camera module is lost.
Disclosure of Invention
An object of the present invention is to provide an array camera module, an electronic device having the same and an image processing method, wherein the array camera module includes a first camera module and a second camera module, an equivalent focal length f1 of the first camera module is smaller than an equivalent focal length f2 of the second camera module, a first photosensitive chip of the first camera module is a color photosensitive chip, and a second photosensitive chip of the second camera module is a black and white photosensitive chip to enhance capturing capability of light.
Another objective of the present invention is to provide an array camera module, an electronic device having the array camera module, and an image processing method, wherein the second photosensitive chip of the second camera module is a black-and-white photosensitive chip, and the area of the unit pixel of the black-and-white photosensitive chip can be reduced compared to the area of the unit pixel of the color photosensitive chip, so that the size of the second photosensitive chip can be correspondingly reduced under the requirement of the same number of pixels, so that the second photosensitive chip of the second camera module can be disposed closer to the second optical lens. That is, the back focus required for the second camera module is reduced, so that the overall height dimension of the array camera module can be reduced.
Another objective of the present invention is to provide an array camera module, an electronic device having the array camera module, and an image processing method, wherein the second photosensitive chip of the second camera module is a black-and-white photosensitive chip, so as to reduce the overall height of the array camera module, thereby properly increasing the equivalent focal length f2 of the second camera module on the premise of ensuring that the height of the array camera module meets a certain size requirement, so as to improve the "optical zoom" magnification of the array camera module, for example, more than 2 times "optical zoom".
Another objective of the present invention is to provide an array camera module, an electronic device having the array camera module, and an image processing method, wherein the unit pixel size of the black-and-white photo-sensing chip can be reduced to less than 1.0 × 1.0um, so that compared with the color photo-sensing chip, the black-and-white photo-sensing chip with the same number of pixels has a relatively smaller size, which is beneficial to reducing the overall height size of the array camera module.
Another object of the present invention is to provide an array camera module, an electronic device having the array camera module, and an image processing method, in which the black and white photosensitive chip has relatively good detail rendering capability, so that when a relatively distant target is photographed by the second camera module, an image acquired by the second camera module has relatively better detail features, thereby improving the visual effect of subsequent image synthesis.
Another objective of the present invention is to provide an array camera module, an electronic device having the array camera module, and an image processing method, wherein the photosensitive chip of the first camera module is a color photosensitive chip, and the photosensitive chip of the second camera module is a black and white photosensitive chip, and the pixel density of the black and white photosensitive chip (under the same camera angle CRA) is higher than or equal to the pixels of the color photosensitive chip, so that in the subsequent "optical zoom" process, the image of the target to be measured at a relatively far position acquired by the second camera module can relatively maintain a high definition level.
Another objective of the present invention is to provide an array camera module, an electronic device having the array camera module, and an image processing method, wherein a first image acquired by the first camera module and a second image acquired by the second camera module can be fused according to a predetermined image fusion method, so as to achieve a visual effect of "optical zoom".
Other advantages and features of the invention will become apparent from the following description and may be realized by means of the instrumentalities and combinations particularly pointed out in the appended claims.
In accordance with the present invention, the foregoing and other objects and advantages can be achieved by an array camera module comprising:
a first camera module;
a second camera module; and
a circuit board, wherein the first camera module and the second camera module are electrically connected to the circuit board respectively, wherein the first camera module includes a first photosensitive chip and a first optical lens corresponding to the first photosensitive chip, wherein the second camera module includes a second photosensitive chip and a first optical lens corresponding to the first photosensitive chip, wherein a focal length f1 of the first optical lens is smaller than a focal length f2 of the second optical lens, a field angle FOV1 of the first optical lens is larger than a field angle FOV2 of the second optical lens, and wherein the first photosensitive chip is a color photosensitive chip and the second photosensitive chip is a black and white photosensitive chip.
In an embodiment of the invention, the unit pixel area of the black-and-white photosensitive chip is smaller than that of the color photosensitive chip, so that the size of the second photosensitive chip is relatively reduced under the requirement of the same pixel number, so that the second optical lens can be relatively more adjacently mounted on the top side of the second photosensitive chip.
In an embodiment of the invention, a unit pixel size of the black-and-white photosensitive chip is set to be less than 1.0 × 1.0um.
In an embodiment of the invention, the circuit board includes a first circuit board and a second circuit board, wherein the first camera module is conductively mounted on the first circuit board, and the second camera module is conductively mounted on the second circuit board.
In an embodiment of the invention, the array camera module further includes a bracket, wherein the bracket is fixed to the outer peripheries of the first camera module and the second camera module for positioning the first camera module and the second camera module.
In an embodiment of the invention, the first camera module further includes a first base, the first base is mounted on the circuit board and forms a first light-passing hole, the first light-passing hole corresponds to the first photosensitive chip, and the first optical lens is mounted on a top side of the first base and is retained in a photosensitive path of the first photosensitive chip; the second camera module also comprises a second base, the second base is arranged on the circuit board, a second light through hole is formed, the second light through hole corresponds to the second photosensitive chip, and the second optical lens is arranged on the top side of the second base and is kept in a photosensitive path of the second photosensitive chip.
In an embodiment of the invention, the first base integrally extends to the second base.
According to another aspect of the present invention, an array camera module is provided, which includes:
a first camera module;
a second camera module; and
a circuit board, wherein the first camera module and the second camera module are electrically connected to the circuit board respectively, wherein the first camera module comprises a first photosensitive chip and a first optical lens corresponding to the first photosensitive chip, wherein the focal length f1 of the first optical lens is smaller than the focal length f2 of the second optical lens, the field angle FOV1 of the first optical lens is larger than the field angle FOV2 of the second optical lens, wherein the first photosensitive chip is a color photosensitive chip and the second photosensitive chip is a black and white photosensitive chip, wherein the pixel point area of the black and white photosensitive chip is smaller than the pixel point area of the color photosensitive chip, so that under the condition of meeting the same pixel requirement, the second light sensing chip is relatively reduced in size so that the second optical lens can be mounted relatively more closely to a top side of the second light sensing chip.
According to another aspect of the present invention, the present invention further provides an image processing method for an array camera module, wherein the image processing method comprises the steps of:
acquiring first image information of a detected target by a first camera module;
acquiring second image information of a detected object by a second camera module, wherein the field angle FOV1 of the first camera module is larger than the field angle FOV2 of the second camera module; and
and fusing the first image information and the second image information to obtain third image information, wherein the third image information is associated with the first image information and the second image information. Further objects and advantages of the invention will be fully apparent from the ensuing description and drawings.
These and other objects, features and advantages of the present invention will become more fully apparent from the following detailed description, the accompanying drawings and the claims.
Drawings
Fig. 1 is a schematic perspective view of an array camera module according to a first preferred embodiment of the invention.
Fig. 2 is an exploded view of the array camera module according to the above preferred embodiment.
Fig. 3 is a partially enlarged perspective exploded view of a first photosensitive chip and a second photosensitive chip of the dual camera module according to the above preferred embodiment.
Fig. 4 is a schematic diagram illustrating a change in relative distance between the second optical lens and the second photosensitive chip (from h1 to h2) and a change in size of the second optical lens when the second photosensitive chip corresponding to the second camera module is replaced with a black-and-white photosensitive chip from a color photosensitive chip.
Fig. 5 is a schematic cross-sectional view illustrating a change (from h1 to h2) in the relative distance between the second optical lens and the second photosensitive chip when the second photosensitive chip corresponding to the second image capturing module is replaced by a black-and-white photosensitive chip from a color photosensitive chip.
Fig. 6 is a schematic cross-sectional view of the array camera module according to the above preferred embodiment.
Fig. 7 shows a modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 8 shows another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 9 shows a further modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 10 shows another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 11 is a view showing still another modified example of the array camera module according to the above preferred embodiment.
Fig. 12 shows a further modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 13 shows another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 14 is a view of another variation of the array camera module according to the above preferred embodiment, which illustrates that an optical lens at the top side of the second optical lens of the second camera module is a glass lens.
Fig. 15 is a cross-sectional view of an array camera module according to a second preferred embodiment of the invention.
Fig. 16 is a schematic diagram illustrating an image capturing effect of the array camera module according to the first preferred embodiment and the second preferred embodiment.
Fig. 17 is a block diagram illustrating an image processing method of an array camera module according to the present invention.
Fig. 18 is a block diagram illustrating step S3 of the image processing method according to the present invention.
Fig. 19 is a block diagram illustrating step S31 of the image processing method according to the present invention.
Fig. 20 is a block diagram illustrating step S33 of the image processing method according to the present invention.
Fig. 21 shows an example of one of the corresponding relationships between the array pixels and the imaging units in step S311 of the image processing method provided by the present invention.
Fig. 22 illustrates a corresponding solution relationship between the RGB information of each imaging unit and the corresponding array pixel in the step S311 of the image processing method provided by the present invention.
Fig. 23 shows an example of a second corresponding relationship between the array pixels and the imaging units in step S311 of the image processing method provided by the present invention.
Fig. 24 shows an example of the correspondence relationship between the array pixels and the imaging units of the step S311A of the image processing method provided by the present invention.
Fig. 25 is a schematic diagram illustrating a process of processing a first image and a second image acquired by the array camera module according to the image processing method.
Fig. 26 is another schematic diagram illustrating an image capturing effect of the array camera module according to the first preferred embodiment and the second preferred embodiment.
Fig. 27 is a block diagram illustrating an image processing method corresponding to the image capturing effect of the array camera module of fig. 26.
Fig. 28 is a schematic block diagram of the image processing method corresponding to the image capturing effect of the array camera module of fig. 27.
Fig. 29 is a schematic diagram illustrating a process of processing a first image and a second image captured by the array camera module according to the image processing method illustrated in fig. 27.
Fig. 30 is a schematic perspective view illustrating the array camera module assembled in an electronic device.
Fig. 31 is another perspective view illustrating the array camera module assembled in an electronic device.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The underlying principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
It will be understood by those skilled in the art that in the present disclosure, the terms "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced devices or components must be constructed and operated in a particular orientation and thus are not to be considered limiting.
It is to be understood that the terms "a" and "an" are to be interpreted as meaning that a number of elements in one embodiment may be one and a number of elements in another embodiment may be plural, and the terms "a" and "an" are not to be interpreted as limiting the number.
As shown in fig. 1 to fig. 3, an array camera module according to a first preferred embodiment of the invention is illustrated, wherein the array camera module comprises a first camera module 10 and a second camera module 20, so that the first camera module 10 and the second camera module 20 cooperate to realize an imaging function of "optical zoom". In particular, in the present invention, the equivalent focal length f1 of the first camera module 10 is set to be smaller than the equivalent focal length f2 of the second camera module 20, and accordingly, the field angle FOV1 of the first camera module 10 is larger than the field angle FOV2 of the second camera module 20. That is to say, in the present invention, as shown in fig. 1, the first camera module 10 is configured as a wide-angle camera module, and the second camera module 20 is configured as a telephoto camera module, during shooting, the view range of the wide-angle camera module is wider and wider, but it is difficult to clearly shoot details of objects at a distance, and the view range of the optical lens of the telephoto camera module is narrower and can shoot objects at a relatively farther distance, so that the function of "optical zoom" is implemented by complementary matching of the wide-angle camera module and the telephoto camera module and by using a corresponding image processing algorithm in the background.
The existing array camera module with the optical zooming function has relatively large height and size, so that the overall height of the product cannot be reduced. This limitation in size greatly restricts the application of the array camera module, and particularly cannot meet the development demand of thinning of the electronic device. Simultaneously, as before, the current array that has "optics and zoom" function of making a video recording module, its long burnt module of making a video recording and the wide angle sensitization chip that the module corresponds of making a video recording all are colored sensitization chip. In the process of image acquisition, the light inlet quantity of the long-focus camera module is insufficient due to the height limitation of the long-focus camera module, and in addition, the photosensitive capability of the color photosensitive chip is relatively weak, so that the color image acquired by the long-focus camera module is dark. This drawback is more pronounced in the case of dark ambient light, which results in the color image captured by the tele camera module not being suitable for subsequent composite images or even negatively affecting image synthesis.
Accordingly, the array camera module provided by the invention aims to solve the technical problems of the array camera module in the aspects of overall size, imaging effect and the like so as to better adapt to the current development trend of electronic equipment.
More specifically, as shown in fig. 3, in the preferred embodiment of the present invention, the first camera module 10 includes a first photosensitive chip 11 and a first optical lens 12, and the first optical lens 12 is held in a photosensitive path of the first photosensitive chip 11, so that the imaging light of the detected object collected by the first optical lens 12 can reach the first photosensitive chip 11 along the photosensitive path, and an imaging reaction occurs at the first photosensitive chip 11. The second camera module 20 includes a second photosensitive chip 21 and a second optical lens 22, and similarly, the second optical lens 22 is held in a photosensitive path of the second photosensitive chip 21, so that the imaging light of the detected object collected by the second optical module can reach the second photosensitive chip 21 along the photosensitive path, and an imaging reaction occurs at the second photosensitive chip 21.
Particularly, in the preferred embodiment of the present invention, the first photosensitive chip 11 corresponding to the first optical lens 12 is a color photosensitive chip 110, and the second photosensitive chip 21 corresponding to the second optical lens 22 is a black-and-white photosensitive chip 210. That is, in the present invention, the photosensitive chip of the telephoto imaging module is replaced with a black-and-white photosensitive chip from an existing color photosensitive chip, and the size of the unit pixel 2130 of the black-and-white photosensitive chip 210 can be reduced compared to the color photosensitive chip, so that the size of the black-and-white photosensitive chip 210 can be reduced compared to the color photosensitive chip with the same number of pixels under the condition of meeting the requirement of the same number of pixels. Accordingly, when the size of the second photosensitive chip 21 of the second camera module 20 is reduced, the second optical lens 22 of the second camera module 20 may be mounted more adjacently on the top side of the second photosensitive chip 21. At this time, the installation position of the second optical lens 22 may be further lowered from an intuitive effect, so that the overall height dimension of the array camera module may be effectively reduced.
More specifically, as shown in fig. 3, the color sensor chip 110 has a multi-layer structure, which includes a microlens layer 111, a color filter layer 112 and a photosensitive layer 113, wherein when the light collected by the first optical lens 12 reaches the color sensor chip 110, the light firstly passes through the microlens layer 111 and is filtered by the color filter layer 112, and finally the light reaching each pixel 1130 of the photosensitive layer 113 is a monochromatic light (red, green or blue). It should be understood by those skilled in the art that, in the process of later imaging, an imaging unit is constructed by a set of the pixel points 1130 of the color sensor chip 110, for example, an imaging unit is constructed by every 4 pixel points 1130 of the color sensor chip 110, and the original color information of the measured object is restored by a corresponding algorithm. Based on such imaging characteristics, the size of the unit pixel of the conventional color sensor chip is usually larger than 1 × 1um, for example, 1.2 × 1.2um, so as to ensure that each pixel 1130 can acquire the corresponding monochromatic light information.
In short, compared to the color sensor chip 110, the difference between the black-and-white sensor chip 210 and the color sensor chip 110 is that the color filter layer is not disposed between the microlens layer 211 and the photosensitive layer 213 of the black-and-white sensor chip 210. That is, the light reaching each pixel 2130 of the black-and-white sensor chip 210 is finally white light. In addition, the monochrome sensor chip 210 does not have a function of collecting color information of a target to be measured, and the pixels 2130 of each monochrome sensor chip 2130 are directly used for imaging, which is not similar to the color sensor chip 110, and a certain number of pixels 1130 need to form an imaging unit for color image information synthesis operation. Based on this property, the black-and-white sensor chip 210 can reduce the size of the unit pixel 2130 relative to the size of the unit pixel 1130 of the color sensor chip 110 while ensuring imaging, thereby reducing the size of the second sensor chip.
More specifically, as shown in fig. 4 and fig. 5, the unit pixel size of the conventional color sensor chip 210P is substantially 1.0 × 1.0um or more, and the unit pixel 2130 size of the black and white sensor chip 210 may be less than 1.0 × 1.0um, such as 0.8 × 0.8um, 0.9 × 0.9um, etc. That is, when the second photosensitive chip 21 of the second camera module 20 is replaced with the black and white photosensitive chip 210 from the existing color photosensitive chip 210P, the size of the second photosensitive chip 21 can be relatively reduced, so that the relative distance between the second optical lens 22 and the second photosensitive chip 21 of the second camera module 20 can be reduced, thereby reducing the overall height of the array camera module.
As shown in fig. 5, when the area of the second photosensitive chip 21, especially the imaging area, is reduced, the optical path required for the external light to irradiate to the corresponding photosensitive area of the second photosensitive chip 21 through the second optical lens 22 can be reduced (h1-h2), so that the overall height dimension of the array camera module can be effectively reduced. When the imaging area corresponding to the second optical lens 22 is reduced, the off-axis aberration of the second optical lens 22 has less influence. Meanwhile, the position of a certain field range (for example, 0.6 field) for evaluating the imaging capability of the lens is closer to the center of the second photosensitive chip 21, so that the absolute size of the field angle of the second optical lens 22 can be reduced, thereby further reducing the difficulty of lens design. It can also be understood that, under the same quality requirement, the size of the lens (the second optical lens 21) is smaller for a smaller chip (the second photosensitive chip 21) or a smaller imaging surface, as shown in fig. 4. It should be appreciated that the distance h1 between the second optical lens 22P and the second photosensitive chip 21P and the distance h2 between the second optical lens 22 and the second photosensitive chip 21 shown in fig. 4 and 5 are only schematic to show that when the size of the second photosensitive chip 21 is reduced, the distance between the second optical lens 22 and the second photosensitive chip 21 can be reduced and, when the size of the second photosensitive chip 21 is reduced, the overall size of the second optical lens 22 can be reduced compared to the existing telephoto optical lens, so that the overall size of the array camera module can be further reduced.
In addition, since the black-and-white photosensitive chip 210 is not provided with the color filter layer 112, such as bayer filter, the light radiated to the second photosensitive chip 21 is not reduced by the color filter layer 112, and still has relatively high light intensity. Therefore, even if the amount of light entering is insufficient, the image acquired by the second camera module 20 still has relatively high light and dark contrast information by the compensation of the strong light sensing performance of the black and white light sensing chip 210. That is, when the second photosensitive chip 21 of the second camera module 20 is configured as a black and white photosensitive chip 210, the array camera module has relatively good dark-state shooting performance, so that even if shooting is performed in a dark state, the array camera module still has a relatively high-quality visual effect of "optical zoom", and the long-focus camera module does not need to be selectively turned off similarly to some existing array camera modules with "optical zoom" functions.
It should be noted that, as those skilled in the art will know, the "optical zoom" magnification of the array camera module depends on the relative relationship between the equivalent focal length f1 of the first camera module 10 and the equivalent focal length f2 of the second camera module 20. The reason why the "optical zoom" magnification of the conventional array camera module with the "optical zoom" function does not usually exceed 2 times is to raise the equivalent focal length f2 of the second camera module 20 when the "optical zoom" magnification of the array camera module is desired to be raised. However, the increase of the equivalent focal length f2 of the second camera module 20 means that the height of the second optical lens 22 of the second camera module 20 needs to be lengthened, which is certainly not suitable for the requirement of reducing the size of the array camera module.
However, in the preferred embodiment of the present invention, the second photosensitive chip 21 of the second camera module 20 is implemented as a black-and-white photosensitive chip 210, and the size of the black-and-white photosensitive chip 210 is relatively reduced compared to that of a color photosensitive chip, so that the second optical lens 22 of the second camera module 20 can be more adjacent to the second photosensitive chip 21. Therefore, in the present invention, the "optical zoom" magnification of the array camera module can be set to be more than 2 times, for example, 3 times, and it is only required that the reduction of the height dimension of the second optical lens 22 (including the installation sinking amount (h1-h2) of the second optical lens 22 and the reduction of the height dimension of the second optical lens 22) brought by the black and white photosensitive chip 210 is larger than the length increase caused by the need to raise the equivalent focal length f2 of the second optical lens 22.
It is also worth mentioning that in order to enable a "lossless zoom" of an image during subsequent optical zooming of the image, i.e. the sharpness of the image remains at a high level during optical zooming. Preferably, the second photosensitive chip 21 has a pixel density per unit angle of view higher than or equal to that of the first photosensitive chip 11. It will be appreciated that when the second photo-sensing chip 21 has a higher number of pixels, the second photo-sensing chip 21 can perform a function similar to "super-sampling", so that during optical zooming, even if the image is processed accordingly, such as cropped or enlarged, the image can maintain a relatively high level of sharpness, thereby visually achieving the technical effect of "lossless zooming". However, this is difficult to be implemented in the conventional array camera module with "optical zoom" function, because when the number of pixels of the second photosensitive chip 21P is increased, the size of the second photosensitive chip 210P is increased accordingly, so that the bottom surface of the optical lens 22P is required to be far away from the second photosensitive chip 21P corresponding to the characteristic of the angle of view of the second optical lens 22P of the second camera module 20P, and at the same time, the size of the second optical lens 22P itself is also increased, which leads to further increase of the overall height of the array camera module.
Accordingly, in the preferred embodiment of the present invention, the second photosensitive chip 21 uses the black-and-white photosensitive chip 210 instead of the color photosensitive chip 210P. As mentioned above, by the photosensitive imaging feature of the black-and-white sensor chip 210, the size of the unit pixel 2130 of the black-and-white sensor chip 210 can be reduced compared with the area of the unit pixel 2130P of the color sensor chip 210P. Therefore, in the preferred embodiment of the invention, the pixels of the second photo sensor 21 can be properly raised, and the requirement that the size of the second photo sensor 21 is smaller than that of the color photo sensor 210P of the second photo sensor 21P is satisfied, so that, on one hand, the size of the second photo sensor 21 can be reduced to reduce the installation height of the second optical lens 22; on the other hand, the pixels of the second photosensitive chip 21 are properly lifted, so that the image of the target to be measured at a relatively far position acquired by the second camera module 20 can relatively maintain a high definition level in the subsequent optical zoom process. In a descriptive sense, by adopting the technical scheme that the second photosensitive chip 21 is a black-and-white photosensitive chip, a designer is given more design parameter choices, so that the imaging effect and the overall size can be better balanced.
Further, in the preferred embodiment of the present invention, the array camera module is an integrated array camera module. As shown in fig. 6, the first camera module 10 and the second camera module 20 are conductively mounted on the same circuit board 30, so as to provide a positioning and mounting base for the first camera module 10 and the second camera module 20 through the circuit board 30. In the subsequent mounting and calibration process, the base plane set by the circuit board 30 can be used as the base plane for calibrating the optical axes of the first camera module 10 and the second camera module 20 to be parallel.
It is easily contemplated that in another embodiment of the present invention, as shown in fig. 7, the array camera module may be implemented as a split array camera module, wherein the first camera module 10 and the second camera module 20 are respectively conductively mounted to a first circuit board 31 and a second circuit board 32. Accordingly, in order to ensure that the relative installation positions of the first camera module 10 and the second camera module 20 satisfy a certain relationship, for example, the optical axes of the first camera module 10 and the second camera module 20 are parallel or a predetermined distance is provided, the array camera module further includes a bracket 40, and the bracket 40 is fixed to the outer peripheries of the first camera module 10 and the second camera module 20 by a bonding adhesive layer for positioning the first camera module 10 and the second camera module 20.
In order to further limit and stabilize the relative position between the first camera module 10 and the second camera module 20, in some embodiments of the present invention, as shown in fig. 8, the array camera module further includes a connecting frame beam 60, wherein the connecting frame beam 60 extends between the first camera module 10 and the second camera module 20 to further limit and stabilize the first camera module 10 and the second camera module 20.
Further, in the preferred embodiment of the present invention, the first camera module 10 further includes a first base 51, the first base 51 is mounted on the circuit board 30, wherein the first optical lens 12 is mounted on the top side of the first base 51, so that the first optical lens 12 is held on the photosensitive path of the first photosensitive chip 11 by the first base 51. It should be noted that the first base 51 forms a first light-passing hole 510, the first light-passing hole 510 corresponds to at least a light-sensing area of the first light-sensing chip 11, so that the first light-passing hole 510 forms a light-sensing path of the first light-sensing chip 11 through the optical lens. Similarly, the second camera module 20 further includes a second base 52, the second base 52 is mounted on the circuit board 30, wherein the second optical lens 22 is mounted on the top side of the second base 52, so that the second optical lens 22 is held on the photosensitive path of the second photosensitive chip 21 by the second base 52. It should be noted that the second base 52 forms a second light-passing hole 520, the second light-passing hole 520 corresponds to at least a light-sensing area of the second light-sensing chip 21, so that the second light-passing hole 520 forms a light-sensing path of the second light-sensing chip 21 through the optical lens.
It should be appreciated that, in the present invention, the first base 51 and the second base 52 may be implemented as a single body structure or a split type structure. As shown in fig. 3, when the first base 51 and the second base 52 have an integrated structure, the first base 51 integrally extends to the second base 52, so that the first camera module 10 and the second camera module 20 can be further limited by the integrated structure of the first base 51 and the second base 52. It should be appreciated that when the first base 51 and the second base have a unitary structure, the first base 51 and the second base 52 may be integrally formed, for example, by a molding process or a molding process, so as to improve industrial efficiency. Of course, in order to facilitate the relatively more free movement of the first camera module 10 and the second camera module 20, the first base 51 and the second base 52 may be implemented as a split structure, i.e., the first base 51 and the second base 52 are separate components, as shown in fig. 6.
In the process of assembling and packaging the first camera module 10 and the second camera module 20, the first photosensitive chip 11 and the second photosensitive chip 21 may be mounted On the circuit Board 30 by a conventional COB (Chop On Board process), and the first photosensitive chip 11, the circuit Board 30, the second photosensitive chip 21, and the circuit Board 30 are respectively connected by a set of leads. Of course, in other embodiments of the present invention, the first photosensitive chip 11 and the second photosensitive chip 21 can be mounted on and conducted to the circuit board 30 by other methods, such as fc (flip chip) process.
Accordingly, the base 50 (the first base 51 and the second base 52) can be mounted to the circuit board 30 by conventional gluing, and provides corresponding mounting bases for the first optical lens 12 and the second optical lens 22. Of course, in other embodiments of the present invention, as shown in fig. 9 or fig. 10, the bases 50 (the first bases 51 and the second bases 52) may be mounted on the corresponding positions of the circuit board 30 by other methods, such as integrally molding the bases 50 (the first bases 51 and the second bases 52) on the corresponding positions of the circuit board 30 by a molding or pressing process. More specifically, in the process of Molding or pressing the base 50, the base 50 (the first base 51 and the second base 52) may be formed by a mob (Molding On board), moc (Molding On chip) or MOG (Molding On Glass) process. It should be appreciated that the first light passing hole 510 and the second light passing hole 520 of the base 50 (the first base 51 and the second base 52) are formed simultaneously when the base 50 (the first base 51 and the second base 52) is molded.
Further, in the preferred embodiment of the invention, the first camera module 10 may further include a first lens carrying element 13, wherein the first optical lens 12 is mounted on the first lens carrying element 13, and the first lens carrying element 13 is mounted on the first base 51, in such a way that the first optical lens 12 is maintained in the photosensitive path of the first photosensitive chip 11. Similarly, the second camera module 20 further includes a second lens carrying element 23, wherein the second optical lens 22 is mounted on the second lens carrying element 23, and the second lens carrying element 23 is mounted on the second base 52, in such a way as to maintain the second optical lens 22 in the photosensitive path of the second photosensitive chip 21.
It should be noted that, in the present invention, the lens carrying element 13, 23 can be implemented as a supporting lens barrel 13A, 23A so that the corresponding camera module 10, 20 is implemented as a fixed focus camera module, or the lens carrying element 13, 23 can be implemented as a driving element 13B, 23B so that the corresponding camera module 10, 20 is implemented as a moving focus camera module. It should be appreciated that, in the present invention, the types and combinations of the first camera module 10 and the second camera module 20 of the array camera module may be in any form, as shown in fig. 10 to 13. And are not intended to limit the scope of the present invention.
Further, in the preferred embodiment of the present invention, the optical lenses (the first optical lens 12 and the second optical lens 22) may be an integral optical lens or a split optical lens. In particular, when the optical lens (the first optical lens 12 and the second optical lens 22) is a split type optical lens, the optical lens includes at least two optical lens cells to form the optical lens (the first optical lens 12 and the second optical lens 22) by assembly of the at least two optical lens cells. It should be appreciated that the respective types and combination of the first optical lens 12 and the second optical lens 22 can be adjusted according to the imaging and size requirements of the first camera module 10 and the second camera module 20, and the invention is not limited thereto.
It should be noted that the optical lenses (the first optical lens 12 and the second optical lens 22) include at least one optical lens (120, 220), wherein the at least one optical lens (120, 220) is installed with a predetermined optical system and arranged in the optical lenses in sequence. Specifically, as shown in fig. 14, in the preferred embodiment of the present invention, an optical lens 220 located on top of the second optical lens 22 is a glass lens 211. It should be appreciated that, because the glass lens 211 has a relatively high refractive index, the glass lens 211 has a stronger light converging capability than a plastic lens, so that the size of other lenses in the direction perpendicular to the optical axis can be reduced accordingly, and to ensure that the curvature of the lens is not changed, the size of the lens in the optical axis direction can be correspondingly reduced, so that the overall height size of the second optical lens is reduced.
Further, as shown in fig. 3, in a preferred embodiment of the present invention, the first camera module 10 further includes a first filter element 14, such as an infrared filter element, and the first filter element 14 is installed between the optical lens and the first photosensitive chip 11 and is maintained in a photosensitive path of the first photosensitive chip 11, so that an image captured by the first camera module 10 is closer to a visual effect of human eyes through the first filter element 14. Similarly, the second camera module 20 further includes a second filter element 24, such as an infrared filter element, wherein the second filter element 24 is installed between the optical lens and the first photosensitive chip 11 and is maintained in the photosensitive path of the first photosensitive chip 11, so that the image captured by the second camera module 20 is closer to the visual effect of human eyes through the second filter element 24.
It should be noted that, in the present invention, the structure of the array camera module is described as an example, so that those skilled in the art can more fully understand the technical features of the array camera module provided by the present invention, and therefore those skilled in the art should easily accept the description.
As shown in fig. 15, an array camera module according to a second preferred embodiment of the present invention is illustrated, wherein the array camera module illustrated in the second preferred embodiment is a modified embodiment of the first preferred embodiment. In particular, in the second preferred embodiment of the present invention, the array camera module is implemented as a periscopic array camera module, which includes a first camera module 10A and a second camera module 20A, wherein the first camera module 10A is configured and mounted in the same way as the first camera module 10 of the first preferred embodiment, and the second camera module 20A is implemented as a "periscopic" camera module. In a manner of description, the term "periscopic" camera module means that the second camera module is mounted in a "lying" manner such that the overall height dimension of the second camera module 20A depends on the dimensions thereof in the length-width direction.
Similarly, in the preferred embodiment of the present invention, the first camera module 10A includes a first photosensitive chip 11A and a first optical lens 12A, wherein the first camera module 10A is longitudinally mounted, so that the first optical lens 12A is longitudinally retained in the photosensitive path of the first photosensitive chip 11A. Correspondingly, the second camera module 20A includes a second photosensitive chip 21A and a second optical lens 22A, wherein the second camera module 20A is transversely arranged with respect to the first camera module such that the second optical lens 22A is transversely held to the photosensitive path of the second photosensitive chip 21A. Particularly, in the preferred embodiment of the present invention, the first photosensitive chip 11A of the first camera module is a color photosensitive chip 110A, and the second photosensitive chip 21A of the second camera module is a black-and-white photosensitive chip 210A. As mentioned above, the size of the unit pixel 2130A of the black-and-white sensor 210A can be reduced compared to the color sensor 210P, so that the size of the black-and-white sensor 210A can be reduced compared to the color sensor 210P with the same number of pixels if the same number of pixels is satisfied. Since in the preferred embodiment of the present invention, the second camera module 20A is implemented as a "periscopic" camera module, i.e., the second camera module 20A is installed in a "recumbent" manner, so that the overall height dimension of the second camera module 20A depends on its dimensions in the length-width direction. Therefore, in the preferred embodiment of the present invention, when the second photosensitive chip 20A of the second camera module 20A is replaced by the black-and-white photosensitive chip 210A instead of the existing color photosensitive chip, the overall height of the second camera module 20A can be directly reduced.
Further, as shown in fig. 15, the second camera module 20A implemented as a "periscopic" camera module further includes a light conversion unit 221A and a driving unit 222A for adjusting the light conversion unit, wherein the light conversion unit 211A corresponds to the second optical lens 22A and is laterally retained in the photosensitive path of the second photosensitive chip 21A. In particular, the light conversion unit 221A has a function of directionally changing a light propagation direction, as shown in fig. 15, during a specific operation, an imaging light ray longitudinally radiated to the light conversion unit 221A is converted at the light conversion unit 221A, then transversely passes through the second optical lens 22A, and finally an imaging reaction occurs at the second photosensitive chip 21A.
It should be noted that, in the preferred embodiment of the present invention, the second photosensitive chip 21A of the second camera module 20A is implemented as a black-and-white photosensitive chip 210A, and the overall size of the second photosensitive chip 21A can be reduced depending on the imaging characteristics of the black-and-white photosensitive chip 210A. Further, the second optical lens 22A may be mounted more adjacently to the front side of the second photosensitive chip 21A corresponding to the field angle FOV2 and the imaging area of the second optical lens 22A. From the visual effect, the whole length size of the second camera module can be reduced. In addition, when the size of the second photo-sensing chip 21A is reduced, in order to match the imaging range of the second photo-sensing chip 21, the overall size (length and width) of the second optical lens 22A can be reduced compared to the existing telephoto optical lens, so that the overall length size and height of the second camera module can be further reduced.
It should be appreciated that when the overall size of the second optical lens 22A and the second photosensitive chip 21A of the second camera module 20A is reduced, other parts of the second camera module, such as the light conversion unit 221A, the driving unit 222A and the driving element (not shown in the figure), can be reduced accordingly, so that the overall size of the second camera module is effectively reduced.
Similarly, when the second photo sensor 21A of the second camera module 20A is implemented as a black and white photo sensor 210A, the dark state shooting capability of the array camera module, the "optical zoom" magnification of the array camera module, and the number of pixels of the second photo sensor 21A can all be optimized.
Further, the array camera module further includes an image processing system, wherein the image processing system operates according to a preset image processing method to realize a preset image synthesis function, such as "optical zoom" or "background blurring". .
In particular, in the preferred embodiment of the present invention, as shown in fig. 17, the image processing method of the array camera module includes the steps of:
s1, obtaining a first image information of a measured object by the first camera module;
s2, obtaining a second image information of a detected object by the second camera module; and
s3 fuses the first image information and the second image information to generate a third image information, wherein the third image information is associated with the first image information and the second image information.
It should be noted that, in the present invention, the field angle FOV1 of the first camera module 10, 10A of the array camera module is larger than the field angle FOV2 of the second camera module 20, 20A, i.e., the first camera module 10, 10A is set as a wide-angle camera module and the second camera module 20, 20A is set as a telephoto camera module. As shown in fig. 16, during shooting, the wide-angle camera module (the first camera module 10, 10A) has a wider and wider view but cannot capture details of a distant object, and the telephoto camera module (the second camera module 20, 20A) has a narrower view but can capture a relatively farther object, so that the first image information obtained by the first camera module 10, 10A includes the measured target image content of the second image information obtained by the second camera module 20, 20A. Further, by means of the imaging feature, the array camera module can achieve other special imaging effects like "optical zoom" based on the first image information and the second image information.
Particularly, in the present invention, the first photosensitive chip 11, 11A corresponding to the first camera module 10, 10A is a color photosensitive chip 110, 110A. It will be appreciated by those skilled in the art that the color sensor chip 110, 110A includes a color filter layer 112, 112A, such that the light ultimately reaching each pixel 1130, 1130A of the pixels on a photosensitive layer 113113a of the color sensor chip 110, 110A is monochromatic. For convenience of understanding and illustration, the first photo- sensing chip 11, 11A is configured to include an M × N array pixel, that is, the first image information includes monochromatic light information collected by each pixel 1130, 1130A of the M × N array pixel. In the subsequent image processing process, the original color information of the detected target needs to be restored through corresponding synthesis operation.
Specifically, in the process of acquiring the color information of the target by using the color photosensitive chips 110, 110A, each pixel 1130, 1130A of the first photosensitive chip 11, 11A is acquired as monochromatic light information, such as R, G, B monochromatic light information. Therefore, in order to restore the color information of the target, a group of pixels 1130, 1130A of the color sensor chips 110, 110A, which are adjacent to each other by X × X (X is a positive integer greater than or equal to 2), is set as an imaging unit, and the color information of the target is synthesized and restored by the monochromatic light information of the pixels 1130, 1130A in the imaging unit. Accordingly, it can be known from calculation that if a condition that each group of pixels 1130, 1130A of the color sensor chips 110, 110A of X × X number is an imaging unit is set, an array imaging unit of (M-X +1) × (N-X +1) can be obtained, wherein each imaging unit' includes color information (RGB color information) of the object to be measured. Specifically, in the preferred embodiment of the present invention, each of the pixel points 1130, 1130A of 4 adjacent (2 × 2) color chips 110, 110A is set as an imaging unit ', and an array imaging unit' of (M-1) × (N-1) is obtained through the synthesis operation. For example, fig. 21 illustrates a 4 × 4 color sensor chip 110, 110A, which includes pixels Pixel11, Pixel 12 … to Pixel 44, respectively, and a 3 × 3 array imaging unit, which includes the imaging unit 11, the imaging unit 12, the imaging unit 13 … to the imaging unit 33, respectively, can be obtained according to the above-mentioned algorithm for forming the imaging unit. Correspondingly, the imaging unit 11 is composed of pixels Pixel11, Pixel 12, Pixel 21 and Pixel 22 of the color photosensitive chips 110, 110A, the imaging unit 12 is obtained by Pixel 12, Pixel 13, Pixel 22 and Pixel23, and the array imaging unit is obtained by analogy in this way.
For convenience of description, the color information of the white light collected by each pixel 1130, 1130A is EEE (in 16-ary system, E represents 14, the arrangement order is RGB), the color information of red light is E00, the color information of green light is 0E0, and the color information of blue light is 00E. For example, the imaging unit 11 as framed in fig. 22 includes a group of pixels, i.e., Pixel11, Pixel 12, Pixel 21, and Pixel 22, which are G11, R12, B21, and 622, respectively, where when G11, R12, B21, and G22 receive corresponding monochromatic lights, the color information of the imaging unit 11 is calculated by using the average value of the R, G, and B components in G11, R12, B21, and G22, and is expressed as the imaging unit color information by formula: e00+00E + (0E0+0E0)/2 ═ EEE (R, G, B components in order). Specifically, when one of the pixels 1130 and 1130A does not receive the corresponding monochromatic light information, for example, when G11 or G22 does not receive green information, the color information of the pixel 1130 and 1130A which does not receive the corresponding monochromatic light is set to be black information, which is represented as 000, and then the color information of the imaging unit: e00+00E + (000+0E0)/2 ═ E7E. And by analogy, acquiring the RGB color information of other imaging units.
It should be noted that in other embodiments of the present invention, the color filter layers 112, 112A of the color sensor chips 110, 110A can be implemented in other manners, such as RGBW. It should be appreciated that when the color filter layers 112, 112A are implemented in other forms, the RGB color information synthesis algorithm for each image unit will not change, but the RGB color information value for each image unit will vary according to the color filter layers 112, 112A. Therefore, when the color filter layers 112, 112A of the color sensor chips 110, 110A are changed, the specific color information synthesis algorithm of each image-forming unit' is not repeated.
However, the construction of each image-forming unit, i.e., the combination of the pixels 1130, 1130A of the color sensor chips 110, 110A corresponding to each image-forming unit, can be changed accordingly. For example, in another embodiment of the present invention, as shown in fig. 23, it is possible to selectively set the pixel points 1130 and 1130A of every 9 adjacent (3 × 3) color chips 110 and 110A as an imaging unit, and obtain an array imaging unit of (M-2) × (N-2) through corresponding operations, where each imaging unit includes RGB color information of the target. Accordingly, the RGB color information of each imaging unit is an average value of R, G, B color component information collected by each pixel 1130, 1130A included in the imaging unit, wherein the pixel 1130, 1130A without light exposure is set as black information. For example, fig. 23 illustrates a 5 × 5 color sensor chip 110, 110A, which includes pixels Pixel11, Pixel 12 … to Pixel 55, respectively, and a 3 × 3 array imaging unit, which includes the imaging unit 11, the imaging unit 12, the imaging unit 21 and the imaging unit 22, respectively, can be obtained according to the above algorithm for forming the imaging unit. Correspondingly, the imaging unit 11 is composed of pixels Pixel11, Pixel 12, Pixel 13, Pixel 21, Pixel 22, Pixel23, Pixel 31, Pixel 32, and Pixel 33 of the color photosensitive chips 110, 110A, and the imaging unit 12 is composed of pixels 12, Pixel 13, Pixel 14, Pixel 22, Pixel23, Pixel 24, Pixel 32, Pixel 33, and Pixel 34, and the array imaging unit can be obtained by analogy in this way.
In addition, as shown in fig. 24, another modified embodiment of the present invention is shown, wherein in the modified embodiment, each imaging unit includes a group of pixels 1130, 1130A of the color sensor chips 110, 110A with the number of adjacent X × X (where X ≧ 2), and the pixels 1130, 1130A included in each imaging unit' are not overlapped with each other, so as to finally obtain an (M/X) (N/X) array imaging unit, wherein each imaging unit includes RGB color information of the target to be measured. For example, fig. 23 illustrates a 4 × 4 color sensor chip 110, 110A, respectively including pixels Pixel1, Pixel 12 … to Pixel 44, and a 2 × 2 array imaging unit, respectively including the imaging unit 11, the imaging unit 12, the imaging unit 21 and the imaging unit 22, can be obtained according to the above algorithm of forming the imaging unit by pixels. Correspondingly, the imaging element 11 is composed of pixels Pixel11, Pixel 12, Pixel 21 and Pixel 22 of the color photosensitive chip 110, 110A, the imaging element 12 is composed of pixels 13, Pixel 14, Pixel23 and Pixel 24, and the array imaging element can be obtained by analogy in this way.
It should be understood by those skilled in the art that in other embodiments of the present invention, the combination of the imaging units of the color photosensitive chips 110, 110A can be adjusted and changed according to the actual application, and the comparison is not limited by the present invention.
Accordingly, as shown in fig. 19, the step S3 further includes the steps of:
s31, solving the color information of the measured object from the first image information;
further, as shown in fig. 19, the step S31 further includes the steps of:
s311 sets the number of pixels 1130 and 1130A of each group of adjacent X × X (X is a positive integer greater than or equal to 2) of the color sensor chips 110 and 110A as an imaging unit to obtain an (M-X +1) × (N-X +1) array imaging unit, where M × N represents the array pixels of the first sensor chips 11 and 11A; and
s312, the RGB color information of each imaging unit is obtained, wherein the RGB color information of each imaging unit is an average value of the R, G, and B component information collected by the pixels 1130 and 1130A included in the imaging unit'.
Alternatively, the step S31 further includes the steps of:
S311A, setting a number of pixels 1130, 1130A of each group of adjacent pixels X × X (X is a positive integer greater than or equal to 2) of the color chips 110, 110A as an imaging unit, and the pixels included in each imaging unit are not overlapped with each other, so as to obtain an (M/X) × (N/X) array imaging unit, where M × N represents an array pixel of the first photosensitive chip 11, 11A; and
S312A, solving the color information of each imaging unit, where the color information of each imaging unit is an average value of the R, G, and B component information collected by the pixels 1130 and 1130A included in the imaging unit.
It should be appreciated that after steps S311 and S312 or steps 311A and S312A are performed, the first image information has RGB color information of the target under test, which is composed of an (M-X +1) × (N-X +1) array imaging unit or RGB color information of each imaging unit of an (M/X) × (N/X) array imaging unit.
Further, in the present invention, the second photosensitive chip 2121A corresponding to the second camera module 20, 20A is a black-and-white photosensitive chip 210210A. Those skilled in the art will appreciate that, in a simple manner, the black-and-white photosensitive chip 210210A is not provided with the color filter layer 112, 112A, compared with the color photosensitive chip, so that the black-and-white photosensitive chip 210, 210A only has a function of collecting brightness information (gray scale information) of the measured object. Based on this imaging characteristic, the size of the unit pixels 2130 and 2130A of the black-and-white photosensitive chips 210 and 210A can be reduced relative to the size of the unit pixels of the color photosensitive chips on the premise of ensuring imaging, so as to meet the size requirement of the array camera module.
Similarly, for convenience of describing and explaining the technical features of the subsequent operation steps of the step S3, the black-and-white photosensitive chip 210210A is set to have a Q × P array of pixels, wherein Q ≧ M, and P ≧ N. That is, in the preferred embodiment of the present invention, the total number of pixels of the black-and-white photo chip 210210A is not lower than the total number of pixels of the color photo chips 110, 110A. Accordingly, the second image information includes gray scale information (luminance information) collected by each pixel 2130, 2130A of the Q × P array pixels.
As described above, in the preferred embodiment of the present invention, the field angle FOV1 of the first camera module is larger than the field angle FOV2 of the second camera module, and therefore, from the viewpoint of the imaged content, the first image information includes the imaged content of the second image information. Therefore, in the step of performing the subsequent image fusion, the overlapped region in the first image information and the second image information needs to be identified and extracted for the image fusion.
Accordingly, in the preferred embodiment of the present invention, as shown in fig. 18, the step S3 further includes the steps of:
s32 identifies and extracts an area of the first image information that overlaps with the second image information, wherein the first image information includes RGB color information possessed by an (M-X +1) × (N-X +1) array imaging element or each imaging element of an (M/X) × (N/X) array imaging element and the second image information includes gradation information (luminance information) collected by each pixel point 2130, 2130A of the qp array pixel.
Those skilled in the art will appreciate that the overlapping region of the first image information and the second image information can be identified and extracted by calibrating the overlapping orientation, fixing the interpolation ratio, or comparing the feature points. Of course, in the present invention, the algorithm for identifying and extracting the region of the first image information overlapping with the second image information may be implemented as any other suitable algorithm, and is not limited by the present invention.
For convenience of subsequent description and understanding, if the ratio of the area of the region overlapping with the second image information in the first image information to the first image information is set as α, the region overlapping with the second image information in the first image information can be represented as: and each imaging unit of the array imaging unit of alpha (M-X +1) or alpha (M/X) and N/X has RGB color information.
Since the color information of the region overlapping with the second image information in the first image information is RGB color information. Those skilled in the art will recognize that the color information in RGB format, which represents the brightness information, chromaticity information, saturation information, etc. of the measured object in a mixed manner. Therefore, in order to enable the color information of the region overlapping with the second image information in the first image information to be fused with the second image information, the RGB-format color information of the region overlapping with the second image information in the first image information needs to be converted into YUV-format color information, where Y represents luminance information and UV represents chrominance information.
More specifically, in the present invention, the luminance information of each image forming unit may be set to Y, and the chrominance information of each image forming unit may be set to UV, and the converted formula may be expressed as:
Y=aR+bG+cB+d;…………………(1)
U=eR+fG+gB+h;…………………(2)
V=iR+jG+kB+l;…………………(3)
the conversion formula satisfies the condition (1) that a + b + c is 1 and the condition (2) that vectors (a, b, c), (e, f, g), (i, j, k) formed by three sets of parameters are not linearly related to each other.
Based on the above-described condition (1) and condition (2), preferably, a conversion formula may be obtained:
Y=0.299*R+0.587*G+0.114*B;…………………(4)
U=-0.169*R-0.331*G+0.5*B+128;…………………(5)
V=0.5*R-0.419*G-0.081*B+128;…………………(6)
accordingly, according to the above formula, the chrominance information of each imaging unit in the region of the first image information overlapping with the second image information can be converted into UVij ═ (Uij, Vij) (-0.169: Rij-0.331: Gij + 0.5: Bij +128, 0.5: Rij-0.419: Gij-0.081: Bij +128), where UVij represents the chrominance information of the ijth imaging unit in the array imaging unit, Uij represents the U chrominance information of the ijth imaging unit in the array imaging unit, Vij represents the V chrominance information of the ijth imaging unit in the array imaging unit, and by analogy, the RGB color information of each imaging unit in the region of the first image information overlapping with the second image information can be converted into color information in YUV format.
Note that, at this time, the region of the first image information that overlaps with the second image information may be expressed as: and YUV color information of each imaging unit of the array imaging unit of alpha (M-X +1) or alpha (M/X) or N/X.
Accordingly, in the present invention, as shown in fig. 18, the step S3 further includes the steps of:
s33 blends the color information of the overlapping region of the first image information and the second image information with the second image to obtain a first blended image.
Accordingly, as shown in fig. 20, the step S33 further includes the steps of:
s331 converts the RGB color information of each imaging unit in the area overlapping with the second image information in the first image information into color information in YUV format, so that the image information of the area overlapping with the second image information in the first image information is converted into: and each imaging unit of the array imaging unit of alpha X (M-X +1) or alpha X (M/X) X (N/X) has YUV color information.
Further, in the preferred embodiment of the present invention, the second photo sensor chip 2121A of the second camera module 20, 20A is implemented as a black and white photo sensor chip, and the second image information includes gray scale information (brightness information) collected by each pixel 2130, 2130A of the Q × P array pixels. It should be noted that the second image information includes more pixels of the array pixels Q × P than the number of imaging units of the region overlapping with the second image information in the first image information: array imaging units of α × (M-X +1) × (N-X +1) or α × (M/X) × (N/X). Therefore, in the process of fusing the color information of the region overlapping with the second image information in the first image information to the second image, an interpolation operation in a certain ratio is required. For ease of understanding, in the present invention, the interpolation ratio is set to β, where β ═ Q × P/[ α × (M-X +1) × (N-X +1) ] or β ═ Q × P/[ α × (M/X) (N/X) ].
Accordingly, the step S33 further includes the steps of:
s332 interpolates the UV chromaticity information of each imaging unit in the overlapping region of the first image information and the second image information with the luminance information Y of each pixel 2130, 2130A in the second image information according to an interpolation ratio β.
It is worth noting that when β is calculated as an integer value, it represents a perfect match between the array pixels qp included in the second image information and the array imaging units α (M-X +1) (N-X +1) ] or α (M/X) (N/X) of the region of the first image information overlapping the second image information, and there is no overlapping portion therebetween. In a case where β is calculated to be an integer value, the density of the unit imaging cell' of the region of the first image information overlapping the second image information, α × (M-X +1) × (N-X +1) ] or α × (M/X) × (N/X), is an integral multiple of the density of the unit pixels 2130, 2130A of the array pixel Q × P included in the second image information. Further, when β is measured as a non-integer value by calculation, an overlapping region exists between array imaging unit α [ (-X +1) [ (N-X +1) ] or α [ ((M/X) [ ((N/X) ]) indicating an overlapping region of array pixel Q [ ("P ] ] included in the second image information and the first image information. Preferably, the array pixels of the overlapping portion in the second image information are interpolated with the second image information in such a manner that each pixel proportion (unit imaging cell of the array imaging cell of the area overlapping with the second image information in the first image information) takes a weighted average.
For ease of understanding, the image obtained in step S332 is set as a first fused image, where the first fused image includes each pixel 2130 of an array of pixels of Q × P, and 2130A includes UV chromaticity information provided by each imaging unit' of the first image information and Y brightness information provided by each pixel of the second image information. Further, the conversion of the first fused image with the YUC format color information into the first fused image in the RGB format may be selected.
Accordingly, according to the formula
Y=0.299*R+0.587*G+0.114*B;…………………(4)
U-0.169R-0.331G + 0.5B + 128; … … … … … … … (5) and
V=0.5*R-0.419*G-0.081*B+128;…………………(6)
the inverse of (c) can yield:
R=Y+1.4075*(V-128);
G-Y-0.3455 (U-128) -0.7169 (V-128); and
B=Y+1.779*(U-128);
accordingly, the first fused image having color information in YUG format may be converted into the first fused image information in RGB format by the YUV-RCB conversion formula.
Further, in some embodiments of the present invention, as shown in fig. 18, the step S3 further includes the steps of:
s34 fusing the first fused image with RGB format color information to an overlapping region of the first image information and the second image information to obtain a second fused image, wherein the second fused image is the third image information.
Specifically, as shown in fig. 25, the third image information (the second fused image) fuses luminance information (gradation information) in the second image information so that an overlapping area of the first image information and the second image information has a relatively high detail expression capability while original image features are maintained in other areas of the first image information, in such a manner that imaging functions similar to "optical zoom" and "background blurring" are realized from the viewpoint of image imaging effects.
It should be noted that when the first camera module 10, 10A and the second camera module 20, 20A are actually used to collect the first image information and the second image information of the target, the shooting distance between the array camera module and the target, i.e. the object distance, is limited, and the imaging content of the second image information may not completely fall into the first image information. In particular, as shown in fig. 26, when the distance between the object to be measured and the array camera module is short (micro distance shooting), the second image information is not completely included in the first image information due to a certain distance between the first camera module 10, 10A and the second camera module 20, 20A, that is, a part of the second image information is not included in the first image information.
In this case, the steps S32, S33 and S34 are adjusted accordingly to adapt to the change of the overlapping area between the first image information and the second image information. For convenience of description and understanding, if the ratio of the area of the overlapping region of the first image information with the second image information to the first image information is set to α ', the overlapping region of the first image information with the second image information may be represented as α ' ((M-X +1) ((N-X + 1)) array imaging units or α ' ((M/X) ((N/X)) array imaging units, each of which has RGB color information. Accordingly, if the ratio of the area of the overlapping region of the second image information with the first image information to the second image information is set to γ, the overlapping region of the second image information with the second image information in the second image information can be represented as γ (Q × P) gray scale information (luminance information) collected by each pixel 1130, 1130A of the array pixel.
Further, since the number γ × P of array pixels included in the region where the second image information overlaps the first image information is different from the number α '× (M-X +1) × (N-X +1) or α' × (M/X) ((N/X)) of imaging units in the region where the first image information overlaps the second image information, usually, the number γ × P of array pixels included in the region where the second image information overlaps the first image information is more than the number α '× (M-X +1) × (N-X +1) or α' × (M/X) (/ X) of imaging units in the region where the first image information overlaps the second image information. Therefore, in the process of fusing the region of the first image information overlapping with the second image information to the region of the second image overlapping with the first image, an interpolation operation is performed according to a certain ratio, wherein for the convenience of understanding and explanation, the interpolation ratio is set to be β, where β ═ γ (Q × P)/[ α '(M-X +1) × (N-X +1) ] or β ═ γ (Q)/[ α' ((M/X) ((N/X) ].
That is, in this case, the area of the overlapping region of the first image information and the second image information changes, so that the interpolation ratio causing the fusion between the overlapping regions of the first image information and the second image information is adjusted accordingly.
Accordingly, in this case, as shown in fig. 27, the step S32 is adjusted to:
s32' identifies and extracts an area overlapping with the second image information in the first image information and an area overlapping with the first image information in the second image information.
Accordingly, the step S3 is adjusted to:
s33' fuses color information of an area of the first image information that overlaps with the second image information to an area of the second image that overlaps with the first image information.
Further, as shown in fig. 28, the step S33' further includes the steps of:
s331' converts the RGB color information of each imaging unit in the area overlapping with the second image information in the first image information into color information in YUV format, so that the image information of the area overlapping with the second image information in the first image information is converted into: YUV color information possessed by each imaging unit of the array imaging unit of α '((M-X +1) ((N-X + 1)) or the array imaging unit of α' ((M/X) ((N/X)); and
s332' interpolates the UV chromaticity information of each imaging unit in the region of the first image information overlapping with the second image information, and the luminance information Y of each pixel 2130, 2130A in the region of the second image information overlapping with the first image information, according to an interpolation ratio β.
More specifically, in step S332 ', when β is calculated to be an integer value, a perfect match is made between array pixels γ (Q × P) included in a region of the second image information overlapping the first image information and array imaging units α ' (M-X +1) (N-X +1) ] or α ' (M/X) (N/X) included in a region of the first image information overlapping the second image information, and there is no overlapping portion therebetween. In other words, when β is calculated to be an integer value, the density of the array image cells α × (M-X +1) × (N-X +1) ] or α × (M/X) × (N/X) of the area of the first image information overlapping the second image information is an integral multiple of the unit pixel density of the array pixels γ (Q × P) included in the area of the second image information overlapping the first image information. Further, when β is measured as a non-integer value by calculation, an area where there is an overlap between array imaging units α '((M-X +1) ((N-X + 1)) or α' ((M/X) ((N/X)) of the first image information and array pixels γ (qp) included in an area where the second image information overlaps the first image information. Preferably, the array pixels of the overlapping portion included in the region where the second image information overlaps with the first image information may be weighted-averaged for each pixel proportion (unit imaging cell of the array imaging cell of the region where the second image information overlaps with the first image information).
Accordingly, the step 34 is adjusted to:
s34' fuses the first fused image with RGB format color information to the position of the overlapping area of the first image information and the second image information to obtain a second fused image, wherein the second fused image is the third image information.
It should be noted that, as shown in fig. 29, the third image information fuses the luminance information in the second image information so that the overlapping area of the first image information and the second image information has a relatively high detail representation capability, thereby achieving visual effects similar to "optical zoom" and "background blurring".
It is to be noted that, as described above, in the present invention, the image processing method may select to output the first fused image or the second fused image (the third image information), that is, the image processing method may select to output fused image information of an overlapping region between the first image information and the second image information or output the second fused image information (the third image information) in which the first fused image information is fused at an overlapping region position of the first image information and the second image information. And are not intended to limit the scope of the present invention.
As shown in fig. 30, the present invention further provides an electronic device 80, wherein the electronic device 80 includes an electronic device body 81 and an array camera module 82. The array camera module 82 is assembled to the electronic device body 81, and provides an image capturing function for the electronic device 80. It should be appreciated that the array camera module 82 provided by the invention has a dual-camera zoom function, so that the electronic device 82 has a special imaging performance, and the visual experience of a user is improved.
In particular, in the specific embodiment of the electronic apparatus 80 provided by the present invention, the array camera module 82 can be assembled on the front side of the electronic apparatus body 81, that is, the array camera module 82 is a front camera module of the electronic apparatus 80, as shown in fig. 30. Alternatively, the array camera module 82 may be assembled on the rear side of the electronic device body 81, that is, the array camera module 82 is a rear camera module of the electronic device 80, as shown in fig. 31. Of course, in another embodiment of the present invention, the array camera module 82 may be assembled at other positions of the electronic device body 81, which is not limited by the present invention.
It can thus be seen that the objects of the invention are sufficiently well-attained. The embodiments illustrated to explain the functional and structural principles of the present invention have been fully illustrated and described, and the present invention is not to be limited by changes based on the principles of these embodiments. Accordingly, this invention includes all modifications encompassed within the scope and spirit of the following claims.

Claims (10)

1. An array camera module, a serial communication port, include:
a first camera module;
a second camera module; and
a circuit board, wherein the first camera module and the second camera module are electrically connected to the circuit board respectively, wherein the first camera module comprises a first photosensitive chip and a first optical lens corresponding to the first photosensitive chip, the second camera module comprises a second photosensitive chip and a second optical lens corresponding to the second photosensitive chip, wherein an equivalent focal length f1 of the first optical lens is smaller than an equivalent focal length f2 of the second optical lens, the first optical lens has a field angle FOV1 that is greater than the field angle FOV2 of the second optical lens, wherein the first photosensitive chip is a color photosensitive chip and the second photosensitive chip is a black-and-white photosensitive chip, wherein the second photosensitive chip is smaller in size than the first photosensitive chip, so that the second optical lens is mounted more adjacently on the top side of the second photosensitive chip.
2. The array camera module of claim 1, wherein the unit pixel area of the black and white photo chip is set smaller than the unit pixel area of the color photo chip.
3. The array camera module of claim 2, wherein the unit pixel size of the black and white photo-sensing chip is set to be less than 0.1 um.
4. The array camera module of claim 3, wherein the equivalent focal length f2 of the second optical lens corresponding to the black and white photosensitive chip exceeds twice the equivalent focal length f1 of the first optical lens corresponding to the color photosensitive chip.
5. The array camera module of any of claims 1-4, wherein the circuit board comprises a first circuit board and a second circuit board, wherein the first camera module is communicably mounted to the first circuit board and the second camera module is communicably mounted to the second circuit board.
6. The camera module array of claim 5, further comprising a bracket, wherein the bracket is fixed to the outer periphery of the first camera module and the second camera module for positioning the first camera module and the second camera module.
7. The array camera module of any of claims 1-4, wherein the first camera module further comprises a first base, the first base is mounted to the circuit board and forms a first light passing hole, the first light passing hole corresponds to the first photo-sensing chip, and the first optical lens is mounted on the top side of the first base and is retained in the photo-sensing path of the first photo-sensing chip; the second camera module also comprises a second base, the second base is arranged on the circuit board, a second light through hole is formed, the second light through hole corresponds to the second photosensitive chip, and the second optical lens is arranged on the top side of the second base and is kept in a photosensitive path of the second photosensitive chip.
8. The array camera module of claim 7, wherein the first base integrally extends from the second base.
9. The array camera module of any of claims 1-4, wherein the second optical lens comprises at least one optical lens, wherein the optical lens at the top side of the second optical lens is a glass lens.
10. The array camera module of any of claims 1-4, wherein the first camera module further comprises a first filter element, wherein the first filter element is mounted between the first optical lens and the first photo-sensing chip and is maintained in the photo-sensing path of the first photo-sensing chip, and wherein the second camera module further comprises a second filter element, wherein the second filter element is mounted between the second optical lens and the second photo-sensing chip and is maintained in the photo-sensing path of the second photo-sensing chip.
CN202210566600.9A 2018-05-08 2018-05-08 Array camera module, electronic equipment with array camera module and image processing method Active CN114928688B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210566600.9A CN114928688B (en) 2018-05-08 2018-05-08 Array camera module, electronic equipment with array camera module and image processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210566600.9A CN114928688B (en) 2018-05-08 2018-05-08 Array camera module, electronic equipment with array camera module and image processing method
CN201810429600.8A CN110460747B (en) 2018-05-08 2018-05-08 Image processing method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810429600.8A Division CN110460747B (en) 2018-05-08 2018-05-08 Image processing method

Publications (2)

Publication Number Publication Date
CN114928688A true CN114928688A (en) 2022-08-19
CN114928688B CN114928688B (en) 2024-06-11

Family

ID=68472086

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810429600.8A Active CN110460747B (en) 2018-05-08 2018-05-08 Image processing method
CN202210566600.9A Active CN114928688B (en) 2018-05-08 2018-05-08 Array camera module, electronic equipment with array camera module and image processing method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201810429600.8A Active CN110460747B (en) 2018-05-08 2018-05-08 Image processing method

Country Status (1)

Country Link
CN (2) CN110460747B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991188B (en) * 2019-12-02 2023-06-27 RealMe重庆移动通信有限公司 Image processing method and device, storage medium and electronic equipment
CN111081741B (en) * 2019-12-06 2022-04-29 武汉华星光电半导体显示技术有限公司 Display device
CN114125195A (en) * 2020-08-31 2022-03-01 宁波舜宇光电信息有限公司 Integrated double-zoom periscopic imaging system, camera module and electronic equipment
TWI771811B (en) * 2020-09-18 2022-07-21 大立光電股份有限公司 Electronic device
CN115118945B (en) * 2022-06-27 2023-07-14 南昌华勤电子科技有限公司 Color correction method and device for camera module

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150078678A1 (en) * 2013-09-18 2015-03-19 Blackberry Limited Using narrow field of view monochrome camera for producing a zoomed image
US20170186163A1 (en) * 2015-12-24 2017-06-29 Samsung Electro-Mechanics Co., Ltd. Image sensor and camera module
CN107295225A (en) * 2016-04-12 2017-10-24 三星电机株式会社 Camera model
CN107645624A (en) * 2016-07-20 2018-01-30 宁波舜宇光电信息有限公司 Camera module and its antidote and optical anti-vibration method and the electronic equipment with camera module
WO2018028585A1 (en) * 2016-08-10 2018-02-15 宁波舜宇光电信息有限公司 Multi-camera module with diaphragms of different sizes, and application thereof

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001017236A1 (en) * 1999-09-01 2001-03-08 Florida Atlantic University Color video camera system and method
JP2009246620A (en) * 2008-03-31 2009-10-22 Brother Ind Ltd Image data generating device
CN103546730A (en) * 2012-07-11 2014-01-29 北京博雅华录视听技术研究院有限公司 Method for enhancing light sensitivities of images on basis of multiple cameras
JP2015197745A (en) * 2014-03-31 2015-11-09 キヤノン株式会社 Image processing apparatus, imaging apparatus, image processing method, and program
CN106226976B (en) * 2016-09-22 2019-09-03 宇龙计算机通信科技(深圳)有限公司 A kind of dual camera image pickup method, system and terminal
CN106500852B (en) * 2016-09-28 2018-11-13 北方夜视技术股份有限公司 It is infrared with visible light image registration with the System and method for that merges
CN106960414B (en) * 2016-12-12 2020-01-03 天津大学 Method for generating high-resolution HDR image from multi-view LDR image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150078678A1 (en) * 2013-09-18 2015-03-19 Blackberry Limited Using narrow field of view monochrome camera for producing a zoomed image
US20170186163A1 (en) * 2015-12-24 2017-06-29 Samsung Electro-Mechanics Co., Ltd. Image sensor and camera module
CN107295225A (en) * 2016-04-12 2017-10-24 三星电机株式会社 Camera model
CN107645624A (en) * 2016-07-20 2018-01-30 宁波舜宇光电信息有限公司 Camera module and its antidote and optical anti-vibration method and the electronic equipment with camera module
WO2018028585A1 (en) * 2016-08-10 2018-02-15 宁波舜宇光电信息有限公司 Multi-camera module with diaphragms of different sizes, and application thereof

Also Published As

Publication number Publication date
CN110460747A (en) 2019-11-15
CN114928688B (en) 2024-06-11
CN110460747B (en) 2022-10-14

Similar Documents

Publication Publication Date Title
CN110460747B (en) Image processing method
US7453510B2 (en) Imaging device
JP6122184B2 (en) Image capture and processing using monolithic camera arrays with different types of imagers
US7783185B2 (en) Image sensor, imaging device and imaging method
US20050128509A1 (en) Image creating method and imaging device
US6801719B1 (en) Camera using beam splitter with micro-lens image amplification
EP4161061A1 (en) Image acquisition method, camera assembly, and mobile terminal
US9581436B2 (en) Image processing device, image capturing apparatus, and image processing method
WO2021000592A1 (en) Image capturing device and method
CN108712608A (en) Terminal device image pickup method and device
CN107820066A (en) A kind of low-luminance color video camera
CN112019734B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN208489908U (en) Array camera module and electronic equipment with array camera module
CN113132597B (en) Image acquisition system and terminal
CN110460783B (en) Array camera module, image processing system, image processing method and electronic equipment
US9743007B2 (en) Lens module array, image sensing device and fusing method for digital zoomed images
KR20060053147A (en) Imaging apparatus and imaging method
CN114650359A (en) Camera module and electronic equipment
WO2022073364A1 (en) Image obtaining method and apparatus, terminal, and computer readable storage medium
CN214381044U (en) Electronic device
WO2021149503A1 (en) Electronic device
CN111258166B (en) Camera module, periscopic camera module, image acquisition method and working method
CN112104796B (en) Image processing method and device, electronic equipment and computer readable storage medium
KR101739736B1 (en) Camera module and method for controlling thereof
CN115589517A (en) Image sensor, camera module, electronic equipment and camera shooting method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant