CN111093018A - Imaging module and terminal - Google Patents

Imaging module and terminal Download PDF

Info

Publication number
CN111093018A
CN111093018A CN201911379607.4A CN201911379607A CN111093018A CN 111093018 A CN111093018 A CN 111093018A CN 201911379607 A CN201911379607 A CN 201911379607A CN 111093018 A CN111093018 A CN 111093018A
Authority
CN
China
Prior art keywords
area
imaging
image output
image
photosensitive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911379607.4A
Other languages
Chinese (zh)
Other versions
CN111093018B (en
Inventor
徐青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911379607.4A priority Critical patent/CN111093018B/en
Publication of CN111093018A publication Critical patent/CN111093018A/en
Application granted granted Critical
Publication of CN111093018B publication Critical patent/CN111093018B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Abstract

The application discloses imaging module. The imaging module comprises a lens group and an image sensor, the image sensor comprises a photosensitive area, the photosensitive area comprises an image output area, the image output area is a part of the photosensitive area, the area of the image output area is smaller than that of the photosensitive area, and the image output area is used for receiving light rays to output an image; the projection range of the light rays passing through the lens group on the image sensor is an imaging area, the image output area is positioned in the imaging area, and at least part of the photosensitive area is positioned outside the imaging area. The application also discloses a terminal. The imaging area covers the image output area, images can be output through the signal value of the image output area, and the imaging area does not need to cover all the photosensitive areas, so that the size of the lens group is designed to be small, the requirement for outputting the images through the image output area can be met, and the miniaturization of the imaging module and the installation and layout of the imaging module in the terminal are facilitated.

Description

Imaging module and terminal
Technical Field
The application relates to the technical field of consumer electronics, in particular to an imaging module and a terminal.
Background
The camera can be installed in an electronic device such as a mobile phone for imaging, the overall width of the camera is determined by the larger one of the width of the lens group and the width of the image sensor, and generally, under the condition of the current manufacturing process, when it is required to ensure that the imaging range of the lens group on the image sensor covers all photosensitive areas on the image sensor, the larger lens group is often required to be configured, so that the size of the camera is increased, and the miniaturization of the camera and the installation and layout of the camera on the electronic device are not facilitated.
Disclosure of Invention
The embodiment of the application provides an imaging module and a terminal.
The imaging module comprises a lens group and an image sensor, wherein the image sensor comprises a photosensitive area, the photosensitive area comprises an image output area, the image output area is a part of the photosensitive area, the area of the image output area is smaller than that of the photosensitive area, and the image output area is used for receiving light to output an image; the projection range of the light rays passing through the lens group on the image sensor is an imaging area, the image output area is located in the imaging area, and at least part of the photosensitive area is located outside the imaging area.
In some embodiments, the lens group and the image sensor have equal length dimensions in a first direction perpendicular to an optical axis of the lens group.
In some embodiments, the imaging module further includes a prism, the prism is configured to change a propagation direction of light, the light whose propagation direction is changed by the prism enters the lens group, and a length dimension of the prism in the first direction is smaller than or equal to a length dimension of the lens group in the first direction.
In some embodiments, a length dimension of a second direction of the image sensor is greater than a length dimension of a first direction of the image sensor, the second direction being perpendicular to the first direction, the second direction being perpendicular to the optical axis.
In some embodiments, the image output region is rectangular, the image output region being inscribed at the edges of the imaging region.
In some embodiments, the imaging module further includes a micro-actuator connected to the image sensor and/or the lens assembly, and the micro-actuator is configured to drive the image sensor and/or the lens assembly to move in a direction perpendicular to the optical axis.
The terminal of this application embodiment includes casing and above-mentioned any embodiment the formation of image module, the formation of image module is installed on the casing, the thickness direction of casing with the optical axis of lens group is perpendicular.
In some embodiments, the terminal further comprises a processor configured to: acquiring a shot image according to light received by the image output area; detecting dead pixel located at the edge of the shot image, wherein the dead pixel corresponds to a dead pixel photosensitive unit located at the edge of the image output area; and reading a signal value of a compensation photosensitive unit to replace the signal value of the dead pixel photosensitive unit, wherein the compensation photosensitive unit is positioned outside the image output area and in the imaging area, and the compensation photosensitive unit is adjacent to the dead pixel photosensitive unit.
In some embodiments, the terminal further comprises a processor configured to: acquiring an aspect ratio according to user input; selecting a rectangular area which accords with the length ratio in the imaging area as the image output area; and acquiring an image according to the signal value of the rectangular area.
In some embodiments, the photosensitive area is rectangular, the long side of the image output area is parallel to the long side of the photosensitive area, and the wide side of the image output area is parallel to the wide side of the photosensitive area.
In some embodiments, the photosensitive region further comprises an auxiliary imaging region, the auxiliary imaging region being located outside the image output region, the auxiliary imaging region being located within the imaging region; the terminal further comprises a processor configured to: and receiving signal values of the image output area and the auxiliary imaging area to acquire an image.
In the imaging module and the terminal of the embodiment of the application, the imaging area covers the image output area, images can be output through the signal value of the image output area, and the imaging area does not need to cover all photosensitive areas, so that the size of the lens group is designed to be smaller, the requirement for outputting the images through the image output area can be met, the miniaturization of the imaging module is realized, and the installation and the layout of the imaging module in the terminal are facilitated.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic plan view of an angle of a terminal according to an embodiment of the present application;
FIG. 2 is a schematic plan view of another angle of a terminal according to an embodiment of the present application;
FIG. 3 is a schematic structural diagram of an imaging module according to an embodiment of the present disclosure;
fig. 4 and 5 are schematic structural views of a photosensitive area and an imaging area according to an embodiment of the present disclosure;
FIG. 6a is a schematic view of a scene used for shooting by the terminal according to the embodiment of the present application;
fig. 6b and 6c are schematic views of photographed images photographed by the terminal according to the embodiment of the present application;
fig. 7a is a schematic view of a photographed image photographed by a terminal of the embodiment of the present application;
FIG. 7b is a schematic diagram of a portion of a photosensitive area and an imaging area according to an embodiment of the present disclosure;
FIG. 8 is a schematic structural diagram of an imaging module according to an embodiment of the present disclosure;
FIG. 9 is a schematic view of the structure of the photosensitive area and the imaging area according to the embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of an imaging module according to an embodiment of the present application.
Description of the main element symbols:
the terminal 100, the imaging module 10, the lens group 11, the imaging area 111, the image sensor 12, the light sensing area 121, the image output area 122, the auxiliary imaging area 123, the dead-spot light sensing unit 124, the compensation light sensing unit 125, the prism 13, the micro-driver 14, the housing 20, the front surface 21, the back surface 22, the display screen 30, the main camera 40, the sub-camera 50, the front camera 60, the processor 70, the optical axis Z, the first direction Y, and the second direction X.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout.
In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1 and 2, a terminal 100 according to an embodiment of the present disclosure includes a housing 20 and an imaging module 10. Referring to fig. 3 and 4, the imaging module 10 includes a lens assembly 11 and an image sensor 12. The image sensor 12 includes a photosensitive area 121, and the photosensitive area 121 includes an image output area 122. The image output area 122 is a portion of the photosensitive area 121, the area of the image output area 122 is smaller than that of the photosensitive area 121, and the image output area 122 is configured to receive light to output an image. The projection range of the light passing through the lens group 11 on the image sensor 12 is an imaging area 111. Image-output region 122 is located within imaging area 111, and photosensitive region 121 is located at least partially outside imaging area 111. The imaging module 10 is mounted on the housing 20, and the thickness direction of the housing 20 is perpendicular to the optical axis Z of the lens group 11.
In the imaging module 10 and the terminal 100 of the embodiment of the application, the imaging area 111 covers the image output area 122, and can output an image through a signal value of the image output area 122, and the imaging area 111 does not need to cover all the photosensitive areas 121, so that the size of the lens assembly 11 is designed to be small and the requirement of outputting an image through the image output area 122 can be met, which is beneficial to realizing the miniaturization of the imaging module 10 and the installation and layout of the imaging module 10 in the terminal 100.
Specifically, referring to fig. 1 and fig. 2, the terminal 100 may be a mobile phone, a tablet computer, a game machine, a smart watch, a head-mounted display device, etc., and the present application takes the terminal 100 as a mobile phone for illustrative purposes, and it is understood that the specific type of the terminal 100 is not limited to a mobile phone. In the example shown in fig. 1 and 2, the terminal 100 further includes elements such as a display 30, a main camera 40, a sub-camera 50, a front camera 60, and a processor 70, but the terminal 100 may also include functional elements such as a battery and a fingerprint recognition module, which is not limited herein.
The housing 20 may serve as a housing for the terminal 100. The functional elements such as the imaging module 10, the display screen 30, the main camera 40, the auxiliary camera 50, the front camera and the processor 70 can be installed on the casing 20, and the casing 20 can provide protection such as falling prevention, dust prevention or water prevention for the functional elements. The housing 20 may be made of a non-metal material, such as plastic, glass, etc., or the housing 20 may be made of a metal material, such as aluminum alloy, etc., or the housing 20 may be made of both metal and non-metal materials. The housing 20 may include a front surface 21 and a back surface 22 opposite to each other, where the front surface 21 may be a plane or a curved surface, and the back surface 22 may also be a plane or a curved surface.
The display 30 can be mounted on the front 21 of the housing 20, and in one example, the display 30 can completely fill the front 21 to provide the terminal 100 with a high screen ratio, and the display 30 can be a liquid crystal display, an OLED display, a micro LED display, or the like. The main camera 40 and the auxiliary camera 50 can be exposed from the back 22, the main camera 40 can be a long-focus camera, the auxiliary camera 50 can be a wide-angle camera, according to different shooting requirements, a user can switch to use one of the main camera 40 or the auxiliary camera 50 to shoot, and the main camera 40 and the auxiliary camera 50 can be used for imaging in a matched mode. The front camera 60 may receive light entering the housing 20 from the front face 21 for imaging, for self-timer shooting, video call, or the like. The front camera 60 may be disposed under the display 30 or may be located in a slot formed in the display 30, which is not limited herein.
The imaging module 10 is installed on the chassis 20, specifically, the imaging module 10 may be exposed from the front 21 to be used as a front imaging module, or the imaging module 10 may also be exposed from the back 22 to be used as a rear imaging module, or the imaging module 10 may also be disposed below the display screen 30 to be used as an off-screen imaging module, or the imaging module 10 may also be disposed on a movable module, the movable module may be extended or screwed out from the chassis 20 when receiving a trigger instruction of a user, and the movable module may be hidden in the chassis 20 when not receiving the trigger instruction of the user. In the embodiment of the present application, the imaging module 10 is exposed from the back surface 22 for an exemplary illustration, and the imaging module 10 can be used in cooperation with the main camera 40 and the sub-camera 50 to meet different shooting requirements of a user, for example, requirements for focusing on target objects at different distances. In one example, the imaging module 10 may use visible light for imaging, and in another example, the imaging module 10 may use infrared light for imaging. The imaging module 10 may be a fixed-focus imaging module, and the imaging module 10 may also be a zoom imaging module, which is not limited herein.
Referring to fig. 3 and 4, the imaging module 10 includes a lens assembly 11 and an image sensor 12. Light (shown by a dotted line in fig. 3) passes through the lens assembly 11 and reaches the image sensor 12, the image sensor 12 obtains a signal value according to the received light, and the processor 70 can obtain a captured image according to the signal value.
The lens assembly 11 may include one or more lenses and a lens barrel for accommodating the lenses, for example, the lens assembly 11 includes one or more convex lenses and one or more concave lenses, which is not limited herein. The position of the lens in the lens barrel may be fixed, and the imaging module 10 is a fixed-focus imaging module. The position of the lens in the lens barrel may also be movable, for example, the lens in the lens barrel may be movable along the optical axis Z direction of the lens group 11 to achieve zooming, in this case, the imaging module 10 is a zooming imaging module, and the lens barrel may further be provided with a driving element such as a driving motor, a coil, and a ball, and the driving element is used for driving the lens to move.
The image sensor 12 may be a CCD image sensor or a CMOS image sensor. The image sensor 12 includes a photosensitive member and a package, the photosensitive member is packaged in the package, a connection circuit can be laid in the package, and the connection circuit can be electrically connected with the photosensitive member to transmit an electrical signal generated by the photosensitive member. The photosensitive part comprises a plurality of photosensitive units, the photosensitive units can be photodiodes, the photosensitive units can convert received optical signals into electrical signals (namely signal values), the number of the photosensitive units can be multiple, for example, the number of the photosensitive units can be four thousand, four million and a plurality of the photosensitive units can be arranged in an array.
The image sensor 12 includes a photosensitive area 121, and the photosensitive area 121 may be an area where all the photosensitive cells are arranged. The photosensitive area 121 includes an image output area 122 (shown as a shaded area in fig. 4), the image output area 122 is a portion of the photosensitive area 121, the area of the image output area 122 is smaller than that of the photosensitive area 121, and the image output area 122 is configured to receive light to output an image. The image output area 122 has an area smaller than that of the photosensitive area 121, for example, the photosensitive area 121 is formed by four thousand, four million photosensitive cells, and the image output area 122 is formed by three thousand, two million photosensitive cells, so that light does not need to be irradiated onto all the photosensitive areas 121, and an image can be normally output only by being irradiated onto the image output area 122. The center of the image output region 122 may or may not coincide with the center of the photosensitive region 121. It is understood that the photosensitive section 121 includes, in addition to the image output section 122, an area other than the image output section 122, which includes an area that can be irradiated with light and an area that cannot be irradiated with light. In one example, the processor 70 may output an image only according to the electrical signals generated by the image output area 122, and in another example, the processor 70 may output an image according to the electrical signals generated by the image output area 122 and an area outside the partial image output area 122.
In one example, when the image sensor 12 is used in the imaging module 10 according to the embodiment of the present invention, not all the areas of the photosensitive area 121 are used for outputting an image, but all the areas of the photosensitive area 121 have the capability of converting a received optical signal into an electrical signal, so the image sensor 12 may be used in other imaging modules in which all the areas of the photosensitive area 121 are used for outputting an image, and the versatility of the image sensor 12 is high.
The projection range of the light passing through the lens group 11 on the image sensor 12 is an imaging area 111, in the example of fig. 4, the imaging area 111 is an area surrounded by a virtual figure, the whole imaging area 111 may be circular, and in other examples, the imaging area 111 may be in any shape such as rectangular or elliptical. The imaging region 111 covers a part of the light sensing region 121, such that a part of the light sensing region 121 can receive the light passing through the lens assembly 11 and convert into an electrical signal, and another part of the light sensing region 121 cannot receive the light passing through the lens assembly 11 and convert into an electrical signal. The image output area 122 is located in the imaging area 111, and the imaging area 111 only needs to cover the image output area 122 to normally output an image, and does not need to cover the entire photosensitive area 121, so that the size of the lens group 11 can be set small. In one example, the length of the lens group 11 and the length of the image sensor 12 in the first direction Y are equal, and the length of the imaging module 10 in the first direction Y is not too large due to the too large length of the lens group 11. The first direction Y is perpendicular to the optical axis Z of the lens assembly 11, the length of the lens assembly 11 in the first direction Y is the length of the outer contour of the assembled lens, lens barrel and driving element in the first direction Y, and the length of the image sensor 12 in the first direction Y is the length of the outer contour of the assembled light-sensitive element and package in the first direction Y.
In the embodiment of the present application, when the imaging module 10 is mounted on the housing 20, the thickness direction of the housing 20 is parallel to the first direction Y, and since the first direction Y of the imaging module 10 is reduced, after the imaging module 10 is mounted on the housing 20, the thickness of the housing 20 is also easily reduced, which is beneficial to implementing the lightness and thinness of the terminal 100.
In summary, in the imaging module 10 and the terminal 100 of the embodiment of the application, the imaging area 111 covers the image output area 122, and an image can be output through a signal value of the image output area 122, and the imaging area 111 does not need to cover all the photosensitive areas 121, so that the lengths of the lens group 11 and the image sensor 12 in the first direction Y are equal, the overall width of the imaging module 10 is reduced, and the imaging module 10 is favorably miniaturized and the installation and layout of the imaging module 10 in the terminal 100 are favorably realized.
Referring to fig. 3, in some embodiments, the imaging module 10 further includes a prism 13, and the prism 13 is used for changing the propagation direction of the light. The light whose propagation direction is changed by the prism 13 enters the lens group 11, and the length dimension of the prism 13 in the first direction Y is smaller than or equal to the length dimension of the lens group 11 in the first direction Y. The imaging module 10 can be a periscopic imaging module, and the prism 13 can change the propagation direction of light, so that the light incident surface of the imaging module 10 is perpendicular to the image sensor 12, and although the length of the imaging module 10 in the Z direction of the optical axis is large, the thickness of the terminal 100 is not affected, and the terminal 100 is favorably thinned. The length dimension of the prism 13 in the first direction Y is less than or equal to the length dimension of the lens group 11 in the first direction Y, so that the prism 13 is not arranged, and the overall length dimension of the imaging module 10 is increased. Specifically, the prism 13 may include a reflective surface, and the light entering the imaging module 10 from the outside changes the propagation direction after being reflected by the reflective surface, and in one example, the included angle between the reflective surface and the optical axis Z may be forty-five degrees. In one example, the projection of the lens group 11 on a plane perpendicular to the optical axis Z is located within the projection of the prism 13 on a plane perpendicular to the optical axis Z, so that the light reaching the lens group 11 is the light reflected by the prism 13.
Referring to fig. 3 to 5, in some embodiments, the length dimension of the second direction X of the image sensor 12 is greater than the length dimension of the first direction Y of the image sensor 12, wherein the second direction X is perpendicular to the first direction Y and the second direction X is perpendicular to the optical axis Z. The length dimension of the second direction X of the image sensor 12 is greater than the length dimension of the first direction Y, so that the entire photosensitive area of the image sensor 12 can be set to be larger, while the length dimension of the first direction Y of the imaging module 10 is not increased, that is, the thickness of the terminal 100 is not increased. In one example, the overall shape of the image sensor 12 may be a rectangular parallelepiped, and the shape of the light sensing region 121 may be a rectangle, the long side of which extends along the second direction X, and the wide side of which extends along the first direction Y. Meanwhile, the image output area 122 may also be rectangular, the image output area 122 may be inscribed at the edge of the imaging area 111, and the image output area 122 may also be rectangular according to the shape of the image output by the rectangular image output area 122, which is more suitable for the rectangular display screen 30, and be inscribed in the imaging area 111, so that the area of the imaging area 111 is maximally utilized, and the definition of the photographed image is improved.
Referring to fig. 1, in some embodiments, the processor 70 is configured to: acquiring an aspect ratio according to user input; selecting a rectangular area in the imaging area 111 according to the aspect ratio as an image output area 122; and acquiring an image according to the signal value of the rectangular area.
Therefore, shot images with different length-width ratios can be output according to the length-width ratios input by the user, personalized selection of the user during shooting is reflected, and interestingness is improved. Referring to fig. 6a, fig. 6B and fig. 6C, the scene shown in fig. 6a includes a feature a, a feature B and a feature C, and when the scene shown in fig. 6a is photographed, all of the feature a, the feature B and the feature C may not be photographed by a single photographed image. The user may then enter an aspect ratio to personally select the length scale of the captured image, e.g., the user may enter a number for the aspect ratio, or the user may drag the drag rectangle box to select a suitable aspect ratio to select a combination of features that the user desires to capture. In one example, if the user wants to capture the captured image P1 shown in fig. 6b, wherein the captured image P1 captures the feature a and the feature C, an aspect ratio may be selected, the processor 70 selects a rectangular area corresponding to the aspect ratio in the imaging area 111 as the image output area 122, for example, the image output area 122 shown in fig. 4, and acquires an image according to the signal value of the selected rectangular area; in another example, if the user wants to capture a captured image P2 as shown in fig. 6c, wherein the captured image P2 captures feature a and feature B, another aspect ratio may be selected, the processor 70 selects a rectangular area corresponding to the other aspect ratio in the imaging area 111 as the image output area 122, for example, the image output area 122 as shown in fig. 5, and acquires an image according to the signal value of the selected rectangular area. Of course, in other examples, the user may select other aspect ratios to capture the captured images with different aspect ratios, which is not limited herein.
When the processor 70 selects the rectangular area, the long side of the rectangular area may be parallel to the long side of the photosensitive area 121, and the wide side of the rectangular area may be parallel to the wide side of the photosensitive area 121.
Referring to fig. 1, 7a and 7b, in some embodiments, the processor 70 is configured to: acquiring a shot image according to light received by the image output area 122; detecting a dead pixel 200 located at an edge of the photographed image, the dead pixel 200 corresponding to the dead-pixel sensing unit 124 located at the edge of the image output area 122; and reading the signal value of the compensation photosensitive unit 125 to replace the signal value of the dead-spot photosensitive unit 124, wherein the compensation photosensitive unit 125 is located outside the image output area 122 and inside the imaging area 111, and the compensation photosensitive unit 125 is adjacent to the dead-spot photosensitive unit 124.
In this way, the signal value of the compensation photosensitive unit 125 adjacent to the dead pixel photosensitive unit 124 is replaced with the signal value of the dead pixel photosensitive unit 124, so that the dead pixel 200 located at the edge of the captured image can be removed, and the captured image can still reflect the current scene more accurately after the dead pixel 200 is replaced because the compensation photosensitive unit 125 is adjacent to the dead pixel photosensitive unit 124.
Specifically, in the example shown in fig. 7a and 7b, the processor 70 can detect the dead pixel 200 located at the edge of the captured image P3 in the captured image P3. Where the dead pixel 200 may refer to an overexposed pixel, or a pixel having a brightness significantly lower than that of the surrounding pixels, or a pixel having a brightness significantly higher than that of the surrounding pixels, the dead pixel 200 may be generated due to local damage to the image sensor 12 or overexposure. The light sensing unit in the image output area 122 corresponding to the dead pixel 200 is a dead pixel light sensing unit 124, the dead pixel light sensing unit 124 is located at the edge (shown by a thick solid line in fig. 7 b) of the image output area 122, the compensation light sensing unit 125 is located outside the image output area 122, the compensation light sensing unit 125 is located in the imaging area 111, the compensation light sensing unit 125 can also receive light and generate an electrical signal, the compensation light sensing unit 125 is closer to the dead pixel light sensing unit 124, and the electrical signal generated by the compensation light sensing unit 125 can replace the electrical signal generated by the dead pixel light sensing unit 124 to replace the dead pixel 200, so that the integrity of the photographed image is maintained.
Referring to fig. 1 and 4, in some embodiments, the photosensitive region 121 further includes an auxiliary imaging region 123, the auxiliary imaging region 123 is located outside the image output region 122, and the auxiliary imaging region 123 is located inside the imaging region 111. The processor 70 is configured to: the signal values of the image output area 122 and the auxiliary imaging area 123 are received to acquire an image.
The auxiliary imaging area 123 is located in the imaging area 111, and the auxiliary imaging area 123 is also capable of receiving light and generating a signal value, i.e., capable of receiving light and converting into an electrical signal. If desired, processor 70 may combine the signal values of image output area 122 and auxiliary imaging area 123 to obtain an image, such that an image having a shape different from the shape of image output area 122 is obtained. For example, a spliced image of a rectangular image obtained by the processor 70 according to the signal values of the image output area 122 and a special-shaped image obtained by the processor 70 according to the signal values of the auxiliary imaging area 123 can be obtained. The processor 70 may output an image according to the signal values of the image output area 122 and all the auxiliary imaging areas 123, and the processor 70 may also output an image according to the signal values of the image output area 122 and a part of the auxiliary imaging areas 123, which is not limited herein. By acquiring the signal value of the auxiliary imaging area 123 for auxiliary imaging, images of more shapes can be acquired, and more diversified demands of users can be satisfied.
Referring to fig. 8, in some embodiments, the imaging module 10 further includes a micro driver 14, the micro driver 14 is connected to the image sensor 12, and the micro driver 14 is configured to drive the image sensor 12 to move along a direction perpendicular to the optical axis Z.
By driving the image sensor 12 to move in a direction perpendicular to the optical axis Z by the micro-driver 14, the relative position of the imaging area 111 and the photosensitive area 121 can be changed, and the position of the image output area 122 can be changed, so that when a certain portion of the photosensitive units of the photosensitive area 121 are damaged, the image sensor 12 can be driven to move to make the image output area 122 avoid the damaged photosensitive units. Referring to fig. 4 and 9, if the central position of the imaging area 111 in fig. 4 is substantially coincident with the central position of the photosensitive area 121, and at this time, a portion of the photosensitive units near the left edge in the image output area 122 shown in fig. 4 is damaged, the image sensor 12 may be moved to the state shown in fig. 9, so that the imaging area 111 and the image output area 122 are moved to the right relative to the photosensitive area 121, the image output area 122 in fig. 9 avoids the portion of the photosensitive units near the left edge of the image output area 122 in fig. 4, and the image output area 122 in fig. 9 can be used to output a complete captured image without replacing the image sensor 12 with a new one.
In particular, the microactuator 14 may be made of an electrostrictive material, or a Micro-Electro-Mechanical System (MEMS)
In addition, referring to fig. 10, in some embodiments, the micro driver 14 may further be connected to the lens assembly 11, the micro driver 14 may drive the lens assembly 11 to move along a direction perpendicular to the optical axis Z, and the micro driver 14 may drive the lens assembly 11 to move, and may also change the relative position of the imaging area 111 and the photosensitive area 121, and change the position of the image output area 122, which is not described herein again. In some embodiments, the micro-actuator 14 may drive the lens group 11 to move or the image sensor 12 to move, which is not limited herein.
In the description herein, reference to the description of the terms "certain embodiments," "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples" means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present application, "a plurality" means at least two, e.g., two, three, unless specifically limited otherwise.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations of the above embodiments may be made by those of ordinary skill in the art within the scope of the present application, which is defined by the claims and their equivalents.

Claims (11)

1. An imaging module, comprising:
a lens group; and
the image sensor comprises a photosensitive area, the photosensitive area comprises an image output area, the image output area is a part of the photosensitive area, the area of the image output area is smaller than that of the photosensitive area, and the image output area is used for receiving light rays to output an image;
the projection range of the light rays passing through the lens group on the image sensor is an imaging area, the image output area is located in the imaging area, and at least part of the photosensitive area is located outside the imaging area.
2. The imaging module of claim 1, wherein the lens group and the image sensor have equal length dimensions in a first direction perpendicular to an optical axis of the lens group.
3. The imaging module of claim 2, further comprising a prism for redirecting light, wherein the light redirected by the prism enters the lens assembly, and wherein a length dimension of the prism in the first direction is less than or equal to a length dimension of the lens assembly in the first direction.
4. The imaging module of claim 2, wherein a length dimension of the image sensor in a second direction is greater than a length dimension of the image sensor in the first direction, the second direction being perpendicular to the optical axis.
5. The imaging module of claim 1, wherein the image output region is rectangular and the image output region is inscribed at an edge of the imaging region.
6. The imaging module of any of claims 1 to 5, further comprising a micro-actuator coupled to the image sensor and/or the lens assembly, the micro-actuator configured to move the image sensor and/or the lens assembly in a direction perpendicular to the optical axis.
7. A terminal, comprising:
a housing; and
the imaging module of any of claims 1 to 6, mounted on the housing, the thickness direction of the housing being perpendicular to the optical axis of the lens group.
8. The terminal of claim 7, further comprising a processor configured to:
acquiring a shot image according to light received by the image output area;
detecting dead pixel located at the edge of the shot image, wherein the dead pixel corresponds to a dead pixel photosensitive unit located at the edge of the image output area; and
and reading a signal value of a compensation photosensitive unit to replace the signal value of the dead pixel photosensitive unit, wherein the compensation photosensitive unit is positioned outside the image output area and in the imaging area, and the compensation photosensitive unit is adjacent to the dead pixel photosensitive unit.
9. The terminal of claim 7, further comprising a processor configured to:
acquiring an aspect ratio according to user input;
selecting a rectangular area which accords with the length ratio in the imaging area as the image output area; and
and acquiring an image according to the signal value of the rectangular area.
10. A terminal as claimed in claim 9, wherein the photosensitive area is rectangular, the long side of the image output area is parallel to the long side of the photosensitive area, and the wide side of the image output area is parallel to the wide side of the photosensitive area.
11. The terminal of claim 7, wherein the photosensitive area further comprises an auxiliary imaging area, the auxiliary imaging area being located outside the image output area, the auxiliary imaging area being located within the imaging area; the terminal further comprises a processor configured to:
and receiving signal values of the image output area and the auxiliary imaging area to acquire an image.
CN201911379607.4A 2019-12-27 2019-12-27 Imaging module and terminal Active CN111093018B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911379607.4A CN111093018B (en) 2019-12-27 2019-12-27 Imaging module and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911379607.4A CN111093018B (en) 2019-12-27 2019-12-27 Imaging module and terminal

Publications (2)

Publication Number Publication Date
CN111093018A true CN111093018A (en) 2020-05-01
CN111093018B CN111093018B (en) 2021-04-30

Family

ID=70396819

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911379607.4A Active CN111093018B (en) 2019-12-27 2019-12-27 Imaging module and terminal

Country Status (1)

Country Link
CN (1) CN111093018B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113099092A (en) * 2021-04-09 2021-07-09 凌云光技术股份有限公司 Design method of high-performance imaging system
CN114518642A (en) * 2020-11-20 2022-05-20 余姚舜宇智能光学技术有限公司 Linear TOF camera module, manufacturing method thereof and electronic equipment
CN115022491A (en) * 2021-09-26 2022-09-06 荣耀终端有限公司 Camera module and electronic equipment

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2657029Y (en) * 2003-11-01 2004-11-17 鸿富锦精密工业(深圳)有限公司 Ultrathin digital camera lens module
CN101165586A (en) * 2006-10-17 2008-04-23 三星Techwin株式会社 Dual lens optical system and dual lens camera having the same
CN102547080A (en) * 2010-12-31 2012-07-04 联想(北京)有限公司 Image pick-up module and information processing equipment comprising same
CN102842592A (en) * 2012-08-29 2012-12-26 格科微电子(上海)有限公司 CMOS (Complementary Metal Oxide Semiconductor) image sensor module and manufacturing method thereof
CN104820271A (en) * 2014-11-12 2015-08-05 台湾东电化股份有限公司 Thin lens module
CN105160298A (en) * 2015-07-31 2015-12-16 瑞声光电科技(常州)有限公司 Iris recognition system and iris recognition method
CN105229787A (en) * 2014-04-15 2016-01-06 索尼公司 Focus detection device and electronic equipment
US20160044250A1 (en) * 2014-08-10 2016-02-11 Corephotonics Ltd. Zoom dual-aperture camera with folded lens
CN205643995U (en) * 2016-04-25 2016-10-12 浙江大华技术股份有限公司 Image capture device
CN106713710A (en) * 2016-11-29 2017-05-24 深圳众思科技有限公司 Camera
CN108427166A (en) * 2017-02-15 2018-08-21 三星电子株式会社 Camera model and double camera module
CN209072595U (en) * 2018-11-29 2019-07-05 厦门地平线征程智能科技有限公司 Lens module and electronic equipment including it
CN209345242U (en) * 2018-11-22 2019-09-03 纮华电子科技(上海)有限公司 Image acquiring module and portable electronic device
CN110324540A (en) * 2019-06-10 2019-10-11 芯盟科技有限公司 The forming method and electronic equipment of a kind of imaging sensor, imaging sensor

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2657029Y (en) * 2003-11-01 2004-11-17 鸿富锦精密工业(深圳)有限公司 Ultrathin digital camera lens module
CN101165586A (en) * 2006-10-17 2008-04-23 三星Techwin株式会社 Dual lens optical system and dual lens camera having the same
CN102547080A (en) * 2010-12-31 2012-07-04 联想(北京)有限公司 Image pick-up module and information processing equipment comprising same
CN102842592A (en) * 2012-08-29 2012-12-26 格科微电子(上海)有限公司 CMOS (Complementary Metal Oxide Semiconductor) image sensor module and manufacturing method thereof
CN105229787A (en) * 2014-04-15 2016-01-06 索尼公司 Focus detection device and electronic equipment
US20160044250A1 (en) * 2014-08-10 2016-02-11 Corephotonics Ltd. Zoom dual-aperture camera with folded lens
CN104820271A (en) * 2014-11-12 2015-08-05 台湾东电化股份有限公司 Thin lens module
CN105160298A (en) * 2015-07-31 2015-12-16 瑞声光电科技(常州)有限公司 Iris recognition system and iris recognition method
CN205643995U (en) * 2016-04-25 2016-10-12 浙江大华技术股份有限公司 Image capture device
CN106713710A (en) * 2016-11-29 2017-05-24 深圳众思科技有限公司 Camera
CN108427166A (en) * 2017-02-15 2018-08-21 三星电子株式会社 Camera model and double camera module
CN209345242U (en) * 2018-11-22 2019-09-03 纮华电子科技(上海)有限公司 Image acquiring module and portable electronic device
CN209072595U (en) * 2018-11-29 2019-07-05 厦门地平线征程智能科技有限公司 Lens module and electronic equipment including it
CN110324540A (en) * 2019-06-10 2019-10-11 芯盟科技有限公司 The forming method and electronic equipment of a kind of imaging sensor, imaging sensor

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114518642A (en) * 2020-11-20 2022-05-20 余姚舜宇智能光学技术有限公司 Linear TOF camera module, manufacturing method thereof and electronic equipment
CN114518642B (en) * 2020-11-20 2024-03-22 余姚舜宇智能光学技术有限公司 Linear TOF camera module, manufacturing method thereof and electronic equipment
CN113099092A (en) * 2021-04-09 2021-07-09 凌云光技术股份有限公司 Design method of high-performance imaging system
CN113099092B (en) * 2021-04-09 2022-07-08 凌云光技术股份有限公司 Design method of high-performance imaging system
CN115022491A (en) * 2021-09-26 2022-09-06 荣耀终端有限公司 Camera module and electronic equipment
WO2023045679A1 (en) * 2021-09-26 2023-03-30 荣耀终端有限公司 Camera module and electronic device
CN115022491B (en) * 2021-09-26 2023-10-10 荣耀终端有限公司 Camera module and electronic equipment

Also Published As

Publication number Publication date
CN111093018B (en) 2021-04-30

Similar Documents

Publication Publication Date Title
US10830928B2 (en) Optical lens assembly, imaging lens module and electronic apparatus
CN111093018B (en) Imaging module and terminal
CN110888216B (en) Optical lens, lens module and terminal
CN210572980U (en) Camera module and electronic device
CN102192724B (en) Distance measurement and photometry device, and imaging apparatus
US20070052833A1 (en) Variable magnification optical system and image-taking apparatus
CN111835953B (en) Camera module and electronic equipment
US7355154B2 (en) Image sensing apparatus with movable light flux splitter and control method thereof
CN117310960A (en) Light turning element for camera module, camera module and electronic device
CN111491085B (en) Image sensor, imaging device, and electronic apparatus
US20220373715A1 (en) Plastic light-folding element, imaging lens assembly module and electronic device
US20240103345A1 (en) Image capturing unit, camera module and electronic device
CN110753145A (en) Mobile terminal
CN111147724B (en) Image sensor, imaging module and terminal
CN114859538B (en) Plastic light turning element, imaging lens module and electronic device
CN215526212U (en) Camera module and electronic device
CN214675328U (en) Camera module and electronic equipment
CN210807353U (en) Mobile terminal
US20190253590A1 (en) Camera Module
CN219392360U (en) Imaging lens, camera module and electronic device
CN220671792U (en) Light path turning element, camera module and electronic device
CN213783355U (en) Camera shooting assembly and terminal
CN220730514U (en) Imaging lens module, camera module and electronic device
CN210781025U (en) Camera module and electronic equipment
KR20110099983A (en) Compact lens optical system and digital camera module comprising the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant