CN115278007A - Image acquisition method, terminal and readable storage medium - Google Patents

Image acquisition method, terminal and readable storage medium Download PDF

Info

Publication number
CN115278007A
CN115278007A CN202210814363.3A CN202210814363A CN115278007A CN 115278007 A CN115278007 A CN 115278007A CN 202210814363 A CN202210814363 A CN 202210814363A CN 115278007 A CN115278007 A CN 115278007A
Authority
CN
China
Prior art keywords
image
camera
lens
terminal
macro
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210814363.3A
Other languages
Chinese (zh)
Inventor
王文涛
韦怡
陈嘉伟
李响
于盼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210814363.3A priority Critical patent/CN115278007A/en
Publication of CN115278007A publication Critical patent/CN115278007A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application discloses an image acquisition method, a terminal and a non-volatile computer readable storage medium. The image acquisition method comprises the following steps: acquiring the distance between an object and a camera of a terminal; determining a shooting mode of the terminal according to the distance; under the condition that the terminal is in a macro mode, the camera executes macro imaging to output a macro image; and under the condition that the terminal is in a microscopic mode, the camera performs microscopic imaging to output an original image, and the processor performs image restoration processing on the original image to output a microscopic image. According to the image acquisition method, the terminal and the non-volatile computer readable storage medium, the distance between the object and the camera is acquired to control the shooting mode of the camera, so that the terminal can complete a macro shooting function and a micro shooting function in one camera, the production cost is reduced, and the internal space of the terminal is saved.

Description

Image acquisition method, terminal and readable storage medium
Technical Field
The present application relates to the field of imaging technologies, and in particular, to an image acquisition method, a terminal, and a non-volatile computer-readable storage medium.
Background
In recent years, with the development of terminals such as mobile phones, tablet computers, notebook computers and the like, more and more functions are expected to be realized by the terminals. For example, the user desires that the terminal simultaneously realize a microscopic photographing function and a macro photographing function. However, to realize both macro shooting and micro shooting at the terminal, at least two lens modules are required to realize the macro shooting and the micro shooting, which increases the cost and occupies more space inside the terminal.
Disclosure of Invention
Embodiments of the present application provide an image acquisition method, a terminal, and a non-volatile computer-readable storage medium, which are used to at least solve a problem of how to simultaneously implement a macro shooting function and a micro shooting function.
The image acquisition method of the embodiment of the application comprises the following steps: acquiring the distance between an object and a camera of a terminal; determining a shooting mode of the terminal according to the distance; under the condition that the terminal is in a macro mode, the camera executes macro imaging to output a macro image; and under the condition that the terminal is in a microscopic mode, the camera performs microscopic imaging to output an original image, and the processor performs image restoration processing on the original image to output a microscopic image.
The terminal of the embodiment of the application comprises a body, a camera and a processor. The camera is arranged on the body; the processor is used for acquiring the distance between an object and the camera and determining the shooting mode of the terminal according to the distance; under the condition that the terminal is in a macro mode, the camera executes macro imaging to output a macro image; under the condition that the terminal is in a microscopic mode, the camera performs microscopic imaging to output an original image; and the processor is also used for executing image restoration processing on the original image to output a microscopic image.
The non-transitory computer-readable storage medium of the embodiments of the present application, which stores a computer program that, when executed by one or more processors, implements an image acquisition method as follows: acquiring the distance between an object and a camera of a terminal; determining a shooting mode of the terminal according to the distance; under the condition that the terminal is in a macro mode, the camera executes macro imaging to output a macro image; and under the condition that the terminal is in a microscopic mode, the camera performs microscopic imaging to output an original image, and the processor performs image restoration processing on the original image to output a microscopic image.
According to the image acquisition method, the terminal and the non-volatile computer readable storage medium, when the fact that the distance between the object and the camera meets the macro mode is detected, the camera is controlled to execute macro imaging so as to output a macro image. When detecting that the distance between object and the camera satisfies microscopic mode, control the camera and carry out the micro-formation of image and output original image, the treater carries out image restoration to original image and handles to output microscopic image, thereby need not to set up two camera lenses, just can utilize a module of making a video recording to accomplish macro-shooting function and micro-shooting function, reduction in production cost saves the inner space at terminal.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart of an image acquisition method according to some embodiments of the present application;
FIG. 2 is a block diagram of a terminal according to some embodiments of the present application;
FIG. 3 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 4 is a schematic diagram illustrating the principle of obtaining the distance between the object and the camera in the image obtaining method according to some embodiments of the present disclosure;
FIG. 5 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 6 is a schematic structural diagram of a camera in an image acquisition method according to some embodiments of the present application;
FIG. 7 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 8 is a schematic structural diagram of a camera in an image acquisition method according to some embodiments of the present disclosure;
FIG. 9 is a schematic structural diagram of a camera in an image acquisition method according to some embodiments of the present disclosure;
FIG. 10 is a schematic view of a camera in an image acquisition method according to some embodiments of the present application;
FIG. 11 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 12 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 13 is a schematic structural diagram of a camera in an image acquisition method according to some embodiments of the present disclosure;
FIG. 14 is a schematic flow chart of a phase plate in an image acquisition method according to some embodiments of the present application;
FIG. 15 is a schematic flow chart diagram of an image acquisition method according to some embodiments of the present application;
FIG. 16 is a schematic view of a scene of an image acquisition method according to some embodiments of the present application;
FIG. 17 is a schematic illustration of the acquisition of a microscopic image in an image acquisition method according to certain embodiments of the present application;
FIG. 18 is a schematic representation of the connection state of a non-volatile computer readable storage medium and a processor of some embodiments of the present application.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
In recent years, with the development of terminals, users desire more and more functions that can be realized by the terminals. For example, the user desires that the terminal simultaneously realize the microscopic photographing function and the macro photographing function. However, to simultaneously implement the macro shooting function and the micro shooting function on the terminal, at least two lens modules are required to implement the macro shooting function and the micro shooting function, which may increase the cost and occupy more space inside the terminal. To solve this problem, the present application provides an image acquisition method, a terminal 100 (shown in fig. 2) and a non-volatile computer-readable storage medium 200 (shown in fig. 18).
Referring to fig. 1 and fig. 2, an image obtaining method according to an embodiment of the present disclosure includes:
01: acquiring a distance between an object and the camera 10 of the terminal 100;
03: determining a photographing mode of the terminal 100 according to the distance;
05: in the case where the terminal 100 is in the macro mode, the camera 10 performs macro imaging to output a macro image; and
07: in the case where the terminal 100 is in the microscopic mode, the camera 10 performs microscopic imaging to output an original image, and the processor 30 performs image restoration processing on the original image to output a microscopic image.
Referring to fig. 6, the terminal 100 further includes a camera 10 and one or more processors 30. One or more processors 30 are used to perform the image acquisition methods in 01, 03, 05, and 07. That is, the one or more processors 30 are configured to acquire a distance between the object 14 and the camera 10, and determine a photographing mode of the terminal 100 according to the distance; in the case where the terminal 100 is in the macro mode, the camera 10 performs macro imaging to output a macro image; in the case where the terminal 100 is in the microscopic mode, the camera 10 performs microscopic imaging to output an original image; the processor 30 is also configured to perform an image restoration process on the original image to output a microscopic image.
In some embodiments, the terminal 100 may be a mobile phone, a tablet computer, a notebook computer, a personal computer, a smart watch, an automobile, an unmanned aerial vehicle, a robot, or other devices having a photographing function. The embodiment of the present application is described by taking an example in which the terminal 100 is a mobile phone, and it should be noted that the specific form of the terminal 100 is not limited to the mobile phone.
In some embodiments, the terminal 100 may further include a body 20, and the camera 10 is disposed on the body 20. It should be noted that the camera 10 may be disposed on the back of the terminal 100, and of course, the camera 10 may also be disposed at other positions, for example, inside the terminal 100, and when the user has a shooting requirement, the camera 10 pops up to shoot; and is disposed, for example, below a screen of the terminal 100, through which light is incident to the camera 10 for photographing.
Specifically, after acquiring the distance between the object 14 and the camera 10, the one or more processors 30 can determine the shooting mode of the camera 10 according to the distance between the object 14 and the camera 10 to switch the shooting mode of the camera 10. For example, when the distance between the object 14 and the camera 10 is 5mm to 30mm, the one or more processors 30 control the camera 10 to perform a macro imaging mode to output a macro image. When the distance between the object 14 and the camera 10 is 0-5mm, the one or more processors 30 control the camera 10 to perform a microscopic imaging mode to output an original image, and then the one or more processors 30 perform an image restoration process on the original image to output a microscopic image.
Referring to fig. 3 and 6, in some embodiments, 01: acquiring the distance between the object 14 and the camera of the terminal, including:
011: photographing the object 14 to acquire a photographed image; and
013: and obtaining the distance according to the actual shape and the actual contrast of the object 14 in the shot image and a preset distance database, wherein the preset distance database stores calibration images and calibration distance values corresponding to the calibration images, and each calibration image comprises a calibration shape and a calibration contrast.
Referring to fig. 2, one or more processors 30 are also configured to perform the image capturing methods of 011 and 013. That is, the one or more processors 30 are also configured to control the camera 10 to capture the object 14 to obtain a captured image; and obtaining the distance according to the actual shape and the actual contrast of the object 14 in the shot image and a preset distance database, wherein the preset distance database stores calibration images and calibration distance values corresponding to the calibration images, and each calibration image comprises the calibration shape and the calibration contrast.
In some embodiments, the camera 10 of the terminal 100 further includes a lens 11 and an image sensor 13. Wherein the camera 10 photographs the object 14 at different distances, thereby obtaining photographed images. Specifically, the light reflected by the object 14 passes through the lens 11 and then falls on the image sensor 13, the image sensor 13 converts the optical signal into an electrical signal to form an image, and the captured image is acquired, and the processor 30 analyzes the actual shape and the actual contrast of the captured image. In one embodiment, the captured image may be one sheet, and when the captured image is one sheet, the processor 30 analyzes the actual shape and the actual contrast in the captured image and performs a contrast analysis with the calibration shape and the calibration contrast in the preset distance database to obtain the distance between the object 14 and the camera 10. In the embodiment of the present application, a case where one captured image is used is described as an example, and in other embodiments, a plurality of captured images may be used. When there are multiple shot images, the processor 30 can perform data processing on the actual shapes and actual contrasts in the multiple shot images to obtain average values of the actual shapes and actual contrasts in the multiple shot images, and perform a contrast analysis on the average values of the actual shapes and the actual contrasts and the calibration shapes and the calibration contrasts in the preset distance database by using the average values of the actual shapes and the average values of the actual contrasts to obtain the distance between the object 14 and the camera 10. And then reduce the influence of external environment factor, improve the accuracy of obtaining the data.
Specifically, known distance information is stored in a preset distance database, the camera 10 sequentially shoots the object 14 under the calibrated distance information to obtain a plurality of shot images, the shot images obtained under different distances have different shapes and contrasts, and the shapes and contrasts in the shot images obtained under the calibrated distances are used as calibrated shapes and calibrated contrasts. And sorting and recording the calibration distance, the calibration shape and the calibration contrast to form a preset distance database.
Referring to fig. 4, in some embodiments, the actual shape/actual contrast ratio and the calibration shape/calibration contrast ratio in the preset distance database form a one-to-one mapping relationship. After the actual shape and the actual contrast in the captured image are acquired by using the 011 image acquisition method, since the calibrated distance information is known, the distance information between the object 14 and the camera 10 can be further obtained according to the mapping relationship between the actual shape/actual contrast and the calibrated shape/calibrated contrast.
In actual use, the camera 10 is used for shooting the object 14 to obtain a shot image, the image sensor 13 is used for analyzing the shot image to obtain the actual shape and the actual contrast of the shot image, and the distance information between the object 14 and the camera 10 is finally obtained by performing comparison processing according to the actual shape and the actual contrast and the calibration shape and the calibration contrast in the preset distance database.
In some embodiments, the predetermined distance database may be pre-stored in the memory and may be read by the processor 30. Through presetting the distance database, when treater 30 acquires the distance information between object 14 and camera 10, can directly carry out contrastive analysis with the demarcation shape and the demarcation contrast in presetting the distance database according to actual shape and actual contrast to obtain the distance between object 14 and the camera 10, make the distance information that treater 30 acquireed more accurate, guarantee that the switching of the camera 10 mode of shooing is more quick.
Referring to fig. 2, 5 and 6, in some embodiments, 05: the camera 10 performs macro imaging, including:
051: the first driver 12 drives the lens 11 to move integrally to perform macro focusing; and
053: after the macro focusing is completed, the camera 10 performs photographing.
The one or more processors 30 are also used to perform the image acquisition methods in 051 and 053. That is, the one or more processors 30 are further configured to control the first driver 12 to drive the lens 11 to move integrally for macro focusing; and after the macro focusing is completed, the camera 10 performs photographing.
In certain embodiments, the camera head 10 further includes a first driver 12. The first driver 12 drives the lens 11 to move integrally to perform macro focusing; after the macro focusing is completed, the camera 10 performs photographing. Specifically, the camera 10 includes, from the object side to the image side, a lens 11, a first driver 12, and an image sensor 13. It should be noted that the components in fig. 6 are only exemplary components, and the actual shape, size, dimension and position information of the components are not limited to those listed in fig. 6. The object side refers to a side of the object 14, that is, a side of the object 14, and the image side refers to a side imaged by the image sensor 13. The first driver 12 may be a voice coil motor, a piezoelectric ceramic, or the like, and is not limited thereto. The first driver 12 is used for driving the lens 11 to move along the direction of the optical axis for focusing.
Referring to fig. 2, 7 and 8, in some embodiments, 05: the camera 10 performs macro imaging, and further includes:
055: the first driver 12 drives the lens 11 to move integrally to perform macro coarse focusing; and/or
057: the first driver 12 drives a part of the lens group in the lens 11 to move for focusing in macro range; and
059: after the focusing is completed, the camera 10 performs shooting.
Referring to fig. 2, one or more processors 30 are also used to perform the image acquisition methods of 055, 057, and 059. That is, the one or more processors 30 are further configured to control the first driver 12 to drive the lens 11 to move integrally to perform macro coarse focusing; and/or controlling the first driver 12 to drive a part of the lens group in the lens 11 to move so as to perform focusing in macro range; and after focusing is completed, the camera 10 performs photographing.
Referring to fig. 8, in some embodiments, the lens 11 in the terminal 100 further includes a plurality of lens sets. The first driver 12 is used for driving the lens 11 to move integrally so as to perform macro coarse focusing; the first driver 12 is also used for driving a part of the lens group in the lens 11 to move for focusing in macro-range; after the focusing is completed, the camera 10 performs shooting. In some embodiments, the lens 11 includes a plurality of lenses sequentially disposed along the optical axis, and the plurality of lenses can form more than one lens group, and each lens group may include one or more than one lens. The first driver 12 can move one or more lens groups in the lens 11 for focusing.
Specifically, in one embodiment, as shown in fig. 6, in the case where the one or more processors 30 control the camera 10 to perform macro imaging, the first driver 12 drives the lens 11 to move in its entirety (as all the lenses in the lens 11 in the right drawing in fig. 6 move in the object side along the optical axis as compared with all the lenses in the lens 11 in the left drawing) to perform macro coarse focusing, and after the focusing is completed, the camera 10 performs shooting. The first driver 12 is used to drive the lens 11 to move integrally so as to adjust the object distance and the image distance, thereby facilitating determination of the optimal position of the lens 11 when the photographic subject 14 can be clearly imaged.
In another embodiment, as shown in fig. 8, in the case that the one or more processors 30 control the camera 10 to perform macro imaging, the first driver 12 drives a part of the lens group in the lens 11 to move (the lens group G2 in the right image of fig. 8 is moved along the optical axis toward the image side compared with the lens group G2 in the left image) for macro focusing, and after the focusing is completed, the camera 10 performs shooting. By moving a part of the lens group inside the camera lens 11, the distance between the lenses in the camera head 10 is changed to adjust the focal length of the camera lens 11, so that the image shot by the camera head 10 is clearer.
In another embodiment, in the case that the one or more processors 30 control the camera 10 to perform macro-imaging, as shown in fig. 6, the first driver 12 drives the lens 11 to move as a whole for macro-focusing, and then the first driver 12 drives a part of the lens group in the lens 11 to move for macro-focusing, as shown in fig. 8, and after the focusing is completed, the camera 10 performs shooting. First, the first driver 12 is used for driving the lens 11 to move integrally so as to adjust the object distance and the image distance, and then the partial lens group inside the lens 11 is moved so as to change the distance between the lenses in the camera 10 and further adjust the focal length of the lens 11, so that the image shot by the camera 10 is more real and accurate, and the shooting effect of the camera 10 is improved.
Referring to fig. 9 and 10, in some embodiments, the lens 11 includes 5 lenses, and the camera 10 includes a cover plate 16, a stop STO, a first lens L1, a second lens L2, a third lens L3, a fourth lens L4, a fifth lens L5, a filter 15, and an image sensor 13 in sequence from an object side to an image side, wherein the first lens L1 has a negative refractive power. The second lens L2 has positive refractive power. The third lens L3 has negative refractive power. The fourth lens L4 has positive refractive power. The fifth lens L5 has negative refractive power. The camera 10 satisfies the following conditional expressions: 0.8< | TTL/Diag | <1.2; TTL is an on-axis distance from the object-side surface of the first lens L1 to the imaging surface of the camera 10 (the surface on which the image sensor 13 is located), and Diag is a diagonal length of the image sensor 13. It should be noted that in some embodiments, the lens 11 may also have other numbers of lenses or lenses with other structures, for example, the lens 11 may be composed of 6 lenses, 7 lenses or 8 lenses, and is not limited herein.
In some embodiments, stop STO may be an aperture stop or a field stop. In the embodiment of the present application, the stop STO is an aperture stop as an example. The stop STO may be provided between the first lens L1 and the subject 14, on the surface of either lens, or between either two lenses. In the embodiment of the present application, the stop STO is provided between the first lens L1 and the subject 14 to control the amount of light entering and improve the imaging effect.
In some embodiments, the optical filter 15 is disposed between the fifth lens L5 and the image sensor 13 for filtering light with a specific wavelength. In some embodiments, the filter 15 is an infrared filter 15. When the camera 10 is used for imaging, light emitted or reflected by the object 14 enters the camera 10 from the object side direction, and finally converges on an imaging surface of the image sensor 13 after sequentially passing through the first lens L1, the second lens L2, the third lens L3, the fourth lens L4, the fifth lens L5, and the optical filter 15.
In some embodiments, by providing 5 lenses, and the 5 lenses have a reasonable thickness configuration, the ratio between the total optical length of the camera 10 and the diagonal length of the image sensor 13 is in a reasonable range, so as to obtain a shorter closest focusing distance, that is, when the distance between the camera 10 and the object 14 is short, focusing can be successfully performed, and high-quality and high-definition imaging can be obtained. Moreover, without two cameras 10, the macro photographing function and the micro photographing function can be simultaneously realized by the same camera 10, so that the cost is reduced, and the internal space of the terminal 100 (shown in fig. 2) can be saved.
In some embodiments, the material of the lens is plastic or glass, or a mixture of plastic and glass, which is not limited herein.
Referring to fig. 2, 11 and 13, in some embodiments, 07: the camera 10 performs microscopic imaging to output an original image, including:
071: the first driver 12 drives the lens 11 to move integrally to perform overall microscopic focusing; and/or
073: the first driver 12 drives a part of the lens group in the lens 11 to move for intra-microscope focusing; and
075: after the focusing is completed, the camera 10 performs shooting.
Referring to fig. 2, one or more processors 30 are also used to execute the image capturing methods 071, 073 and 075. That is, the one or more processors 30 are further configured to control the first driver 12 to drive the lens 11 to move integrally for overall microscope focusing; and/or controlling the first driver 12 to drive a part of the lens group in the lens 11 to move for intra-microscope focusing; and after focusing is completed, the camera 10 performs photographing.
Specifically, in some embodiments, the camera 10 further includes a lens 11 and a first driver 12. In the case where the one or more processors 30 control the camera 10 to perform microscopic imaging, the first driver 12 drives the lens 11 to perform overall movement for microscopic overall focusing, then the first driver 12 drives a part of the lens group in the lens 11 to move for microscopic internal focusing, and after the focusing is completed, the camera 10 performs photographing.
The first driver 12 drives the lens 11 to move integrally, so as to drive the lens 11 to move along the direction of the optical axis, and further adjust the image distance and the object distance, and determine the position of the object 14 in the lens 11, which can be clearly imaged. The first driver 12 drives a part of the lens groups in the lens 11 to move so as to change the spacing between the lenses, and then adjust the focal length of the lens 11 to complete the focusing process on the object 14, and the camera 10 performs shooting. The camera 10 drives a part of lens groups in the lens 11 through the first driver 12, and then adjusts the distance between the lens groups, so that the distance between the lenses in the lens 11 changes, and the focal length of the lens 11 is adjusted, so that the object 14 can present a clearer image on the image sensor 13 after the camera 10 performs shooting.
In some embodiments, the relative position of the lenses of a portion of the lens group along the optical axis is not changed when the first actuator 12 drives the portion of the lens group to move, i.e., the first actuator 12 drives the same lens group as a whole along the optical axis. For example, the first driver 12 drives a first lens of the lens group to move 50um along the optical axis toward the object 14, and correspondingly, the first driver 12 drives a second lens of the same lens group to move 50um along the optical axis toward the object 14. In some embodiments, the direction of movement along the optical axis may be different between different lens groups. For example, referring to fig. 9, the first driver 12 can drive L2 to move toward the object 14 along the optical axis, and simultaneously, the first driver 12 can drive L4 to move toward the image side along the optical axis synchronously. In another embodiment, the distance of movement along the optical axis may be different between different lens groups.
In some embodiments, the first driver 12 may be a voice coil motor, a piezoelectric ceramic, or other driving device, which is not limited herein. It should be noted that, in some embodiments, the first driver 12 may be provided in multiple numbers, that is, each first driver 12 is connected to one lens group, or each first driver 12 is connected to one lens, and further, the lens groups in the lens barrel 11 may be driven to move along the optical axis direction at the same time, so as to save the time for performing shooting and improve the shooting sensitivity.
In some embodiments, the lens and the first driver 12 may be connected by a clamp or by an adhesive, without limitation. Specifically, referring to fig. 8 and 9, in some embodiments, the first driver 12 may be adhered to the lens L1, the lens L2 and the lens L4 to drive the lens L1, the lens L2 and the lens L4 to move along the optical axis. In another embodiment, the first driver 12 may further be connected to the lens groups G1 and G2 by a clamp to drive the lens groups G1 and G2 to move along the optical axis direction, so as to adjust the focal length of the lens 11.
Referring to fig. 12 and 13, in some embodiments, 07: the camera 10 performs microscopic imaging to output an original image, and further includes:
074: the second driver 17 drives the phase plate 18 to move onto the imaging optical path.
Referring to fig. 2, one or more processors 30 are also configured to perform the image acquisition method of 074. That is, the one or more processors 30 are also configured to drive the second driver 17 to drive the phase plate 18 to move onto the imaging optical path.
Referring to fig. 13, in some embodiments, the camera 10 of the terminal 100 may further include a second driver 17 and a phase plate 18 connected to the second driver 17. Specifically, in the case where the one or more processors 30 control the camera 10 to perform microscopic imaging, the first driver 12 drives the lenses in the lens 11 to move integrally along the optical axis direction for overall microscopic focusing, the first driver 12 drives a part of the lens groups in the lens 11 to move along the optical axis direction for intra-microscopic focusing, the second driver 17 drives the phase plate 18 to move on the imaging optical path, and the camera 10 performs shooting, so that light is imaged on the image sensor 13 after passing through the phase plate 18 and the lens 11 to obtain an original image.
The terminal 100 obtains an original image by disposing the phase plate 18 outside the camera 10 and driving the phase plate 18 to rotate by using the second driver 12, so that the phase plate 18 selectively blocks or opens an imaging optical path, and light passes through the phase plate 18 and the lens 11 and then is imaged on the image sensor 13. In addition, the light entering the camera 10 may be phase-coded by the phase plate 18, so that the camera 10 can perform depth-of-field extended wavefront coding imaging on the object 14, which not only can greatly increase the depth of field of the camera 10, but also can correct out-of-focus aberration caused by installation error, temperature change and the like, so as to improve the imaging performance of the camera 10.
Referring to fig. 14, in some embodiments, the phase plate 18 may include a mirror 181 and a micro-structure layer 183, wherein the micro-structure layer 183 is disposed on one side of the mirror 181 to form a phase plane 185. The lens 181 may be a plane, a spherical surface, an aspherical surface, or a free-form surface, which is determined according to the actual design requirement of the camera 10. Specifically, the side of the lens 181 close to the phase plane 185 may be a flat surface or a gentle arc surface, which is more advantageous for manufacturing, and the side far from the phase plane 185 may be a flat surface, or a spherical surface or an aspheric surface. In addition, generally, when the lens 11 is close to the object 14, because the focal distance is close, the depth of field of the captured image is shallow, so that the background of the captured image is more blurred, and the terminal microphotograph is blurred.
It should be noted that the phase plate 18 may be disposed between the object 14 and the imaging plane, when the light reflected by the object 14 enters the camera 10 with the phase plate 18, an intermediate blurred image may be formed on the imaging plane, and it is ensured that the blurred degrees of the images formed within the large depth of field are consistent, and then, by using the characteristic of consistent intermediate blurred degrees, the images are restored by using various algorithms such as frequency domain or spatial domain, so as to obtain a final sharp image.
Referring to fig. 2 and 15, in some embodiments, 07: the processor 30 performs an image restoration process on the original image to output a microscopic image, including:
077: carrying out coded image processing on the original image to obtain coded images with the same point spread function; and
079: and inputting the coded image into a preset neural network model to perform convolution and deconvolution operation so as to output a microscopic image.
The one or more processors 30 are also used to perform the image acquisition methods in 077 and 079. That is, the one or more processors 30 are further configured to perform encoded image processing on the original image to obtain encoded images with the same point spread function; and inputting the coded image into a preset neural network model to perform convolution and deconvolution operation so as to output a microscopic image.
Referring to fig. 16, in a case where the one or more processors 30 control the camera 10 to perform microscopic imaging, the camera 10 captures and obtains an original image (shown in (b) of fig. 16), however, the original image obtained at this time has a larger aberration from an ideal image (shown in (a) of fig. 16), that is, the edge of the original image obtained by the camera 10 is more blurred, and the field performance is poorer. Therefore, the processor 30 in the terminal 100 is further configured to perform encoded image processing on the original image to obtain an encoded image with the same point spread function; and inputting the coded image into a preset neural network model to perform convolution and deconvolution operation so as to output a microscopic image with better field performance.
Specifically, referring to fig. 17, first, an original image (shown in (a) of fig. 17) is subjected to encoded image processing to obtain an encoded image (shown in (b) of fig. 17) having the same point spread function; establishing a mapping relation according to the point spread function and the neural network; the encoded image is subjected to image signal processing (shown in (c) in fig. 17) to output a microscopic image (shown in (d) in fig. 17). It should be noted that, in some embodiments, the encoded image may be obtained by disposing the phase plate 18, and grinding and refining the phase plate 18 to obtain the encoded image with the same point spread function. In other embodiments, the encoded image may also be obtained by arranging an independent external device, that is, the external device is arranged in front of the lens of the camera 10 to perform thermal refining on the encoded image, so as to obtain the encoded image with the same point spread function.
In some embodiments, the image signal processing includes a neural network model, that is, the image signal processing is data processed by the neural network model, and in a specific application, the coded image is output as a microscopic image after being processed by the image signal processing, that is, after being processed by a recovery algorithm. The neural network model may be a recovery algorithm, a convolution and deconvolution algorithm, or an AI algorithm, which is not limited herein.
Referring to fig. 1, 2 and 18, the present application further provides a non-volatile computer readable storage medium 200 storing a computer program, wherein when the computer program 202 is executed by one or more processors 30, the image capturing method according to any of the above embodiments is implemented.
For example, where the program 202 is executed by the processor 30, the following image acquisition methods are implemented:
01: acquiring a distance between an object and the camera 10 of the terminal 100;
03: determining a photographing mode of the terminal 100 according to the distance;
05: in the case where the terminal 100 is in the macro mode, the camera 10 performs macro imaging to output a macro image; and
07: in the case where the terminal 100 is in the microscopic mode, the camera 10 performs microscopic imaging to output an original image, and the processor 30 performs image restoration processing on the original image to output a microscopic image.
For another example, when the program 202 is executed by the processor 30, the image capturing methods of 011, 013, 051, 053, 055, 057, 059, 071, 073, 074, 075, 077, and 079 can also be implemented.
Referring to fig. 6, in the non-volatile computer readable storage medium 200 of the present application, a distance between the object 14 and the camera 10 of the terminal 100 is obtained; determining a photographing mode of the terminal 100 according to the distance; in the case where the terminal 100 is in the macro mode, the camera 10 performs macro imaging to output a macro image; and in case that the terminal 100 is in the microscopic mode, the camera 10 performs microscopic imaging to output an original image, and the processor 30 performs an image restoration process on the original image to output a microscopic image. When the terminal 100 detects that the distance between the object 14 and the camera 10 satisfies microscopic conditions, the camera 10 is controlled to perform microscopic imaging and output an original image, and the processor 30 performs image restoration processing on the original image to output a microscopic image, so that a macro-shooting function and a micro-shooting function can be completed in one camera module without arranging two lenses 11, the production cost is reduced, and the internal space of the terminal 100 is saved.
In the description of the present specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to implicitly indicate the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are exemplary and should not be construed as limiting the present application and that changes, modifications, substitutions and alterations in the above embodiments may be made by those of ordinary skill in the art within the scope of the present application.

Claims (13)

1. An image acquisition method, comprising:
acquiring the distance between an object and a camera of a terminal;
determining a shooting mode of the terminal according to the distance;
under the condition that the terminal is in a macro mode, the camera executes macro imaging to output a macro image; and
and under the condition that the terminal is in a microscopic mode, the camera performs microscopic imaging to output an original image, and the processor performs image restoration processing on the original image to output a microscopic image.
2. The image acquisition method according to claim 1, wherein the acquiring a distance between the object and a camera of the terminal comprises:
shooting the object to obtain a shot image; and
and acquiring the distance according to the actual shape and the actual contrast of the object in the shot image and a preset distance database, wherein the preset distance database stores calibration images and calibration distance values corresponding to the calibration images, and each calibration image comprises a calibration shape and a calibration contrast.
3. The image acquisition method according to claim 1, wherein the camera comprises a lens and a first driver; the camera performs macro imaging, including:
the first driver drives the lens to integrally move so as to carry out macro focusing; and
and after the macro focusing is finished, the camera performs shooting.
4. The image capturing method of claim 1, wherein the camera comprises a lens and a first driver, the lens comprising a plurality of lens sets; the camera performs macro imaging, including:
the first driver drives the lens to integrally move so as to perform macro coarse focusing; and/or
The first driver drives part of the lens group in the lens to move so as to carry out focusing in a macro range; and
after focusing is completed, the camera performs shooting.
5. The image acquisition method according to claim 1, wherein the camera comprises a lens and a first driver; the camera performs microscopic imaging to output an original image, including:
the first driver drives the lens to integrally move so as to carry out integral focusing of a microscope; and/or
The first driver drives part of the lens group in the lens to move so as to carry out microscopic internal focusing; and
after focusing is completed, the camera performs shooting.
6. The image acquisition method according to claim 1, wherein the processor performs an image restoration process on the original image to output a microscopic image, including:
carrying out coded image processing on the original image to obtain coded images with the same point spread function; and
and inputting the coded image into a preset neural network model to perform convolution and deconvolution operation so as to output the microscopic image.
7. A terminal, comprising:
a body;
the camera is arranged on the body;
the processor is used for acquiring the distance between an object and the camera and determining the shooting mode of the terminal according to the distance;
under the condition that the terminal is in a macro mode, the camera executes macro imaging to output a macro image;
under the condition that the terminal is in a microscopic mode, the camera performs microscopic imaging to output an original image; and
the processor is further configured to perform an image restoration process on the original image to output a microscopic image.
8. The terminal of claim 7,
the camera is also used for shooting the object to obtain a shot image;
the processor is further configured to obtain the distance according to an actual shape and an actual contrast of the object in the captured image and a preset distance database, where the preset distance database stores calibration images and calibration distance values corresponding to the calibration images, and each calibration image includes a calibration shape and a calibration contrast.
9. The terminal of claim 7, wherein the camera comprises a lens and a first driver; the first driver is used for driving the lens to integrally move so as to carry out macro focusing; and after the macro focusing is finished, the camera performs shooting.
10. The terminal of claim 7, wherein the camera comprises a lens and a first driver, the lens comprising a plurality of sets of lenses; the first driver is used for driving the lens to integrally move so as to carry out macro coarse focusing; the first driver is also used for driving part of the lens group in the lens to move so as to carry out macro inner focusing; after focusing is completed, the camera performs shooting.
11. The terminal of claim 7, wherein the camera comprises a lens and a first driver; the first driver is used for driving the lens to integrally move so as to carry out overall focusing of a microscope; the first driver is also used for driving part of the lens group in the lens to move so as to carry out microscopic internal focusing; after focusing is completed, the camera performs shooting.
12. The terminal according to claim 7, wherein the processor is further configured to perform coded image processing on the original image to obtain coded images with the same point spread function; and inputting the coded image into a preset neural network model to perform convolution and deconvolution operation so as to output the microscopic image.
13. A non-transitory computer readable storage medium storing a computer program which, when executed by one or more processors, implements the image acquisition method of any one of claims 1 to 6.
CN202210814363.3A 2022-07-11 2022-07-11 Image acquisition method, terminal and readable storage medium Pending CN115278007A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210814363.3A CN115278007A (en) 2022-07-11 2022-07-11 Image acquisition method, terminal and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210814363.3A CN115278007A (en) 2022-07-11 2022-07-11 Image acquisition method, terminal and readable storage medium

Publications (1)

Publication Number Publication Date
CN115278007A true CN115278007A (en) 2022-11-01

Family

ID=83765366

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210814363.3A Pending CN115278007A (en) 2022-07-11 2022-07-11 Image acquisition method, terminal and readable storage medium

Country Status (1)

Country Link
CN (1) CN115278007A (en)

Similar Documents

Publication Publication Date Title
CN113703133B (en) Optical lens group for camera shooting
CN115268035B (en) Optical lens group for photography
CN109828345B (en) Optical image capturing lens, image capturing device and electronic device
US8325265B2 (en) Camera module having extended depth of focus mode and autofocus mode and method of operating the same
CN110320639B (en) Optical lens for shooting, image capturing device and electronic device
CN112241059B (en) Optical imaging lens assembly, image capturing device and electronic device
TWI616700B (en) Optical image capturing lens assembly, imaging apparatus and electronic device
JP2016126230A (en) Imaging optical system, camera device, and stereo camera device
JP2006209100A (en) Zoom lens and imaging apparatus
WO2007013621A1 (en) Imaging device and image processing method
KR20160031491A (en) Optimized imaging apparatus for iris imaging
CN111580237A (en) Electronic device and control method thereof
CN112462490B (en) Photographing optical lens, image capturing device and electronic device
JP5868074B2 (en) Lens barrel and imaging device
CN115379116A (en) Image acquisition method, electronic device, and computer-readable storage medium
CN115278007A (en) Image acquisition method, terminal and readable storage medium
CN115561881A (en) Camera module and electronic equipment
CN116266010A (en) Image capturing optical system lens assembly, image capturing device and electronic device
CN205594647U (en) A iris device for mobile client equipment
CN219625851U (en) Image capturing module and electronic device
KR102657473B1 (en) Folded macro-tele camera lens design
CN117666099A (en) Lens module and electronic equipment
CN113826036A (en) Optical zoom system and camera of mobile device
KR20240049655A (en) Folded macro-tele camera lens designs
CN116859546A (en) Optical image capturing system lens assembly, image capturing device and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination