CN113709366A - Information processing method and device - Google Patents

Information processing method and device Download PDF

Info

Publication number
CN113709366A
CN113709366A CN202110972977.XA CN202110972977A CN113709366A CN 113709366 A CN113709366 A CN 113709366A CN 202110972977 A CN202110972977 A CN 202110972977A CN 113709366 A CN113709366 A CN 113709366A
Authority
CN
China
Prior art keywords
camera
preview image
area
blurring
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110972977.XA
Other languages
Chinese (zh)
Other versions
CN113709366B (en
Inventor
马彬强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202110972977.XA priority Critical patent/CN113709366B/en
Publication of CN113709366A publication Critical patent/CN113709366A/en
Application granted granted Critical
Publication of CN113709366B publication Critical patent/CN113709366B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions

Abstract

The application discloses an information processing method and device, wherein the method comprises the following steps: the method comprises the steps that a preview image in an acquisition space of a camera is obtained in real time through the camera, wherein the preview image comprises a blurring area and a clear area; the blurring area of the preview image is an area generated according to the focal length of the camera and the current aperture function; determining an adjustment range based on the maximum aperture of the camera and the minimum aperture of the camera; displaying the preview image in real time through a display screen; obtaining an adjustment operation; the adjusting operation is used for changing the range of the blurring region; and determining a target aperture value based on the adjustment operation and the adjustment range, wherein the target aperture value and the focal length act on a blurring area of the preview image obtained by the camera, and the blurring area of the preview image obtained by the camera acted by the current aperture and the focal length are different.

Description

Information processing method and device
Technical Field
The present application relates to the field of optical technologies, and in particular, to an information processing method and apparatus.
Background
"depth of field" is a particular concept in the field of photography. The depth of field refers to a certain distance range from a camera lens to a shot object; within this distance range, the subject can be imaged clearly. The depth of field has a very important influence on the shooting effect, so that a photographer needs to adjust and set the depth of field in the shooting process, thereby achieving an ideal shooting effect.
When shooting, a photographer can realize the purpose of controlling the depth of field by adjusting and setting a series of shooting parameters. In addition, in the conventional camera, the photographer can also observe through the viewfinder of the camera to confirm whether the adjustment of the depth effect is ideal.
The above method for adjusting the depth of field has the disadvantage that the depth of field can be accurately controlled only by mastering certain professional skills, which is often not achieved by ordinary users. And the current partial photographing apparatus can only perform framing through the display screen, so that there may be deviation in the framing preview for the depth of field effect, which is likely to cause erroneous judgment by the photographer.
Disclosure of Invention
The application provides an information processing method and device.
In a first aspect, the present application provides an information processing method, including:
the method comprises the steps that a preview image in an acquisition space of a camera is obtained in real time through the camera, wherein the preview image comprises a blurring area and a clear area; the blurring area of the preview image is an area generated according to the focal length of the camera and the current aperture function;
determining an adjustment range based on the maximum aperture of the camera and the minimum aperture of the camera;
displaying the preview image in real time through a display screen;
obtaining an adjustment operation; the adjusting operation is used for changing the range of the blurring region;
and determining a target aperture value based on the adjustment operation and the adjustment range, wherein the target aperture value and the focal length act on a blurring area of the preview image obtained by the camera, and the blurring area of the preview image obtained by the camera acted by the current aperture and the focal length are different.
Preferably, the method further comprises:
acquiring a photographing instruction;
and responding to the photographing instruction, generating and storing the photographed image based on the target aperture value and the focal length, wherein the virtual area of the photographed image is consistent with the virtual area of the preview image obtained by the camera through the action of the target aperture value and the focal length.
Preferably, the method further comprises:
and displaying the mark of the adjustment range in the process of displaying the preview image in real time through a display screen, wherein the mark is used for prompting the adjustable boundary of the blurring area.
Preferably, when the subject moves, the method further includes:
determining a shot object positioned in the acquisition space of the camera, and determining the distance from the shot object to the camera;
or, searching the shot object in the acquisition space of the camera, and detecting to obtain the distance from the shot object to the camera;
and determining the focal length of the camera according to the distance from the shot object to the camera.
Preferably, the method further comprises:
determining the distance range of the clear area according to the shape of the shot object after moving;
determining a target aperture value based on a distance range of the clear region;
and the target aperture value and the focal length re-determine a blurring area and a clear area of a preview image of the camera.
Preferably, the determining a target aperture value based on the adjustment operation and the adjustment range further includes:
and determining the shutter time value and the sensitivity value of the camera.
Preferably, the method further comprises the following steps:
determining the variation range of the clear area according to the adjustment range of the camera;
marking a maximum clear area and/or a minimum clear area in the preview image according to the variation range of the clear area.
In a second aspect, the present application provides an information processing apparatus comprising:
the preview image determining module is used for obtaining a preview image in an acquisition space of a camera in real time through the camera, wherein the preview image comprises a blurring area and a clear area; the blurring area of the preview image is an area generated according to the focal length of the camera and the current aperture function;
an adjustment range determining module for determining an adjustment range based on the maximum aperture of the camera and the minimum aperture of the camera;
the display module is used for displaying the preview image in real time through a display screen;
the operation acquisition module is used for acquiring adjustment operation; the adjusting operation is used for changing the range of the blurring region;
and the blurring module is used for determining a target aperture value based on the adjustment operation and the adjustment range, wherein the blurring area of the preview image obtained by the camera acted by the target aperture value and the focal length is different from the blurring area of the preview image obtained by the camera acted by the current aperture and the focal length.
In a third aspect, the present application provides a computer-readable storage medium storing a computer program for executing the information processing method described in the present application.
In a fourth aspect, the present application provides an electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is used for reading the executable instructions from the memory and executing the instructions to realize the information processing method.
Compared with the prior art, the information processing method and the information processing device provided by the application can control the depth of field distance of shooting in a visual and simple mode, so that the range of a clear area/a virtual area is determined, and a non-professional user can finish accurate shooting operation.
Drawings
Fig. 1 is a schematic flowchart of an information processing method according to an embodiment of the present application;
fig. 2 is a light path diagram in an information processing method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of another information processing method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present application more obvious and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
When shooting, a photographer can realize the purpose of controlling the depth of field by adjusting and setting a series of shooting parameters. In addition, in the conventional camera, the photographer can also observe through the viewfinder of the camera to confirm whether the adjustment of the depth effect is ideal.
The above method for adjusting the depth of field has the disadvantage that the depth of field can be accurately controlled only by mastering certain professional skills, which is often not achieved by ordinary users. And the current partial photographing apparatus can only perform framing through the display screen, so that there may be deviation in the framing preview for the depth of field effect, which is likely to cause erroneous judgment by the photographer.
Therefore, embodiments of the present application will provide an information processing method to solve at least the above technical problems in the prior art. As shown in fig. 1, the method in this embodiment includes the following steps:
step 101, obtaining a preview image in an acquisition space of a camera in real time through the camera.
That is, during normal framing and focusing, the camera will obtain a picture within the acquisition space as a preview image. Wherein the preview image comprises a blurring region and a clear region; the blurring area of the preview image is an area generated according to the focal length of the camera and the current aperture function. In other words, the clear region is a depth of field region in the field of photography, and the blurred region is a region outside the depth of field region.
The extent of the clear area, i.e. the depth of field area, depends mainly on certain parameters of the camera. The most important is the current aperture value. In addition, shutter time values and sensitivity values may be included. The determination of the depth of field is illustrated with reference to fig. 2. L1 represents the distance range of the depth of field area from the camera (i.e., the lens), also known as the depth distance. The length of the depth-of-field distance is mainly related to the current aperture value. The specific mathematical calculation relationship is well known in the art and is not described in detail herein.
L2 represents the corresponding depth of focus range after refraction by the depth of field region. L3 is the diffusion range corresponding to the depth of focus, the so-called circle diameter. Generally, the diameter of the diffusion ring is within 1-2 pixels of the image sensor, and the imaging definition meets the definition requirement of a 'clear region'. Conversely, the space outside the depth of field distance can be considered as belonging to the blurring region.
Step 102, determining an adjustment range based on the maximum aperture of the camera and the minimum aperture of the camera.
As previously known, the length of the depth-of-field distance is mainly related to the current aperture value. And on the camera, the adjustable range of the aperture value is fixed. That is, an adjustment range can be determined according to the maximum aperture and the minimum aperture of the camera. This adjustment range also corresponds to the adjustment range of the depth distance length. In other words, the segmentation of the blurred region and the sharp region in the preview image is also adjusted according to this adjustment range.
In addition, a mark of the adjustment range can be displayed in the process of displaying the preview image in real time through a display screen, and the mark is used for prompting the adjustable boundary of the blurring area. That is, an adjustable range is displayed in the preview image through the UI interface, so that the user performs an adjustment operation.
And 103, displaying the preview image in real time through a display screen.
After the camera collects the preview image, the display screen displays the preview image. In addition, on the basis that an adjustable range is displayed in a preview image through a UI (user interface), the change range of the clear area can be determined according to the adjustment range of the camera; marking a maximum clear area and/or a minimum clear area in the preview image according to the variation range of the clear area.
That is, the maximum and minimum depth distances and the maximum and minimum ranges of the clear region corresponding to the maximum and minimum depth distances are determined according to the adjustment range of the camera. Further, the maximum clear area and/or the minimum clear area can be marked in the preview image through a mark symbol, so that the user can select intuitively during the framing process.
Step 104, obtaining an adjustment operation; the adjusting operation is to change the extent of the blurring region.
During framing and shooting, the user may not be satisfied with the state of the current clear area/blurring area. The user can adjust this after the UI interface of the preview image shows an adjustable range, thereby changing the range of the clear area/blurred area. The camera can acquire the adjustment operation of the user.
And 105, determining a target aperture value based on the adjusting operation and the adjusting range.
According to the adjustment operation, the range of the clear region/blurring region can be intuitively determined in the preview image. And based on the range, the adjusted target aperture value can be obtained by reversible calculation. The blurring area of the preview image obtained by the camera acted by the target aperture value and the focal length is different from the blurring area of the preview image obtained by the camera acted by the current aperture and the focal length. The target aperture value enables the range of the adjusted clear area/blurred area to be actually presented in the preview image. Meanwhile, the shutter time value and the sensitivity value of the camera can be obtained through reverse calculation.
In addition, after framing and adjustment in the above manner, the user may issue a photographing instruction. A further camera obtains a photographing instruction; and responding to the photographing instruction, generating and storing the photographed image based on the target aperture value and the focal length, wherein the virtual area of the photographed image is consistent with the virtual area of the preview image obtained by the camera through the action of the target aperture value and the focal length.
According to the technical scheme, the beneficial effects of the embodiment are as follows: the depth of field distance of shooting can be controlled in an intuitive and simple mode, so that the range of a clear area/a blurring area is determined, and a non-professional user can finish accurate shooting operation.
Fig. 1 shows only a basic embodiment of the method described in the present application, and based on this, certain optimization and expansion can be performed, and other preferred embodiments of the method can also be obtained.
Fig. 3 shows another embodiment of an information processing method according to the present application. The present embodiment is further developed on the basis of the foregoing embodiments. The method specifically comprises the following steps:
301, obtaining a preview image in an acquisition space of a camera in real time through the camera.
That is, during normal framing and focusing, the camera will obtain a picture within the acquisition space as a preview image. Wherein the preview image comprises a blurring region and a clear region; the blurring area of the preview image is an area generated according to the focal length of the camera and the current aperture function. In other words, the clear region is a depth of field region in the field of photography, and the blurred region is a region outside the depth of field region.
Step 302, determining an adjustment range based on the maximum aperture of the camera and the minimum aperture of the camera.
As previously known, the length of the depth-of-field distance is mainly related to the current aperture value. And on the camera, the adjustable range of the aperture value is fixed. That is, an adjustment range can be determined according to the maximum aperture and the minimum aperture of the camera. This adjustment range also corresponds to the adjustment range of the depth distance length. In other words, the segmentation of the blurred region and the sharp region in the preview image is also adjusted according to this adjustment range.
In addition, a mark of the adjustment range can be displayed in the process of displaying the preview image in real time through a display screen, and the mark is used for prompting the adjustable boundary of the blurring area. That is, an adjustable range is displayed in the preview image through the UI interface, so that the user performs an adjustment operation.
And 303, displaying the preview image in real time through a display screen.
After the camera collects the preview image, the display screen displays the preview image. In addition, on the basis that an adjustable range is displayed in a preview image through a UI (user interface), the change range of the clear area can be determined according to the adjustment range of the camera; marking a maximum clear area and/or a minimum clear area in the preview image according to the variation range of the clear area.
That is, the maximum and minimum depth distances and the maximum and minimum ranges of the clear region corresponding to the maximum and minimum depth distances are determined according to the adjustment range of the camera. Further, the maximum clear area and/or the minimum clear area can be marked in the preview image through a mark symbol, so that the user can select intuitively during the framing process.
Step 304, obtaining an adjustment operation; the adjusting operation is to change the extent of the blurring region.
During framing and shooting, the user may not be satisfied with the state of the current clear area/blurring area. The user can adjust this after the UI interface of the preview image shows an adjustable range, thereby changing the range of the clear area/blurred area. The camera can acquire the adjustment operation of the user.
Step 305, determining a target aperture value based on the adjustment operation and the adjustment range.
According to the adjustment operation, the range of the clear region/blurring region can be intuitively determined in the preview image. And based on the range, the adjusted target aperture value can be obtained by reversible calculation. The blurring area of the preview image obtained by the camera acted by the target aperture value and the focal length is different from the blurring area of the preview image obtained by the camera acted by the current aperture and the focal length. The target aperture value enables the range of the adjusted clear area/blurred area to be actually presented in the preview image.
In the case of the subject moving, the method in the embodiment further includes:
step 306, determining a shot object located in the acquisition space of the camera, and determining the distance from the shot object to the camera.
In an alternative case, the moving distance of the object is short, so that the same object can be captured again in the acquisition space, and the distance from the object to the camera can be determined again. Specifically, the distance can be determined by a distance measuring sensor carried on the camera.
Step 307, searching the shot object in the acquisition space of the camera, and detecting to obtain the distance between the shot object and the camera.
In another alternative case, the subject moves a long distance and the camera cannot recapture. At this time, the camera can search from near to far in the acquisition space to determine a shot object again. In this case, the newly determined subject may be the same subject as the original subject, and may not be excluded. After the shot object is redetermined, the distance from the shot object to the camera can be determined in the same way.
And 308, determining the focal length of the camera according to the distance from the shot object to the camera.
After the distance from the object to the camera is determined, focusing can be performed again, namely the focal length of the camera is determined. That is, returning to the optical path structure shown in fig. 2, the focal length f is determined in advance.
Step 309, determining the distance range of the clear region according to the shape of the subject after the movement.
After the focal length is determined, the depth of field distance below this focal length, i.e. the corresponding sharp area, can be further determined. The determination of the depth distance, that is, the division of the clear area/blurred area may be newly performed according to the newly determined shape of the subject, that is, the newly determined content of the preview image. Refer to the contents of step 303 to step 304.
Step 310, determining a target aperture value based on the distance range of the clear area.
And the target aperture value and the focal length re-determine a blurring area and a clear area of a preview image of the camera. After the clear region is re-determined, the target aperture value can be further re-determined, as in step 305.
Through the method in the embodiment, refocusing after the movement of the shot object can be further realized, and a clear area/blurring area can be re-determined.
Fig. 4 shows an embodiment of an information processing apparatus according to the present application. The apparatus of this embodiment is a physical apparatus for performing the method described in FIGS. 1-3. The technical solution is essentially the same as that in the above embodiment, and the corresponding description in the above embodiment is also applicable to this embodiment. The device in the embodiment comprises:
the preview image determining module 401 is configured to obtain a preview image in an acquisition space of a camera in real time through the camera, where the preview image includes a blurring region and a clear region; the blurring area of the preview image is an area generated according to the focal length of the camera and the current aperture function.
An adjustment range determining module 402, configured to determine an adjustment range based on the maximum aperture of the camera and the minimum aperture of the camera.
And a display module 403, configured to display the preview image in real time through a display screen.
An operation obtaining module 404, configured to obtain an adjustment operation; the adjusting operation is to change the extent of the blurring region.
A blurring module 405, configured to determine a target aperture value based on the adjustment operation and the adjustment range, where a blurring area of the preview image obtained by the camera on which the target aperture value and the focal length act is different from a blurring area of the preview image obtained by the camera on which the current aperture and the focal length act.
In addition, on the basis of the embodiment shown in fig. 4, it is preferable that:
a shooting module 406, configured to obtain a shooting instruction; and responding to the photographing instruction, generating and storing the photographed image based on the target aperture value and the focal length, wherein the virtual area of the photographed image is consistent with the virtual area of the preview image obtained by the camera through the action of the target aperture value and the focal length.
An adjusting module 407, configured to display a mark of the adjustment range in a process of displaying the preview image in real time through a display screen, where the mark is used to prompt an adjustable boundary of the blurring region.
The object capturing module 408 is configured to determine a subject located in the acquisition space of the camera when the subject moves, and determine a distance from the subject to the camera; or, searching the shot object in the acquisition space of the camera, and detecting to obtain the distance from the shot object to the camera; and determining the focal length of the camera according to the distance from the shot object to the camera.
The auxiliary display module is used for determining the change range of the clear area according to the adjustment range of the camera; marking a maximum clear area and/or a minimum clear area in the preview image according to the variation range of the clear area.
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the methods according to the various embodiments of the present application described in the "exemplary methods" section of this specification, above.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform steps in a method according to various embodiments of the present application described in the "exemplary methods" section above of this specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (10)

1. An information processing method, the method comprising:
the method comprises the steps that a preview image in an acquisition space of a camera is obtained in real time through the camera, wherein the preview image comprises a blurring area and a clear area; the blurring area of the preview image is an area generated according to the focal length of the camera and the current aperture function;
determining an adjustment range based on the maximum aperture of the camera and the minimum aperture of the camera;
displaying the preview image in real time through a display screen;
obtaining an adjustment operation; the adjusting operation is used for changing the range of the blurring region;
and determining a target aperture value based on the adjustment operation and the adjustment range, wherein the target aperture value and the focal length act on a blurring area of the preview image obtained by the camera, and the blurring area of the preview image obtained by the camera acted by the current aperture and the focal length are different.
2. The method of claim 1, further comprising:
acquiring a photographing instruction;
and responding to the photographing instruction, generating and storing the photographed image based on the target aperture value and the focal length, wherein the virtual area of the photographed image is consistent with the virtual area of the preview image obtained by the camera through the action of the target aperture value and the focal length.
3. The method of claim 1, further comprising:
and displaying the mark of the adjustment range in the process of displaying the preview image in real time through a display screen, wherein the mark is used for prompting the adjustable boundary of the blurring area.
4. The method of claim 1, when the subject moves, further comprising:
determining a shot object positioned in the acquisition space of the camera, and determining the distance from the shot object to the camera;
or, searching the shot object in the acquisition space of the camera, and detecting to obtain the distance from the shot object to the camera;
and determining the focal length of the camera according to the distance from the shot object to the camera.
5. The method of claim 4, further comprising:
determining the distance range of the clear area according to the shape of the shot object after moving;
determining a target aperture value based on a distance range of the clear region;
and the target aperture value and the focal length re-determine a blurring area and a clear area of a preview image of the camera.
6. The method according to any one of claims 1 to 5, wherein the determining a target aperture value based on the adjustment operation and the adjustment range further comprises:
and determining the shutter time value and the sensitivity value of the camera.
7. The method of any of claims 1 to 5, further comprising:
determining the variation range of the clear area according to the adjustment range of the camera;
marking a maximum clear area and/or a minimum clear area in the preview image according to the variation range of the clear area.
8. An information processing apparatus, the apparatus comprising:
the preview image determining module is used for obtaining a preview image in an acquisition space of a camera in real time through the camera, wherein the preview image comprises a blurring area and a clear area; the blurring area of the preview image is an area generated according to the focal length of the camera and the current aperture function;
an adjustment range determining module for determining an adjustment range based on the maximum aperture of the camera and the minimum aperture of the camera;
the display module is used for displaying the preview image in real time through a display screen;
the operation acquisition module is used for acquiring adjustment operation; the adjusting operation is used for changing the range of the blurring region;
and the blurring module is used for determining a target aperture value based on the adjustment operation and the adjustment range, wherein the blurring area of the preview image obtained by the camera acted by the target aperture value and the focal length is different from the blurring area of the preview image obtained by the camera acted by the current aperture and the focal length.
9. A computer-readable storage medium storing a computer program for executing the information processing method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is used for reading the executable instructions from the memory and executing the instructions to realize the information processing method of any one of the claims 1 to 7.
CN202110972977.XA 2021-08-24 2021-08-24 Information processing method and device Active CN113709366B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110972977.XA CN113709366B (en) 2021-08-24 2021-08-24 Information processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110972977.XA CN113709366B (en) 2021-08-24 2021-08-24 Information processing method and device

Publications (2)

Publication Number Publication Date
CN113709366A true CN113709366A (en) 2021-11-26
CN113709366B CN113709366B (en) 2022-11-22

Family

ID=78654320

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110972977.XA Active CN113709366B (en) 2021-08-24 2021-08-24 Information processing method and device

Country Status (1)

Country Link
CN (1) CN113709366B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007249000A (en) * 2006-03-17 2007-09-27 Casio Comput Co Ltd Imaging apparatus, imaging method, and imaging program
CN101656817A (en) * 2008-08-19 2010-02-24 株式会社理光 Image processing apparatus, image processing process and image processing procedures
CN104219445A (en) * 2014-08-26 2014-12-17 小米科技有限责任公司 Method and device for adjusting shooting modes
CN112654956A (en) * 2018-09-11 2021-04-13 苹果公司 User interface for simulating depth effects
CN112887606A (en) * 2021-01-26 2021-06-01 维沃移动通信有限公司 Shooting method and device and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007249000A (en) * 2006-03-17 2007-09-27 Casio Comput Co Ltd Imaging apparatus, imaging method, and imaging program
CN101656817A (en) * 2008-08-19 2010-02-24 株式会社理光 Image processing apparatus, image processing process and image processing procedures
CN104219445A (en) * 2014-08-26 2014-12-17 小米科技有限责任公司 Method and device for adjusting shooting modes
CN112654956A (en) * 2018-09-11 2021-04-13 苹果公司 User interface for simulating depth effects
CN112887606A (en) * 2021-01-26 2021-06-01 维沃移动通信有限公司 Shooting method and device and electronic equipment

Also Published As

Publication number Publication date
CN113709366B (en) 2022-11-22

Similar Documents

Publication Publication Date Title
US9998650B2 (en) Image processing apparatus and image pickup apparatus for adding blur in an image according to depth map
CN103197491B (en) The method of fast automatic focusing and image collecting device
JP5589548B2 (en) Imaging apparatus, image processing method, and program storage medium
JP6887960B2 (en) Systems and methods for autofocus triggers
JP5054063B2 (en) Electronic camera, image processing apparatus, and image processing method
US10212347B2 (en) Image stabilizing apparatus and its control method, image pickup apparatus, and storage medium
KR20080027443A (en) Imaging apparatus, control method of imaging apparatus, and computer program
CN110546943B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
JP5278564B2 (en) Imaging device
JP2003046844A (en) Highlighting method, camera, and focus highlighting system
US20170318224A1 (en) Image stabilization apparatus, its control method, image capturing apparatus, and storage medium
JP2001272591A (en) Electronic still camera
JP2009141475A (en) Camera
JP2019083364A (en) Image processing device, imaging device, and control method
CN106922181B (en) Direction-aware autofocus
CN113709366B (en) Information processing method and device
JP6346484B2 (en) Image processing apparatus and control method thereof
US20160198084A1 (en) Image pickup apparatus, operation support method, and medium recording operation support program
JP2015167308A (en) Photometry method suitable for face detection autofocus control
WO2022266915A1 (en) Focusing control method and device for lens and photographing device
JP2019197295A (en) Image processing device, image processing method, and program
US10681274B2 (en) Imaging apparatus and control method thereof
US20220292692A1 (en) Image processing apparatus and method of processing image
JP2011071671A (en) Image recognition device and imaging apparatus
JP2018191205A (en) Control apparatus, imaging apparatus, control method, program, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant