CN116686299A - Focusing method, photographing apparatus, photographing system, and readable storage medium - Google Patents

Focusing method, photographing apparatus, photographing system, and readable storage medium Download PDF

Info

Publication number
CN116686299A
CN116686299A CN202180087692.6A CN202180087692A CN116686299A CN 116686299 A CN116686299 A CN 116686299A CN 202180087692 A CN202180087692 A CN 202180087692A CN 116686299 A CN116686299 A CN 116686299A
Authority
CN
China
Prior art keywords
lens position
target
focusing
photographing
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180087692.6A
Other languages
Chinese (zh)
Inventor
滕文猛
胡涛
朱张豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN116686299A publication Critical patent/CN116686299A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Lens Barrels (AREA)
  • Studio Devices (AREA)

Abstract

A focusing method, a photographing apparatus, a photographing system, and a readable storage medium. The method comprises the following steps: acquiring a current lens position of shooting equipment and a target lens position of the shooting equipment in real time in a focusing process of the shooting equipment, wherein when the shooting equipment is positioned at the target lens position, a shooting target is in a focusing state in a shooting picture of the shooting equipment; and displaying the current lens position and focusing prompt information through a display device so as to prompt a user to focus the shooting equipment according to the current lens position and the focusing prompt information, and adjusting the current lens position to the target lens position through user operation, wherein the focusing prompt information comprises an estimated lens position range formed based on the target lens position. The method provides focusing prompt information, manual focusing can be more accurately carried out through the focusing prompt information, the focusing prompt information is independent of an image sensor, and the application range is wider.

Description

Focusing method, photographing apparatus, photographing system, and readable storage medium Technical Field
The present application relates generally to the field of manual focusing, and more particularly, to a focusing method, a photographing apparatus, a photographing system, and a readable storage medium.
Background
Manual focusing is a common focusing mode in the field of film machines (video machines), and a focusing motor is controlled by rotating a focusing wheel with Jiao Shi, so that a picture focus is changed by moving the focusing motor. But current manual focusing relies heavily on the experience of the tracking operator to control the position and speed of tracking.
Disclosure of Invention
In the summary, a series of concepts in a simplified form are introduced, which will be further described in detail in the detailed description. The summary of the application is not intended to define the key features and essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
An embodiment of the present application provides a focusing method, where the method includes:
acquiring a current lens position of shooting equipment and a target lens position of the shooting equipment in real time in a focusing process of the shooting equipment, wherein when the shooting equipment is positioned at the target lens position, a shooting target is in a focusing state in a shooting picture of the shooting equipment;
And displaying the current lens position and focusing prompt information through a display device so as to prompt a user to focus the shooting equipment according to the current lens position and the focusing prompt information, and adjusting the current lens position to the target lens position through user operation, wherein the focusing prompt information comprises a predicted lens position range formed based on the target lens position.
The second aspect of the embodiment of the application provides a focusing method, which comprises the following steps:
acquiring a first lens position and a second lens position for enabling a shooting target to be in a focusing state in a shooting picture of shooting equipment, wherein the first lens position is determined based on a first focusing method, the second lens position is determined based on a second focusing method, and the first focusing method is different from the second focusing method;
and determining a target lens position according to the first lens position and the second lens position, so that the shooting equipment can be adjusted to the target lens position from the current lens position.
A third aspect of an embodiment of the present application provides a photographing apparatus, including:
At least one processor and a memory, the memory storing computer-executable instructions, the at least one processor executing the computer-executable instructions stored by the memory such that executing the computer-executable instructions performs the steps of: acquiring a current lens position of shooting equipment and a target lens position of the shooting equipment in real time in a focusing process of the shooting equipment, wherein when the shooting equipment is positioned at the target lens position, a shooting target is in a focusing state in a shooting picture of the shooting equipment;
and displaying the current lens position and focusing prompt information through a display device so as to prompt a user to focus the shooting equipment according to the current lens position and the focusing prompt information, and adjusting the current lens position to the target lens position through user operation, wherein the focusing prompt information comprises a predicted lens position range formed based on the target lens position.
A fourth aspect of the present application provides a photographing apparatus, including:
at least one processor and a memory, the memory storing computer-executable instructions, the at least one processor executing the computer-executable instructions stored by the memory such that executing the computer-executable instructions performs the steps of:
Acquiring a first lens position and a second lens position for enabling a shooting target to be in a focusing state in a shooting picture of shooting equipment, wherein the first lens position is determined based on a first focusing method, the second lens position is determined based on a second focusing method, and the first focusing method is different from the second focusing method;
and determining a target lens position according to the first lens position and the second lens position, so that the shooting equipment can be adjusted to the target lens position from the current lens position.
A fifth aspect of an embodiment of the present application provides a photographing system, including: a cradle head and the photographing apparatus of the third aspect; the cradle head is used for bearing the shooting equipment.
A sixth aspect of an embodiment of the present application provides a photographing system, including: a cradle head and the photographing apparatus of the fourth aspect; the cradle head is used for bearing the shooting equipment.
A seventh aspect of the embodiments of the present application provides a readable storage medium having a computer program stored thereon; the computer program, when executed, implements the focusing method as described above.
The application provides a focusing method, a shooting device and a shooting system readable storage medium. In the focusing method, a display device displays a current lens position and focusing prompt information, and a user can focus the shooting equipment according to the current lens position and the focusing prompt information, so that the current lens position is adjusted to the target lens position through user operation, and thus a shooting target is in a focusing state in a shooting picture of the shooting equipment; and the focus prompt information includes an estimated lens position range formed based on the target lens position. The method provides the focusing prompt information, and the manual focusing can be more accurately carried out through the focusing prompt information without the need of users to judge by themselves according to the output image of the image sensor, so that the problem of poor focusing caused by poor performance of the image sensor is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
In the drawings:
fig. 1 shows a schematic architecture diagram of a photographing system provided by an embodiment of the present application;
fig. 2 is a schematic diagram illustrating a focusing process in a second focusing method according to an embodiment of the present application;
FIG. 3 is a schematic diagram showing a current lens position and a predicted lens position range displayed in a display device according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a focusing state in a display device according to an embodiment of the present application;
FIG. 5 is a flow chart of a focusing method according to an embodiment of the application;
FIG. 6 is a flow chart of a focusing method according to another embodiment of the application;
fig. 7 shows a schematic configuration of a photographing apparatus in another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein. Based on the embodiments of the application described in the present application, all other embodiments that a person skilled in the art would have without inventive effort shall fall within the scope of the application.
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present application. It will be apparent, however, to one skilled in the art that the application may be practiced without one or more of these details. In other instances, well-known features have not been described in detail in order to avoid obscuring the application.
It should be understood that the present application may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the application to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of the associated listed items.
In order to provide a thorough understanding of the present application, detailed structures will be presented in the following description in order to illustrate the technical solutions presented by the present application. Alternative embodiments of the application are described in detail below, however, the application may have other implementations in addition to these detailed descriptions.
A first aspect of the present application provides a focusing method, as shown in fig. 5, including:
step 110: acquiring a current lens position of shooting equipment and a target lens position of the shooting equipment in real time in a focusing process of the shooting equipment, wherein when the shooting equipment is positioned at the target lens position, a shooting target is in a focusing state in a shooting picture of the shooting equipment;
step 120: and displaying the current lens position and focusing prompt information through a display device so as to prompt a user to focus the shooting equipment according to the current lens position and the focusing prompt information, and adjusting the current lens position to the target lens position through user operation, wherein the focusing prompt information comprises a predicted lens position range formed based on the target lens position.
In order to better explain the above focusing method, the photographing apparatus will be briefly described with reference to the drawings. The shooting device can be independently operated as an execution main body and can also be arranged on the cradle head, the cradle head is taken as a bearing main body for shooting, and the whole shooting system is described in detail by taking the example that the shooting device is borne on the cradle head.
Wherein fig. 1 is a schematic architecture diagram of a photographing system provided according to an embodiment of the present application. As shown in fig. 1, the photographing system may include a distance sensor 110 and a photographing apparatus 130.
The distance sensor 110 includes a transmitting device 111, a receiving device 112, and a control device 113, and is configured to control the transmitting device 111 to transmit an optical signal by the control device 113, and receive a reflected optical signal by the receiving device 112, thereby acquiring distance sensing data, so as to obtain a real-time distance between a shooting target and the shooting device 130 according to the distance sensing data. Alternatively, the receiving device 112 may be a receiving array. The sensing data collected by the distance sensor 110 may include measured distances M between the distance sensor 110 and the photographed target corresponding to different sampling moments. The distance difference between the distance sensor 110 and the photographing apparatus 130 is a fixed value L, and the real-time distance between the photographing object and the photographing apparatus 130 can be determined according to M and L. For example, if the reference plane of the distance sensor is located behind the focal plane of the lens, i.e., on the side of the lens facing away from the photographing object, the real-time distance between the photographing apparatus 130 and the photographing object is a value M-L obtained by subtracting the distance difference L from the measured distance M. The calculation process of the real-time distance may be performed by the distance sensor 110, the photographing device 130, or the pan/tilt head, which is not limited in this embodiment.
The photographing system may further include a focus follower motor 120, and an actuator for driving the lens 131 of the photographing apparatus 130 to rotate by the focus follower motor 120 includes a position controller 121 and a gear, wherein the position controller 121 is configured to output a moment according to a control command given by a focus follower wheel 140 connected to the focus follower motor 120 or a control command given by a controller connected to the distance sensor 110, and drive the lens 131 engaged with the focus follower motor 120 to focus by a gear transmission, so that a photographing object is in a focused state in a photographing picture of the photographing apparatus 130. Optionally, the focus follower motor 120 is engaged with the focus follower ring 0 of the lens 131 in the photographing apparatus 130. The tracking wheel 140 may be a separate control device, or integrated on the cradle head.
In one embodiment, the tracker motor 120 is communicatively connected to the cradle head, and is configured to receive a control command from a user for the tracker motor 120. Specifically, an operating member of the focus follower motor 120 may be disposed on a handheld portion of the pan-tilt, where the operating member is configured to receive a control instruction of a user for the focus follower motor and further drive the focus follower motor according to the control instruction, so as to focus the photographing apparatus 130. Wherein the operating member may include, but is not limited to, a heel wheel 140.
The photographing apparatus 130 includes a lens 131 and a camera body 132, and the lens 131 is disposed on the camera body 132. After the focus follower motor 120 drives the lens 131 to focus, that is, when the photographing object is in a focused state in the photographing screen of the photographing apparatus 130, the photographing apparatus 130 can photograph the photographing object to obtain an image. The camera body 132 may be a camera body of a hand-held single-lens or micro-lens camera. The lens 131 in this embodiment can support manual focusing.
Optionally, the photographing system may further include a cradle head. The photographing apparatus 130 and the focus follower motor 120 are carried on the cradle head. The accessory using the cradle head can fixedly connect the focus follower motor 120 with the bottom of the camera body 132, and the lens 131 can drive the lens 131 to rotate through the focus follower motor 120 after being mounted on the camera body 132 through the bayonet.
The photographing system may further include a focus following wheel 140, and the focus following wheel 140 may push current rotational position and rotational speed data of the focus following wheel 140 to the focus following motor 120, thereby rotating the focus following motor 120 to a designated position corresponding to the focus following wheel 140. The calibration of the mapping relationship between the real-time distance between the photographing device 130 and the photographing target and the rotational position of the focus follower motor 120 may be accomplished by parameter calibration, for example, using the focus follower wheel 140 to drive the focus follower motor 120 to reach at least two focusing calibration points.
In addition, the display device 150 is used for displaying the current lens position and focusing prompt information to prompt the user how to focus.
In an example, the display device 150 may be disposed on the photographing apparatus 130, for example, an interactive interface inherent to the photographing apparatus 130 for displaying various information, and directly display focusing prompt information to the user through the interactive interface during photographing.
In another example, the display device may be disposed outside the photographing apparatus 130, independent of the photographing apparatus 130, but in communication connection with the photographing apparatus 130, transmit the focus prompt information determined by the photographing apparatus 130 to the display device through the communication connection, and display the focus prompt information to the user.
For example, the display device 150 may also be disposed on a cradle head, such as a handheld portion integrated with the cradle head. Or the display device 150 is detachably arranged on a mounting seat (such as being connected with a handle) of the cradle head, the display device 150 is in communication connection with the cradle head, the shooting equipment 130 acquires focusing prompt information and then transmits the focusing prompt information to the cradle head, and the cradle head transmits the focusing prompt information to the display device 150 and displays the focusing prompt information to a user through the display device 150 for focusing. It should be noted that the display device 150 is not limited to the illustrated mounting positions, and the display device 150 may be disposed at other positions of the cradle head.
In the embodiment of the present application, the photographing apparatus 130 may perform automatic focusing in addition to manual focusing. Based on this, the user can also switch the automatic focus following mode to the manual focus following mode by inputting a selection instruction to perform mode selection on the display device 150, thereby realizing flexible switching between the manual focus following mode and the automatic focus following mode.
In the manual focusing mode, the position of the focus follower motor 120 can be controlled by using the focus follower wheel 140 to focus, or the position of the lens can be directly manually rotated to focus. For example, in one embodiment of the present application, the tracker motor 120 may obtain a control command for the tracker motor 120 from a user; and drives the focus follower motor 120 according to the control command to output torque, and drives the lens meshed with the focus follower motor to focus through gear transmission, so that the shooting target is in a focusing state in a shooting picture of the shooting equipment. In another embodiment of the present application, the photographing system may not include the focus follower motor 120, but manually toggle the lens to adjust the current lens position to the target lens position by manual rotation of the user, so as to achieve focusing.
It should be understood that the above designations of the components of the camera system are for identification purposes only and should not be construed as limiting embodiments of the present application.
It should be noted that, the photographing device 130 may not be disposed on the pan-tilt, and the focusing method of the present application may be implemented by the photographing device 130 alone, where the foregoing configuration and operation method of the photographing device 130 may be introduced into the photographing device 130 entirely on the premise that the photographing device 130 is not supported on the pan-tilt and is not contradictory to each other, and will not be described herein.
In the present application, the main execution body of the focusing method may be a processing device with independent computing capability, such as the tracker motor 120, the distance sensor 110, the photographing device 130 or the pan/tilt head in the embodiment shown in fig. 1, and of course, the method of this embodiment may also execute some steps by one device, and execute other steps by another device, that is, execute any combination of the tracker motor 120, the distance sensor 110, the photographing device 130 and the pan/tilt head together, for example, take the combination of the tracker motor 120 and the pan/tilt head as an example, execute a part of the steps by the tracker motor 120, and execute the rest steps by the pan/tilt head, so that the tracker motor 120 and the pan/tilt head together execute the focusing method provided in this embodiment.
In step S110, in an embodiment of the present application, as described above, the focus follower motor 120 may be engaged with the focus follower ring of the lens 131 in the photographing apparatus 130, and then the current position of the lens of the photographing apparatus 130 may be determined according to the current position of the focus follower motor 120. Wherein the current position of the tracking motor 120 can be detected by an angle sensor. Of course, the current lens position of the photographing apparatus may be determined by other methods, for example, may be calculated according to the current focusing degree, which may depend on the specific focusing method.
The target lens position of the photographing apparatus 130 refers to a focusing position of a lens of the photographing apparatus 130, that is, when the photographing apparatus 130 is at the target lens position, the photographing target is in a focusing state in a photographing screen of the photographing apparatus 130.
In the application, the focusing prompt information is a predicted lens position range including the target lens position, the predicted lens position range is a numerical value range, and the predicted lens position range is not a point value. For example, in one example of the present application, the predicted lens position range is a numerical range with the target lens position as the median. The method for determining the target lens position and outputting the estimated lens position range formed based on the target lens position comprises the following steps: the first lens position and the second lens position are obtained by different methods, respectively, wherein the photographing apparatus 130 is able to be in an in-focus state in a photographing screen of the photographing apparatus 130 when the photographing apparatus 130 is at the first lens position and the second lens position, respectively.
Specifically, the first lens position is determined based on a first focusing method, and the second lens position is determined based on a second focusing method, the first focusing method being different from the second focusing method. The estimated lens position ranges respectively determined by the first lens position and the second lens position may be the same or may be partially overlapped or completely different based on the same or different first lens position and second lens position.
Wherein, since the first focusing method is different from the second focusing method and the influence factors of the confidence coefficient are different, the first lens position and the second lens position can have different confidence coefficients, namely, the first lens position has a first confidence coefficient and the second lens position has a second confidence coefficient.
As such, the target lens position may be determined based on the confidence to focus the photographing device 130. Wherein determining the target lens position based on the acquired first lens position and second lens position specifically includes: and comparing the first confidence coefficient with the second confidence coefficient, and taking the one with the higher confidence coefficient in the first lens position and the second lens position as the target lens position.
In the present application, the confidence level may be used to characterize the accuracy of the determined lens position, i.e. the first confidence level may be used to characterize the accuracy of the determined first lens position and the second confidence level may be used to characterize the accuracy of the determined second lens position. The confidence level may be inversely related to the size of the corresponding estimated lens position range, that is, the higher the confidence level is, the smaller the corresponding estimated lens position range is, the lower the confidence level is, and the larger the corresponding estimated lens position range is. Therefore, the problem that jump exists in the estimated lens position range when the accuracy of the lens position determined based on the corresponding focusing method is low can be avoided.
For example, if the estimated lens position range is small, the target lens position may be adjusted (e.g., due to real-time switching of the focusing method) along with the adjustment of the current lens position, and the estimated lens position range may be adjusted along with the adjustment of the estimated lens position range, so that the estimated lens position range may be moved from the range of B-D to the range of C-E; if the estimated lens position range is larger, as the current lens position gets closer to the target lens position, when the estimated lens position range is adjusted along with the target lens position, the estimated lens position range can be changed from the range of A-E to the range of C-E, so that the estimated lens position range is only adjusted in the original range, and the estimated lens position range becomes more accurate, rather than the phenomenon that the whole movement of the original range causes jump.
In the application, since the first focusing method and the second focusing method are different methods, the types of the obtained first confidence coefficient and the second confidence coefficient are different, the method further comprises the step of normalizing the first confidence coefficient and the second confidence coefficient, and comparing the first confidence coefficient and the second confidence coefficient under the same unit measurement through normalization processing to determine the higher confidence coefficient of the first lens position and the second lens position.
In the present application, the target lens position may be determined according to other methods in addition to determining the target lens position by comparing the level of confidence. For example, a weighting method may be adopted, that is, the first lens position and the second lens position are respectively provided with corresponding weights, and the weighted result thereof may be used as the target lens position. The weights of the first lens position and the second lens position can be dynamically adjusted according to the confidence level.
The first focusing method includes using a distance sensor 110 capable of providing multi-point ranging, having a higher resolution, capable of providing depth map information, calculating an image distance when knowing an object distance according to a gaussian imaging formula, and knowing the image distance can know a current focusing state.
Specifically, in an embodiment of the present application, a shooting target refers to an object to be imaged, such as a person, an object, a scene, or the like. The camera body will generally mark the position of the focal plane, and the real-time distance between the photographing apparatus 130 and the photographing target may be the distance between the focal plane of the camera body of the photographing apparatus 130 and the photographing target, denoted by d, so the relationship between d and the object distance u and the image distance v may be expressed as:
d=u+v (1)
When the photographing target is in an in-focus state in the photographing screen of the photographing apparatus 130, the image distance and the object distance satisfy the gaussian imaging formula:
from equation (1) and equation (2), we can get:
in an embodiment of the present application, determining the first lens position based on the first focusing method includes: determining a distance d between the photographing object and the photographing apparatus 130 by the distance sensor 110; calculating a focusing object distance according to the distance d; then, a target image distance is calculated according to the focusing object distance according to formulas (2) and (3), and a first lens position is determined according to the target image distance.
In this embodiment, there may be various mounting positions of the distance sensor 110, as long as the real-time distance between the photographing apparatus 130 and the photographing target can be determined according to the sensing data collected by the distance sensor 110:
in one implementation, the distance sensor 110 is detachably disposed on the photographing apparatus 130. For example, the distance sensor 110 may be provided on a hot shoe of the photographing device 130, in which case the ranging reference plane of the distance sensor 110 and the focal plane of the lens of the photographing device 130 may be substantially coincident within a certain error range. Of course, the distance measurement reference surface of the distance sensor 110 may be installed at a fixed distance from the focal plane of the lens of the photographing apparatus 130, as long as the fixed distance is taken into consideration in the subsequent calculation. For example, the reference plane of the distance sensor 110 is located behind the focal plane of the lens, i.e. on the side facing away from the shooting target, and the distance difference between the reference plane of the distance sensor 110 and the focal plane of the lens is a fixed value L, then the real-time distance between the shooting device 130 and the shooting target is a value M-L obtained by subtracting L from the measured distance M of the distance sensor 110. This embodiment is not limited thereto.
In another implementation, the photographing apparatus 130 is detachably carried on the carrying seat of the cradle head, and the distance sensor 110 is detachably carried on the carrying seat of the cradle head. In this way, by setting the cradle head, the photographing device 130 and the distance sensor 110 are both carried by the cradle head, so that on one hand, the balance of the photographing device 130 can be ensured by utilizing the balance adjusting function of the cradle head, and on the other hand, the photographing device 130 and the distance sensor 110 can be ensured to be set relatively fixedly, thereby realizing accurate calculation of the real-time distance between the photographing device 130 and the photographing target.
Alternatively, the shooting target in the present embodiment may be selected by the user. In some embodiments, the shooting device 130 is detachably supported on a supporting seat of a cradle head, the cradle head is in communication connection with the shooting device 130, a display device 150 is disposed on a handheld portion of the cradle head, and the shooting target is determined by detecting a selection operation of a user on the display device 150 for displaying a shot image of the shooting device 130.
In some embodiments, the distance sensor 110 includes a transmitting device for transmitting an optical signal and a receiving device for receiving an optical signal reflected by a shooting target, and correspondingly, acquiring a real-time distance between the shooting device 130 and the shooting target according to the sensed data acquired by the distance sensor 110 includes: the distance between the photographing target and the photographing apparatus 130 is determined according to the optical signal received by the receiving device.
In some embodiments, the distance sensor 110 may further include a transmitting device for transmitting an optical signal and a receiving device for receiving the optical signal transmitted by the shooting target, and accordingly, acquiring the real-time distance between the shooting device 130 and the shooting target according to the sensed data acquired by the distance sensor 110 includes: the distance between the photographing object and the photographing apparatus 130 is determined according to the acoustic signal received by the receiving device.
In this embodiment, the distance sensor 110 may be a single-point distance sensor or a 3D distance sensor, and the distance sensor 110 may be a TOF (Time Of Flight) distance sensor. This embodiment is not limited thereto.
In the first focusing method, the focusing or focus following of the lens is generally to keep the position of the shooting target and the focal plane position of the camera body of the shooting device 130 unchanged, by rotating the focus following ring of the lens of the shooting device 130, the screw thread on the inner wall of the lens converts the rotating rotation angle into the front-back translation distance of the lens group of the lens, which is equivalent to that under the condition that the real-time distance between the shooting device 130 and the shooting target is kept unchanged, the lens is driven to rotate by a manual mode or by the focus following motor 120, and the object distance u and the image distance v are adjusted to focus the shooting picture. In the first focusing method, the effect is not affected in a dark light or low contrast environment. Further, since the CMOS with a phase sensor or the CMOS with phase ranging is not used (more pixels, better focusing performance but poor image quality), the image quality is not lost at all for the image sensor, and there is a higher degree of blurring.
The first confidence is positively correlated with the signal-to-noise ratio of the collected data of the distance sensor 110, that is, the smaller the distortion caused by the noise of the distance sensor 110, the greater the intensity of the reflected light of the distance sensor 110, and the higher the signal-to-noise ratio of the distance sensor 110, the higher the first confidence.
The second focusing method is a focusing method based on ambiguity, and the method focuses by using a Point Spread Function (PSF), wherein the Point Spread Function (PSF) is a function describing the resolving power of an optical system to a point source. Since the point source forms an enlarged image point by diffraction after passing through any optical system, the image information can be extracted more accurately by measuring the point spread function of the system.
The method based on the blur degree focusing is to calculate a focusing curve according to different images shot at different focusing positions and respective blur degrees (such as represented by PSF), and then obtain the focusing position. As can be seen from the following formula, in the case where PSF1 is equal to PSF1 (d) and PSF2 is equal to PSF2 (d), picture1 convolves PSF2 (d) to be equal to picture2 convolves PSF1 (d).
Wherein Picture0 is an ideal Picture in an in-focus state, picture1 is an image taken when the first position is taken, PSF1 is a first blur degree of a lens when the first position is taken for a shooting target, picture2 is an image taken when the second position is taken, PSF2 is a second blur degree of the lens when the second position is taken for a shooting target, and PSF1 (d) and PSF2 (d) are blur degrees for calibrating the lens of the shooting device 130 on a focusing stroke, that is, blur degrees in a PSF database. In the manual focusing process, two images Picture1 and Picture2 at adjacent positions are calculated, all PSFs in a database are traversed, PSF1 (d) and PSF2 (d) meeting the formula are found, curves for focusing corresponding to the PSF1 (d) and the PSF2 (d) are determined, and then focusing points can be found, as shown in fig. 2.
In an embodiment of the present application, as shown in fig. 2, in the second focusing method, the shooting position of the shooting device 130 is calibrated on the focusing stroke, and different shooting positions correspond to different image ambiguities. Therefore, when focusing, the photographing apparatus 130 may be controlled to photograph the photographing object at a first position and a second position, respectively, the first position being different from the second position; the first ambiguity of the lens when the shooting target is shot at the first position and the second ambiguity of the lens when the shooting target is shot at the second position can be obtained; then, a second lens position is determined based on the acquired first and second ambiguities.
In one embodiment of the application, determining the second lens position based on the first ambiguity and the second ambiguity comprises: determining a curve for focusing according to the first ambiguity and the second ambiguity; and then determining a second lens position according to the curve. Wherein the corresponding position at the vertex in the curve is the target lens position.
The first ambiguity and the second ambiguity may be obtained by using contents corresponding to the above formula, and a method for determining a curve for focusing according to the ambiguity may refer to the prior art, which is not described herein.
It should be noted that, in the present application, besides determining the target lens position through a curve, the target lens position may also be obtained by acquiring a focusing data table and by looking up a table, and the focusing method is merely exemplary.
The second focusing method has a second confidence coefficient, the second confidence coefficient is positively correlated with the curvature of the curve, namely, the larger the curvature of the focusing curve is, the smaller the opening degree is, the higher the second confidence coefficient is, and conversely, the smaller the curvature of the focusing curve is, the larger the opening degree is, and the lower the second confidence coefficient is.
Optionally, the focusing prompt information includes, but is not limited to, at least one of a moving direction and a moving distance of the lens of the photographing apparatus 130. The current position of the photographing apparatus 130 may be displayed in the display device 150, the lens position range may be estimated, and a specific value of the current position of the photographing apparatus 130 may be displayed in the display device 150, the specific value of the lens position range may be estimated, so that the user can determine the direction in which the photographing apparatus 130 needs to be moved and the distance required to be moved according to the specific value of the current position of the photographing apparatus 130 and the specific value of the estimated lens position range and the positional relationship between the two.
In an embodiment of the present application, as shown in fig. 3, the display device 150 displays that the photographing apparatus 130 is currently located at the position of 0.3, and the value of the estimated lens position range is 0.4-0.7, and the position relationship between the two can be used to know how the lens of the photographing apparatus 130 needs to be adjusted to enter the estimated lens position range. In addition, for further presentation, the direction of movement may also be indicated in the display device 150.
When the current lens position is overlapped with the target lens position, the display mark corresponding to the current lens position can be changed to prompt the user that the current lens position reaches the target lens position, and the rotation of the lens can be stopped.
The related information such as the current lens position, the estimated lens position range, the moving direction and the like can have respective display marks, and different display marks can have corresponding display modes such as color, shape, dynamic and static changes and the like.
Specifically, the display device may display a position adjustment progress bar, and distance data may be displayed above the position adjustment progress bar, where the distance data may be used to characterize a current lens position and predict a lens position; the position adjustment progress bar is provided with a mark of the current lens position and a mark of the estimated lens position range, the mark of the current lens position can be close to or far from the mark of the estimated lens position range in the focusing process, and the colors and the sizes of the mark of the current lens position and the mark of the estimated lens position range can be different; meanwhile, a moving direction mark is arranged on one side of the current lens position mark, and is used for indicating the moving direction of the current lens position so as to be close to the estimated lens position range and coincide with the target lens position. Thus, the user can well realize manual focusing by pressing operation through the display.
The change in the display identifier when the current lens position coincides with the target lens position includes at least one of a color change, a shape change, and a size change. The color change may further include a color change or a dynamic blinking of a display mark, and is not limited to a certain one.
In the first focusing method, due to convergence of the target lens position, the identification of the estimated lens position range corresponding to the target lens position is continuously converged in the focusing process, as shown in fig. 4, when focusing is performed, the current lens position and the target lens position coincide, and can be treated by bright green areas respectively.
In the second focusing method, lens positions of two adjacent frames are sampled, a focusing position range (an estimated lens position range calculated by taking a target lens position as a center is displayed according to the result of the current frame, the estimated lens position range is determined by a second confidence), each frame of the current lens position and the estimated lens position range is updated, and when the target lens position is converged, the identifier corresponding to the estimated lens position range is also caused to be continuously converged in the focusing process. When in focus as shown in fig. 4, the current lens position and the target lens position coincide, which can be treated differently with a bright green area.
In the present application, the focusing methods include a first focusing method and a second focusing method, and the two focusing methods have respective advantages, and the present application integrates the advantages of the two focusing methods, so as to further improve focusing accuracy, for example, the first focusing method of the distance sensor 110 is used to determine that the first lens position has sufficient resolution and accuracy, so that distance measurement within a certain distance can be satisfied, and therefore, manual focusing assistance can be satisfied within the specification of the first focusing method. And in the case that the first focusing method is out of specification, a second focusing method is needed to estimate the focusing position, so that the effect of manual focusing assistance is improved.
For example, the maximum ranging range in the first focusing method does not exceed 10 meters, but part of lenses need to satisfy manual focusing assistance of 10 meters or more; the accuracy of determining the first lens position by the general first focusing method is about 5%, but the accuracy requirement is very high and may be about 1% when part of the lenses are at a relatively short distance. Therefore, in the case where the first focusing method cannot satisfy the in-focus estimation, the in-focus position can be estimated using the second focusing method.
Under the static condition, the second focusing method determines the accuracy of the second lens position to be within the range of-1 Fdelta, 1 Fdelta for a high-contrast scene, and determines the accuracy of the second lens position to be within the range of-2 Fdelta, 2 Fdelta for a low-contrast scene, wherein F delta is the size of a diffuse spot, and the requirement of manual focusing assistance can be met. However, for sports scenes, especially when the sports scene moves vigorously, the accuracy of determining the position of the second lens by the second focusing method is reduced, and focusing can be performed by the first focusing method. The high contrast scene refers to a high contrast scene where the shooting target and the environmental background have a large contrast and/or have good illumination conditions, for example, the shooting background is black, the shooting target is white, and/or have good illumination, and in this case, the high contrast scene is a low contrast scene if the colors of the shooting target and the environmental background are similar, and the illumination conditions are poor.
It should be noted that, the relation between the accuracy of determining the first lens position by the first focusing method and the accuracy of determining the second lens position by the second focusing method and the application scene can be finally reflected on the confidence coefficient of the description, so that the target lens position selected based on the confidence coefficient can be better used for focusing the application scenes with different dynamic and static states, different contrast ratios and different distances.
Therefore, the application combines the advantages of the second focusing method and the first focusing method to assist manual focusing, thereby improving the accuracy.
As described above, the target lens position is included in the estimated lens position range, in an embodiment of the present application, the target lens position is a midpoint position of the estimated lens position range, and the first focusing method cannot meet the accuracy requirement in the case that the first focusing method is out of specification, and the second focusing method is required to estimate the focusing position, so that the accuracy requirements are different in different contrast scenes, and the size of the estimated lens position range is different due to different accuracy requirements, so as to meet the manual focusing requirement, and improve the focusing accuracy.
In the focusing method, the display device 150 displays the current lens position and focusing prompt information, wherein the focusing prompt information includes a predicted lens position range formed based on the target lens position, and the user can focus the photographing apparatus 130 according to the current lens position and the focusing prompt information, so that the current lens position is adjusted to the target lens position through the user operation, and the photographing target is in a focusing state in the photographing picture of the photographing apparatus 130. The method provides the focusing prompt information, can more accurately perform manual focusing through the focusing prompt information, and has wider application range.
A second aspect of the embodiment of the present application provides a focusing method, as shown in FIG. 6, including:
step S210: acquiring a first lens position and a second lens position for enabling a shooting target to be in a focusing state in a shooting picture of shooting equipment, wherein the first lens position is determined based on a first focusing method, the second lens position is determined based on a second focusing method, and the first focusing method is different from the second focusing method;
step S220: and determining a target lens position according to the first lens position and the second lens position, so that the shooting equipment can be adjusted to the target lens position from the current lens position.
The focusing method may be either automatic focusing or manual focusing, and is not limited herein.
For example, in one embodiment of the present application, a focus follower motor is coupled to a lens of a photographing apparatus, and the focus follower motor is used to drive the lens of the photographing apparatus to rotate, so that the photographing apparatus is adjusted from a current lens position to the target lens position.
Optionally, the photographing device and the focus follower motor are carried on the cradle head. For example, the camera device is in communication connection with a cradle head, wherein: the display device is arranged on the handheld part of the cradle head; or (b)
The display device is detachably arranged on the mounting seat of the cradle head and is in communication connection with the cradle head.
Optionally, the focus follower motor is in communication connection with the cradle head, an operating part of the focus follower motor is arranged on a handheld part of the cradle head, and the operating part is used for receiving a control instruction of a user aiming at the focus follower motor.
Optionally, the current lens position and the target lens position are displayed by a display device.
Optionally, the photographing device may not be disposed on the pan-tilt, and the photographing device is independently used as an execution body to perform focusing and photographing.
The specific configuration of the photographing apparatus, the execution body, and the setting manner of the display device may refer to the explanation and the explanation related to the first aspect of the present application, which are not described herein.
Optionally, the method further comprises: the first confidence and the second confidence are normalized for comparison.
Optionally, determining the first lens position based on the first focusing method includes:
determining distance data between a shooting target and shooting equipment through a distance sensor;
Calculating a focusing object distance according to the distance data;
and calculating a target image distance according to the focusing object distance, and determining a first lens position according to the target image distance.
Optionally, the signal-to-noise ratio of the acquired data of the distance sensor is positively correlated with the first confidence.
Optionally, determining the first lens position based on the first focusing method includes:
controlling shooting equipment to respectively shoot a shooting target at a first position and a second position, wherein the first position is different from the second position;
when shooting a shooting target at a first position, acquiring a first ambiguity of an image shot by shooting equipment;
when shooting a shooting target at a second position, acquiring a second ambiguity of an image shot by shooting equipment;
and determining a second lens position according to the first ambiguity and the second ambiguity.
Optionally, determining the second lens position according to the first ambiguity and the second ambiguity includes:
determining a curve for focusing according to the first ambiguity and the second ambiguity;
and determining the second lens position according to the curve.
Optionally, the second confidence level is positively correlated with the curvature of the curve.
Optionally, the size of the estimated lens position range is inversely related to the first confidence level or the second confidence level.
Optionally, the target lens position is a midpoint position of the range of predicted lens positions.
Optionally, the focusing prompt information further includes a moving direction and/or a moving distance of a lens of the photographing device.
Optionally, the moving distance is characterized by a set distance between the photographic target and the photographic device.
Optionally, when the current lens position is overlapped with the target lens position, the display identifier corresponding to the current lens position is changed.
Optionally, the change comprises at least one of a color change, a shape change, a size change.
In the focusing method, the first focusing method and the second focusing method may refer to the description and explanation related to the first aspect of the present application, and will not be described herein.
The application combines the advantages of the second focusing method and the first focusing method to assist manual focusing and improve the accuracy. The method can more accurately perform manual focusing, is independent of an image sensor, and has a wider application range.
A third aspect of embodiments of the present application provides a photographing apparatus for performing the focusing methods provided in the first and second aspects, respectively.
In one embodiment, as shown in fig. 7, the photographing apparatus 700 of the present embodiment may include: at least one processor 701 and a memory 702. Wherein the processor 701 and the memory 702 are connected by a bus 703.
In a specific implementation process, the memory 702 stores computer-executable instructions, and the at least one processor 701 executes the computer-executable instructions stored in the memory 702, so that the following steps are implemented when the computer-executable instructions are executed:
acquiring the current lens position of the shooting equipment and the target lens position of the shooting equipment in real time in the focusing process of the shooting equipment, wherein when the shooting equipment is positioned at the target lens position, the shooting target is in a focusing state in a shooting picture of the shooting equipment;
the current lens position and focusing prompt information are displayed through the display device so as to prompt a user to focus the shooting equipment according to the current lens position and the focusing prompt information, and the current lens position is adjusted to the target lens position through user operation, wherein the focusing prompt information comprises an estimated lens position range formed based on the target lens position.
In another embodiment, a photographing apparatus includes:
at least one processor and a memory, the memory storing computer-executable instructions, the at least one processor executing the computer-executable instructions stored in the memory such that executing the computer-executable instructions performs the steps of:
acquiring a first lens position and a second lens position for enabling a shooting target to be in a focusing state in a shooting picture of shooting equipment, wherein the first lens position is determined based on a first focusing method, the second lens position is determined based on a second focusing method, and the first focusing method is different from the second focusing method;
And determining a target lens position according to the first lens position and the second lens position, so that the shooting equipment can be adjusted to the target lens position from the current lens position.
Specific steps are not described here again, and reference may be made to the relevant content of the first aspect and the second aspect.
A fourth aspect of the present application provides a photographing system, comprising: a cradle head and the photographing apparatus of the third aspect; the cradle head is used for bearing the shooting equipment.
The cradle head of the present embodiment may be used to execute the technical solutions of the above embodiments of the methods of the present application, and its implementation principle and technical effects are similar, and are not repeated here.
It should be noted that, the technical solutions of the above embodiments of the method of the present application may also be implemented by a combination device. For example, some steps of the focusing method provided in the method embodiment may be performed by a cradle head, and another part of the steps may be performed by a focus follower motor or manually by a user. The embodiment of the present application is not limited thereto.
The fifth aspect of the present application also provides a computer storage medium having a computer program stored thereon. One or more computer program instructions may be stored on a computer-readable storage medium, the computer program comprising at least one piece of code executable by a computer to control the computer to perform the focusing method described above.
In an embodiment of the present application, a computer storage medium is further provided, where program instructions are stored, and when executed, the program may include part or all of the steps of the focusing method of the first aspect or the second aspect. The above-described readable storage medium may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. A readable storage medium can be any available medium that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. In the alternative, the readable storage medium may be integral to the processor. The processor and the readable storage medium may reside in an application specific integrated circuit (Application Specific Integrated Circuits, ASIC for short). The processor and the readable storage medium may reside as discrete components in a device.
When the computer program is run, it may implement the functions (implemented by the processor) and/or other desired functions in the embodiments of the present application herein, for example, to perform the corresponding steps of the focusing method according to the embodiments of the present application, and various application programs and various data, such as various data used and/or generated by the application programs, etc., may also be stored in the computer readable storage medium.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above illustrative embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be made therein by one of ordinary skill in the art without departing from the scope and spirit of the application. All such changes and modifications are intended to be included within the scope of the present application as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technology. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another device, or some features may be omitted or not performed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in order to streamline the application and aid in understanding one or more of the various inventive aspects, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof in the description of exemplary embodiments of the application. However, the method of the present application should not be construed as reflecting the following intent: i.e., the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be combined in any combination, except combinations where the features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some of the modules according to embodiments of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is merely illustrative of specific embodiments of the present application and the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present application. The protection scope of the application is subject to the protection scope of the claims.

Claims (83)

  1. A focusing method, the method comprising:
    Acquiring a current lens position of shooting equipment and a target lens position of the shooting equipment in real time in a focusing process of the shooting equipment, wherein when the shooting equipment is positioned at the target lens position, a shooting target is in a focusing state in a shooting picture of the shooting equipment;
    and displaying the current lens position and focusing prompt information through a display device so as to prompt a user to focus the shooting equipment according to the current lens position and the focusing prompt information, and adjusting the current lens position to the target lens position through user operation, wherein the focusing prompt information comprises a predicted lens position range formed based on the target lens position.
  2. The method of claim 1, wherein a focus follower motor is coupled to a lens of the photographing apparatus, the focus follower motor for driving the lens of the photographing apparatus to rotate according to a user instruction to focus the photographing apparatus.
  3. The method of claim 2, wherein the photographing device and the follower motor are carried on a cradle head.
  4. The method of claim 3, wherein the capture device is communicatively coupled to the pan-tilt, wherein:
    The display device is arranged on the handheld part of the cradle head; or (b)
    The display device is detachably arranged on the mounting seat of the cradle head, and the display device is in communication connection with the cradle head.
  5. The method of claim 3, wherein the focus follower motor is communicatively connected to the cradle head, and an operating member of the focus follower motor is disposed on a hand-held portion of the cradle head, and the operating member is configured to receive a control command of a user for the focus follower motor.
  6. The method according to claim 1 or 2, wherein the display device is an interactive interface of the photographing apparatus or the display device is communicatively connected to the photographing apparatus.
  7. The method according to claim 1, wherein the method further comprises:
    acquiring a first lens position and a second lens position for enabling the shooting target to be in an in-focus state in a shooting picture of the shooting device, wherein the first lens position is determined based on a first focusing method, the second lens position is determined based on a second focusing method, and the first focusing method is different from the second focusing method;
    and determining the target lens position according to the first lens position and the second lens position.
  8. The method of claim 7, wherein the first lens position has a first confidence level and the second lens position has a second confidence level;
    the determining the target lens position according to the first lens position and the second lens position includes:
    and determining the higher confidence level of the first lens position and the second lens position as the target lens position.
  9. The method of claim 8, wherein the method further comprises:
    and normalizing the first confidence coefficient and the second confidence coefficient.
  10. The method of claim 8, wherein determining the first lens position based on the first focusing method comprises:
    determining distance data between the shooting target and the shooting equipment through a distance sensor;
    calculating a focusing object distance according to the distance data;
    and calculating the target image distance according to the focusing object distance, and determining the first lens position according to the target image distance.
  11. The method of claim 10, wherein a signal-to-noise ratio of the acquired data of the distance sensor is positively correlated with the first confidence.
  12. The method of claim 7, wherein determining the second lens position based on the second focusing method comprises:
    controlling the shooting equipment to respectively shoot the shooting target at a first position and a second position, wherein the first position is different from the second position;
    acquiring a first ambiguity of a lens of the photographing device when the photographing target is photographed at the first position;
    acquiring a second ambiguity of a lens of the photographing apparatus when photographing the photographing target at the second position;
    and determining the second lens position according to the first ambiguity and the second ambiguity.
  13. The method of claim 12, wherein the determining the second lens position from the first ambiguity and the second ambiguity comprises:
    determining a curve for focusing according to the first ambiguity and the second ambiguity;
    and determining the second lens position according to the curve.
  14. The method of claim 8, wherein the second confidence level is positively correlated with a curvature of the curve.
  15. The method of claim 8, wherein the magnitude of the estimated lens position range is inversely related to the first confidence level or the second confidence level.
  16. The method of claim 1, wherein the target lens position is a midpoint position of the range of predicted lens positions.
  17. The method of claim 1, wherein the focus hint information further comprises a direction and/or distance of movement of a lens of the photographing device.
  18. The method of claim 17, wherein the distance of movement is characterized by a set distance between the photographic target and the photographic device.
  19. The method of claim 1, wherein the display identifier corresponding to the current lens position changes when the current lens position coincides with the target lens position.
  20. The method of claim 19, wherein the change comprises at least one of a color change, a shape change, a size change.
  21. A focusing method, the method comprising:
    acquiring a first lens position and a second lens position for enabling a shooting target to be in a focusing state in a shooting picture of shooting equipment, wherein the first lens position is determined based on a first focusing method, the second lens position is determined based on a second focusing method, and the first focusing method is different from the second focusing method;
    And determining a target lens position according to the first lens position and the second lens position, so that the shooting equipment can be adjusted to the target lens position from the current lens position.
  22. The method of claim 21, wherein a lens of the photographing device is coupled with a focus follower motor, the focus follower motor for driving the lens of the photographing device to rotate to adjust the photographing device from a current lens position to the target lens position.
  23. The method of claim 22, wherein the photographing device and the follower motor are carried on a cradle head.
  24. The method of claim 23, wherein the capture device is communicatively coupled to the pan-tilt, wherein:
    the display device is arranged on the handheld part of the cradle head; or (b)
    The display device is detachably arranged on the mounting seat of the cradle head, and the display device is in communication connection with the cradle head.
  25. The method of claim 23, wherein the tracker motor is communicatively coupled to the cradle head, and an operating member of the tracker motor is disposed on a hand-held portion of the cradle head, the operating member being configured to receive a control command from a user for the tracker motor.
  26. The method of claim 21 or 22, wherein the current lens position is displayed by a display device together with focus cues that comprise a predicted lens position range formed based on the target lens position.
  27. The method of claim 26, wherein the display device is an interactive interface of the photographing apparatus or is communicatively coupled to the photographing apparatus.
  28. The method of claim 21, wherein the first lens position has a first confidence level and the second lens position has a second confidence level;
    the determining the target lens position according to the first lens position and the second lens position includes:
    and determining the higher confidence level of the first lens position and the second lens position as the target lens position.
  29. The method of claim 28, wherein the method further comprises:
    and normalizing the first confidence coefficient and the second confidence coefficient.
  30. The method of claim 28, wherein determining the first lens position based on the first focusing method comprises:
    Determining distance data between the shooting target and the shooting equipment through a distance sensor;
    calculating a focusing object distance according to the distance data;
    and calculating the target image distance according to the focusing object distance, and determining the first lens position according to the target image distance.
  31. The method of claim 30, wherein a signal-to-noise ratio of the acquired data of the distance sensor is positively correlated with the first confidence level.
  32. The method of claim 28, wherein determining the second lens position based on the second focusing method comprises:
    controlling the shooting equipment to respectively shoot the shooting target at a first position and a second position, wherein the first position is different from the second position;
    acquiring a first ambiguity of a lens of the photographing device when the photographing target is photographed at the first position;
    acquiring a second ambiguity of a lens of the photographing apparatus when photographing the photographing target at the second position;
    and determining the second lens position according to the first ambiguity and the second ambiguity.
  33. The method of claim 32, wherein the determining the second lens position from the first ambiguity and the second ambiguity comprises:
    Determining a curve for focusing according to the first ambiguity and the second ambiguity;
    and determining the second lens position according to the curve.
  34. The method of claim 33, wherein the second confidence level is positively correlated with a curvature of the curve.
  35. The method of claim 28, wherein the current lens position is displayed via a display device along with focus cues that include a predicted lens position range formed based on the target lens position;
    the size of the estimated lens position range is inversely related to the first confidence level or the second confidence level.
  36. The method of claim 26, wherein the target lens position is a midpoint position of the range of predicted lens positions.
  37. The method of claim 26, wherein the focus hint information further comprises a direction and/or distance of movement of a lens of the photographing device.
  38. The method of claim 37, wherein the distance of movement is characterized by a set distance between the photographic target and the photographic device.
  39. The method of claim 21, wherein the display identifier corresponding to the current lens position changes when the current lens position coincides with the target lens position.
  40. The method of claim 39, wherein the change comprises at least one of a color change, a shape change, a size change.
  41. A photographing apparatus, characterized in that the photographing apparatus comprises:
    at least one processor and a memory, the memory storing computer-executable instructions, the at least one processor executing the computer-executable instructions stored by the memory such that executing the computer-executable instructions performs the steps of:
    acquiring a current lens position of the shooting equipment and a target lens position of the shooting equipment in real time in a focusing process of the shooting equipment, wherein when the shooting equipment is positioned at the target lens position, a shooting target is in a focusing state in a shooting picture of the shooting equipment;
    and displaying the current lens position and focusing prompt information through a display device so as to prompt a user to focus the shooting equipment according to the current lens position and the focusing prompt information, and adjusting the current lens position to the target lens position through user operation, wherein the focusing prompt information comprises a predicted lens position range formed based on the target lens position.
  42. The photographing apparatus of claim 41, wherein a lens of the photographing apparatus is coupled with a focus follower motor, the focus follower motor for driving the lens of the photographing apparatus to rotate according to a user instruction to focus the photographing apparatus.
  43. The photographing apparatus of claim 42, wherein the photographing apparatus and the focus follower motor are carried on a cradle head.
  44. The photographing apparatus of claim 43, wherein the photographing apparatus is communicatively coupled to the pan-tilt head, wherein:
    the display device is arranged on the handheld part of the cradle head; or (b)
    The display device is detachably arranged on the mounting seat of the cradle head, and the display device is in communication connection with the cradle head.
  45. The photographing apparatus of claim 43, wherein the focus follower motor is communicatively connected to the cradle head, and an operating member of the focus follower motor is disposed on a hand-held portion of the cradle head, and the operating member is configured to receive a control command of a user for the focus follower motor.
  46. The photographing apparatus of claim 41 or 42, wherein the display device is an interactive interface of the photographing apparatus or is communicatively connected to the photographing apparatus.
  47. The photographing apparatus of claim 41, wherein said processor when executing said computer-executable instructions further performs the steps of:
    acquiring a first lens position and a second lens position for enabling the shooting target to be in an in-focus state in a shooting picture of the shooting device, wherein the first lens position is determined based on a first focusing method, the second lens position is determined based on a second focusing method, and the first focusing method is different from the second focusing method;
    and determining the target lens position according to the first lens position and the second lens position.
  48. The photographing device of claim 46, wherein the first lens position has a first confidence level and the second lens position has a second confidence level;
    the determining the target lens position according to the first lens position and the second lens position includes:
    and determining the higher confidence level of the first lens position and the second lens position as the target lens position.
  49. The photographing apparatus of claim 48, wherein said processor when executing said computer-executable instructions further performs the steps of:
    And normalizing the first confidence coefficient and the second confidence coefficient.
  50. The photographing device of claim 47, wherein determining the first lens position based on the first focusing method comprises:
    determining distance data between the shooting target and the shooting equipment through a distance sensor;
    calculating a focusing object distance according to the distance data;
    and calculating the target image distance according to the focusing object distance, and determining the first lens position according to the target image distance.
  51. The photographing apparatus of claim 48, wherein a signal-to-noise ratio of the acquired data of the distance sensor is positively correlated with the first confidence level.
  52. The photographing device of claim 47, wherein determining the second lens position based on the second focusing method comprises:
    controlling the shooting equipment to respectively shoot the shooting target at a first position and a second position, wherein the first position is different from the second position;
    acquiring a first ambiguity of a lens of the photographing device when the photographing target is photographed at the first position;
    acquiring a second ambiguity of a lens of the photographing apparatus when photographing the photographing target at the second position;
    And determining the second lens position according to the first ambiguity and the second ambiguity.
  53. The photographing device of claim 52, wherein the determining the second lens position from the first and second ambiguities comprises:
    determining a curve for focusing according to the first ambiguity and the second ambiguity;
    and determining the second lens position according to the curve.
  54. The photographing apparatus of claim 48, wherein the second confidence level is positively correlated with a curvature of the curve.
  55. The photographing apparatus of claim 48, wherein the size of the estimated lens position range is inversely related to the first confidence level or the second confidence level.
  56. The photographing apparatus of claim 41, wherein the target lens position is a midpoint position of the range of estimated lens positions.
  57. The photographing apparatus of claim 41, wherein the focus prompt information further includes a moving direction and/or a moving distance of a lens of the photographing apparatus.
  58. The photographing device of claim 57, wherein the distance of movement is characterized by a set distance between the photographing objective and the photographing device.
  59. The photographing apparatus of claim 41, wherein the display identifier corresponding to the current lens position changes when the current lens position coincides with the target lens position.
  60. The photographing apparatus of claim 59, wherein the change comprises at least one of a color change, a shape change, a size change.
  61. A photographing apparatus, characterized in that the photographing apparatus comprises:
    at least one processor and a memory, the memory storing computer-executable instructions, the at least one processor executing the computer-executable instructions stored by the memory such that executing the computer-executable instructions performs the steps of:
    acquiring a first lens position and a second lens position for enabling a shooting target to be in a focusing state in a shooting picture of shooting equipment, wherein the first lens position is determined based on a first focusing method, the second lens position is determined based on a second focusing method, and the first focusing method is different from the second focusing method;
    and determining a target lens position according to the first lens position and the second lens position, so that the shooting equipment can be adjusted to the target lens position from the current lens position.
  62. The photographing device of claim 61, wherein a lens of the photographing device is coupled with a focus follower motor, the focus follower motor for driving the lens of the photographing device to rotate to adjust the photographing device from a current lens position to the target lens position.
  63. The photographing apparatus of claim 62, wherein the photographing apparatus and the focus follower motor are carried on a cradle head.
  64. The shooting device of claim 63, wherein the shooting device is communicatively coupled to the pan-tilt, the shooting device further comprising a display means, wherein:
    the display device is arranged on the handheld part of the cradle head; or (b)
    The display device is detachably arranged on the mounting seat of the cradle head, and the display device is in communication connection with the cradle head.
  65. The apparatus of claim 63, wherein the focus follower motor is communicatively connected to the pan-tilt, and an operating member of the focus follower motor is disposed on a hand-held portion of the pan-tilt, and the operating member is configured to receive a control command of a user for the focus follower motor.
  66. The photographing apparatus of claim 61 or 62, wherein the current lens position and focus hint information is displayed via a display device, the focus hint information including a predicted lens position range formed based on the target lens position.
  67. The photographing apparatus of claim 66, wherein the display device is an interactive interface of the photographing apparatus or is communicatively coupled with the photographing apparatus.
  68. The photographing device of claim 61, wherein the first lens position has a first confidence level and the second lens position has a second confidence level;
    the determining the target lens position according to the first lens position and the second lens position includes:
    and determining the higher confidence level of the first lens position and the second lens position as the target lens position.
  69. The photographing apparatus of claim 68, wherein said processor when executing said computer-executable instructions further performs the steps of:
    and normalizing the first confidence coefficient and the second confidence coefficient.
  70. The photographing device of claim 68, wherein determining the first lens position based on the first in-focus photographing device comprises:
    determining distance data between the shooting target and the shooting equipment through a distance sensor;
    calculating a focusing object distance according to the distance data;
    And calculating the target image distance according to the focusing object distance, and determining the first lens position according to the target image distance.
  71. The photographing apparatus of claim 70, wherein a signal-to-noise ratio of the acquired data of the distance sensor is positively correlated with the first confidence level.
  72. The photographing device of claim 68, wherein determining the second lens position based on the second in-focus photographing device comprises:
    controlling the shooting equipment to respectively shoot the shooting target at a first position and a second position, wherein the first position is different from the second position;
    acquiring a first ambiguity of an image shot by the shooting equipment when the shooting target is shot at the first position;
    acquiring a second ambiguity of an image shot by the shooting equipment when the shooting target is shot at the second position;
    and determining the second lens position according to the first ambiguity and the second ambiguity.
  73. The photographing device of claim 72, wherein the determining the second lens position from the first and second ambiguities comprises:
    Determining a curve for focusing according to the first ambiguity and the second ambiguity;
    and determining the second lens position according to the curve.
  74. The photographing apparatus of claim 73, wherein the second confidence level is positively correlated with a curvature of the curve.
  75. The photographing apparatus of claim 68, wherein the current lens position and focus hint information are displayed by a display device, the focus hint information including a predicted lens position range formed based on the target lens position;
    the size of the estimated lens position range is inversely related to the first confidence level or the second confidence level.
  76. The photographing apparatus of claim 66, wherein the target lens position is a midpoint position of the range of estimated lens positions.
  77. The photographing apparatus of claim 66, wherein the focus prompt information further comprises a moving direction and/or a moving distance of a lens of the photographing apparatus.
  78. The photographing apparatus of claim 77, wherein the moving distance is characterized by a set distance between the photographing objective and the photographing apparatus.
  79. The photographing device of claim 61, wherein the display identifier corresponding to the current lens position changes when the current lens position coincides with the target lens position.
  80. The photographing device of claim 79, wherein the change comprises at least one of a color change, a shape change, and a size change.
  81. A photographing system, comprising: a cradle head and a camera device according to any one of claims 41-60;
    the cradle head is used for bearing the shooting equipment.
  82. A photographing system, comprising: a cradle head and a shooting apparatus of any one of claims 61-80;
    the cradle head is used for bearing the shooting equipment.
  83. A readable storage medium having a computer program stored thereon; which when executed, implements a focusing method as claimed in any one of claims 1-40.
CN202180087692.6A 2021-04-09 2021-04-09 Focusing method, photographing apparatus, photographing system, and readable storage medium Pending CN116686299A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/086073 WO2022213340A1 (en) 2021-04-09 2021-04-09 Focusing method, photographic device, photographic system and readable storage medium

Publications (1)

Publication Number Publication Date
CN116686299A true CN116686299A (en) 2023-09-01

Family

ID=83544995

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180087692.6A Pending CN116686299A (en) 2021-04-09 2021-04-09 Focusing method, photographing apparatus, photographing system, and readable storage medium

Country Status (2)

Country Link
CN (1) CN116686299A (en)
WO (1) WO2022213340A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008046351A (en) * 2006-08-16 2008-02-28 Canon Inc Automatic focusing device and imaging apparatus
KR20140091959A (en) * 2013-01-14 2014-07-23 삼성전자주식회사 A focus Aid System
WO2019037087A1 (en) * 2017-08-25 2019-02-28 深圳市大疆创新科技有限公司 Method and device for assisting manual focusing, and unmanned aerial vehicle
JP6543880B1 (en) * 2018-04-26 2019-07-17 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd CONTROL DEVICE, IMAGING DEVICE, CONTROL METHOD, AND PROGRAM
JP6874251B2 (en) * 2019-07-23 2021-05-19 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Devices, imaging devices, moving objects, methods, and programs

Also Published As

Publication number Publication date
WO2022213340A1 (en) 2022-10-13

Similar Documents

Publication Publication Date Title
JP7145208B2 (en) Method and Apparatus and Storage Medium for Dual Camera Based Imaging
CN109451244B (en) Automatic focusing method and system based on liquid lens
US8335393B2 (en) Image processing apparatus and image processing method
US8055097B2 (en) Image pick-up apparatus, image pick-up program, and image processing program
CN109521547B (en) Variable-step-length automatic focusing method and system
US7702231B2 (en) Autofocus control apparatus and method of controlling the same
US7231143B2 (en) Manual focus device and autofocus camera
US9781334B2 (en) Control method, camera device and electronic equipment
US20190164256A1 (en) Method and device for image processing
JP2003131121A (en) Device and method for automatic focus adjustment
JP2010197551A (en) Imaging apparatus and image synthesis method
JP3937678B2 (en) Electronic still camera
JP2010213038A (en) Imaging apparatus
WO2022061519A1 (en) Photographing control method and apparatus, gimbal, and follow focus motor
JP2002325199A (en) Electronic imaging device
JP4081806B2 (en) Autofocus camera and photographing method
JP2002296494A (en) Image forming position detecting program and camera
TWI393981B (en) Use the flash to assist in detecting focal lengths
JP2007052072A (en) Focus detecting device, optical equipment, and focusing method
CN106878604B (en) Image generation method based on electronic equipment and electronic equipment
CN116686299A (en) Focusing method, photographing apparatus, photographing system, and readable storage medium
WO2022213339A1 (en) Focusing method, photographing device, photographing system, and readable storage medium
JP2009219085A (en) Imaging apparatus
JP6778340B2 (en) Imaging device, imaging method, and program
CN104683694A (en) Terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination