WO2019129077A1 - 对焦的方法和电子设备 - Google Patents

对焦的方法和电子设备 Download PDF

Info

Publication number
WO2019129077A1
WO2019129077A1 PCT/CN2018/123942 CN2018123942W WO2019129077A1 WO 2019129077 A1 WO2019129077 A1 WO 2019129077A1 CN 2018123942 W CN2018123942 W CN 2018123942W WO 2019129077 A1 WO2019129077 A1 WO 2019129077A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
focus
phase
contrast
focus position
Prior art date
Application number
PCT/CN2018/123942
Other languages
English (en)
French (fr)
Inventor
谢琼
蔡西蕾
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2019129077A1 publication Critical patent/WO2019129077A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Definitions

  • the present application relates to the field of electronic devices, and more particularly to a method and an electronic device for focusing.
  • phase detection auto focus (PDAF).
  • a focus mode of the existing electronic device is: the main camera first uses the PDAF to determine the quasi-focal position of the main camera, and after the main camera focuses, the auxiliary camera determines and adjusts the auxiliary camera according to the collimated position of the main camera. At the position of the lens, the final auxiliary camera uses a contrast detect auto focus (CDAF) to fine-tune the lens to determine the position of the secondary camera.
  • CDAF contrast detect auto focus
  • Another focus mode of the existing electronic device is: the main camera and the auxiliary camera simultaneously focus, wherein the main camera uses the PDAF to focus, the auxiliary camera uses the CDAF to focus or the laser detects the distance of the subject to focus (may be referred to as Laser focus).
  • these focusing modes used by the auxiliary camera also have some disadvantages.
  • the CDAF has a serious phenomenon of pulling the bellows, and the focusing time is long, and the shooting picture is blurred during the focusing process; similarly, the laser focusing process takes a long time. And the shooting picture is blurred during the focusing process, which affects the user experience.
  • the present application provides a method and an electronic device for focusing, which can achieve fast focusing and improve user experience.
  • a method of focusing for use in an electronic device including at least two cameras, a first one of the at least two cameras for detecting phase information of an image, in the at least two cameras
  • the second camera is for detecting contrast information of the image, and the method includes:
  • the contrast focus position includes a contrast focus movement direction of the second camera.
  • the process of performing phase detection autofocus in the first camera may indicate all or part of the time period of the first camera after the phase detection focus is turned on until the phase detection focus ends.
  • the process in which the first camera performs phase detection autofocus may indicate a period of time before the first camera turns on the CDAF, and the embodiment of the present application is not limited thereto.
  • the processor may determine, according to one or more phase focus positions of the first camera, at least one phase focus position of the second camera during the phase detection autofocus of the first camera, And controlling the second camera to move to the at least one phase focus position; implementing a processor to control the focus of the two cameras in parallel, so that the second camera that does not support the PDAF function synchronizes the focus behavior of the first camera supporting the PDAF function, and realizes Fast focus of the PDAF second camera is not supported, which can reduce the overall focus time and enhance the user experience.
  • the processor controls the second camera to move to the at least one phase focus position during the phase detection autofocus of the first camera. It can also be stated that the processor controls the first camera and the second camera in parallel.
  • the parallel control of the first camera and the second camera can be understood as controlling the first camera and the second camera at the same time. It should be understood that controlling two cameras in parallel or controlling two cameras simultaneously in the embodiment of the present application is not limited to controlling two.
  • the camera is strictly consistent in time. For example, a certain time interval may be permitted in controlling the time when the two cameras move.
  • the embodiment of the present application is not limited thereto.
  • the first camera movement may be controlled first, and then during the movement of the first camera, the focus position of the second camera is determined according to the focus position of the first camera, and then the control is performed.
  • Two cameras move.
  • the focus position of the second camera may be determined according to the focus position of the first camera, and then the processor controls the first camera and the second camera to move to the corresponding positions in parallel. Focus position.
  • the processor can move the camera to a corresponding focus position by controlling the lens motor driver.
  • the processor can control at least two cameras to perform focusing in parallel, so that the camera that does not support the PDAF function synchronizes the focusing behavior of the camera supporting the PDAF function, realizes fast focusing without supporting the PDAF camera, and can reduce the overall focus. Time to enhance the user experience.
  • one or more phase in-focus positions of the first camera may include one or a plurality of phase in-focus positions of the first camera during the phase detection auto-focusing process.
  • the one or more phase in-focus positions may include the first phase in-focus position or one or more phase in-focus positions in the middle of the phase focusing process, or the last phase in-focus position, and embodiments of the present application are not limited thereto.
  • the phase focus position is a focus position calculated based on phase information of an image.
  • the contrast focus position is a focus position calculated based on contrast information of an image.
  • the “focus position” in the embodiment of the present application may include a moving direction and/or a moving distance, which indicates a direction in which a movable lens, a lens group or a lens in the camera needs to move in order to obtain a clear image;
  • the movement Distance indicates the distance that the movable lens, lens group or lens in the camera needs to move in the direction of movement in order to obtain a clear image.
  • the focus position can indicate where the lens, lens set or lens that is movable in the camera needs to be in order to obtain a sharp image.
  • the embodiment of the present application may determine the contrast focus moving direction according to the contrast information of the recorded image. For example, when the curve information of the recorded contrast information is better, the contrast focus moving direction may be determined to be a curve. Information is getting better and better.
  • the image detected by the second camera in one or more phase in-focus positions of the at least one phase in-focus position may be The contrast information determines the contrast focus position of the second camera. That is to say, in the embodiment of the present application, the contrast information of the image detected by the second camera may be recorded during the synchronization of the second camera by the second camera, and the contrast focus position of the second camera is determined according to the contrast information. Since the behavior of synchronizing the first camera with the second camera may cause the focus movement direction of the second camera to be inaccurate, or the second camera needs to be focused by the contrast focus mode after the first camera is focused.
  • the embodiment of the present application can ensure the accuracy of the focus moving direction by the recorded contrast information, or directly determine the contrast focus position by the recorded contrast information, and avoid determining the second camera according to the focus position of the first camera or the CDAF.
  • the focus position can reduce the focus time and enhance the user experience.
  • the image signal processor can respectively pass the first The camera and the second camera acquire images and combine the images acquired by the two cameras to form a final image.
  • the electronic device can also display the final image through the display screen.
  • the image processing process after the two cameras are respectively moved to the corresponding focus positions can refer to the image synthesis algorithm of the existing multi-camera, which is not limited in the embodiment of the present application.
  • the method further includes:
  • the second camera is controlled to detect contrast information of the image when the one or more phase in-focus positions of the at least one phase in-focus position.
  • the processor controls the second camera to detect contrast information of the image when the moving state of the electronic device, and/or the image stabilization state of the second camera satisfies the detection condition; or, in the moving state, and/or the image
  • the processor does not control the contrast information of the second camera detection image when the steady state does not satisfy the detection condition.
  • the moving state of the electronic device is relatively stable or the moving is slow
  • the gyro or accelerator of the electronic device detects that the mobility of the electronic device is less than the preset moving threshold, and/or the steady state of the image is the image contrast change Smaller (for example, when the second camera has a small contrast change between the current focus position and the previous focus position image, for example, less than the preset contrast change threshold, it is considered that the detection condition is satisfied. Otherwise, it can be considered that the detection condition is not satisfied.
  • the embodiment of the present application may also consider that the captured picture is stationary.
  • the captured picture may be considered to be changed, and the embodiment of the present application is not limited thereto.
  • the image stabilization state of the camera controls whether the second camera detects contrast information of the image, for example, the processor determines that the contrast information of the detected image is determined by the one or more phase in-focus positions of the second camera in the at least one phase in-focus position, and according to The contrast information of the detected image determines the contrast focus position of the second camera.
  • the processing can be made.
  • the device accurately determines the contrast focus position according to the contrast information of the acquired image, and the embodiment of the present application can ensure the accuracy of the focus moving direction by the recorded contrast information, or directly determine the contrast focus position by the recorded contrast information, thereby avoiding again Determining the focus position of the second camera according to the focus position of the first camera or the CDAF manner can reduce the focus time and improve the user experience.
  • the processor controls the contrast information of the second camera to acquire the image only when the picture is still, and determines the contrast focus position of the second camera.
  • the embodiment of the present application may also detect contrast information of an image at all phase in-focus positions of the second camera, and then obtain valid information from the contrast information of the detected image.
  • the method further includes:
  • the contrast information of the image detected by the second camera at one or more of the at least one phase in-focus position is valid.
  • the moving state of the electronic device is relatively stable or the moving is slow
  • the gyro or accelerator of the electronic device detects that the mobility of the electronic device is less than the preset moving threshold, and/or the steady state of the image is the image contrast change Smaller
  • the contrast of the second camera at the current focus position and the previous focus position image is small, for example, less than the preset contrast change threshold (ie, when the shooting screen is stationary)
  • the contrast of the image detected by the second camera is considered to be small.
  • the information is valid. Otherwise (ie, when the shooting picture changes), the contrast information of the image detected by the second camera can be considered invalid.
  • the one or more phase focus positions of the at least one phase focus position of the second camera are stationary, and the plurality of phase focus positions of the at least one phase focus position are Continuous valid information, that is, the second camera is still stationary when the multiple phase focus positions are taken.
  • the embodiment of the present application detects the contrast information of the image, and determines that the contrast information of the image in the case where the image is still is effective, and the contrast of the image acquired by the still image is effective.
  • the information is relatively reliable, and the processor can accurately determine the contrast focus position according to the contrast information of the acquired image.
  • the accuracy of the focus movement direction can be ensured by the recorded contrast information, or directly determined by the recorded contrast information.
  • the contrast focus position avoids determining the focus position of the second camera according to the focus position of the first camera or the CDAF mode, which can reduce the focus time and enhance the user experience.
  • the contrast focus position of the second camera further includes a contrast focus moving distance of the second camera
  • the method further includes:
  • the focus position of the second camera can be determined according to the contrast information of the record (ie, the contrast focus moving direction and The moving distance), for example, the quasi-focal position may be the position corresponding to the peak, or the quasi-focal position is the quasi-focal position obtained by the contrast information curve fitting.
  • controlling the lens movement of the second camera to be determined by using the CDAF method When the camera is in the focus position, it can be determined according to the contrast information of the record. For example, when the curve information of the recorded contrast information is better, the lens can be moved in the direction before the CDAF method is continued. The final quasi-focal position is determined; or, when the curve information of the recorded contrast information is worse, the CDAF method can be used to move the lens in the opposite direction to determine the final quasi-focus position and the like.
  • the contrast focus moving direction and the moving distance of the second camera can be determined according to the contrast information of the recorded image, regardless of Whether the first camera finishes focusing or not can directly control the second camera to move the moving distance in the moving direction to complete the focusing of the second camera.
  • the image detected by the second camera in one or more phase in-focus positions of the at least one phase in-focus position may be The contrast information determines a contrast focus moving direction of the second camera and a contrast focus moving distance of the second camera. Since the quasi-focal position of the second camera has been determined, the processor can directly control the second camera to move the contrast focus movement distance in the contrast focus movement direction of the second camera to complete the focusing of the second camera without further control.
  • the second camera synchronizes the behavior of the first camera, and does not need to control the second camera to determine the quasi-focal position by using the CDAF method, which can reduce the focusing time and enhance the user experience.
  • the method further includes:
  • a next phase in-focus position of the second camera includes a phase in-focus movement direction and phase of the second camera Focus movement distance;
  • the second camera is controlled to move to the next phase focus position.
  • the first camera performs phase detection autofocus (ie, when the first camera does not end phase focusing)
  • the first camera moves to the current phase focus position, as described in step 230 above, at the distance
  • the first threshold is greater than the first threshold
  • the contrast information can determine the contrast focus moving direction and moving distance of the second camera.
  • the method further includes:
  • the embodiment of the present application can stop the second camera synchronizing the behavior of the first camera, and directly control the second camera to adopt the CDAF mode to move the preset in the contrast focus moving direction.
  • the distance for example, is fixed in the contrast focus movement direction by CDAF to perform CDAF focusing.
  • controlling the second camera to move the preset distance by using the CDAF method in the contrast focus moving direction for example, moving in the contrast focus moving direction by using a fixed step of the CDAF. Perform CDAF focusing.
  • the embodiment of the present application determines the contrast focus moving direction of the second camera by comparing the contrast information of the image in the process of synchronizing the first camera with the second camera, and thus the phase focusing direction of the second camera is wrong, that is, the second camera.
  • the phase focusing direction is opposite to the contrast focusing direction
  • the second camera is stopped to synchronize the behavior of the first camera, and CDAF focusing is directly used in the contrast focusing direction, which can avoid unnecessary movement behavior of the second camera and ensure the accuracy of the second camera.
  • Fast focus is possible focus.
  • the embodiment of the present application determines the contrast focus movement direction of the second camera by comparing the contrast information of the image in the process of synchronizing the first camera with the second camera, and then directly in the contrast focus direction after the first camera finishes focusing.
  • Adopting CDAF focusing can avoid the second camera adopting CDAF mode to move the direction of focusing of the second camera in a random direction, so as to ensure accurate and fast focusing of the second camera.
  • a processor comprising: a processing unit and a storage unit,
  • the storage unit is configured to store code
  • the processing unit is configured to execute a method in the code unit to implement the first aspect or any feasible implementation manner of the first aspect.
  • an electronic device comprising: a processor and at least two cameras;
  • the first one of the at least two cameras is configured to detect phase information of the image
  • a second one of the at least two cameras is configured to detect contrast information of the image
  • the processor is used to:
  • the contrast focus position includes a contrast focus movement direction of the second camera.
  • At least one phase focus position of the second camera may be determined according to one or more phase focus positions of the first camera during the phase detection focusing of the first camera, and the control center may be controlled.
  • the second camera moves to the at least one phase in-focus position.
  • the camera does not support the PDAF function to synchronize the focus behavior of the camera supporting the PDAF function, and realizes the fast focus that does not support the PDAF camera, which can reduce the overall focusing time and enhance the user experience.
  • the second camera in the process of synchronizing the first camera with the second camera, may also detect that the second camera is in the one or more phase in-focus positions of the at least one phase in-focus position.
  • the contrast information of the image determines the contrast focus position of the second camera. That is to say, in the embodiment of the present application, the contrast information of the image detected by the second camera may be recorded during the synchronization of the second camera by the second camera, and the contrast focus position of the second camera is determined according to the contrast information. Since the behavior of synchronizing the first camera with the second camera may cause the focus movement direction of the second camera to be inaccurate, or the second camera needs to be focused by the contrast focus mode after the first camera is focused.
  • the embodiment of the present application can ensure the accuracy of the focus moving direction by the recorded contrast information, or directly determine the contrast focus position by the recorded contrast information, thereby avoiding determining the first according to the focus position of the first camera or the CDAF.
  • the secondary focus position of the two cameras can reduce the focus time and enhance the user experience.
  • the third aspect corresponds to the first aspect, and the processor can implement the method in the first aspect and its feasible implementation manner, and the detailed description is omitted here as appropriate in order to avoid repetition.
  • the processor is further configured to:
  • the processor controls the second camera to detect contrast information of an image when the one or more phase in-focus positions of the at least one phase in-focus position.
  • the processor is further configured to:
  • the contrast information of the image detected by the second camera at one or more of the at least one phase in-focus position is valid.
  • the contrast focus position of the second camera further includes a contrast focus moving distance of the second camera
  • the processor is further configured to:
  • the processor is further configured to:
  • a next phase in-focus position of the second camera includes a phase in-focus movement direction and phase of the second camera Focus movement distance;
  • the second camera is controlled to move to the next phase focus position.
  • the processor is further configured to:
  • the solution implemented by the above processor may be implemented by a chip.
  • a computer program product comprising: a computer program (also referred to as a code, or an instruction) that, when executed, causes the computer to perform the first aspect and the A method in any of the possible implementations on the one hand.
  • a computer program also referred to as a code, or an instruction
  • a computer readable medium storing a computer program (which may also be referred to as a code, or an instruction), when executed on a computer, causes the computer to perform the first aspect and the A method in any of the possible implementations on the one hand.
  • a computer program which may also be referred to as a code, or an instruction
  • FIG. 1 is a schematic diagram of an applicable scenario of an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a focus flow according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a focusing method in accordance with an embodiment of the present application.
  • Figure 4 is a schematic diagram of the principle of lens imaging.
  • FIG. 5 is a schematic diagram of a focusing process in accordance with an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a focusing process in accordance with one embodiment of the present application.
  • FIG. 7 is a schematic diagram of a focusing process in accordance with one embodiment of the present application.
  • FIG. 8 is a schematic diagram of a focus flow according to another embodiment of the present application.
  • FIG. 9 is a schematic block diagram of an image signal processor in accordance with one embodiment of the present application.
  • FIG. 10 is a schematic block diagram of an electronic device in accordance with an embodiment of the present application.
  • FIG. 11 is a block diagram showing the structure of an electronic device according to an embodiment of the present application.
  • FIG. 1 is a schematic diagram of an applicable scenario of an embodiment of the present application.
  • the electronic device 100 may include at least two cameras, for example, a first camera 110 and a second camera 120, and the electronic device 100 may control the first camera 110 through a processor controller (not shown).
  • the second camera 120 focuses on the object 130 and acquires an image of the object 130.
  • the electronic device in the embodiment of the present application may include a mobile phone, a tablet computer, a personal digital assistant (PDA), a notebook computer, a desktop computer, a point of sales (POS), a monitoring device, and the like, including at least two cameras. device of.
  • PDA personal digital assistant
  • POS point of sales
  • the processor in the embodiment of the present application may also be referred to as an image signal processor, an image processing unit, a processing unit, or a processing module.
  • the processor may be a CPU of the electronic device, and the processor may also be a separate device from the CPU.
  • the embodiment of the present application is not limited thereto.
  • the implementation principle of the PDAF is as follows: some shielding pixel points are reserved on the photosensitive element, which are specifically used for phase detection of the image, and the offset value of the focus is determined by the distance between the pixels and the like to achieve accurate Focus.
  • the focusing process for capturing an object for the above-mentioned electronic device having only one camera supporting the PDAF takes a long time, and the shooting picture is blurred during the focusing process, and the user experience is poor.
  • the embodiment of the present application proposes a method for focusing, by enabling a camera (second camera) that does not support the PDAF function to synchronize the focusing behavior of a camera (first camera) that supports the PDAF function, thereby implementing a camera that does not support the PDAF camera.
  • Fast focus can reduce the focus time, reduce or avoid the problem of blurring the picture during the focus, and enhance the user experience.
  • At least one phase focus position of the second camera may be determined according to one or more phase focus positions of the first camera during the phase detection focusing of the first camera, And controlling the second camera to move to the at least one phase focus position.
  • the camera does not support the PDAF function to synchronize the focus behavior of the camera supporting the PDAF function, and realizes the fast focus that does not support the PDAF camera, which can reduce the overall focusing time and enhance the user experience.
  • the second camera in the process of synchronizing the first camera with the second camera, may also detect that the second camera is in the one or more phase in-focus positions of the at least one phase in-focus position.
  • the contrast information of the image determines the contrast focus position of the second camera. That is to say, in the embodiment of the present application, the contrast information of the image detected by the second camera may be recorded during the synchronization of the second camera by the second camera, and the contrast focus position of the second camera is determined according to the contrast information. Since the behavior of synchronizing the first camera with the second camera may cause the focus movement direction of the second camera to be inaccurate, or the second camera needs to be focused by the contrast focus mode after the first camera is focused.
  • the embodiment of the present application can ensure the accuracy of the focus moving direction by the recorded contrast information, or directly determine the contrast focus position by the recorded contrast information, thereby avoiding determining the first according to the focus position of the first camera or the CDAF.
  • the secondary focus position of the two cameras can reduce the focus time and enhance the user experience.
  • the second camera that does not support the PDAF function needs to synchronously support the focus behavior of the first camera of the PDAF. Therefore, in order to make the solution of the embodiment of the present application easy to understand, the following first describes the support of the PDAF function in conjunction with FIG. 2 . A specific process of focusing on a camera.
  • the method as shown in FIG. 2 may be performed by a processor (eg, an image signal processor), and the method illustrated in FIG. 2 includes:
  • the first focus position is first determined according to the PD information of the camera acquired image, and the camera is controlled to move to the first focus position.
  • the image signal processor reacquires the PD information of the image, and determines the second focus position according to the PD information of the newly acquired image, and then determines the current position (ie, the first focus position) ) the difference from the second focus position.
  • the object photographed by the camera may move during the focusing process, or the electronic device may also move, or, before moving the camera in 210, when the position of the camera is far from the quasi-focus position , the PDAF focus accuracy may be poor, resulting in the current position in 220 (ie, the position of the lens after performing step 210) and the second in-focus position (ie, the quasi-focus position determined by the PD information of the image acquired at the current position) There is a certain gap, so the current position may have a certain distance from the second focus position.
  • the control camera moves to the second focus position, and then determines whether the new distance acquired by the method according to the method is greater than the first threshold after the camera moves to the second focus position, and repeats The process of 210 to 230 until it is determined that the last distance acquired is less than or equal to the first threshold.
  • step 240 is performed.
  • the first threshold may be a preset value, and the first threshold may be determined according to an actual situation.
  • the embodiment of the present application does not limit the value of the first threshold.
  • the PD self-convergence condition is considered to be satisfied.
  • step 210 is still performed, and the process is repeated until consecutive n
  • the distance eg, 2 or 3 times, etc.
  • step 240 is performed.
  • step 240 may be modified as follows: it is determined that the PD self-convergence condition is satisfied if there is a continuous n-time repetition process in which the distance is less than or equal to the first threshold.
  • the position of the camera may not be moved when the above process is repeated, and the current focus position is directly determined according to the PD information of the current position camera acquisition image, and then performed. Subsequent comparison judgment process.
  • step 260 is performed and the focus ends.
  • step 250 is performed.
  • the CDAF small step search is used to determine the focus position and control the camera to move to the focus position.
  • step 240 may be an optional step.
  • the process of repeating 210 to 230 is reached.
  • step 260 or 250 may be directly executed without performing step 240.
  • the implementation of the present application is not limited thereto.
  • the CDAF large step search target position may be performed first, and then in the case where the large step size determines that the focus position and the current position difference are smaller than the second threshold, step 250 is performed.
  • the application embodiment is not limited to this. It should be understood that the CDAF large step search for the focus position is greater than the small step search to determine the amplitude of the focus position.
  • the second threshold may be a preset value, and the second threshold may be determined according to an actual situation.
  • the embodiment of the present application does not limit the value of the first threshold.
  • FIG. 3 is a schematic flow chart showing a method of focusing in an embodiment of the present application.
  • the method shown in FIG. 3 can be applied to the above electronic device including at least two cameras, the first camera of the at least two cameras supports a phase detection autofocus PDAF function, and the first camera is configured to detect phase information of the image;
  • the second camera does not support the PDAF function, and the second camera is used to detect the contrast information of the image.
  • the method 300 shown in Figure 3 can be performed by a processor of the electronic device.
  • the processor may control the first camera to perform focusing according to the method similar to FIG. 2, and the processor controls the second camera to synchronize the focusing behavior of the first camera in parallel, thereby implementing the PDAF not supported.
  • the fast focus of the second camera of the function reduces the overall focus time.
  • the camera can acquire an image, and the camera can include a lens group and a photosensitive element.
  • each camera in the embodiment of the present application may further include a corresponding image signal processing module.
  • the camera may The image signal processing module is not included, and the image signal processing is performed by the processor.
  • the embodiment of the present application is not limited thereto.
  • the method 300 shown in FIG. 3 includes:
  • phase detection autofocus of the first camera determining at least one phase focus position of the second camera according to one or more phase focus positions of the first camera, and controlling the second camera to move to the at least A phase focus position.
  • the phase focus position is a focus position calculated based on phase information of an image.
  • At least one phase in-focus position of the first camera may be determined according to the phase detection PD information of the image acquired by the first camera.
  • at least one phase in-focus position of the second camera may be determined according to one or more phase in-focus positions of the at least one phase in-focus position of the first camera.
  • the “focus position” in the embodiment of the present application may include a moving direction and/or a moving distance, which indicates a direction in which a movable lens, a lens group or a lens in the camera needs to move in order to obtain a clear image;
  • the movement Distance indicates the distance that the movable lens, lens group or lens in the camera needs to move in the direction of movement in order to obtain a clear image.
  • the focus position can indicate where the lens, lens set or lens that is movable in the camera needs to be in order to obtain a sharp image.
  • a method for determining a phase in-focus position of a second camera according to a phase in-focus position of the first camera in the embodiment of the present application is described below.
  • the phase focus position of the first camera may be converted into depth information or object distance of the photographed object, and then the phase focus position of the second camera is calculated according to the depth information or the object distance.
  • u is called the image distance (corresponding to the distance between the imaging surface and the optical center of the lens group in the camera);
  • v is called the object distance, that is, the distance from the optical center of the object;
  • f is the focal length of the lens group, corresponding to a certain lens group, f is Fixed constant.
  • the focusing process of the embodiment of the present application is a process of adjusting the image distance u.
  • the current object distance v can be calculated according to the lens imaging principle and the focal length f1 of the first camera.
  • the object distance is the object distance at which the current image of the first camera can be clearly focused; the second camera synchronizes the position of the first camera, meaning that the second camera focuses on the same object distance position as the first camera, so according to the object distance v and
  • the focal length f2 of the second camera can calculate the image distance u2 of the second camera, which is the position at which the second camera is synchronized (corresponding to the phase focus position of the second camera).
  • mapping relationship directly determines the phase in-focus position of the second camera based on the phase in-focus position of the first camera.
  • the process of performing phase detection autofocus in the first camera may indicate all or part of the time period of the first camera after the phase detection focus is turned on until the phase detection focus ends.
  • the process in which the first camera performs phase detection autofocus may indicate a period of time before the first camera turns on the CDAF, and the embodiment of the present application is not limited thereto.
  • the processor may determine, according to one or more phase focus positions of the first camera, at least one phase focus position of the second camera during the phase detection autofocus of the first camera, And controlling the second camera to move to the at least one phase focus position; implementing a processor to control the focus of the two cameras in parallel, so that the second camera that does not support the PDAF function synchronizes the focus behavior of the first camera supporting the PDAF function, and realizes Fast focus of the PDAF second camera is not supported, which can reduce the overall focus time and enhance the user experience.
  • the processor controls the second camera to move to the at least one phase focus position during the phase detection autofocus of the first camera. It can also be stated that the processor controls the first camera and the second camera in parallel.
  • the parallel control of the first camera and the second camera can be understood as controlling the first camera and the second camera at the same time. It should be understood that controlling two cameras in parallel or controlling two cameras simultaneously in the embodiment of the present application is not limited to controlling two.
  • the camera is strictly consistent in time. For example, a certain time interval may be permitted in controlling the time when the two cameras move.
  • the embodiment of the present application is not limited thereto.
  • the first camera movement may be controlled first, and then during the movement of the first camera, the focus position of the second camera is determined according to the focus position of the first camera, and then the control is performed.
  • Two cameras move.
  • the focus position of the second camera may be determined according to the focus position of the first camera, and then the processor controls the first camera and the second camera to move to the corresponding positions in parallel. Focus position.
  • the processor can move the camera to a corresponding focus position by controlling the lens motor driver.
  • the processor can control at least two cameras to perform focusing in parallel, so that the camera that does not support the PDAF function synchronizes the focusing behavior of the camera supporting the PDAF function, realizes fast focusing without supporting the PDAF camera, and can reduce the overall focus. Time to enhance the user experience.
  • one or more phase in-focus positions of the first camera may include one or a plurality of phase in-focus positions of the first camera during the phase detection auto-focusing process.
  • the one or more phase in-focus positions may include the first phase in-focus position or one or more phase in-focus positions in the middle of the phase focusing process, or the last phase in-focus position, and embodiments of the present application are not limited thereto.
  • the contrast focus position is a focus position calculated based on contrast information of an image.
  • the embodiment of the present application may determine the contrast focus moving direction according to the contrast information of the recorded image. For example, when the curve information of the recorded contrast information is better, the contrast focus moving direction may be determined to be a curve. Information is getting better and better.
  • the image detected by the second camera in one or more phase in-focus positions of the at least one phase in-focus position may be The contrast information determines the contrast focus position of the second camera. That is to say, in the embodiment of the present application, the contrast information of the image detected by the second camera may be recorded during the synchronization of the second camera by the second camera, and the contrast focus position of the second camera is determined according to the contrast information. Since the behavior of synchronizing the first camera with the second camera may cause the focus movement direction of the second camera to be inaccurate, or the second camera needs to be focused by the contrast focus mode after the first camera is focused.
  • the embodiment of the present application can ensure the accuracy of the focus moving direction by the recorded contrast information, or directly determine the contrast focus position by the recorded contrast information, and avoid determining the second camera according to the focus position of the first camera or the CDAF.
  • the focus position can reduce the focus time and enhance the user experience.
  • the method further includes:
  • the second camera is controlled to detect contrast information of the image when the one or more phase in-focus positions of the at least one phase in-focus position.
  • the processor controls the second camera to detect contrast information of the image when the moving state of the electronic device, and/or the image stabilization state of the second camera satisfies the detection condition; or, in the moving state, and/or the image
  • the processor does not control the contrast information of the second camera detection image when the steady state does not satisfy the detection condition.
  • the moving state of the electronic device is relatively stable or the moving is slow
  • the gyro or accelerator of the electronic device detects that the mobility of the electronic device is less than the preset moving threshold, and/or the steady state of the image is the image contrast change Smaller (for example, when the second camera has a small contrast change between the current focus position and the previous focus position image, for example, less than the preset contrast change threshold, it is considered that the detection condition is satisfied. Otherwise, it can be considered that the detection condition is not satisfied.
  • the embodiment of the present application may also consider that the captured picture is stationary.
  • the captured picture may be considered to be changed, and the embodiment of the present application is not limited thereto.
  • the image stabilization state of the camera controls whether the second camera detects contrast information of the image, for example, the processor determines that the contrast information of the detected image is determined by the one or more phase in-focus positions of the second camera in the at least one phase in-focus position, and according to The contrast information of the detected image determines the contrast focus position of the second camera.
  • the processing can be made.
  • the device accurately determines the contrast focus position according to the contrast information of the acquired image, and the embodiment of the present application can ensure the accuracy of the focus moving direction by the recorded contrast information, or directly determine the contrast focus position by the recorded contrast information, thereby avoiding again Determining the focus position of the second camera according to the focus position of the first camera or the CDAF manner can reduce the focus time and improve the user experience.
  • the processor controls the contrast information of the second camera to acquire the image only when the picture is still, and determines the contrast focus position of the second camera.
  • the embodiment of the present application may also detect contrast information of an image at all phase in-focus positions of the second camera, and then obtain valid information from the contrast information of the detected image.
  • the method of the embodiment of the present application further includes:
  • the contrast information of the image detected by the second camera at one or more of the at least one phase in-focus position is valid.
  • the moving state of the electronic device is relatively stable or the moving is slow
  • the gyro or accelerator of the electronic device detects that the mobility of the electronic device is less than the preset moving threshold, and/or the steady state of the image is the image contrast change Smaller
  • the contrast of the second camera at the current focus position and the previous focus position image is small, for example, less than the preset contrast change threshold (ie, when the shooting screen is stationary)
  • the contrast of the image detected by the second camera is considered to be small.
  • the information is valid. Otherwise (ie, when the shooting picture changes), the contrast information of the image detected by the second camera can be considered invalid.
  • the one or more phase focus positions of the at least one phase focus position of the second camera are stationary, and the plurality of phase focus positions of the at least one phase focus position are Continuous valid information, that is, the second camera is still stationary when the multiple phase focus positions are taken.
  • the embodiment of the present application detects the contrast information of the image, and determines that the contrast information of the image in the case where the image is still is effective, and the contrast of the image acquired by the still image is effective.
  • the information is relatively reliable, and the processor can accurately determine the contrast focus position according to the contrast information of the acquired image.
  • the accuracy of the focus movement direction can be ensured by the recorded contrast information, or directly determined by the recorded contrast information.
  • the contrast focus position avoids determining the focus position of the second camera according to the focus position of the first camera or the CDAF mode, which can reduce the focus time and enhance the user experience.
  • the contrast focus position of the second camera further includes a contrast focus moving distance of the second camera
  • the method further includes:
  • the focus position of the second camera can be determined according to the contrast information of the record (ie, the contrast focus moving direction and The moving distance), for example, the quasi-focal position may be the position corresponding to the peak, or the quasi-focal position is the quasi-focal position obtained by the contrast information curve fitting.
  • controlling the lens movement of the second camera to be determined by using the CDAF method When the camera is in the focus position, it can be determined according to the contrast information of the record. For example, when the curve information of the recorded contrast information is better, the lens can be moved in the direction before the CDAF method is continued. The final quasi-focal position is determined; or, when the curve information of the recorded contrast information is worse, the CDAF method can be used to move the lens in the opposite direction to determine the final quasi-focus position and the like.
  • the contrast focus moving direction and the moving distance of the second camera can be determined according to the contrast information of the recorded image, regardless of Whether the first camera finishes focusing or not can directly control the second camera to move the moving distance in the moving direction to complete the focusing of the second camera.
  • the principle of CDAF focusing is to determine the position of the quasi-focal position by the change of the sharpness of the focusing object.
  • the CDAF algorithm can obtain the most suitable quasi-focus after the image of the object is subjected to a "up and down slope" process of sharpness. position.
  • the initial picture is a virtual focus state, and then the lens moves, and people can see that the coins on the screen gradually become clear. Until a certain position (focus state) the coin is the clearest, but the camera itself is not aware that the focus has been completed at this time, the lens will continue to move, and people will see the coin become blurred again.
  • the camera module realized that the lens "walked through the station", and then retracted to the clear focus position, so that the focus is completed.
  • the focusing time is longer by the CDAF method, the picture is blurred during the focusing process, and the user experience is poor.
  • the image detected by the second camera in one or more phase in-focus positions of the at least one phase in-focus position may be The contrast information determines a contrast focus moving direction of the second camera and a contrast focus moving distance of the second camera. Since the quasi-focal position of the second camera has been determined, the processor can directly control the second camera to move the contrast focus movement distance in the contrast focus movement direction of the second camera to complete the focusing of the second camera without further control.
  • the second camera synchronizes the behavior of the first camera, and does not need to control the second camera to determine the quasi-focal position by using the CDAF method, which can reduce the focusing time and enhance the user experience.
  • the method further includes:
  • a next phase in-focus position of the second camera includes a phase in-focus movement direction and phase of the second camera Focus movement distance;
  • the second camera is controlled to move to the next phase in-focus position when the contrast focus moving direction coincides with the phase focus moving direction.
  • the first camera performs phase detection autofocus (ie, when the first camera does not end phase focusing)
  • the first camera moves to the current phase focus position, as described in step 230 above, at the distance
  • the first threshold is greater than the first threshold
  • the contrast information can determine the contrast focus moving direction and moving distance of the second camera.
  • the second camera when the phase detection autofocus of the first camera ends, or the contrast focus movement direction does not coincide with the phase focus movement direction, the second camera is controlled.
  • the contrast is moved in the moving direction to move the preset distance.
  • the embodiment of the present application can stop the second camera synchronizing the behavior of the first camera, and directly control the second camera to adopt the CDAF mode to move the preset in the contrast focus moving direction.
  • the distance for example, is fixed in the contrast focus movement direction by CDAF to perform CDAF focusing.
  • controlling the second camera to move the preset distance by using the CDAF method in the contrast focus moving direction for example, moving in the contrast focus moving direction by using a fixed step of the CDAF. Perform CDAF focusing.
  • the embodiment of the present application determines the contrast focus moving direction of the second camera by comparing the contrast information of the image in the process of synchronizing the first camera with the second camera, and thus the phase focusing direction of the second camera is wrong, that is, the second camera.
  • the phase focusing direction is opposite to the contrast focusing direction
  • the second camera is stopped to synchronize the behavior of the first camera, and CDAF focusing is directly used in the contrast focusing direction, which can avoid unnecessary movement behavior of the second camera and ensure the accuracy of the second camera.
  • Fast focus is possible focus.
  • the embodiment of the present application determines the contrast focus movement direction of the second camera by comparing the contrast information of the image in the process of synchronizing the first camera with the second camera, and then directly in the contrast focus direction after the first camera finishes focusing.
  • Adopting CDAF focusing can avoid the second camera adopting CDAF mode to move the direction of focusing of the second camera in a random direction, so as to ensure accurate and fast focusing of the second camera.
  • the position of the first camera may represent the movable lens or lens of the first camera.
  • the position of the second camera in the direction of its movement for example, the electronic device is a mobile phone, the direction may be perpendicular to the display screen of the mobile phone
  • the position of the second camera may indicate that the movable lens or lens group of the first camera is
  • the direction of its movement for example, the electronic device is a mobile phone, which may be in a direction perpendicular to the display screen of the mobile phone. It should be understood that although the positions of the first camera and the second camera are both 0, the actual spatial positions of the two cameras are different.
  • the first camera and the second camera are both set to 0, but the embodiment of the present application is not limited thereto, and the initial of the first camera and the second camera in practical applications. Location may vary.
  • the position ratio is 1:1 when the first camera position is synchronized to the second camera. It should be understood that in actual use, the ratio is a fixed ratio, but the ratio value will be different due to the difference between the first camera and the second camera. For example, the ratio may be greater than 1:1 or less than 1:1.
  • the embodiment of the present application is not limited thereto. Assume that the first camera is at position 0, and the acquired contrast value is 100, as shown in FIG. 5, written [0, 100] (it should be understood that the contrast value of the first camera may not be paid attention to in the embodiment of the present application, where the upper and lower sides are aligned here. The uniformity of the description sets the contrast value of the first camera, but the embodiment of the present application is not limited thereto.
  • the second camera position and contrast are [0,300].
  • the PD of the first camera gives a quasi-focus position value of 80 (the value given by PD is the distance and direction of the quasi-focal position from the current position.
  • the positive value indicates the positive direction, ie To the right, the value is negative to indicate the negative direction, that is, to the left, to the second camera, and the second camera's collimation position is also 80, then the first camera and the second camera are pushed to 80.
  • the first camera acquires a contrast value of 200
  • the second camera acquires a contrast value of 450 (contrast can be determined when there are two or more saved position numbers (code) and contrast values).
  • the direction pointed by the contrast value (contrast focus movement direction) is positive, and it is necessary to keep the current direction to continue searching downward, and vice versa.
  • the value given by the PD is positive, that is, the phase focus moving direction coincides with the contrast focus moving direction, so the position of the first camera is continuously synchronized, and the first camera and the second camera are pushed to the position 100.
  • the second camera has a contrast of 400, which is decreasing relative to position 80, so the contrast focus movement direction is negative.
  • the curve fitting can be used to obtain the quasi-focal position (that is, the moving distance in the moving direction), or from the current position, and the CDAF is used toward the position of 80. Get the focus position.
  • the method of CDAF in the embodiment of the present application refers to a method of obtaining a contrast information by using a fixed step size, and adopting a curve fitting to obtain a quasi-focal position when there is a peak in the curve. If you use CDAF from position 100 to position 80 to focus, set the step size to 15 (the step size varies from algorithm to algorithm, but all are fixed values), then save the contrast value at position 100 [100,400], push 15 to Position 85, obtain the contrast value [85, 445], and then push 15 to position 70 to obtain the contrast value [70, x]. If x is less than 445, the position is the peak at 85. The curve can be used to obtain the quasi-focus position.
  • the second camera Since the collimated position of the second camera (ie, the contrast focus moving direction and the moving distance) can be determined according to the recorded contrast information, the second camera can be directly controlled to move the quasi-focus position in the moving direction to complete the second camera. Focus.
  • the image detected by the second camera in one or more phase in-focus positions of the at least one phase in-focus position may be The contrast information determines a contrast focus moving direction of the second camera and a contrast focus moving distance of the second camera. Since the quasi-focal position of the second camera has been determined, the processor can directly control the second camera to move the contrast focus movement distance in the contrast focus movement direction of the second camera to complete the focusing of the second camera without further control.
  • the second camera synchronizes the behavior of the first camera, and does not need to control the second camera to determine the quasi-focal position by using the CDAF method, which can reduce the focusing time and enhance the user experience.
  • the first camera when the first camera is at position 0, the value given by the PD is 80, so the first camera needs to be pushed to the position 80 in the next step; and the second camera has a contrast ratio of 300 at the position 0.
  • the next step of synchronizing the position of the first camera also pushes the motor to 80; when set to position 80, the PD value of the first camera is 20, and the contrast value obtained by the second camera at position 80 is 450, at this time due to the second camera
  • the contrast value is rising, so the contrast judgment direction is positive to the right, and the PD of the first camera gives a value of 20, and the value is positive right, so the direction of the first camera is synchronized with the second camera to save the contrast direction.
  • the second camera stops synchronizing the behavior of the first camera, and needs to continue to focus on the right CDAF.
  • the example in FIG. 6 above is that the PD direction does not coincide with the contrast direction in the third step.
  • the second camera has a contrast ratio less than 300, that is, in the second step, at position 80, When the direction of the step PD does not coincide with the contrast direction, the second camera will start focusing to the left CDAF at the second step.
  • the first three steps are consistent with FIG. 6.
  • the third step if the value given by the PD is 5, the second camera contrast is rising, so the contrast direction is consistent with the direction given by the PD. It is positive to the right, so the second camera still synchronizes the behavior of the first camera to push the motor to 105 in the third step; at this time, the first camera has a small PD value, indicating that the current position is very close to the collimated position (assuming PD If the convergence threshold is greater than 5), the focus can be ended. At this time, the first camera can be pushed to 105, and the current position can be maintained, which is different according to each algorithm.
  • the second camera When the second camera is pushed to 105, since the first camera has finished focusing, the first camera behavior cannot be synchronized, and the subsequent process needs to be determined by itself. If the contrast value x obtained at 105 is less than 500, then at position 100 At the peak of the contrast, the curve fitting can be directly performed to find the peak point of the second camera or the CDAF can be connected in a fixed step length from 105 to 100. If x is greater than 500, it indicates that the peak point is still on the right side of the positive direction. At this time, the second camera will start from the current position and focus on the CDAF in a fixed step.
  • the second camera is directly connected to the CDAF, and the step size is fixed, and the direction may be left or right, according to a specific implementation.
  • the two cameras may be considered to have completed focusing, and then the processor may acquire images through the first camera and the second camera, respectively.
  • the images acquired by the two cameras are combined to form a final image.
  • the electronic device can also display the final image through the display screen.
  • the image processing process after the two cameras respectively perform the focusing can refer to the image synthesis algorithm of the existing multi-camera, which is not limited by the embodiment of the present application.
  • FIG. 8 The specific process of focusing of the embodiment of the present application is described above with reference to FIGS. 2 through 7.
  • the method of focusing of the embodiment of the present application will be described in detail below with reference to a specific example of FIG. 8.
  • the method of FIG. 8 can be applied to the above-described electronic device including at least two cameras, the first of the at least two cameras supporting the phase detection autofocus PDAF function, and the second camera not supporting the PDAF function.
  • the method illustrated in Figure 8 can be performed by a processor of the electronic device. It should be understood that only two examples of camera (dual camera) focusing are described in FIG. 8, but the embodiment of the present application is not limited thereto, and the process of focusing is similar when the electronic device includes three or more cameras. Avoid repetition, no more details here.
  • the method 800 shown in Figure 8 includes:
  • the description of determining the picture change that is, the still picture can be referred to the description above. To avoid repetition, details are not described herein again.
  • the first camera performs phase detection autofocus.
  • the second camera synchronizes the position of the first camera.
  • the second camera synchronizes the position of the first camera and does not record the image contrast information of the second camera.
  • step 814 is performed, otherwise step 810 is performed.
  • step 815 and step 816 are performed, and in the case where the first camera is not in focus, step 820 is performed.
  • the first camera finishes focusing.
  • the second camera performs CDAF focusing.
  • steps 810 to 816 describe the focusing process in the case of a screen change, specifically, in the case of a screen change, the second camera synchronizes the behavior of the first camera, and does not record the image contrast information of the second camera. .
  • the camera that does not support the PDAF function synchronizes the focusing behavior of the camera supporting the PDAF function, and realizes the fast focus that does not support the PDAF camera, thereby reducing the overall focusing time and improving the user experience.
  • the description of determining the picture change that is, the still picture can be referred to the description above. To avoid repetition, details are not described herein again.
  • the first camera performs phase detection autofocus.
  • the second camera synchronizes the position of the first camera.
  • the second camera synchronizes the position of the first camera and records image contrast information of the second camera.
  • step 824 is performed, and if the contrast focus moving direction cannot be calculated, step 822 is performed.
  • step 825 is performed, and if the contrast focus moving distance cannot be calculated, step 826 is performed.
  • the second camera is pushed to the quasi-focus position (ie, the contrast focus moving distance is moved in the contrast focus moving direction).
  • the phase of the second camera moves in the same direction as the contrast focus direction.
  • Step 822 is performed in a consistent case, and in the event of an inconsistency, step 829 is performed.
  • step 828 is performed to perform step 821 and step 826 if the first camera does not satisfy the phase focus self-convergence.
  • the embodiment of the present application may stop the first time if the number of self-convergence of the PD does not meet the preset repetition threshold.
  • a camera's phase focusing process then use CDAF to determine the focus position of the first camera.
  • the first camera finishes focusing.
  • steps 820 to 829 describe the focusing process in the case where the picture is still, specifically, in the case of a picture change, the second camera synchronizes the behavior of the first camera, and records the image contrast information of the second camera. And determining a contrast focus position of the second camera according to the recorded contrast information (the contrast focus position includes a contrast focus movement direction, or the contrast focus position includes a contrast focus movement direction and a contrast focus movement distance).
  • the processing can be made.
  • the device accurately determines the contrast focus position according to the contrast information of the acquired image, and the embodiment of the present application can ensure the accuracy of the focus moving direction by the recorded contrast information, or directly determine the contrast focus position by the recorded contrast information, thereby avoiding again Determining the focus position of the second camera according to the focus position of the first camera or the CDAF manner can reduce the focus time and improve the user experience.
  • FIG. 2 to FIG. 8 are merely for facilitating the understanding of the embodiments of the present invention, and the embodiments of the present invention are not limited to the specific numerical values or specific examples illustrated. A person skilled in the art will be able to make various modifications and changes in accordance with the examples of FIG. 2 to FIG. 8 which are within the scope of the embodiments of the present invention.
  • FIG. 9 is a schematic block diagram of a processor in accordance with one embodiment of the present application.
  • the processor 900 shown in FIG. 9 includes a processing unit 910 and a storage unit 920.
  • the storage unit 920 is configured to store code
  • the processing unit 910 is configured to execute the code in the storage unit 920 to perform the method illustrated in FIG. 2 to FIG. 8 above.
  • FIG. 2 to FIG. 5 For details of the method implemented by the processor, refer to the descriptions in FIG. 2 to FIG. 5 above. To avoid repetition, details are not described herein again.
  • processor 900 may also be referred to as an image signal processor, an image processing unit, a processing unit, or a processing module, and the like.
  • the processor 900 may be a CPU of the electronic device, and the image signal processor may also be a separate device from the CPU.
  • the embodiment of the present application is not limited thereto.
  • the processor in the embodiment of the present invention may be an integrated circuit chip with signal processing capability.
  • each step of the foregoing method embodiment may be completed by an integrated logic circuit of hardware in a processor or an instruction in a form of software.
  • the above processor may be a general purpose processor, a digital signal processor (DSP), an application specific integrated crucit (ASIC), a field programmable gate array (FPGA) or the like. Programming logic devices, discrete gates or transistor logic devices, discrete hardware components.
  • the methods, steps, and logical block diagrams disclosed in the embodiments of the present invention may be implemented or carried out.
  • the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
  • the steps of the method disclosed in the embodiments of the present invention may be directly implemented by the hardware decoding processor, or may be performed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a conventional storage medium such as random access memory, flash memory, read only memory, programmable read only memory or electrically erasable programmable memory, registers, and the like.
  • the storage medium is located in the memory, and the processor reads the information in the memory and combines the hardware to perform the steps of the above method.
  • the storage unit in the embodiment of the present invention may also be referred to as a memory, which may be a volatile memory or a non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory may be a read-only memory (ROM), a programmable read only memory (ROMM), an erasable programmable read only memory (erasable PROM, EPROM), or an electrical Erase programmable EPROM (EEPROM) or flash memory.
  • the volatile memory can be a random access memory (RAM) that acts as an external cache.
  • RAM random access memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • DDR SDRAM double data rate synchronous DRAM
  • ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM synchronously connected dynamic random access memory
  • DR RAM direct memory bus random access memory
  • FIG. 10 is a schematic block diagram of an electronic device in accordance with an embodiment of the present application.
  • the electronic device 1000 shown in FIG. 10 includes a processor 1010 and at least two cameras 1020, wherein a first camera 1021 of the at least two cameras supports a phase detection autofocus PDAF function, and a second camera 1022 does not support a PDAF function.
  • processor 1010 and the 9 processor 600 type shown in FIG. 9 can implement the functions of the method shown in FIG. 2 to FIG. 8, and the processor 1010 can include all modules or units implementing the above method, in order to avoid Repeat, no longer repeat them here.
  • the electronic device in the embodiment of the present application may further include other modules.
  • An example in which the electronic device is a mobile phone in the embodiment of the present application is described below with reference to FIG.
  • FIG. 11 is a block diagram showing a partial structure of a mobile phone 1100 related to an embodiment of the present invention.
  • the mobile phone 1100 includes a radio frequency (RF) circuit 1110, a memory 1120, other input devices 1130, a display 1140, a sensor 1150, an audio circuit 1160, an I/O subsystem 1170, a processor 1180, and a power supply 1190. And at least two components such as the camera 11100.
  • RF radio frequency
  • the structure of the mobile phone shown in FIG. 11 does not constitute a limitation on the mobile phone, and may include more or less components than those illustrated, or combine some components, or split some components, or Different parts are arranged.
  • the display 1140 is a user interface (UI) and that the handset 1100 can include fewer user interfaces than illustrated or less.
  • the components of the mobile phone 1100 are specifically described below with reference to FIG. 11:
  • the RF circuit 1110 can be used for receiving and transmitting signals during and after receiving or transmitting information, in particular, after receiving the downlink information of the base station, and processing it to the processor 1180; in addition, transmitting the designed uplink data to the base station.
  • RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like.
  • LNA Low Noise Amplifier
  • RF circuitry 1110 can also communicate with the network and other devices via wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to Global System of Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (Code). Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), E-mail, Short Messaging Service (SMS), etc.
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • the memory 1120 can be used to store software programs and modules, and the processor 1180 executes various functional applications and data processing of the handset 1100 by running software programs and modules stored in the memory 1120.
  • the memory 1120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored. Data created according to the use of the mobile phone 1100 (such as audio data, phone book, etc.).
  • memory 1120 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • Other input devices 1130 can be used to receive input numeric or character information, as well as generate key signal inputs related to user settings and function controls of handset 1100.
  • other input devices 1130 may include, but are not limited to, physical keyboards, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and light mice (the light mouse is not sensitive to display visual output).
  • function keys such as volume control buttons, switch buttons, etc.
  • trackballs mice, joysticks, and light mice (the light mouse is not sensitive to display visual output).
  • Other input devices 1130 are coupled to other input device controllers 1171 of I/O subsystem 1170 for signal interaction with processor 1180 under the control of other device input controllers 1171.
  • the display 1140 can be used to display information entered by the user or information provided to the user as well as various menus of the handset 1100, and can also accept user input.
  • the specific display screen 1140 can include a display panel 1141 and a touch panel 1142.
  • the display panel 1141 can be configured by using a liquid crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the touch panel 1142 also referred to as a touch screen, a touch sensitive screen, etc., can collect contact or non-contact operations on or near the user (eg, the user uses any suitable object or accessory such as a finger, a stylus, etc. on the touch panel 1142.
  • the operation in the vicinity of the touch panel 1142 may also include a somatosensory operation; the operation includes a single-point control operation, a multi-point control operation, and the like, and drives the corresponding connection device according to a preset program.
  • the touch panel 1142 may include two parts: a touch detection device and a touch controller. Wherein, the touch detection device detects the touch orientation and posture of the user, and detects a signal brought by the touch operation, and transmits a signal to the touch controller; the touch controller receives the touch information from the touch detection device, and converts the signal into a processor. The processed information is sent to the processor 1180 and can receive commands from the processor 1180 and execute them.
  • the touch panel 1142 can be implemented by using various types such as resistive, capacitive, infrared, and surface acoustic waves, and the touch panel 1142 can be implemented by any technology developed in the future.
  • the touch panel 1142 can cover the display panel 1141, and the user can display the content according to the display panel 1141 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, a virtual button, an icon, etc.) on the display panel 1141. Operation is performed on or near the covered touch panel 1142. After detecting the operation on or near the touch panel 1142, the touch panel 1142 transmits to the processor 1180 through the I/O subsystem 1170 to determine user input, and then the processor 1180 is based on the user.
  • the input provides a corresponding visual output on display panel 1141 via I/O subsystem 1170.
  • the touch panel 1142 and the display panel 1141 are used as two independent components to implement the input and input functions of the mobile phone 1100 in FIG. 11, in some embodiments, the touch panel 1142 can be integrated with the display panel 1141.
  • the input and output functions of the mobile phone 1100 are implemented.
  • the handset 1100 can also include at least one sensor 1150, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 1141 according to the brightness of the ambient light, and the proximity sensor may close the display panel 1141 when the mobile phone 1100 moves to the ear. / or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.
  • the mobile phone 1100 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Let me repeat.
  • An audio circuit 1160, a speaker 1161, and a microphone 1162 can provide an audio interface between the user and the handset 1100.
  • the audio circuit 1160 can transmit the converted audio data to the speaker 1161, and convert it into a sound signal output by the speaker 1161; on the other hand, the microphone 1162 converts the collected sound signal into a signal, which is received by the audio circuit 1160.
  • the audio data is converted, and the audio data is output to the RF circuit 1110 for transmission to, for example, another mobile phone, or the audio data is output to the memory 1120 for further processing.
  • the I/O subsystem 1170 is used to control external devices for input and output, and may include other device input controllers 1171, sensor controllers 1172, display controllers 1173, and image signal processors 1174.
  • the image signal processor 1174 is configured to control at least two cameras 11100 to take a subject, perform the method of focusing shown in FIG. 2 to FIG. 5 above; one or more other input control device controllers 1171 from other input devices 1130 receives signals and/or sends signals to other input devices 1130.
  • Other input devices 1130 may include physical buttons (press buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels, light rats (light mouse is A touch-sensitive surface that does not display a visual output, or an extension of a touch-sensitive surface formed by a touch screen. It is worth noting that other input control device controllers 1171 can be connected to any one or more of the above devices.
  • Display controller 1173 in I/O subsystem 1170 receives signals from display 1140 and/or transmits signals to display 1140. After the display 1140 detects the user input, the display controller 1173 converts the detected user input into an interaction with the user interface object displayed on the display 1140, ie, implements human-computer interaction.
  • Sensor controller 1172 can receive signals from one or more sensors 1150 and/or send signals to one or more sensors 1150.
  • the processor 1180 is a control center for the handset 1100 that connects various portions of the entire handset with various interfaces and lines, by running or executing software programs and/or modules stored in the memory 1120, and recalling data stored in the memory 1120,
  • the mobile phone 1100 performs various functions and processing data to perform overall monitoring of the mobile phone.
  • the processor 1180 may include one or more processing units; preferably, the processor 1180 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
  • the modem processor primarily handles wireless communications. It will be appreciated that the above described modem processor may also not be integrated into the processor 1180.
  • the image signal controller may also be integrated in the processor 1180.
  • the embodiment of the present application is not limited thereto.
  • the handset 1100 also includes a power source 1190 (such as a battery) that powers the various components.
  • a power source 1190 such as a battery
  • the power source can be logically coupled to the processor 1180 via a power management system to manage functions such as charging, discharging, and power consumption through the power management system.
  • the mobile phone 1100 may further include a Bluetooth module and the like, and details are not described herein again.
  • the embodiment of the present invention further provides a computer readable medium having stored thereon a computer program, the computer program being executed by a computer to implement the method of any of the foregoing method embodiments.
  • the embodiment of the present invention further provides a computer program product, which is implemented by a computer to implement the method of any of the foregoing method embodiments.
  • the computer program product includes one or more computer instructions.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions can be stored in a computer readable storage medium or transferred from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions can be wired from a website site, computer, server or data center (for example, coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • the computer readable storage medium can be any available media that can be accessed by a computer or a data storage device such as a server, data center, or the like that includes one or more available media.
  • the usable medium can be a magnetic medium (eg, a floppy disk, a hard disk, a magnetic tape), an optical medium (eg, a high-density digital video disc (DVD)), or a semiconductor medium (eg, a solid state disk (SSD) ))Wait.
  • a magnetic medium eg, a floppy disk, a hard disk, a magnetic tape
  • an optical medium eg, a high-density digital video disc (DVD)
  • DVD high-density digital video disc
  • SSD solid state disk
  • the image signal processor may be a chip, and the processor may be implemented by hardware or by software.
  • the processor may be a logic circuit, an integrated circuit, etc.;
  • the processor can be a general purpose processor implemented by reading software code stored in the memory, and the modified memory can be integrated in the processor and can exist independently of the processor.
  • the disclosed systems, devices, and methods may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions may be stored in a computer readable storage medium if implemented in the form of a software functional unit and sold or used as a standalone product.
  • the technical solution of the present application which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program code. .

Abstract

本申请提供了一种对焦的方法和电子设备,该电子设备包括处理器和至少两个摄像机;该至少两个摄像机中的第一摄像机用于检测图像的相位信息;该至少两个摄像机中的第二摄像机用于检测图像的对比度信息;该处理器用于在该第一摄像机进行相位检测自动对焦的过程中,根据该第一摄像机的一个或多个相位对焦位置,确定该第二摄像机的至少一个相位对焦位置,并控制该第二摄像机移动至该至少一个相位对焦位置;根据该第二摄像机在该至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定该第二摄像机的对比度对焦位置,其中,该第二摄像机的对比度对焦位置包括该第二摄像机的对比度对焦移动方向。本申请实施例能够实现快速对焦。

Description

对焦的方法和电子设备
本申请要求于2017年12月27日提交中国专利局、申请号为201711447201.6、申请名称为“对焦的方法和电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备领域,更具体地,涉及一种对焦的方法和电子设备。
背景技术
随着科技的发展,为了满足用户对拍摄的多样化需求,越来越多的电子设备支持双摄像机或多摄像机,例如,现有的智能手机普遍具有后置双摄像机或前置双摄像机。
现有的具有双摄的电子设备中基于成本的考虑,通常仅有一个摄像机(该摄像机可以称为主摄像机,另一个摄像机可以称为辅摄像机)支持相位检测自动对焦(phase detect auto focus,PDAF)。
在拍摄时,现有的电子设备的一种对焦方式为:主摄像机首先使用PDAF确定主摄像机的准焦位置,辅摄像机在主摄像机对焦后,根据主摄像机的准焦位置确定并调整辅摄像机的镜片位置,最后辅摄像机采用对比度检测自动对焦(contrast detect auto focus,CDAF)微调镜头,确定辅摄像机的准焦位置。然而这种对焦方式需要先对焦主摄像机,再对焦辅摄像机,使得整个对焦时间较长,影响用户体验。
现有的电子设备的另一种对焦方式为:主摄像机和辅摄像机同时对焦,其中,主摄像机使用PDAF对焦,辅摄像机使用CDAF对焦或者通过激光检测被摄对象的距离以进行对焦(可以简称为激光对焦)。然而,辅摄像机采用的这些对焦方式同样具有一些缺点,例如,CDAF的拉风箱现象较严重,且对焦时间较长,在对焦过程中拍摄画面较模糊;同样的,激光对焦过程耗时较长,且在对焦过程中拍摄画面较模糊,影响用户体验。
因此,如何提供一种快速对焦的方法,成为亟待解决的问题。
发明内容
本申请提供一种对焦的方法和电子设备,能够实现快速对焦,提升用户体验。
第一方面,提供了一种对焦的方法,应用于包括至少两个摄像机的电子设备中,所述至少两个摄像机中的第一摄像机用于检测图像的相位信息,所述至少两个摄像机中的第二摄像机用于检测图像的对比度信息,所述方法包括:
在所述第一摄像机进行相位检测自动对焦的过程中,根据所述第一摄像机的一个或多个相位对焦位置,确定所述第二摄像机的至少一个相位对焦位置,并控制所述第二摄像机移动至所述至少一个相位对焦位置;
根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检 测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置,其中,所述第二摄像机的对比度对焦位置包括所述第二摄像机的对比度对焦移动方向。
应理解,本申请实施例中,“第一摄像机进行相位检测自动对焦的过程中”可以表示第一摄像机在开启相位检测对焦之后至相位检测对焦结束之前的时间段中的全部或部分时间。换句话说,“第一摄像机进行相位检测自动对焦的过程中”可以表示第一摄像机在开启CDAF对焦之前的一个时间段,本申请实施例并不限于此。
具体而言,处理器可以在所述第一摄像机进行相位检测自动对焦的过程中,根据所述第一摄像机的一个或多个相位对焦位置,确定所述第二摄像机的至少一个相位对焦位置,并控制所述第二摄像机移动至所述至少一个相位对焦位置;实现处理器并行控制两个摄像机的对焦,使得不支持PDAF功能的第二摄像机同步支持PDAF功能的第一摄像机的对焦行为,实现不支持PDAF第二摄像机的快速对焦,能够降低整体对焦时间,提升用户体验。
应理解,本申请实施例中处理器在所述第一摄像机进行相位检测自动对焦的过程中,控制所述第二摄像机移动至所述至少一个相位对焦位置。也可以表述为处理器并行控制第一摄像机和第二摄像机。这里并行控制第一摄像机和第二摄像机可以理解为同时控制第一摄像机和第二摄像机,应理解,本申请实施例中的并行控制两个摄像机或者同时控制两个摄像机并不局限于控制两个摄像机在时间上的严格一致,例如,控制两个摄像机移动的时间上可以准许一定的时间间隔,本申请实施例并不限于此。
例如,在确定出第一摄像机的对焦位置后,可以先控制第一摄像机移动,然后在第一摄像机移动的过程中,根据第一摄像机的对焦位置确定出第二摄像机的对焦位置,再控制第二摄像机移动。再例如,再确定出第一摄像机的对焦位置后,可以先根据第一摄像机的对焦位置确定出第二摄像机的对焦位置,然后处理器并行控制该第一摄像机和该第二摄像机分别移动至相应的对焦位置。
具体地,处理器可以通过控制镜头马达驱动器移动摄像机至对应的对焦位置。
因此,本申请实施例中,处理器能够并行控制至少两个摄像机进行对焦,使得不支持PDAF功能的摄像机同步支持PDAF功能的摄像机的对焦行为,实现不支持PDAF摄像机的快速对焦,能够降低整体对焦时间,提升用户体验。
应理解,本申请实施例中,第一摄像机的一个或多个相位对焦位置可以包括第一摄像机在进行相位检测自动对焦过程中的一个或者连续的多个相位对焦位置。该一个或者多个相位对焦位置可以包括第一个相位对焦位置或者相位对焦过程中的中间的一个或多个相位对焦位置,或者最后一个相位对焦位置,本申请实施例并不限于此。
应理解,本申请实施例中,所述相位对焦位置是基于图像的相位信息计算得到的对焦位置。所述对比度对焦位置是基于图像的对比度信息计算得到的对焦位置。
应理解,本申请实施例中“对焦位置”可以包括移动方向和/或移动距离,该移动方向表示为了获得清晰图像,摄像机中可移动的镜片、镜片组或镜头,需要移动的方向;该移动距离表示为了获得清晰图像,摄像机中可移动的镜片、镜片组或镜头,需要在移动方向上移动的距离。换句话说,该对焦位置可以表示为了获得清晰图像,摄像机中可移动的镜片、镜片组或镜头,需要处于的位置。
应理解,本申请实施例可以根据该记录的图像的对比度信息来确定对比度对焦移动方向,例如,当记录的对比度信息的曲线信息越来好的情况下,则可以确定对比度对焦移动 方向为朝曲线信息越来越好的走势方向。
具体而言,本申请实施例中,在第二摄像机同步第一摄像机的过程中,可以根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置。也就是说,本申请实施例中,第二摄像机同步第一摄像机的过程中可以记录第二摄像机检测的图像的对比度信息,并根据该对比度信息确定第二摄像机的对比度对焦位置。由于在第二摄像机同步第一摄像机的行为可能导致第二摄像机的对焦移动方向不准确,或者,在第一摄像机对焦结束后第二摄像机需要再采用对比度对焦方式对焦。本申请实施例可以通过记录的对比度信息保证对焦移动方向的准确性,或者通过记录的对比度信息直接确定对比度对焦位置,避免了再次根据第一摄像机的对焦位置或者CDAF的方式确定该第二摄像机的准焦位置,能够降低对焦时间,提升用户体验。
应理解,在控制第一摄像机和该第二摄像机分别移动至第一准焦位置和第二准焦位置后,即可以认为该两个摄像机已完成对焦,之后图像信号处理器可以分别通过第一摄像机和第二摄像机获取图像,并将该两个摄像机获取的图像进行合成形成最终的图像。可选地,在该电子设备具有显示屏的情况下,该电子设备还可以通过显示屏显示该最终的图像。
应理解,在两个摄像机分别移动到相应的准焦位置后的图像处理过程可以参照现有多摄像机的图像合成算法,本申请实施例并不对此做限定。
可选地,在一种可行的实现方式中,所述方法还包括:
根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,控制所述第二摄像机是否检测图像的对比度信息;
其中,所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置时被控制检测图像的对比度信息。
例如,在电子设备的移动状态,和/或第二摄像机的图像稳定状态满足检测条件时,处理器控制所述第二摄像机检测图像的对比度信息;或者,在该移动状态,和/或,图像稳定状态不满足检测条件时处理器不控制第二摄像机检测图像的对比度信息。
例如,在电子设备的移动状态为较稳定或移动较慢,例如,电子设备的陀螺仪或加速器检测到电子设备的移动性小于预设移动阈值时,和/或者图像的稳定状态为图像对比度变化较小(例如第二摄像机在当前对焦位置与前一对焦位置图像的对比度变化较小,例如小于预设对比度变化阈值时,则认为满足检测条件。反之则可以认为不满足检测条件。
应理解,在满足检测条件时,本申请实施例也可以认为拍摄的画面是静止的。在不满足检测条件时,则可以认为拍摄的画面是变化的,本申请实施例并不限于此。
具体而言,本申请实施例中,在第二摄像机同步第一摄像机的过程中,还可以根据所述第二摄像机在所述至少一个相位对焦位置时根据电子设备的移动状态和/或第二摄像机的图像稳定状态控制第二摄像机是否检测图像的对比度信息,例如,处理器确定在第二摄像机在该至少一个相位对焦位置中的一个或多个相位对焦位置确定检测图像的对比度信息,并根据检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置。
也就是说,本申请实施例在第二摄像机同步第一摄像的过程中,且确定画面静止的情况下,才检测图像的对比度信息,由于静止画面获取的图像的对比度信息比较可靠,能够使得处理器根据获取的图像的对比度信息准确的确定出对比度对焦位置,进而本申请实施 例可以通过记录的对比度信息保证对焦移动方向的准确性,或者通过记录的对比度信息直接确定对比度对焦位置,避免了再次根据第一摄像机的对焦位置或者CDAF的方式确定该第二摄像机的准焦位置,能够降低对焦时间,提升用户体验。
上面介绍了只有在画面静止的情况下,处理器才控制第二摄像机获取图像的对比度信息,并确定第二摄像机的对比度对焦位置。可替代地,本申请实施例也可以在第二摄像机的所有相位对焦位置均检测图像的对比度信息,然后,从检测的图像的对比度信息中获取有效的信息。
可选地,在一种可行的实现方式中,所述方法还包括:
根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,确定所述第二摄像机检测到的图像的对比度信息是否有效;
其中,所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息是有效的。
例如,在电子设备的移动状态为较稳定或移动较慢,例如,电子设备的陀螺仪或加速器检测到电子设备的移动性小于预设移动阈值时,和/或者图像的稳定状态为图像对比度变化较小(例如第二摄像机在当前对焦位置与前一对焦位置图像的对比度变化较小,例如小于预设对比度变化阈值时(即拍摄画面静止时),则认为第二摄像机检测到的图像的对比度信息是有效的。反之(即拍摄画面变化时)则可以认为第二摄像机检测到的图像的对比度信息是无效的。
应理解,本申请实施例中所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置均是静止的,且该至少一个相位对焦位置中的多个相位对焦位置为连续的有效信息,即第二摄像机在该多个相位对焦位置时拍摄画面一直是静止的。
也就是说,本申请实施例在第二摄像机同步第一摄像的过程中,检测图像的对比度信息,并确定画面静止的情况下的图像的对比度信息为有效的,由于静止画面获取的图像的对比度信息比较可靠,能够使得处理器根据获取的图像的对比度信息准确的确定出对比度对焦位置,进而本申请实施例可以通过记录的对比度信息保证对焦移动方向的准确性,或者通过记录的对比度信息直接确定对比度对焦位置,避免了再次根据第一摄像机的对焦位置或者CDAF的方式确定该第二摄像机的准焦位置,能够降低对焦时间,提升用户体验。
可选地,在一种可行的实现方式中,所述第二摄像机的对比度对焦位置还包括所述第二摄像机的对比度对焦移动距离,
所述方法还包括:
控制所述第二摄像机移动至所述对比度对焦位置。
应理解,当记录的第二摄像机采集的图像的对比度信息的曲线有峰值的情况下,则可以确定根据该记录的对比度信息能够确定出该第二摄像机的准焦位置(即对比度对焦移动方向和移动距离),例如,该准焦位置可以为该峰值对应的位置,或者该准焦位置为通过对比度信息曲线拟合获取的准焦位置。
并且,在确定根据记录的所述第二摄像机采集的图像的对比度信息不能计算出所述第二摄像机的准焦位置的情况下,控制所述第二摄像机的镜头移动至采用CDAF方式确定的第二摄像机的准焦位置时,可以根据该记录的对比度信息来确定如何采用CDAF对焦,例如,当记录的对比度信息的曲线信息越来好的情况下,则可以采用CDAF方式继续之前的 方向移动镜头确定最终的准焦位置;或者,当记录的对比度信息的曲线信息越来差的情况下,则可以采用CDAF方式反方向移动镜头确定最终的准焦位置等。
也就是说,本申请实施例中在第二摄像机同步第一摄像机的相位检测自动对焦过程中,只要能够根据记录的图像的对比度信息能够确定出第二摄像机的对比度对焦移动方向和移动距离,不管第一摄像机是否完成对焦,可以直接控制第二摄像机在该移动方向上移动该移动距离,完成第二摄像机的对焦。
具体而言,本申请实施例中,在第二摄像机同步第一摄像机的过程中,可以根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦移动方向和所述第二摄像机的对比度对焦移动距离。由于已确定出第二摄像机的准焦位置,因此,处理器可以直接控制第二摄像机在该第二摄像机的对比度对焦移动方向上移动该对比度对焦移动距离,完成第二摄像机的对焦,无需再控制第二摄像机同步第一摄像机的行为,也无需控制第二摄像机再采用CDAF方式确定准焦位置,能够降低对焦时间,提升用户体验。
可选地,在一种可行的实现方式中,所述方法还包括:
根据所述第一摄像机的下一相位对焦位置,确定所述第二摄像机的下一相位对焦位置,所述第二摄像机的下一相位对焦位置包括所述第二摄像机的相位对焦移动方向和相位对焦移动距离;
当所述对比度对焦移动方向与所述相位对焦移动方向一致时,控制所述第二摄像机移动至所述下一相位对焦位置。
具体的,在第一摄像机进行相位检测自动对焦的过程中(即第一摄像机未结束相位对焦时),在第一摄像机移动至当前的相位对焦位置后,如上文中步骤230所描述的,在距离大于该第一阈值的情况下,控制第一摄像机移动至下一相位对焦位置,并根据该第一第一摄像机的下一相位对焦位置,确定所述第二摄像机的下一相位对焦位置,当所述对比度对焦移动方向与所述相位对焦移动方向一致时,控制所述第二摄像机移动至所述下一相位对焦位位置,重复该过程,直到第一摄像机的相位对焦结束,或者,根据记录的对比度信息能够确定出第二摄像机的对比度对焦移动方向和移动距离。
可选地,在一种可行的实现方式中,所述方法还包括:
当所述第一摄像机的相位检测自动对焦结束后,或者,所述对比度对焦移动方向与所述相位对焦移动方向不一致时,控制所述第二摄像机在所述对比度对焦移动方向上移动预设距离。
具体的,上述相位对焦移动方向与对比度对焦移动方向不一致时,本申请实施例可以停在第二摄像机同步第一摄像机的行为,直接控制第二摄像机在该对比度对焦移动方向采用CDAF方式移动预设距离,例如,采用CDAF的固定步长在该对比度对焦移动方向上移动进行CDAF对焦。
或者,在当所述第一摄像机的相位检测自动对焦结束后控制第二摄像机在该对比度对焦移动方向采用CDAF方式移动预设距离,例如,采用CDAF的固定步长在该对比度对焦移动方向上移动进行CDAF对焦。
因此,本申请实施例通过在第二摄像机同步第一摄像机的过程中图像的对比度信息,确定出第二摄像机的对比度对焦移动方向,进而可以在第二摄像机的相位对焦方向错误, 即第二摄像机的相位对焦方向与该对比度对焦方向相反时,停止第二摄像机同步第一摄像机的行为,直接在该对比度对焦方向上采用CDAF对焦,能够避免第二摄像机不必要的移动行为,保证第二摄像机准确快速的对焦。
或者,本申请实施例通过在第二摄像机同步第一摄像机的过程中图像的对比度信息,确定出第二摄像机的对比度对焦移动方向,进而可以在第一摄像机对焦结束后,直接在该对比度对焦方向上采用CDAF对焦,能够避免第二摄像机采用随机方向采用CDAF方式移动第二摄像头对焦的方向,保证第二摄像机准确快速的对焦。
第二方面,提供了一种处理器,该处理器包括:处理单元和存储单元,
所述存储单元用于存储代码,所述处理单元用于执行所述存储单元中的代码实现第一方面或第一方面的任意可行实现方式中的方法。
第三方面,提供了一种电子设备,该电子设备包括:处理器和至少两个摄像机;
所述至少两个摄像机中的第一摄像机用于检测图像的相位信息;
所述至少两个摄像机中的第二摄像机用于检测图像的对比度信息;
所述处理器用于:
在所述第一摄像机进行相位检测自动对焦的过程中,根据所述第一摄像机的一个或多个相位对焦位置,确定所述第二摄像机的至少一个相位对焦位置,并控制所述第二摄像机移动至所述至少一个相位对焦位置;
根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置,其中,所述第二摄像机的对比度对焦位置包括所述第二摄像机的对比度对焦移动方向。
因此,本申请实施例中,可以在第一摄像机进行相位检测对焦的过程中,根据第一摄像机的一个或多个相位对焦位置,确定所述第二摄像机的至少一个相位对焦位置,并控制所述第二摄像机移动至所述至少一个相位对焦位置。实现了不支持PDAF功能的摄像机同步支持PDAF功能的摄像机的对焦行为,实现不支持PDAF摄像机的快速对焦,能够降低整体对焦时间,提升用户体验。
并且,进一步地,本申请实施例中,在第二摄像机同步第一摄像机的过程中,还可以根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置。也就是说,本申请实施例中,第二摄像机同步第一摄像机的过程中可以记录第二摄像机检测的图像的对比度信息,并根据该对比度信息确定第二摄像机的对比度对焦位置。由于在第二摄像机同步第一摄像机的行为可能导致第二摄像机的对焦移动方向不准确,或者,在第一摄像机对焦结束后第二摄像机需要再采用对比度对焦方式对焦。鉴于此,本申请实施例可以通过记录的对比度信息保证对焦移动方向的准确性,或者通过记录的对比度信息直接确定对比度对焦位置,避免了再次根据第一摄像机的对焦位置或者CDAF的方式确定该第二摄像机的准焦位置,能够降低对焦时间,提升用户体验。
应理解,第三方面与第一方面对应,处理器能够实现第一方面及其可行实现方式中的方法,为了避免重复,此处适当省略详细描述。
可选地,在一种可行的实现方式中,处理器还用于:
根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,控制所述 第二摄像机是否检测图像的对比度信息;
其中,所述处理器控制所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置时检测图像的对比度信息。
可选地,在一种可行的实现方式中,所述处理器还用于:
根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,确定所述第二摄像机检测到的图像的对比度信息是否有效;
其中,所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息是有效的。
可选地,在一种可行的实现方式中,所述第二摄像机的对比度对焦位置还包括所述第二摄像机的对比度对焦移动距离,
所述处理器还用于:
控制所述第二摄像机移动至所述对比度对焦位置。
可选地,在一种可行的实现方式中,所述处理器还用于:
根据所述第一摄像机的下一相位对焦位置,确定所述第二摄像机的下一相位对焦位置,所述第二摄像机的下一相位对焦位置包括所述第二摄像机的相位对焦移动方向和相位对焦移动距离;
当所述对比度对焦移动方向与所述相位对焦移动方向一致时,控制所述第二摄像机移动至所述下一相位对焦位置。
可选地,在一种可行的实现方式中,所述处理器还用于:
当所述第一摄像机的相位检测自动对焦结束后,或者,所述对比度对焦移动方向与所述相位对焦移动方向不一致时,控制所述第二摄像机在所述对比度对焦移动方向上移动预设距离。
可选地,在一种可能的设计中,上述处理器实现的方案可以由芯片实现。
第四方面,提供了一种计算机程序产品,所述计算机程序产品包括:计算机程序(也可以称为代码,或指令),当所述计算机程序被运行时,使得计算机执行上述第一方面及第一方面中任一种可能实现方式中的方法。
第五方面,提供了一种计算机可读介质,所述计算机可读介质存储有计算机程序(也可以称为代码,或指令)当其在计算机上运行时,使得计算机执行上述第一方面及第一方面中任一种可能实现方式中的方法。
附图说明
图1是本申请实施例的一种可应用的场景示意图。
图2是根据本申请一个实施例的对焦流程示意图。
图3是根据本申请一个实施例的对焦方法的示意图。
图4是透镜成像原理示意图。
图5是根据本申请一个实施例的对焦过程示意图。
图6是根据本申请一个实施例的对焦过程示意图。
图7是根据本申请一个实施例的对焦过程示意图。
图8是根据本申请另一实施例的对焦流程示意图。
图9是根据本申请一个实施例的图像信号处理器的示意框图。
图10是根据本申请一个实施例的电子设备的示意框图。
图11是根据本申请一个实施例的电子设备的结构框图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
图1示出了本申请实施例的一种可应用的场景示意图。如图1所示,电子设备100可以包括至少两个摄像机,例如,第一摄像机110和第二摄像机120,该电子设备100可以通过处理器控制器(图中未示出)控制第一摄像机110和第二摄像机120对焦物体130,并获取物体130的图像。
本申请实施例中的电子设备可以包括手机、平板电脑、个人数字助理(personal digital assistant,PDA)、笔记本电脑、台式电脑、销售终端(point of sales,POS)、监控设备等包括至少两个摄像机的设备。
应理解,本申请实施例中处理器也可以称为图像信号处理器、图像处理单元、处理单元或处理模块等。该处理器可以是该电子设备的CPU,该处理器也可以是与CPU不同的单独的器件,本申请实施例并不限于此。
前文已说明,现有的具有至少两个摄像机的电子设备中基于成本的考虑,通常仅有一个摄像机支持相位检测自动对焦PDAF功能。
应理解,PDAF的实现原理如下:在感光元件上预留出一些遮蔽像素点,专门用来进行图像的相位检测,通过像素之间的距离及其变化等来决定对焦的偏移值从而实现准确对焦。
然而,目前,针对仅有一个摄像机支持PDAF的上述电子设备拍摄物体的对焦过程耗时较长,且在对焦过程中拍摄画面较模糊,用户体验较差。
鉴于上述问题,本申请实施例提出了一种对焦的方法,通过让不支持PDAF功能的摄像机(第二摄像机)同步支持PDAF功能的摄像机(第一摄像机)的对焦行为,实现不支持PDAF摄像机的快速对焦,能够降低对焦时间,减小或避免对焦过程中模糊画面的问题,提升用户体验。
具体而言,在本申请实施例中,可以在第一摄像机进行相位检测对焦的过程中,根据第一摄像机的一个或多个相位对焦位置,确定所述第二摄像机的至少一个相位对焦位置,并控制所述第二摄像机移动至所述至少一个相位对焦位置。实现了不支持PDAF功能的摄像机同步支持PDAF功能的摄像机的对焦行为,实现不支持PDAF摄像机的快速对焦,能够降低整体对焦时间,提升用户体验。
并且,进一步地,本申请实施例中,在第二摄像机同步第一摄像机的过程中,还可以根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置。也就是说,本申请实施例中,第二摄像机同步第一摄像机的过程中可以记录第二摄像机检测的图像的对比度信息,并根据该对比度信息确定第二摄像机的对比度对焦位置。由于在第二摄像机同步第一摄像机的行为可能导致第二摄像机的对焦移动方向不准确,或者,在第一摄像机对焦结束后第二摄像机需要再采用对比度对焦方式对焦。鉴于此,本申请实施例可以通过记录的对比度 信息保证对焦移动方向的准确性,或者通过记录的对比度信息直接确定对比度对焦位置,避免了再次根据第一摄像机的对焦位置或者CDAF的方式确定该第二摄像机的准焦位置,能够降低对焦时间,提升用户体验。
以下,作为示例而非限定,对本申请实施例的对焦的方法结合具体的例子进行详细说明。
由于本申请实施例中,不支持PDAF功能的第二摄像机需要同步支持PDAF第一摄像机的对焦行为,因此,以下为了使得本申请实施例的方案容易理解,下面首先结合图2描述支持PDAF功能的摄像机的一种对焦的具体过程。
具体地,如图2所示的方法可以由处理器(例如,图像信号处理器)执行,图2所示的方法包括:
210,控制摄像机移动至第一对焦位置。
具体地,首先根据摄像机获取图像的PD信息确定该第一对焦位置,并控制该摄像机移动至该第一对焦位置。
220,确定当前位置与第二对焦位置间的距离。
具体地,在摄像机移动至第一对焦位置后,图像信号处理器重新获取图像的PD信息,并根据新获取的图像的PD信息确定第二对焦位置,之后确定该当前位置(即第一对焦位置)与该第二对焦位置的差距。
应理解,在实际应用中,在对焦的过程中该摄像机拍摄的物体可能会移动,或者该电子设备也会移动,或者,在210中移动摄像机之前,当摄像机所在的位置与准焦位置较远时,PDAF对焦精度可能会较差,导致在220中的当前位置(即执行完步骤210后的镜头的位置)与第二对焦位置(即当前位置获取的图像的PD信息确定的准焦位置)可能有一定的差距,因此,该当前位置与第二对焦位置可能具有一定的距离。
230,确定该距离是否大于第一阈值。
具体的,在该距离大于该第一阈值的情况下,控制摄像机移动至第二对焦位置,之后再判断摄像机移动至第二对焦位置后按照上述方法获取的新的距离是否大于第一阈值,重复210至230的过程,直到确定获取的最后一次距离小于或等于该第一阈值。
在距离小于或等于该第一阈值的情况下,或者上述重复过程达到重复阈值的情况下,执行步骤240。
应理解,本申请实施例中该第一阈值可以是预设的值,该第一阈值可以根据实际情况而定,本申请实施例并不对第一阈值的取值做限定。
240,判断是否满足PD自收敛条件。
具体的,本申请实施例中,在该距离小于或等于该第一阈值的情况下,即可认为满足PD自收敛条件。
可替代地,作为另一实施例,本申请实施例中可以进行如下变形,在230中当该距离小于或等于该第一阈值的情况下,仍然执行步骤210,重复上述过程,直到连续的n次(例如,2次或3次等)重复过程中该距离均小于或等于该第一阈值,或者重复次数达到重复阈值,之后执行步骤240。这种情况下,步骤240可以进行如下变形:判断存在连续的n次重复过程中该距离均小于或等于该第一阈值,则认为满足PD自收敛条件。
应理解,在230中该距离小于或等于该第一阈值的情况下,在重复上述过程时可以不 移动摄像机的位置,直接再次根据当前位置摄像机获取图像的PD信息确定当前的对焦位置,然后进行后续的比较判断过程。
在满足PD自收敛条件的情况下,执行步骤260,对焦结束。
在不满PD自收敛条件的情况下,执行步骤250。
250,使用小步长搜索准焦位置。
例如,采用CDAF小步长搜索的方式确定准焦位置,并控制摄像机移动至该准焦位置。
260,对焦结束。
应理解,上述步骤240可以为可选步骤,在230中当该距离小于或等于该第一阈值的情况下,或者上述230中距离大于该第一阈值的情况下,重复210至230的过程达到重复阈值的情况下,可以不执行步骤240,直接执行步骤260或者250,本申请实施了并不限于此。
还应理解,方法200中在执行250之前,可以先进行CDAF大步长搜索对焦位置,然后在大步长确定的对焦位置与当前位置差距小于第二阈值的情况下,在执行步骤250,本申请实施例并不限于此。应理解,CDAF大步长搜索对焦位置的幅度大于小步长搜索确定对焦位置的幅度。
应理解,本申请实施例中该第二阈值可以是预设的值,该第二阈值可以根据实际情况而定,本申请实施例并不对第一阈值的取值做限定。
下面结合图3描述本申请实施例的对焦的方法。图3示出了本申请实施例的对焦的方法示意性流程图。图3所示的方法可以应用于上述包括至少两个摄像机的电子设备中,该至少两个摄像机中的第一摄像机支持相位检测自动对焦PDAF功能,第一摄像机用于检测图像的相位信息;第二摄像机不支持PDAF功能,第二摄像机用于检测图像的对比度信息。图3所示的方法300可以由该电子设备的处理器执行。具体的,图3所示的方法中处理器可以按照类似图2的方法控制第一摄像机进行对焦,并且,该处理器并行地控制第二摄像机同步该第一摄像机的对焦行为,实现不支持PDAF功能的第二摄像机的快速对焦,能够降低整体对焦时间。
应理解,本申请实施例中,摄像机能够获取图像,摄像机可以包括镜片组及感光元件,可选地,本申请实施例中各个摄像机还可以包括各自的图像信号处理模块,可选地,摄像机可以不包括图像信号处理模块,由处理器统一进行图像信号处理,本申请实施例并不限于此。
如图3所示的方法300包括:
310,在第一摄像机进行相位检测自动对焦的过程中,根据该第一摄像机的一个或多个相位对焦位置,确定第二摄像机的至少一个相位对焦位置,并控制该第二摄像机移动至该至少一个相位对焦位置。
应理解,本申请实施例中,所述相位对焦位置是基于图像的相位信息计算得到的对焦位置。
具体而言,按照图2描述的方法,在第一摄像机进行相位检测自动对焦过程中,可以根据该第一摄像机采集的图像的相位检测PD信息,确定第一摄像机的至少一个相位对焦位置。本申请实施例中,可以根据该第一摄像机的至少一个相位对焦位置中的一个或多个相位对焦位置,确定所述第二摄像机的至少一个相位对焦位置。
应理解,本申请实施例中“对焦位置”可以包括移动方向和/或移动距离,该移动方向表示为了获得清晰图像,摄像机中可移动的镜片、镜片组或镜头,需要移动的方向;该移动距离表示为了获得清晰图像,摄像机中可移动的镜片、镜片组或镜头,需要在移动方向上移动的距离。换句话说,该对焦位置可以表示为了获得清晰图像,摄像机中可移动的镜片、镜片组或镜头,需要处于的位置。
下面介绍本申请实施例中根据第一摄像机的相位对焦位置确定第二摄像机的相位对焦位置的方法。
例如,一种可行的实现方式中:可以将第一摄像机的相位对焦位置换算成拍摄物体的深度信息或者物距,然后根据该深度信息或者物距计算出该第二摄像机的相位对焦位置。
例如,如图4所示,
透镜成像原理为:1/f=1/u+1/v。
其中u称为像距(对应到摄像机中为成像面与镜头组光心间距离);v称为物距,即物体距光心距离;f为镜头组焦距,对应某一镜头组时,f为固定常数。
本申请实施例的对焦过程即为调节像距u的过程。
在第一摄像机位置确定时,即第一摄像机的像距u1确定(对应第一摄像机的相位对焦位置),此时可根据透镜成像原理及第一摄像机的焦距f1计算出当前物距v(该物距为第一摄像机当前像距所能对焦清晰的物距);第二摄像机同步第一摄像机的位置意味着第二摄像机对焦到与第一摄像机相同的物距位置,因此根据物距v及第二摄像机的焦距f2,可计算出第二摄像机的像距u2,该u2即第二摄像机同步的位置(对应第二摄像机的相位对焦位置)。
再例如,另一种可行的实现方式中:由于电子设备中第一摄像机和第二摄像机的位置是固定的,一般在实际应用中,u1和u2存在一定的映射关系,本申请实施例可以通过该映射关系直接根据第一摄像机的相位对焦位置确定第二摄像机的相位对焦位置。
应理解,本申请实施例中,“第一摄像机进行相位检测自动对焦的过程中”可以表示第一摄像机在开启相位检测对焦之后至相位检测对焦结束之前的时间段中的全部或部分时间。换句话说,“第一摄像机进行相位检测自动对焦的过程中”可以表示第一摄像机在开启CDAF对焦之前的一个时间段,本申请实施例并不限于此。
具体而言,处理器可以在所述第一摄像机进行相位检测自动对焦的过程中,根据所述第一摄像机的一个或多个相位对焦位置,确定所述第二摄像机的至少一个相位对焦位置,并控制所述第二摄像机移动至所述至少一个相位对焦位置;实现处理器并行控制两个摄像机的对焦,使得不支持PDAF功能的第二摄像机同步支持PDAF功能的第一摄像机的对焦行为,实现不支持PDAF第二摄像机的快速对焦,能够降低整体对焦时间,提升用户体验。
应理解,本申请实施例中处理器在所述第一摄像机进行相位检测自动对焦的过程中,控制所述第二摄像机移动至所述至少一个相位对焦位置。也可以表述为处理器并行控制第一摄像机和第二摄像机。这里并行控制第一摄像机和第二摄像机可以理解为同时控制第一摄像机和第二摄像机,应理解,本申请实施例中的并行控制两个摄像机或者同时控制两个摄像机并不局限于控制两个摄像机在时间上的严格一致,例如,控制两个摄像机移动的时间上可以准许一定的时间间隔,本申请实施例并不限于此。
例如,在确定出第一摄像机的对焦位置后,可以先控制第一摄像机移动,然后在第一摄像机移动的过程中,根据第一摄像机的对焦位置确定出第二摄像机的对焦位置,再控制第二摄像机移动。再例如,再确定出第一摄像机的对焦位置后,可以先根据第一摄像机的对焦位置确定出第二摄像机的对焦位置,然后处理器并行控制该第一摄像机和该第二摄像机分别移动至相应的对焦位置。
具体地,处理器可以通过控制镜头马达驱动器移动摄像机至对应的对焦位置。
因此,本申请实施例中,处理器能够并行控制至少两个摄像机进行对焦,使得不支持PDAF功能的摄像机同步支持PDAF功能的摄像机的对焦行为,实现不支持PDAF摄像机的快速对焦,能够降低整体对焦时间,提升用户体验。
应理解,本申请实施例中,第一摄像机的一个或多个相位对焦位置可以包括第一摄像机在进行相位检测自动对焦过程中的一个或者连续的多个相位对焦位置。该一个或者多个相位对焦位置可以包括第一个相位对焦位置或者相位对焦过程中的中间的一个或多个相位对焦位置,或者最后一个相位对焦位置,本申请实施例并不限于此。
320,根据该第二摄像机在该至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定该第二摄像机的对比度对焦位置,其中,该第二摄像机的对比度对焦位置包括该第二摄像机的对比度对焦移动方向。
应理解,本申请实施例中,所述对比度对焦位置是基于图像的对比度信息计算得到的对焦位置。
应理解,本申请实施例可以根据该记录的图像的对比度信息来确定对比度对焦移动方向,例如,当记录的对比度信息的曲线信息越来好的情况下,则可以确定对比度对焦移动方向为朝曲线信息越来越好的走势方向。
具体而言,本申请实施例中,在第二摄像机同步第一摄像机的过程中,可以根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置。也就是说,本申请实施例中,第二摄像机同步第一摄像机的过程中可以记录第二摄像机检测的图像的对比度信息,并根据该对比度信息确定第二摄像机的对比度对焦位置。由于在第二摄像机同步第一摄像机的行为可能导致第二摄像机的对焦移动方向不准确,或者,在第一摄像机对焦结束后第二摄像机需要再采用对比度对焦方式对焦。本申请实施例可以通过记录的对比度信息保证对焦移动方向的准确性,或者通过记录的对比度信息直接确定对比度对焦位置,避免了再次根据第一摄像机的对焦位置或者CDAF的方式确定该第二摄像机的准焦位置,能够降低对焦时间,提升用户体验。
可选地,作为另一实施例,所述方法还包括:
根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,控制所述第二摄像机是否检测图像的对比度信息;
其中,所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置时被控制检测图像的对比度信息。
例如,在电子设备的移动状态,和/或第二摄像机的图像稳定状态满足检测条件时,处理器控制所述第二摄像机检测图像的对比度信息;或者,在该移动状态,和/或,图像稳定状态不满足检测条件时处理器不控制第二摄像机检测图像的对比度信息。
例如,在电子设备的移动状态为较稳定或移动较慢,例如,电子设备的陀螺仪或加速器检测到电子设备的移动性小于预设移动阈值时,和/或者图像的稳定状态为图像对比度变化较小(例如第二摄像机在当前对焦位置与前一对焦位置图像的对比度变化较小,例如小于预设对比度变化阈值时,则认为满足检测条件。反之则可以认为不满足检测条件。
应理解,在满足检测条件时,本申请实施例也可以认为拍摄的画面是静止的。在不满足检测条件时,则可以认为拍摄的画面是变化的,本申请实施例并不限于此。
具体而言,本申请实施例中,在第二摄像机同步第一摄像机的过程中,还可以根据所述第二摄像机在所述至少一个相位对焦位置时根据电子设备的移动状态和/或第二摄像机的图像稳定状态控制第二摄像机是否检测图像的对比度信息,例如,处理器确定在第二摄像机在该至少一个相位对焦位置中的一个或多个相位对焦位置确定检测图像的对比度信息,并根据检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置。
也就是说,本申请实施例在第二摄像机同步第一摄像的过程中,且确定画面静止的情况下,才检测图像的对比度信息,由于静止画面获取的图像的对比度信息比较可靠,能够使得处理器根据获取的图像的对比度信息准确的确定出对比度对焦位置,进而本申请实施例可以通过记录的对比度信息保证对焦移动方向的准确性,或者通过记录的对比度信息直接确定对比度对焦位置,避免了再次根据第一摄像机的对焦位置或者CDAF的方式确定该第二摄像机的准焦位置,能够降低对焦时间,提升用户体验。
上面介绍了只有在画面静止的情况下,处理器才控制第二摄像机获取图像的对比度信息,并确定第二摄像机的对比度对焦位置。可替代地,本申请实施例也可以在第二摄像机的所有相位对焦位置均检测图像的对比度信息,然后,从检测的图像的对比度信息中获取有效的信息。
相应地,作为另一实施例,本申请实施例方法还包括:
根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,确定所述第二摄像机检测到的图像的对比度信息是否有效;
其中,所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息是有效的。
例如,在电子设备的移动状态为较稳定或移动较慢,例如,电子设备的陀螺仪或加速器检测到电子设备的移动性小于预设移动阈值时,和/或者图像的稳定状态为图像对比度变化较小(例如第二摄像机在当前对焦位置与前一对焦位置图像的对比度变化较小,例如小于预设对比度变化阈值时(即拍摄画面静止时),则认为第二摄像机检测到的图像的对比度信息是有效的。反之(即拍摄画面变化时)则可以认为第二摄像机检测到的图像的对比度信息是无效的。
应理解,本申请实施例中所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置均是静止的,且该至少一个相位对焦位置中的多个相位对焦位置为连续的有效信息,即第二摄像机在该多个相位对焦位置时拍摄画面一直是静止的。
也就是说,本申请实施例在第二摄像机同步第一摄像的过程中,检测图像的对比度信息,并确定画面静止的情况下的图像的对比度信息为有效的,由于静止画面获取的图像的对比度信息比较可靠,能够使得处理器根据获取的图像的对比度信息准确的确定出对比度对焦位置,进而本申请实施例可以通过记录的对比度信息保证对焦移动方向的准确性,或 者通过记录的对比度信息直接确定对比度对焦位置,避免了再次根据第一摄像机的对焦位置或者CDAF的方式确定该第二摄像机的准焦位置,能够降低对焦时间,提升用户体验。
可选地,作为另一实施例,所述第二摄像机的对比度对焦位置还包括所述第二摄像机的对比度对焦移动距离,所述方法还包括:
控制所述第二摄像机移动至所述对比度对焦位置。
应理解,当记录的第二摄像机采集的图像的对比度信息的曲线有峰值的情况下,则可以确定根据该记录的对比度信息能够确定出该第二摄像机的准焦位置(即对比度对焦移动方向和移动距离),例如,该准焦位置可以为该峰值对应的位置,或者该准焦位置为通过对比度信息曲线拟合获取的准焦位置。
并且,在确定根据记录的所述第二摄像机采集的图像的对比度信息不能计算出所述第二摄像机的准焦位置的情况下,控制所述第二摄像机的镜头移动至采用CDAF方式确定的第二摄像机的准焦位置时,可以根据该记录的对比度信息来确定如何采用CDAF对焦,例如,当记录的对比度信息的曲线信息越来好的情况下,则可以采用CDAF方式继续之前的方向移动镜头确定最终的准焦位置;或者,当记录的对比度信息的曲线信息越来差的情况下,则可以采用CDAF方式反方向移动镜头确定最终的准焦位置等。
也就是说,本申请实施例中在第二摄像机同步第一摄像机的相位检测自动对焦过程中,只要能够根据记录的图像的对比度信息能够确定出第二摄像机的对比度对焦移动方向和移动距离,不管第一摄像机是否完成对焦,可以直接控制第二摄像机在该移动方向上移动该移动距离,完成第二摄像机的对焦。
应理解,CDAF对焦的原理为:通过对焦物体清晰度的变化确定准焦位置,具体的,当拍摄物体的画面经历一次清晰度的“上下坡”过程后,CDAF算法才能获得最合适的准焦位置。以采用CDAF方式拍摄一枚硬币为例,最开始画面是虚焦的状态,随后镜头移动,人们可以看到屏幕中的硬币逐渐清晰起来。直到某一个位置(合焦状态)硬币最为清晰,但摄像机自身是意识不到此时已经合焦完毕的,镜头会继续移动,此时人们会看到硬币又变得模糊。这时摄像机模组才意识到镜头“走过站了”,于是回退至刚才清晰的焦点位置,这样一次对焦就完成了。通过CDAF方式对焦时间较长,对焦过程中画面较模糊,用户体验较差。
具体而言,本申请实施例中,在第二摄像机同步第一摄像机的过程中,可以根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦移动方向和所述第二摄像机的对比度对焦移动距离。由于已确定出第二摄像机的准焦位置,因此,处理器可以直接控制第二摄像机在该第二摄像机的对比度对焦移动方向上移动该对比度对焦移动距离,完成第二摄像机的对焦,无需再控制第二摄像机同步第一摄像机的行为,也无需控制第二摄像机再采用CDAF方式确定准焦位置,能够降低对焦时间,提升用户体验。
可选地,作为另一实施例,所述方法还包括:
根据所述第一摄像机的下一相位对焦位置,确定所述第二摄像机的下一相位对焦位置,所述第二摄像机的下一相位对焦位置包括所述第二摄像机的相位对焦移动方向和相位对焦移动距离;
当所述对比度对焦移动方向与所述相位对焦移动方向一致时,控制所述第二摄像机移 动至所述下一相位对焦位置。
具体的,在第一摄像机进行相位检测自动对焦的过程中(即第一摄像机未结束相位对焦时),在第一摄像机移动至当前的相位对焦位置后,如上文中步骤230所描述的,在距离大于该第一阈值的情况下,控制第一摄像机移动至下一相位对焦位置,并根据该第一第一摄像机的下一相位对焦位置,确定所述第二摄像机的下一相位对焦位置,当所述对比度对焦移动方向与所述相位对焦移动方向一致时,控制所述第二摄像机移动至所述下一相位对焦位位置,重复该过程,直到第一摄像机的相位对焦结束,或者,根据记录的对比度信息能够确定出第二摄像机的对比度对焦移动方向和移动距离。
可选地,作为另一实施例,当所述第一摄像机的相位检测自动对焦结束后,或者,所述对比度对焦移动方向与所述相位对焦移动方向不一致时,控制所述第二摄像机在所述对比度对焦移动方向上移动预设距离。
具体的,上述相位对焦移动方向与对比度对焦移动方向不一致时,本申请实施例可以停在第二摄像机同步第一摄像机的行为,直接控制第二摄像机在该对比度对焦移动方向采用CDAF方式移动预设距离,例如,采用CDAF的固定步长在该对比度对焦移动方向上移动进行CDAF对焦。
或者,在当所述第一摄像机的相位检测自动对焦结束后控制第二摄像机在该对比度对焦移动方向采用CDAF方式移动预设距离,例如,采用CDAF的固定步长在该对比度对焦移动方向上移动进行CDAF对焦。
因此,本申请实施例通过在第二摄像机同步第一摄像机的过程中图像的对比度信息,确定出第二摄像机的对比度对焦移动方向,进而可以在第二摄像机的相位对焦方向错误,即第二摄像机的相位对焦方向与该对比度对焦方向相反时,停止第二摄像机同步第一摄像机的行为,直接在该对比度对焦方向上采用CDAF对焦,能够避免第二摄像机不必要的移动行为,保证第二摄像机准确快速的对焦。
或者,本申请实施例通过在第二摄像机同步第一摄像机的过程中图像的对比度信息,确定出第二摄像机的对比度对焦移动方向,进而可以在第一摄像机对焦结束后,直接在该对比度对焦方向上采用CDAF对焦,能够避免第二摄像机采用随机方向采用CDAF方式移动第二摄像头对焦的方向,保证第二摄像机准确快速的对焦。
下面结合图5图7具体的列子,描述在画面静止时,本申请实施例对焦的过程。
例如,如图5所示,在第一摄像机开始自动对焦时,假设初始第一摄像机和第二摄像机位置为0,应理解,第一摄像机的位置可以表示第一摄像机的可移动的镜片或镜片组在其移动方向(例如,该电子设备为手机,该方向可以为垂直于手机显示屏的方向)上的一个位置,第二摄像机的位置可以表示第一摄像机的可移动的镜片或镜片组在其移动方向(例如,该电子设备为手机,该方向可以为垂直于手机显示屏的方向)上的一个位置。应理解,虽然第一摄像机和第二摄像机的位置均为0,但该两个摄像机的实际空间位置不同。还应理解,本申请实施例中为了便于描述,设置第一摄像机和第二摄像机位置均为0处,但本申请实施例并不限于此,在实际应用中第一摄像机和第二摄像机的初始位置可能不同。
假设由第一摄像机位置同步至第二摄像机时位置比例为1:1,应理解,实际使用中,该比例为固定比例,但该比例值会因第一摄像机和第二摄像机间的差异而不同,例如,该 比例可以大于1:1或者小于1:1本申请实施例并不限于此。假设第一摄像机在位置0时,获取的对比度值为100,如图5所示,写作[0,100](应理解,本申请实施例中也可不关注第一摄像机的对比度值,此处为了上下对齐,描述的统一,设置了第一摄像机的对比度值,但本申请实施例并不限于此)。第二摄像机位置及对比度为[0,300]。设在0位置时第一摄像机的PD给出的准焦位置值为80(PD所给出的值为准焦位置与当前位置的距离及方向,在此我们规定值为正表示正方向,即向右,值为负表示负方向,即向左),同步至第二摄像机,第二摄像机的准焦位置也为80,则将第一摄像机和第二摄像机推至80。在位置80时,第一摄像机获取的对比度值为200,第二摄像机获取的对比度值为450(当保存的位置编号(code)和对比度值有两组及以上时,便可以判断对比度(contrast)对焦移动方向),由于第二摄像机的对比度值在上升,因此对比度值所指向的方向(对比度对焦移动方向)为正向,需要保持当前的方向继续向下搜索,反之则需反向。此时,PD给出的值为正,即相位对焦移动方向与对比度对焦移动方向一致,因此继续同步第一摄像机的位置,第一摄像机和第二摄像机推至位置100处。在位置100时,第二摄像机对比度为400,相对于位置80时是下降的,因此对比度对焦移动方向为负。而此时对比度曲线存在峰值(位置80处),因此可采用曲线拟合获取准焦位置(即获取在移动方向上的移距离),或由当前位置开始,朝着80的位置采用CDAF的方式获取准焦位置。
应理解,本申请实施例中所谓的CDAF的方式是指采用固定步长,推一步获取一个对比度信息,在曲线存在峰值时采用曲线拟合获取准焦位置的方法。如从位置100向位置80采用CDAF的方式对焦,设步长为15(步长因个算法不同而不同,但都是固定值),则在位置100处保存对比度值[100,400],推15至位置85,获取对比度值[85,445],再推15至位置70获取对比度值[70,x]。若x小于445,则位置85处为峰值,采用曲线拟合可获取准焦位置,将马达推至准焦位置即可结束本次对焦;若x大于445,则需继续按照15的步长向左推动,直至在某一个位置的对比度值比上一次的对比度值小为止,此时再采用曲线拟合获取准焦位置。
由于可以根据记录的对比度信息能够确定出该第二摄像机的准焦位置(即对比度对焦移动方向和移动距离),可以直接控制第二摄像机在该移动方向上移动该准焦位置,完成第二摄像机的对焦。
具体而言,本申请实施例中,在第二摄像机同步第一摄像机的过程中,可以根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦移动方向和所述第二摄像机的对比度对焦移动距离。由于已确定出第二摄像机的准焦位置,因此,处理器可以直接控制第二摄像机在该第二摄像机的对比度对焦移动方向上移动该对比度对焦移动距离,完成第二摄像机的对焦,无需再控制第二摄像机同步第一摄像机的行为,也无需控制第二摄像机再采用CDAF方式确定准焦位置,能够降低对焦时间,提升用户体验。
再例如,如图6所示,第一摄像机在位置0时,PD给出的值为80,因此第一摄像机下一步需要推至位置80;设第二摄像机在位置0时的对比度为300,下一步同步第一摄像机的位置也将马达推至80;设在位置80时,第一摄像机的PD值为20,第二摄像机在位置80时获取的对比度值为450,此时由于第二摄像机的对比度值在上升,因此对比度判断的方向为正向向右,而第一摄像机的PD给出的值为20,值为正向右,因此同步第一摄像 机的方向与第二摄像机保存对比度方向一致,因此继续同步第一摄像机的位置。在位置100时,若第一摄像机获取的PD值为-5,第二摄像机获取的对比度值为500,此时同步第一摄像机时的方向(即第二摄像机的相位对焦移动方向)为负向左,而对比度值依然在上升,对比度获取的方向(即第二摄像机的对比度对焦移动方向)为正向右。由于二者方向不一致,因此第二摄像机停止同步第一摄像的行为,需继续向右接CDAF对焦。
应理解,上述图6中的例子是在第三步的时候PD方向与对比度方向不一致,可选地,若在第二步,位置80处,第二摄像机的对比度若小于300,即在第二步PD的方向与对比度方向不一致时,第二摄像机则将在第二步处开始向左接CDAF对焦。
再例如,如图7所示,前面三步与图6一致,在第三步时,若PD给出的值是5,第二摄像机对比度在上升,因此对比度方向与PD给出的方向一致,为正向向右,因此第三步时第二摄像机依然同步第一摄像机的行为将马达推至105;此时第一摄像机因PD值很小,表示当前位置离准焦位置非常近(假设PD收敛阈值大于5),可结束对焦,此时第一摄像机可推至105,可保持当前位置,具体根据各个算法不同而不同。当第二摄像机推至105时,由于第一摄像机已结束对焦,因此无法再同步第一摄像机行为,需要自行判断后续流程,若在105处获取的对比度值x小于500,则在位置100处为对比度峰值处,可直接进行曲线拟合找到第二摄像机的峰值点或由105向100的方向按照固定步长接CDAF。若x大于500,则表明峰值点依然在正方向右侧,此时第二摄像机将从当前位置开始,以固定步长向右接CDAF对焦。
应理解,作为另一实施例,若第一步中第一摄像机马达已位于准焦位置,则第二摄像机直接接CDAF,步长固定,方向可能向左也可能向右,根据具体实现确定。
应理解,本申请实施例中在控制第一摄像机和该第二摄像机分别完成对焦后,即可以认为该两个摄像机已完成对焦,之后处理器可以分别通过第一摄像机和第二摄像机获取图像,并将该两个摄像机获取的图像进行合成形成最终的图像。可选地,在该电子设备具有显示屏的情况下,该电子设备还可以通过显示屏显示该最终的图像。
应理解,在两个摄像机分别完成对焦后的图像处理过程可以参照现有多摄像机的图像合成算法,本申请实施例并不对此做限定。
上面结合图2至图7描述了本申请实施例的对焦的具体过程。下面结合图8具体的例子详细描述本申请实施例的对焦的方法。图8的方法可以应用于上述包括至少两个摄像机的电子设备中,该至少两个摄像机中的第一摄像机支持相位检测自动对焦PDAF功能,第二摄像机不支持PDAF功能。图8所示的方法可以由该电子设备的处理器执行。应理解,图8中仅描述了两个摄像机(双摄)对焦的例子,但本申请实施例并不限于此,当电子设备包括三个或更多个摄像头时对焦的过程与此类似,为避免重复,此处不再赘述。
图8所示的方法800包括:
801,开始双摄对焦。
810,确定画面变化。
具体的,确定画面变化即画面静止的描述可以参考上文中的描述,为避免重复,此处不再赘述。
811,第一摄像机进行相位检测自动对焦。
应理解,第一摄像机进行相位检测自动对焦的过程可以参考图2中的描述,为避免重复,此处不再赘述。
812,第二摄像机同步第一摄像机的位置。
具体的,第二摄像机同步第一摄像机的位置,且不记录第二摄像机的图像对比度信息。
813,判断画面是否静止。
在画面静止的情况下,执行步骤814,否则执行步骤810。
814,判断第一摄像机是否已准焦。
在第一摄像机已准焦的情况下,执行步骤815和步骤816,在第一摄像机未准焦的情况下,执行步骤820。
815,第一摄像机完成对焦。
816,第二摄像机进行CDAF对焦。
应理解,步骤810至步骤816描述了,在画面变化的情况下的对焦过程,具体的,在画面变化的情况下,第二摄像机同步第一摄像机的行为,不记录第二摄像机的图像对比度信息。
因此,本申请实施例中,使得不支持PDAF功能的摄像机同步支持PDAF功能的摄像机的对焦行为,实现不支持PDAF摄像机的快速对焦,能够降低整体对焦时间,提升用户体验。
820,确定画面静止。
具体的,确定画面变化即画面静止的描述可以参考上文中的描述,为避免重复,此处不再赘述。
821,第一摄像机进行相位检测自动对焦。
应理解,第一摄像机进行相位检测自动对焦的过程可以参考图2中的描述,为避免重复,此处不再赘述。
822,第二摄像机同步第一摄像机的位置。
具体的,第二摄像机同步第一摄像机的位置,且记录第二摄像机的图像对比度信息。
823,判断根据记录的图像对比度信息能否计算出对比度对焦移动方向。
在能够计算出对比度对焦移动方向的情况下,执行步骤824,在不能计算出对比度对焦移动方向的情况下,执行步骤822。
824,判断根据记录的图像对比度信息能否计算出对比度对焦移动距离。
在能够计算出对比度对焦移动距离的情况下,执行步骤825,在不能计算出对比度对焦移动距离的情况下,执行步骤826。
825,第二摄像机推至准焦位置(即在对比度对焦移动方向上移动对比度对焦移动距离)。
826,第二摄像机的相位对焦移动方向与对比度对焦方向是否一致。
在一致的情况下执行步骤822,在不一致的情况下,执行步骤829。
827,判断第一摄像机是否满足相位对焦自收敛。
在第一摄像机相位对焦自收敛的情况下,执行步骤828,在第一摄像机不满足相位对焦自收敛的情况下,执行步骤821和步骤826。
可选地,作为另一实施例,在图8所示的方法中,与图2类似,本申请实施例还可以在不满足PD自收敛的次数达到预设重复阈值的情况下,可以停止第一摄像机的相位对焦过程,然后使用CDAF确定第一摄像机的准焦位置。
828,第一摄像机完成对焦。
829,控制第二摄像机在所述对比度对焦移动方向上进行CDAF对焦。
应理解,步骤820至步骤829描述了,在画面静止的情况下的对焦过程,具体的,在画面变化的情况下,第二摄像机同步第一摄像机的行为,记录第二摄像机的图像对比度信息。并根据记录的对比度信息确定第二摄像机的对比度对焦位置(该对比度对焦位置包括对比度对焦移动方向,或者,该对比度对焦位置包括对比度对焦移动方向和对比度对焦移动距离)。
也就是说,本申请实施例在第二摄像机同步第一摄像的过程中,且确定画面静止的情况下,才检测图像的对比度信息,由于静止画面获取的图像的对比度信息比较可靠,能够使得处理器根据获取的图像的对比度信息准确的确定出对比度对焦位置,进而本申请实施例可以通过记录的对比度信息保证对焦移动方向的准确性,或者通过记录的对比度信息直接确定对比度对焦位置,避免了再次根据第一摄像机的对焦位置或者CDAF的方式确定该第二摄像机的准焦位置,能够降低对焦时间,提升用户体验。
应理解,上文中图2至图8的例子,仅仅是为了帮助本领域技术人员理解本发明实施例,而非要将本发明实施例限于所例示的具体数值或具体场景。本领域技术人员根据所给出的图2至图8的例子,显然可以进行各种等价的修改或变化,这样的修改或变化也落入本发明实施例的范围内。
应理解,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本发明实施例的实施过程构成任何限定。
上文中,结合图2至图8详细描述了本发明实施例的对焦的方法,下面结合图9描述本申请实施例的处理器,结合图10描述本申请实施例的电子设备,结合图11描述本申请实施例的手机。
图9是根据本申请一个实施例的处理器的示意框图。图9所示处理器900包括:处理单元910和存储单元920。所述存储单元920用于存储代码,所述处理单元910用于执行所述存储单元920中的代码执行上述图2至图8所示的方法。具体的该处理器实现的方法可以参见上文中图2至图5的描述,为避免重复,此处不再赘述。
应理解,处理器900也可以称为图像信号处理器、图像处理单元、处理单元或处理模块等。该处理器900可以是该电子设备的CPU,该图像信号处理器也可以是与CPU不同的单独的器件,本申请实施例并不限于此。
应注意,本发明实施例中的处理器可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器可以是通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated crcuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本发明实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本发明实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法 的步骤。
可以理解,本发明实施例中的存储单元也可称为存储器,该存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
图10是根据本申请一个实施例的电子设备的示意框图。图10所示的电子设备1000包括处理器1010和至少两个摄像机1020,其中,该至少两个摄像机中的第一摄像机1021支持相位检测自动对焦PDAF功能,第二摄像机1022不支持PDAF功能。
应理解,该处理器1010与图9所示的9处理器600类型,能够实现图2至图8所示的方法的功能,该处理器1010可以包括实现上述方法的所有模块或单元,为避免重复,此处不再赘述。
应理解,本申请实施例的电子设备还可以包括其他模块,下面结合图11描述本申请实施例中电子设备为手机的例子。
具体的,图11示出的是与本发明实施例相关的手机1100的部分结构的框图。参考图11,手机1100包括、射频(radio frequency,RF)电路1110、存储器1120、其他输入设备1130、显示屏1140、传感器1150、音频电路1160、I/O子系统1170、处理器1180、电源1190以及至少两个摄像机11100等部件。
本领域技术人员可以理解,图11中示出的手机结构并不构成对手机的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。本领领域技术人员可以理解显示屏1140属于用户界面(user interface,UI),且手机1100可以包括比图示或者更少的用户界面。
下面结合图11对手机1100的各个构成部件进行具体的介绍:
RF电路1110可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,给处理器1180处理;另外,将设计上行的数据发送给基站。通常,RF电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器(Low Noise Amplifier,LNA)、双工器等。此外,RF电路1110还可以通过无线通信与网络和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统(Global System of Mobile communication,GSM)、通用分组无线服务(General Packet Radio Service,GPRS)、码分多址(Code Division Multiple Access,CDMA)、宽带码分多址(Wideband Code Division Multiple Access,WCDMA)、长期演进(Long Term Evolution,LTE)、电子邮件、短消息服务(Short Messaging Service,SMS)等。
存储器1120可用于存储软件程序以及模块,处理器1180通过运行存储在存储器1120的软件程序以及模块,从而执行手机1100的各种功能应用以及数据处理。存储器1120可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图象播放功能等)等;存储数据区可存储根据手机1100的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器1120可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
其他输入设备1130可用于接收输入的数字或字符信息,以及产生与手机1100的用户设置以及功能控制有关的键信号输入。具体地,其他输入设备1130可包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由触摸屏形成的触摸敏感表面的延伸)等中的一种或多种。其他输入设备1130与I/O子系统1170的其他输入设备控制器1171相连接,在其他设备输入控制器1171的控制下与处理器1180进行信号交互。
显示屏1140可用于显示由用户输入的信息或提供给用户的信息以及手机1100的各种菜单,还可以接受用户输入。具体的显示屏1140可包括显示面板1141,以及触控面板1142。其中显示面板1141可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板1141。触控面板1142,也称为触摸屏、触敏屏等,可收集用户在其上或附近的接触或者非接触操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1142上或在触控面板1142附近的操作,也可以包括体感操作;该操作包括单点控制操作、多点控制操作等操作类型。),并根据预先设定的程式驱动相应的连接装置。可选的,触控面板1142可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位、姿势,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成处理器能够处理的信息,再送给处理器1180,并能接收处理器1180发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1142,也可以采用未来发展的任何技术实现触控面板1142。进一步的,触控面板1142可覆盖显示面板1141,用户可以根据显示面板1141显示的内容(该显示内容包括但不限于,软键盘、虚拟鼠标、虚拟按键、图标等等),在显示面板1141上覆盖的触控面板1142上或者附近进行操作,触控面板1142检测到在其上或附近的操作后,通过I/O子系统1170传送给处理器1180以确定用户输入,随后处理器1180根据用户输入通过I/O子系统1170在显示面板1141上提供相应的视觉输出。虽然在图11中,触控面板1142与显示面板1141是作为两个独立的部件来实现手机1100的输入和输入功能,但是在某些实施例中,可以将触控面板1142与显示面板1141集成而实现手机1100的输入和输出功能。
手机1100还可包括至少一种传感器1150,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板1141的亮度,接近传感器可在手机1100移动到耳边时,关闭显示面板1141和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、 敲击)等;至于手机1100还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路1160、扬声器1161,麦克风1162可提供用户与手机1100之间的音频接口。音频电路1160可将接收到的音频数据转换后的信号,传输到扬声器1161,由扬声器1161转换为声音信号输出;另一方面,麦克风1162将收集的声音信号转换为信号,由音频电路1160接收后转换为音频数据,再将音频数据输出至RF电路1110以发送给比如另一手机,或者将音频数据输出至存储器1120以便进一步处理。
I/O子系统1170用来控制输入输出的外部设备,可以包括其他设备输入控制器1171、传感器控制器1172、显示控制器1173、图像信号处理器1174。可选的,图像信号处理器1174用于控制至少两个摄像机11100进行拍摄物体,执行上述图2至图5所示的对焦的方法;一个或多个其他输入控制设备控制器1171从其他输入设备1130接收信号和/或者向其他输入设备1130发送信号,其他输入设备1130可以包括物理按钮(按压按钮、摇臂按钮等)、拨号盘、滑动开关、操纵杆、点击滚轮、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由触摸屏形成的触摸敏感表面的延伸)。值得说明的是,其他输入控制设备控制器1171可以与任一个或者多个上述设备连接。所述I/O子系统1170中的显示控制器1173从显示屏1140接收信号和/或者向显示屏1140发送信号。显示屏1140检测到用户输入后,显示控制器1173将检测到的用户输入转换为与显示在显示屏1140上的用户界面对象的交互,即实现人机交互。传感器控制器1172可以从一个或者多个传感器1150接收信号和/或者向一个或者多个传感器1150发送信号。
处理器1180是手机1100的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器1120内的软件程序和/或模块,以及调用存储在存储器1120内的数据,执行手机1100的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器1180可包括一个或多个处理单元;优选的,处理器1180可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器1180中。可选地,上述图像信号控制器也可以集成在处理器1180中,本申请实施例并不限于此。
手机1100还包括给各个部件供电的电源1190(比如电池),优选的,电源可以通过电源管理系统与处理器1180逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗等功能。
尽管未示出,手机1100还可以包括蓝牙模块等,在此不再赘述。
本发明实施例还提供了一种计算机可读介质,其上存储有计算机程序,该计算机程序被计算机执行时实现上述任一方法实施例的方法。
本发明实施例还提供了一种计算机程序产品,该计算机程序产品被计算机执行时实现上述任一方法实施例的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。该计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行该计算机指令时,全部或部分地产生按照本发明实施例该的流程或功能。该计算机可以是通用计算机、专用计算机、计算机网 络、或者其他可编程装置。该计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,该计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。该计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,高密度数字视频光盘(digital video disc,DVD))、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
应理解,上述图像信号处理器可以是一个芯片,该处理器可以通过硬件来实现也可以通过软件来实现,当通过硬件实现时,该处理器可以是逻辑电路、集成电路等;当通过软件来实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现,改存储器可以集成在处理器中,可以位于该处理器之外,独立存在。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (14)

  1. 一种电子设备,其特征在于,包括:
    处理器和至少两个摄像机;
    所述至少两个摄像机中的第一摄像机用于检测图像的相位信息;
    所述至少两个摄像机中的第二摄像机用于检测图像的对比度信息;
    所述处理器用于:
    在所述第一摄像机进行相位检测自动对焦的过程中,根据所述第一摄像机的一个或多个相位对焦位置,确定所述第二摄像机的至少一个相位对焦位置,并控制所述第二摄像机移动至所述至少一个相位对焦位置;
    根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置,其中,所述第二摄像机的对比度对焦位置包括所述第二摄像机的对比度对焦移动方向。
  2. 根据权利要求1所述的电子设备,其特征在于,
    所述处理器还用于:
    根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,控制所述第二摄像机是否检测图像的对比度信息;
    其中,所述处理器控制所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置时检测图像的对比度信息。
  3. 根据权利要求1所述的电子设备,其特征在于,
    所述处理器还用于:
    根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,确定所述第二摄像机检测到的图像的对比度信息是否有效;
    其中,所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息是有效的。
  4. 根据权利要求1所述的电子设备,其特征在于,
    所述第二摄像机的对比度对焦位置还包括所述第二摄像机的对比度对焦移动距离,
    所述处理器还用于:
    控制所述第二摄像机移动至所述对比度对焦位置。
  5. 根据权利要求1所述的电子设备,其特征在于,
    所述处理器还用于:
    根据所述第一摄像机的下一相位对焦位置,确定所述第二摄像机的下一相位对焦位置,所述第二摄像机的下一相位对焦位置包括所述第二摄像机的相位对焦移动方向和相位对焦移动距离;
    当所述对比度对焦移动方向与所述相位对焦移动方向一致时,控制所述第二摄像机移动至所述下一相位对焦位置。
  6. 根据权利要求5所述的电子设备,其特征在于,
    所述处理器还用于:
    当所述第一摄像机的相位检测自动对焦结束后,或者,所述对比度对焦移动方向与所述相位对焦移动方向不一致时,控制所述第二摄像机在所述对比度对焦移动方向上移动预设距离。
  7. 一种对焦的方法,其特征在于,应用于包括至少两个摄像机的电子设备中,所述至少两个摄像机中的第一摄像机用于检测图像的相位信息,所述至少两个摄像机中的第二摄像机用于检测图像的对比度信息,所述方法包括:
    在所述第一摄像机进行相位检测自动对焦的过程中,根据所述第一摄像机的一个或多个相位对焦位置,确定所述第二摄像机的至少一个相位对焦位置,并控制所述第二摄像机移动至所述至少一个相位对焦位置;
    根据所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息,确定所述第二摄像机的对比度对焦位置,其中,所述第二摄像机的对比度对焦位置包括所述第二摄像机的对比度对焦移动方向。
  8. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,控制所述第二摄像机是否检测图像的对比度信息;
    其中,所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置时被控制检测图像的对比度信息。
  9. 根据权利要求7所述的方法,其特征在于,所述方法还包括:
    根据所述电子设备的移动状态,和/或,所述第二摄像机的图像稳定状态,确定所述第二摄像机检测到的图像的对比度信息是否有效;
    其中,所述第二摄像机在所述至少一个相位对焦位置中的一个或多个相位对焦位置检测到的图像的对比度信息是有效的。
  10. 根据权利要求7所述的方法,其特征在于,
    所述第二摄像机的对比度对焦位置还包括所述第二摄像机的对比度对焦移动距离,
    所述方法还包括:
    控制所述第二摄像机移动至所述对比度对焦位置。
  11. 根据权利要求7所述的方法,其特征在于,
    所述方法还包括:
    根据所述第一摄像机的下一相位对焦位置,确定所述第二摄像机的下一相位对焦位置,所述第二摄像机的下一相位对焦位置包括所述第二摄像机的相位对焦移动方向和相位对焦移动距离;
    当所述对比度对焦移动方向与所述相位对焦移动方向一致时,控制所述第二摄像机移动至所述下一相位对焦位置。
  12. 根据权利要求11所述的方法,其特征在于,
    所述方法还包括:
    当所述第一摄像机的相位检测自动对焦结束后,或者,所述对比度对焦移动方向与所述相位对焦移动方向不一致时,控制所述第二摄像机在所述对比度对焦移动方向上移动预设距离。
  13. 一种处理器,其特征在于,包括:
    处理单元和存储单元,
    所述存储单元用于存储代码,所述处理单元用于执行所述存储单元中的代码实现权利要求7至12中任一项所述方法。
  14. 一种计算机可读存储介质,其特征在于,包括计算机程序,当所述计算机程序在计算机上运行时,使得所述计算机执行如权利要求7至12中任一项所述的方法。
PCT/CN2018/123942 2017-12-27 2018-12-26 对焦的方法和电子设备 WO2019129077A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711447201.6 2017-12-27
CN201711447201.6A CN109981965B (zh) 2017-12-27 2017-12-27 对焦的方法和电子设备

Publications (1)

Publication Number Publication Date
WO2019129077A1 true WO2019129077A1 (zh) 2019-07-04

Family

ID=67063184

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/123942 WO2019129077A1 (zh) 2017-12-27 2018-12-26 对焦的方法和电子设备

Country Status (2)

Country Link
CN (1) CN109981965B (zh)
WO (1) WO2019129077A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112261398A (zh) * 2020-11-17 2021-01-22 广东未来科技有限公司 一种基于移动设备的双目摄像头的对焦方法

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110475071B (zh) * 2019-09-19 2021-06-04 厦门美图之家科技有限公司 相位对焦方法、装置、电子设备和机器可读存储介质
CN110881103B (zh) * 2019-09-19 2022-01-28 Oppo广东移动通信有限公司 对焦控制方法和装置、电子设备、计算机可读存储介质
WO2021081909A1 (zh) * 2019-10-31 2021-05-06 深圳市大疆创新科技有限公司 拍摄设备的对焦方法、拍摄设备、系统及存储介质
CN112866551B (zh) * 2019-11-12 2022-06-14 Oppo广东移动通信有限公司 对焦方法和装置、电子设备、计算机可读存储介质
CN110933305B (zh) * 2019-11-28 2021-07-20 维沃移动通信有限公司 电子设备及对焦方法
CN113438407B (zh) * 2020-03-23 2022-10-04 华为技术有限公司 多摄像头模组对焦方法和装置
CN111787231B (zh) * 2020-07-31 2022-05-27 广东小天才科技有限公司 一种对焦方法、终端设备以及计算机可读存储介质
CN113556472B (zh) * 2021-09-22 2021-12-14 上海豪承信息技术有限公司 图像补偿方法、装置、介质及前置摄像头
CN116233605B (zh) * 2023-05-08 2023-07-25 此芯科技(武汉)有限公司 一种对焦实现方法、装置、存储介质及摄像设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070269197A1 (en) * 2006-05-16 2007-11-22 Masataka Ide Digital camera and camera system
CN105376474A (zh) * 2014-09-01 2016-03-02 光宝电子(广州)有限公司 图像采集装置及其自动对焦方法
CN107172410A (zh) * 2017-07-14 2017-09-15 闻泰通讯股份有限公司 双摄像头对焦方法及装置
CN107465881A (zh) * 2017-09-30 2017-12-12 努比亚技术有限公司 一种双摄像头对焦方法、移动终端以及计算机可读存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003295047A (ja) * 2002-04-05 2003-10-15 Canon Inc 撮像装置および撮像システム
CN106331484B (zh) * 2016-08-24 2020-02-14 维沃移动通信有限公司 一种对焦方法及移动终端

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070269197A1 (en) * 2006-05-16 2007-11-22 Masataka Ide Digital camera and camera system
CN105376474A (zh) * 2014-09-01 2016-03-02 光宝电子(广州)有限公司 图像采集装置及其自动对焦方法
CN107172410A (zh) * 2017-07-14 2017-09-15 闻泰通讯股份有限公司 双摄像头对焦方法及装置
CN107465881A (zh) * 2017-09-30 2017-12-12 努比亚技术有限公司 一种双摄像头对焦方法、移动终端以及计算机可读存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112261398A (zh) * 2020-11-17 2021-01-22 广东未来科技有限公司 一种基于移动设备的双目摄像头的对焦方法

Also Published As

Publication number Publication date
CN109981965A (zh) 2019-07-05
CN109981965B (zh) 2021-01-01

Similar Documents

Publication Publication Date Title
WO2019129077A1 (zh) 对焦的方法和电子设备
JP6561141B2 (ja) タッチパッドを用いて携帯端末の撮影焦点距離を調整する方法および携帯端末
US10038844B2 (en) User interface for wide angle photography
EP2975838B1 (en) Image shooting parameter adjustment method and device
KR101983725B1 (ko) 전자 기기 및 전자 기기의 제어 방법
KR102010955B1 (ko) 프리뷰 제어 방법 및 이를 구현하는 휴대 단말
US20210409582A1 (en) Focusing Lighting Module
US9569065B2 (en) Electronic device including projector and method for controlling the electronic device
EP3709147B1 (en) Method and apparatus for determining fingerprint collection region
AU2017440899B2 (en) Photographing method and terminal
US20120268373A1 (en) Method for recognizing user's gesture in electronic device
KR20130034765A (ko) 휴대 단말기의 펜 입력 방법 및 장치
CN108476339B (zh) 一种遥控方法和终端
KR20180133743A (ko) 이동 단말기 및 그 제어 방법
AU2017433305B2 (en) Task switching method and terminal
TW201516844A (zh) 一種物件選擇的方法和裝置
WO2018219275A1 (zh) 对焦方法、装置、计算机可读存储介质和移动终端
US9538086B2 (en) Method of performing previewing and electronic device for implementing the same
JP6371485B2 (ja) エアマウスリモコンの最適化方法、装置、端末機器、プログラム、及び記録媒体
US20210405773A1 (en) Method and apparatus for detecting orientation of electronic device, and storage medium
WO2019047129A1 (zh) 一种移动应用图标的方法及终端
WO2017084180A1 (zh) 空鼠遥控器的优化方法、装置和空鼠遥控器
EP3979619A1 (en) Video recording method and terminal
WO2017035794A1 (zh) 显示器操作的方法、装置、用户界面及存储介质
US11611693B1 (en) Updating lens focus calibration values

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18896383

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18896383

Country of ref document: EP

Kind code of ref document: A1