WO2021104265A1 - 电子设备及对焦方法 - Google Patents

电子设备及对焦方法 Download PDF

Info

Publication number
WO2021104265A1
WO2021104265A1 PCT/CN2020/131165 CN2020131165W WO2021104265A1 WO 2021104265 A1 WO2021104265 A1 WO 2021104265A1 CN 2020131165 W CN2020131165 W CN 2020131165W WO 2021104265 A1 WO2021104265 A1 WO 2021104265A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
cameras
electronic device
phase difference
point pair
Prior art date
Application number
PCT/CN2020/131165
Other languages
English (en)
French (fr)
Inventor
王乔明
谢佳涛
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to KR1020227018221A priority Critical patent/KR20220085834A/ko
Priority to JP2022527786A priority patent/JP7472281B2/ja
Priority to EP20893880.3A priority patent/EP4057616A4/en
Publication of WO2021104265A1 publication Critical patent/WO2021104265A1/zh
Priority to US17/746,424 priority patent/US11856294B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • the embodiment of the present invention relates to the field of communication technology, and in particular to an electronic device and a focusing method.
  • the electronic device can use a Phase Difference Auto Focus (PDAF) method to focus to obtain a clearer picture.
  • PDAF Phase Difference Auto Focus
  • the electronic device can calculate the phase difference through the PD point pair set on the sensor, and convert the phase difference into the moving distance of the motor in the lens module, so that the electronic device can determine the focus point according to the moving distance To achieve focus.
  • PDAF Phase Difference Auto Focus
  • the arrangement of the PD point pairs in the electronic device is fixed.
  • the electronic device can obtain the phase difference of the image in one direction, while for other directions, it may not be able to accurately obtain the phase difference.
  • the accuracy of focusing is low.
  • the embodiments of the present invention provide an electronic device and a focusing method, which can solve the problem of low accuracy of focusing by the electronic device.
  • an electronic device in a first aspect of the embodiments of the present invention, includes at least two cameras, each of which is provided with a PD point pair set. Wherein, the angle value between the linear direction where the PD point pair set on each of the at least two cameras is located is within the first preset angle range.
  • a focusing method is provided, which is applied to an electronic device.
  • the electronic device includes at least two cameras, each of which is provided with a PD point pair set, and the focusing method includes: acquiring at least two cameras.
  • Target parameters where each target parameter is a phase parameter obtained through a collection of PD point pairs on one camera, and the angle value between the linear direction of the collection of PD point pairs on every two cameras in at least two cameras Within the first preset angle range; determine at least two target phase differences according to at least two target parameters, and each target phase difference is the phase difference in the direction corresponding to a camera; according to the at least two target phase differences, control At least two cameras focus separately.
  • an electronic device in a third aspect of the embodiments of the present invention, includes at least two cameras, each of which is provided with a PD point pair set.
  • the electronic device includes: an acquisition module, a determination module, and a control module. Module.
  • the acquisition module is used to acquire at least two target parameters; each target parameter is a phase parameter acquired through a PD point pair collection on one camera, and the PD point pair collection on every two cameras in the at least two cameras is located The value of the included angle between the straight line directions is within the first preset angle range.
  • the determining module is configured to determine at least two target phase differences according to the at least two target parameters acquired by the acquiring module, and each target phase difference is a phase difference in a direction corresponding to a camera.
  • the control module is configured to control the at least two cameras to focus separately according to the phase difference of the at least two targets determined by the determining module.
  • an electronic device in a fourth aspect of the embodiments of the present invention, includes a processor, a memory, and a computer program stored on the memory and running on the processor.
  • the computer program When the computer program is executed by the processor, the computer program The steps of the focusing method described in the second aspect.
  • a computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the steps of the focusing method as described in the second aspect are implemented.
  • the electronic device may include at least two cameras, each of which is provided with a PD point pair set, and the PD point pair set on each of the at least two cameras is located in a straight line direction
  • the included angle value between is within the first preset angle range. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
  • FIG. 1 is a schematic diagram of the architecture of an Android operating system provided by an embodiment of the present invention
  • FIG. 2 is one of the schematic diagrams of a focusing method provided by an embodiment of the present invention.
  • FIG. 3 is a second schematic diagram of a focusing method provided by an embodiment of the present invention.
  • FIG. 4 is the third schematic diagram of a focusing method provided by an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of an electronic device provided by an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of hardware of an electronic device provided by an embodiment of the present invention.
  • first and second in the description and claims of the embodiments of the present invention are used to distinguish different objects, rather than to describe a specific order of objects.
  • first camera and the second camera are used to distinguish different cameras, rather than to describe the specific order of the cameras.
  • plural means two or more.
  • a plurality of elements refers to two elements or more than two elements.
  • words such as “exemplary” or “for example” are used to represent examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present invention should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • the embodiment of the present invention provides an electronic device and a focusing method.
  • the electronic device may include at least two cameras, each of which is provided with a PD point pair set, and the PD on each two cameras of the at least two cameras.
  • the angle value between the straight line directions where the point pair set is located is within the first preset angle range. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
  • the electronic device and the focusing method provided by the embodiments of the present invention can be applied to a process in which the electronic device focuses on a camera.
  • the electronic device in the embodiment of the present invention may be an electronic device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present invention.
  • the following takes the Android operating system as an example to introduce the software environment to which the focusing method provided in the embodiment of the present invention is applied.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present invention.
  • the architecture of the Android operating system includes 4 layers, which are: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software level.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop a software program that implements the focusing method provided by the embodiment of the present invention based on the system architecture of the Android operating system as shown in FIG. It can be run based on the Android operating system as shown in Figure 1. That is, the processor or the electronic device can implement the focusing method provided by the embodiment of the present invention by running the software program in the Android operating system.
  • the electronic device in the embodiment of the present invention may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant).
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the non-mobile electronic device may be a personal computer (PC), television (television, TV), teller machine, or self-service machine, etc., which is not specifically limited in the embodiment of the present invention.
  • An embodiment of the present invention provides an electronic device, which includes at least two cameras, and a PD point pair set is set on each of the at least two cameras.
  • the angle value between the linear direction where the PD point pairs on each of the at least two cameras are located is within the first preset angle range.
  • the above-mentioned PD point pair set may include at least two PD point pairs.
  • the PD point pair can be understood as: the special pixel points used to detect the phase on the camera's sensor (sensor) generally appear in pairs; PD sensor: a camera with a PD point pair.
  • the above-mentioned at least two cameras include a first camera and a second camera, the first camera is provided with a first PD point pair set, and the second camera is provided with a second PD point pair set.
  • the angle value between the linear direction where the first PD point pair set is located and the linear direction where the second PD point pair set is located is within the second preset angle range.
  • An embodiment of the present invention provides an electronic device, the electronic device may include at least two cameras, each of the at least two cameras is provided with a PD point pair set, and the PD point pairs on every two cameras of the at least two cameras
  • the angle value between the linear directions where the collection is located is within the first preset angle range. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
  • FIG. 2 shows a flowchart of a focusing method provided by an embodiment of the present invention.
  • the method can be applied to an electronic device having an Android operating system as shown in FIG. 1.
  • the focusing method provided by the embodiment of the present invention may include the following steps 201 to 203.
  • Step 201 The electronic device obtains at least two target parameters.
  • the electronic device may include at least two cameras, and PD point pair sets are set on each of the at least two cameras.
  • each of the above-mentioned at least two target parameters is a phase parameter obtained through a collection of PD point pairs on one camera, and the PD point pair collection on every two cameras of the at least two cameras is located
  • the value of the included angle between the straight line directions is within the first preset angle range.
  • the above-mentioned at least two cameras may include a first camera and a second camera.
  • the first camera is provided with a first PD point pair set
  • the second camera is provided with a second PD point.
  • the angle value between the linear direction where the first PD point pair set is located and the linear direction where the second PD point pair set is located is within the second preset angle range.
  • the angle between the PD point pair set on each two cameras in the straight line direction may be 180°/N (For example, the first preset angle range).
  • each of the above-mentioned at least two target parameters includes a phase difference and a credibility value.
  • the aforementioned at least two cameras include a first camera and a second camera
  • the aforementioned at least two target parameters may include the first parameter and the second parameter
  • the above-mentioned first parameter is a phase parameter when an image is collected by a first camera
  • the above-mentioned second parameter is a phase parameter when an image is collected by a second camera.
  • the electronic device may obtain the phase parameter of the image to be collected through the first camera, and obtain the phase parameter of the image to be collected through the second camera.
  • the electronic device may obtain the first parameter through the first PD point pair set, and obtain the second parameter through the second PD point pair set.
  • a plurality of first PD point pairs (that is, a first PD point pair set) are provided on the first camera, the plurality of first PD point pairs are arranged in one direction, and the second camera is provided with a plurality of second PDs.
  • Point pairs ie, the second PD point pair set
  • the plurality of second PD point pairs are arranged in one direction, and the angle value between the two directions is within the second preset angle range.
  • the relationship between the position of the first camera and the position of the second camera can make the angle between the linear direction of the first PD point pair set and the linear direction of the second PD point pair set in the second preset Set the angle range.
  • the foregoing second preset angle range may be a preset angle value, and the preset angle value is 90°.
  • the arrangement direction of the first PD point pair set is the first direction (for example, horizontal), and the arrangement direction of the second PD point pair set is the second direction (for example, the vertical direction).
  • the first direction is perpendicular to the second direction.
  • the above-mentioned first parameter may include a first credibility value corresponding to the first phase difference and the first phase difference
  • the above-mentioned second parameter may include a second phase difference corresponding to the second phase difference.
  • the second credibility value may be included in the embodiment of the present invention.
  • the first phase difference is the actual output phase difference of the first camera
  • the second phase difference is the actual output phase difference of the second camera
  • the first credibility value is the actual output credibility of the first camera
  • the first credibility value is used to indicate the credibility of the first phase difference
  • the second credibility value is the actual output credibility value of the second camera
  • the second credibility value is used to indicate The credibility of the second phase difference.
  • the electronic device can acquire phase parameters in multiple different directions, so that Determining the in-focus position in the corresponding direction can improve the accuracy of the electronic device in determining the in-focus position (or in-focus position).
  • the electronic device can obtain two different directions
  • the above phase parameters determine the focus position in the corresponding direction, and therefore can improve the accuracy of the electronic device in determining the focus position (or focus position).
  • Step 202 The electronic device determines at least two target phase differences according to the at least two target parameters.
  • each of the above-mentioned at least two target phase differences is a phase difference in a direction corresponding to a camera.
  • step 202 may be specifically implemented by the following step 202a.
  • Step 202a If the inherent phase parameters of the at least two cameras are the same, the electronic device uses the first algorithm to determine the at least two target phase differences according to the at least two target parameters.
  • each inherent phase parameter may include an inherent phase difference and an inherent reliability value.
  • the same inherent phase parameters of at least two cameras can be understood as: the value range of the inherent phase difference of each camera is the same, and the value range of the inherent credibility value of each camera is the same.
  • the electronic device may determine the target phase difference corresponding to the first camera and the second camera according to the first parameter and the second parameter The corresponding target phase difference.
  • the electronic device can calculate the phase difference in the direction corresponding to each camera through the PD point pair according to the first parameter and the second parameter.
  • each target phase difference is the in-focus phase difference corresponding to a camera.
  • the electronic device may adopt the first algorithm, and according to the first parameter and the second inherent phase parameter Parameters to determine the target phase difference corresponding to the first camera and the target phase difference corresponding to the second camera.
  • the first inherent phase parameter may include a first inherent phase difference and a first inherent credibility value
  • the second inherent phase parameter may include a second inherent phase difference and a second inherent reliability value. Reliability value.
  • the first inherent phase parameter of the first camera is the same as the second inherent phase parameter of the second camera. It can be understood that the value range of the first inherent phase difference is the same as the value range of the second inherent phase difference. , And the value range of the first inherent credibility value is the same as the value range of the second inherent credibility value.
  • the phase difference in the direction corresponding to the first camera calculated by the electronic device is the same as the phase difference in the direction corresponding to the second camera.
  • F a target phase difference F1 is a first phase
  • C1 is the first confidence value
  • F2 is the second phase
  • C2 is the second confidence value.
  • step 202 may be specifically implemented by the following step 202b.
  • Step 202b If the inherent phase parameters of the at least two cameras are different, the electronic device uses the second algorithm to determine the at least two target phase differences based on the at least two target parameters and the inherent phase parameter of each camera.
  • the difference in the intrinsic phase parameters of the above at least two cameras can be understood as: the value range of the intrinsic phase difference of each camera is different, and the value range of the intrinsic credibility value of each camera is different.
  • the electronic device when the first inherent phase parameter of the first camera is different from the second inherent phase parameter of the second camera, the electronic device may adopt the second algorithm, and according to the first inherent phase parameter, The second inherent phase parameter, the first parameter and the second parameter determine the target phase difference.
  • the difference between the first inherent phase parameter of the first camera and the second inherent phase parameter of the second camera can be understood as: the value range of the first inherent phase difference is different from the value range of the second inherent phase difference , And the value range of the first inherent credibility value is different from the value range of the second inherent credibility value.
  • the electronic device may map the first parameter to the second camera, or map the second parameter to the first camera to calculate the phase difference or the second camera corresponding to the first camera.
  • the value range of the first inherent phase difference can be [F10, F11]
  • the value range of the first inherent credibility value can be [C10, C11]
  • the value range of the second inherent phase difference can be [C10, C11].
  • the value range of the phase difference can be [F20, F21]
  • the value range of the second inherent credibility value can be [C20, C21].
  • the first parameter may be mapped to the second camera.
  • F1′′ is the target phase difference corresponding to the first camera
  • F b is the target phase difference corresponding to the second camera.
  • F b (F1′ ⁇ C1′+F2 ⁇ C2)/(C1′+C2)
  • F1′ is the phase difference obtained by mapping the first phase difference to the second camera
  • C1′ is the first credible
  • the degree value is mapped to the credibility value obtained on the second camera
  • F2 is the second phase difference
  • C2 is the second credibility value.
  • F1′ F1 ⁇ (F21-F20)/(F11-F10)
  • C1′ C1 ⁇ (C21-C20)/(C11-C10)
  • F1 is the first phase difference
  • C1 is the first reliability value
  • Step 203 The electronic device controls the at least two cameras to focus separately according to the phase difference of the at least two targets.
  • the electronic device may first determine the focus position (ie, focus position) of each of the at least two cameras according to the phase difference of at least two targets, and then control the corresponding camera according to each focus position.
  • Focus that is, focus on a position in the image to be shot or preview interface.
  • the electronic device may be based on the target phase difference F a, the calculated focus position focus position of the first camera or the second camera.
  • the electronic device may be a phase difference F1 "from the target, the calculated focus position of the first camera; electronic device according to the target retardation F b, is calculated to obtain the focal position of the second camera.
  • the electronic device may calculate the phase difference through the PD point pair, and then convert the phase difference into the distance moved by the motor to determine the focus position.
  • two cameras are used for focusing, and it only needs to satisfy that the arrangement directions of the PD point pairs of the two cameras are close to completely orthogonal, which can improve the accuracy of the phase difference output, thereby improving the focus success rate. And without loss of image quality.
  • the embodiment of the present invention provides a focusing method.
  • An electronic device can determine at least two target phase differences (each target phase difference is the phase difference in the direction corresponding to a camera) according to the acquired at least two target parameters, and At least two targets have a phase difference, and at least two cameras are controlled to focus separately. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
  • FIG. 5 shows a schematic diagram of a possible structure of an electronic device involved in an embodiment of the present invention.
  • the electronic device includes at least two cameras, and a PD point pair set is set on each of the at least two cameras.
  • the electronic device 50 may include: an acquisition module 51, a determination module 52, and a control module 53.
  • the acquiring module 51 is configured to acquire at least two target parameters; wherein, each target parameter is a phase parameter acquired through a collection of PD point pairs on one camera, and the PD points on each two cameras of the at least two cameras The angle value between the straight line directions where the collection is located is within the first preset angle range.
  • the determining module 52 is configured to determine at least two target phase differences according to the at least two target parameters acquired by the acquiring module 51, and each target phase difference is a phase difference in a direction corresponding to a camera.
  • the control module 53 is configured to control the at least two cameras to focus separately according to the phase difference of the at least two targets determined by the determining module 52.
  • the aforementioned at least two cameras include a first camera and a second camera, the first camera is provided with a first PD point pair set, and the second camera is provided with a second PD point pair set ;
  • the angle between the linear direction where the first PD point pair set is located and the linear direction where the second PD point pair set is located is within a second preset angle range.
  • Each of the above at least two target parameters includes a phase difference and a credibility value.
  • the above determining module 52 is specifically configured to use the first algorithm to determine at least two target phase differences according to the at least two target parameters if the inherent phase parameters of the at least two cameras are the same; or If the inherent phase parameters of the at least two cameras are different, the second algorithm is used to determine the at least two target phase differences based on the at least two target parameters and the inherent phase parameter of each camera.
  • the electronic device provided in the embodiment of the present invention can implement each process implemented by the electronic device in the foregoing method embodiment. To avoid repetition, the detailed description will not be repeated here.
  • An embodiment of the present invention provides an electronic device.
  • the electronic device may include at least two cameras, each of which is provided with a PD point pair set, and each of the at least two cameras has a PD point pair set on each two cameras.
  • the angle value between the linear directions is within the first preset angle range. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
  • Fig. 6 is a schematic diagram of hardware of an electronic device that implements various embodiments of the present invention.
  • the electronic device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, The processor 110, and the power supply 111 and other components.
  • electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the processor 110 is configured to obtain at least two target parameters; where each target parameter is a phase parameter obtained through a collection of PD point pairs on one camera, and the PD points on each two cameras of the at least two cameras The angle between the linear directions where the collection is located is within the first preset angle range; and at least two target phase differences are determined according to at least two target parameters, and each target phase difference is the phase in the direction corresponding to a camera Difference; and according to the phase difference of at least two targets, control at least two cameras to focus separately.
  • each target parameter is a phase parameter obtained through a collection of PD point pairs on one camera, and the PD points on each two cameras of the at least two cameras
  • the angle between the linear directions where the collection is located is within the first preset angle range
  • at least two target phase differences are determined according to at least two target parameters, and each target phase difference is the phase in the direction corresponding to a camera Difference; and according to the phase difference of at least two targets, control at least two cameras to focus separately.
  • An embodiment of the present invention provides an electronic device.
  • the electronic device can determine at least two target phase differences (each target phase difference is the phase difference in the direction corresponding to a camera) according to the acquired at least two target parameters, and according to At least two targets have a phase difference, and at least two cameras are controlled to focus separately. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042.
  • the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the electronic device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the electronic device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the electronic device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the electronic device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with the electronic device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 100 or can be used to connect to the electronic device 100 and the external device. Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the electronic device. It uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes software programs and/or modules stored in the memory 109, and calls data stored in the memory 109 , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the electronic device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the electronic device 100 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present invention further provides an electronic device, including a processor 110 as shown in FIG. 6, a memory 109, a computer program stored in the memory 109 and running on the processor 110, the computer
  • the program is executed by the processor 110, each process of the foregoing method embodiment is realized, and the same technical effect can be achieved. In order to avoid repetition, details are not described herein again.
  • the embodiment of the present invention also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
  • a computer program is stored on the computer-readable storage medium.
  • the computer program is executed by a processor, each process of the foregoing method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, I won’t repeat them here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk, etc.
  • the technical solution of the present invention essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes a number of instructions to enable an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to execute the method described in each embodiment of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Cameras In General (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

本发明实施例公开了一种电子设备及对焦方法,电子设备包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合;其中,至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。

Description

电子设备及对焦方法
相关申请的交叉引用
本申请主张在2019年11月28日在中国提交的中国专利申请号201911194022.5的优先权,其全部内容通过引用包含于此。
技术领域
本发明实施例涉及通信技术领域,尤其涉及一种电子设备及对焦方法。
背景技术
通常,用户在通过电子设备进行拍照时,电子设备可以采用相位差自动对焦(Phase Difference Auto Focus,PDAF)方式进行对焦,以得到比较清晰的图片。具体的,电子设备可以通过设置在传感器上的PD点对来计算得到相位差,并将该相位差转换成镜头模组内的马达的移动距离,从而电子设备可以根据该移动距离确定出对焦点,以实现对焦。
然而,电子设备中的PD点对的排列是固定的,通过上述方法,电子设备可以得到一个方向上的图像的相位差,而对于其他方向上,可能无法准确地得到相位差,如此电子设备进行对焦的准确性较低。
发明内容
本发明实施例提供一种电子设备及对焦方法,可以解决电子设备进行对焦的准确性较低的问题。
为了解决上述技术问题,本发明实施例采用如下技术方案:
本发明实施例的第一方面,提供一种电子设备,该电子设备包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合。其中,至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。
本发明实施例的第二方面,提供一种对焦方法,应用于电子设备,该电子设备包括至 少两个摄像头,至少两个摄像头上均设置有PD点对集合,该对焦方法包括:获取至少两个目标参数;其中,每个目标参数分别为通过一个摄像头上的PD点对集合获取的相位参数,至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内;根据至少两个目标参数,确定至少两个目标相位差,每个目标相位差分别为一个摄像头对应的方向上的相位差;根据至少两个目标相位差,控制至少两个摄像头分别对焦。
本发明实施例的第三方面,提供一种电子设备,该电子设备包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合,该电子设备包括:获取模块、确定模块和控制模块。其中,获取模块,用于获取至少两个目标参数;每个目标参数分别为通过一个摄像头上的PD点对集合获取的相位参数,至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。确定模块,用于根据获取模块获取的至少两个目标参数,确定至少两个目标相位差,每个目标相位差分别为一个摄像头对应的方向上的相位差。控制模块,用于根据确定模块确定的至少两个目标相位差,控制至少两个摄像头分别对焦。
本发明实施例的第四方面,提供一种电子设备,该电子设备包括处理器、存储器及存储在存储器上并可在处理器上运行的计算机程序,该计算机程序被处理器执行时实现如第二方面所述的对焦方法的步骤。
本发明实施例的第五方面,提供一种计算机可读存储介质,该计算机可读存储介质上存储计算机程序,该计算机程序被处理器执行时实现如第二方面所述的对焦方法的步骤。
在本发明实施例中,电子设备可以包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合,且该至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。由于每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内,则电子设备可以获取到多个不同方向上的相位参数,因此可以准确地确定对应的方向上的合焦位置,从而可以提高电子设备进行对焦的准确性。
附图说明
图1为本发明实施例提供的一种安卓操作系统的架构示意图;
图2为本发明实施例提供的一种对焦方法的示意图之一;
图3为本发明实施例提供的一种对焦方法的示意图之二;
图4为本发明实施例提供的一种对焦方法的示意图之三;
图5为本发明实施例提供的一种电子设备的结构示意图;
图6为本发明实施例提供的一种电子设备的硬件示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明实施例的说明书和权利要求书中的术语“第一”和“第二”等是用于区别不同的对象,而不是用于描述对象的特定顺序。例如,第一摄像头和第二摄像头等是用于区别不同的摄像头,而不是用于描述摄像头的特定顺序。
在本发明实施例的描述中,除非另有说明,“多个”的含义是指两个或两个以上。例如,多个元件是指两个元件或两个以上元件。
本文中术语“和/或”,是一种描述关联对象的关联关系,表示可以存在三种关系,例如,显示面板和/或背光,可以表示:单独存在显示面板,同时存在显示面板和背光,单独存在背光这三种情况。本文中符号“/”表示关联对象是或者的关系,例如输入/输出表示输入或者输出。
在本发明实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本发明实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
本发明实施例提供一种电子设备及对焦方法,电子设备可以包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合,且该至少两个摄像头中每两个摄像头上的PD 点对集合所在直线方向之间的夹角值在第一预设角度范围内。由于每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内,则电子设备可以获取到多个不同方向上的相位参数,因此可以准确地确定对应的方向上的合焦位置,从而可以提高电子设备进行对焦的准确性。
本发明实施例提供的电子设备及对焦方法,可以应用于电子设备对摄像头进行对焦的过程。
本发明实施例中的电子设备可以为具有操作系统的电子设备。该操作系统可以为安卓(Android)操作系统,可以为ios操作系统,还可以为其他可能的操作系统,本发明实施例不作具体限定。
下面以安卓操作系统为例,介绍一下本发明实施例提供的对焦方法所应用的软件环境。
如图1所示,为本发明实施例提供的一种可能的安卓操作系统的架构示意图。在图1中,安卓操作系统的架构包括4层,分别为:应用程序层、应用程序框架层、系统运行库层和内核层(具体可以为Linux内核层)。
其中,应用程序层包括安卓操作系统中的各个应用程序(包括系统应用程序和第三方应用程序)。
应用程序框架层是应用程序的框架,开发人员可以在遵守应用程序的框架的开发原则的情况下,基于应用程序框架层开发一些应用程序。
系统运行库层包括库(也称为系统库)和安卓操作系统运行环境。库主要为安卓操作系统提供其所需的各类资源。安卓操作系统运行环境用于为安卓操作系统提供软件环境。
内核层是安卓操作系统的操作系统层,属于安卓操作系统软件层次的最底层。内核层基于Linux内核为安卓操作系统提供核心系统服务和与硬件相关的驱动程序。
以安卓操作系统为例,本发明实施例中,开发人员可以基于上述如图1所示的安卓操作系统的系统架构,开发实现本发明实施例提供的对焦方法的软件程序,从而使得该对焦方法可以基于如图1所示的安卓操作系统运行。即处理器或者电子设备可以通过在安卓操作系统中运行该软件程序实现本发明实施例提供的对焦方法。
本发明实施例中的电子设备可以为移动电子设备,也可以为非移动电子设备。示例性的,移动电子设备可以为手机、平板电脑、笔记本电脑、掌上电脑、车载电子设备、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)等,非移动电子设备可以为个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本发明实施例不作具体限定。
下面结合附图,通过具体的实施例及其应用场景对本发明实施例提供的一种电子设备及对焦方法进行详细地说明。
本发明实施例提供一种电子设备,该电子设备包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合。
本发明实施例中,上述至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。
本发明实施例中,上述PD点对集合中可以包括至少两个PD点对。
需要说明的是,PD点对可以理解为:摄像头的传感器(sensor)上用来检测相位的特殊像素点,一般成对出现;PD传感器:带有PD点对的摄像头。
可选地,本发明实施例中,上述至少两个摄像头包括第一摄像头和第二摄像头,该第一摄像头上设置有第一PD点对集合,该第二摄像头上设置有第二PD点对集合。其中,第一PD点对集合所在直线方向与第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内。
需要说明的是,针对PD点对集合的具体说明,将在下述实施例中进行描述,此处不予赘述。
本发明实施例提供一种电子设备,该电子设备可以包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合,且该至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。由于每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内,则电子设备可以获取到多个不同方向上的相位参数,因此可以准确地确定对应的方向上的合焦位置,从而可以提高电子设备进 行对焦的准确性。
本发明实施例提供一种对焦方法,图2示出了本发明实施例提供的一种对焦方法的流程图,该方法可以应用于具有如图1所示的安卓操作系统的电子设备。如图2所示,本发明实施例提供的对焦方法可以包括下述的步骤201至步骤203。
步骤201、电子设备获取至少两个目标参数。
本发明实施例中,电子设备可以包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合。
本发明实施例中,上述至少两个目标参数中的每个目标参数分别为通过一个摄像头上的PD点对集合获取的相位参数,至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。
可选地,本发明实施例中,上述至少两个摄像头可以包括第一摄像头和第二摄像头,该第一摄像头上设置有第一PD点对集合,该第二摄像头上设置有第二PD点对集合,该第一PD点对集合所在直线方向与第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内。
可选地,本发明实施例中,针对电子设备包括N(N≥3)个摄像头的情况,每两个摄像头上的PD点对集合所在直线方向之间的夹角值可以为180°/N(例如第一预设角度范围)。
可选地,本发明实施例中,上述至少两个目标参数中每个目标参数包括相位差和可信度值。
可选地,本发明实施例中,上述至少两个摄像头包括第一摄像头和第二摄像头,上述至少两个目标参数可以包括第一参数和第二参数。
可选地,本发明实施例中,上述第一参数为通过第一摄像头采集图像时的相位参数,上述第二参数为通过第二摄像头采集图像时的相位参数。
可以理解,在用户触发电子设备处于拍摄模式的情况下,电子设备可以通过第一摄像头获取待采集图像的相位参数,并通过第二摄像头获取待采集图像的相位参数。
可选地,本发明实施例中,电子设备可以通过第一PD点对集合获取第一参数,并通过第二PD点对集合获取第二参数。
可以理解,第一摄像头上设置有多个第一PD点对(即第一PD点对集合),该多个第一PD点对呈现一个方向排列,第二摄像头上设置有多个第二PD点对(即第二PD点对集合),该多个第二PD点对呈现一个方向排列,这两个方向之间的夹角值在第二预设角度范围内。
需要说明的是,第一摄像头的位置和第二摄像头的位置的关系,可以使得第一PD点对集合所在直线方向与第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内。
可选地,本发明实施例中,上述第二预设角度范围可以为预设角度值,该预设角度值为90°。
可选地,本发明实施例中,上述第一PD点对集合的排布方向为第一方向(例如横向),第二PD点对集合的排布方向为第二方向(例如竖向),第一方向与第二方向垂直。
可选地,本发明实施例中,上述第一参数可以包括第一相位差和第一相位差对应的第一可信度值,上述第二参数可以包括第二相位差和第二相位差对应的第二可信度值。
可以理解,上述第一相位差为第一摄像头的实际输出相位差,上述第二相位差为第二摄像头的实际输出相位差;上述第一可信度值为第一摄像头的实际输出可信度值,该第一可信度值用于指示第一相位差的可信度,上述第二可信度值为第二摄像头的实际输出可信度值,该第二可信度值用于指示第二相位差的可信度。
本发明实施例中,每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内,则电子设备可以获取到多个不同方向上的相位参数,从而确定对应的方向上的合焦位置,因此可以提高电子设备确定合焦位置(或对焦位置)的准确性。
本发明实施例中,第一PD点对集合所在直线方向与第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内,则电子设备可以获取到两个不同方向上的相位参数,从而确定对应的方向上的合焦位置,因此可以提高电子设备确定合焦位置(或对焦位置)的准确性。
步骤202、电子设备根据至少两个目标参数,确定至少两个目标相位差。
本发明实施例中,上述至少两个目标相位差中的每个目标相位差分别为一个摄像头对应的方向上的相位差。
可选地,本发明实施例中,结合图2,如图3所示,上述步骤202具体可以通过下述的步骤202a实现。
步骤202a、若至少两个摄像头的固有相位参数相同,则电子设备采用第一算法,根据至少两个目标参数,确定至少两个目标相位差。
可选地,本发明实施例中,每个固有相位参数可以包括固有相位差和固有可信度值。
需要说明的是,上述至少两个摄像头的固有相位参数相同可以理解为:每个摄像头的固有相位差的取值范围相同,且每个摄像头的固有可信度值的取值范围相同。
可选地,本发明实施例中,若至少两个摄像头包括第一摄像头和第二摄像头,则电子设备可以根据第一参数和第二参数,确定第一摄像头对应的目标相位差和第二摄像头对应的目标相位差。
可以理解,电子设备可以根据第一参数和第二参数,通过PD点对计算每个摄像头对应的方向上的相位差。
需要说明的是,每个目标相位差分别为一个摄像头对应的合焦相位差。
可选地,本发明实施例中,在第一摄像头的第一固有相位参数与第二摄像头的第二固有相位参数相同的情况下,电子设备可以采用第一算法,根据第一参数和第二参数,确定第一摄像头对应的目标相位差和第二摄像头对应的目标相位差。
可选地,本发明实施例中,上述第一固有相位参数可以包括第一固有相位差和第一固有可信度值,上述第二固有相位参数可以包括第二固有相位差和第二固有可信度值。
需要说明的是,上述第一摄像头的第一固有相位参数与第二摄像头的第二固有相位参数相同可以理解为:第一固有相位差的取值范围与第二固有相位差的取值范围相同,且第一固有可信度值的取值范围与第二固有可信度值的取值范围相同。
可以理解,在第一固有相位参数与第二固有相位参数相同的情况下,电子设备计算的 第一摄像头对应的方向上的相位差与第二摄像头对应的方向上的相位差相同。
可选地,本发明实施例中,上述第一算法可以为F a=(F1×C1+F2×C2)/(C1+C2)。其中,F a为目标相位差,F1为第一相位差,C1为第一可信度值,F2为第二相位差,C2为第二可信度值。
可选地,本发明实施例中,结合图2,如图4所示,上述步骤202具体可以通过下述的步骤202b实现。
步骤202b、若至少两个摄像头的固有相位参数不同,则电子设备采用第二算法,根据至少两个目标参数和每个摄像头的固有相位参数,确定至少两个目标相位差。
需要说明的是,上述至少两个摄像头的固有相位参数不同可以理解为:每个摄像头的固有相位差的取值范围不同,且每个摄像头的固有可信度值的取值范围不同。
可选地,本发明实施例中,在第一摄像头的第一固有相位参数与第二摄像头的第二固有相位参数不同的情况下,电子设备可以采用第二算法,根据第一固有相位参数、第二固有相位参数、第一参数和第二参数,确定目标相位差。
需要说明的是,上述第一摄像头的第一固有相位参数与第二摄像头的第二固有相位参数不同可以理解为:第一固有相位差的取值范围与第二固有相位差的取值范围不同,且第一固有可信度值的取值范围与第二固有可信度值的取值范围不同。
可选地,本发明实施例中,电子设备可以通过将第一参数映射到第二摄像头上,或者,将第二参数映射到第一摄像头上,以计算第一摄像头对应的相位差或第二摄像头对应的相位差。
可选地,本发明实施例中,假设第一固有相位差的取值范围可以为[F10,F11],第一固有可信度值的取值范围可以为[C10,C11],第二固有相位差的取值范围可以为[F20,F21],第二固有可信度值的取值范围可以为[C20,C21]。
示例性的,以第一摄像头为例进行说明,可以将第一参数映射到第二摄像头上。上述第二算法可以为F1″=F b×(F11-F10)/(F21-F20)。其中,F1″为第一摄像头对应的目标相位差,F b为第二摄像头的对应的目标相位差。
其中,F b=(F1′×C1′+F2×C2)/(C1′+C2),F1′为第一相位差映射到第二摄像头上得 到后的相位差,C1′为第一可信度值映射到第二摄像头上得到后的可信度值,F2为第二相位差,C2为第二可信度值。
其中,F1′=F1×(F21-F20)/(F11-F10),C1′=C1×(C21-C20)/(C11-C10),F1为第一相位差,C1为第一可信度值。
步骤203、电子设备根据至少两个目标相位差,控制至少两个摄像头分别对焦。
本发明实施例中,电子设备可以根据至少两个目标相位差,先确定至少两个摄像头中每个摄像头的合焦位置(即对焦位置),然后根据每个合焦位置分别控制对应的摄像头进行对焦(即对待拍摄图像或预览界面中的一个位置进行对焦)。
可选地,本发明实施例中,电子设备可以根据目标相位差F a,计算得到第一摄像头的合焦位置或第二摄像头的合焦位置。
可选地,本发明实施例中,电子设备可以根据目标相位差F1″,计算得到第一摄像头的合焦位置;电子设备可以根据目标相位差F b,计算得到第二摄像头的合焦位置。
可选地,本发明实施例中,电子设备可以通过PD点对来计算相位差,再将相位差转换成马达移动的距离,以确定合焦位置。
需要说明的是,当两个摄像头(即第一摄像头和第二摄像头)正交排布(即第一摄像头上的PD点对和第二摄像头上的PD点对正交排布)时,不可能存在与两个摄像头都垂直的梯度方向(即物体灰度的变化方向),即始终存在一个摄像头的可信度值是比较高的,因此通过本方案可以检测出任意梯度方向的相位差。
本发明实施例中,采用两个摄像头来进行对焦,只需要满足两个摄像头的PD点对的排布方向接近完全正交即可,可以提升相位差输出的准确性,从而提高对焦成功率,且不损失图像质量。
本发明实施例提供一种对焦方法,电子设备可以根据获取的至少两个目标参数,确定至少两个目标相位差(每个目标相位差分别为一个摄像头对应的方向上的相位差),并根据至少两个目标相位差,控制至少两个摄像头分别对焦。由于每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内,则电子设备可以获取到多个不同方向上的相位参数,因此可以准确地确定对应的方向上的合焦位置,从而可以提高电子设 备进行对焦的准确性。
图5示出了本发明实施例中涉及的电子设备的一种可能的结构示意图,电子设备包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合。如图5所示,电子设备50可以包括:获取模块51、确定模块52和控制模块53。
其中,获取模块51,用于获取至少两个目标参数;其中,每个目标参数分别为通过一个摄像头上的PD点对集合获取的相位参数,至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。确定模块52,用于根据获取模块51获取的至少两个目标参数,确定至少两个目标相位差,每个目标相位差分别为一个摄像头对应的方向上的相位差。控制模块53,用于根据确定模块52确定的至少两个目标相位差,控制至少两个摄像头分别对焦。
在一种可能的实现方式中,上述至少两个摄像头包括第一摄像头和第二摄像头,该第一摄像头上设置有第一PD点对集合,该第二摄像头上设置有第二PD点对集合;该第一PD点对集合所在直线方向与第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内。上述至少两个目标参数中每个目标参数包括相位差和可信度值。
在一种可能的实现方式中,上述确定模块52,具体用于若至少两个摄像头的固有相位参数相同,则采用第一算法,根据至少两个目标参数,确定至少两个目标相位差;或者,若至少两个摄像头的固有相位参数不同,则采用第二算法,根据至少两个目标参数和每个摄像头的固有相位参数,确定至少两个目标相位差。
本发明实施例提供的电子设备能够实现上述方法实施例中电子设备实现的各个过程,为避免重复,详细描述这里不再赘述。
本发明实施例提供一种电子设备,电子设备可以包括至少两个摄像头,该至少两个摄像头上均设置有PD点对集合,且该至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。由于每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内,则电子设备可以获取到多个不同方向上的相位参数,因此可以准确地确定对应的方向上的合焦位置,从而可以提高电子设备进行对焦的准确性。
图6为实现本发明各个实施例的一种电子设备的硬件示意图。如图6所示,电子设备100包括但不限于:射频单元101、网络模块102、音频输出单元103、输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、处理器110、以及电源111等部件。
需要说明的是,本领域技术人员可以理解,图6中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图6所示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本发明实施例中,电子设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
其中,处理器110,用于获取至少两个目标参数;其中,每个目标参数分别为通过一个摄像头上的PD点对集合获取的相位参数,至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内;并根据至少两个目标参数,确定至少两个目标相位差,每个目标相位差为一个摄像头对应的方向上的相位差;以及根据至少两个目标相位差,控制至少两个摄像头分别对焦。
本发明实施例提供一种电子设备,电子设备可以根据获取的至少两个目标参数,确定至少两个目标相位差(每个目标相位差分别为一个摄像头对应的方向上的相位差),并根据至少两个目标相位差,控制至少两个摄像头分别对焦。由于每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内,则电子设备可以获取到多个不同方向上的相位参数,因此可以准确地确定对应的方向上的合焦位置,从而可以提高电子设备进行对焦的准确性。
应理解的是,本发明实施例中,射频单元101可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器110处理;另外,将上行的数据发送给基站。通常,射频单元101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元101还可以通过无线通信系统与网络和其他设备通信。
电子设备通过网络模块102为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元103可以将射频单元101或网络模块102接收的或者在存储器109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元103还可以提供与电子设备100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元103包括扬声器、蜂鸣器以及受话器等。
输入单元104用于接收音频或视频信号。输入单元104可以包括图形处理器(Graphics Processing Unit,GPU)1041和麦克风1042,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元106上。经图形处理器1041处理后的图像帧可以存储在存储器109(或其它存储介质)中或者经由射频单元101或网络模块102进行发送。麦克风1042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元101发送到移动通信基站的格式输出。
电子设备100还包括至少一种传感器105,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板1061的亮度,接近传感器可在电子设备100移动到耳边时,关闭显示面板1061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别电子设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器105还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元106用于显示由用户输入的信息或提供给用户的信息。显示单元106可包括显示面板1061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板1061。
用户输入单元107可用于接收输入的数字或字符信息,以及产生与电子设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元107包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1071上或在触控面板 1071附近的操作)。触控面板1071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,接收处理器110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1071。除了触控面板1071,用户输入单元107还可以包括其他输入设备1072。具体地,其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板1071可覆盖在显示面板1061上,当触控面板1071检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板1061上提供相应的视觉输出。虽然在图6中,触控面板1071与显示面板1061是作为两个独立的部件来实现电子设备的输入和输出功能,但是在某些实施例中,可以将触控面板1071与显示面板1061集成而实现电子设备的输入和输出功能,具体此处不做限定。
接口单元108为外部装置与电子设备100连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到电子设备100内的一个或多个元件或者可以用于在电子设备100和外部装置之间传输数据。
存储器109可用于存储软件程序以及各种数据。存储器109可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器110是电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器109内的软件程序和/或模块,以及调用存储在存储器 109内的数据,执行电子设备的各种功能和处理数据,从而对电子设备进行整体监控。处理器110可包括一个或多个处理单元;可选地,处理器110可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。
电子设备100还可以包括给各个部件供电的电源111(比如电池),可选地,电源111可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,电子设备100包括一些未示出的功能模块,在此不再赘述。
可选地,本发明实施例还提供一种电子设备,包括如图6所示的处理器110,存储器109,存储在存储器109上并可在所述处理器110上运行的计算机程序,该计算机程序被处理器110执行时实现上述方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本发明实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如 ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台电子设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。
上面结合附图对本发明的实施例进行了描述,但是本发明并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本发明的启示下,在不脱离本发明宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本发明的保护之内。

Claims (10)

  1. 一种电子设备,所述电子设备包括至少两个摄像头,所述至少两个摄像头上均设置有相位差PD点对集合;
    其中,所述至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。
  2. 根据权利要求1所述的电子设备,其中,所述至少两个摄像头包括第一摄像头和第二摄像头,所述第一摄像头上设置有第一PD点对集合,所述第二摄像头上设置有第二PD点对集合;
    其中,所述第一PD点对集合所在直线方向与所述第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内。
  3. 一种对焦方法,应用于电子设备,所述电子设备包括至少两个摄像头,所述至少两个摄像头上均设置有相位差PD点对集合,所述方法包括:
    获取至少两个目标参数;其中,每个目标参数分别为通过一个摄像头上的PD点对集合获取的相位参数,所述至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内;
    根据所述至少两个目标参数,确定至少两个目标相位差,每个目标相位差分别为一个摄像头对应的方向上的相位差;
    根据所述至少两个目标相位差,控制所述至少两个摄像头分别对焦。
  4. 根据权利要求3所述的方法,其中,所述至少两个摄像头包括第一摄像头和第二摄像头,所述第一摄像头上设置有第一PD点对集合,所述第二摄像头上设置有第二PD点对集合;所述第一PD点对集合所在直线方向与所述第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内;
    所述至少两个目标参数中每个目标参数包括相位差和可信度值。
  5. 根据权利要求3或4所述的方法,其中,所述根据所述至少两个目标参数,确定至少两个目标相位差,包括:
    若所述至少两个摄像头的固有相位参数相同,则采用第一算法,根据所述至少两个目标参数,确定所述至少两个目标相位差;
    若所述至少两个摄像头的固有相位参数不同,则采用第二算法,根据所述至少两个目标参数和每个摄像头的固有相位参数,确定所述至少两个目标相位差。
  6. 一种电子设备,所述电子设备包括至少两个摄像头,所述至少两个摄像头上均设置有相位差PD点对集合,所述电子设备包括:获取模块、确定模块和控制模块;
    所述获取模块,用于获取至少两个目标参数;其中,每个目标参数分别为通过一个摄像头上的PD点对集合获取的相位参数,所述至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内;
    所述确定模块,用于根据所述获取模块获取的所述至少两个目标参数,确定至少两个目标相位差,每个目标相位差分别为一个摄像头对应的方向上的相位差;
    所述控制模块,用于根据所述确定模块确定的所述至少两个目标相位差,控制所述至少两个摄像头分别对焦。
  7. 根据权利要求6所述的电子设备,其中,所述至少两个摄像头包括第一摄像头和第二摄像头,所述第一摄像头上设置有第一PD点对集合,所述第二摄像头上设置有第二PD点对集合;所述第一PD点对集合所在直线方向与所述第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内;
    所述至少两个目标参数中每个目标参数包括相位差和可信度值。
  8. 根据权利要求6或7所述的电子设备,其中,所述确定模块,具体用于若所述至少两个摄像头的固有相位参数相同,则采用第一算法,根据所述至少两个目标参数,确定所述至少两个目标相位差;或者,若所述至少两个摄像头的固有相位参数不同,则采用第二算法,根据所述至少两个目标参数和每个摄像头的固有相位参数,确定所述至少两个目标相位差。
  9. 一种电子设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求3至5中任一项所述的对焦方法的步骤。
  10. 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求3至5中任一项所述的对焦方法的步骤。
PCT/CN2020/131165 2019-11-28 2020-11-24 电子设备及对焦方法 WO2021104265A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020227018221A KR20220085834A (ko) 2019-11-28 2020-11-24 전자 기기 및 포커싱 방법
JP2022527786A JP7472281B2 (ja) 2019-11-28 2020-11-24 電子機器及びフォーカス方法
EP20893880.3A EP4057616A4 (en) 2019-11-28 2020-11-24 ELECTRONIC DEVICE AND METHOD FOR DEVELOPING
US17/746,424 US11856294B2 (en) 2019-11-28 2022-05-17 Electronic device and focusing method for electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911194022.5 2019-11-28
CN201911194022.5A CN110769162B (zh) 2019-11-28 2019-11-28 电子设备及对焦方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/746,424 Continuation US11856294B2 (en) 2019-11-28 2022-05-17 Electronic device and focusing method for electronic device

Publications (1)

Publication Number Publication Date
WO2021104265A1 true WO2021104265A1 (zh) 2021-06-03

Family

ID=69339972

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/131165 WO2021104265A1 (zh) 2019-11-28 2020-11-24 电子设备及对焦方法

Country Status (6)

Country Link
US (1) US11856294B2 (zh)
EP (1) EP4057616A4 (zh)
JP (1) JP7472281B2 (zh)
KR (1) KR20220085834A (zh)
CN (1) CN110769162B (zh)
WO (1) WO2021104265A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110769162B (zh) 2019-11-28 2021-07-13 维沃移动通信有限公司 电子设备及对焦方法
CN117499549B (zh) * 2023-12-25 2024-05-14 荣耀终端有限公司 一种扫描方法及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105376474A (zh) * 2014-09-01 2016-03-02 光宝电子(广州)有限公司 图像采集装置及其自动对焦方法
CN106331499A (zh) * 2016-09-13 2017-01-11 努比亚技术有限公司 对焦方法及拍照设备
CN107395990A (zh) * 2017-08-31 2017-11-24 珠海市魅族科技有限公司 相位对焦方法及装置、终端、计算机装置及可读存储介质
JP2019087880A (ja) * 2017-11-07 2019-06-06 キヤノン株式会社 信号出力装置、カメラ雲台、カメラ雲台連動システム、およびプログラム
CN110769162A (zh) * 2019-11-28 2020-02-07 维沃移动通信有限公司 电子设备及对焦方法

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0572470A (ja) * 1991-06-26 1993-03-26 Fuji Photo Film Co Ltd 位相差検出型焦点検出装置
JP5317562B2 (ja) * 2008-07-17 2013-10-16 キヤノン株式会社 位相差検出装置、撮像装置、位相差検出方法、位相差検出プログラム
JP5230388B2 (ja) * 2008-12-10 2013-07-10 キヤノン株式会社 焦点検出装置及びその制御方法
US8755036B2 (en) * 2010-03-11 2014-06-17 Optical Physics Company Active imaging system and method
JP2011211381A (ja) * 2010-03-29 2011-10-20 Fujifilm Corp 立体撮像装置
WO2013047160A1 (ja) * 2011-09-28 2013-04-04 富士フイルム株式会社 固体撮像素子、撮像装置、及び合焦制御方法
CN104272161B (zh) * 2012-05-01 2016-05-18 富士胶片株式会社 摄像装置和对焦控制方法
JP6045362B2 (ja) * 2013-01-17 2016-12-14 オリンパス株式会社 撮像装置および焦点検出方法
JP6186900B2 (ja) * 2013-06-04 2017-08-30 ソニー株式会社 固体撮像装置、電子機器、レンズ制御方法、および撮像モジュール
JP6053652B2 (ja) * 2013-09-20 2016-12-27 富士フイルム株式会社 撮像装置及び合焦制御方法
KR102294316B1 (ko) * 2014-08-04 2021-08-26 엘지이노텍 주식회사 이미지 센서 및 이를 포함하는 촬상 장치
US9906772B2 (en) 2014-11-24 2018-02-27 Mediatek Inc. Method for performing multi-camera capturing control of an electronic device, and associated apparatus
JP6536126B2 (ja) * 2015-03-31 2019-07-03 株式会社ニコン 撮像素子および撮像装置
US10122998B2 (en) * 2015-04-30 2018-11-06 Seiko Epson Corporation Real time sensor and method for synchronizing real time sensor data streams
US9807294B2 (en) 2015-08-05 2017-10-31 Omnivision Technologies, Inc. Image sensor with symmetric multi-pixel phase-difference detectors, and associated methods
US9910247B2 (en) * 2016-01-21 2018-03-06 Qualcomm Incorporated Focus hunting prevention for phase detection auto focus (AF)
KR20170139408A (ko) * 2016-06-09 2017-12-19 엘지전자 주식회사 듀얼 카메라가 장착된 동영상 촬영 장치
CN106331484B (zh) * 2016-08-24 2020-02-14 维沃移动通信有限公司 一种对焦方法及移动终端
JP2018050267A (ja) * 2016-09-23 2018-03-29 キヤノン株式会社 撮像装置及び撮像素子の制御方法
CN106506969B (zh) * 2016-11-29 2019-07-19 Oppo广东移动通信有限公司 摄像模组、通过其进行人像追踪的方法以及电子设备
CN106791373B (zh) 2016-11-29 2020-03-13 Oppo广东移动通信有限公司 对焦处理方法、装置及终端设备
CN106506922A (zh) * 2016-11-29 2017-03-15 广东欧珀移动通信有限公司 摄像模组以及电子设备
JP2018169517A (ja) 2017-03-30 2018-11-01 ソニーセミコンダクタソリューションズ株式会社 撮像装置、撮像モジュールおよび撮像装置の制御方法
CN107124547B (zh) * 2017-04-19 2020-10-16 宇龙计算机通信科技(深圳)有限公司 双摄像头拍照方法及装置
CN106973206B (zh) * 2017-04-28 2020-06-05 Oppo广东移动通信有限公司 摄像模组摄像处理方法、装置和终端设备
CN109639974A (zh) * 2018-12-20 2019-04-16 Oppo广东移动通信有限公司 控制方法、控制装置、电子装置及介质
CN109862271B (zh) * 2019-02-27 2021-03-16 上海创功通讯技术有限公司 相位差对焦方法、电子设备及计算机存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105376474A (zh) * 2014-09-01 2016-03-02 光宝电子(广州)有限公司 图像采集装置及其自动对焦方法
CN106331499A (zh) * 2016-09-13 2017-01-11 努比亚技术有限公司 对焦方法及拍照设备
CN107395990A (zh) * 2017-08-31 2017-11-24 珠海市魅族科技有限公司 相位对焦方法及装置、终端、计算机装置及可读存储介质
JP2019087880A (ja) * 2017-11-07 2019-06-06 キヤノン株式会社 信号出力装置、カメラ雲台、カメラ雲台連動システム、およびプログラム
CN110769162A (zh) * 2019-11-28 2020-02-07 维沃移动通信有限公司 电子设备及对焦方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4057616A4 *

Also Published As

Publication number Publication date
JP7472281B2 (ja) 2024-04-22
EP4057616A4 (en) 2022-12-28
EP4057616A1 (en) 2022-09-14
US11856294B2 (en) 2023-12-26
JP2023501608A (ja) 2023-01-18
CN110769162B (zh) 2021-07-13
KR20220085834A (ko) 2022-06-22
CN110769162A (zh) 2020-02-07
US20220279129A1 (en) 2022-09-01

Similar Documents

Publication Publication Date Title
CN108513070B (zh) 一种图像处理方法、移动终端及计算机可读存储介质
WO2021057267A1 (zh) 图像处理方法及终端设备
WO2021104197A1 (zh) 对象跟踪方法及电子设备
WO2021104195A1 (zh) 图像显示方法及电子设备
CN109743498B (zh) 一种拍摄参数调整方法及终端设备
WO2021098603A1 (zh) 预览画面显示方法及电子设备
WO2021098697A1 (zh) 屏幕显示的控制方法及电子设备
WO2021013009A1 (zh) 拍照方法和终端设备
WO2021036623A1 (zh) 显示方法及电子设备
WO2021082744A1 (zh) 视频查看方法及电子设备
WO2021082772A1 (zh) 截屏方法及电子设备
US12022190B2 (en) Photographing method and electronic device
WO2021104266A1 (zh) 对象显示方法及电子设备
CN110784575B (zh) 一种电子设备和拍摄方法
CN110830713A (zh) 一种变焦方法及电子设备
WO2019137535A1 (zh) 物距测量方法及终端设备
WO2021147911A1 (zh) 移动终端、拍摄模式的检测方法及存储介质
WO2021017785A1 (zh) 数据传输方法及终端设备
WO2021104265A1 (zh) 电子设备及对焦方法
WO2020156119A1 (zh) 应用程序界面调整方法及移动终端
CN110602390B (zh) 一种图像处理方法及电子设备
CN109104573B (zh) 一种确定对焦点的方法及终端设备
CN109005337B (zh) 一种拍照方法及终端
WO2021136181A1 (zh) 图像处理方法及电子设备
US11877057B2 (en) Electronic device and focusing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20893880

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022527786

Country of ref document: JP

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 20227018221

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2020893880

Country of ref document: EP

Effective date: 20220607

NENP Non-entry into the national phase

Ref country code: DE