WO2021104265A1 - 电子设备及对焦方法 - Google Patents
电子设备及对焦方法 Download PDFInfo
- Publication number
- WO2021104265A1 WO2021104265A1 PCT/CN2020/131165 CN2020131165W WO2021104265A1 WO 2021104265 A1 WO2021104265 A1 WO 2021104265A1 CN 2020131165 W CN2020131165 W CN 2020131165W WO 2021104265 A1 WO2021104265 A1 WO 2021104265A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- cameras
- electronic device
- phase difference
- point pair
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000004590 computer program Methods 0.000 claims description 12
- 230000006870 function Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000009527 percussion Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
Definitions
- the embodiment of the present invention relates to the field of communication technology, and in particular to an electronic device and a focusing method.
- the electronic device can use a Phase Difference Auto Focus (PDAF) method to focus to obtain a clearer picture.
- PDAF Phase Difference Auto Focus
- the electronic device can calculate the phase difference through the PD point pair set on the sensor, and convert the phase difference into the moving distance of the motor in the lens module, so that the electronic device can determine the focus point according to the moving distance To achieve focus.
- PDAF Phase Difference Auto Focus
- the arrangement of the PD point pairs in the electronic device is fixed.
- the electronic device can obtain the phase difference of the image in one direction, while for other directions, it may not be able to accurately obtain the phase difference.
- the accuracy of focusing is low.
- the embodiments of the present invention provide an electronic device and a focusing method, which can solve the problem of low accuracy of focusing by the electronic device.
- an electronic device in a first aspect of the embodiments of the present invention, includes at least two cameras, each of which is provided with a PD point pair set. Wherein, the angle value between the linear direction where the PD point pair set on each of the at least two cameras is located is within the first preset angle range.
- a focusing method is provided, which is applied to an electronic device.
- the electronic device includes at least two cameras, each of which is provided with a PD point pair set, and the focusing method includes: acquiring at least two cameras.
- Target parameters where each target parameter is a phase parameter obtained through a collection of PD point pairs on one camera, and the angle value between the linear direction of the collection of PD point pairs on every two cameras in at least two cameras Within the first preset angle range; determine at least two target phase differences according to at least two target parameters, and each target phase difference is the phase difference in the direction corresponding to a camera; according to the at least two target phase differences, control At least two cameras focus separately.
- an electronic device in a third aspect of the embodiments of the present invention, includes at least two cameras, each of which is provided with a PD point pair set.
- the electronic device includes: an acquisition module, a determination module, and a control module. Module.
- the acquisition module is used to acquire at least two target parameters; each target parameter is a phase parameter acquired through a PD point pair collection on one camera, and the PD point pair collection on every two cameras in the at least two cameras is located The value of the included angle between the straight line directions is within the first preset angle range.
- the determining module is configured to determine at least two target phase differences according to the at least two target parameters acquired by the acquiring module, and each target phase difference is a phase difference in a direction corresponding to a camera.
- the control module is configured to control the at least two cameras to focus separately according to the phase difference of the at least two targets determined by the determining module.
- an electronic device in a fourth aspect of the embodiments of the present invention, includes a processor, a memory, and a computer program stored on the memory and running on the processor.
- the computer program When the computer program is executed by the processor, the computer program The steps of the focusing method described in the second aspect.
- a computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the steps of the focusing method as described in the second aspect are implemented.
- the electronic device may include at least two cameras, each of which is provided with a PD point pair set, and the PD point pair set on each of the at least two cameras is located in a straight line direction
- the included angle value between is within the first preset angle range. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
- FIG. 1 is a schematic diagram of the architecture of an Android operating system provided by an embodiment of the present invention
- FIG. 2 is one of the schematic diagrams of a focusing method provided by an embodiment of the present invention.
- FIG. 3 is a second schematic diagram of a focusing method provided by an embodiment of the present invention.
- FIG. 4 is the third schematic diagram of a focusing method provided by an embodiment of the present invention.
- FIG. 5 is a schematic structural diagram of an electronic device provided by an embodiment of the present invention.
- FIG. 6 is a schematic diagram of hardware of an electronic device provided by an embodiment of the present invention.
- first and second in the description and claims of the embodiments of the present invention are used to distinguish different objects, rather than to describe a specific order of objects.
- first camera and the second camera are used to distinguish different cameras, rather than to describe the specific order of the cameras.
- plural means two or more.
- a plurality of elements refers to two elements or more than two elements.
- words such as “exemplary” or “for example” are used to represent examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present invention should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
- the embodiment of the present invention provides an electronic device and a focusing method.
- the electronic device may include at least two cameras, each of which is provided with a PD point pair set, and the PD on each two cameras of the at least two cameras.
- the angle value between the straight line directions where the point pair set is located is within the first preset angle range. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
- the electronic device and the focusing method provided by the embodiments of the present invention can be applied to a process in which the electronic device focuses on a camera.
- the electronic device in the embodiment of the present invention may be an electronic device with an operating system.
- the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present invention.
- the following takes the Android operating system as an example to introduce the software environment to which the focusing method provided in the embodiment of the present invention is applied.
- FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present invention.
- the architecture of the Android operating system includes 4 layers, which are: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
- the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
- the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
- the system runtime layer includes a library (also called a system library) and an Android operating system runtime environment.
- the library mainly provides various resources needed by the Android operating system.
- the Android operating system operating environment is used to provide a software environment for the Android operating system.
- the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software level.
- the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
- developers can develop a software program that implements the focusing method provided by the embodiment of the present invention based on the system architecture of the Android operating system as shown in FIG. It can be run based on the Android operating system as shown in Figure 1. That is, the processor or the electronic device can implement the focusing method provided by the embodiment of the present invention by running the software program in the Android operating system.
- the electronic device in the embodiment of the present invention may be a mobile electronic device or a non-mobile electronic device.
- the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant).
- UMPC ultra-mobile personal computer
- PDA personal digital assistant
- the non-mobile electronic device may be a personal computer (PC), television (television, TV), teller machine, or self-service machine, etc., which is not specifically limited in the embodiment of the present invention.
- An embodiment of the present invention provides an electronic device, which includes at least two cameras, and a PD point pair set is set on each of the at least two cameras.
- the angle value between the linear direction where the PD point pairs on each of the at least two cameras are located is within the first preset angle range.
- the above-mentioned PD point pair set may include at least two PD point pairs.
- the PD point pair can be understood as: the special pixel points used to detect the phase on the camera's sensor (sensor) generally appear in pairs; PD sensor: a camera with a PD point pair.
- the above-mentioned at least two cameras include a first camera and a second camera, the first camera is provided with a first PD point pair set, and the second camera is provided with a second PD point pair set.
- the angle value between the linear direction where the first PD point pair set is located and the linear direction where the second PD point pair set is located is within the second preset angle range.
- An embodiment of the present invention provides an electronic device, the electronic device may include at least two cameras, each of the at least two cameras is provided with a PD point pair set, and the PD point pairs on every two cameras of the at least two cameras
- the angle value between the linear directions where the collection is located is within the first preset angle range. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
- FIG. 2 shows a flowchart of a focusing method provided by an embodiment of the present invention.
- the method can be applied to an electronic device having an Android operating system as shown in FIG. 1.
- the focusing method provided by the embodiment of the present invention may include the following steps 201 to 203.
- Step 201 The electronic device obtains at least two target parameters.
- the electronic device may include at least two cameras, and PD point pair sets are set on each of the at least two cameras.
- each of the above-mentioned at least two target parameters is a phase parameter obtained through a collection of PD point pairs on one camera, and the PD point pair collection on every two cameras of the at least two cameras is located
- the value of the included angle between the straight line directions is within the first preset angle range.
- the above-mentioned at least two cameras may include a first camera and a second camera.
- the first camera is provided with a first PD point pair set
- the second camera is provided with a second PD point.
- the angle value between the linear direction where the first PD point pair set is located and the linear direction where the second PD point pair set is located is within the second preset angle range.
- the angle between the PD point pair set on each two cameras in the straight line direction may be 180°/N (For example, the first preset angle range).
- each of the above-mentioned at least two target parameters includes a phase difference and a credibility value.
- the aforementioned at least two cameras include a first camera and a second camera
- the aforementioned at least two target parameters may include the first parameter and the second parameter
- the above-mentioned first parameter is a phase parameter when an image is collected by a first camera
- the above-mentioned second parameter is a phase parameter when an image is collected by a second camera.
- the electronic device may obtain the phase parameter of the image to be collected through the first camera, and obtain the phase parameter of the image to be collected through the second camera.
- the electronic device may obtain the first parameter through the first PD point pair set, and obtain the second parameter through the second PD point pair set.
- a plurality of first PD point pairs (that is, a first PD point pair set) are provided on the first camera, the plurality of first PD point pairs are arranged in one direction, and the second camera is provided with a plurality of second PDs.
- Point pairs ie, the second PD point pair set
- the plurality of second PD point pairs are arranged in one direction, and the angle value between the two directions is within the second preset angle range.
- the relationship between the position of the first camera and the position of the second camera can make the angle between the linear direction of the first PD point pair set and the linear direction of the second PD point pair set in the second preset Set the angle range.
- the foregoing second preset angle range may be a preset angle value, and the preset angle value is 90°.
- the arrangement direction of the first PD point pair set is the first direction (for example, horizontal), and the arrangement direction of the second PD point pair set is the second direction (for example, the vertical direction).
- the first direction is perpendicular to the second direction.
- the above-mentioned first parameter may include a first credibility value corresponding to the first phase difference and the first phase difference
- the above-mentioned second parameter may include a second phase difference corresponding to the second phase difference.
- the second credibility value may be included in the embodiment of the present invention.
- the first phase difference is the actual output phase difference of the first camera
- the second phase difference is the actual output phase difference of the second camera
- the first credibility value is the actual output credibility of the first camera
- the first credibility value is used to indicate the credibility of the first phase difference
- the second credibility value is the actual output credibility value of the second camera
- the second credibility value is used to indicate The credibility of the second phase difference.
- the electronic device can acquire phase parameters in multiple different directions, so that Determining the in-focus position in the corresponding direction can improve the accuracy of the electronic device in determining the in-focus position (or in-focus position).
- the electronic device can obtain two different directions
- the above phase parameters determine the focus position in the corresponding direction, and therefore can improve the accuracy of the electronic device in determining the focus position (or focus position).
- Step 202 The electronic device determines at least two target phase differences according to the at least two target parameters.
- each of the above-mentioned at least two target phase differences is a phase difference in a direction corresponding to a camera.
- step 202 may be specifically implemented by the following step 202a.
- Step 202a If the inherent phase parameters of the at least two cameras are the same, the electronic device uses the first algorithm to determine the at least two target phase differences according to the at least two target parameters.
- each inherent phase parameter may include an inherent phase difference and an inherent reliability value.
- the same inherent phase parameters of at least two cameras can be understood as: the value range of the inherent phase difference of each camera is the same, and the value range of the inherent credibility value of each camera is the same.
- the electronic device may determine the target phase difference corresponding to the first camera and the second camera according to the first parameter and the second parameter The corresponding target phase difference.
- the electronic device can calculate the phase difference in the direction corresponding to each camera through the PD point pair according to the first parameter and the second parameter.
- each target phase difference is the in-focus phase difference corresponding to a camera.
- the electronic device may adopt the first algorithm, and according to the first parameter and the second inherent phase parameter Parameters to determine the target phase difference corresponding to the first camera and the target phase difference corresponding to the second camera.
- the first inherent phase parameter may include a first inherent phase difference and a first inherent credibility value
- the second inherent phase parameter may include a second inherent phase difference and a second inherent reliability value. Reliability value.
- the first inherent phase parameter of the first camera is the same as the second inherent phase parameter of the second camera. It can be understood that the value range of the first inherent phase difference is the same as the value range of the second inherent phase difference. , And the value range of the first inherent credibility value is the same as the value range of the second inherent credibility value.
- the phase difference in the direction corresponding to the first camera calculated by the electronic device is the same as the phase difference in the direction corresponding to the second camera.
- F a target phase difference F1 is a first phase
- C1 is the first confidence value
- F2 is the second phase
- C2 is the second confidence value.
- step 202 may be specifically implemented by the following step 202b.
- Step 202b If the inherent phase parameters of the at least two cameras are different, the electronic device uses the second algorithm to determine the at least two target phase differences based on the at least two target parameters and the inherent phase parameter of each camera.
- the difference in the intrinsic phase parameters of the above at least two cameras can be understood as: the value range of the intrinsic phase difference of each camera is different, and the value range of the intrinsic credibility value of each camera is different.
- the electronic device when the first inherent phase parameter of the first camera is different from the second inherent phase parameter of the second camera, the electronic device may adopt the second algorithm, and according to the first inherent phase parameter, The second inherent phase parameter, the first parameter and the second parameter determine the target phase difference.
- the difference between the first inherent phase parameter of the first camera and the second inherent phase parameter of the second camera can be understood as: the value range of the first inherent phase difference is different from the value range of the second inherent phase difference , And the value range of the first inherent credibility value is different from the value range of the second inherent credibility value.
- the electronic device may map the first parameter to the second camera, or map the second parameter to the first camera to calculate the phase difference or the second camera corresponding to the first camera.
- the value range of the first inherent phase difference can be [F10, F11]
- the value range of the first inherent credibility value can be [C10, C11]
- the value range of the second inherent phase difference can be [C10, C11].
- the value range of the phase difference can be [F20, F21]
- the value range of the second inherent credibility value can be [C20, C21].
- the first parameter may be mapped to the second camera.
- F1′′ is the target phase difference corresponding to the first camera
- F b is the target phase difference corresponding to the second camera.
- F b (F1′ ⁇ C1′+F2 ⁇ C2)/(C1′+C2)
- F1′ is the phase difference obtained by mapping the first phase difference to the second camera
- C1′ is the first credible
- the degree value is mapped to the credibility value obtained on the second camera
- F2 is the second phase difference
- C2 is the second credibility value.
- F1′ F1 ⁇ (F21-F20)/(F11-F10)
- C1′ C1 ⁇ (C21-C20)/(C11-C10)
- F1 is the first phase difference
- C1 is the first reliability value
- Step 203 The electronic device controls the at least two cameras to focus separately according to the phase difference of the at least two targets.
- the electronic device may first determine the focus position (ie, focus position) of each of the at least two cameras according to the phase difference of at least two targets, and then control the corresponding camera according to each focus position.
- Focus that is, focus on a position in the image to be shot or preview interface.
- the electronic device may be based on the target phase difference F a, the calculated focus position focus position of the first camera or the second camera.
- the electronic device may be a phase difference F1 "from the target, the calculated focus position of the first camera; electronic device according to the target retardation F b, is calculated to obtain the focal position of the second camera.
- the electronic device may calculate the phase difference through the PD point pair, and then convert the phase difference into the distance moved by the motor to determine the focus position.
- two cameras are used for focusing, and it only needs to satisfy that the arrangement directions of the PD point pairs of the two cameras are close to completely orthogonal, which can improve the accuracy of the phase difference output, thereby improving the focus success rate. And without loss of image quality.
- the embodiment of the present invention provides a focusing method.
- An electronic device can determine at least two target phase differences (each target phase difference is the phase difference in the direction corresponding to a camera) according to the acquired at least two target parameters, and At least two targets have a phase difference, and at least two cameras are controlled to focus separately. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
- FIG. 5 shows a schematic diagram of a possible structure of an electronic device involved in an embodiment of the present invention.
- the electronic device includes at least two cameras, and a PD point pair set is set on each of the at least two cameras.
- the electronic device 50 may include: an acquisition module 51, a determination module 52, and a control module 53.
- the acquiring module 51 is configured to acquire at least two target parameters; wherein, each target parameter is a phase parameter acquired through a collection of PD point pairs on one camera, and the PD points on each two cameras of the at least two cameras The angle value between the straight line directions where the collection is located is within the first preset angle range.
- the determining module 52 is configured to determine at least two target phase differences according to the at least two target parameters acquired by the acquiring module 51, and each target phase difference is a phase difference in a direction corresponding to a camera.
- the control module 53 is configured to control the at least two cameras to focus separately according to the phase difference of the at least two targets determined by the determining module 52.
- the aforementioned at least two cameras include a first camera and a second camera, the first camera is provided with a first PD point pair set, and the second camera is provided with a second PD point pair set ;
- the angle between the linear direction where the first PD point pair set is located and the linear direction where the second PD point pair set is located is within a second preset angle range.
- Each of the above at least two target parameters includes a phase difference and a credibility value.
- the above determining module 52 is specifically configured to use the first algorithm to determine at least two target phase differences according to the at least two target parameters if the inherent phase parameters of the at least two cameras are the same; or If the inherent phase parameters of the at least two cameras are different, the second algorithm is used to determine the at least two target phase differences based on the at least two target parameters and the inherent phase parameter of each camera.
- the electronic device provided in the embodiment of the present invention can implement each process implemented by the electronic device in the foregoing method embodiment. To avoid repetition, the detailed description will not be repeated here.
- An embodiment of the present invention provides an electronic device.
- the electronic device may include at least two cameras, each of which is provided with a PD point pair set, and each of the at least two cameras has a PD point pair set on each two cameras.
- the angle value between the linear directions is within the first preset angle range. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
- Fig. 6 is a schematic diagram of hardware of an electronic device that implements various embodiments of the present invention.
- the electronic device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, The processor 110, and the power supply 111 and other components.
- electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
- the processor 110 is configured to obtain at least two target parameters; where each target parameter is a phase parameter obtained through a collection of PD point pairs on one camera, and the PD points on each two cameras of the at least two cameras The angle between the linear directions where the collection is located is within the first preset angle range; and at least two target phase differences are determined according to at least two target parameters, and each target phase difference is the phase in the direction corresponding to a camera Difference; and according to the phase difference of at least two targets, control at least two cameras to focus separately.
- each target parameter is a phase parameter obtained through a collection of PD point pairs on one camera, and the PD points on each two cameras of the at least two cameras
- the angle between the linear directions where the collection is located is within the first preset angle range
- at least two target phase differences are determined according to at least two target parameters, and each target phase difference is the phase in the direction corresponding to a camera Difference; and according to the phase difference of at least two targets, control at least two cameras to focus separately.
- An embodiment of the present invention provides an electronic device.
- the electronic device can determine at least two target phase differences (each target phase difference is the phase difference in the direction corresponding to a camera) according to the acquired at least two target parameters, and according to At least two targets have a phase difference, and at least two cameras are controlled to focus separately. Since the angle between the linear direction of the PD point pair set on every two cameras is within the first preset angle range, the electronic device can obtain phase parameters in multiple different directions, so it can accurately determine the corresponding In the direction of the focus position, which can improve the accuracy of the electronic device to focus.
- the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
- the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
- the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
- the electronic device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
- the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output it as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic device 100 (for example, call signal reception sound, message reception sound, etc.).
- the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
- the input unit 104 is used to receive audio or video signals.
- the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042.
- the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
- the processed image frame can be displayed on the display unit 106.
- the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
- the microphone 1042 can receive sound, and can process such sound into audio data.
- the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
- the electronic device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
- the light sensor includes an ambient light sensor and a proximity sensor.
- the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
- the proximity sensor can close the display panel 1061 and the display panel 1061 when the electronic device 100 is moved to the ear. / Or backlight.
- the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
- the display unit 106 is used to display information input by the user or information provided to the user.
- the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
- LCD liquid crystal display
- OLED organic light-emitting diode
- the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the electronic device.
- the user input unit 107 includes a touch panel 1071 and other input devices 1072.
- the touch panel 1071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
- the touch panel 1071 may include two parts: a touch detection device and a touch controller.
- the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
- the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
- the user input unit 107 may also include other input devices 1072.
- other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
- the touch panel 1071 can be overlaid on the display panel 1061.
- the touch panel 1071 detects a touch operation on or near it, it transmits it to the processor 110 to determine the type of the touch event, and then the processor 110 determines the type of the touch event according to the touch.
- the type of event provides corresponding visual output on the display panel 1061.
- the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
- the implementation of the input and output functions of the electronic device is not specifically limited here.
- the interface unit 108 is an interface for connecting an external device with the electronic device 100.
- the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
- the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 100 or can be used to connect to the electronic device 100 and the external device. Transfer data between devices.
- the memory 109 can be used to store software programs and various data.
- the memory 109 may mainly include a storage program area and a storage data area.
- the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
- the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
- the processor 110 is the control center of the electronic device. It uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes software programs and/or modules stored in the memory 109, and calls data stored in the memory 109 , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
- the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem
- the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
- the electronic device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
- a power source 111 such as a battery
- the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
- the electronic device 100 includes some functional modules not shown, which will not be repeated here.
- an embodiment of the present invention further provides an electronic device, including a processor 110 as shown in FIG. 6, a memory 109, a computer program stored in the memory 109 and running on the processor 110, the computer
- the program is executed by the processor 110, each process of the foregoing method embodiment is realized, and the same technical effect can be achieved. In order to avoid repetition, details are not described herein again.
- the embodiment of the present invention also provides a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium.
- a computer program is stored on the computer-readable storage medium.
- the computer program is executed by a processor, each process of the foregoing method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, I won’t repeat them here.
- the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk, etc.
- the technical solution of the present invention essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes a number of instructions to enable an electronic device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to execute the method described in each embodiment of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Focusing (AREA)
- Cameras In General (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
Claims (10)
- 一种电子设备,所述电子设备包括至少两个摄像头,所述至少两个摄像头上均设置有相位差PD点对集合;其中,所述至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内。
- 根据权利要求1所述的电子设备,其中,所述至少两个摄像头包括第一摄像头和第二摄像头,所述第一摄像头上设置有第一PD点对集合,所述第二摄像头上设置有第二PD点对集合;其中,所述第一PD点对集合所在直线方向与所述第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内。
- 一种对焦方法,应用于电子设备,所述电子设备包括至少两个摄像头,所述至少两个摄像头上均设置有相位差PD点对集合,所述方法包括:获取至少两个目标参数;其中,每个目标参数分别为通过一个摄像头上的PD点对集合获取的相位参数,所述至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内;根据所述至少两个目标参数,确定至少两个目标相位差,每个目标相位差分别为一个摄像头对应的方向上的相位差;根据所述至少两个目标相位差,控制所述至少两个摄像头分别对焦。
- 根据权利要求3所述的方法,其中,所述至少两个摄像头包括第一摄像头和第二摄像头,所述第一摄像头上设置有第一PD点对集合,所述第二摄像头上设置有第二PD点对集合;所述第一PD点对集合所在直线方向与所述第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内;所述至少两个目标参数中每个目标参数包括相位差和可信度值。
- 根据权利要求3或4所述的方法,其中,所述根据所述至少两个目标参数,确定至少两个目标相位差,包括:若所述至少两个摄像头的固有相位参数相同,则采用第一算法,根据所述至少两个目标参数,确定所述至少两个目标相位差;若所述至少两个摄像头的固有相位参数不同,则采用第二算法,根据所述至少两个目标参数和每个摄像头的固有相位参数,确定所述至少两个目标相位差。
- 一种电子设备,所述电子设备包括至少两个摄像头,所述至少两个摄像头上均设置有相位差PD点对集合,所述电子设备包括:获取模块、确定模块和控制模块;所述获取模块,用于获取至少两个目标参数;其中,每个目标参数分别为通过一个摄像头上的PD点对集合获取的相位参数,所述至少两个摄像头中每两个摄像头上的PD点对集合所在直线方向之间的夹角值在第一预设角度范围内;所述确定模块,用于根据所述获取模块获取的所述至少两个目标参数,确定至少两个目标相位差,每个目标相位差分别为一个摄像头对应的方向上的相位差;所述控制模块,用于根据所述确定模块确定的所述至少两个目标相位差,控制所述至少两个摄像头分别对焦。
- 根据权利要求6所述的电子设备,其中,所述至少两个摄像头包括第一摄像头和第二摄像头,所述第一摄像头上设置有第一PD点对集合,所述第二摄像头上设置有第二PD点对集合;所述第一PD点对集合所在直线方向与所述第二PD点对集合所在直线方向之间的夹角值在第二预设角度范围内;所述至少两个目标参数中每个目标参数包括相位差和可信度值。
- 根据权利要求6或7所述的电子设备,其中,所述确定模块,具体用于若所述至少两个摄像头的固有相位参数相同,则采用第一算法,根据所述至少两个目标参数,确定所述至少两个目标相位差;或者,若所述至少两个摄像头的固有相位参数不同,则采用第二算法,根据所述至少两个目标参数和每个摄像头的固有相位参数,确定所述至少两个目标相位差。
- 一种电子设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求3至5中任一项所述的对焦方法的步骤。
- 一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求3至5中任一项所述的对焦方法的步骤。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020227018221A KR20220085834A (ko) | 2019-11-28 | 2020-11-24 | 전자 기기 및 포커싱 방법 |
JP2022527786A JP7472281B2 (ja) | 2019-11-28 | 2020-11-24 | 電子機器及びフォーカス方法 |
EP20893880.3A EP4057616A4 (en) | 2019-11-28 | 2020-11-24 | ELECTRONIC DEVICE AND METHOD FOR DEVELOPING |
US17/746,424 US11856294B2 (en) | 2019-11-28 | 2022-05-17 | Electronic device and focusing method for electronic device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911194022.5 | 2019-11-28 | ||
CN201911194022.5A CN110769162B (zh) | 2019-11-28 | 2019-11-28 | 电子设备及对焦方法 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/746,424 Continuation US11856294B2 (en) | 2019-11-28 | 2022-05-17 | Electronic device and focusing method for electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021104265A1 true WO2021104265A1 (zh) | 2021-06-03 |
Family
ID=69339972
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/131165 WO2021104265A1 (zh) | 2019-11-28 | 2020-11-24 | 电子设备及对焦方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11856294B2 (zh) |
EP (1) | EP4057616A4 (zh) |
JP (1) | JP7472281B2 (zh) |
KR (1) | KR20220085834A (zh) |
CN (1) | CN110769162B (zh) |
WO (1) | WO2021104265A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110769162B (zh) | 2019-11-28 | 2021-07-13 | 维沃移动通信有限公司 | 电子设备及对焦方法 |
CN117499549B (zh) * | 2023-12-25 | 2024-05-14 | 荣耀终端有限公司 | 一种扫描方法及电子设备 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105376474A (zh) * | 2014-09-01 | 2016-03-02 | 光宝电子(广州)有限公司 | 图像采集装置及其自动对焦方法 |
CN106331499A (zh) * | 2016-09-13 | 2017-01-11 | 努比亚技术有限公司 | 对焦方法及拍照设备 |
CN107395990A (zh) * | 2017-08-31 | 2017-11-24 | 珠海市魅族科技有限公司 | 相位对焦方法及装置、终端、计算机装置及可读存储介质 |
JP2019087880A (ja) * | 2017-11-07 | 2019-06-06 | キヤノン株式会社 | 信号出力装置、カメラ雲台、カメラ雲台連動システム、およびプログラム |
CN110769162A (zh) * | 2019-11-28 | 2020-02-07 | 维沃移动通信有限公司 | 电子设备及对焦方法 |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0572470A (ja) * | 1991-06-26 | 1993-03-26 | Fuji Photo Film Co Ltd | 位相差検出型焦点検出装置 |
JP5317562B2 (ja) * | 2008-07-17 | 2013-10-16 | キヤノン株式会社 | 位相差検出装置、撮像装置、位相差検出方法、位相差検出プログラム |
JP5230388B2 (ja) * | 2008-12-10 | 2013-07-10 | キヤノン株式会社 | 焦点検出装置及びその制御方法 |
US8755036B2 (en) * | 2010-03-11 | 2014-06-17 | Optical Physics Company | Active imaging system and method |
JP2011211381A (ja) * | 2010-03-29 | 2011-10-20 | Fujifilm Corp | 立体撮像装置 |
WO2013047160A1 (ja) * | 2011-09-28 | 2013-04-04 | 富士フイルム株式会社 | 固体撮像素子、撮像装置、及び合焦制御方法 |
CN104272161B (zh) * | 2012-05-01 | 2016-05-18 | 富士胶片株式会社 | 摄像装置和对焦控制方法 |
JP6045362B2 (ja) * | 2013-01-17 | 2016-12-14 | オリンパス株式会社 | 撮像装置および焦点検出方法 |
JP6186900B2 (ja) * | 2013-06-04 | 2017-08-30 | ソニー株式会社 | 固体撮像装置、電子機器、レンズ制御方法、および撮像モジュール |
JP6053652B2 (ja) * | 2013-09-20 | 2016-12-27 | 富士フイルム株式会社 | 撮像装置及び合焦制御方法 |
KR102294316B1 (ko) * | 2014-08-04 | 2021-08-26 | 엘지이노텍 주식회사 | 이미지 센서 및 이를 포함하는 촬상 장치 |
US9906772B2 (en) | 2014-11-24 | 2018-02-27 | Mediatek Inc. | Method for performing multi-camera capturing control of an electronic device, and associated apparatus |
JP6536126B2 (ja) * | 2015-03-31 | 2019-07-03 | 株式会社ニコン | 撮像素子および撮像装置 |
US10122998B2 (en) * | 2015-04-30 | 2018-11-06 | Seiko Epson Corporation | Real time sensor and method for synchronizing real time sensor data streams |
US9807294B2 (en) | 2015-08-05 | 2017-10-31 | Omnivision Technologies, Inc. | Image sensor with symmetric multi-pixel phase-difference detectors, and associated methods |
US9910247B2 (en) * | 2016-01-21 | 2018-03-06 | Qualcomm Incorporated | Focus hunting prevention for phase detection auto focus (AF) |
KR20170139408A (ko) * | 2016-06-09 | 2017-12-19 | 엘지전자 주식회사 | 듀얼 카메라가 장착된 동영상 촬영 장치 |
CN106331484B (zh) * | 2016-08-24 | 2020-02-14 | 维沃移动通信有限公司 | 一种对焦方法及移动终端 |
JP2018050267A (ja) * | 2016-09-23 | 2018-03-29 | キヤノン株式会社 | 撮像装置及び撮像素子の制御方法 |
CN106506969B (zh) * | 2016-11-29 | 2019-07-19 | Oppo广东移动通信有限公司 | 摄像模组、通过其进行人像追踪的方法以及电子设备 |
CN106791373B (zh) | 2016-11-29 | 2020-03-13 | Oppo广东移动通信有限公司 | 对焦处理方法、装置及终端设备 |
CN106506922A (zh) * | 2016-11-29 | 2017-03-15 | 广东欧珀移动通信有限公司 | 摄像模组以及电子设备 |
JP2018169517A (ja) | 2017-03-30 | 2018-11-01 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置、撮像モジュールおよび撮像装置の制御方法 |
CN107124547B (zh) * | 2017-04-19 | 2020-10-16 | 宇龙计算机通信科技(深圳)有限公司 | 双摄像头拍照方法及装置 |
CN106973206B (zh) * | 2017-04-28 | 2020-06-05 | Oppo广东移动通信有限公司 | 摄像模组摄像处理方法、装置和终端设备 |
CN109639974A (zh) * | 2018-12-20 | 2019-04-16 | Oppo广东移动通信有限公司 | 控制方法、控制装置、电子装置及介质 |
CN109862271B (zh) * | 2019-02-27 | 2021-03-16 | 上海创功通讯技术有限公司 | 相位差对焦方法、电子设备及计算机存储介质 |
-
2019
- 2019-11-28 CN CN201911194022.5A patent/CN110769162B/zh active Active
-
2020
- 2020-11-24 EP EP20893880.3A patent/EP4057616A4/en active Pending
- 2020-11-24 KR KR1020227018221A patent/KR20220085834A/ko active Search and Examination
- 2020-11-24 JP JP2022527786A patent/JP7472281B2/ja active Active
- 2020-11-24 WO PCT/CN2020/131165 patent/WO2021104265A1/zh unknown
-
2022
- 2022-05-17 US US17/746,424 patent/US11856294B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105376474A (zh) * | 2014-09-01 | 2016-03-02 | 光宝电子(广州)有限公司 | 图像采集装置及其自动对焦方法 |
CN106331499A (zh) * | 2016-09-13 | 2017-01-11 | 努比亚技术有限公司 | 对焦方法及拍照设备 |
CN107395990A (zh) * | 2017-08-31 | 2017-11-24 | 珠海市魅族科技有限公司 | 相位对焦方法及装置、终端、计算机装置及可读存储介质 |
JP2019087880A (ja) * | 2017-11-07 | 2019-06-06 | キヤノン株式会社 | 信号出力装置、カメラ雲台、カメラ雲台連動システム、およびプログラム |
CN110769162A (zh) * | 2019-11-28 | 2020-02-07 | 维沃移动通信有限公司 | 电子设备及对焦方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4057616A4 * |
Also Published As
Publication number | Publication date |
---|---|
JP7472281B2 (ja) | 2024-04-22 |
EP4057616A4 (en) | 2022-12-28 |
EP4057616A1 (en) | 2022-09-14 |
US11856294B2 (en) | 2023-12-26 |
JP2023501608A (ja) | 2023-01-18 |
CN110769162B (zh) | 2021-07-13 |
KR20220085834A (ko) | 2022-06-22 |
CN110769162A (zh) | 2020-02-07 |
US20220279129A1 (en) | 2022-09-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108513070B (zh) | 一种图像处理方法、移动终端及计算机可读存储介质 | |
WO2021057267A1 (zh) | 图像处理方法及终端设备 | |
WO2021104197A1 (zh) | 对象跟踪方法及电子设备 | |
WO2021104195A1 (zh) | 图像显示方法及电子设备 | |
CN109743498B (zh) | 一种拍摄参数调整方法及终端设备 | |
WO2021098603A1 (zh) | 预览画面显示方法及电子设备 | |
WO2021098697A1 (zh) | 屏幕显示的控制方法及电子设备 | |
WO2021013009A1 (zh) | 拍照方法和终端设备 | |
WO2021036623A1 (zh) | 显示方法及电子设备 | |
WO2021082744A1 (zh) | 视频查看方法及电子设备 | |
WO2021082772A1 (zh) | 截屏方法及电子设备 | |
US12022190B2 (en) | Photographing method and electronic device | |
WO2021104266A1 (zh) | 对象显示方法及电子设备 | |
CN110784575B (zh) | 一种电子设备和拍摄方法 | |
CN110830713A (zh) | 一种变焦方法及电子设备 | |
WO2019137535A1 (zh) | 物距测量方法及终端设备 | |
WO2021147911A1 (zh) | 移动终端、拍摄模式的检测方法及存储介质 | |
WO2021017785A1 (zh) | 数据传输方法及终端设备 | |
WO2021104265A1 (zh) | 电子设备及对焦方法 | |
WO2020156119A1 (zh) | 应用程序界面调整方法及移动终端 | |
CN110602390B (zh) | 一种图像处理方法及电子设备 | |
CN109104573B (zh) | 一种确定对焦点的方法及终端设备 | |
CN109005337B (zh) | 一种拍照方法及终端 | |
WO2021136181A1 (zh) | 图像处理方法及电子设备 | |
US11877057B2 (en) | Electronic device and focusing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20893880 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022527786 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227018221 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020893880 Country of ref document: EP Effective date: 20220607 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |