WO2021231482A1 - Étalonnage de mise au point automatique de détection de phase dynamique - Google Patents

Étalonnage de mise au point automatique de détection de phase dynamique Download PDF

Info

Publication number
WO2021231482A1
WO2021231482A1 PCT/US2021/031851 US2021031851W WO2021231482A1 WO 2021231482 A1 WO2021231482 A1 WO 2021231482A1 US 2021031851 W US2021031851 W US 2021031851W WO 2021231482 A1 WO2021231482 A1 WO 2021231482A1
Authority
WO
WIPO (PCT)
Prior art keywords
phase detection
image
autofocus
gain map
pixels
Prior art date
Application number
PCT/US2021/031851
Other languages
English (en)
Inventor
Ravi Shankar Kadambala
Tony Lijo JOSE
Bapineedu Chowdary GUMMADI
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Publication of WO2021231482A1 publication Critical patent/WO2021231482A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals

Definitions

  • the following relates generally to image processing, and more specifically to dynamic phase detection autofocus calibration.
  • Autofocus may refer to a field of image processing for detecting an object in a field of view of a camera and using motors in a camera to focus on the detected object. With some cameras, autofocus may fail to focus on an object under certain circumstances.
  • portable computing devices such as portable wireless telephones, personal digital assistants (PDAs), laptop computers, tablet personal computers, and the like, may include digital imaging sensors for taking photos (and video) as well as components for communicating information over wired or wireless networks (e.g., for downloading videos and images). Such devices may benefit from improved autofocus techniques.
  • the described techniques relate to improved methods, systems, devices, and apparatuses that support dynamic autofocus for different condition, including but not limited to different light conditions.
  • the described techniques provide for improving exposure settings for autofocus in different light conditions by providing two or more dynamically selected gain map settings when sensing image pixels or autofocus pixels.
  • a measured light level e.g., illuminance measurement, lux measurement
  • the determined light level may be used to select a gain map.
  • the gain map may be applied to one or more autofocus pixels (e.g., phase detection autofocus (PDAF) pixels) to modify one or more parameters of a captured image.
  • PDAF phase detection autofocus
  • a method of dynamic autofocus calibration may include capturing an image using an image sensor of the device, performing an automatic exposure control operation on the captured image, determining a light level associated with the image based on an output of the automatic exposure control operation, selecting a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image, and using the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • the apparatus may include a processor, memory coupled with the processor, and instructions stored in the memory.
  • the instructions may be executable by the processor to cause the apparatus to capture an image using an image sensor of the device, perform an automatic exposure control operation on the captured image, determine a light level associated with the image based on an output of the automatic exposure control operation, select a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image, and use the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • the apparatus may include means for capturing an image using an image sensor of the device, performing an automatic exposure control operation on the captured image, determining a light level associated with the image based on an output of the automatic exposure control operation, selecting a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image, and using the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • a non-transitory computer-readable medium storing code for dynamic autofocus calibration is described.
  • the code may include instructions executable by a processor to capture an image using an image sensor of the device, perform an automatic exposure control operation on the captured image, determine a light level associated with the image based on an output of the automatic exposure control operation, select a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image, and use the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • Some examples of the method, apparatuses, and non-transitory computer-readable medium described herein may further include operations, features, means, or instructions for identifying pixels of a phase detection pixel block from the image, where determining the light level may be based on one or more light levels of the pixels of the phase detection pixel block.
  • Some examples of the method, apparatuses, and non-transitory computer-readable medium described herein may further include operations, features, means, or instructions for applying the phase detection autofocus gain map to the pixels of the phase detection pixel block based on the one or more light levels of the pixels of the phase detection pixel block.
  • Some examples of the method, apparatuses, and non-transitory computer-readable medium described herein may further include operations, features, means, or instructions for determining a phase disparity based on applying the phase detection autofocus gain map to the pixels of the phase detection pixel block.
  • Some examples of the method, apparatuses, and non-transitory computer-readable medium described herein may further include operations, features, means, or instructions for determining a lens defocus based on the phase disparity, and adjusting a lens of the device based on the lens defocus.
  • a value of the phase detection autofocus gain map includes a gain map width, or a gain map height, or a left gain map, or a right gain map, or any combination thereof.
  • Some examples of the method, apparatuses, and non-transitory computer-readable medium described herein may further include operations, features, means, or instructions for determining the light level may be based on one or more values of an intensity measurement map associated with the captured image.
  • the intensity measurement map includes a luma histogram.
  • the luma histogram includes a weighted sum of a red intensity value, a green intensity value, and a blue intensity value of a pixel of the image.
  • FIG. 1 illustrates an example of a system for dynamic autofocus calibration that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • FIG. 2 illustrates an example of a digital image system that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • FIG. 3 illustrates an example of a digital image system that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • FIGs. 4 and 5 show block diagrams of devices that support dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • FIG. 6 shows a block diagram of a autofocus manager that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • FIG. 7 shows a diagram of a system including a device that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • FIGs. 8 and 9 show flowcharts illustrating methods that support dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • the sensitivities of left channel phase detection pixels and right channel phase detection pixels used in phase detection autofocus (PDAF) may differ. Accordingly, calibration may be done to ensure that the sensitivities correspond and map to each other more accurately and improve the output of associated operations.
  • the calibration may include using parameters (gain map and defocus conversion coefficient (DCC)) from a phase detection library (e.g., dynamic link library (DLL), a DLL associated with an application program interface (API), etc.).
  • DCC gain map and defocus conversion coefficient
  • DLL dynamic link library
  • API application program interface
  • one or more inputs may be used in conjunction with the phase detection library. The one or more inputs may include at least one of image width, or image height, or black level, or pixel bit depth, or any combination thereof.
  • the parameters may be generated in one or more light conditions such as, relatively good light conditions (e.g., known or controlled light conditions), which may not reflect actual use scenarios (e.g., low light conditions, etc.).
  • actual use scenarios may reduce the PDAF accuracy when, for example, a gain map generated using relatively good light conditions is used in different light conditions, such as low light conditions.
  • the accuracy of a PDAF defocus value determined by a PDAF system may decrease when a single gain map is used for all light conditions.
  • applying the single gain map to PD pixels under low light conditions may decrease the accuracy of the PDAF defocus value.
  • an object may be situated 70cm away from a camera. While in sufficient light conditions (e.g., at least 1000 lux), movement of the lens based on the PDAF defocus value determined from a singular gain map may focus the lens on an object located 68 cm away from the camera (e.g., 97% autofocus accuracy). However, in low light the movement of the lens based on the PDAF defocus value determined from a singular gain map may focus the lens on an object located 50 cm away from the camera (e.g., 71% autofocus accuracy), resulting in the object being out of focus.
  • the described techniques include implementing one or more stored PDAF parameters for various light conditions.
  • the PDAF parameters for the various light conditions may be stored, for example, in a lookup table.
  • the described techniques may include determining light conditions (e.g., light conditions while capturing an image) and selecting a gain map based on the determined light conditions.
  • the described techniques may include selecting a first gain map stored in the lookup table for low light conditions (e.g., 100 lux to 499 lux), selecting a second gain map stored in the lookup table for medium light conditions (e.g., 500 lux to 999 lux), and selecting a third gain map stored in the lookup table for relatively good light conditions (e.g., 1000 lux or more), and applying the selected gain map to PDAF pixels to modify one or more parameters of a captured image.
  • a measured light level e.g., lux measurement
  • the determined light level may be used to select the gain map.
  • aspects of the disclosure are initially described in the context of a multimedia system. Aspects of the disclosure are further described in the context of image processing system. Aspects of the disclosure are further illustrated by and described with reference to apparatus diagrams, system diagrams, and flowcharts that relate to dynamic phase detection autofocus calibration.
  • FIG. 1 illustrates a multimedia system 100 for a device that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • the multimedia system 100 may include devices 105, a server 110, and a database 115. Although, the multimedia system 100 illustrates two devices 105, a single server 110, a single database 115, and a single network 120, the present disclosure applies to any multimedia system architecture having one or more devices 105, servers 110, databases 115, and networks 120.
  • the devices 105, the server 110, and the database 115 may communicate with each other and exchange information that supports dynamic phase detection autofocus calibration, such as multimedia packets, multimedia data, or multimedia control information, via network 120 using communications links 125. In some cases, a portion or all of the techniques described herein supporting dynamic phase detection autofocus calibration may be performed by the devices 105 or the server 110, or both.
  • a device 105 may be a cellular phone, a smartphone, a personal digital assistant (PDA), a wireless communication device, a handheld device, a tablet computer, a laptop computer, a cordless phone, a display device (e.g., monitors), and/or the like that supports various types of communication and functional features related to multimedia (e.g., transmitting, receiving, broadcasting, streaming, sinking, capturing, storing, and recording multimedia data).
  • PDA personal digital assistant
  • a device 105 may, additionally or alternatively, be referred to by those skilled in the art as a user equipment (UE), a user device, a smartphone, a Bluetooth device, a Wi-Fi device, a mobile station, a subscriber station, a mobile unit, a subscriber unit, a wireless unit, a remote unit, a mobile device, a wireless device, a wireless communications device, a remote device, an access terminal, a mobile terminal, a wireless terminal, a remote terminal, a handset, a user agent, a mobile client, a client, and/or some other suitable terminology.
  • UE user equipment
  • smartphone a Bluetooth device, a Wi-Fi device, a mobile station, a subscriber station, a mobile unit, a subscriber unit, a wireless unit, a remote unit, a mobile device, a wireless device, a wireless communications device, a remote device, an access terminal, a mobile terminal, a wireless terminal, a remote terminal, a handset, a user agent, a mobile client
  • the devices 105 may also be able to communicate directly with another device (e.g., using a peer-to-peer (P2P) or device-to-device (D2D) protocol).
  • P2P peer-to-peer
  • D2D device-to-device
  • a device 105 may be able to receive from or transmit to another device 105 variety of information, such as instructions or commands (e.g., multimedia-related information).
  • the devices 105 may include an application 130 and a multimedia manager 135. While, the multimedia system 100 illustrates the devices 105 including both the application 130 and the multimedia manager 135, the application 130 and the multimedia manager 135 may be an optional feature for the devices 105.
  • the application 130 may be a multimedia-based application that can receive (e.g., download, stream, broadcast) from the server 110, database 115 or another device 105, or transmit (e.g., upload) multimedia data to the server 110, the database 115, or to another device 105 via using communications links 125.
  • receive e.g., download, stream, broadcast
  • transmit e.g., upload
  • the multimedia manager 135 may be part of a general-purpose processor, a digital signal processor (DSP), an image signal processor (ISP), a central processing unit (CPU), a graphics processing unit (GPU), a microcontroller, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a discrete gate or transistor logic component, a discrete hardware component, or any combination thereof, or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described in the present disclosure, and/or the like.
  • the multimedia manager 135 may process multimedia (e.g., image data, video data, audio data) from and/or write multimedia data to a local memory of the device 105 or to the database 115.
  • multimedia e.g., image data, video data, audio data
  • the multimedia manager 135 may also be configured to provide multimedia enhancements, multimedia restoration, multimedia analysis, multimedia compression, multimedia streaming, and multimedia synthesis, among other functionality.
  • the multimedia manager 135 may perform white balancing, cropping, scaling (e.g., multimedia compression), adjusting a resolution, multimedia stitching, color processing, multimedia filtering, spatial multimedia filtering, artifact removal, frame rate adjustments, multimedia encoding, multimedia decoding, and multimedia filtering.
  • the multimedia manager 135 may process multimedia data to support dynamic phase detection autofocus calibration, according to the techniques described herein.
  • the server 110 may be a data server, a cloud server, a server associated with a multimedia subscription provider, proxy server, web server, application server, communications server, home server, mobile server, or any combination thereof.
  • the server 110 may in some cases include a multimedia distribution platform 140.
  • the multimedia distribution platform 140 may allow the devices 105 to discover, browse, share, and download multimedia via network 120 using communications links 125, and therefore provide a digital distribution of the multimedia from the multimedia distribution platform 140.
  • a digital distribution may be a form of delivering media content such as audio, video, images, without the use of physical media but over an online delivery medium, such as the Internet.
  • the devices 105 may upload or download multimedia-related applications for streaming, downloading, uploading, processing, enhancing, etc. multimedia (e.g., images, audio, video).
  • the server 110 may also transmit to the devices 105 a variety of information, such as instructions or commands (e.g., multimedia-related information) to download multimedia-related applications on the device 105.
  • the database 115 may store a variety of information, such as instructions or commands (e.g., multimedia-related information).
  • the database 115 may store multimedia 145.
  • the device may support dynamic phase detection autofocus calibration associated with the multimedia 145.
  • the device 105 may retrieve the stored data from the database 115 via the network 120 using communication links 125.
  • the database 115 may be a relational database (e.g., a relational database management system (RDBMS) or a Structured Query Language (SQL) database), a non-relational database, a network database, an object-oriented database, or other type of database, that stores the variety of information, such as instructions or commands (e.g., multimedia-related information).
  • RDBMS relational database management system
  • SQL Structured Query Language
  • the network 120 may provide encryption, access authorization, tracking, Internet Protocol (IP) connectivity, and other access, computation, modification, and/or functions.
  • Examples of network 120 may include any combination of cloud networks, local area networks (LAN), wide area networks (WAN), virtual private networks (VPN), wireless networks (using 802.11, for example), cellular networks (using third generation (3G), fourth generation (4G), long-term evolved (LTE), or new radio (NR) systems (e.g., fifth generation (5G)), etc.
  • Network 120 may include the Internet.
  • the communications links 125 shown in the multimedia system 100 may include uplink transmissions from the device 105 to the server 110 and the database 115, and/or downlink transmissions, from the server 110 and the database 115 to the device 105.
  • the wireless links 125 may transmit bidirectional communications and/or unidirectional communications.
  • the communication links 125 may be a wired connection or a wireless connection, or both.
  • the communications links 125 may include one or more connections, including but not limited to, Wi-Fi, Bluetooth, Bluetooth low- energy (BLE), cellular, Z-WAVE, 802.11, peer-to-peer, LAN, wireless local area network (WLAN), Ethernet, FireWire, fiber optic, and/or other connection types related to multimedia systems and digital image systems.
  • device 105 may include a display 155.
  • device 105 may include a camera 150 for capturing still images and/or video images.
  • camera 150 may include a front-facing camera as shown.
  • device 105 may also include a rear-facing camera.
  • device 105 may capture one or more images by an image sensor of camera 150 on device 105 that is interoperable with a processor of device 105 capable of implementing aspects of the present disclosure.
  • one or more images may be obtained by a device (e.g., device 105, a wireless device) via a transmission received from another device (e.g., over a wireless link, a wired link, a portable memory, etc.).
  • display 155 may display pictures captured by camera 150 on device 105, and/or a camera wireless connected to device 105.
  • display 155 may display images captured by camera 150 (e.g., one or more image frames displayed on display 155).
  • camera 150 may include one or more autofocus (AF) motors to adjust the focus of images captured by the one or more cameras.
  • camera 150 may include one or more adjustable lens elements.
  • camera 150 may include one or more adjustable image sensors.
  • camera 150 may include an AF motor to adjust at least one adjustable lens element of camera 150. Additionally or alternatively, camera 150 may include an AF motor to adjust at least one adjustable image sensor.
  • camera 150 may operate in conjunction with one or more processors (e.g., image processors), or autofocus managers (e.g., phase detection autofocus (PDAF) manager, contrast AF manager), or gain correction managers, or phase disparity managers, or phase to defocus converters, or automatic exposure control managers, or any combination thereof.
  • processors e.g., image processors
  • autofocus managers e.g., phase detection autofocus (PDAF) manager, contrast AF manager
  • gain correction managers e.g., phase disparity managers, or phase to defocus converters, or automatic exposure control managers, or any combination thereof.
  • device 105 may include an autofocus manager 135.
  • aspects of the present disclosure may relate to autofocus manager enabling improved techniques for autofocus when camera 150 is used in one or more conditions, such as well-lit conditions (e.g., sufficient light conditions for using a single exposure configuration for image processing and PD F processing) and such as with low light conditions (e.g., light conditions where the light is insufficient for using a single exposure configuration for image processing and PDAF processing).
  • autofocus manager 135 may determine a light level of an environment of a sensor of camera 150 and/or a confidence level associated with a set of one or more autofocus pixels of the sensor of camera 150.
  • autofocus manager 135 may use a first gain map and perform lens defocus (e.g., generate a digital to analog code by which a motor of a lens is moved) based on the first gain map.
  • autofocus manager 135 may select the first gain map or a second gain map based at least in part on a determined light level. For example, when the determined light level is within a first light level range, autofocus manager 135 may use the first gain map to perform lens defocus, and when the determined light level is within a second light level range, autofocus manager 135 may use the second gain map to perform lens defocus. In some examples, autofocus manager 135 may perform an autofocus operation for the sensor of camera 150 based at least in part on an output of the autofocus pixels and the selected gain map (e.g., the first gain map, or the second gain map, or a third gain map, etc.). In some examples, autofocus manager 135 may output image data from the image pixels of the sensor based at least in part on the selected gain map.
  • the selected gain map e.g., the first gain map, or the second gain map, or a third gain map, etc.
  • the described techniques increase autofocus accuracy (e.g., PDAF accuracy) in various light conditions.
  • the described techniques include calibrating gain map values for multiple light conditions, among other aspects.
  • values for a first gain map may be determined based on a calibration process performed under low-range light conditions (e.g., a first range of lux values).
  • values for a second gain map may be determined based on the calibration process performed under medium-range light conditions (e.g., a second range of lux values higher than the first range of lux values).
  • values for a third gain map may be determined based on the calibration process performed under high-range light conditions (e.g., a third range of lux values higher than both the second range of lux values and the first range of lux values).
  • the described techniques may determine a light condition (e.g., a light condition associated with capturing an image) and apply a gain map (e.g., first gain map, second gain map, etc.) based on the determined light condition.
  • the described techniques improve autofocus accuracy by increasing an accuracy of lens defocus associated with an autofocus process.
  • applying a gain map based on a determined light condition results in an improved digital to analog code used to move a lens (e.g., lens defocus) in an autofocus process. Accordingly, the described techniques increase autofocus accuracy (e.g., PD F accuracy) in various light conditions.
  • autofocus accuracy e.g., PD F accuracy
  • FIG. 2 illustrates an example of a digital image system 200 that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • digital image system 200 may implement aspects of multimedia system 100.
  • the digital image system 200 includes an autofocus manager 205.
  • the autofocus manager 205 may be an example of an autofocus manager of FIG. 1.
  • autofocus manager 205 may include phase detection library 210, contrast AF manager 240, PDAF manager 245.
  • phase detection library 210 may include gain correction manager 225, phase disparity manager 230, and defocus converter 235.
  • phase detection library 210 may receive PD pixels 250.
  • PD pixels 250 may include a subset of pixels captured by an image sensor (e.g., a 13x17 grid of pixels from the total number of pixels captured by an image sensor).
  • the PD pixels 250 may include any combination of one or more left pixels (e.g., pixels sensed by a left image sensor) and one or more right pixels (e.g., pixels sensed by a right image sensor).
  • a left gain map may be associated with the left pixels and a right gain map may be associated with the right pixels.
  • gain map 215 may include the left gain map and the right gain map.
  • gain correction manager 225 may apply use different gain maps for different light conditions.
  • gain map 215 may be one of several gain maps selected by gain correction manager 225.
  • gain map 215 may include a first gain map for a first light condition (e.g., 100 lux to 499 lux), or a second gain map for a second light condition (e.g., 500 lux to 999 lux), or a third gain map for a third light condition (e.g., 1000 lux or more), etc.
  • gain correction manager 225 may select gain map 215 based on a determined light condition associated with PD pixels 250.
  • phase disparity manager 230 may calculate a phase disparity value based on the gain map 215 used by gain correction manager 225. In some examples, phase disparity manager 230 may calculate the phase disparity value based on a DCC map calibration process. In some examples, the DCC map calibration process may include a lens sweep operation. During the lens sweep, a number of images may be captured (e.g., 10 images captured in relation to 9 equal movements of the lens). In some examples, 10 disparity values and 10 focus values may be determined based on the lens sweep, in conjunction with phase disparity manager 230 and autofocus manager 205.
  • defocus converter 235 may receive a phase disparity value from phase disparity manager 230. In some examples, defocus converter 235 may receive a defocus conversion coefficient (DCC) map 220. In some examples, the DCC map 220 may be determined based on the DCC map calibration process. In some examples, the DCC map 220 may be determined based on a lens moving over a range (e.g., moving the lens in N equal number of steps) during the DCC map calibration process. In some examples, a linear regression may be performed by autofocus manager 205 to determine a DCC value.
  • DCC defocus conversion coefficient
  • an image may be divided into X by Y regions (e.g., 6x8 regions), resulting in an X by Y DCC map (e.g., DCC map 220).
  • defocus converter 235 may convert the phase disparity calculated by phase disparity manager 230 to a defocus value.
  • the defocus value determined by defocus converter 235 may be based on a digital to analog (DAC) code.
  • the DCC value may be calculated based on a change in lens position (e.g., DAC code) divided by a change in phase disparity.
  • contrast AF manager 240 may determine a contrast autofocus value.
  • contrast AF manager 240 may capture several images at several focal points (e.g., over a range of lens incremental movements) and analyze the pixels of each captured image using contrast calculations to determine which image includes the most contrast.
  • the contrast AF manager 240 may determine the lens position of the image with the most contrast, and provide a contrast autofocus value (e.g., the position of the lens associated with the image that includes the most contrast) to PDAF manager 245.
  • PDAF manager 245 may receive the defocus value from defocus converter 235.
  • PDAF manager 245 may receive the contrast autofocus value from contrast AF manager 240.
  • PDAF manager 245 may determine a PDAF defocus value used to move a lens to an autofocus position.
  • the PDAF defocus value is based on the gain map 215 selected by gain correction manager 225 among multiple gain maps based on the light condition associated with PD pixels 250.
  • the operations of digital image system 200 may determine a light condition (e.g., a light condition associated with capturing an image) and apply a gain map (e.g., gain map 215) based on the determined light condition.
  • a light condition e.g., a light condition associated with capturing an image
  • a gain map e.g., gain map 215
  • the operations of digital image system 200 improve autofocus accuracy by increasing an accuracy of lens defocus associated with an autofocus process of digital image system 200.
  • digital image system 200 applying gain map 215 based on a determined light condition results in an improved digital to analog code used to move a lens (e.g., lens defocus) in an autofocus process. Accordingly, the operations of digital image system 200 increase autofocus accuracy (e.g., PDAF accuracy) under multiple light conditions.
  • FIG. 3 illustrates an example of a digital image system 300 that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • digital image system 300 may implement aspects of multimedia system 100.
  • digital image system 300 may include autofocus manager 305.
  • the autofocus manager 305 may be an example of an autofocus manager of FIG. 1 or FIG. 2.
  • autofocus manager 305 may include a switch 310, N PDAF gain map 315 based on N light conditions, a gain correction manager 325, and automatic exposure control (AEC) manager 330.
  • AEC automatic exposure control
  • autofocus manager 305 may use switch 310 to select a PDAF gain map from the N PDAF gain maps 315. In some examples, autofocus manager 305 selects selected PDAF gain map based on a light condition (e.g., a light condition associated with capturing one or more images, PD pixels, etc.).
  • a light condition e.g., a light condition associated with capturing one or more images, PD pixels, etc.
  • AEC manager 330 may determine a light condition associated with capturing one or more images or PD pixels. In some examples, AEC manager 330 may use an AEC algorithm to determine the light condition. In some examples, the AEC algorithm may be based on a determined luma value (e.g., a brightness value of an image, a brightness value of a black and white portion of an image, a brightness value of an achromatic portion of an image) that is determined by AEC manager 330. In some examples, AEC manager 330 may analyze values of pixels of an image sensor detecting photons.
  • AEC manager 330 may determine one or more luma values of the analyzed pixel values (e.g., luma value of an individual pixel, luma value for a group of pixels, etc.). In some examples, AEC manager 330 may determine a light condition based on the one or luma values determined by AEC manager 330.
  • autofocus manager 305 may determine, via AEC manager 330, a first light condition associated with one or more images or PD pixels, or both. In some examples, autofocus manager 305 may select first PDAF gain map 320 that corresponds to the determined first light condition. In some examples, autofocus manager 305 may provide first PDAF gain map 320 to gain correction manager 325. In some examples, autofocus manager 305 providing to gain correction manager 325 may result in an increase in autofocus accuracy based on autofocus manager 305 selecting PDAF gain maps according to determined light conditions.
  • FIG. 4 shows a block diagram 400 of a device 405 that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • the device 405 may be an example of aspects of a device as described herein.
  • the device 405 may include a sensor 410, an autofocus manager 415, and a memory 420.
  • the device 405 may also include a processor. Each of these components may be in communication with one another (e.g., via one or more buses).
  • Sensor 410 may include or be an example of a digital imaging sensor for taking photos and video.
  • sensor 410 may receive information such as packets, user data, or control information associated with various information channels. Information may be passed on from sensor 410 to other components of the device 405. Additionally or alternatively, components of device 405 used to communicate data over a wireless (e.g., or wired) link may be in communication with autofocus manager 415 (e.g., via one or more buses) without passing information through sensor 410.
  • a wireless link e.g., or wired
  • the autofocus manager 415 may capture an image using an image sensor of the device, perform an automatic exposure control operation on the captured image, determine a light level associated with the image based on an output of the automatic exposure control operation, select a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image, and use the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • the autofocus manager 415 may be an example of aspects of the autofocus manager 710 described herein.
  • the autofocus manager 415 may be implemented in hardware, code (e.g., software or firmware) executed by a processor, or any combination thereof. If implemented in code executed by a processor, the functions of the autofocus manager 415, or its sub-components may be executed by a general-purpose processor, a DSP, an application-specific integrated circuit (ASIC), an FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described in the present disclosure.
  • code e.g., software or firmware
  • ASIC application-specific integrated circuit
  • FPGA field-programmable gate
  • the autofocus manager 415 may be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations by one or more physical components.
  • the autofocus manager 415, or its sub-components may be a separate and distinct component in accordance with various aspects of the present disclosure.
  • the autofocus manager 415, or its sub-components may be combined with one or more other hardware components, including but not limited to an input/output (I/O) component, a transceiver, a network server, another computing device, one or more other components described in the present disclosure, or a combination thereof in accordance with various aspects of the present disclosure.
  • I/O input/output
  • Memory 420 may store information (e.g., facial feature information) generated by other components of the device such as autofocus manager 415.
  • information e.g., facial feature information
  • memory 420 may store facial feature information with which to compare an output of autofocus manager 415.
  • Memory 420 may comprise one or more computer-readable storage media.
  • Examples of memory 420 include, but are not limited to, random access memory (RAM), static RAM (SRAM), dynamic RAM (DRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM) or other optical disc storage, magnetic disc storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer or a processor (e.g., autofocus manager 415).
  • RAM random access memory
  • SRAM static RAM
  • DRAM dynamic RAM
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • CD-ROM compact disc read-only memory
  • flash memory or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer or a processor (e.g., autofocus manager 415).
  • FIG. 5 shows a block diagram 500 of a device 505 that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • the device 505 may be an example of aspects of a device 405 or a device 105 as described herein.
  • the device 505 may include a sensor 510, an autofocus manager 515, and a memory 540.
  • the device 505 may also include a processor. Each of these components may be in communication with one another (e.g., via one or more buses).
  • Sensor 510 may include or be an example of a digital imaging sensor for taking photos and video.
  • sensor 510 may receive information such as packets, user data, or control information associated with various information channels. Information may be passed on to other components of the device. Additionally or alternatively, components of device 505 used to communicate data over a wireless (e.g., or wired) link may be in communication with autofocus manager 515 (e.g., via one or more buses) without passing information through sensor 510.
  • the autofocus manager 515 may be an example of aspects of the autofocus manager 415 as described herein.
  • the autofocus manager 515 may include an image manager 520, a light level manager 525, a gain map manager 530, and a PDAF manager 535.
  • the autofocus manager 515 may be an example of aspects of the autofocus manager 710 described herein.
  • the image manager 520 may capture an image using an image sensor of the device.
  • the light level manager 525 may perform an automatic exposure control operation on the captured image and determine a light level associated with the image based on an output of the automatic exposure control operation.
  • the gain map manager 530 may select a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image.
  • the PDAF manager 535 may use the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • Memory 540 may store information (e.g., facial feature information) generated by other components of the device such as autofocus manager 515.
  • memory 540 may store facial feature information with which to compare an output of autofocus manager 515.
  • Memory 540 may comprise one or more computer-readable storage media.
  • Examples of memory 540 include, but are not limited to, random access memory (RAM), static RAM (SRAM), dynamic RAM (DRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM) or other optical disc storage, magnetic disc storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer or a processor (e.g., autofocus manager 515).
  • RAM random access memory
  • SRAM static RAM
  • DRAM dynamic RAM
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • CD-ROM compact disc read-only memory
  • flash memory or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer or a processor (e.g., autofocus manager 515).
  • FIG. 6 shows a block diagram 600 of a autofocus manager 605 that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • the autofocus manager 605 may be an example of aspects of a autofocus manager 415, a autofocus manager 515, or a autofocus manager 710 described herein.
  • the autofocus manager 605 may include an image manager 610, a light level manager 615, a gain map manager 620, a PDAF manager 625, and a defocus manager 630. Each of these modules may communicate, directly or indirectly, with one another (e.g., via one or more buses).
  • the image manager 610 may capture an image using an image sensor of the device.
  • the light level manager 615 may perform an automatic exposure control operation on the captured image.
  • the light level manager 615 may determine a light level associated with the image based on an output of the automatic exposure control operation.
  • the light level manager 615 may identify pixels of a phase detection pixel block from the image, where determining the light level is based on one or more light levels of the pixels of the phase detection pixel block. In some examples, the light level manager 615 may determine the light level is based on one or more values of an intensity measurement map associated with the captured image. In some cases, the intensity measurement map includes a luma histogram. In some cases, the luma histogram includes a weighted sum of a red intensity value, a green intensity value, and a blue intensity value of a pixel of the image.
  • the gain map manager 620 may select a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image.
  • a value of the phase detection autofocus gain map includes a gain map width, or a gain map height, or a left gain map, or a right gain map, or any combination thereof.
  • the PDAF manager 625 may use the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • the PDAF manager 625 may apply the phase detection autofocus gain map to the pixels of the phase detection pixel block based on the one or more light levels of the pixels of the phase detection pixel block.
  • the PDAF manager 625 may determine a phase disparity based on applying the phase detection autofocus gain map to the pixels of the phase detection pixel block.
  • the defocus manager 630 may determine a lens defocus based on the phase disparity. In some examples, the defocus manager 630 may adjust a lens of the device based on the lens defocus.
  • FIG. 7 shows a diagram of a system 700 including a device 705 that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • the device 705 may be an example of or include the components of device 405, device 505, or a device as described herein.
  • the device 705 may include components for bi directional voice and data communications including components for transmitting and receiving communications, including an autofocus manager 710, an I/O controller 715, a transceiver 720, an antenna 725, memory 730, a processor 740, and a coding manager 750. These components may be in electronic communication via one or more buses (e.g., bus 745).
  • buses e.g., bus 745
  • the autofocus manager 710 may capture an image using an image sensor of the device, perform an automatic exposure control operation on the captured image, determine a light level associated with the image based on an output of the automatic exposure control operation, select a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image, and use the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • the I/O controller 715 may manage input and output signals for the device 705.
  • the I/O controller 715 may also manage peripherals not integrated into the device 705. In some cases, the I/O controller 715 may represent a physical connection or port to an external peripheral.
  • the I/O controller 715 may utilize an operating system such as iOS®, ANDROID®, MS-DOS®, MS-WINDOWS®, OS/2®, UNIX®, LINUX®, or another known operating system. In other cases, the I/O controller 715 may represent or interact with a modem, a keyboard, a mouse, a touchscreen, or a similar device. In some cases, the I/O controller 715 may be implemented as part of a processor. In some cases, a user may interact with the device 705 via the I/O controller 715 or via hardware components controlled by the I/O controller 715.
  • an operating system such as iOS®, ANDROID®, MS-DOS®, MS-WINDOWS®, OS/2®, UNIX®, LINUX®, or another known operating system.
  • the I/O controller 715 may represent or interact with a modem, a keyboard, a mouse, a touchscreen, or a similar device.
  • the I/O controller 715 may be implemented as part of a processor
  • the transceiver 720 may communicate bi-directionally, via one or more antennas, wired, or wireless links as described herein.
  • the transceiver 720 may represent a wireless transceiver and may communicate bi-directionally with another wireless transceiver.
  • the transceiver 720 may also include a modem to modulate the packets and provide the modulated packets to the antennas for transmission, and to demodulate packets received from the antennas.
  • the wireless device may include a single antenna 725. However, in some cases the device may have more than one antenna 725, which may be capable of concurrently transmitting or receiving multiple wireless transmissions.
  • the memory 730 may include RAM and ROM.
  • the memory 730 may store computer-readable, computer-executable code 735 including instructions that, when executed, cause the processor to perform various functions described herein.
  • the memory 730 may contain, among other things, a BIOS which may control basic hardware or software operation such as the interaction with peripheral components or devices.
  • the processor 740 may include an intelligent hardware device, (e.g., a general- purpose processor, a DSP, a CPU, a microcontroller, an ASIC, an FPGA, a programmable logic device, a discrete gate or transistor logic component, a discrete hardware component, or any combination thereof).
  • the processor 740 may be configured to operate a memory array using a memory controller.
  • a memory controller may be integrated into the processor 740.
  • the processor 740 may be configured to execute computer- readable instructions stored in a memory (e.g., the memory 730) to cause the device 705 to perform various functions (e.g., functions or tasks supporting dynamic phase detection autofocus calibration).
  • the code 735 may include instructions to implement aspects of the present disclosure, including instructions to support dynamic autofocus calibration.
  • the code 735 may be stored in a non-transitory computer-readable medium such as system memory or other type of memory.
  • the code 735 may not be directly executable by the processor 740 but may cause a computer (e.g., when compiled and executed) to perform functions described herein.
  • FIG. 8 shows a flowchart illustrating a method 800 that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • the operations of method 800 may be implemented by a device or its components as described herein.
  • the operations of method 800 may be performed by a autofocus manager as described with reference to FIGs. 4 through 7.
  • a device may execute a set of instructions to control the functional elements of the device to perform the functions described herein. Additionally or alternatively, a device may perform aspects of the functions described herein using special-purpose hardware.
  • the device may capture an image using an image sensor of the device.
  • the operations of 805 may be performed according to the methods described herein. In some examples, aspects of the operations of 805 may be performed by an image manager as described with reference to FIGs. 4 through 7.
  • the device may perform an automatic exposure control operation on the captured image.
  • the operations of 810 may be performed according to the methods described herein. In some examples, aspects of the operations of 810 may be performed by a light level manager as described with reference to FIGs. 4 through 7.
  • the device may determine a light level associated with the image based on an output of the automatic exposure control operation.
  • the operations of 815 may be performed according to the methods described herein. In some examples, aspects of the operations of 815 may be performed by a light level manager as described with reference to FIGs. 4 through 7.
  • the device may select a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image.
  • the operations of 820 may be performed according to the methods described herein. In some examples, aspects of the operations of 820 may be performed by a gain map manager as described with reference to FIGs. 4 through 7.
  • the device may use the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • the operations of 825 may be performed according to the methods described herein. In some examples, aspects of the operations of 825 may be performed by a PDAF manager as described with reference to FIGs. 4 through 7.
  • FIG. 9 shows a flowchart illustrating a method 900 that supports dynamic phase detection autofocus calibration in accordance with aspects of the present disclosure.
  • the operations of method 900 may be implemented by a device or its components as described herein.
  • the operations of method 900 may be performed by a autofocus manager as described with reference to FIGs. 4 through 7.
  • a device may execute a set of instructions to control the functional elements of the device to perform the functions described herein. Additionally or alternatively, a device may perform aspects of the functions described herein using special-purpose hardware.
  • the device may determine a light level associated with the image based on an output of the automatic exposure control operation.
  • the operations of 905 may be performed according to the methods described herein. In some examples, aspects of the operations of 905 may be performed by a light level manager as described with reference to FIGs. 4 through 7.
  • the device may select a phase detection autofocus gain map from a group of phase detection autofocus gain maps based on the light level associated with the image.
  • the operations of 910 may be performed according to the methods described herein. In some examples, aspects of the operations of 910 may be performed by a gain map manager as described with reference to FIGs. 4 through 7.
  • the device may use the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • the operations of 915 may be performed according to the methods described herein. In some examples, aspects of the operations of 915 may be performed by a PDAF manager as described with reference to FIGs. 4 through 7.
  • the device may identify pixels of a phase detection pixel block from the image, where determining the light level is based on one or more light levels of the pixels of the phase detection pixel block.
  • the operations of 920 may be performed according to the methods described herein. In some examples, aspects of the operations of 920 may be performed by a light level manager as described with reference to FIGs. 4 through 7.
  • the device may apply the phase detection autofocus gain map to the pixels of the phase detection pixel block based on the one or more light levels of the pixels of the phase detection pixel block.
  • the operations of 925 may be performed according to the methods described herein. In some examples, aspects of the operations of 925 may be performed by a PDAF manager as described with reference to FIGs. 4 through 7.
  • the device may determine a phase disparity based on applying the phase detection autofocus gain map to the pixels of the phase detection pixel block.
  • the operations of 930 may be performed according to the methods described herein. In some examples, aspects of the operations of 930 may be performed by a PDAF manager as described with reference to FIGs. 4 through 7.
  • the device may determine a lens defocus based on the phase disparity.
  • the operations of 935 may be performed according to the methods described herein. In some examples, aspects of the operations of 935 may be performed by a defocus manager as described with reference to FIGs. 4 through 7.
  • the device may adjust a lens of the device based on the lens defocus.
  • the operations of 940 may be performed according to the methods described herein. In some examples, aspects of the operations of 940 may be performed by a defocus manager as described with reference to FIGs. 4 through 7.
  • a method for dynamic autofocus calibration by a device comprising: capturing an image using an image sensor of the device; performing an automatic exposure control operation on the captured image; determining a light level associated with the image based at least in part on an output of the automatic exposure control operation; selecting a phase detection autofocus gain map from a group of phase detection autofocus gain maps based at least in part on the light level associated with the image; and using the selected phase detection autofocus gain map to perform at least a portion of a phase detection autofocus process.
  • Aspect 2 The method of aspect 1, further comprising: identifying pixels of a phase detection pixel block from the image, wherein determining the light level is based at least in part on one or more light levels of the pixels of the phase detection pixel block.
  • Aspect 3 The method of aspect 2, further comprising: applying the phase detection autofocus gain map to the pixels of the phase detection pixel block based at least in part on the one or more light levels of the pixels of the phase detection pixel block.
  • Aspect 4 The method of aspect 3, further comprising: determining a phase disparity based at least in part on applying the phase detection autofocus gain map to the pixels of the phase detection pixel block.
  • Aspect 5 The method of aspect 4, further comprising: determining a lens defocus based at least in part on the phase disparity; and adjusting a lens of the device based at least in part on the lens defocus.
  • Aspect 6 The method of any of aspects 1 through 5, wherein a value of the phase detection autofocus gain map includes a gain map width, or a gain map height, or a left gain map, or a right gain map, or any combination thereof.
  • Aspect 7 The method of any of aspects 1 through 6, wherein determining the light level is based at least in part on one or more values of an intensity measurement map associated with the captured image.
  • Aspect 8 The method of aspect 7, wherein the intensity measurement map comprises a luma histogram.
  • Aspect 9 The method of aspect 8, wherein the luma histogram comprises a weighted sum of a red intensity value, a green intensity value, and a blue intensity value of a pixel of the image.
  • Aspect 10 An apparatus for dynamic autofocus calibration by a device, comprising a processor; memory coupled with the processor; and instructions stored in the memory and executable by the processor to cause the apparatus to perform a method of any of aspects 1 through 9.
  • Aspect 11 An apparatus for dynamic autofocus calibration by a device, comprising at least one means for performing a method of any of aspects 1 through 9.
  • Aspect 12 A non-transitory computer-readable medium storing code for dynamic autofocus calibration by a device, the code comprising instructions executable by a processor to perform a method of any of aspects 1 through 9.
  • Information and signals described herein may be represented using any of a variety of different technologies and techniques.
  • data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
  • a general-purpose processor may be a microprocessor, but in the alternative, the processor may be any processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices (e.g., a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration).
  • the functions described herein may be implemented in hardware, software executed by a processor, firmware, or any combination thereof. If implemented in software executed by a processor, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Other examples and implementations are within the scope of the disclosure and appended claims. For example, due to the nature of software, functions described herein can be implemented using software executed by a processor, hardware, firmware, hardwiring, or combinations of any of these. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations.
  • Computer-readable media includes both non-transitory computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a non-transitory storage medium may be any available medium that can be accessed by a general purpose or special purpose computer.
  • non-transitory computer-readable media may include random- access memory (RAM), read-only memory (ROM), electrically erasable programmable ROM (EEPROM), flash memory, compact disk (CD) ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other non-transitory medium that can be used to carry or store desired program code means in the form of instructions or data structures and that can be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Also, any connection is properly termed a computer-readable medium.
  • RAM random- access memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable ROM
  • flash memory compact disk (CD) ROM or other optical disk storage
  • CD compact disk
  • magnetic disk storage or other magnetic storage devices or any other non-transitory medium that can be used to carry or store desired program code means in the form of instructions or data structures and that can be accessed by a general-purpose or special-purpose computer, or a general-purpose
  • Disk and disc include CD, laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of computer-readable media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne des procédés, des systèmes et des dispositifs pour un étalonnage de mise au point automatique de détection de phase dynamique. Les procédés consistent à capturer une image à l'aide d'un capteur d'image du dispositif, à réaliser une opération de commande d'exposition automatique sur l'image capturée, à déterminer un niveau de lumière associé à l'image sur la base d'une sortie de l'opération de commande d'exposition automatique, à sélectionner une carte de gain de mise au point automatique de détection de phase à partir d'un groupe de cartes de gain de mise au point automatique de détection de phase sur la base du niveau de lumière associé à l'image, et à utiliser la carte de gain de mise au point automatique de détection de phase sélectionnée pour effectuer au moins une partie d'un processus de mise au point automatique de détection de phase.
PCT/US2021/031851 2020-05-15 2021-05-11 Étalonnage de mise au point automatique de détection de phase dynamique WO2021231482A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202041020509 2020-05-15
IN202041020509 2020-05-15

Publications (1)

Publication Number Publication Date
WO2021231482A1 true WO2021231482A1 (fr) 2021-11-18

Family

ID=76197659

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/031851 WO2021231482A1 (fr) 2020-05-15 2021-05-11 Étalonnage de mise au point automatique de détection de phase dynamique

Country Status (1)

Country Link
WO (1) WO2021231482A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120300116A1 (en) * 2011-05-25 2012-11-29 Canon Kabushiki Kaisha Image pickup apparatus and control method therefor
US20180349378A1 (en) * 2017-05-30 2018-12-06 Qualcomm Incorporated Calibration for phase detection auto focus (pdaf) camera systems
US20190156516A1 (en) * 2018-12-28 2019-05-23 Intel Corporation Method and system of generating multi-exposure camera statistics for image processing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120300116A1 (en) * 2011-05-25 2012-11-29 Canon Kabushiki Kaisha Image pickup apparatus and control method therefor
US20180349378A1 (en) * 2017-05-30 2018-12-06 Qualcomm Incorporated Calibration for phase detection auto focus (pdaf) camera systems
US20190156516A1 (en) * 2018-12-28 2019-05-23 Intel Corporation Method and system of generating multi-exposure camera statistics for image processing

Similar Documents

Publication Publication Date Title
US10291321B2 (en) Self-identifying one-way authentication method using optical signals
JP5666703B2 (ja) カメラ機能付きモバイル・デバイスによって取り込まれたビジュアル・メディア・コンテンツの識別
US20200336684A1 (en) Pattern configurable pixel correction
US8965121B2 (en) Image color matching and equalization devices and related methods
CN109345485B (zh) 一种图像增强方法、装置、电子设备及存储介质
US10594921B2 (en) Dual phase detection power optimizations
US20150304685A1 (en) Perceptual preprocessing filter for viewing-conditions-aware video coding
US9407829B1 (en) Method for mobile device to improve camera image quality by detecting whether the mobile device is indoors or outdoors
US9363445B2 (en) Flash collision detection, compensation, and prevention
US11435973B2 (en) Communication apparatus, communication method, and storage medium
KR20180080713A (ko) 가상 현실 이미지들의 이미지 개선을 위한 방법 및 장치
US20190313005A1 (en) Tone mapping for high-dynamic-range images
US20190230253A1 (en) Face tone color enhancement
WO2018195720A1 (fr) Procédé et appareil pour déterminer un mode de modulation et de codage
WO2021231482A1 (fr) Étalonnage de mise au point automatique de détection de phase dynamique
US11070738B2 (en) Infrared-assisted pre-flash
US20200236269A1 (en) Dynamic exposure for autofocus in low light
US20190373167A1 (en) Spotlight detection for improved image quality
US20220239814A1 (en) In-display camera activation
US11363213B1 (en) Minimizing ghosting in high dynamic range image processing
US11373281B1 (en) Techniques for anchor frame switching
KR101980629B1 (ko) 동영상 자동 실행을 위한 장치, 이를 위한 방법, 및 이 방법을 수행하는 프로그램이 기록된 컴퓨터 판독 가능한 기록매체
US20220327718A1 (en) Techniques for enhancing slow motion recording
US20210203825A1 (en) Techniques for correcting video rolling shutter jello effect for open loop voice-coil motor cameras
CN107592342B (zh) 信息推送方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21729184

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21729184

Country of ref document: EP

Kind code of ref document: A1