CN116848851A - Electronic device including image sensor and method of operating the same - Google Patents

Electronic device including image sensor and method of operating the same Download PDF

Info

Publication number
CN116848851A
CN116848851A CN202280013971.2A CN202280013971A CN116848851A CN 116848851 A CN116848851 A CN 116848851A CN 202280013971 A CN202280013971 A CN 202280013971A CN 116848851 A CN116848851 A CN 116848851A
Authority
CN
China
Prior art keywords
adc
group
electronic device
phase difference
reading out
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280013971.2A
Other languages
Chinese (zh)
Inventor
文仁儿
朴宰亨
下川修一
尹汝倬
姜家王
金东秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020210072286A external-priority patent/KR20220115493A/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority claimed from PCT/KR2022/002018 external-priority patent/WO2022173236A1/en
Publication of CN116848851A publication Critical patent/CN116848851A/en
Pending legal-status Critical Current

Links

Landscapes

  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

A method and an electronic device are provided in which, in a first unit pixel of an image sensor, a first analog-to-digital conversion (ADC) is performed by reading out a first Photodiode (PD) group and a second PD group adjacent to the first PD group in a first direction. The second ADC is performed by reading out a third PD group adjacent to the first PD group in the second direction. The second direction is perpendicular to the first direction. The third ADC is performed by reading out a fourth PD group adjacent to the second PD group in the second direction. The first phase difference in the second direction is detected based on the first ADC, the second ADC, and the third ADC. A second phase difference in the first direction is detected based on the second ADC and the third ADC.

Description

Electronic device including image sensor and method of operating the same
Technical Field
The present disclosure relates to an electronic device having an image sensor, and more particularly, to a technique for performing an autofocus function in an electronic device including an image sensor.
Background
With the demand for a high resolution mode, a technique for improving focusing performance by using an image sensor has been developed. Due to limitations on the camera mounting structure, image sensors have been developed that reduce the size of the resulting pixels and increase the number of the resulting pixels. According to this trend, a method of reducing the pixel size has been proposed. In particular, based on the latest demand for an image sensor having a high focusing performance instead of a simple high pixel mode, a pixel structure capable of phase difference detection has been developed.
Meanwhile, according to the Auto Focus (AF) method of the related art, two Photodiodes (PDs) having a 2×1 pixel structure are disposed under a single micro lens and classified with respect to left and right data, thereby performing an auto focus function such that focusing is performed with reference to a phase difference therebetween. However, as the size of the PD decreases, there is an increasing demand for a structure capable of having high resolution, and it has been proposed to use four PDs having a 2×2 pixel structure instead of a 2×1 structure so as to share a single microlens. Such a method is required in recent image sensors because, in addition to the advantage of high resolution, a phase difference between the left and right and between the up and down can be obtained.
Disclosure of Invention
Technical problem
In the case of a 4PD structure in which four PDs are shared by one Floating Diffusion (FD) node, each of the four PDs needs to be read out once, and the PD value that has been read out cannot be read out redundantly again.
To overcome these limitations, the number of analog-to-digital conversion (ADC) is increased so that each parallax (disparity) is detected by using data on left, right, up and down. In this case, the frame rate may be affected, making quick image data transmission difficult.
However, if only the phase difference in one direction (e.g., left and right or up and down) is used, to prevent such frame rate loss, the accuracy may be lower than in the case of using the phase difference between left, right, up and down.
Solution to the problem
An electronic device according to an embodiment may include an image sensor having a plurality of unit pixels. Each unit pixel includes at least four PDs. The PDs are disposed adjacent to each other in a first direction and a second direction different from the first direction. The electronic device also includes at least one processor electrically connected to the image sensor. The at least one processor is configured to perform a first ADC in a first unit pixel of the plurality of unit pixels by reading out a first PD group included in the first unit pixel and a second PD group in the first unit pixel adjacent to the first PD group in the first direction. The at least one processor is further configured to perform the second ADC by reading out a third PD group in the first unit pixel adjacent to the first PD group in the first unit pixel in the second direction, and to perform the third ADC by reading out a fourth PD group in the first unit pixel adjacent to the second PD group in the first unit pixel in the second direction. The at least one processor is further configured to detect a first phase difference in the second direction based on the results of the first ADC, the second ADC, and the third ADC, and to detect a second phase difference in the first direction based on the results of the second ADC and the third ADC.
A method for operating an electronic device according to an embodiment may include: in a first unit pixel among a plurality of unit pixels included in an image sensor of an electronic device, a first ADC is performed by reading out a first PD group included in the first unit pixel and a second PD group in the first unit pixel adjacent to the first PD group in a first direction, a second ADC is performed by reading out a third PD group in the first unit pixel adjacent to the first PD group in a second direction, the second direction being perpendicular to the first direction, a third ADC is performed by reading out a fourth PD group in the first unit pixel adjacent to the second PD group in the second direction, a first phase difference in the second direction is detected based on a result of performing the first ADC, the second ADC, and the third ADC, and a second phase difference in the first direction is detected based on a result of performing the second ADC and the third ADC.
Advantageous effects of the invention
The electronic apparatus and method according to various embodiments of the present disclosure may reduce frame rate loss while ensuring phase differences in multiple directions, thereby improving AF performance.
The advantages obtainable from the present disclosure are not limited to the above-described advantages, and other advantages not mentioned herein will be clearly understood from the following description by those skilled in the art to which the present disclosure pertains.
Drawings
The foregoing and other aspects, features, and advantages of certain embodiments of the present disclosure will become more apparent from the following description, taken in conjunction with the accompanying drawings, in which:
fig. 1 illustrates a structure of a camera module and an electronic device according to an embodiment;
FIG. 2 illustrates the main hardware elements of an electronic device according to an embodiment;
FIG. 3 is a cross-sectional view of a pixel array of an image sensor according to an embodiment;
fig. 4 is a circuit diagram of a unit pixel of an image sensor according to an embodiment;
FIG. 5 is a flowchart illustrating operations performed by a processor for auto-focusing in an electronic device, according to an embodiment;
fig. 6A illustrates that an ADC is performed when an image sensor includes a unit pixel including four Photodiodes (PDs) having a 2×2 array in an electronic device according to an embodiment;
fig. 6B illustrates that ADC is performed when 16 Photodiodes (PDs) of an image sensor having a 4×4 array share the same color filter in the electronic device according to the embodiment;
FIG. 7 illustrates performing an ADC in an electronic device according to various embodiments;
fig. 8 shows that pixels are read out in different orders with respect to different unit pixel groups in the electronic device according to the embodiment;
Fig. 9 is a flowchart showing determination of an ADC mode by comparing reliability in a first direction and reliability in a second direction in an electronic device according to an embodiment;
fig. 10 is a graph showing parallax based on the position of an actuator in an electronic device according to an embodiment;
FIG. 11 is a block diagram of an electronic device in a network environment, in accordance with various embodiments; and
fig. 12 is a block diagram illustrating a camera module according to various embodiments.
Detailed Description
Fig. 1 illustrates a structure of a camera module and an electronic device according to an embodiment.
Fig. 1 schematically illustrates a camera module 180 and the exterior of an electronic device 100 mounted with the camera module 180 according to an embodiment. The embodiment of fig. 1 is shown and described using a mobile device, such as, for example, a smart phone, but those skilled in the art will appreciate that the embodiment may be applied to any of a variety of electronic devices or mobile device mounted camera.
Referring to fig. 1, according to an embodiment, a display 110 may be disposed on a front surface of an electronic device 100. According to an embodiment, the display 110 may occupy a majority of the front surface of the electronic device 100. The display 110 and the bezel 190 area at least partially surrounding the edge of the display 110 may be disposed on a front surface of the electronic device 100. The display 110 may include a flat region and a curved region extending from the flat region to a side surface of the electronic device 100. The electronic device 100 shown in fig. 1 is one example, and various embodiments are possible. For example, the display 110 of the electronic device 100 may include only a flat area without a curved area, or may include a curved area at only one edge, instead of including curved areas at both edges. Further, according to an embodiment, the curved region may extend to the rear surface of the electronic device 100, and thus, the electronic device 100 may include an additional flat region.
According to an embodiment, the electronic device 100 may further include a speaker, a receiver, a front camera 161, a proximity sensor, and a home key. The electronic device 100 according to the embodiment may be provided with the rear cover 150 while being integrated with the body of the electronic device. In another embodiment, the rear cover 150 may have a shape in which the rear cover 150 is detachable from the body of the electronic device 100 so as to allow replacement of the battery. The rear cover 150 may be referred to as a battery cover or a rear surface cover.
According to an embodiment, a fingerprint sensor 171 for identifying a user fingerprint may be provided in the first area 170 of the display 110. The fingerprint sensor 171 may be disposed on a layer below the display 110 and thus not visible to the user. Alternatively, the fingerprint sensor 171 may be arranged to be difficult to see. Further, in addition to the fingerprint sensor 171, an additional sensor for user/biometric authentication may be provided in a partial region of the display 110. In another embodiment, the sensor for user/biometric authentication may be disposed in one region of the bezel 190. For example, an Infrared (IR) sensor for iris authentication may be provided to be exposed through one region of the display 110 or through one region of the bezel 190.
According to an embodiment, the front camera 161 is arranged in the second zone 160 on the front surface of the electronic device 100. In the embodiment of fig. 1, front camera 161 is shown exposed through an area of display 110. However, in another embodiment, the front camera 161 may be exposed through the bezel 190. In another embodiment (not shown), the display 110 may include at least one of an audio module, a sensor module (e.g., sensor 163), a camera module (e.g., front camera 161), and a light emitting element (not shown) in the rear surface of the second region 160. For example, the camera module may be disposed in the front surface and/or the side surface of the electronic device 100 so as to face the front surface and/or the side surface. For example, the front camera 161 may be an under-display camera (UDC) that is not visually exposed through the second region 160.
According to an embodiment, the electronic device 100 may comprise at least one front camera 161. For example, the electronic device 100 may have two front facing cameras, including a first front facing camera and a second front facing camera. According to an embodiment, the first front camera and the second front camera may be the same type of camera with the same specification (e.g., pixels). However, in another embodiment, the first front camera and the second front camera may be implemented as different cameras having different specifications. The electronic device 100 may support dual camera related functions (e.g., 3D image capture, AF, etc.) through two front-facing cameras. The description of the front camera may be equally or similarly applicable to the rear camera of the electronic device 100.
According to an embodiment, the electronic device 100 may also include a sensor 163 or various types of hardware for assisting image capture, such as a flash. For example, the electronic device 100 may include a distance sensor (e.g., a time of flight (TOF) sensor) for sensing a distance between an object (subject) and the electronic device 100. The distance sensor may be provided separately from the front camera 161 and/or the rear camera, or may be provided to be included in the front camera 161 and/or the rear camera.
According to an embodiment, at least one physical key may be provided at a side of the electronic device 100. For example, the first function key 151 for turning on/off the display 110 or turning on/off the power of the electronic device 100 may be disposed at the right edge with respect to the front surface of the electronic device 100. According to an embodiment, the second function key 152 for controlling the volume of the electronic device 100 or controlling the brightness of the screen may be disposed at the left edge with respect to the front surface of the electronic device 100. In addition, additional buttons or keys may also be provided on the front or rear surface of the electronic device 100. For example, physical buttons or touch buttons mapped to specific functions may be provided in a lower end region of the bezel 190 of the front surface.
The electronic device 100 shown in fig. 1 corresponds to an example, and the example does not limit the types of devices to which the technical spirit disclosed in the present disclosure is applied. For example, by employing a flexible display or a hinge structure, the technical spirit of the present disclosure can be applied to a foldable electronic device that can be folded in a lateral or longitudinal direction, or a rollable electronic device that can be rolled, or a tablet or notebook PC.
Referring to fig. 1, an electronic device 100 according to an embodiment may include a camera module 180. The camera module 180 may include a lens assembly 111, a housing 113, an IR cut filter 115, an image sensor 120, and an Image Signal Processor (ISP) 130.
According to an embodiment, in the lens assembly 111, the number, arrangement, type, and the like of lenses may vary according to the front camera 161 and the rear camera. The front camera 161 and the rear camera may have different characteristics (e.g., focal length, maximum magnification) based on the type of lens assembly 111. The lens may be moved forward or backward along the optical axis, and may be operated such that a clear image of a target object that is a subject may be captured by changing the focal length.
According to an embodiment, the camera module 180 may include a barrel for at least one lens aligned on an optical axis, and a housing 113 for mounting a magnet and/or at least one coil around the optical axis at a circumference of the barrel. According to an embodiment, the camera module 180 may perform a stabilization function (e.g., optical Image Stabilization (OIS)) of the image acquired by the image sensor 120 using at least one coil and/or magnet included in the housing 113. For example, the camera module 180 may control the direction and/or intensity of the current passing through the at least one coil under the control of the processor to control the electromagnetic force, and may move (or rotate) the lens assembly 111 and at least a portion of the lens carrier including the lens assembly 111 in a direction substantially perpendicular to the optical axis using the lorentz force by the electromagnetic force.
According to an embodiment, the camera module 180 may use another method for an image stabilization function. For example, the camera module 180 may use digital stabilization (video digital image stabilization (VDIS)). According to an embodiment, the camera module 180 may include a method for software processing data output values of the image sensor 120 to perform image stabilization. For example, the camera module 180 may extract a motion vector based on a difference (different images) of video frames passing through the VDIS, and may increase sharpness through image processing. Further, the camera module 180 may extract a motion vector based on the video through the VDIS to recognize the motion of the object itself other than the shake of the electronic apparatus 100 as a shake (shake).
According to an embodiment, the IR cut filter 115 may be disposed on the top surface of the image sensor 120. The image of the object passing through the lens may be partially filtered by the IR cut filter 115 and then sensed by the image sensor 120.
According to an embodiment, the image sensor 120 may be disposed on a top surface of a Printed Circuit Board (PCB) 140, a Printed Board Assembly (PBA), a Flexible PCB (FPCB), or a rigid-flexible PCB (RFPCB). The image sensor 120 may be electrically connected to the ISP 130 connected to the PCB 140 through a connector. The FPCB or the cable may be used as a connector.
According to an embodiment, the image sensor 120 may be a Complementary Metal Oxide Semiconductor (CMOS) sensor. A plurality of individual pixels are integrated in the image sensor 120, and each of the individual pixels may include a micro lens, a color filter, and a PD. Each of the individual pixels is a type of photodetector and can convert input light into an electrical signal. The photodetector may comprise a PD. For example, the image sensor 120 may amplify a current generated by the received light through the photoelectric effect of the light receiving element through the lens assembly 111. For example, each of the individual pixels may include a photoelectric conversion element (or a photosensitive element (position sensitive detector (PSD)) and a plurality of transistors.
According to an embodiment, optical information of an object incident through the lens assembly 111 may be converted into an electrical signal by the image sensor 120 and input into the ISP 130.
According to an embodiment, when the ISP 130 and the image sensor 120 are physically separated from each other, the image sensor 120 may be electrically connected to the ISP 130 based on a sensor interface of an appropriate standard.
According to an embodiment, ISP 130 may perform image processing on the electrically converted image data. The processing in ISP 130 may be classified into a pre-ISP (hereinafter referred to as "preprocessing") and an ISP chain (hereinafter referred to as "post-processing"). The image processing before the demosaicing process (demosaicing process) may refer to preprocessing, and the image processing after the demosaicing process may refer to post-processing. The preprocessing procedure may include 3A processing, lens shading correction, edge enhancement, bad pixel correction, and corner correction (knee correction). 3A may include at least one of Automatic White Balance (AWB), automatic Exposure (AE), and AF. The post-processing procedure may include at least one of a sensor index (index) change, a tuning parameter change, and a screen scaling adjustment. The post-processing process may include a process of processing image data output from the image sensor 120 or image data output from the scaler. The ISP 130 may adjust at least one of contrast, sharpness, saturation, and jitter of the image through a post-processing process. The process of adjusting contrast, sharpness, or saturation may be performed in the YUV color space, and the dithering process may be performed in the red, green, and blue (GRB) color spaces. Part of the preprocessing process may be performed during the post-processing process, or part of the post-processing process may be performed during the preprocessing process. Furthermore, a portion of the pretreatment process may be repeated as part of the post-treatment process.
According to an embodiment, the camera module 180 may be disposed on the front surface in addition to the rear surface of the electronic device 100. In addition, in order to improve the performance of the camera, the electronic device 100 may include not only one camera module 180 but also a plurality of camera modules 180. For example, the electronic device 100 may also include a front-facing camera 161 for video calls or self-timer shooting. The front camera 161 may support a relatively small number of pixels compared to the rear camera module. The front camera 161 may be relatively small compared to the camera module 180 of the rear camera.
Fig. 2 shows the main hardware elements of an electronic device according to an embodiment. In describing fig. 2, elements described above with reference to fig. 1 may be briefly described, or a description thereof may be omitted.
Referring to fig. 2, the electronic device 100 according to an embodiment may include a lens assembly 111, an image sensor 120, an ISP 130, a processor 210, a display 110, and a memory 220.
According to an embodiment, in the lens assembly 111, the number, arrangement, type, and the like of lenses may vary according to front cameras and rear cameras. The front camera and the rear camera may have different characteristics (e.g., focal length, maximum magnification, etc.) based on the type of lens assembly.
According to an embodiment, when ISP 130 is physically separated from image sensor 120, a standards-based sensor interface may exist.
According to an embodiment, ISP 130 may perform image processing on the electrically converted image data. The processing in ISP 130 may be classified into a pre-ISP (hereinafter referred to as "preprocessing") and an ISP chain (hereinafter referred to as "post-processing"). The image processing before the demosaicing process may refer to preprocessing, and the image processing after the demosaicing process may refer to post-processing. The preprocessing process may include 3A processing, lens shading correction, edge enhancement, bad pixel correction, and inflection point correction. 3A may include at least one of AWB, AE, and AF. The post-processing procedure may include at least one of sensor index changes, tuning parameter changes, and screen scaling. The post-processing process may include a process of processing image data output from the image sensor 120 or image data output from the scaler. The ISP 130 may adjust the contrast, sharpness, saturation, jitter, etc. of the image through post-processing. The process of adjusting contrast, sharpness, or saturation may be performed in the YUV color space, and the dithering process may be performed in the red, green, and blue (GRB) color spaces. The ISP 130 may transmit image data obtained after performing the post-processing procedure to the memory 220 (e.g., a display buffer). The display 110 may display image data stored in the memory 220 on a display screen under the control of the processor 210.
According to an embodiment, the processor 210 may perform/control various functions supported by the electronic device 100. For example, the processor 210 may execute code written in a programming language and stored in the memory 220 to execute applications and control various types of hardware. For example, the processor 210 may execute applications that support the photographing function and are stored in the memory 220. Further, the processor 210 may execute a camera module (e.g., the camera module 180 of fig. 1), and may configure and support an appropriate photographing mode so that the camera module 180 may perform operations desired by a user.
According to an embodiment, the memory 220 may store instructions that may be executed by the processor 210. Memory 220 may be understood as a concept including elements that temporarily store data, such as Random Access Memory (RAM), and/or elements that permanently store data, such as Solid State Drives (SSDs). For example, processor 210 may invoke instructions stored in the SSD to implement software modules in RAM space. In various embodiments, various types of memory may be included, and their appropriate types may be employed depending on the use of the device.
According to an embodiment, applications associated with the camera module 180 may be stored in the memory 220. For example, the camera application may be stored in the memory 220. Camera applications may support various shooting functions such as photography, moving image shooting, panoramic photography, and slow motion recording.
According to an embodiment, the processor 210 may display an execution screen of an application executed by the processor 210 or contents such as images and/or moving images stored in the memory 220 on the display 110. In addition, the processor 210 may display image data acquired through the camera module 180 on the display 110 in real time.
Fig. 3 is a cross-sectional view of a pixel array of an image sensor according to an embodiment.
Referring to fig. 3, the image sensor 120 may include a plurality of unit pixels 310. According to an embodiment, each of the unit pixels 310 may include at least four PDs 313. According to an embodiment, the plurality of unit pixels 310 may be located on a plane perpendicular to a Z-axis corresponding to a light incident direction. According to an embodiment, a first direction (e.g., an X-axis direction) of the plurality of unit pixels 310 may be perpendicular to a second direction (e.g., a Y-axis direction) of the unit pixels 310. According to an embodiment, the first direction (e.g., the X-axis direction) and the second direction (e.g., the Y-axis direction) may be perpendicular to the Z-axis direction.
According to an embodiment, each of the unit pixels (310) may include a micro lens 311, a color filter 312, and a plurality of PDs 313 or a combination thereof. According to an embodiment, each of the plurality of PDs 313 may also be referred to as a light receiving element. According to an embodiment, the plurality of PDs 313 may also be referred to as multi-PDs.
According to an embodiment, the micro lens 311 may focus light incident on the micro lens 311. According to the embodiment, the micro lens 311 may adjust the path of light incident on the micro lens 311 so that the light reaches each of the plurality of PDs 313.
According to an embodiment, the color filter 312 may allow light having a predetermined color (or color channel) to pass therethrough. According to an embodiment, the color filter 312 of each of the plurality of PDs 313 may allow light having one color (e.g., red) of a pre-specified color (e.g., red, blue, or green) to pass through according to a pre-specified pattern (e.g., bayer pattern). According to an embodiment, the color filter 312 may block light having a color different from a pre-specified color (or color channel).
According to an embodiment, the number of the plurality of PDs 313 may be greater than or equal to 4. According to an embodiment, each of the plurality of PDs 313 may output a value corresponding to incident light. According to an embodiment, each of the plurality of PDs 313 may output a value corresponding to incident light based on the photoelectric effect. According to an embodiment, each of the plurality of PDs 313 may output a value corresponding to the intensity (or illuminance) of incident light based on the photoelectric effect.
According to an embodiment, each of the plurality of PDs 313 may generate a charge based on the intensity (or illuminance) of incident light based on the photoelectric effect. According to an embodiment, each of the plurality of PDs 313 may output a current based on the generated charge amount.
Fig. 4 is a circuit diagram of a unit pixel of an image sensor according to an embodiment.
According to an embodiment, the unit pixel 310 may include a plurality of PDs 410, a Transmission Gate (TG) 420 corresponding to the PDs 410, FD nodes 430, a Source Follower (SF) 450, a row selection (hereinafter referred to as "SEL") 460, and a reset gate (RST) 470.
According to an embodiment, the unit pixel 310 may include four PDs 410. For example, the unit pixel 310 may include four PDs 410 (e.g., a first PD 411, a second PD 412, a third PD 413, and a fourth PD 414) having a 2×2 array and at least one FD node 430 connected to the four PDs 410. For example, the unit pixel 310 may mean a micro lens unit or a color filter unit. Herein, the description has been made based on the unit pixel 310 including four PDs 410 having a 2×2 array. However, this is an example, and various embodiments are possible that can be implemented by those skilled in the art.
According to an embodiment, when the TG 420 is turned on, the charge accumulated in the PD 410 during the exposure time may move to the FD node 430. For example, when the first TG 421 is turned on, the charge accumulated in the first PD 411 may move to the FD node 430. According to an embodiment, the image sensor 120 may acquire analog data corresponding to the charge moved to the FD node 430. For example, the analog data may include information about the amount of charge accumulated in the PD 410 during the exposure time.
According to an embodiment, the image sensor 120 may acquire analog data through the unit pixel 310. For example, the image sensor 120 may control the TG 420 to acquire analog data corresponding to the light amount data acquired by the at least one PD 410. For example, the image sensor 120 may acquire light amount data through the first PD 411, the second PD 412, the third PD 413, and the fourth PD 414 during the exposure time. When the image sensor 120 turns on the first TG 421, the image sensor 120 may acquire analog data based on the light amount data acquired by the first PD 411. When the image sensor 120 turns on the first TG 421, the second TG 422, the third TG 423, and the fourth TG 424, the image sensor 120 may acquire analog data based on the light amount data acquired through the first PD 411, the second PD 412, the third PD 413, and the fourth PD 414.
According to an embodiment, the image sensor 120 may acquire analog data based on light amount data acquired through one of the four PDs 410. In another embodiment, the image sensor 120 may also acquire analog data based on light amount data acquired by at least two PDs of the four PDs 410. For example, it is also understood that the image sensor 120 acquires analog data through the unit pixel 310.
According to an embodiment, the charge stored in the FD node 430 may be read out by the SF 450 and may be output as an electrical signal. According to an embodiment, the image sensor 120 may digitally convert analog data through an ADC to obtain digital data. For example, it is understood that digital data may imply image data.
According to an embodiment, the image sensor 120 may switch the SEL 460 from the off state to the on state in order to output image data of a specific line.
According to an embodiment, the image sensor 120 may perform a Correlated Double Sampling (CDS) operation in order to reduce noise. For example, the image sensor 120 may turn on the RST 470 to reset data accumulated in the FD node 430, and may read out reset data remaining after the reset. The image sensor 120 may turn off the RST 470 and then move the charges accumulated in the PD 410 to the FD node 430, and may read out the charges moved to the FD node 430 to acquire read-out data.
Fig. 5 is a flowchart showing operations performed by a processor for AF in an electronic device according to an embodiment.
Referring to fig. 5, at 510, the processor 210 according to the embodiment may perform a first ADC in a first unit pixel among a plurality of unit pixels by reading out a first PD group included in the first unit pixel and a second PD group adjacent to the first PD group in a first direction (e.g., a vertical direction).
According to an embodiment, among a plurality of unit pixels included in the image sensor 120, the first PD group may be placed at a first position, and the second PD group may be disposed adjacent to the first PD group in a first direction, the first direction being a vertical direction downward from the first PD group. According to an embodiment, the processor 210 may convert analog data acquired through a plurality of PDs included in the first PD group and the second PD group into digital data.
According to an embodiment, the processor 210 may perform the second ADC by reading out a third PD group adjacent to the first PD group in the second direction in operation 520. According to an embodiment, among a plurality of unit pixels included in the image sensor 120, the first PD group may be disposed at a first position, and the third PD group may be disposed adjacent to the first PD group in a second direction, the second direction being a horizontal direction from the first PD group to the right. According to an embodiment, the processor 210 may convert analog data acquired through the plurality of PDs included in the third PD group into digital data.
According to an embodiment, the processor 210 may perform the third ADC by reading out a fourth PD group adjacent to the second PD group in the second direction in operation 530. According to an embodiment, among the plurality of unit pixels included in the image sensor 120, the fourth PD group may be disposed adjacent to the second PD group in the second direction, which is a horizontal direction from the second PD group to the right. According to an embodiment, the processor 210 may convert analog data acquired through the plurality of PDs included in the fourth PD group into digital data.
According to an embodiment, in operation 540, the processor 210 may detect a first phase difference in a second direction (e.g., a horizontal direction) based on a result of executing the first ADC, the second ADC, and the third ADC. According to an embodiment, the processor 210 may detect the phase difference in the second direction by using first ADC data obtained by reading out the first PD group and a second PD group adjacent to the first PD group in the first direction, second ADC data obtained by reading out a third PD group adjacent to the first PD group in the second direction (e.g., horizontal direction), and third ADC data obtained by reading out a fourth PD group adjacent to the second PD group in the second direction.
According to an embodiment, the processor 210 may detect a second phase difference in a first direction (e.g., a vertical direction) based on a result of executing the second ADC and the third ADC in operation 550. The phase difference in the first direction may be detected by using second ADC data obtained by reading out a third PD group adjacent to the first PD group in the second direction, and by using third ADC data obtained by reading out a fourth PD group adjacent to the second PD group in the second direction.
According to an embodiment, the processor 210 may perform an AF function based on the first phase difference and the second phase difference. According to an embodiment, when it is determined that there is a first phase difference in the first direction and/or a second phase difference in the second direction, the processor 210 may perform an AF function based on the first phase difference and the second phase difference. According to an embodiment, the processor 210 may acquire information about a position of the focus, a direction of the focus, or a distance between the object and the image sensor 120 based on the first phase difference and/or the second phase difference. According to an embodiment, the processor 210 may output a control signal for moving the lens position based on the first phase difference and/or the second phase difference.
Fig. 6A illustrates that ADC is performed when the image sensor includes a unit pixel including four PDs having a 2×2 array in the electronic device according to an embodiment. Fig. 6B illustrates performing an ADC when 16 PDs having an image sensor of a 4×4 array share the same filter in an electronic device according to an embodiment.
Referring to fig. 6A, the image sensor 120 according to the embodiment may acquire analog data through the unit pixel 310. For example, the image sensor 120 may acquire analog data through PDs (e.g., the first PD 315TL, the second PD 315BL, the third PD 315TR, and the fourth PD 315 BR) included in the unit pixel 310. According to the embodiment, the image sensor 120 may acquire analog data based on light amount data acquired by at least one PD of four PDs (the first PD 315TL, the second PD 315BL, the third PD 315TR, and the fourth PD 315 BR) included in the unit pixel 310 and having a 2×2 array.
According to an embodiment, in the unit pixel 310, the processor 210 may perform the first ADC by reading out the first PD 315TL and the second PD 315BL adjacent to the first PD 315TL in the first direction (e.g., the vertical direction). According to an embodiment, the processor 210 may perform the second ADC by reading out the third PD 315TR adjacent to the first PD 315TL in the second direction (e.g., the horizontal direction). According to an embodiment, the processor 210 may perform the third ADC by reading out the fourth PD 315BR adjacent to the second PD 315BL in the second direction. According to the embodiment, when there are a plurality of unit pixels, the operations of performing the first ADC, the second ADC, and the third ADC may be applied to all the unit pixels.
Referring to fig. 6B, the image sensor 120 according to the embodiment may acquire analog data through the unit pixel 310. For example, the image sensor 120 may acquire analog data through the PD 612, the PD 612 being included in the four unit pixels 310 and sharing the same color filter 610. According to an embodiment, the image sensor 120 may acquire analog data based on light amount data acquired through at least one PD of the 16 PDs 612 included in the four unit pixels 310.
According to an embodiment, in the PD 612 sharing one color filter 610, the processor 210 may perform the first ADC by reading out the first PD group and the second PD group adjacent to the first PD group in the first direction (e.g., the vertical direction). According to an embodiment, the processor 210 may perform the second ADC by reading out a third PD group adjacent to the first PD group in the second direction (e.g., horizontal direction). According to an embodiment, the processor 210 may perform the third ADC by reading out a fourth PD group adjacent to the second PD group in the second direction. According to the embodiment, when there are a plurality of unit pixels, the operations of performing the first ADC, the second ADC, and the third ADC may be applied to all the unit pixels.
Fig. 7 illustrates performing an ADC in an electronic device, according to various embodiments.
Referring to fig. 7, the image sensor 120 of the electronic device 100 according to the embodiment may include a plurality of unit pixels 310. According to an embodiment, four unit pixels 310 may share the same color filter (e.g., color filter 610 in fig. 6B). According to an embodiment, each unit pixel 310 may include at least four PDs. According to an embodiment, the unit pixel 310 may include a first PD 315TL, a second PD 315BL adjacent to the first PD 315TL in a first direction (e.g., a vertical direction), a third PD 315TR adjacent to the first PD 315TL in a second direction (e.g., a horizontal direction), and a fourth PD 315BR adjacent to the second PD 315BL in the second direction.
The processor 210 according to various embodiments may determine an order in which the PDs (e.g., the first PD 315TL, the second PD 315BL, the third PD 315TR, and the fourth PD 315 BR) included in the unit pixel 310 are read out. According to various embodiments, the processor 210 may read out the PD (315 TL, 315BL, 315TR, 315 BR) included in the unit pixel 310 in one of the first mode, the second mode, the third mode, and the fourth mode according to the order.
In the first mode, the processor 210 according to the embodiment may perform the first ADC by reading out the third PD 315TR and the fourth PD 315BR, may perform the second ADC by reading out the first PD 315TL, and may perform the third ADC by reading out the second PD 315 BL.
In the second mode, the processor 210 according to the embodiment may perform the first ADC by reading out the first PD 315TL and the second PD 315BL, may perform the second ADC by reading out the third PD 315TR, and may perform the third ADC by reading out the fourth PD 315 BR.
In the third mode, the processor 210 according to the embodiment may perform the first ADC by reading out the second PD 315BL and the fourth PD 315BR, may perform the second ADC by reading out the first PD 315TL, and may perform the third ADC by reading out the third PD 315 TR.
In the fourth mode, the processor 210 according to the embodiment may perform the first ADC by reading out the first PD 315TL and the third PD 315TR, may perform the second ADC by reading out the second PD 315BL, and may perform the third ADC by reading out the fourth PD 315 BR.
Fig. 8 illustrates reading out pixels in different orders for different unit pixel groups in an electronic device according to an embodiment.
Referring to fig. 8, in the electronic device 100 according to the embodiment, the image sensor 120 may include a plurality of unit pixels. According to an embodiment, in the PD included in the first unit pixel 810 among the plurality of unit pixels, the first ADC may be performed by reading out the first PD group 315TL of the first unit pixel 810 and the second PD group 315BL of the first unit pixel 810, the second PD group 315BL being adjacent to the first PD group 315TL of the first unit pixel 810 in the first direction (e.g., the vertical direction), the second ADC may be performed by reading out the third PD group 315TR of the first unit pixel 810, the third PD group 315TR being adjacent to the first PD group 315TL of the first unit pixel 810 in the second direction (e.g., the horizontal direction), and the third ADC may be performed by reading out the fourth PD group 315BR of the first unit pixel 810, the fourth PD group 315BR being adjacent to the second PD group 315BL of the first unit pixel 810 in the second direction (e.g., the horizontal direction).
According to an embodiment, in the PD included in the second unit pixel 820 among the plurality of unit pixels, the fourth ADC may be performed by reading the first PD group 325TL of the second unit pixel 820 and the third PD group 325TR of the second unit pixel 820, the third PD group 325TR being adjacent to the first PD group 325TL of the second unit pixel 820 in the second direction, the fifth ADC may be performed by reading the second PD group 325BL of the second unit pixel 820, the second PD group 325BL being adjacent to the first PD group 325TL of the second unit pixel 820 in the first direction, and the sixth ADC may be performed by reading the fourth PD group 325BR of the second unit pixel 820, the fourth PD group 325BR being adjacent to the second PD group 325BL of the second unit pixel 820 in the second direction.
According to an embodiment, in the first unit pixel 810, the processor 210 may detect the first phase difference in the second direction based on the results of performing the first ADC, the second ADC, and the third ADC, and may detect the second phase difference in the first direction based on the results of performing the second ADC and the third ADC.
According to an embodiment, in the second unit pixel 820, the processor 210 may detect a third phase difference in the first direction based on the results of performing the fourth ADC, the fifth ADC, and the sixth ADC, and may detect a fourth phase difference in the second direction based on the results of performing the fifth ADC and the sixth ADC.
According to an embodiment, the processor 210 may perform the AF function based on the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference.
Fig. 9 is a flowchart illustrating a method of determining an ADC mode by comparing reliability in a first direction and reliability in a second direction in an electronic device, according to an embodiment. With respect to fig. 9, repetition or similar to the above may be briefly described or omitted herein.
Referring to fig. 9, at 901, the processor 210 according to an embodiment may acquire a previous frame from the image sensor 120. According to an embodiment, the processor 210 may receive frames previously acquired by the image sensor 120.
According to an embodiment, in operation 903, the processor 210 may analyze a previous image frame to determine reliability in the first direction and the second direction. According to an embodiment, the processor 210 may analyze frames previously acquired by the image sensor 120 to determine reliability in a first direction (e.g., a vertical direction) and a second direction (e.g., a horizontal direction). According to an embodiment, the processor 210 may determine the reliability in the first direction and the reliability in the second direction based on the contrast information in the first direction and the contrast information in the second direction, respectively. According to an embodiment, the processor 210 may determine that the greater the contrast, the higher the reliability based on the contrast information.
According to an embodiment, in operation 905, the processor 210 may determine whether the reliability in the first direction is higher than the reliability in the second direction. According to an embodiment, the processor 210 may compare the reliability in the first direction and the reliability in the second direction based on the contrast information in the first direction and the contrast information in the second direction.
According to an embodiment, when it is determined that the reliability in the first direction is higher than the reliability in the second direction, the processor 210 may perform the readout in the first mode in operation 907. According to an embodiment, the processor 210 may perform the readout in the first mode when it is determined that the contrast in the first direction (e.g., the vertical direction) is greater than the contrast in the second direction (e.g., the horizontal direction). For example, the first mode may include the processor 210 executing the first ADC by reading out the first PD group and a third PD group adjacent to the first PD group in the second direction, executing the second ADC by reading out the second PD group adjacent to the first PD group in the first direction, executing the third ADC by reading out a fourth PD group adjacent to the second PD group in the second direction, detecting a phase difference in the first direction based on a result of executing the first ADC, the second ADC, and the third ADC, and detecting a phase difference in the second direction based on a result of executing the second ADC and the third ADC.
According to an embodiment, when it is not determined that the reliability in the first direction is higher than the reliability in the second direction, the processor 210 may perform the readout in the second mode in operation 909. According to an embodiment, the processor 210 may perform the readout in the second mode when it is determined that the contrast in the first direction is smaller than the contrast in the second direction. For example, the second mode may include a mode in which the processor 210 may perform the first ADC by reading out the first PD group and the second PD group adjacent to the first PD group in the first direction, perform the second ADC by reading out the third PD group adjacent to the first PD group in the second direction, perform the third ADC by reading out the fourth PD group adjacent to the second PD group in the second direction, detect a phase difference in the second direction based on a result of performing the first ADC, the second ADC, and the third ADC, and detect a phase difference in the first direction based on a result of performing the second ADC and the third ADC.
According to an embodiment, the processor 210 may output a current image frame in operation 911. According to an embodiment, the processor 210 may output the current image frame based on a result of performing the readout in the first mode or the second mode.
Fig. 10 is a graph showing parallax based on the position of an actuator in an electronic device according to an embodiment.
Referring to fig. 10, the x-axis represents the position of the actuator, and the Y-axis represents parallax based on the position of the actuator.
According to an embodiment, ISP 130 may store image data (RDATA) in memory 220 and may use the position information (COOR) of the phase detection pixels to extract phase detection pixel data from the image data (RDATA) stored in memory 220. According to an embodiment, the ISP 130 may calculate the parallax of each phase detection pixel from the phase detection pixel data.
Fig. 10 shows an embodiment of parallax based on the position of an actuator in the case where a phase difference is detected by using a PD included in a unit pixel of the image processor 210 when the ISP 130 has adjusted the focus position with reference to a first direction (e.g., a vertical direction) or a second direction (e.g., a horizontal direction) in a specific environment (e.g., an indoor environment) by using the actuator. For example, fig. 10 is an embodiment showing parallax based on the actuator position when the phase difference is detected by using all (all) PDs included in the unit pixel and when the phase difference is detected by using half PDs included in the unit pixel. According to an embodiment, the parallax when the ISP 130 detects the phase difference by using all (all) PDs included in the unit pixel may be the same as the parallax when the ISP 130 detects the phase difference by using half PDs included in the unit pixel, or the difference between the two may be less than the threshold value. According to the above-described embodiments, there may be no large AF performance difference between when detecting left, right, up and down phase differences by performing the ADC three times and when detecting left, right, up and down phase differences by performing the ADC four times.
Fig. 11 is a block diagram illustrating an electronic device 1101 in a network environment 1100 in accordance with various embodiments. Referring to fig. 11, an electronic device 1101 in a network environment 1100 may communicate with the electronic device 1102 via a first network 1198 (e.g., a short-range wireless communication network) or with the electronic device 1104 or server 1108 via a second network 1199 (e.g., a long-range wireless communication network). According to an embodiment, electronic device 1101 may communicate with electronic device 1104 via server 1108. According to an embodiment, the electronic device 1101 may include a processor 1120, a memory 1130, an input device 1150, a sound output device 1155, a display device 1160, an audio module 1170, a sensor module 1176, an interface 1177, a haptic module 1179, a camera module 1180, a power management module 1188, a battery 1189, a communication module 1190, a Subscriber Identity Module (SIM) 1196, or an antenna module 1197. In some embodiments, at least one of the components (e.g., display device 1160 or camera module 1180) may be omitted from electronic device 1101, or one or more other components may be added to electronic device 1101. In some embodiments, some of the components may be implemented as a single integrated circuit. For example, the sensor module 1176 (e.g., a fingerprint sensor, iris sensor, or illuminance sensor) may be implemented embedded in the display device 1160 (e.g., a display).
The processor 1120 may run, for example, software (e.g., program 1140) to control at least one other component (e.g., hardware component or software component) of the electronic device 1101 that is connected to the processor 1120, and may perform various data processing or calculations. According to one embodiment, as at least part of the data processing or calculation, the processor 1120 may load commands or data received from another component (e.g., the sensor module 1176 or the communication module 1190) into the volatile memory 1132, process the commands or data stored in the volatile memory 1132, and store the resulting data in the non-volatile memory 1134. According to an embodiment, the processor 1120 may include a main processor 1121 (e.g., a Central Processing Unit (CPU) or an Application Processor (AP)) and an auxiliary processor 1123 (e.g., a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a sensor hub processor or a Communication Processor (CP)) that is operatively independent or combined with the main processor 1121. Additionally or alternatively, the secondary processor 1123 may be adapted to consume less power than the primary processor 1121, or to be specifically adapted for a specified function. The auxiliary processor 1123 may be implemented separately from the main processor 1121 or as part of the main processor 1121.
The auxiliary processor 1123 may control at least some of the functions or states associated with at least one of the components of the electronic device 1101 (rather than the main processor 1121) (e.g., the display device 1160, the sensor module 1176, or the communication module 1190) while the main processor 1121 is in an inactive (e.g., sleep) state, or the auxiliary processor 1123 may control at least some of the functions or states associated with at least one of the components of the electronic device 1101 (e.g., the display device 1160, the sensor module 1176, or the communication module 1190) with the main processor 1121 while the main processor 1121 is in an active state (e.g., running an application). According to an embodiment, the auxiliary processor 1123 (e.g., an image signal processor or a communication processor) may be implemented as part of another component (e.g., the camera module 1180 or the communication module 1190) functionally related to the auxiliary processor 1123.
The memory 1130 may store various data used by at least one component of the electronic device 1101 (e.g., the processor 1120 or the sensor module 1176). The various data may include, for example, software (e.g., program 1140) and input data or output data for commands associated therewith. Memory 1130 may include volatile memory 1132 or nonvolatile memory 1134.
Program 1140 may be stored as software in memory 1130 and program 1140 may include, for example, an Operating System (OS) 1142, middleware 1144, or applications 1146.
The input device 1150 may receive commands or data from outside the electronic device 1101 (e.g., a user) to be used by other components of the electronic device 1101 (e.g., the processor 1120). The input device 1150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (e.g., a stylus).
The sound output device 1155 may output a sound signal to the outside of the electronic device 1101. The sound output device 1155 may include, for example, a speaker or a receiver. Speakers may be used for general purposes such as playing multimedia or playing a album and receivers may be used for incoming calls. Depending on the embodiment, the receiver may be implemented separate from the speaker or as part of the speaker.
The display device 1160 may visually provide information to an outside (e.g., a user) of the electronic device 1101. The display device 1160 may comprise, for example, a display, holographic device, or projector, and control circuitry for controlling a corresponding one of the display, holographic device, and projector. According to an embodiment, the display device 1160 may comprise touch circuitry adapted to detect touches or sensor circuitry (e.g., a pressure sensor) adapted to measure the strength of the force caused by touches.
The audio module 1170 may convert sound into an electrical signal and vice versa. According to an embodiment, the audio module 1170 may obtain sound via the input device 1150, or output sound via the sound output device 1155 or headphones of an external electronic device (e.g., the electronic device 1102) that is directly (e.g., wired) or wirelessly connected to the electronic device 1101.
The sensor module 1176 may detect an operational state (e.g., power or temperature) of the electronic device 1101 or an environmental state (e.g., a state of a user) external to the electronic device 1101 and then generate an electrical signal or data value corresponding to the detected state. According to an embodiment, the sensor module 1176 may include, for example, a gesture sensor, a gyroscope sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an Infrared (IR) sensor, a biometric sensor, a temperature sensor, a humidity sensor, or an illuminance sensor.
The interface 1177 may support one or more specific protocols that will be used to connect the electronic device 1101 with an external electronic device (e.g., the electronic device 1102), either directly (e.g., wired) or wirelessly. According to an embodiment, interface 1177 may include, for example, a High Definition Multimedia Interface (HDMI), a Universal Serial Bus (USB) interface, a Secure Digital (SD) card interface, or an audio interface.
The connection end 1178 may include a connector via which the electronic device 1101 may be physically connected with an external electronic device (e.g., the electronic device 1102). According to an embodiment, the connection 1178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (e.g., a headphone connector).
The haptic module 1179 may convert the electrical signal into a mechanical stimulus (e.g., vibration or motion) or an electrical stimulus that can be recognized by the user via his sense of touch or kinesthetic sense. According to an embodiment, haptic module 1179 may include, for example, a motor, a piezoelectric element, or an electrostimulator.
The camera module 1180 may capture still images or moving images. According to an embodiment, the camera module 1180 may include one or more lenses, image sensors, image signal processors, or flash lamps.
The power management module 1188 may manage power to the electronic device 1101. According to an embodiment, the power management module 1188 may be implemented as at least part of, for example, a Power Management Integrated Circuit (PMIC).
A battery 1189 may power at least one component of the electronic device 1101. According to an embodiment, the battery 1189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
The communication module 1190 may support establishment of a direct (e.g., wired) communication channel or a wireless communication channel between the electronic device 1101 and an external electronic device (e.g., the electronic device 1102, the electronic device 1104, or the server 1108) and perform communication via the established communication channel. The communication module 1190 may include one or more communication processors capable of operating independently of the processor 1120 (e.g., an Application Processor (AP)) and supporting direct (e.g., wired) or wireless communication. According to an embodiment, the communication module 1190 may include a wireless communication module 1192 (e.g., a cellular communication module, a short-range wireless communication module, or a Global Navigation Satellite System (GNSS) communication module) or a wired communication module 1194 (e.g., a Local Area Network (LAN) communication module or a Power Line Communication (PLC) module). A respective one of these communication modules may communicate with external electronic devices via a first network 1198 (e.g., a short-range communication network such as bluetooth, wireless fidelity (Wi-Fi) direct, or infrared data association (IrDA)) or a second network 1199 (e.g., a long-range communication network such as a cellular network, the internet, or a computer network (e.g., a LAN or Wide Area Network (WAN)). These various types of communication modules may be implemented as a single component (e.g., a single chip), or may be implemented as multiple components (e.g., multiple chips) separate from each other. The wireless communication module 1192 may use user information (e.g., an International Mobile Subscriber Identity (IMSI)) stored in the subscriber identity module 1196 to identify and authenticate the electronic device 1101 in a communication network, such as the first network 1198 or the second network 1199.
The wireless communication module 1192 may support a 5G network after a fourth generation (4G) network and next generation communication technologies such as New Radio (NR) access technologies. NR access technologies may support enhanced mobile broadband (eMBB), large-scale machine type communication (mctc), or Ultra Reliable Low Latency Communication (URLLC). The wireless communication module 1192 may support a high frequency band (e.g., millimeter wave band) to achieve, for example, high data transmission rates. The wireless communication module 1192 may support various techniques for ensuring performance over high frequency bands, such as beamforming, massive multiple-input multiple-output (massive MIMO), full-dimensional MIMO (FD-MIMO), array antennas, analog beamforming, or massive antennas. The wireless communication module 1192 may support various requirements specified in the electronic device 1101, an external electronic device (e.g., electronic device 1104), or a network system (e.g., second network 1199). According to one embodiment, the wireless communication module 1192 may support a peak data rate (e.g., 20Gbps or higher) for implementing an eMBB, a loss range (e.g., 164dB or lower) for implementing an emtc, or a U-plane delay (e.g., round trip of 0.5ms or less, or 1ms or less for each of the Downlink (DL) and Uplink (UL)) for implementing a URLLC.
The antenna module 1197 may transmit signals or power to or receive signals or power from outside of the electronic device 1101 (e.g., an external electronic device). According to an embodiment, the antenna module 1197 may include an antenna that includes a radiating element composed of a conductive material or conductive pattern formed in or on a substrate (e.g., PCB). According to an embodiment, the antenna module 1197 may include multiple antennas. In this case, at least one antenna of the plurality of antennas suitable for a communication scheme used in a communication network, such as the first network 1198 or the second network 1199, may be selected by, for example, the communication module 1190 (e.g., the wireless communication module 1192). Signals or power may then be transmitted or received between the communication module 1190 and the external electronic device via the selected at least one antenna. According to embodiments, further components (e.g., a Radio Frequency Integrated Circuit (RFIC)) other than radiating elements may additionally be formed as part of the antenna module 1197.
According to various embodiments, antenna module 1197 may form a millimeter wave antenna module. According to one embodiment, a millimeter wave antenna module may include a PCB, an RFIC disposed on or adjacent to a first surface (e.g., a bottom surface) of the PCB and capable of supporting a specified high frequency band (e.g., millimeter wave band), and a plurality of antennas (e.g., array antennas) disposed on or adjacent to a second surface (e.g., a top surface or a side surface) of the PCB and capable of transmitting or receiving signals of the specified high frequency band.
At least some of the above components may be interconnected via an inter-peripheral communication scheme (e.g., bus, general Purpose Input Output (GPIO), serial Peripheral Interface (SPI), or Mobile Industrial Processor Interface (MIPI)) and communicatively communicate signals (e.g., commands or data) therebetween.
According to an embodiment, commands or data may be sent or received between the electronic device 1101 and the external electronic device 1104 via a server 1108 connected to a second network 1199. Each of the electronic device 1102 and the electronic device 1104 may be the same type of device as the electronic device 1101 or a different type of device from the electronic device 1101. According to embodiments, all or some of the operations to be performed at the electronic device 1101 may be performed at one or more of the external electronic device 1102, the external electronic device 1104, or the server 1108. For example, if the electronic device 1101 should automatically perform a function or service or should perform a function or service in response to a request from a user or another device, the electronic device 1101 may request the one or more external electronic devices to perform at least part of the function or service instead of or in addition to the function or service, or the electronic device 1101 may request the one or more external electronic devices to perform at least part of the function or service. The one or more external electronic devices that received the request may perform the requested at least part of the function or service, or perform additional functions or additional services related to the request, and transmit the result of the performing to the electronic device 1101. The electronic device 1101 may provide the results as at least a partial reply to the request with or without further processing of the results. For this purpose, cloud computing technology, distributed computing technology, or client-server computing technology, for example, may be used. The electronic device 1101 may provide ultra low latency services using, for example, distributed computing or MEC. In another embodiment, the external electronic device 1104 may include an internet of things (IoT) device. The server 1108 may be an intelligent server using machine learning and/or a neural network. According to an embodiment, an external electronic device 1104 or a server 1108 may be included in the second network 1199. The electronic device 1101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology or internet of things related technology.
The electronic device according to various embodiments may be one of various types of electronic devices. The electronic device may include, for example, a portable communication device (e.g., a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a household appliance. According to the embodiments of the present disclosure, the electronic devices are not limited to those described above.
It should be understood that the various embodiments of the disclosure and the terminology used therein are not intended to limit the technical features set forth herein to the particular embodiments, but rather include various modifications, equivalents or alternatives to the respective embodiments. For the description of the drawings, like reference numerals may be used to refer to like or related elements. It will be understood that a noun in the singular corresponding to a term may include one or more things unless the context clearly indicates otherwise. As used herein, each of the phrases such as "a or B", "at least one of a and B", "at least one of a or B", "A, B or C", "at least one of A, B and C", and "at least one of A, B or C" may include any or all possible combinations of items listed with a corresponding one of the plurality of phrases. As used herein, terms such as "1 st" and "2 nd" or "first" and "second" may be used to simply distinguish one element from another element and not to limit the element in other respects (e.g., importance or order). It will be understood that if the terms "operatively" or "communicatively" are used or the terms "operatively" or "communicatively" are not used, then if an element (e.g., a first element) is referred to as being "coupled to," "connected to," or "connected to" another element (e.g., a second element), it is intended that the element can be directly (e.g., wired) connected to, wireless connected to, or connected to the other element via a third element.
As used herein, the term "module" may include units implemented in hardware, software, or firmware, and may be used interchangeably with other terms (e.g., "logic," "logic block," "portion" or "circuitry"). A module may be a single integrated component adapted to perform one or more functions or a minimal unit or portion of the single integrated component. For example, according to an embodiment, a module may be implemented in the form of an Application Specific Integrated Circuit (ASIC).
The various embodiments set forth herein may be implemented as software (e.g., program 1140) comprising one or more instructions stored in a storage medium (e.g., internal memory 1136 or external memory 1138) readable by a machine (e.g., electronic device 1101). For example, under control of a processor, a processor (e.g., processor 1120) of the machine (e.g., electronic device 1101) may invoke and execute at least one of the one or more instructions stored in the storage medium with or without the use of one or more other components. This enables the machine to operate to perform at least one function in accordance with the at least one instruction invoked. The one or more instructions may include code generated by a compiler or code capable of being executed by an interpreter. The machine-readable storage medium may be provided in the form of a non-transitory storage medium. Wherein the term "non-transitory storage medium" is a tangible device and does not include signals (e.g., electromagnetic waves), but the term does not distinguish between data being semi-permanently stored in the storage medium and data being temporarily stored in the storage medium.
According to embodiments, methods according to various embodiments of the present disclosure may be included and provided in a computer program product. The computer program product may be used as a product for conducting transactions between sellers and buyers. The computer program product may be distributed in the form of a machine-readable storage medium, such as a compact disk read only memory (CD-ROM), or may be distributed via an application Store (e.g., a Play Store TM ) On-line distribution (e.g., downloading or uploading) of a computer program product, or may be distributed directly between two user devices (e.g., smartphones)(e.g., download or upload) of the computer program product. If published online, at least a portion of the computer program product (e.g., the downloadable application) may be temporarily generated, or at least a portion of the computer program product may be stored at least temporarily in a machine-readable storage medium, such as a memory of a manufacturer's server, an application store's server, or a relay server.
According to various embodiments, each of the above-described components (e.g., a module or program) may include a single entity or multiple entities. According to various embodiments, one or more of the above components may be omitted, or one or more other components may be added. Alternatively or additionally, multiple components (e.g., modules or programs) may be integrated into a single component. In this case, according to various embodiments, the integrated component may still perform the one or more functions of each of the plurality of components in the same or similar manner as the corresponding one of the plurality of components performed the one or more functions prior to integration. According to various embodiments, operations performed by a module, a program, or another component may be performed sequentially, in parallel, repeatedly, or in a heuristic manner, or one or more of the operations may be performed in a different order or omitted, or one or more other operations may be added.
Fig. 12 is a block diagram 1200 illustrating a camera module 1180 in accordance with various embodiments. Referring to fig. 12, a camera module 1180 may include a lens assembly 1210, a flash 1220, an image sensor 1230, an image stabilizer 1240, a memory 1250 (e.g., a buffer memory), or an ISP 1260. The lens assembly 1210 may collect light emitted or reflected from an object to be photographed. The lens assembly 1210 may include one or more lenses. According to an embodiment, the camera module 1180 may include a plurality of lens assemblies 1210. In this case, the camera module 1180 may form, for example, a dual camera, a 360 degree camera, or a spherical camera. Some of the plurality of lens assemblies 1210 may have the same lens properties (e.g., viewing angle, focal length, auto-focus, f-number, or optical zoom), or at least one lens assembly may have one or more lens properties that are different from the lens properties of the other lens assemblies. The lens assembly 1210 may include, for example, a wide angle lens or a telephoto lens.
The flash 1220 may emit light, wherein the emitted light is used to enhance the light reflected from the object. According to an embodiment, flash 1220 may include one or more Light Emitting Diodes (LEDs) (e.g., red Green Blue (RGB) LEDs, white LEDs, infrared (IR) LEDs, or Ultraviolet (UV) LEDs) or xenon lamps. The image sensor 1230 may acquire an image corresponding to an object by converting light emitted or reflected from the object and transmitted through the lens assembly 1210 into an electrical signal. According to an embodiment, the image sensor 1230 may include one image sensor (e.g., an RGB sensor, a black-and-white (BW) sensor, an IR sensor, or a UV sensor) selected from image sensors having different properties, a plurality of image sensors having the same properties, or a plurality of image sensors having different properties. Each of the image sensors included in the image sensor 1230 may be implemented using, for example, a Charge Coupled Device (CCD) sensor or a Complementary Metal Oxide Semiconductor (CMOS) sensor.
The image stabilizer 1240 may move the image sensor 1230 or at least one lens included in the lens assembly 1210 in a particular direction or control operational properties (e.g., adjust readout timing) of the image sensor 1230 in response to movement of the camera module 1180 or the electronic device 1101 including the camera module 1180. In this way, it is allowed to compensate for at least a portion of negative effects (e.g., image blurring) due to movement of the image being captured. According to an embodiment, the image stabilizer 1240 may sense such movement of the camera module 1180 or the electronic device 1101 using a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 1180. According to an embodiment, the image stabilizer 1240 may be implemented as, for example, an optical image stabilizer.
Memory 1250 may at least temporarily store at least a portion of an image acquired via image sensor 1230 for subsequent image processing tasks. For example, if image capture is delayed or multiple images are captured quickly due to shutter time lag, the acquired raw images (e.g., bayer pattern images, high resolution images) may be stored in memory 1250 and their corresponding duplicate images (e.g., low resolution images) may be previewed via display device 1160. Then, if the specified condition is met (e.g., by user input or system commands), at least a portion of the original image stored in memory 1250 may be acquired and processed by, for example, ISP 1260. According to an embodiment, memory 1250 may be configured as at least a portion of memory 1130, or memory 1250 may be configured as a separate memory that operates independently of memory 1130.
ISP 1260 may perform one or more image processes on images acquired via image sensor 1230 or images stored in memory 1250. The one or more image processing may include, for example, depth map generation, three-dimensional (3D) modeling, panorama generation, feature point extraction, image synthesis, or image compensation (e.g., noise reduction, resolution adjustment, brightness adjustment, blurring, sharpening, or softening). Additionally or alternatively, ISP 1260 may perform control (e.g., exposure time control or readout timing control) on at least one of the components included in camera module 1180 (e.g., image sensor 1230). The image processed by ISP 1260 may be stored back in memory 1250 for further processing or may be provided to external components (e.g., memory 1130, display device 1160, electronic device 1102, electronic device 1104, or server 1108) external to camera module 1180. According to an embodiment, ISP 1260 may be configured as at least a portion of processor 1120, or ISP 1260 may be configured as a separate processor that operates independently of processor 1120. If ISP 1260 is configured as a separate processor from processor 1120, at least one image processed by ISP 1260 may be displayed as it is by processor 1120 via display device 1160 or may be displayed after further processing.
According to an embodiment, the electronic device 1101 may include a plurality of camera modules 1180 having different attributes or functions. In this case, at least one camera module 1180 of the plurality of camera modules 1180 may form, for example, a wide-angle camera, and at least another camera module 1180 of the plurality of camera modules 1180 may form a tele camera. Similarly, at least one camera module 1180 of the plurality of camera modules 1180 may form, for example, a front camera, and at least another camera module 1180 of the plurality of camera modules 1180 may form a rear camera.
As described above, the electronic apparatus (e.g., the electronic apparatus 100 in fig. 1) according to the embodiment may include an image sensor (e.g., the image sensor 120 in fig. 2) including a plurality of unit pixels each including at least four Photodiodes (PDs), the PDs being disposed adjacent to each other in a first direction and a second direction different from the first direction, and at least one processor electrically connected to the image sensor, wherein the at least one processor performs a first analog-to-digital converter (ADC) by reading out a first PD group included in the first unit pixel and a second PD group adjacent to the first PD group in the first direction, performs a second ADC by reading out a third PD group adjacent to the first PD group in the second direction, performs a third ADC by reading out a fourth PD group adjacent to the second PD group in the second direction, detects a first ADC based on a result of performing the first ADC, the second ADC, and the third ADC, and performs a detection of a first ADC based on the second ADC and a second phase difference in the second direction.
According to an embodiment, the at least one processor may perform the AF function based on the first phase difference and the second phase difference.
According to an embodiment, each of the unit pixels may include at least one color filter formed on at least four PDs included in the respective unit pixels, and may include at least one micro lens formed on the at least one color filter.
According to an embodiment, at least four PDs may share the same color filter and the same micro lens.
According to an embodiment, each of the unit pixels may include four PDs having a 2×2 array and at least one FD node connected to the four PDs.
According to an embodiment, four unit pixels may share the same color filter.
According to an embodiment, the at least one processor may analyze a frame acquired by the image sensor to compare reliability in the first direction with reliability in the second direction, and may perform the first ADC, the second ADC, and the third ADC when it is determined that the reliability in the second direction is higher than the reliability in the first direction based on a result of the comparison.
When it is determined that the reliability in the first direction is higher based on the result of the comparison, the at least one processor may perform the fourth ADC by reading out the first PD group included in the first unit pixel and the third PD group adjacent to the first PD group in the second direction, and may perform the fifth ADC by reading out the second PD group adjacent to the first PD group in the first direction, may perform the sixth ADC by reading out the fourth PD group adjacent to the second PD group in the second direction, may detect the third phase difference in the first direction based on the results of the fourth ADC, the fifth ADC, and the sixth ADC, and may detect the fourth phase difference in the second direction based on the results of the fifth ADC and the sixth ADC.
According to an embodiment, the reliability in the first direction and the reliability in the second direction may be determined based on the contrast in the first direction and the contrast in the second direction, respectively, and the at least one processor may determine that the greater the contrast, the higher the reliability.
According to an embodiment, in the unit pixel, the first PD group may be disposed at a first position, and the first direction may be a vertical direction downward from the first PD group.
According to an embodiment, in the PD in the second unit pixel included in the plurality of unit pixels, the at least one processor may perform the fourth ADC by reading out the first PD group of the second unit pixel and the third PD group of the second unit pixel adjacent to the first PD group of the second unit pixel in the second direction, and may perform the fifth ADC by reading out the second PD group of the second unit pixel adjacent to the first PD group of the second unit pixel in the first direction, may perform the sixth ADC by reading out the fourth PD group of the second unit pixel adjacent to the second PD group of the second unit pixel in the second direction, may detect the third phase difference in the first direction based on the results of the fourth ADC, the fifth ADC, and the sixth ADC, and may detect the fourth phase difference in the second direction based on the results of the performing the fifth ADC and the sixth ADC, and may perform the AF function based on the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference.
According to an embodiment, the first direction may be perpendicular to the second direction.
As described above, a method for operating an electronic device is provided. In a first unit pixel among a plurality of unit pixels included in an image sensor of an electronic device, a first ADC is performed by reading out a first PD group included in the first unit pixel and a second PD group in the first unit pixel adjacent to the first PD group in a first direction. The second ADC is performed by reading out a third PD group in the first unit pixel adjacent to the first PD group in the first unit pixel in the second direction. The second direction is perpendicular to the first direction. The third ADC is performed by reading out a fourth PD group in the first unit pixel adjacent to the second PD group in the first unit pixel in the second direction. The first phase difference in the second direction is detected based on the results of the first ADC, the second ADC, and the third ADC. A second phase difference in the first direction is detected based on the results of the second ADC and the third ADC.
The method for operating an electronic device according to an embodiment may include an operation of performing an AF function based on the first phase difference and the second phase difference.
In the method for operating an electronic device according to the embodiment, each of the unit pixels may include at least one color filter formed on at least four PDs included in the respective unit pixels, and may include at least one micro lens formed on the at least one color filter.
In the method for operating an electronic device according to an embodiment, at least four PDs may share the same color filter and the same micro lens.
In the method for operating an electronic device according to the embodiment, each of the unit pixels may include four PDs having a 2×2 array and at least one FD node connected to the four PDs.
In the method for operating the electronic device according to the embodiment, the unit pixels may share the same color filter.
The method for operating an electronic device according to an embodiment may include an operation of analyzing a frame acquired by an image sensor to compare reliability in a first direction with reliability in a second direction, and an operation of performing the first ADC, the second ADC, and the third ADC when it is determined that the reliability in the second direction is higher than the reliability in the first direction based on a comparison result.
The method for operating an electronic device according to an embodiment may include an operation of analyzing a frame acquired by an image sensor to compare reliability in a first direction with reliability in a second direction, and when it is determined that reliability in the first direction is higher than reliability in the second direction based on a result of the comparison, performing a fourth ADC by reading out a first PD group included in a first unit pixel and a third PD group adjacent to the first PD group in the second direction, performing a fifth ADC by reading out a second PD group adjacent to the first PD group in the first direction, and performing a sixth ADC by reading out a fourth PD group adjacent to the second PD group in the second direction, detecting a third phase difference in the first direction based on a result of the fourth ADC, the fifth ADC, and the sixth ADC, and detecting a fourth phase difference in the second direction based on a result of the fifth ADC and the sixth ADC.
While the disclosure has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the disclosure. Accordingly, the scope of the present disclosure should not be defined as limited to the embodiments, but should be defined by the appended claims and equivalents thereof.

Claims (15)

1. An electronic device, comprising:
an image sensor including a plurality of unit pixels, wherein each unit pixel includes at least four Photodiodes (PDs), the PDs being disposed adjacent to each other in a first direction and a second direction different from the first direction; and
at least one processor electrically connected to the image sensor,
wherein the at least one processor is configured to:
in a first unit pixel among the plurality of unit pixels, performing a first analog-to-digital conversion (ADC) by reading out a first PD group included in the first unit pixel and a second PD group adjacent to the first PD group in a first direction,
performing a second ADC by reading out a third PD group adjacent to the first PD group in the second direction;
performing a third ADC by reading out a fourth PD group adjacent to the second PD group in the first unit pixel in the second direction;
Detecting a first phase difference in a second direction based on results of the first ADC, the second ADC, and the third ADC; and
based on the results of the second ADC and the third ADC, a second phase difference in the first direction is detected.
2. The electronic device of claim 1, wherein the at least one processor is configured to perform an autofocus function based on the first phase difference and the second phase difference.
3. The electronic device of claim 1, wherein each of the plurality of unit pixels includes at least one color filter formed on at least four PDs included in the respective unit pixels, and at least one micro lens formed on the at least one color filter.
4. The electronic device of claim 3, wherein the at least four PDs are configured to share a same color filter and a same micro-lens.
5. The electronic device of claim 1, wherein each of the plurality of unit pixels includes four PDs having a 2 x 2 array, and at least one Floating Diffusion (FD) node connected to the four PDs.
6. The electronic device of claim 5, wherein the plurality of unit pixels are configured to share the same color filter.
7. The electronic device of claim 1, wherein the at least one processor is configured to:
analyzing frames acquired by the image sensor to compare reliability in a first direction and reliability in a second direction; and
when it is determined that the reliability in the second direction is higher than the reliability in the first direction based on the result of the comparison, the first ADC, the second ADC, and the third ADC are performed.
8. The electronic device of claim 7, wherein the at least one processor is configured to:
when it is determined that the reliability in the first direction is high based on the result of the comparison,
the fourth ADC is performed by reading out the first PD group included in the first unit pixel and the third PD group adjacent to the first PD group in the second direction,
the fifth ADC is performed by reading out a second PD group adjacent to the first PD group in the first direction,
the sixth ADC is performed by reading out a fourth PD group adjacent to the second PD group in the second direction,
detecting a third phase difference in the first direction based on the results of the fourth ADC, the fifth ADC, and the sixth ADC, and
based on the results of the fifth ADC and the sixth ADC, a fourth phase difference in the second direction is detected.
9. The electronic device of claim 7, wherein the reliability in the first direction and the reliability in the second direction are determined based on a contrast in the first direction and a contrast in the second direction, respectively, and
The at least one processor is configured to determine that the greater the contrast, the higher the reliability.
10. The electronic device according to claim 1, wherein the first PD group is disposed at a first position among the plurality of unit pixels, and the first direction is a vertical direction downward from the first PD group.
11. The electronic device of claim 1, wherein in the PD included in the second one of the plurality of unit pixels, the at least one processor is configured to:
the fourth ADC is performed by reading out the first PD group of the second unit pixel and the third PD group of the second unit pixel adjacent to the first PD group of the second unit pixel in the second direction,
the fifth ADC is performed by reading out a second PD group of the second unit pixel adjacent to the first PD group of the second unit pixel in the first direction,
the sixth ADC is performed by reading out a fourth PD group of the second unit pixel adjacent to the second PD group of the second unit pixel in the second direction,
based on the results of the fourth ADC, the fifth ADC and the sixth ADC, a third phase difference in the first direction is detected,
detecting a fourth phase difference in the second direction based on the results of the fifth ADC and the sixth ADC, and
An auto-focusing function is performed based on the first phase difference, the second phase difference, the third phase difference, and the fourth phase difference.
12. The electronic device of claim 1, wherein the first direction is perpendicular to the second direction.
13. A method for operating an electronic device, comprising:
in a first unit pixel among a plurality of unit pixels included in an image sensor of an electronic device, performing a first analog-to-digital conversion (ADC) by reading out a first PD group included in the first unit pixel and a second PD group adjacent to the first PD group in a first direction;
executing a second ADC by reading out a third PD group adjacent to the first PD group in a second direction, the second direction being perpendicular to the first direction;
executing a third ADC by reading out a fourth PD group adjacent to the second PD group in the second direction;
detecting a first phase difference in a second direction based on results of the first ADC, the second ADC, and the third ADC; and
based on the results of the second ADC and the third ADC, a second phase difference in the first direction is detected.
14. The method of claim 13, comprising:
analyzing frames acquired by the image sensor to compare reliability in a first direction and reliability in a second direction, and
When it is determined that the reliability in the second direction is high based on the result of the comparison, the first ADC, the second ADC, and the third ADC are performed.
15. The method of claim 14, comprising:
analyzing frames acquired by the image sensor to compare reliability in a first direction and reliability in a second direction, and
when it is determined that the reliability in the first direction is high based on the result of the comparison,
performing a fourth ADC by reading out a first PD group included in the first unit pixel and a third PD group adjacent to the first PD group in the second direction;
performing a fifth ADC by reading out a second PD group adjacent to the first PD group in the first direction;
performing a sixth ADC by reading out a fourth PD group adjacent to the second PD group in the second direction;
detecting a third phase difference in the first direction based on the results of the fourth ADC, the fifth ADC, and the sixth ADC; and
based on the results of the fifth ADC and the sixth ADC, a fourth phase difference in the second direction is detected.
CN202280013971.2A 2021-02-10 2022-02-10 Electronic device including image sensor and method of operating the same Pending CN116848851A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2021-0019463 2021-02-10
KR10-2021-0072286 2021-06-03
KR1020210072286A KR20220115493A (en) 2021-02-10 2021-06-03 Electronic device including image sensor and operating method thereof
PCT/KR2022/002018 WO2022173236A1 (en) 2021-02-10 2022-02-10 Electronic device comprising image sensor and method of operating same

Publications (1)

Publication Number Publication Date
CN116848851A true CN116848851A (en) 2023-10-03

Family

ID=88165679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280013971.2A Pending CN116848851A (en) 2021-02-10 2022-02-10 Electronic device including image sensor and method of operating the same

Country Status (1)

Country Link
CN (1) CN116848851A (en)

Similar Documents

Publication Publication Date Title
CN112840634B (en) Electronic device and method for obtaining image
US20230283920A1 (en) Electronic device comprising image sensor and method of operating same
US20230209202A1 (en) Electronic device for performing image stabilization, and operating method therefor
CN115989680A (en) Image stabilizing method and electronic device thereof
CN116530090A (en) Method for taking pictures by using multiple cameras and device thereof
US20230101888A1 (en) Electronic device performing image stabilization and operating method thereof
US11743585B2 (en) Electronic apparatus including camera and control method thereof
CN117256157A (en) Electronic device including camera
CN116530088A (en) Camera module and electronic device including the same
US20230101860A1 (en) Electronic device including image sensor and operating method thereof
CN116848851A (en) Electronic device including image sensor and method of operating the same
US20230113058A1 (en) Electronic device including image sensor and operating method thereof
US20230199330A1 (en) Electronic device for processing continuous shooting input and method thereof
US11877072B2 (en) Image capturing method using plurality of cameras, and electronic device
US20230388677A1 (en) Electronic device including image sensor and operating method thereof
US20240078685A1 (en) Method for generating file including image data and motion data, and electronic device therefor
EP4254977A1 (en) Electronic device comprising image sensor and operating method thereof
KR20220115493A (en) Electronic device including image sensor and operating method thereof
EP4228246A1 (en) Electronic device capable of auto-focusing and method for operating same
US20240098347A1 (en) Electronic device comprising image sensor and dynamic vision sensor, and operating method therefor
KR20220153331A (en) Electronic device including image sensor and operating method thereof
KR20230010461A (en) Electronic device including image sensor and operating method thereof
KR20230113109A (en) Electronic device including image sensor and operating method thereof
CN117178559A (en) Electronic device including camera module and electronic device operating method
CN117546476A (en) Electronic device for applying foreground effect to image and operation method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination