WO2021251631A1 - Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé - Google Patents

Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé Download PDF

Info

Publication number
WO2021251631A1
WO2021251631A1 PCT/KR2021/005965 KR2021005965W WO2021251631A1 WO 2021251631 A1 WO2021251631 A1 WO 2021251631A1 KR 2021005965 W KR2021005965 W KR 2021005965W WO 2021251631 A1 WO2021251631 A1 WO 2021251631A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
electronic device
image
processor
camera
Prior art date
Application number
PCT/KR2021/005965
Other languages
English (en)
Korean (ko)
Inventor
송인선
이승한
김표재
최지환
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021251631A1 publication Critical patent/WO2021251631A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • Embodiments disclosed in this document relate to an electronic device including a focus control function of a camera.
  • Depth of field refers to the range between the closest and farthest positions where a target object can be clearly seen in an image captured by a camera lens. In the case of a shallow depth of field, a clear image cannot be obtained because the focus is limited to a part at a close distance.
  • an image for a focus stack is acquired through a plurality of lens modules, or the image is acquired by simply dividing the image into a plurality of regions and focusing on the center of the regions.
  • the size of the lens assembly increases, which may limit the size of the camera module to be placed in a fixed space.
  • Various embodiments of the present disclosure may provide an electronic device capable of finely focusing on a desired area by determining a near-planar object and acquiring text information.
  • An electronic device may include an image sensor, a depth sensor, and at least one processor. At least one processor determines whether a target object corresponds to a flat object using the depth sensor, activates a text recognition function, and responds to activation of the text recognition function when the target object corresponds to the flat object , to obtain a plurality of images by determining a plurality of focal points for different regions of the target object, and driving the image sensor based on the determined plurality of focal points, and to at least some of the obtained plurality of images. Based on this, a final image of the target object may be obtained.
  • the method of operating an electronic device includes an operation of determining whether a target object corresponds to a flat object using the depth sensor, an operation of activating a text recognition function, and an operation of the target object Determining a plurality of focal points for different regions of the target object in response to a planar object and activating the text recognition function, a plurality of focal points by driving the image sensor based on the determined plurality of focal points
  • the method may include acquiring an image and acquiring a final image of the target object based on at least a part of the plurality of acquired images.
  • the time required for the focus stack may be reduced by determining the lens movement and the number of shots optimized for the current shooting environment.
  • FIG. 1 is a diagram illustrating a structure of an electronic device and a camera module according to an embodiment.
  • FIG. 2 illustrates a hardware configuration of an electronic device according to an embodiment.
  • FIG. 3 illustrates a process for performing a focus stack in an electronic device according to an exemplary embodiment.
  • FIG. 4 illustrates a state in which an electronic device captures a planar object according to an exemplary embodiment.
  • FIG. 5 is a diagram illustrating a state in which a camera of an electronic device captures a flat object while forming a predetermined angle with the flat object, according to an exemplary embodiment.
  • FIG. 6 is a diagram illustrating an electronic device photographing a non-planar object according to an exemplary embodiment.
  • FIG. 7 is a flowchart illustrating a process in which an electronic device acquires a plurality of images based on a plurality of focal points, according to an exemplary embodiment.
  • FIG. 8 is a flowchart illustrating a process of dividing a region of a target object according to an angle formed by a camera of an electronic device with the target object according to an exemplary embodiment.
  • FIG. 9 is a flowchart illustrating a process of acquiring an image based on movement of a lens according to an exemplary embodiment.
  • FIG. 10 is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure.
  • FIG. 11 is a block diagram illustrating a camera module according to various embodiments of the present disclosure.
  • FIG. 1 is a diagram illustrating a structure of an electronic device and a camera module according to an embodiment.
  • FIG. 1 is a diagram schematically illustrating an exterior and a camera module 180 of an electronic device 100 on which a camera module 180 is mounted, according to an embodiment.
  • FIG. 1 has been illustrated and described on the premise of a mobile device, in particular, a smart phone, it is to those skilled in the art that it can be applied to various electronic devices or electronic devices equipped with a camera among mobile devices. will be clearly understood.
  • the display 110 may be disposed on the front surface of the electronic device 100 according to an embodiment.
  • the display 110 may occupy most of the front surface of the electronic device 100 .
  • a display 110 and a bezel 190 region surrounding at least some edges of the display 110 may be disposed on the front surface of the electronic device 100 .
  • the display 110 may include a flat area and a curved area extending from the flat area toward the side of the electronic device 100 .
  • the electronic device 100 illustrated in FIG. 1 is an example, and various embodiments are possible.
  • the display 110 of the electronic device 100 may include only a flat area without a curved area, or may include a curved area only at one edge instead of both sides.
  • the curved area may extend toward the rear surface of the electronic device, so that the electronic device 100 may include an additional planar area.
  • the electronic device 100 may additionally include a speaker, a receiver, a front camera, a proximity sensor, a home key, and the like.
  • the electronic device 100 may be provided in which the rear cover 150 is integrated with the main body of the electronic device.
  • the rear cover 150 may be separated from the main body of the electronic device 100 to have a form in which the battery can be replaced.
  • the back cover 150 may be referred to as a battery cover or a back cover.
  • a fingerprint sensor 171 for recognizing a user's fingerprint may be included in the first area 170 of the display 110 . Since the fingerprint sensor 171 is disposed on a lower layer of the display 110 , the fingerprint sensor 171 may not be recognized by the user or may be difficult to recognize. Also, in addition to the fingerprint sensor 171 , a sensor for additional user/biometric authentication may be disposed in a portion of the display 110 . In another embodiment, a sensor for user/biometric authentication may be disposed on one area of the bezel 190 . For example, the IR sensor for iris authentication may be exposed through one area of the display 110 or may be exposed through one area of the bezel 190 .
  • the front camera 161 may be disposed in the second area 160 on the front side of the electronic device 100 .
  • the front camera 161 is shown to be exposed through one area of the display 110 , but in another embodiment, the front camera 161 may be exposed through the bezel 190 .
  • the electronic device 100 may include one or more front cameras 161 .
  • the electronic device 100 may include two front cameras, such as a first front camera and a second front camera.
  • the first front camera and the second front camera may be cameras of the same type having the same specifications (eg, pixels), but the first front camera and the second front camera may be implemented as cameras of different specifications.
  • the electronic device 100 may support a function (eg, 3D imaging, auto focus, etc.) related to a dual camera through two front cameras. The above-mentioned description of the front camera may be equally or similarly applied to the rear camera of the electronic device 100 .
  • various hardware or sensors 163 to assist photographing such as a flash, may be additionally disposed.
  • a distance sensor eg, TOF sensor
  • the distance sensor may be applied to both a front camera and/or a rear camera.
  • the distance sensor may be separately disposed or included and disposed on the front camera and/or the rear camera.
  • At least one physical key may be disposed on a side portion of the electronic device 100 .
  • the first function key 151 for turning on/off the display 110 or turning on/off the power of the electronic device 100 may be disposed on the right edge with respect to the front surface of the electronic device 100 .
  • the second function key 152 for controlling the volume or screen brightness of the electronic device 100 may be disposed on the left edge with respect to the front surface of the electronic device 100 .
  • additional buttons or keys may be disposed on the front or rear of the electronic device 100 .
  • a physical button or a touch button mapped to a specific function may be disposed in a lower region of the front bezel 190 .
  • the electronic device 100 illustrated in FIG. 1 corresponds to one example, and the shape of the device to which the technical idea disclosed in the present disclosure is applied is not limited.
  • a foldable electronic device that can be folded horizontally or vertically, a rollable electronic device that can be rolled, a tablet or a notebook computer
  • the technical idea of the present disclosure This can be applied.
  • the present technical idea can be applied even when it is possible to arrange the first camera and the second camera facing in the same direction to face different directions through rotation, folding, deformation, etc. of the device.
  • an electronic device 100 may include a camera module 180 .
  • the camera module 180 may include a lens assembly 111 , a housing 113 , an infrared cut filter 115 , an image sensor 120 , and an image signal processor 130 . have.
  • the lens assembly 111 may have a different number, arrangement, type, etc. of lenses depending on the front camera and the rear camera.
  • the front camera and the rear camera may have different characteristics (eg, focal length, maximum magnification, etc.).
  • the lens may be moved forward and backward along the optical axis, and may operate so that a target object, which is a subject, can be clearly captured by changing a focal length.
  • the camera module 180 includes a housing 113 for mounting at least one coil surrounding the periphery of the barrel around the optical axis and a barrel for mounting at least one or more lenses aligned on the optical axis.
  • the infrared cut filter 115 may be disposed on the upper surface of the image sensor 120 .
  • the image of the subject passing through the lens may be partially filtered by the infrared cut filter 115 and then detected by the image sensor 120 .
  • the image sensor 120 may be disposed on the upper surface of the printed circuit board.
  • the image sensor 120 may be electrically connected to the image signal processor 130 connected to the printed circuit board 140 by a connector.
  • a flexible printed circuit board (FPCB) or a cable may be used as the connector.
  • the image sensor 120 may be a complementary metal oxide semiconductor (CMOS) sensor or a charged coupled device (CCD) sensor.
  • CMOS complementary metal oxide semiconductor
  • CCD charged coupled device
  • a plurality of individual pixels are integrated in the image sensor 120 , and each individual pixel may include a micro lens, a color filter, and a photodiode.
  • Each individual pixel is a kind of photodetector that can convert incoming light into an electrical signal. Photodetectors generally cannot detect the wavelength of the captured light by themselves and cannot determine color information.
  • the photodetector may include a photodiode.
  • light information of a subject incident through the lens assembly 111 may be converted into an electrical signal by the image sensor 120 and input to the image signal processor 130 .
  • the camera module 180 may be disposed on the front side as well as the rear side of the electronic device 100 .
  • the electronic device 100 may include a plurality of camera modules 180 as well as one camera module 180 to improve camera performance.
  • the electronic device 100 may further include a front camera 161 for video call or self-camera photography.
  • the front camera 161 may support a relatively low number of pixels compared to the rear camera module.
  • the front camera may be relatively smaller than the rear camera module.
  • FIG. 2 illustrates a hardware configuration of an electronic device according to an embodiment.
  • the configuration illustrated in FIG. 1 may be briefly described or a description thereof may be omitted.
  • the electronic device 100 may include a camera module 180 , a processor 220 , a display 110 , and a memory 230 .
  • a camera module 180 may include a camera module 180 , a processor 220 , a display 110 , and a memory 230 .
  • descriptions of the same reference numerals as those of FIG. 1 may be omitted.
  • the camera module 180 may include a lens assembly 111 , an image sensor 120 , a depth detection sensor 210 , and an image signal processor 130 .
  • the electronic device 100 may further include additional components.
  • the electronic device 100 may further include at least one microphone for recording audio data.
  • the electronic device 100 may include at least one sensor for determining a direction in which the front or rear of the electronic device 100 faces and/or posture information of the electronic device 100 .
  • the at least one sensor may include an acceleration sensor, a gyro sensor, and the like. A detailed description of hardware included or may be included in the electronic device 100 of FIG. 2 is provided with reference to FIG. 10 .
  • the image sensor 120 may include a complementary metal oxide semiconductor (CMOS) sensor or a charged coupled device (CCD) sensor.
  • CMOS complementary metal oxide semiconductor
  • CCD charged coupled device
  • the light information of the subject incident through the lens assembly 111 may be converted into an electrical signal by the image sensor 120 and input to the image signal processor 130 .
  • An infrared cut filter (hereinafter, IR cut filter) may be disposed on the upper surface of the image sensor 120 , and the image of the subject passing through the lens is partially filtered by the IR cut filter and then applied to the image sensor 120 . can be detected by IR cut filter.
  • a light emitter may generate output light and emit it to the outside.
  • the light emitter may be configured and operated separately from the depth detection sensor 210 , or may be included in the depth detection sensor 210 to operate.
  • the depth sensor 210 may be a sensor capable of calculating a depth value of each pixel of an image.
  • the depth sensor 210 may be implemented, for example, in at least a stereo method, a time of flight (TOF) method, or a structured pattern method.
  • the depth sensor 210 (eg, a TOF sensor) may receive an input light corresponding to the output light emitted from the light emitter.
  • the output light and the input light may be infrared or near infrared.
  • the depth sensor 210 may obtain input light to irradiate the subject.
  • the depth sensor 210 may obtain distance information by analyzing the input light.
  • output light means light output from a light emitter and incident on an object
  • input light is light input to the depth sensor 210 after the output light reaches the object and is reflected from the object.
  • output light may be referred to as an output signal
  • input light may be referred to as an input signal.
  • the depth sensor 210 may irradiate the object with the obtained input light during a predetermined exposure period.
  • the exposure period may mean one frame period.
  • the exposure cycle may be repeated. For example, when the camera module 180 captures an object at 20 FPS, the exposure period may be 1/20 [sec]. And when 100 frames are generated, the exposure cycle may be repeated 100 times.
  • the image signal processor 130 and the image sensor 120 when the image signal processor 130 and the image sensor 120 are physically separated, there may be a sensor interface conforming to a standard.
  • the image signal processor 130 may perform image processing on the electrically converted image data.
  • a process in the image signal processor 130 may be divided into a pre-ISP (hereinafter, pre-processing) and an ISP chain (hereinafter, post-processing).
  • Image processing before the demosaicing process may mean pre-processing, and image processing after the demosaicing process may mean post-processing.
  • the preprocessing process may include 3A processing, lens shading correction, edge enhancement, dead pixel correction, knee correction, and the like.
  • 3A may include at least one of auto white balance (AWB), auto exposure (AE), and auto focusing (AF).
  • the post-processing process may include at least one of changing a sensor index value, changing a tuning parameter, and adjusting an aspect ratio.
  • the post-processing process may include processing the image data output from the image sensor 120 or image data output from the scaler.
  • the image signal processor 130 may adjust contrast, sharpness, saturation, dithering, etc. of the image through a post-processing process.
  • the contrast, sharpness, and saturation adjustment procedures are performed in the YUV color space, and the dithering procedure may be performed in the RGB (Red Green Blue) color space.
  • a part of the pre-processing process may be performed in the post-processing process, or a part of the post-processing process may be performed in the pre-processing process.
  • a part of the pre-processing process may be overlapped with a part of the post-processing process.
  • the display 110 may display contents such as an execution screen of an application executed by the processor 220 or images and/or videos stored in the memory 230 on the display 110 .
  • the processor 220 may display the image data acquired through the camera module 180 on the display 110 in real time.
  • the display 110 may be implemented integrally with the touch panel.
  • the display 110 may support a touch function, detect a user input such as a touch using a finger, and transmit it to the processor 220 .
  • the display 110 may be connected to a display driver integrated circuit (DDIC) for driving the display 110 , and the touch panel may be connected to a touch IC that detects touch coordinates and processes a touch-related algorithm.
  • DDIC display driver integrated circuit
  • the display driving circuit and the touch IC may be integrally formed, and in another embodiment, the display driving circuit and the touch IC may be formed separately.
  • the display driving circuit and/or the touch IC may be electrically connected to the processor 220 .
  • the processor 220 may execute/control various functions supported by the electronic device 100 .
  • the processor 220 may execute an application by executing a code written in a programming language stored in the memory 230 , and may control various hardware.
  • the processor 220 may execute an application supporting a photographing function stored in the memory 230 .
  • the processor 220 may execute the camera module 180 and set and support an appropriate shooting mode so that the camera module 180 may perform an operation intended by the user.
  • the memory 230 may store instructions executable by the processor 220 .
  • the memory 230 may be understood as a concept including a component in which data is temporarily stored, such as a random access memory (RAM), and/or a component in which data is permanently stored, such as a solid state drive (SSD).
  • the processor 220 may implement a software module in the RAM space by calling instructions stored in the SSD.
  • the memory 230 may include various types, and an appropriate type may be adopted according to the purpose of the device.
  • an application related to the camera module 180 may be stored in the memory 230 .
  • a camera application may be stored in the memory 230 .
  • the camera application may support various shooting functions, such as photo shooting, video shooting, panoramic shooting, and slow motion shooting.
  • an application associated with the camera module 180 may correspond to various types of applications.
  • a chatting application may also use the camera module 180 to support a video call, photo/video attachment, streaming service, product image, or product-related virtual reality (VR) shooting function.
  • VR virtual reality
  • FIG. 3 illustrates a process for performing a focus stack in an electronic device according to an exemplary embodiment.
  • An operating entity of the block diagram illustrated in FIG. 3 may be understood as a processor (eg, the processor 220 of FIG. 2 ) or an image signal processor (eg, the image signal processor 130 of FIG. 1 ).
  • the processor 220 may determine whether the distance between the camera of the electronic device 100 and the target object is a short distance using the depth information 301 and the image information 302 .
  • the depth information 301 may be obtained through the depth sensor 210
  • the image information 302 may be obtained through the image sensor 120 .
  • the image information 302 may be referred to as image data acquired through the image sensor 120 .
  • the standard of the short distance can be confirmed through ⁇ Table 1> below.
  • the horizontal axis represents a kind of distance from the center of the lens to the edge of the lens as a field
  • the vertical axis represents the distance between the camera of the electronic device 100 and the target object.
  • ⁇ Table 1> is a table showing how much the lens needs to be moved to have the maximum resolution in each field, assuming that the lens center is 0 field and the lens edge is 1 field, and the lens center is in focus. This value can be expressed as a movement amount (mm) or a step amount (step) of the lens. Information on the lens movement amount as shown in ⁇ Table 1> may be obtained based on design data, and may be obtained based on calibration data for correcting deviation for each module.
  • the processor 220 may determine the circle and the short distance based on about 30 cm.
  • the processor 220 may determine whether the distance between the camera of the electronic device 100 and the target object is less than or equal to about 30 cm using depth information obtained through the depth sensor 210 . .
  • the processor 220 may determine whether the target object is a flat object.
  • the processor 220 may calculate plane information to determine whether the target object is a plane object.
  • the processor 220 may change depth information into a point cloud format on the camera coordinate system for plane determination and calculation of plane information.
  • the processor 220 calculates a plane equation having the shortest distance between point clouds and a three-dimensional plane, obtains reliability for the plane using the square or absolute value of the distance, and determines whether it is a plane as a result. can
  • the processor 220 may determine a plane having a predetermined range of curvature as a plane object by using the depth sensor 210 .
  • the processor 220 may determine that the object is a flat object if the error value when the target object is matched to time-of-flight (ToF) depth information is within 1 cm.
  • ToF time-of-flight
  • the processor 220 may analyze the type of the target object.
  • the processor 220 may analyze the type of the target object in order to compensate for the fact that it may be difficult to determine that the paper or book is bent at a short distance.
  • the processor 220 may classify the target object through a function (eg, a scene optimizer) of the electronic device 100 .
  • the one function eg, a scene optimizer
  • the one function may be a function capable of discriminating an object or the like based on data. For example, when photographing a natural landscape, the processor 220 may determine the target object in consideration of an average shape, average color (eg, green and brown) of the target object (eg, a tree). The processor 220 may determine whether the target object is a paper or a book through a function (eg, a scene optimizer) of the electronic device 100 .
  • the processor 220 may use the lens information 303 to calculate a lens movement amount and the required number of shots for the focus stack.
  • the lens information 303 can be confirmed through ⁇ Table 1>.
  • the processor 220 may determine whether a focus stack operation is necessary by combining conditions such as the lens information 303, the near-plane object, and/or text information. When it is determined that the focus stack operation is necessary, the processor 220 may determine the number of shots in consideration of a degree of blur for each field, whether text is included, a distance from a target object, and the like.
  • images having the maximum resolution for each field may be obtained by photographing using a lens movement amount between 0.1 and 0.5 fields.
  • the processor 220 may determine six images corresponding to the 0 field and the 0.1 field to the 0.5 field as images required for the focus stack.
  • the lens shift amount between 0.7 field and 0.9 field may be used.
  • the processor 220 may determine the four images corresponding to the 0 field and the 0.7 field to the 0.9 field as images necessary for the focus stack. Referring to ⁇ Table 1>, since the fields of the 0.1 & 0.9 field, 0.2 & 0.8 field, 0.3 & 0.7 field, and 0.4 & 0.6 field pair have similar lens shift amounts for maximum resolution, the processor 220 generates an average lens Using the shift amount, two fields can be covered with one lens shift amount. When two fields are covered with one lens movement amount, the electronic device 100 may reduce the number of shots and thus reduce a focus stack processing time of a high-resolution image.
  • the distance to the camera becomes closer than the image center, and the other opposite The orientation may be such that the distance from the camera is greater than the center. In this case, since there is no field pair having the maximum resolution due to similar lens movement, the number of photographing sheets may increase.
  • FIG. 4 illustrates a state in which an electronic device captures a planar object according to an exemplary embodiment.
  • the target object 405 may be an object having a generally planar shape.
  • the target object 405 may be a plane having a certain range of curvature.
  • the predetermined range may mean a case in which an error value when a target object is matched to time-of-flight (ToF) depth information is within 1 cm.
  • ToF time-of-flight
  • the processor 220 may determine a plurality of regions with respect to the target object 405 based on a point 410 to which the center of the lens is directed.
  • the processor 220 may divide the target object 405 into an A region 430 , a B region 440 , or a C region 450 .
  • Area B 440 may be an area that is farther away from the camera of the electronic device 100 than area A 430 .
  • Region C 450 may be a region further away from the camera of the electronic device 100 than region B 440 .
  • the plurality of regions are not limited to region A 430 , region B 440 , and region C 450 , but the area of the target object or the distance between the camera of the electronic device 100 and the target object. Depending on the distance, it may be less or more. This can be equally applied to the description below.
  • the processor 220 may obtain a first image by setting a first focal length to focus on the area A 430 .
  • the processor 220 may obtain a second image by setting a second focal length to focus on the region B 440 .
  • the processor 220 may obtain a third image by setting a third focal length to focus on the C region 450 .
  • the first focal length may be a focal length set based on an auto focus function.
  • the second focal length may be a longer focal length than the first focal length.
  • the third focal length may be a longer focal length than the second focal length.
  • FIG. 5 is a diagram illustrating a state in which a camera of an electronic device captures a flat object while forming a predetermined angle with the flat object, according to an exemplary embodiment.
  • FIG. 5 is a diagram illustrating a state in which the electronic device 100 captures a target object 505 . It shows a state in which the electronic device 100 captures a target object while forming a predetermined angle 530 instead of being parallel to the target object.
  • the predetermined angle may be an angle between the electronic device 100 and the target object.
  • the processor 220 may calculate an angle 530 between the direction of the optical axis 520 of the camera of the electronic device 100 and the normal of the target object. The calculated angle may be referred to as angle information between the lens and the plane.
  • the processor 220 may determine a plurality of regions with respect to the target object 505 based on a point 510 to which the center of the lens is directed.
  • the processor 220 may divide the target object 505 into at least an A region 540 , a B region 550 , a C region 560 , a D region 570 , and an E region 580 .
  • the plurality of regions are not limited to region A 540 , region B 550 , region C 560 , region D 570 , and region E 580 , but the area of the target object;
  • the number may be less or more depending on the distance between the camera of the electronic device 100 and the target object or the angle between the electronic device and the target object. This can be equally applied to the description below.
  • the processor 220 may obtain a first image by setting a first focal length in order to focus on the area A 540 .
  • the processor 220 may obtain a second image by setting a second focal length to focus on the region B 550 .
  • the processor 220 may obtain a third image by setting a third focal length to focus on the C region 560 .
  • the processor 220 may obtain a fourth image by setting a fourth focal length to focus on the D region 570 .
  • the processor 220 may obtain a fifth image by setting a fifth focal length to focus on the E region 580 .
  • the first focal length may be a focal length set based on an auto focus function.
  • the second focal length may be a longer focal length than the first focal length.
  • the third focal length may be a longer focal length than the second focal length.
  • the fourth focal length may be a shorter focal length than the first focal length.
  • the fifth focal length may be a shorter focal length than the fourth focal length.
  • FIG. 6 is a diagram illustrating an electronic device photographing a non-planar object according to an exemplary embodiment.
  • the processor 220 may determine that the target object located in the short distance is a curved surface rather than a flat surface.
  • the processor 220 may determine the target object determined to be a curved surface as a plurality of flat areas.
  • the plurality of planar areas may be planar areas having curvatures within a range that may be determined to be planar.
  • the processor 220 converts the curved target object into a planar area A 630 , a flat area B 640 , a flat area C 650 , a flat area D 660 , and a planar area E 670 . can be shared
  • the plurality of planar regions are not limited to region A 630 , region B 640 , region C 650 , region D 660 , and region E 670 , but the area of the target object and the electronic device 100 are not limited thereto. ) may be less or more depending on the distance between the camera and the target object and the degree of curvature of the target object.
  • the description of the target object mentioned in FIG. 5 may be equally applied to the individual planar regions (eg, 630 , 640 , 650 , 660 and 670 ) in FIG. 6 .
  • FIG. 7 is a flowchart illustrating a process in which an electronic device acquires a plurality of images based on a plurality of focal points, according to an exemplary embodiment.
  • the operating subject of the flowchart illustrated in FIG. 7 may be understood as a processor (eg, the processor 220 of FIG. 2 ) or an image signal processor (eg, the image signal processor 130 of FIG. 1 ).
  • the processor 220 may acquire image data through the image sensor 120 .
  • the image data is image data before image processing is performed in the image signal processor 130, and each pixel may have a color value of R (red), G (green), or B (blue) corresponding to the color filter array. have.
  • the processor 220 may determine whether the distance between the camera of the electronic device 100 and the target object is less than or equal to a predetermined distance using the depth sensor 210 .
  • the camera of the electronic device 100 may be a depth camera or a camera including a depth detection sensor 210 .
  • the processor 220 may determine whether the target object corresponds to a planar object using the depth sensor.
  • the processor 220 may activate a text recognition function.
  • the text recognition function may be the same as or a lower function of one function (eg, scene optimizer) of the electronic device 100 described with reference to FIG. 3 .
  • the text recognition function may be activated in response to a user input regardless of whether the target object is near or far. Alternatively, the text recognition function may be automatically activated in response to the execution of the camera application.
  • the processor 220 may activate a text recognition function to determine whether a target object (eg, paper or a wall) includes text.
  • the processor 220 may determine the type of the target object or whether text is included in order to compensate for the fact that it may be difficult to determine that the paper or book is bent at a short distance. For example, the processor 220 determines a curved target object at a short distance, and when it is determined that the target object is a paper or a book that is a flat object including text, the target object is converted into a flat object based on the determination. can judge
  • the processor 220 may determine the weight of the text in the target object using a text recognition function.
  • the processor 220 may prioritize a plurality of regions of the target object to be described below in consideration of the weight of the text of the target object. For example, the processor 220 may not give priority to a target object (eg, paper) because it is not necessary to focus on an area where text does not exist.
  • the processor 220 may adjust the focal length so that the text is present by giving priority to the region.
  • the processor 220 may determine a plurality of focal points on different regions of the target object.
  • the processor 220 may set a different focus for each frame of the image data obtained from the image sensor 120 .
  • a first focus may be set in an nth frame
  • a second focus may be set in an n+1th frame
  • a third focus may be set in an n+2th frame.
  • the processor 220 may set the first focal length so that the first area of the target object is in focus.
  • the processor 220 may set the second focal length so that the second area of the target object is in focus.
  • the processor 220 may set the third focal length so that the third area of the target object is in focus. What has been described above is only an example, and a larger focal length may be set by dividing the area into more areas.
  • the processor 220 may acquire a plurality of images based on the determined plurality of focal points.
  • the processor 220 may adjust the focal length of the lens to control movement of the lens to achieve focus on a predetermined area.
  • the processor 220 may acquire a first image based on the first focal length, and acquire a second image based on a second focal length that is longer than the first focal length.
  • the processor 220 obtains a first image based on a first focal length, obtains a second image based on a second focal length longer than the first focal length, and obtains a third focal length shorter than the first focal length
  • a third image may be acquired based on .
  • the first image is an image focused on a first area of the target object
  • the second image is an image focused on a second area of the target object
  • the third image is focused on a third area of the target object It may be the correct image.
  • the processor 220 may acquire a final image of the target object based on at least a part of the plurality of acquired images.
  • the processor 220 determines a first image acquired based on a first focal length among the plurality of acquired images as a main image, and a second image acquired based on a second focal length and A third image acquired based on the third focal length may be determined as a sub-image.
  • the processor 220 may synthesize the acquired images.
  • the processor 220 is configured to select a first area of a focused target object in the obtained first image (or main image), and a second area of a focused target object in the obtained second image (or first sub-image).
  • the second region and the third region of the target object in focus from the obtained third image (or second sub-image) may be synthesized.
  • the processor 220 may generate a final image by synthesizing the images.
  • the processor 220 may output the generated final image as a preview on the display 110 .
  • FIG. 8 is a flowchart illustrating a process of dividing a region of a target object according to an angle formed by a camera of an electronic device with the target object according to an exemplary embodiment.
  • the operating subject of the flowchart illustrated in FIG. 8 may be understood as a processor (eg, the processor 220 of FIG. 2 ) or an image signal processor (eg, the image signal processor 130 of FIG. 1 ).
  • the processor 220 may determine an angle between the camera of the electronic device and the target object.
  • the processor 220 may calculate a normal vector in a direction perpendicular to the plane and may calculate an angle between the normal vector and the Z-axis of the camera coordinate system.
  • the angle between the normal and the Z-axis of the camera coordinate system may be referred to as an angle between a lens and a plane or an angle between the electronic device and the target object.
  • the processor 220 may determine whether an angle between the depth camera of the electronic device and the target object is within the first range.
  • the processor 220 may determine whether the angle is an angle within the second range.
  • the first range may be about 0 degrees to about 20 degrees
  • the second range may be about 20 degrees to about 40 degrees greater than the first range.
  • the distance to the camera becomes closer than the single center in a specific direction based on one center of the target object to which the center of the lens is directed, In the other opposite direction, the distance from the camera may be greater than the one center. Since the processor 220 needs to adjust the focal length by moving the lens forward and backward at the position where the reference focus is met, the number of shots may be increased.
  • the distance from the camera may be greater than the single center. Since the processor 220 has to adjust the focal length by moving the lens in one direction at the position where the reference focus is met, the number of shots compared to the case where the angle is large can be reduced.
  • operation 820 may be performed, and if the angle between the electronic device and the target object falls within the second range instead of the first range, operation 830 may be performed. have.
  • the processor 220 may divide the target object into a first number of regions.
  • the processor 220 may divide the target object (eg, paper) into a first number of regions.
  • the processor 220 may divide the target object into a second number of regions.
  • the processor 220 may divide the target object (eg, paper) into a second number of regions. The second number may be greater than the first number.
  • the processor 220 may determine a plurality of focal points for a plurality of regions of the target object.
  • the processor 220 may determine a plurality of focal points such that each of the plurality of regions is in focus. For example, the processor 220 may determine the first focal length so that the first area of the target object is in focus.
  • the processor 220 may determine the second focal length so that the second area of the target object is in focus.
  • the processor 220 may determine the third focal length so that the third area of the target object is in focus.
  • the processor 220 may acquire a plurality of images based on the plurality of determined focal points, and may acquire a final image of the target object based on at least some of the plurality of acquired images. For example, when the angle corresponds to the first range, the processor 220 may acquire six images and synthesize three of them to obtain a final image. As another example, when the angle corresponds to the second range, the processor 220 may acquire 9 images and synthesize 4 of them to obtain a final image.
  • FIG. 9 is a flowchart illustrating a process of acquiring an image based on movement of a lens according to an exemplary embodiment.
  • the operating subject of the flowchart illustrated in FIG. 9 may be understood as a processor (eg, the processor 220 of FIG. 2 ) or an image signal processor (eg, the image signal processor 130 of FIG. 1 ).
  • the processor 220 may determine whether an angle between the camera of the electronic device 100 and the target object is a first range. Operation 905 may correspond to operation 810 .
  • the processor 220 may acquire an image based on the auto focus function.
  • the processor 220 may acquire a reference image or a reference image focused on the first area by the automatic initial activation function.
  • the processor 220 may acquire an image while increasing the focal length to the reference focal point.
  • the processor 220 may acquire an image focused on the second area by increasing the focal length more than the focal length in operation 910 .
  • the processor 220 may determine whether the acquired images are necessary for the focus stack.
  • the processor 220 may determine whether regions requiring sharpness in the target object are in focus, and determine whether sufficient images are obtained in the focus stack in the direction of increasing the focal length. If sufficient images are obtained in the focus stack, operation 760 is performed. Otherwise, an additional image may be obtained by further increasing the focal length.
  • the processor 220 may acquire an image based on the auto focus function.
  • the processor 220 may acquire a reference image focused on the first area by the automatic initial activation function.
  • the processor 220 may acquire an image while increasing the focal length to the reference focal point.
  • the processor 220 may acquire an image focused on the second area by increasing the focal length more than the focal length in operation 930 .
  • the processor 220 may determine whether the acquired images are sufficient for the focus stack.
  • the processor 220 may determine whether regions requiring sharpness in the target object are in focus, and determine whether sufficient images are obtained in the focus stack in the direction of increasing the focal length. If sufficient images are obtained in the focus stack, operation 945 is performed. Otherwise, an additional image may be obtained by further increasing the focal length.
  • the processor 220 may acquire an image while reducing the focal length to the reference focal point.
  • the processor 220 may reduce the focal length more than the focal length in operation 930 to obtain an image focused on the third area.
  • the processor 220 may determine whether the acquired images are sufficient for the focus stack.
  • the processor 220 may determine whether a sufficient image is obtained in a focus stack in a direction of reducing a focal length by determining whether areas requiring sharpness in the target object are in focus. If sufficient images are obtained in the focus stack, operation 760 of FIG. 7 is performed. Otherwise, an additional image may be obtained by further reducing the focal length.
  • operations 945 and 950 may precede operations 935 and 940 .
  • the processor 220 may acquire the image while increasing the focal length to the reference focal point after completing acquiring the image while decreasing the focal length to the reference focal point.
  • FIG. 10 is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure.
  • the electronic device 1001 (eg, the electronic device 100 of FIG. 1 ) 1001 communicates with the electronic device 1002 through a first network 1098 (eg, a short-range wireless communication network). ), or may communicate with the electronic device 1004 or the server 1008 through the second network 1099 (eg, a remote wireless communication network). According to an embodiment, the electronic device 1001 may communicate with the electronic device 1004 through the server 1008 .
  • a first network 1098 eg, a short-range wireless communication network
  • the server 1008 eg, a remote wireless communication network
  • the electronic device 1001 includes a processor 1020 , a memory 1030 , an input module 1050 , a sound output module 1055 , a display module 1060 , an audio module 1070 , and a sensor module ( 1076), interface 1077, connection terminal 1078, haptic module 1079, camera module 1080, power management module 1088, battery 1089, communication module 1090, subscriber identification module 1096 , or an antenna module 1097 .
  • at least one of these components eg, the connection terminal 1078
  • some of these components are integrated into one component (eg, display module 1060 ). can be
  • the processor (eg, the processor 220 of FIG. 2 ) 1020 is, for example, at least one other component of the electronic device 1001 connected to the processor 1020 by executing software (eg, the program 1040 ). It can control elements (eg, hardware or software components) and can perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 1020 stores a command or data received from another component (eg, the sensor module 1076 or the communication module 1090 ) into the volatile memory 1032 . may be stored in , process commands or data stored in the volatile memory 1032 , and store the result data in the non-volatile memory 1034 .
  • another component eg, the sensor module 1076 or the communication module 1090
  • the processor 1020 is the main processor 1021 (eg, a central processing unit or an application processor) or a secondary processor 1023 (eg, a graphic processing unit, a neural network processing unit (eg, a graphic processing unit, a neural network processing unit) It may include a neural processing unit (NPU), an image signal processor (eg, the image signal processor 130 of FIG. 1 ), a sensor hub processor, or a communication processor.
  • the main processor 1021 and the auxiliary processor 1023 uses less power than the main processor 1021 or is set to be specialized for a specified function.
  • the auxiliary processor 1023 may be implemented separately from or as a part of the main processor 1021 .
  • the coprocessor 1023 may, for example, act on behalf of the main processor 1021 while the main processor 1021 is in an inactive (eg, sleep) state, or when the main processor 1021 is active (eg, executing an application). ), together with the main processor 1021, at least one of the components of the electronic device 1001 (eg, the display module 1060, the sensor module 1076, or the communication module 1090) It is possible to control at least some of the related functions or states.
  • the coprocessor 1023 eg, an image signal processor or a communication processor
  • may be implemented as part of another functionally related component eg, the camera module 1080 or the communication module 1090). have.
  • the auxiliary processor 1023 may include a hardware structure specialized for processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 1001 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, server 1008).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 1030 may store various data used by at least one component of the electronic device 1001 (eg, the processor 1020 or the sensor module 1076 ).
  • the data may include, for example, input data or output data for software (eg, the program 1040 ) and instructions related thereto.
  • the memory 1030 may include a volatile memory 1032 or a non-volatile memory 1034 .
  • the program 1040 may be stored as software in the memory 1030 , and may include, for example, an operating system 1042 , middleware 1044 , or an application 1046 .
  • the input module 1050 may receive a command or data to be used in a component (eg, the processor 1020 ) of the electronic device 1001 from the outside (eg, a user) of the electronic device 1001 .
  • the input module 1050 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 1055 may output a sound signal to the outside of the electronic device 1001 .
  • the sound output module 1055 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to an embodiment, the receiver may be implemented separately from or as a part of the speaker.
  • the display module 1060 may visually provide information to the outside (eg, a user) of the electronic device 1001 .
  • the display module 1060 may include, for example, a control circuit for controlling a display, a hologram device, or a projector and a corresponding device.
  • the display module 1060 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 1070 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 1070 acquires a sound through the input module 1050 or an external electronic device (eg, a sound output module 1055 ) directly or wirelessly connected to the electronic device 1001 .
  • the electronic device 1002) eg, a speaker or headphones
  • the sensor module 1076 detects an operating state (eg, power or temperature) of the electronic device 1001 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 1076 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 1077 may support one or more specified protocols that may be used for the electronic device 1001 to directly or wirelessly connect with an external electronic device (eg, the electronic device 1002 ).
  • the interface 1077 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • connection terminal 1078 may include a connector through which the electronic device 1001 can be physically connected to an external electronic device (eg, the electronic device 1002 ).
  • the connection terminal 1078 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 1079 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 1079 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 1080 may capture still images and moving images. According to an embodiment, the camera module 1080 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 1088 may manage power supplied to the electronic device 1001 .
  • the power management module 1088 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 1089 may supply power to at least one component of the electronic device 1001 .
  • the battery 1089 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 1090 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 1001 and an external electronic device (eg, the electronic device 1002, the electronic device 1004, or the server 1008). It can support establishment and communication performance through the established communication channel.
  • the communication module 1090 may include one or more communication processors that operate independently of the processor 1020 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 1090 is a wireless communication module 1092 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 1094 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module among these communication modules is a first network 1098 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 1099 (eg, legacy It may communicate with the external electronic device 1004 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • a first network 1098 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 1099 eg, legacy It may communicate with the external electronic device 1004 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or a WAN).
  • These various types of communication modules
  • the wireless communication module 1092 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 1096 within a communication network, such as the first network 1098 or the second network 1099 .
  • subscriber information eg, International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the electronic device 1001 may be identified or authenticated.
  • the wireless communication module 1092 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 1092 may support a high frequency band (eg, mmWave band) to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 1092 uses various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 1092 may support various requirements defined in the electronic device 1001 , an external electronic device (eg, the electronic device 1004 ), or a network system (eg, the second network 1099 ).
  • the wireless communication module 1092 includes a peak data rate (eg, 20 Gbps or more) for realizing eMBB, loss coverage (eg, 164 dB or less) for realizing mMTC, or U-plane latency for realizing URLLC ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • loss coverage eg, 164 dB or less
  • U-plane latency for realizing URLLC
  • the antenna module 1097 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 1097 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 1097 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 1098 or the second network 1099 is connected from the plurality of antennas by, for example, the communication module 1090 . can be selected. A signal or power may be transmitted or received between the communication module 1090 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 1097 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a specified high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • a command or data may be transmitted or received between the electronic device 1001 and the external electronic device 1004 through the server 1008 connected to the second network 1099 .
  • Each of the external electronic devices 1002 and 1004 may be the same as or different from the electronic device 1001 .
  • all or a part of operations executed by the electronic device 1001 may be executed by one or more external electronic devices 1002 , 1004 , or 1008 .
  • the electronic device 1001 when the electronic device 1001 needs to perform a function or service automatically or in response to a request from a user or other device, the electronic device 1001 performs the function or service by itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 1001 .
  • the electronic device 1001 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 1001 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 1004 may include an Internet of things (IoT) device.
  • IoT Internet of things
  • Server 1008 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 1004 or the server 1008 may be included in the second network 1099 .
  • the electronic device 1001 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the electronic device may be a device of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a laptop, a desktop, a tablet, or a portable multimedia device
  • portable medical device e.g., a portable medical device
  • camera e.g., a camera
  • a wearable device e.g., a smart watch
  • a home appliance device e.g., a smart bracelet
  • first”, “second”, or “first” or “second” may simply be used to distinguish the component from other components in question, and may refer to components in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, for example, and interchangeably with terms such as logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 1040) including
  • a processor eg, processor 1020
  • a device eg, electronic device 1001
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the methods according to various embodiments disclosed in this document may be provided by being included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed online (eg download or upload), directly between smartphones (eg smartphones).
  • a part of the computer program product may be temporarily stored or temporarily generated in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. .
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repetitively, or heuristically, or one or more of the operations are executed in a different order, omitted, or , or one or more other operations may be added.
  • FIG. 11 is a block diagram illustrating a camera module (eg, the camera module 180 of FIG. 1 ) according to various embodiments of the present disclosure.
  • the camera module 1080 includes a lens assembly 1110 , a flash 1120 , an image sensor 1130 , an image stabilizer 1140 , a memory 1150 (eg, a buffer memory), or an image signal processor. (1160) may be included.
  • the lens assembly 1110 may collect light emitted from a subject, which is an image capturing object.
  • the lens assembly 1110 may include one or more lenses.
  • the camera module 1080 may include a plurality of lens assemblies 1110 . In this case, the camera module 1080 may form, for example, a dual camera, a 360 degree camera, or a spherical camera.
  • Some of the plurality of lens assemblies 1110 may have the same lens properties (eg, angle of view, focal length, auto focus, f number, or optical zoom), or at least one lens assembly may be a different lens assembly. It may have one or more lens properties different from the lens properties of .
  • the lens assembly 1110 may include, for example, a wide-angle lens or a telephoto lens.
  • the flash 1120 may emit light used to enhance light emitted or reflected from the subject.
  • the flash 1120 may include one or more light emitting diodes (eg, a red-green-blue (RGB) LED, a white LED, an infrared LED, or an ultraviolet LED), or a xenon lamp.
  • the image sensor 1130 may acquire an image corresponding to the subject by converting light emitted or reflected from the subject and transmitted through the lens assembly 1110 into an electrical signal.
  • the image sensor 1130 may include, for example, one image sensor selected from among image sensors having different properties, such as an RGB sensor, a black and white (BW) sensor, an IR sensor, or a UV sensor, the same It may include a plurality of image sensors having properties, or a plurality of image sensors having different properties.
  • Each image sensor included in the image sensor 1130 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the image stabilizer 1140 moves at least one lens or the image sensor 1130 included in the lens assembly 1110 in a specific direction or Operation characteristics of the image sensor 1130 may be controlled (eg, read-out timing may be adjusted, etc.). This makes it possible to compensate for at least some of the negative effects of the movement on the image being taken.
  • the image stabilizer 1140 according to an embodiment, the image stabilizer 1140 is a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 1080 Such a movement of the camera module 1080 or the electronic device 1001 may be detected using .
  • the image stabilizer 1140 may be implemented as, for example, an optical image stabilizer.
  • the memory 1150 may temporarily store at least a portion of the image acquired through the image sensor 1130 for the next image processing operation. For example, when image acquisition is delayed according to the shutter or a plurality of images are acquired at high speed, the acquired original image (eg, Bayer-patterned image or high-resolution image) is stored in the memory 1150 and , a copy image corresponding thereto (eg, a low-resolution image) may be previewed through the display device 1060 .
  • the acquired original image eg, Bayer-patterned image or high-resolution image
  • a copy image corresponding thereto eg, a low-resolution image
  • the memory 1150 may be configured as at least a part of the memory 1030 or as a separate memory operated independently of the memory 1030 .
  • the image signal processor 1160 may perform one or more image processing on an image acquired through the image sensor 1130 or an image stored in the memory 1150 .
  • the one or more image processes may include, for example, depth map generation, 3D modeling, panorama generation, feature point extraction, image synthesis, or image compensation (eg, noise reduction, resolution adjustment, brightness adjustment, blurring ( blurring), sharpening (sharpening), or softening (softening)
  • the image signal processor 1160 may include at least one of the components included in the camera module 1080 (eg, an image sensor). 1130), for example, exposure time control, readout timing control, etc.
  • the image processed by the image signal processor 1160 is stored back in the memory 1150 for further processing.
  • the image signal processor 1160 may be configured as at least a part of the processor 1020 or as a separate processor operated independently of the processor 1020.
  • the image signal processor 1160 may include the processor 1020 and a separate processor, the at least one image processed by the image signal processor 1160 may be displayed through the display device 1060 as it is or after additional image processing is performed by the processor 1020 .
  • the electronic device 1001 may include a plurality of camera modules 1080 each having different properties or functions.
  • at least one of the plurality of camera modules 1080 may be a wide-angle camera, and at least the other may be a telephoto camera.
  • at least one of the plurality of camera modules 1080 may be a front camera, and at least the other may be a rear camera.
  • the electronic device 100 includes at least one processor (eg, FIG. 2 ) electrically connected to the image sensor 120 , the depth sensor 210 , the image sensor 120 , and the depth sensor 210 . of the processor 220).
  • the at least one processor 220 may determine whether the target object corresponds to a flat object using the depth sensor 210 and activate a text recognition function.
  • the processor 220 determines a plurality of focal points for different regions of the target object in response to the target object corresponding to the planar object and activation of the text recognition function, and based on the determined plurality of focal points
  • a plurality of images may be acquired by driving the image sensor 120 .
  • the processor 220 may acquire a final image of the target object based on at least a part of the plurality of acquired images.
  • the processor 220 when the angle between the camera of the electronic device 100 and the target object is within a first range, the processor 220 divides the area for the target object by the first number to determine a plurality of focal points, , when the angle between the camera of the electronic device 100 and the target object is a second range greater than the first range, the processor 220 may determine a plurality of focal points by dividing the area for the target object by the second number. have.
  • the processor 220 may acquire the image while increasing the focal length from the reference focal length.
  • the processor 220 acquires an image while increasing the focal length than the reference focal length, and decreases the focal length than the reference focal length to obtain the image. can be obtained.
  • an angle between the electronic device 100 and the target object may be an angle between a normal that is perpendicular to the target object and a z-axis of the camera coordinate system.
  • the processor 220 may determine whether the target object includes a text area greater than or equal to a certain ratio using the image sensor 120 or the depth sensor 210 .
  • the text recognition function may be activated by a user input or may be activated automatically in response to execution of a camera application.
  • the processor 220 acquires a first image based on a first focal length to focus on a first area of the target object, and obtains a second image to focus on a second area of the target object A second image may be acquired based on the 2 focal lengths.
  • the processor 220 may determine at least one image among the plurality of acquired images as a main image and determine the remaining images as sub images.
  • the processor 220 may obtain a final image of the target object by synthesizing a portion of the sub images with the main image.
  • the different areas may be prioritized for the focus stack based on the size of the different areas, the degree of blur, and the distance from the camera of the electronic device.
  • the processor 220 may determine a focus for the high-priority regions, and acquire a plurality of images based on the determined focus.
  • the operating method of the electronic device 100 may include an operating method of determining whether the target object corresponds to a planar object using the depth sensor 210 .
  • An operating method of the electronic device 100 may include an operating method of activating a text recognition function.
  • the operating method of the electronic device 100 includes an operating method of determining a plurality of focal points for different regions of the target object in response to the target object corresponding to the planar object and the activation of the text recognition function can do.
  • the operating method of the electronic device 100 may include an operating method of acquiring a plurality of images by driving the image sensor 120 based on the determined plurality of focal points.
  • the operating method of the electronic device 100 may include an operating method of acquiring a final image of the target object based on at least a part of the plurality of acquired images.
  • dividing the area for the target object by the first number It may include an operation of determining the focus of
  • dividing the area for the target object by the second number to determine a plurality of focal points may include a method of operation.
  • the acquiring of the plurality of images includes acquiring images while increasing a focal length rather than a reference focal length when an angle between the camera of the electronic device 100 and the target object is within a first range.
  • the method may include an operation method including an operation of acquiring an image while reducing a focal length from a reference focal length.
  • the operating method of the electronic device 100 may include an operating method of determining whether the target object includes a text area of a certain ratio or more using the image sensor 120 or the depth sensor 210 . .
  • the acquiring of the plurality of images includes acquiring a first image based on a first focal length so that a first region of the target object is in focus, and applying a first image to a second region of the target object.
  • the method may include an operating method of acquiring a second image based on a second focal length so as to be in focus.
  • the obtaining of the final image includes determining at least one image from among the plurality of obtained images as a main image, determining the remaining images as sub images, and adding the main image to the main image. and obtaining a final image of the target object by synthesizing a portion of the sub-images.
  • the method of operating the electronic device 100 provides priority for focusing stacks on the different regions based on the size of the different regions, the degree of blur, and the distance from the camera of the electronic device. It may include an operation method for determining the ranking.
  • the operating method of the electronic device 100 may include an operating method of determining a focus for the high-priority regions and an operating method of acquiring a plurality of images based on the determined focus. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

Au moins un processeur inclus dans un dispositif électronique peut : déterminer si un objet cible correspond à un objet plan à l'aide d'un capteur de détection de profondeur ; activer une fonction de reconnaissance de texte ; déterminer une pluralité de foyers par rapport à différentes zones de l'objet cible en réponse au fait que l'objet cible correspond à l'objet plan et que la fonction de reconnaissance de texte est activée ; acquérir une pluralité d'images par entraînement d'un capteur d'image d'après la pluralité déterminée de foyers ; et acquérir une image finale de l'objet cible d'après moins une partie de la pluralité d'images acquise. Divers autres modes de réalisation identifiés à partir de la description sont possibles.
PCT/KR2021/005965 2020-06-12 2021-05-12 Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé WO2021251631A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200071755A KR20210154594A (ko) 2020-06-12 2020-06-12 초점 조절 기능을 포함하는 전자 장치 및 방법
KR10-2020-0071755 2020-06-12

Publications (1)

Publication Number Publication Date
WO2021251631A1 true WO2021251631A1 (fr) 2021-12-16

Family

ID=78846236

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/005965 WO2021251631A1 (fr) 2020-06-12 2021-05-12 Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé

Country Status (2)

Country Link
KR (1) KR20210154594A (fr)
WO (1) WO2021251631A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004347858A (ja) * 2003-05-22 2004-12-09 Nec Corp カバー付き撮像装置
KR20060105930A (ko) * 2005-04-01 2006-10-12 엘지전자 주식회사 카메라 폰에서의 문자 인식 장치 및 방법
JP2018107593A (ja) * 2016-12-26 2018-07-05 キヤノン株式会社 画像処理装置、画像処理方法およびプログラム
JP2019032370A (ja) * 2017-08-04 2019-02-28 キヤノン株式会社 撮像装置、及びその制御方法
JP2019032443A (ja) * 2017-08-08 2019-02-28 キヤノン株式会社 焦点調節装置、撮像装置、焦点調節方法、及びプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004347858A (ja) * 2003-05-22 2004-12-09 Nec Corp カバー付き撮像装置
KR20060105930A (ko) * 2005-04-01 2006-10-12 엘지전자 주식회사 카메라 폰에서의 문자 인식 장치 및 방법
JP2018107593A (ja) * 2016-12-26 2018-07-05 キヤノン株式会社 画像処理装置、画像処理方法およびプログラム
JP2019032370A (ja) * 2017-08-04 2019-02-28 キヤノン株式会社 撮像装置、及びその制御方法
JP2019032443A (ja) * 2017-08-08 2019-02-28 キヤノン株式会社 焦点調節装置、撮像装置、焦点調節方法、及びプログラム

Also Published As

Publication number Publication date
KR20210154594A (ko) 2021-12-21

Similar Documents

Publication Publication Date Title
WO2022039424A1 (fr) Procédé de stabilisation d'images et dispositif électronique associé
WO2022030838A1 (fr) Dispositif électronique et procédé de commande d'image de prévisualisation
WO2022108235A1 (fr) Procédé, appareil et support de stockage pour obtenir un obturateur lent
WO2022149654A1 (fr) Dispositif électronique pour réaliser une stabilisation d'image, et son procédé de fonctionnement
WO2022092706A1 (fr) Procédé de prise de photographie à l'aide d'une pluralité de caméras, et dispositif associé
WO2022196993A1 (fr) Dispositif électronique et procédé de capture d'image au moyen d'un angle de vue d'un module d'appareil de prise de vues
WO2022149812A1 (fr) Dispositif électronique comprenant un module de caméra et procédé de fonctionnement de dispositif électronique
WO2021251631A1 (fr) Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé
WO2022080737A1 (fr) Procédé de correction de distorsion d'image, et dispositif électronique associé
WO2021230567A1 (fr) Procédé de capture d'image faisant intervenir une pluralité d'appareils de prise de vues et dispositif associé
WO2022240186A1 (fr) Procédé de correction de distorsion d'image et dispositif électronique associé
WO2022154164A1 (fr) Dispositif électronique apte à régler un angle de vue et procédé de fonctionnement associé
WO2022092607A1 (fr) Dispositif électronique comportant un capteur d'image et procédé de fonctionnement de celui-ci
WO2022235043A1 (fr) Dispositif électronique comprenant une pluralité de caméras et son procédé de fonctionnement
WO2022220621A1 (fr) Dispositif électronique comprenant un réflecteur et un ensemble objectif
WO2021230507A1 (fr) Procédé et dispositif pour fournir un guidage en imagerie
WO2022025574A1 (fr) Dispositif électronique comprenant un capteur d'image et un processeur de signal d'image, et son procédé
WO2022220444A1 (fr) Procédé de balayage lors d'une prise de vue avec un appareil photo et appareil électronique associé
WO2022231270A1 (fr) Dispositif électronique et son procédé de traitement d'image
WO2023033396A1 (fr) Dispositif électronique pour traiter une entrée de prise de vue continue, et son procédé de fonctionnement
WO2022186495A1 (fr) Dispositif électronique comprenant une pluralité d'objectifs et procédé de commande dudit dispositif
WO2022173236A1 (fr) Dispositif électronique comprenant un capteur d'image et son procédé de fonctionnement
WO2022231251A1 (fr) Procédé de traitement d'images et dispositif électronique pour le prendre en charge
WO2023146236A1 (fr) Dispositif électronique comprenant un module de caméra
WO2023043132A1 (fr) Dispositif électronique d'application d'effet bokeh sur une image et son procédé de fonctionnement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21822079

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21822079

Country of ref document: EP

Kind code of ref document: A1