WO2022080737A1 - Procédé de correction de distorsion d'image, et dispositif électronique associé - Google Patents

Procédé de correction de distorsion d'image, et dispositif électronique associé Download PDF

Info

Publication number
WO2022080737A1
WO2022080737A1 PCT/KR2021/013658 KR2021013658W WO2022080737A1 WO 2022080737 A1 WO2022080737 A1 WO 2022080737A1 KR 2021013658 W KR2021013658 W KR 2021013658W WO 2022080737 A1 WO2022080737 A1 WO 2022080737A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
parameter value
image
processor
face
Prior art date
Application number
PCT/KR2021/013658
Other languages
English (en)
Korean (ko)
Inventor
신대규
히링크유리
문승민
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022080737A1 publication Critical patent/WO2022080737A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/18Image warping, e.g. rearranging pixels individually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • Embodiments disclosed in this document relate to an electronic device and method including a function of correcting image distortion.
  • the camera of the mobile terminal is also required to have a field of view (FOV) equivalent to a digital camera.
  • FOV field of view
  • Cameras of various angles of view are being applied to mobile phones, and among them, a wide-angle camera such as an ultra-wide camera can display a wide area on a display.
  • the stereo graphic projection method is a method used to correct an object by giving a change in the angle of view in an image distorted by lens characteristics.
  • the stereo graphic projection method is used to correct the distorted image, the face at the edge of the image is corrected, but since the angle of view is changed for the entire image, the periphery of the face is rather distorted. Occurs.
  • parameters related to distortion compensation such as an angle of view (FOV) are set to different values in a region of interest and a region not in the region of interest that are determined based on the face region of the image to set the object of the image (eg: It provides a method for efficiently processing distortion correction (human face).
  • FOV angle of view
  • An electronic device may include an image sensor, a display, and at least one processor electrically connected to the image sensor and the display.
  • the at least one processor detects a face in the image acquired through the image sensor, divides the acquired image into a plurality of grid regions, and includes at least one of the plurality of grid regions corresponding to the detected face. set a grid region of A second parameter value lower than the first parameter value may be applied to the lattice region except for the parameter.
  • the at least one processor may perform distortion correction on the image based on the first parameter value and the second parameter value, and display the distortion-corrected image on the display.
  • the method of operating an electronic device includes an operation of detecting a face in an image obtained through an image sensor, an operation of dividing the obtained image into a plurality of grid regions, and the operation of the plurality of grid regions.
  • a storage medium stores computer-readable instructions, and when the instructions are executed by at least one processor of an electronic device, the at least one processor performs an image obtained through an image sensor. detecting a face, dividing the obtained image into a plurality of grid regions, setting at least one grid region corresponding to the detected face among the plurality of grid regions as a region of interest; A parameter related to distortion compensation is applied as a first parameter value to the at least one grating region set as a region, and the parameter is applied to a grating region except for the at least one grating region among the plurality of grating regions as the first parameter value. applying a lower second parameter value, performing distortion correction on the image based on the first parameter value and the second parameter value, and displaying the image on which the distortion correction has been performed on a display can make it work.
  • distortion can be effectively corrected with respect to the face region, and line distortion at the periphery except for the face region can be minimized.
  • power consumption may be minimized by reducing the amount of calculation related to correction.
  • distortion is minimized even in a wide-angle photographing environment, so that a high-quality picture can be provided to a user.
  • FIG. 1 is a diagram illustrating a structure of an electronic device and a camera module according to an embodiment.
  • FIG. 2 illustrates a hardware configuration of an electronic device according to an embodiment.
  • FIG 3 illustrates a process for correcting face distortion in an electronic device according to an exemplary embodiment.
  • FIG. 4 illustrates division of an image into a plurality of grid regions according to an exemplary embodiment.
  • FIG. 5 is a flowchart illustrating a process of processing an image by dividing an image into a plurality of regions and then applying different parameter values related to distortion compensation to each region in an electronic device according to an exemplary embodiment.
  • FIG. 6 is a flowchart of performing distortion correction by dividing an image into a plurality of grid regions according to an exemplary embodiment.
  • FIG. 7 is a flowchart illustrating setting of a plurality of regions based on face detection in an image acquired through an image sensor according to an exemplary embodiment.
  • FIG. 8 is a flowchart for correcting distortion of an image when a partial region of an image is displayed on a display in an electronic device according to an exemplary embodiment.
  • FIG 9 illustrates displaying a partial region of an image on a display in an electronic device according to an exemplary embodiment.
  • FIG. 10 is a flowchart illustrating image processing in a case in which a change range of a parameter value related to distortion compensation according to a frame is limited in an electronic device according to an exemplary embodiment.
  • FIG. 11 is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure.
  • FIG. 12 is a block diagram illustrating a camera module according to various embodiments.
  • FIG. 1 is a diagram illustrating a structure of an electronic device and a camera module according to an embodiment.
  • FIG. 1 is a diagram schematically illustrating an exterior and a camera module 180 of an electronic device 100 on which a camera module 180 is mounted, according to an embodiment.
  • FIG. 1 has been illustrated and described on the premise of a mobile device, in particular, a smart phone, it is to those skilled in the art that it can be applied to various electronic devices or electronic devices equipped with a camera among mobile devices. will be clearly understood.
  • the display 110 may be disposed on the front surface of the electronic device 100 according to an embodiment.
  • the display 110 may occupy most of the front surface of the electronic device 100 .
  • a display 110 and a bezel 190 region surrounding at least some edges of the display 110 may be disposed on the front surface of the electronic device 100 .
  • the display 110 may include a flat area and a curved area extending from the flat area toward the side of the electronic device 100 .
  • the electronic device 100 illustrated in FIG. 1 is an example, and various embodiments are possible.
  • the display 110 of the electronic device 100 may include only a flat area without a curved area, or may include a curved area only at one edge instead of both sides.
  • the curved area may extend toward the rear surface of the electronic device, so that the electronic device 100 may include an additional planar area.
  • the electronic device 100 may additionally include a speaker, a receiver, a front camera, a proximity sensor, a home key, and the like.
  • the electronic device 100 may be provided in which the rear cover 150 is integrated with the main body of the electronic device.
  • the rear cover 150 may be separated from the main body of the electronic device 100 to have a form in which the battery can be replaced.
  • the back cover 150 may be referred to as a battery cover or a back cover.
  • a fingerprint sensor 171 for recognizing a user's fingerprint may be included in the first area 170 of the display 110 . Since the fingerprint sensor 171 is disposed on a lower layer of the display 110 , the fingerprint sensor 171 may not be recognized by the user or may be difficult to recognize. Also, in addition to the fingerprint sensor 171 , a sensor for additional user/biometric authentication may be disposed in a portion of the display 110 . In another embodiment, a sensor for user/biometric authentication may be disposed on one area of the bezel 190 . For example, the IR sensor for iris authentication may be exposed through one area of the display 110 or may be exposed through one area of the bezel 190 .
  • the front camera 161 may be disposed in the second area 160 on the front side of the electronic device 100 .
  • the front camera 161 is shown to be exposed through one area of the display 110 , but in another embodiment, the front camera 161 may be exposed through the bezel 190 .
  • the electronic device 100 may include one or more front cameras 161 .
  • the electronic device 100 may include two front cameras, such as a first front camera and a second front camera.
  • the first front camera and the second front camera may be cameras of the same type having the same specifications (eg, pixels), but the first front camera and the second front camera may be implemented as cameras of different specifications.
  • the electronic device 100 may support a function (eg, 3D imaging, auto focus, etc.) related to a dual camera through two front cameras. The above-mentioned description of the front camera may be equally or similarly applied to the rear camera of the electronic device 100 .
  • various kinds of hardware or sensors 163 to assist photographing may be additionally disposed in the electronic device 100 .
  • a distance sensor eg, TOF sensor
  • the distance sensor may be applied to both a front camera and/or a rear camera.
  • the distance sensor may be separately disposed or included and disposed on the front camera and/or the rear camera.
  • At least one physical key may be disposed on a side portion of the electronic device 100 .
  • the first function key 151 for turning on/off the display 110 or turning on/off the power of the electronic device 100 may be disposed on the right edge with respect to the front surface of the electronic device 100 .
  • the second function key 152 for controlling the volume or screen brightness of the electronic device 100 may be disposed on the left edge with respect to the front surface of the electronic device 100 .
  • additional buttons or keys may be disposed on the front or rear of the electronic device 100 .
  • a physical button or a touch button mapped to a specific function may be disposed in a lower region of the bezel 190 of the front of the electronic device 100 .
  • the electronic device 100 illustrated in FIG. 1 corresponds to one example, and the shape of the device to which the technical idea disclosed in the present disclosure is applied is not limited.
  • a foldable electronic device that can be folded horizontally or vertically, a rollable electronic device that can be rolled, a tablet or a notebook computer
  • the technical idea of the present disclosure This can be applied.
  • the present technical idea can be applied even when it is possible to arrange the first camera and the second camera facing in the same direction to face different directions through rotation, folding, or deformation of the device.
  • an electronic device eg, the electronic device 1101 of FIG. 11
  • the camera module 180 includes a lens assembly (eg, the lens assembly 1210 of FIG. 12 ) 111 , a housing 113 , an infrared cut filter 115 , and an image sensor (eg, the image of FIG. 12 ). It may include a sensor 1230) 120 and an image signal processor (eg, the image signal processor 1260 of FIG. 12 ) 130 .
  • the lens assembly 111 may have a different number, arrangement, type, etc. of lenses depending on the front camera and the rear camera.
  • the front camera and the rear camera may have different characteristics (eg, focal length, maximum magnification, etc.).
  • the lens may be moved forward and backward along the optical axis, and may operate so that a target object, which is a subject, can be clearly captured by changing a focal length.
  • the camera module 180 includes a housing 113 for mounting at least one coil surrounding the periphery of the barrel around the optical axis and a barrel for mounting at least one or more lenses aligned on the optical axis.
  • the infrared cut filter 115 may be disposed on the upper surface of the image sensor 120 .
  • the image of the subject passing through the lens may be partially filtered by the infrared cut filter 115 and then detected by the image sensor 120 .
  • the image sensor 120 may be disposed on the upper surface of the printed circuit board.
  • the image sensor 120 may be electrically connected to the image signal processor 130 connected to the printed circuit board 140 by a connector.
  • a flexible printed circuit board (FPCB) or a cable may be used as the connector.
  • the image sensor 120 may be a complementary metal oxide semiconductor (CMOS) sensor or a charged coupled device (CCD) sensor.
  • CMOS complementary metal oxide semiconductor
  • CCD charged coupled device
  • a plurality of individual pixels are integrated in the image sensor 120 , and each individual pixel may include a micro lens, a color filter, and a photodiode.
  • Each individual pixel is a kind of photodetector that can convert incoming light into an electrical signal. Photodetectors generally cannot detect the wavelength of the captured light by themselves and cannot determine color information.
  • the photodetector may include a photodiode.
  • light information of a subject incident through the lens assembly 111 may be converted into an electrical signal by the image sensor 120 and input to the image signal processor 130 .
  • the camera module 180 may be disposed on the front side as well as the rear side of the electronic device 100 .
  • the electronic device 100 may include a plurality of camera modules 180 as well as one camera module 180 to improve camera performance.
  • the electronic device 100 may further include a front camera 161 for video call or self-camera photography.
  • the front camera 161 may support a relatively low number of pixels compared to the rear camera module.
  • the front camera may be relatively smaller than the rear camera module.
  • FIG. 2 illustrates a hardware configuration of an electronic device according to an embodiment.
  • the configuration illustrated in FIG. 1 may be briefly described or a description thereof may be omitted.
  • the electronic device 100 includes a camera module 180 , a processor (eg, the processor 1120 of FIG. 11 ) 220 , and a display (eg, the display module 1160 of FIG. 11 ). ) 110 and a memory (eg, the memory 1130 of FIG. 11 ) 230 .
  • a processor eg, the processor 1120 of FIG. 11
  • a display e.g, the display module 1160 of FIG. 11
  • a memory eg, the memory 1130 of FIG. 11
  • the camera module 180 may include a lens assembly 111 , an image sensor 120 , a distance detection sensor 210 , and an image signal processor 130 .
  • the electronic device 100 may further include additional components.
  • the electronic device 100 may further include at least one microphone for recording audio data.
  • the electronic device 100 may further include at least one sensor for determining a direction in which the front or rear of the electronic device 100 faces and/or posture information of the electronic device 100 .
  • the at least one sensor may include an acceleration sensor, a gyro sensor, and the like. A detailed description of hardware included or may be included in the electronic device 100 of FIG. 2 is provided with reference to FIG. 11 .
  • the image sensor 120 may include a complementary metal oxide semiconductor (CMOS) sensor or a charged coupled device (CCD) sensor.
  • CMOS complementary metal oxide semiconductor
  • CCD charged coupled device
  • the light information of the subject incident through the lens assembly 111 may be converted into an electrical signal by the image sensor 120 and input to the image signal processor 130 .
  • An infrared cut filter (hereinafter, IR cut filter) may be disposed on the upper surface of the image sensor 120 , and the image of the subject passing through the lens is partially filtered by the IR cut filter and then the image sensor 120 ) can be detected by IR cut filter.
  • a light emitter may generate output light and emit it to the outside.
  • the light emitter may be configured and operated separately from the distance detection sensor 210 , or may be operated while being included in the distance detection sensor 210 .
  • the distance sensor 210 may be a sensor capable of calculating a depth value of each pixel of an image.
  • the depth value of the pixel may be understood as a depth value of an area (eg, a subject and/or a background) corresponding to the pixel.
  • the distance sensor 210 may be implemented, for example, in at least a stereo method, a time of flight (TOF) method, or a structured pattern method.
  • the distance detection sensor 210 eg, a TOF sensor
  • the distance detection sensor 210 may receive an input light corresponding to the output light emitted from the light emitter.
  • the output light and the input light may be infrared or near infrared.
  • the distance sensor 210 may obtain input light to irradiate the subject.
  • the distance sensor 210 may obtain distance information by analyzing the input light.
  • output light means light output from a light emitter and incident on an object
  • input light is light input to the distance sensor 210 after the output light reaches the object and is reflected from the object.
  • the output light may be referred to as an output signal
  • the input light may be referred to as an input signal.
  • the distance sensor 210 may irradiate the object with the obtained input light during a predetermined exposure period.
  • the exposure period may mean one frame period.
  • the exposure cycle may be repeated. For example, when the camera module 180 captures an object at 20 FPS, the exposure period may be 1/20 [sec]. And when the camera module 180 generates 100 frames, the exposure cycle may be repeated 100 times.
  • a sensor interface conforming to a standard may be disposed.
  • the image signal processor 130 may perform image processing on the electrically converted image data.
  • a process in the image signal processor 130 may be divided into a pre-ISP (hereinafter, pre-processing) and an ISP chain (hereinafter, post-processing).
  • Image processing before the demosaicing process may mean pre-processing, and image processing after the demosaicing process may mean post-processing.
  • the preprocessing process may include 3A processing, lens shading correction, edge enhancement, dead pixel correction, knee correction, and the like.
  • 3A may include at least one of auto white balance (AWB), auto exposure (AE), and auto focusing (AF).
  • the post-processing process may include at least one of changing a sensor index value, changing a tuning parameter, and adjusting an aspect ratio.
  • the post-processing process may include processing the image data output from the image sensor 120 or image data output from the scaler.
  • the image signal processor 130 may adjust contrast, sharpness, saturation, dithering, etc. of the image through a post-processing process.
  • the contrast, sharpness, and saturation adjustment procedures are performed in the YUV color space, and the dithering procedure may be performed in the RGB (Red Green Blue) color space.
  • a part of the pre-processing process may be performed in the post-processing process, or a part of the post-processing process may be performed in the pre-processing process.
  • a part of the pre-processing process may be overlapped with a part of the post-processing process.
  • the display 110 may display contents such as an execution screen of an application executed by the processor 220 or images and/or videos stored in the memory 230 on the display 110 .
  • the processor 220 may display the image data acquired through the camera module 180 on the display 110 in real time.
  • the display 110 may be implemented integrally with the touch panel.
  • the display 110 may support a touch function, detect a user input such as a touch using a finger, and transmit it to the processor 220 .
  • the display 110 may be connected to a display driver integrated circuit (DDIC) for driving the display 110 , and the touch panel may be connected to a touch IC that detects touch coordinates and processes a touch-related algorithm.
  • DDIC display driver integrated circuit
  • the display driving circuit and the touch IC may be integrally formed, and in another embodiment, the display driving circuit and the touch IC may be formed separately.
  • the display driving circuit and/or the touch IC may be electrically connected to the processor 220 .
  • the processor 220 may execute/control various functions supported by the electronic device 100 .
  • the processor 220 may execute an application by executing a code written in a programming language stored in the memory 230 , and may control various hardware.
  • the processor 220 may execute an application supporting a photographing function stored in the memory 230 .
  • the processor 220 may execute the camera module 180 and set and support an appropriate shooting mode so that the camera module 180 may perform an operation intended by the user.
  • the memory 230 may store instructions executable by the processor 220 .
  • the memory 230 may be understood as a concept including a component in which data is temporarily stored, such as a random access memory (RAM), and/or a component in which data is permanently stored, such as a solid state drive (SSD).
  • the processor 220 may implement a software module in the RAM space by calling instructions stored in the SSD.
  • the memory 230 may include various types, and an appropriate type may be adopted according to the purpose of the device.
  • an application related to the camera module 180 may be stored in the memory 230 .
  • a camera application may be stored in the memory 230 .
  • the camera application may support various shooting functions, such as photo shooting, video shooting, panoramic shooting, and slow motion shooting.
  • an application associated with the camera module 180 may correspond to various types of applications.
  • a chatting application may also use the camera module 180 to support a video call, photo/video attachment, streaming service, product image, or product-related virtual reality (VR) shooting function.
  • VR virtual reality
  • FIG. 3 illustrates a process for correcting face distortion in an electronic device according to an exemplary embodiment.
  • the operating subject of the flowchart illustrated in FIG. 3 may be understood as a processor (eg, the processor 220 of FIG. 2 ) or an image signal processor (eg, the image signal processor 130 of FIG. 1 ).
  • the processor 220 may acquire an image.
  • the image may refer to raw image data acquired through the image sensor 120 .
  • the image may mean a preview image. That is, the processor 220 may perform distortion correction through raw image data or real-time distortion correction through a preview image.
  • the processor 220 may divide the obtained image into a plurality of grid regions.
  • the shape of the grid region may mean a region made of four dots.
  • the shape of the grid region may include at least the shape of a square, a rectangle, a trapezoid, and/or a rhombus.
  • the processor 220 may detect a face from the image acquired through the image sensor 120 .
  • the processor 220 may detect at least two or more faces from the image.
  • the processor 220 may detect at least a human face or an animal face from the image. It will be clearly understood by those skilled in the art that the face detection may be performed not only by the processor 220 but also by the image signal processor 130 and other hardware and software modules not shown.
  • the processor 220 may set a value of a parameter related to distortion compensation.
  • the parameter related to the distortion compensation may be a field of view (FOV) value, which is a characteristic of a lens, or a value related to the field of view.
  • FOV field of view
  • the degree of distortion compensation may increase.
  • the degree of distortion compensation may increase as the value of the parameter decreases.
  • the processor 220 may adjust a parameter value related to the distortion compensation for each pixel. In another embodiment, the processor 220 may adjust a parameter value related to the distortion compensation for each grid region. In another embodiment, the processor 220 may adjust the parameter value related to the distortion compensation for each grid node constituting the grid region.
  • the processor 220 may reset a parameter related to distortion compensation.
  • the processor 220 may reset a parameter related to distortion compensation based on a grid node included in the grid region, face detection information, and an ROI.
  • the processor 220 may reset the value of the parameter based on the distance between the grid points.
  • the processor 220 may reset the value of the parameter based on the size of the face, the position of the face, and the like. For example, as the position of the face is closer to the outer region of the image, the distortion of the face becomes more severe, so the value of the parameter may be increased.
  • the processor 220 may set a region of interest based on detection of a face in an image acquired through the image sensor 120 .
  • the processor 220 may set the ROI to include a region corresponding to the face in the image.
  • the region of interest may include a region corresponding to the face and/or a region overlapping the region corresponding to the face and larger than the size of the face.
  • a region corresponding to the face may be referred to as a face region.
  • An area larger than the size of the face may be referred to as a margin area.
  • the processor 220 may perform stereographic projection.
  • the processor 220 may perform a projection projection method based on a parameter value related to distortion compensation.
  • the processor 220 may perform a warping operation on the image on which the stereographic projection has been performed.
  • the processor 220 may perform a warping operation so that an image distorted by the projection projection method is displayed on the display.
  • the processor 220 may obtain a corrected image, and store the obtained corrected image in the memory 230 or display it on the display 110 .
  • FIG. 4 illustrates division of an image into a plurality of grid regions according to an exemplary embodiment.
  • the processor 220 may divide the image 450 acquired through the image sensor 120 into at least two grid regions.
  • the processor 220 may divide the grid area into at least two or more areas including one or more.
  • the processor 220 may divide the acquired image 450 into a region of interest 430 including at least one grid region and a region 440 excluding the region of interest.
  • the processor 220 may set a plurality of ROIs based on the derived face. For example, when there are three faces derived from an image acquired through the image sensor 120 , the processor 220 may set three ROIs.
  • the processor 220 may set an ROI based on the face detected in the predetermined field. For example, when the processor 220 detects a face in the 0.6 to 1.0 field, the processor 220 sets an ROI based on the detected face, and performs distortion compensation processing based on the set ROI. can be done As another example, when the processor 220 detects a face in fields 0 to 0.6, an ROI may not be set and distortion compensation may not be performed.
  • the field may be referred to as a kind of distance from the center of the lens to the edge of the lens.
  • a 0 field may mean the center of a lens or image
  • a 1.0 field may mean an outer part of the lens or image.
  • the region of interest 430 may include a face region 410 and a margin region 420 .
  • FIG. 5 is a flowchart illustrating a process of processing an image by dividing an image into a plurality of regions and then applying different parameter values related to distortion compensation to each region in an electronic device according to an exemplary embodiment.
  • the operating subject of the flowchart illustrated in FIG. 5 may be understood as a processor (eg, the processor 220 of FIG. 2 ) or an image signal processor (eg, the image signal processor 130 of FIG. 1 ).
  • the processor 220 may detect a face from an image acquired through the image sensor 120 .
  • the processor 220 may detect a face in the entire area of the image acquired through the image sensor 120 .
  • the face detection is not performed using a specific method, but may be performed by conventionally well-known methods.
  • the face may include a human face or an animal face.
  • the face detection may be performed not only by the processor 220 , but also by the image signal processor 130 , other hardware and software modules not shown.
  • the processor 220 may divide the obtained image into a plurality of grid regions.
  • the shape and size of the grid regions are not limited.
  • the processor 220 may divide the entire image area into a plurality of grid areas. For example, when an x by y image is acquired, the processor 220 may divide the x by y image into a plurality of grid regions.
  • the processor 220 may divide an area in which a face is detected and an area within a predetermined distance, not the entire image acquired through the image sensor 120 , into a plurality of grid areas. For example, in order to reduce the amount of calculation of distortion correction, the processor 220 may divide a region within a first pixel distance from a region where a face is detected in the image into a plurality of grid regions.
  • the processor 220 may set at least one grid area corresponding to the detected face among the plurality of grid areas as the ROI. 4 , the processor 220 may set two or more ROIs based on the detected face information. The processor 220 may set the ROI based on the detected position of the face.
  • the processor 220 applies, as a first parameter value, a parameter related to distortion compensation of at least one grating region set as a region of interest, and the at least one grating among the plurality of grating regions.
  • the parameter of the grid region excluding the region may be applied as a second parameter value.
  • the processor 220 may perform distortion correction on the entire area of the image based on the first parameter value and the second parameter value.
  • the processor 220 may perform distortion correction on the ROI based on a first parameter value, and may perform distortion correction on a region excluding the ROI based on a second parameter value.
  • the processor 220 may display an image on which distortion correction has been performed on the display 110 .
  • the processor 220 may store or temporarily store the image on which the distortion correction has been performed in the memory 230 while displaying the image on the display 110 .
  • FIG. 6 is a flowchart of performing distortion correction by dividing an image into a plurality of grid regions according to an exemplary embodiment.
  • the at least one processor illustrated in FIG. 6 may be understood as a processor (eg, the processor 220 of FIG. 2 ) or an image signal processor (eg, the image signal processor 130 of FIG. 1 ).
  • the processor 220 may acquire an image through the image sensor 120 .
  • the processor 220 may acquire an image having at least R, G, and B pixel values through the image sensor 120 .
  • the processor 220 may divide the entire area of the image into a grid.
  • the processor 220 may divide the image acquired through the image sensor 120 into at least one grid area. At least one or more grid regions corresponding to the detected face may be referred to as a face region.
  • the processor 220 may perform stereographic projection with respect to the grid points using reset parameters.
  • the flat projection method may refer to a method of projecting a spherical surface on a plane.
  • the processor 220 may change a parameter value related to distortion compensation of the grid region itself or a grid point constituting the grid region. For example, the processor 220 may apply the parameter value reset to 60 to the first grid region to which the parameter value of 50 is applied. Alternatively, the processor 220 may apply the parameter value reset to 60 to the first grid point and apply the parameter value reset to 65 to the second grid point.
  • the processor 220 may change the position or coordinates of each grid point in the grid area by performing the projection method based on the reset parameter value.
  • the processor 220 may perform image warping based on the changed lattice points.
  • the processor 220 may perform image warping by comparing the projected image with the raw image data obtained through the image sensor 120 .
  • the processor 220 may perform image warping so that the image on which the projection method has been performed corresponds to the position or coordinate information of the original grid points included in the raw image data.
  • FIG. 7 is a flowchart illustrating setting of a plurality of regions based on face detection in an image acquired through an image sensor according to an exemplary embodiment.
  • the operating subject of the flowchart illustrated in FIG. 7 may be understood as a processor (eg, the processor 220 of FIG. 2 ) or an image signal processor (eg, the image signal processor 130 of FIG. 1 ).
  • the processor 220 may detect a face from an image acquired through the image sensor 120 .
  • the processor 220 may determine at least one object related to the face and detect the face based thereon.
  • the processor 220 may determine an object such as a neck and hair, and detect a face based thereon. It will be clearly understood by those skilled in the art that the face detection may be performed not only by the processor 220 but also by the image signal processor 130 and other hardware and software modules not shown.
  • the processor 220 may determine whether a face is continuously detected. When the face is not detected, the processor 220 may not perform distortion correction. When the situation in which the face is detected and not detected is repeated, the processor 220 may limit a change range of a parameter value related to distortion compensation to prevent the output screen from shaking. As for limiting the change range of the parameter value, the contents mentioned in FIG. 10 below may be applied.
  • the processor 220 may determine a center point of the face through face detection and set a face region.
  • the processor 220 may determine the center point of the face by using the detected face feature.
  • the processor 220 may extract feature information indicating the feature points of the face and determine the center point of the face based on the extracted feature information.
  • the feature points of the face may include eyes, nose, mouth, and ears, and the feature information may include location and/or size information of the eyes, nose, mouth, and ears, respectively.
  • the processor 220 may set a margin area in addition to the set face area.
  • the processor 220 may set an area as much as a predetermined grid distance from the set face area as the margin area.
  • the margin area may be an area for reducing a difference in distortion correction degree between the face area and other areas.
  • the processor 220 may set a margin region to reduce a sharp correction difference occurring between the face region on which distortion correction is intensively performed and other regions (eg, an external region of the ROI).
  • the processor 220 may apply a third parameter value that is between the first parameter value and the second parameter value to the grid region corresponding to the margin region. For example, when the processor 220 applies a distortion compensation-related parameter value of 120 to a grid region or grid point corresponding to the face region and 90 is applied to a grid region excluding the face region, the face region and the face region 105 can be applied to the margin region located between regions except for .
  • the processor 220 may set the margin area to further enlarge the distortion correction area.
  • the processor 220 may apply a parameter related to distortion compensation including not only the face region but also the margin region as the first parameter value. For example, when the face region is small because the size of the detected face is small, the processor 220 may apply the first parameter value to a region including at least one of the face region and the margin region.
  • the region including at least one of the face region and the margin region may include a face region, a margin region, or an region including both the face region and the margin region.
  • the processor 220 may set a region including both the face region and the margin region as the region of interest.
  • the processor 220 may set the ROI as a distortion correction region.
  • FIG. 8 is a flowchart illustrating image distortion correction when a partial region of an image is displayed on a display in an electronic device according to an exemplary embodiment.
  • the operating subject of the flowchart illustrated in FIG. 8 may be understood as a processor (eg, the processor 220 of FIG. 2 ) or an image signal processor (eg, the image signal processor 130 of FIG. 1 ).
  • the flowchart of FIG. 8 may be described in connection with FIG. 9 .
  • the processor 220 may set a partial area of the entire image area as an area to be output as a preview image.
  • the processor 220 may crop a partial region of the entire image region and output it as a preview image.
  • the processor 220 may output a partial region of the image acquired through the image sensor 120 as a preview image through a separate application.
  • the processor 220 may set the partial region to be output as a preview image as a region of interest. In addition to setting the partial region to be output as a preview image as the ROI, the processor 220 may set a wider range than the partial region as the margin region.
  • the processor 220 may perform correction by setting the ROI as a distortion correction region.
  • the processor 220 may perform correction by setting the region of interest and the margin region as a distortion correction region.
  • the processor 220 since the processor 220 performs the correction by setting the region having a margin, which is a wider region than the region displayed through the display 110 , as the distortion correction region, effective distortion correction can be performed.
  • the processor 220 may output the region in which the distortion is corrected as a preview image.
  • the 9 illustrates displaying a partial region of an image on a display in an electronic device according to an exemplary embodiment.
  • 9 shows an image of an embodiment corresponding to the flowchart of FIG. 8 .
  • the first image 910 may be an image acquired through the image sensor 120 .
  • the second image 920 may be an image obtained by cropping a partial region from the first image 910 .
  • the processor 220 may set the periphery of the second image 920 as the margin area 930 for distortion compensation.
  • the processor 220 may display the second image 920 as a preview image on the display 110 .
  • each operation may be performed sequentially, but is not necessarily performed sequentially.
  • the order of each operation may be changed, and at least two operations may be performed in parallel.
  • the processor 220 may limit the maximum change range of the parameter value according to the frame to the first change range.
  • the parameter value may be a parameter value related to distortion compensation. For example, as the parameter value related to the distortion compensation increases, the degree of distortion compensation may increase. However, depending on the type of the parameter, the degree of distortion compensation may increase as the value of the parameter decreases.
  • the processor 220 may limit the maximum change range of the parameter value related to the distortion compensation applied to the ROI to the first change range.
  • the maximum change range may be a value set to be applied to all frames.
  • the processor 220 may naturally perform image correction by gradually changing parameter values according to frames by limiting a change range of the parameter values related to the distortion compensation.
  • the processor 220 may assign a parameter value of 115 to the first grid point instead of 100.
  • the processor 220 may determine a parameter value related to distortion compensation to be applied to the M-th frame.
  • the processor 220 may determine a parameter value related to distortion compensation to be applied to at least the face region of the M-th frame as the first value.
  • the processor 220 may determine whether a difference between the parameter value applied to the N-th frame and the parameter value applied to the M-th frame that is the N-th frame before the N-th frame is within the first change range. .
  • the processor 220 may determine a parameter value to be applied to the N-th frame as the second value.
  • the processor 220 may calculate a difference between a first value that is a parameter value applied to the M-th frame and a second value that is a parameter value applied to the N-th frame.
  • the N-th frame may be a frame after the M-th frame.
  • the N-th frame may be an M+1-th frame.
  • the processor 220 may perform operation 1040 when the calculated difference between the parameter values related to the distortion compensation is within the first change range. For example, when the parameter value related to distortion compensation applied to the M-th frame is 120, the parameter value applied to the N-th (eg, M+1) frame is 118, and the first change range is set to 5, Since the difference between the parameter value of the N-th frame and the M-th frame is 2, the processor 220 may perform distortion correction by applying the parameter value to the N-th frame.
  • the processor 220 when the calculated difference between the parameter values related to the distortion compensation is not the first variation range, the processor 220 returns to operation 1020 to recrystallize the parameter values related to the distortion compensation within the first variation range.
  • the parameter value applied to the M-th frame is 120
  • the parameter value applied to the N-th (eg, M+1) frame is 100
  • the first change range is set to 5
  • the processor 220 Since the difference between the parameter value related to the distortion compensation applied to the N-th frame and the M-th frame is 20, the process returns to operation 1020 and the parameter value applied to the N-th frame may be re-determined as 115.
  • the processor 220 may perform distortion correction based on the parameter value related to the distortion compensation. For example, when the parameter value related to distortion compensation applied to the M-th frame is 120, the parameter value applied to the N-th (eg, M+1) frame is 118, and the first change range is set to 5, Distortion correction may be performed by applying the parameter value 118 to the N-th frame. As another example, when the parameter value applied to the first frame in which the distortion correction is started is 4 and the parameter value is 0 in the frame before the distortion correction is started, since the first change range is 5, the distortion correction is Distortion correction may be performed by applying a parameter value 4 related to distortion compensation to the first frame starting with .
  • the processor 220 may determine whether distortion correction is completed.
  • the processor 220 may determine whether distortion correction is performed to a desired degree over the first number of frames.
  • the processor 220 may repeatedly perform operations 1020 to 1040 .
  • the processor 220 may perform distortion correction based on a parameter value related to distortion compensation determined as a first value in a first frame in which distortion correction is started.
  • the processor 220 may perform distortion correction based on the parameter value related to the distortion compensation determined as the second value by satisfying the operation 1030 in the second frame that is the next frame of the first frame.
  • the processor 220 performs the operation 1020 for the third frame to determine a parameter value related to the distortion compensation as a third value, and to the second frame through the operation 1030 It may be determined whether a difference between the applied second value and the third value is a first change range. This may be a process that is repeated until the image distortion correction is completed.
  • FIG. 11 is a block diagram of an electronic device 1101 in a network environment 1100 according to various embodiments of the present disclosure.
  • the electronic device 1101 communicates with the electronic device 1102 through a first network 1198 (eg, a short-range wireless communication network) or a second network 1199 . It may communicate with the electronic device 1104 or the server 1108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 1101 may communicate with the electronic device 1104 through the server 1108 .
  • a first network 1198 eg, a short-range wireless communication network
  • a second network 1199 e.g., a second network 1199
  • the electronic device 1101 may communicate with the electronic device 1104 through the server 1108 .
  • the electronic device 1101 includes a processor 1120 , a memory 1130 , an input module 1150 , a sound output module 1155 , a display module 1160 , an audio module 1170 , and a sensor module ( 1176), interface 1177, connection terminal 1178, haptic module 1179, camera module 1180, power management module 1188, battery 1189, communication module 1190, subscriber identification module 1196 , or an antenna module 1197 may be included.
  • at least one of these components eg, the connection terminal 1178
  • some of these components are integrated into one component (eg, display module 1160 ). can be
  • the processor 1120 for example, executes software (eg, a program 1140) to execute at least one other component (eg, hardware or software component) of the electronic device 1101 connected to the processor 1120. It can control and perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 1120 may store a command or data received from another component (eg, the sensor module 1176 or the communication module 1190 ) into the volatile memory 1132 . , process the command or data stored in the volatile memory 1132 , and store the result data in the non-volatile memory 1134 .
  • software eg, a program 1140
  • the processor 1120 may store a command or data received from another component (eg, the sensor module 1176 or the communication module 1190 ) into the volatile memory 1132 . , process the command or data stored in the volatile memory 1132 , and store the result data in the non-volatile memory 1134 .
  • the processor 1120 is a main processor 1121 (eg, a central processing unit or an application processor) or a secondary processor 1123 (eg, a graphics processing unit, a neural network processing unit) a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 1121 e.g, a central processing unit or an application processor
  • a secondary processor 1123 e.g, a graphics processing unit, a neural network processing unit
  • a neural processing unit e.g., a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the electronic device 1101 includes a main processor 1121 and a sub-processor 1123
  • the sub-processor 1123 uses less power than the main processor 1121 or is set to specialize in a specified function.
  • the auxiliary processor 1123 may be implemented separately from or as a part of the main processor 1121 .
  • the coprocessor 1123 may be, for example, on behalf of the main processor 1121 while the main processor 1121 is in an inactive (eg, sleep) state, or the main processor 1121 is active (eg, executing an application). ), together with the main processor 1121, at least one of the components of the electronic device 1101 (eg, the display module 1160, the sensor module 1176, or the communication module 1190) It is possible to control at least some of the related functions or states.
  • the auxiliary processor 1123 eg, an image signal processor or a communication processor
  • the auxiliary processor 1123 may include a hardware structure specialized in processing an artificial intelligence model.
  • Artificial intelligence models can be created through machine learning. Such learning may be performed, for example, in the electronic device 1101 itself on which artificial intelligence is performed, or may be performed through a separate server (eg, the server 1108).
  • the learning algorithm may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but in the above example not limited
  • the artificial intelligence model may include a plurality of artificial neural network layers.
  • Artificial neural networks include deep neural networks (DNNs), convolutional neural networks (CNNs), recurrent neural networks (RNNs), restricted boltzmann machines (RBMs), deep belief networks (DBNs), bidirectional recurrent deep neural networks (BRDNNs), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the above example.
  • the artificial intelligence model may include, in addition to, or alternatively, a software structure in addition to the hardware structure.
  • the memory 1130 may store various data used by at least one component (eg, the processor 1120 or the sensor module 1176 ) of the electronic device 1101 .
  • the data may include, for example, input data or output data for software (eg, the program 1140 ) and instructions related thereto.
  • the memory 1130 may include a volatile memory 1132 or a non-volatile memory 1134 .
  • the program 1140 may be stored as software in the memory 1130 , and may include, for example, an operating system 1142 , middleware 1144 , or an application 1146 .
  • the input module 1150 may receive a command or data to be used in a component (eg, the processor 1120 ) of the electronic device 1101 from the outside (eg, a user) of the electronic device 1101 .
  • the input module 1150 may include, for example, a microphone, a mouse, a keyboard, a key (eg, a button), or a digital pen (eg, a stylus pen).
  • the sound output module 1155 may output a sound signal to the outside of the electronic device 1101 .
  • the sound output module 1155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback.
  • the receiver may be used to receive an incoming call. According to an embodiment, the receiver may be implemented separately from or as a part of the speaker.
  • the display module 1160 may visually provide information to the outside (eg, a user) of the electronic device 1101 .
  • the display module 1160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
  • the display module 1160 may include a touch sensor configured to sense a touch or a pressure sensor configured to measure the intensity of a force generated by the touch.
  • the audio module 1170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 1170 acquires a sound through the input module 1150 or an external electronic device (eg, a sound output module 1155 ) directly or wirelessly connected to the electronic device 1101 .
  • the electronic device 1102) eg, a speaker or headphones
  • the sensor module 1176 detects an operating state (eg, power or temperature) of the electronic device 1101 or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 1176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 1177 may support one or more specified protocols that may be used for the electronic device 1101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 1102).
  • the interface 1177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card
  • connection terminal 1178 may include a connector through which the electronic device 1101 can be physically connected to an external electronic device (eg, the electronic device 1102 ).
  • the connection terminal 1178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 1179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 1179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 1180 may capture still images and moving images. According to an embodiment, the camera module 1180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 1188 may manage power supplied to the electronic device 1101 .
  • the power management module 1188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 1189 may supply power to at least one component of the electronic device 1101 .
  • the battery 1189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 1190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 1101 and an external electronic device (eg, the electronic device 1102, the electronic device 1104, or the server 1108). It can support establishment and communication performance through the established communication channel.
  • the communication module 1190 operates independently of the processor 1120 (eg, an application processor) and may include one or more communication processors supporting direct (eg, wired) communication or wireless communication.
  • the communication module 1190 may include a wireless communication module 1192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 1194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • a wireless communication module 1192 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 1194 eg, : It may include a LAN (local area network) communication module, or a power line communication module.
  • a corresponding communication module among these communication modules is a first network 1198 (eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 1199 (eg, legacy It may communicate with the external electronic device 1104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a first network 1198 eg, a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)
  • a second network 1199 eg, legacy It may communicate with the external electronic device 1104 through a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a telecommunication network such as a
  • the wireless communication module 1192 uses subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 1196 within a communication network such as the first network 1198 or the second network 1199 .
  • subscriber information eg, International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the electronic device 1101 may be identified or authenticated.
  • the wireless communication module 1192 may support a 5G network after a 4G network and a next-generation communication technology, for example, a new radio access technology (NR).
  • NR access technology includes high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low-latency) -latency communications)).
  • eMBB enhanced mobile broadband
  • mMTC massive machine type communications
  • URLLC ultra-reliable and low-latency
  • the wireless communication module 1192 may support a high frequency band (eg, mmWave band) in order to achieve a high data rate, for example.
  • a high frequency band eg, mmWave band
  • the wireless communication module 1192 uses various technologies for securing performance in a high-frequency band, for example, beamforming, massive multiple-input and multiple-output (MIMO), all-dimensional multiplexing. It may support technologies such as full dimensional MIMO (FD-MIMO), an array antenna, analog beam-forming, or a large scale antenna.
  • the wireless communication module 1192 may support various requirements specified in the electronic device 1101 , an external electronic device (eg, the electronic device 1104 ), or a network system (eg, the second network 1199 ).
  • the wireless communication module 1192 provides a peak data rate (eg, 20 Gbps or more) for realization of eMBB, loss coverage for realization of mMTC (eg, 164 dB or less), or U-plane latency for realization of URLLC ( Example: downlink (DL) and uplink (UL) each 0.5 ms or less, or round trip 1 ms or less).
  • a peak data rate eg, 20 Gbps or more
  • mMTC eg, 164 dB or less
  • U-plane latency for realization of URLLC
  • the antenna module 1197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 1197 may include an antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 1197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 1198 or the second network 1199 is connected from the plurality of antennas by, for example, the communication module 1190 . can be selected. A signal or power may be transmitted or received between the communication module 1190 and an external electronic device through the selected at least one antenna.
  • other components eg, a radio frequency integrated circuit (RFIC)
  • RFIC radio frequency integrated circuit
  • the antenna module 1197 may form a mmWave antenna module.
  • the mmWave antenna module comprises a printed circuit board, an RFIC disposed on or adjacent to a first side (eg, bottom side) of the printed circuit board and capable of supporting a specified high frequency band (eg, mmWave band); and a plurality of antennas (eg, an array antenna) disposed on or adjacent to a second side (eg, top or side) of the printed circuit board and capable of transmitting or receiving signals of the designated high frequency band. can do.
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • a command or data may be transmitted or received between the electronic device 1101 and the external electronic device 1104 through the server 1108 connected to the second network 1199 .
  • Each of the external electronic devices 1102 or 1104 may be the same or a different type of the electronic device 1101 .
  • all or a part of operations executed by the electronic device 1101 may be executed by one or more external electronic devices 1102 , 1104 , or 1108 .
  • the electronic device 1101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 1101 .
  • the electronic device 1101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, mobile edge computing (MEC), or client-server computing technology may be used.
  • the electronic device 1101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 1104 may include an Internet of things (IoT) device.
  • IoT Internet of things
  • Server 1108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 1104 or the server 1108 may be included in the second network 1199 .
  • the electronic device 1101 may be applied to an intelligent service (eg, smart home, smart city, smart car, or health care) based on 5G communication technology and IoT-related technology.
  • the electronic device may be a device of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a laptop, a desktop, a tablet, or a portable multimedia device
  • portable medical device e.g., a portable medical device
  • camera e.g., a camera
  • a wearable device e.g., a smart watch
  • a home appliance device e.g., a smart bracelet
  • first”, “second”, or “first” or “second” may simply be used to distinguish the component from other components in question, and may refer to components in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, for example, and interchangeably with terms such as logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 1140) including
  • a processor eg, processor 1120 of a device (eg, electronic device 1101 ) may call at least one command among one or more commands stored from a storage medium and execute it. This makes it possible for the device to be operated to perform at least one function according to the called at least one command.
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not include a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the methods according to various embodiments disclosed in this document may be provided by being included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed online (eg download or upload), directly between smartphones (eg smartphones).
  • a part of the computer program product may be temporarily stored or temporarily generated in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities, and some of the plurality of entities may be separately disposed in other components. .
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repetitively, or heuristically, or one or more of the operations are executed in a different order, omitted, or , or one or more other operations may be added.
  • FIG. 12 is a block diagram 1200 illustrating a camera module 1180, according to various embodiments.
  • the camera module 1180 includes a lens assembly 1210 , a flash 1220 , an image sensor 1230 , an image stabilizer 1240 , a memory 1250 (eg, a buffer memory), or an image signal processor. (1260).
  • the lens assembly 1210 may collect light emitted from a subject, which is an image capturing object.
  • Lens assembly 1210 may include one or more lenses.
  • the camera module 1180 may include a plurality of lens assemblies 1210 . In this case, the camera module 1180 may form, for example, a dual camera, a 360 degree camera, or a spherical camera.
  • Some of the plurality of lens assemblies 1210 may have the same lens properties (eg, angle of view, focal length, auto focus, f number, or optical zoom), or at least one lens assembly may be a different lens assembly. It may have one or more lens properties different from the lens properties of .
  • the lens assembly 1210 may include, for example, a wide-angle lens or a telephoto lens.
  • the flash 1220 may emit light used to enhance light emitted or reflected from the subject.
  • the flash 1220 may include one or more light emitting diodes (eg, a red-green-blue (RGB) LED, a white LED, an infrared LED, or an ultraviolet LED), or a xenon lamp.
  • the image sensor 1230 may acquire an image corresponding to the subject by converting light emitted or reflected from the subject and transmitted through the lens assembly 1210 into an electrical signal.
  • the image sensor 1230 may include, for example, one image sensor selected from among image sensors having different properties, such as an RGB sensor, a black and white (BW) sensor, an IR sensor, or a UV sensor, the same It may include a plurality of image sensors having properties, or a plurality of image sensors having different properties.
  • Each image sensor included in the image sensor 1230 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the image stabilizer 1240 moves at least one lens or the image sensor 1230 included in the lens assembly 1210 in a specific direction or Operation characteristics of the image sensor 1230 may be controlled (eg, read-out timing may be adjusted, etc.). This makes it possible to compensate for at least some of the negative effects of the movement on the image being taken.
  • the image stabilizer 1240 according to an embodiment, the image stabilizer 1240 is a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 1180 Such a movement of the camera module 1180 or the electronic device 1101 may be detected using .
  • the image stabilizer 1240 may be implemented as, for example, an optical image stabilizer.
  • the memory 1250 may temporarily store at least a portion of the image acquired through the image sensor 1230 for a next image processing operation. For example, when image acquisition is delayed according to the shutter or a plurality of images are acquired at high speed, the acquired original image (eg, a Bayer-patterned image or a high-resolution image) is stored in the memory 1250 and , a copy image corresponding thereto (eg, a low-resolution image) may be previewed through the display device 1160 .
  • the acquired original image eg, a Bayer-patterned image or a high-resolution image
  • a copy image corresponding thereto eg, a low-resolution image
  • the memory 1250 may be configured as at least a part of the memory 1130 or as a separate memory operated independently of the memory 1130 .
  • the image signal processor 1260 may perform one or more image processing on an image acquired through the image sensor 1230 or an image stored in the memory 1250 .
  • the one or more image processes may include, for example, depth map generation, 3D modeling, panorama generation, feature point extraction, image synthesis, or image compensation (eg, noise reduction, resolution adjustment, brightness adjustment, blurring ( blurring), sharpening (sharpening), or softening (softening)
  • the image signal processor 1260 may include at least one of the components included in the camera module 1180 (eg, an image sensor). 1230), for example, exposure time control, readout timing control, etc.
  • the image processed by the image signal processor 1260 is stored back in the memory 1250 for further processing.
  • the image signal processor 1260 may be configured as at least a part of the processor 1120 or as a separate processor operated independently of the processor 1120.
  • the image signal processor 1260 may include the processor 1120 and a separate processor, at least one image processed by the image signal processor 1260 may be displayed through the display device 1160 as it is by the processor 1120 or after additional image processing.
  • the electronic device 1101 may include a plurality of camera modules 1180 each having different properties or functions.
  • at least one of the plurality of camera modules 1180 may be a wide-angle camera, and at least the other may be a telephoto camera.
  • at least one of the plurality of camera modules 1180 may be a front camera, and at least the other may be a rear camera.
  • the electronic device 100 includes the image sensor 120 , the display 110 , the image sensor 120 , and at least one processor electrically connected to the display 110 (eg, the image signal processor of FIG. 1 ). 130 and/or processor 220 of FIG. 2).
  • the at least one processor may detect a face from an image acquired through the image sensor 120 .
  • the at least one processor may divide the obtained image into a plurality of grid regions.
  • the at least one processor may set at least one grid area corresponding to the detected face among the plurality of grid areas as the ROI.
  • the at least one processor applies a parameter related to distortion compensation as a first parameter value to the at least one grating region set as the region of interest, and applies a first parameter value to a grating region except for the at least one grating region among the plurality of grating regions.
  • a second parameter value lower than the first parameter value may be applied to the parameter.
  • the at least one processor may perform distortion correction on the image based on the first parameter value and the second parameter value.
  • the at least one processor may display the image on which the distortion correction has been performed on the display 110 .
  • the region of interest may include a face region and a margin region located outside the face region.
  • the at least one processor may apply a third parameter value that is between the first parameter value and the second parameter value to the grid region corresponding to the margin region.
  • the at least one processor may reset the parameter value related to the distortion compensation in response to a change in at least one of a size, a shape, and a position of the detected face.
  • the at least one processor performs stereoographic projection based on the first parameter value and the second parameter value, and is located on a grid node whose position is changed by the projection projection. Based on the image warping (warping) can be performed.
  • the at least one processor determines a first change range of a parameter value applied to the region of interest in an M-th frame and a parameter value applied to the region of interest in an N-th frame that is an M-th and subsequent frames.
  • the range of change can be limited.
  • the at least one processor when the at least one processor detects a face in the M-th frame, the at least one processor applies a first correction value to the ROI in the M-th frame, and the at least one processor performs the N-th frame
  • a second correction value may be applied to the ROI in the N-th frame.
  • the at least one processor may detect at least one of a human face and an animal face through the face detection.
  • the at least one processor may determine the parameter value related to the compensation based on a grid node, face detection information, and an ROI.
  • the at least one processor may apply the second parameter value to a first grid region adjacent to the ROI outside the ROI.
  • the at least one processor may apply a fourth parameter value smaller than the second parameter value to a second grid area that is more distant from the ROI than the first grid area and is adjacent to an outer area of the image.
  • the at least one processor may not perform the distortion compensation on a grid region corresponding to the outermost region of the image.
  • the method of operating the electronic device 100 includes detecting a face in an image acquired through the image sensor 120 , dividing the acquired image into a plurality of grid regions, and the plurality of grids. setting at least one grid region corresponding to the detected face among regions as a region of interest; applying a distortion compensation-related parameter as a first parameter value to the at least one grid region set as the region of interest; applying the parameter to a grid area other than the at least one grid area among a plurality of grid areas, a second parameter value lower than the first parameter value, based on the first parameter value and the second parameter value It may include an operation of performing distortion correction on the image, and an operation of displaying the image on which the distortion correction has been performed on the display 110 .
  • the method of operating the electronic device 100 may include applying a third parameter value that is between the first parameter value and the second parameter value to the grid region corresponding to the margin region. .
  • the method of operating the electronic device 100 includes an operation of performing stereoographic projection based on the first parameter value and the second parameter value, and lattice points whose positions are changed by the projection projection method. It may include an operation of performing image warping based on the .
  • the method of operating the electronic device 100 includes a change range of a parameter value applied to the region of interest in an M-th frame and a parameter value applied to the region of interest in an N-th frame that is an M-th and subsequent frames. may include an operation of limiting to the first change range.
  • the method of operating the electronic device 100 includes applying the second parameter value to a first grid region adjacent to the region of interest outside the region of interest, and the region of interest rather than the first grid region.
  • the method may include applying a fourth parameter value smaller than the second parameter value to a second grid region spaced apart from and adjacent to the outer region of the image.
  • a non-transitory recording medium storing computer readable instructions
  • the instructions when executed by at least one processor of an electronic device, detect a face in an image obtained through the image sensor 120 operation, dividing the obtained image into a plurality of grid regions, setting at least one grid region corresponding to the detected face among the plurality of grid regions as a region of interest;
  • a parameter related to distortion compensation is applied to at least one grating region as a first parameter value
  • the parameter is applied to a grating region except for the at least one grating region among the plurality of grating regions with a second value lower than the first parameter value.
  • a non-transitory recording medium storing computer readable instructions, when the instructions are executed by at least one processor of an electronic device, the first parameter value and the grid area corresponding to the margin area; An operation of applying a third parameter value that is a value between the second parameter values may be performed.
  • a non-transitory recording medium storing computer-readable instructions
  • the instructions when executed by at least one processor of an electronic device, are evaluated based on the first parameter value and the second parameter value.
  • An operation of performing a stereoographic projection and an operation of performing image warping based on lattice points whose positions are changed by the flat projection may be performed.
  • a parameter value applied to the region of interest in an Mth frame and M an operation of limiting a change range of a parameter value applied to the ROI to a first change range in an N-th frame, which is a frame after the th frame, may be performed.
  • a first grid region adjacent to the region of interest outside the region of interest applying the second parameter value to , and applying a fourth parameter value smaller than the second parameter value to a second grid region spaced from the region of interest rather than the first grid region and adjacent to the outer region of the image action can be made.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Studio Devices (AREA)
  • Geometry (AREA)
  • Image Processing (AREA)

Abstract

Selon la présente invention, au moins un processeur inclus dans un dispositif électronique peut effectuer une correction de distorsion sur une image acquise au moyen d'un capteur d'image, la correction de distorsion étant effectuée par : la détection d'un visage dans l'image ; la division de l'image acquise en une pluralité de zones de grille ; la définition, en tant que zone d'intérêt, d'au moins une zone de grille correspondant au visage détecté parmi la pluralité de zones de grille ; l'application d'un paramètre lié à la compensation de distorsion en tant que première valeur de paramètre à la ou aux zones de grille définies en tant que zone d'intérêt ; et l'application d'une seconde valeur de paramètre inférieure à la première valeur de paramètre en tant que paramètre à des zones de grille autres que la ou les zones de grille parmi la pluralité de zones de grille. Divers autres modes de réalisation, découlant de la description, sont également possibles.
PCT/KR2021/013658 2020-10-14 2021-10-06 Procédé de correction de distorsion d'image, et dispositif électronique associé WO2022080737A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0132891 2020-10-14
KR1020200132891A KR20220049354A (ko) 2020-10-14 2020-10-14 이미지의 왜곡을 보정하는 방법 및 그 전자 장치

Publications (1)

Publication Number Publication Date
WO2022080737A1 true WO2022080737A1 (fr) 2022-04-21

Family

ID=81208390

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/013658 WO2022080737A1 (fr) 2020-10-14 2021-10-06 Procédé de correction de distorsion d'image, et dispositif électronique associé

Country Status (2)

Country Link
KR (1) KR20220049354A (fr)
WO (1) WO2022080737A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007174060A (ja) * 2005-12-20 2007-07-05 Fuji Xerox Co Ltd 画像形成装置
JP2008140401A (ja) * 2007-12-14 2008-06-19 Sanyo Electric Co Ltd 運転支援装置
KR20110116387A (ko) * 2010-04-19 2011-10-26 클레어픽셀 주식회사 촬상 장치 및 영상 왜곡 보정 방법
WO2018074520A1 (fr) * 2016-10-21 2018-04-26 パナソニックIpマネジメント株式会社 Système d'interphone, dispositif principal d'interphone, sous-dispositif d'interphone, et programme
JP2018160809A (ja) * 2017-03-23 2018-10-11 カシオ計算機株式会社 画像処理装置、撮像システム、画像処理方法及びプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007174060A (ja) * 2005-12-20 2007-07-05 Fuji Xerox Co Ltd 画像形成装置
JP2008140401A (ja) * 2007-12-14 2008-06-19 Sanyo Electric Co Ltd 運転支援装置
KR20110116387A (ko) * 2010-04-19 2011-10-26 클레어픽셀 주식회사 촬상 장치 및 영상 왜곡 보정 방법
WO2018074520A1 (fr) * 2016-10-21 2018-04-26 パナソニックIpマネジメント株式会社 Système d'interphone, dispositif principal d'interphone, sous-dispositif d'interphone, et programme
JP2018160809A (ja) * 2017-03-23 2018-10-11 カシオ計算機株式会社 画像処理装置、撮像システム、画像処理方法及びプログラム

Also Published As

Publication number Publication date
KR20220049354A (ko) 2022-04-21

Similar Documents

Publication Publication Date Title
WO2022030855A1 (fr) Dispositif électronique et procédé permettant de générer une image par application d'un effet sur un sujet et un arrière-plan
WO2022030838A1 (fr) Dispositif électronique et procédé de commande d'image de prévisualisation
WO2022039424A1 (fr) Procédé de stabilisation d'images et dispositif électronique associé
WO2022092706A1 (fr) Procédé de prise de photographie à l'aide d'une pluralité de caméras, et dispositif associé
WO2022108235A1 (fr) Procédé, appareil et support de stockage pour obtenir un obturateur lent
WO2022149654A1 (fr) Dispositif électronique pour réaliser une stabilisation d'image, et son procédé de fonctionnement
WO2022235043A1 (fr) Dispositif électronique comprenant une pluralité de caméras et son procédé de fonctionnement
WO2023033333A1 (fr) Dispositif électronique comprenant une pluralité de caméras et son procédé de fonctionnement
WO2022196993A1 (fr) Dispositif électronique et procédé de capture d'image au moyen d'un angle de vue d'un module d'appareil de prise de vues
WO2021194161A1 (fr) Procédé de correction de tremblement au niveau d'un agrandissement élevé et dispositif électronique associé
WO2022080737A1 (fr) Procédé de correction de distorsion d'image, et dispositif électronique associé
WO2022240186A1 (fr) Procédé de correction de distorsion d'image et dispositif électronique associé
WO2021251631A1 (fr) Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé
WO2021230567A1 (fr) Procédé de capture d'image faisant intervenir une pluralité d'appareils de prise de vues et dispositif associé
WO2022025574A1 (fr) Dispositif électronique comprenant un capteur d'image et un processeur de signal d'image, et son procédé
WO2022092607A1 (fr) Dispositif électronique comportant un capteur d'image et procédé de fonctionnement de celui-ci
WO2022154164A1 (fr) Dispositif électronique apte à régler un angle de vue et procédé de fonctionnement associé
WO2022173236A1 (fr) Dispositif électronique comprenant un capteur d'image et son procédé de fonctionnement
WO2021230507A1 (fr) Procédé et dispositif pour fournir un guidage en imagerie
WO2022231270A1 (fr) Dispositif électronique et son procédé de traitement d'image
WO2023033396A1 (fr) Dispositif électronique pour traiter une entrée de prise de vue continue, et son procédé de fonctionnement
WO2022245148A1 (fr) Procédé de traitement d'image et dispositif électronique pour celui-ci
WO2024106746A1 (fr) Dispositif électronique et procédé d'augmentation de la résolution d'une image de bokeh numérique
WO2023058878A1 (fr) Dispositif électronique comprenant un capteur d'image et procédé de fonctionnement associé
WO2023146236A1 (fr) Dispositif électronique comprenant un module de caméra

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21880384

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21880384

Country of ref document: EP

Kind code of ref document: A1