WO2022010193A1 - Dispositif électronique d'amélioration d'image et procédé de fonctionnement de caméra de dispositif électronique - Google Patents

Dispositif électronique d'amélioration d'image et procédé de fonctionnement de caméra de dispositif électronique Download PDF

Info

Publication number
WO2022010193A1
WO2022010193A1 PCT/KR2021/008443 KR2021008443W WO2022010193A1 WO 2022010193 A1 WO2022010193 A1 WO 2022010193A1 KR 2021008443 W KR2021008443 W KR 2021008443W WO 2022010193 A1 WO2022010193 A1 WO 2022010193A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
electronic device
subject
camera
processor
Prior art date
Application number
PCT/KR2021/008443
Other languages
English (en)
Korean (ko)
Inventor
김상헌
임연욱
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2022010193A1 publication Critical patent/WO2022010193A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • Embodiments of the present document relate to an electronic device that generates an improved image when capturing an image using a camera of the electronic device, and a method of operating the electronic device.
  • an image When an image is captured using an electronic device, it may be necessary to improve an image to be captured according to a property included in the configuration of the electronic device or a surrounding environment. For example, the electronic device may need to improve an angle of view and improve shading according to positions of light sources and objects.
  • capturing may be performed based on a plurality of attributes.
  • the electronic device may be equipped with a wide camera, an ultra-wide camera, a tele photo camera, a time of flight (TOF) camera, and the like.
  • the electronic device may capture an image based on various angles of view or distance information by mounting the aforementioned cameras.
  • the electronic device may estimate or determine the shadow based on distance information about the object, and may perform image correction on the determined shadow. Also, the electronic device may determine whether an event in the image has occurred by using shooting information such as auto focusing.
  • the user has to compose a composition so as not to generate a shadow when taking an image, and while trying various compositions, the user cannot take a picture in a position suitable for taking an image while no shadow is generated.
  • An electronic device includes a display, a first camera module including a first camera having a first viewing angle, and a second camera including a second camera having a second viewing angle different from the first viewing angle a module, and a processor operatively connected to the first camera module and the second camera module, wherein the processor is configured to display a first image obtained using the first camera module and including a subject image.
  • a preview screen is displayed through the display, it is determined whether the first image satisfies a condition for determining whether image improvement is required, and a second image including the subject image is captured using the second camera module.
  • a composite image obtained by matching the first object coordinates of the first image and the second object coordinates of the second image, and synthesizing the first image and the second image and an electronic device that displays the displayed second preview screen through the display.
  • the method of operating an electronic device includes: displaying a first preview screen through the display, which is acquired using the first camera module and displays a first image including a subject image; Determining whether the first image satisfies a condition for determining whether image improvement is necessary, obtaining a second image including the subject image by using the second camera module, and determining whether the condition is satisfied
  • a second preview screen for displaying a composite image in which the first object coordinates of the first image and the second object coordinates of the second image are matched, and the first image and the second image are combined; It may include an operation of displaying through the display.
  • image improvement may be performed on a shadow generated when an image of a subject is captured by using the camera of the electronic device.
  • image improvement may be performed with respect to light reflection by a subject that occurs when an image is captured by using a camera of an electronic device.
  • the user experience may be improved by improving the image of the subject photographed by the electronic device.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments of the present disclosure
  • FIG. 2 is a block diagram illustrating a camera module according to various embodiments of the present disclosure
  • FIG. 3 is a diagram illustrating a configuration of a part of an electronic device and timing for explaining an example in which the electronic device operates according to various embodiments of the present disclosure
  • FIG. 4 is a diagram illustrating an image before and after image enhancement is performed in an electronic device according to an exemplary embodiment.
  • FIG. 5 is a diagram illustrating a configuration of an electronic device according to an embodiment.
  • FIG. 6 is a flowchart illustrating a flow of performing image enhancement in an electronic device according to an exemplary embodiment.
  • FIG. 7 is a flowchart illustrating a part of a detailed flow of performing image enhancement in an electronic device according to an exemplary embodiment.
  • FIG. 8 is a diagram illustrating an embodiment of dividing a subject image area of a preview screen while image enhancement is being performed in an electronic device according to an exemplary embodiment.
  • FIG. 9 is a diagram illustrating an embodiment of determining whether an electronic device satisfies a threshold value condition for determining whether image improvement is necessary while performing image improvement, according to an embodiment.
  • FIG. 10 is a diagram illustrating an embodiment of obtaining distance values between a subject and a light source using a TOF camera while image enhancement is being performed in an electronic device, according to an embodiment.
  • FIG. 11 is a diagram illustrating an embodiment of generating object coordinates and switching to a second camera while performing image enhancement in an electronic device, according to an embodiment.
  • FIG. 12 is a flowchart illustrating a part of a detailed flow of performing image enhancement in an electronic device according to an exemplary embodiment.
  • FIG. 13 is a diagram illustrating an embodiment in which image synthesis is performed by matching object coordinates in an electronic device according to an embodiment.
  • FIG. 14 is a diagram illustrating an embodiment of generating a guide based on a subject for image improvement in an electronic device according to an embodiment.
  • 15 is a diagram illustrating an embodiment of generating a shadow area for a subject image in an electronic device according to an embodiment.
  • 16 is a diagram illustrating an example of generating a light reflection area for a subject image in an electronic device according to an exemplary embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments of the present disclosure.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input device 150 , a sound output device 155 , a display device 160 , an audio module 170 , and a sensor module ( 176 , interface 177 , haptic module 179 , camera module 180 , power management module 188 , battery 189 , communication module 190 , subscriber identification module 196 , or antenna module 197 . ) may be included. In some embodiments, at least one of these components (eg, the display device 160 or the camera module 180 ) may be omitted or one or more other components may be added to the electronic device 101 . In some embodiments, some of these components may be implemented as a single integrated circuit. For example, the sensor module 176 (eg, a fingerprint sensor, an iris sensor, or an illuminance sensor) may be implemented while being embedded in the display device 160 (eg, a display).
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illumina
  • the processor 120 executes software (eg, the program 140 ) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to an embodiment, as at least part of data processing or operation, the processor 120 stores a command or data received from another component (eg, the sensor module 176 or the communication module 190 ) into the volatile memory 132 . may be loaded into the volatile memory 132 , and may process commands or data stored in the volatile memory 132 , and store the resulting data in the non-volatile memory 134 .
  • software eg, the program 140
  • the processor 120 stores a command or data received from another component (eg, the sensor module 176 or the communication module 190 ) into the volatile memory 132 .
  • the processor 120 stores a command or data received from another component (eg, the sensor module 176 or the communication module 190 ) into the volatile memory 132 .
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and an auxiliary processor 123 (eg, a graphic processing unit or an image signal processor) that can be operated independently or together with the main processor 121 . , a sensor hub processor, or a communication processor). Additionally or alternatively, the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a specified function. The auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • a main processor 121 eg, a central processing unit or an application processor
  • an auxiliary processor 123 eg, a graphic processing unit or an image signal processor
  • the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a specified function.
  • the auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • the auxiliary processor 123 is, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the auxiliary processor 123 eg, image signal processor or communication processor
  • the memory 130 may store various data used by at least one component of the electronic device 101 (eg, the processor 120 or the sensor module 176 ).
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input device 150 may receive a command or data to be used in a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive an incoming call.
  • the receiver may be implemented separately from or as a part of the speaker.
  • the display device 160 may visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
  • the display device 160 may include a touch circuitry configured to sense a touch or a sensor circuit (eg, a pressure sensor) configured to measure the intensity of a force generated by the touch. have.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input device 150 or an external electronic device (eg, a sound output device 155 ) directly or wirelessly connected to the electronic device 101 . The sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • an external electronic device eg, a sound output device 155
  • the sound may be output through the electronic device 102 (eg, a speaker or headphones).
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more specified protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 388 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication performance through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a LAN (local area network) communication module, or a power line communication module).
  • GNSS global navigation satellite system
  • a corresponding communication module is a first network 198 (eg, a short-range communication network such as Bluetooth, WiFi direct, or IrDA (infrared data association)) or a second network 199 (eg, a cellular network, the Internet, or It can communicate with an external electronic device through a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as a LAN or WAN.
  • These various types of communication modules may be integrated into one component (eg, a single chip) or may be implemented as a plurality of components (eg, multiple chips) separate from each other.
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified and authenticated.
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as a part of the antenna module 197 .
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the electronic devices 102 and 104 may be the same or a different type of device from the electronic device 101 .
  • all or some of the operations performed by the electronic device 101 may be executed by one or more of the external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • One or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • FIG. 2 is a block diagram 200 illustrating a camera module 180, according to various embodiments.
  • the camera module 180 includes a lens assembly 210 , a flash 220 , an image sensor 230 , an image stabilizer 240 , a memory 250 (eg, a buffer memory), or an image signal processor. (260).
  • the lens assembly 210 may collect light emitted from a subject, which is an image to be captured.
  • the lens assembly 210 may include one or more lenses.
  • the camera module 180 may include a plurality of lens assemblies 210 . In this case, the camera module 180 may form, for example, a dual camera, a 360 degree camera, or a spherical camera.
  • Some of the plurality of lens assemblies 210 may have the same lens properties (eg, angle of view, focal length, auto focus, f number, or optical zoom), or at least one lens assembly may be a different lens assembly. It may have one or more lens properties different from the lens properties of .
  • the lens assembly 210 may include, for example, a wide-angle lens or a telephoto lens.
  • the flash 220 may emit light used to enhance light emitted or reflected from the subject.
  • the flash 220 may include one or more light emitting diodes (eg, a red-green-blue (RGB) LED, a white LED, an infrared LED, or an ultraviolet LED), or a xenon lamp.
  • the image sensor 230 may acquire an image corresponding to the subject by converting light emitted or reflected from the subject and transmitted through the lens assembly 210 into an electrical signal.
  • the image sensor 230 may include, for example, one image sensor selected from among image sensors having different properties, such as an RGB sensor, a black and white (BW) sensor, an IR sensor, or a UV sensor, the same It may include a plurality of image sensors having properties, or a plurality of image sensors having different properties.
  • Each image sensor included in the image sensor 230 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the image stabilizer 240 moves at least one lens or the image sensor 230 included in the lens assembly 210 in a specific direction or Operation characteristics of the image sensor 230 may be controlled (eg, read-out timing may be adjusted, etc.). This makes it possible to compensate for at least some of the negative effects of the movement on the image being taken.
  • the image stabilizer 240 is, according to an embodiment, the image stabilizer 240 is a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 180 . can be used to detect such a movement of the camera module 180 or the electronic device 101 .
  • the image stabilizer 240 may be implemented as, for example, an optical image stabilizer.
  • the memory 250 may temporarily store at least a portion of the image acquired through the image sensor 230 for the next image processing operation. For example, when image acquisition is delayed according to the shutter or a plurality of images are acquired at high speed, the acquired original image (eg, Bayer-patterned image or high-resolution image) is stored in the memory 250 and , a copy image corresponding thereto (eg, a low-resolution image) may be previewed through the display device 160 .
  • the acquired original image eg, Bayer-patterned image or high-resolution image
  • a copy image corresponding thereto eg, a low-resolution image
  • the memory 250 may be configured as at least a part of the memory 130 or as a separate memory operated independently of the memory 130 .
  • the image signal processor 260 may perform one or more image processing on an image acquired through the image sensor 230 or an image stored in the memory 250 .
  • the one or more image processes may include, for example, depth map generation, 3D modeling, panorama generation, feature point extraction, image synthesis, or image compensation (eg, noise reduction, resolution adjustment, brightness adjustment, blurring ( blurring, sharpening, or softening.
  • the image signal processor 260 may include at least one of the components included in the camera module 180 (eg, an image sensor). 230), for example, exposure time control, readout timing control, etc.
  • the image processed by the image signal processor 260 is stored back in the memory 250 for further processing.
  • the image signal processor 260 may be configured as at least a part of the processor 120 or as a separate processor operated independently of the processor 120.
  • the image signal processor 260 may be configured as the processor 120 and a separate processor, the at least one image processed by the image signal processor 260 may be displayed through the display device 160 as it is by the processor 120 or after additional image processing.
  • the electronic device 101 may include a plurality of camera modules 180 each having different properties or functions.
  • at least one of the plurality of camera modules 180 may be a wide-angle camera, and at least the other may be a telephoto camera.
  • at least one of the plurality of camera modules 180 may be a front camera, and at least the other may be a rear camera.
  • FIG 3 is a diagram illustrating a timing for explaining a configuration 300 of a portion of the electronic device 301 and an example 390 in which the electronic device 301 operates according to various embodiments of the present disclosure.
  • the electronic device 301 may correspond to the electronic device 101 of FIG. 1 .
  • the electronic device 301 may include a processor 310 , a memory 330 , a camera module 340 , a TOF camera 350 , and a display 360 .
  • the camera module 340 may capture still images and moving images.
  • the camera module 340 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the camera module 340 may include at least a first camera module 342 and a second camera module 344 .
  • the first camera module 342 and the second camera module 344 may perform the same or similar functions to the functions performed by the camera module 340 .
  • the TOF camera 350 may acquire distance information (or depth information) between the electronic device 301 and the object (eg, subject) 302 .
  • the TOF camera 350 may include a time of flight (ToF) module.
  • the TOF camera 350 may include a light emitting module 352 and a receiving module 354 , or a combination thereof.
  • the light emitting module 352 may include one or more vertical-cavity surface emitting lasers (VCSELs), light emitting diodes (eg, red-green-blue (RGB) LEDs, white LEDs, infrared LEDs, or ultraviolet LEDs), or It may include a xenon lamp.
  • VCSELs vertical-cavity surface emitting lasers
  • RGB red-green-blue
  • LEDs white LEDs
  • infrared LEDs or ultraviolet LEDs
  • It may include a xenon lamp.
  • the reception module 354 may be configured as an RGB sensor, a black and white (BW) sensor, an infrared sensor, an ultraviolet sensor, or a combination thereof.
  • the image sensor included in the reception module 354 may be implemented using a charged coupled device (CCD) sensor, a complementary metal oxide semiconductor (CMOS) sensor, or a combination thereof.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the TOF camera 350 transmits the irradiated light 371 of a specified frequency to the object 302 (eg, a subject, an external object) using the light emitting module 352 as shown in FIG. 3 . etc.) can be irradiated during the irradiation time (T p ) of the specified period (T c ).
  • the TOF camera 350 may not irradiate the irradiation light 371 during the non-irradiation time T np of the specified period T c .
  • the TOF camera 350 may receive the reflected light 375 from the object 302 for a time interval corresponding to the irradiation time T p using the receiving module 354 .
  • the reception module 354 may include at least two capacitors T1 and T2.
  • the first capacitor T1 is switched from the off state to the on state during the irradiation time T p of the specified period T c , and is switched from the on state to the off state during the non-irradiation time T np .
  • the second capacitor T2 may be switched from the on state to the off state during the irradiation time T p of the specified period T c , and may be switched from the off state to the on state during the non-irradiation time T np . .
  • the at least two capacitors T1 and T2 may accumulate charge amounts Q1 and Q2 corresponding to the light amount of the incident reflected light 375 while they are in the on state.
  • the charges accumulated in the two capacitors T1 and T2 may be charges generated in response to light received by the reception module.
  • the TOF camera 350 transmits information indicating the distance D between the electronic device 301 and the object 302 to the processor 310 or indicates the distance D from the processor 310 .
  • Information indicating the time difference ⁇ t may be transmitted to obtain the information.
  • the TOF camera 350 may be configured as a part of the camera module 340 .
  • the camera module 340 and the TOF camera 350 may be configured as one sensor, for example, an image sensor. However, this is only an example, and the embodiment of the present invention is not limited thereto.
  • the processor 310 includes at least one other component (eg, the memory 330 , the camera module 340 , the TOF camera 350 , or the display of the electronic device 301 connected to the processor 310 ). (360)) can be controlled. Also, the processor 310 may correspond to the processor 120 of FIG. 1 .
  • the processor 310 activates the TOF camera 350 in response to an input to measure the distance to the object 302 , and communicates with the electronic device 301 using the activated TOF camera 350 .
  • Distance information between the objects 302 may be acquired.
  • an input for measuring a distance to the object 302 may be received through a screen provided by an arbitrary application.
  • an input for measuring a distance to the object 302 may be an input for executing an arbitrary application (eg, a camera application).
  • an input for measuring a distance to the object 302 may be received through at least one of a voice command, a button input, and a gesture input.
  • the processor 310 may calculate the distance D between the electronic device 301 and the object 302 based on Equation 1 below.
  • the processor 310 may provide a service based on distance information measured using the TOF camera 350 .
  • the service may be at least one function capable of distinguishing a background from an object in an image captured by a camera.
  • the processor 310 may distinguish at least a part (eg, a background) included in the image from another part based on the measured distance information, and a different effect (eg, blur effect) on the divided part and the other part.
  • the processor 310 may classify at least one object included in the image based on the measured distance information, the distance between the electronic device and the object 302 , the size of the object 302 , and the object 302 .
  • ) may provide a service for measuring at least one of information such as thickness (eg, AR Ruler service, map service, Mixed Reality service, etc.).
  • information such as thickness (eg, AR Ruler service, map service, Mixed Reality service, etc.).
  • thickness eg, AR Ruler service, map service, Mixed Reality service, etc.
  • the processor 310 in response to determining that the distance measured using the TOF camera 350 deviates from the threshold distance, uses the camera module 340 to connect the electronic device 301 and the object ( 302), it is possible to obtain distance information between them.
  • the threshold distance may be a distance between the electronic device 301 and the object 302 at which an error in a distance measured using the TOF camera 350 may be guaranteed to be less than or equal to a certain level.
  • the processor 310 may stop the operation of the TOF camera 350 and acquire distance information using the camera module 340 .
  • the processor 310 may stop the operation of the TOF camera 350 to prevent unnecessary power consumption, and use the camera module 340 that consumes relatively less power than the TOF camera 350 .
  • a time point at which the operation of 350 is resumed (eg, the object 302 that exists within a threshold distance is checked) may be checked.
  • the processor 310 may stop the operation of the TOF camera 350 for a specified time and temporarily operate after the specified time to confirm the timing of resuming the operation of the TOF camera 350 .
  • the processor 310 may stop providing a service based on the distance information while acquiring the distance information using the camera module 340 .
  • the processor 310 may notify a message indicating that the operation of the TOF camera 350 is stopped.
  • the processor 310 may process the stopped operation of the TOF camera 350 to resume. For example, the processor 310 may measure distance information using the TOF camera 350 and may provide a service based on the measured distance information.
  • the processor 310 may change the measurement parameter while the TOF camera 350 is operating.
  • the measurement parameters may include an irradiation frequency (eg, frequency) of the light emitting module 352 , an irradiation period (eg, exposure time), a frame rate, and the like.
  • the processor 310 may change the measurement parameter based on the position of the object 302 .
  • the processor 310 may decrease the irradiation period and increase the irradiation frequency as the distance of the object 302 approaches the electronic device 301 .
  • the processor 310 may increase the irradiation period and decrease the irradiation frequency as the distance of the object 302 increases from the electronic device 301 .
  • the processor 310 may reduce the frame rate to reduce power consumption while the TOF camera 350 is operating.
  • the processor 310 may change the measurement parameter based on the photographing environment.
  • the photographing environment may be related to the type of the object 302 .
  • the processor 310 may obtain distance information from the object 302 using a preset reference parameter. Also, when the object 302 having relatively many movements is used, the processor 310 may change a measurement parameter to enlarge a distance measurement range, for example, a threshold distance. For example, the processor 310 may expand the range of the distance measurement by increasing the irradiation period and decreasing the irradiation frequency.
  • the memory 330 may include at least one data required for a distance from the object 302 .
  • the memory may include data related to a measurement parameter and a threshold distance.
  • the measurement parameter may include an irradiation frequency (eg, frequency) of the light emitting module, an irradiation period (eg, exposure time), a frame rate, and the like.
  • the measurement parameter may include various parameters such as irradiation intensity of the light emitting module.
  • the measurement parameter may be defined according to a measurement environment.
  • a measurement parameter suitable for the object 302 having relatively few movements and a measurement parameter suitable for the object 302 having relatively many movements may be distinguished and stored.
  • the measurement parameter may be defined according to a distance from the object 302 .
  • the measurement parameters may include parameters that must be changed in order to enlarge the threshold distance.
  • an electronic device eg, electronic device 301
  • a camera module eg, camera module 340
  • TOF camera eg, TOF camera 350
  • It may include at least one memory (eg, the memory 330 ) and a display (eg, the display 360 ).
  • the memory when executed, causes the processor to, while the image is acquired using the camera module, based on the depth data generated using the TOF camera, the at least one A first distance between the object and the electronic device is calculated, and if the calculated first distance is greater than a specified distance, the TOF camera is stopped, and the TOF camera is used while the TOF camera is stopped.
  • Instructions for calculating a second distance between at least one object and the electronic device and resuming the operation of the TOF camera when the calculated second distance is within the specified distance may be stored.
  • the instruction may calculate the second distance based on an auto-focus function of the camera module.
  • the instruction may provide notification information indicating that the calculated first distance is greater than a specified distance.
  • the instruction may provide notification information including information guiding that the first distance is included within a specified distance.
  • the instruction when receiving an input instructing the operation of the TOF camera while the calculated first distance is greater than the specified distance, the instruction enlarges the specified distance and exists within the enlarged specified distance It is possible to generate depth data for an object to perform a task, and to provide a service based on the generated depth data.
  • the instructions include at least one of an irradiation frequency (eg, frequency), an irradiation period (eg, exposure time), and a frame rate of the light emitting module (eg, the light emitting module 352) related to the TOF camera. It can be changed to enlarge the specified distance.
  • an irradiation frequency eg, frequency
  • an irradiation period eg, exposure time
  • a frame rate of the light emitting module eg, the light emitting module 352 related to the TOF camera. It can be changed to enlarge the specified distance.
  • the instruction when the second distance cannot be calculated using the camera module, the instruction may calculate the second distance using the TOF camera.
  • the instruction may terminate a function related to the TOF camera when the second distance cannot be calculated using the TOF camera.
  • the instruction may generate depth data while changing a measurement parameter of the resumed TOF camera, and provide a service based on the generated depth data.
  • the measurement parameter may include at least one of an irradiation frequency, an irradiation period, and a frame rate of the light emitting module related to the TOF camera.
  • the instruction may change the measurement parameter based on at least one of a shooting mode and environment information.
  • the shooting mode may include at least one of a moving image mode and a still image mode.
  • the environment information may include at least one of a movement of the object, the number of the objects, and the type of the object.
  • the TOF camera may generate the depth data using a time when light arrives after being reflected by an object.
  • FIG. 4 is a diagram illustrating an image before and after image enhancement is performed in an electronic device according to an exemplary embodiment.
  • the electronic device 301 may acquire the image 400 using a camera module (eg, the camera module 340 ), and preview the acquired image 400 through the display 360 . It can be displayed on a screen (eg, the first preview screen).
  • the image 400 acquired by the electronic device 301 may include regions requiring image improvement. For example, as shown in FIG. 4 , there may be an area where light is reflected by a fluorescent lamp, an area where a shadow is generated by a subject, and an area where a shadow is generated by a photographer.
  • the electronic device 301 may acquire the image 410 by performing an image enhancement operation.
  • the image 410 on which the electronic device 301 has performed image improvement is a region in the image 400 requiring image improvement (eg, a region where light is reflected by a fluorescent lamp, a region where a shadow is generated by a subject, and a shadow by a photographer). It may be an image in which the area in which the ) has been removed.
  • the electronic device 301 may display the improved image 410 on a preview screen (eg, a second preview screen) through the display 360 .
  • FIG. 5 is a diagram illustrating a configuration of an electronic device according to an embodiment.
  • an electronic device 500 may be configured as follows.
  • the illustrated configurations are examples, and at least some of the illustrated configurations may be changed according to a platform included in the electronic device 500 .
  • the electronic device 500 may correspond to the electronic device 101 of FIG. 1 and the electronic device 301 of FIG. 3 .
  • the electronic device 500 includes an application layer 510 , a framework layer 520 , a hardware abstraction layer (HAL) 530 , and a kernel driver layer. It may include a kernel driver layer 540 and a hardware layer 550 .
  • HAL hardware abstraction layer
  • the application layer 510 includes at least one application 511 stored in a memory (eg, the memory 130 of FIG. 1 ) and executable by a processor (eg, the processor 120 of FIG. 1 ); A system UI 515 may be included.
  • the type of the application 511 may not be limited, such as an Internet browser, a video application, or a game.
  • system UI 515 includes various GUI (graphics) implemented on a system of an electronic device (eg, the electronic device 101 of FIG. 1 ) such as a status bar and a quick view. user interface) may mean an application constituting the screen. Also, the system UI 515 may be an application for displaying a screen related to the system.
  • GUI graphics
  • the system UI 515 may be an application for displaying a screen related to the system.
  • the framework layer 520 provides various functions so that a function or information provided from one or more resources of the electronic device (eg, the electronic device 101 of FIG. 1 ) can be used by the application 511 . It may be provided as an application 511 .
  • the framework layer 520 includes an activity manager 512 , a window manager 522 , a view system 523 , and a power manager 524 . ), an input manager 525 , a display manager 526 , and a sensor manager 527 .
  • the activity manager 512 may control a life cycle and an activity stack of the application 511 .
  • the window manager 522 may manage one or more GUI resources used on a screen of a display (eg, the display 160 of FIG. 1 ).
  • the view system 523 may be a set of extensible views used to generate a user interface for the application 511 .
  • the power manager 524 may manage capacity, temperature, or power of a battery (eg, the battery 189 of FIG. 1 ). Also, the power manager 524 may determine or provide related information necessary for the operation of the electronic device (eg, the electronic device 101 of FIG. 1 ) by using the information acquired during the management.
  • the input manager 525 may provide information on the input device provided by the electronic device (eg, the electronic device 101 of FIG. 1 ).
  • the display manager 526 may manage a life cycle (eg, connection/property change/removal) of a display (eg, the display 160 of FIG. 1 ).
  • the display manager 526 may manage the output of GUI elements on the screen of the display (eg, the display 160 of FIG. 1 ), and the display (eg, the display 160 of FIG. 1 ) to be output according to an event such as a change in the folded state.
  • the display 160 of 1 may be changed.
  • the sensor manager 527 may control a sensor (eg, the sensor module 176 of FIG. 1 ) based on the usability of the application 511 .
  • components included in the framework layer 520 may be components included in an intelligent shadow perspective prediction manager (ISPPM).
  • ISPPM intelligent shadow perspective prediction manager
  • the hardware abstraction layer 530 is an abstraction layer between a plurality of hardware components included in the hardware layer 550 and software components of an electronic device (eg, the electronic device 101 of FIG. 1 ). can mean
  • the hardware abstraction layer 530 may include an input dispatcher 531 , an event hub 532 , and a surface flinger 533 .
  • the input dispatcher 531 may determine whether to transmit the input event to the input target window, process, or application 511 , and may transmit the input event according to the determination.
  • the event hub 532 may standardize an input event generated by a sensor (eg, the sensor module 176 of FIG. 1 ).
  • the surface flinger 533 may perform a function of providing an execution screen to be displayed on a display (eg, the display 160 of FIG. 1 ) among execution screens generated by the application 511 . Also, when the configuration of the display (eg, the display 160 of FIG. 1 ) is changed, the surface flinger 533 may request the application 511 to process the change in resolution and density according to the changed display configuration.
  • the kernel driver layer 540 may include a plurality of drivers to control hardware included in the electronic device (eg, the electronic device 101 of FIG. 1 ). Also, the kernel driver layer 540 according to an embodiment may include a touch driver 541 , a sensor driver 542 , a general display 543 , and a DDI controller 544 .
  • the touch driver 541 may include an interface for controlling the touch controller 551 and may control the touch controller 551 .
  • the sensor driver 542 may include an interface for controlling the sensor controller 552 and may control the sensor controller 552 .
  • the DDI controller 544 may drive the general display 543 .
  • the hardware layer 550 may include a touch controller 551 , a sensor controller 552 , a display controller 553 , and an LCD panel 554 .
  • components included in the electronic device 500 may be controlled by a processor (eg, the processor 120 of FIG. 1 or the processor 310 of FIG. 3 ).
  • FIG. 6 is a flowchart illustrating a flow of performing image enhancement in an electronic device according to an exemplary embodiment.
  • operations 610 to 650 may be controlled by the processor 310 of the electronic device 301 , and data processed by the processor 310 may be at least temporarily stored in the memory 330 . have.
  • the electronic device 301 displays a first preview screen displaying a first image including a subject image acquired using the first camera module 342 through the display 360 . can be displayed
  • the camera module 340 of the electronic device 301 may include a plurality of camera modules (eg, a first camera module 342 and a second camera module 344 ).
  • the first camera module 342 may be a camera module including a wide-angle camera
  • the second camera module 344 may be a camera module including a telephoto camera.
  • the electronic device 301 may acquire images using a plurality of camera modules (eg, the first camera module 342 and the second camera module 344 ) under the control of the processor 310 . have.
  • the electronic device 301 may acquire the first image including the subject image by photographing the subject using the wide-angle camera included in the first camera module 342 under the control of the processor 310 . .
  • the electronic device 301 displays a preview screen (eg, a first preview) that displays an image (eg, a first image) acquired by using a camera module (eg, the first camera module 342 ). screen) may be displayed through a display (eg, the display 360 ) under the control of the processor 310 .
  • a display eg, the display 360
  • the electronic device 301 uses a first camera module 342 including a wide-angle camera to obtain a first image (eg, an image 400 ) under the control of the processor 310 .
  • the displayed screen of the display 360 may be the first preview screen.
  • the first image (eg, the image 400 ) displayed on the first preview screen of the electronic device 301 is a region requiring image improvement (eg, a region reflected by a fluorescent lamp, It may be an image including an area in which a shadow is generated by the photographer, an area in which a shadow is generated by the photographer).
  • a region requiring image improvement eg, a region reflected by a fluorescent lamp, It may be an image including an area in which a shadow is generated by the photographer, an area in which a shadow is generated by the photographer).
  • the electronic device 301 may determine whether the first image satisfies a condition for determining whether image improvement is required.
  • the electronic device 301 may determine whether the first image displayed on the first preview screen needs image improvement under the control of the processor 310 based on a condition.
  • a condition eg, a threshold condition
  • the electronic device 301 may determine that the value of the shadow region or the light reflection region is equal to or greater than a threshold value under the control of the processor 310 . In this case, the electronic device 301 may determine that the first image needs image improvement under the control of the processor 310 .
  • the electronic device 301 may acquire a second image including a subject image using the second camera module 344 .
  • the second camera module 344 of the electronic device 301 may be a camera module including a telephoto camera.
  • the electronic device 301 uses the telephoto camera included in the second camera module 344, it may be easier to photograph a distant subject than when the wide-angle camera included in the first camera module 342 is used.
  • the electronic device 301 may photograph a subject using a telephoto camera included in the second camera module 344 under the control of the processor 310 .
  • the electronic device 301 may acquire the second image including the subject image by photographing it using a telephoto camera.
  • the distance value between the subject and the electronic device 301 is higher than when the electronic device 301 acquires the first image using the wide-angle camera included in the first camera module 342 , the second The case of acquiring the second image by using the telephoto camera included in the camera module 344 may be greater.
  • the electronic device 301 may match the first object coordinates of the first image and the second object coordinates of the second image.
  • the electronic device 301 may generate object coordinates (eg, first object coordinates) while acquiring the first image including the object image under the control of the processor 310 .
  • object coordinates eg, first object coordinates
  • the electronic device 301 obtains object coordinates (eg, first object coordinates) for a subject (eg, a watch) included in the image 400 while acquiring the image 400 . may have created
  • the electronic device 301 may generate object coordinates (eg, second object coordinates) while acquiring the second image including the object image under the control of the processor 310 .
  • object coordinates eg, second object coordinates
  • the value of the distance between the subject and the electronic device 301 is higher than when the electronic device 301 acquires the first image by using the first camera module 342 , rather than using the second camera module 344 . Since the case where the second image is obtained by using the second image may be greater, the second object coordinates may be different from the first object coordinates.
  • the electronic device 301 may match the first object coordinates of the first image with the second object coordinates of the second image under the control of the processor 310 .
  • the electronic device 301 may display a second preview screen displaying a composite image obtained by synthesizing the first image and the second image through the display.
  • the electronic device 301 controls images (eg, first image, second image) corresponding to object coordinates (eg, first object coordinates, second object coordinates) under the control of the processor 310 . ) can be synthesized (or registered) to create a composite image.
  • the electronic device 301 may generate a composite image (eg, an image 410 ) under the control of the processor 310 .
  • the image 410 may be an image in which regions requiring image improvement (eg, a region reflected by a fluorescent lamp, a region in which a shadow is generated by a subject, an region in which a shadow is generated by a photographer) are removed.
  • FIG. 7 is a flowchart illustrating a part of a detailed flow of performing image enhancement in an electronic device according to an exemplary embodiment. Operations 710 to 750 of FIG. 7 are specific flows related to operation 620 of FIG. 6 .
  • operations 710 to 750 of FIG. 7 may be controlled by the processor 310 of the electronic device 301 , and data processed by the processor 310 is at least temporarily stored in the memory 330 . can be saved.
  • Operations 710 to 750 may be described with reference to FIGS. 8 to 11 , and various embodiments may not be limited as described below.
  • the electronic device 301 may classify the subject image area of the first preview screen.
  • Operation 710 according to an embodiment will be described with reference to FIG. 8 showing an embodiment of dividing a subject image area of a preview screen while image enhancement is being performed in the electronic device according to an embodiment.
  • the electronic device 301 may display the subject image area and the image area excluding the subject image on the first preview screen under the control of the processor 310 .
  • the electronic device 301 may display a subject image 800 including text or a subject image 810 having a shape on the first preview screen under the control of the processor 310 . have.
  • the electronic device 301 divides the subject image area (eg, the subject image area 802 including text and the subject image area 812 having a shape characteristic) under the control of the processor 310 .
  • the electronic device 301 displays a subject 801 including text or a subject 811 having a shape through a first preview screen under the control of the processor 310 . It may be displayed as an image (eg, the first image).
  • the electronic device 301 may generate a guide 802 for classifying the subject 801 including text under the control of the processor 310 , and a guide 812 for classifying the subject 811 having a shape characteristic. ) can be created.
  • the electronic device 301 generates the guides 802 and 812 under the control of the processor 310 , thereby generating a subject image area (eg, a subject image area 802 including text, a subject image having a shape). region 812) can be divided.
  • a subject image area eg, a subject image area 802 including text, a subject image having a shape. region 812) can be divided.
  • the electronic device 301 may determine whether a threshold condition for determining whether image improvement is required is satisfied.
  • Operation 720 according to an embodiment will be described with reference to FIG. 9 showing an embodiment of determining whether the electronic device satisfies the threshold value condition for determining whether image improvement is necessary while performing image improvement, according to an embodiment.
  • the electronic device 301 may determine whether image improvement is necessary based on the first image area 910 .
  • the first image area 910 may include a subject and a shadow of the subject.
  • the electronic device 301 may determine whether image improvement is required under the control of the processor 310 based on the first image area 910 including the subject and the subject's shadow.
  • the electronic device 301 may distinguish the background area 920 from the first image area 910 under the control of the processor 310 .
  • the electronic device 301 may classify the remaining areas excluding the subject from the first image area 910 including the subject and the subject's shadow under the control of the processor 310 .
  • the electronic device 301 may obtain a color difference value for a color difference (eg, a color difference between the subject and an area excluding the subject) of the subject boundary in the first image 910 under the control of the processor 310 . have.
  • a color difference value for a color difference eg, a color difference between the subject and an area excluding the subject
  • the electronic device 301 may determine whether the color difference value is equal to or greater than a threshold value (eg, determine whether a threshold value condition is satisfied) under the control of the processor 310 . For example, when the color difference value is not equal to or greater than a threshold value under the control of the processor 310 , the electronic device 301 may determine that the threshold value condition is not satisfied. Also, when the color difference value is equal to or greater than a threshold value under the control of the processor 310 , the electronic device 301 may determine that the threshold value condition is satisfied.
  • a threshold value eg, determine whether a threshold value condition is satisfied
  • the electronic device 301 may obtain a distance value between the subject and the light source by using the TOF camera.
  • Operation 730 according to an embodiment will be described with reference to FIG. 10 showing an embodiment of obtaining distance values between a subject and a light source using a TOF camera while image enhancement is being performed in an electronic device, according to an embodiment.
  • the relationship between the length of the shadow 1000 according to the light source and the incident angle and the reflection angle 1010 according to the surface of the object can be seen.
  • the electronic device 301 may obtain a distance value between the subject and the light source using the TOF camera 350 .
  • the electronic device 301 may extract key points of the subject and generate first object coordinates based on the extracted key points.
  • Operation 740 according to an embodiment will be described with reference to FIG. 11 showing an embodiment of generating object coordinates and switching to a second camera while image enhancement is being performed in the electronic device, according to an embodiment.
  • the electronic device 301 may extract a feature point of the subject under the control of the processor 310 using the distance values between the subject and the light source obtained in operation 730 .
  • the electronic device 301 may acquire a first image of the subject based on the first camera module 1100 (eg, corresponding to the first camera module 342 of FIG. 3 ).
  • the electronic device 301 may have a distance value of h 1 from the subject.
  • the electronic device 301 may extract a feature point of the subject by using the obtained distance value of the subject and the distance value of the light source.
  • the electronic device 301 may generate object coordinates (eg, first object coordinates) under the control of the processor 310 based on the extracted feature points of the object.
  • the electronic device 301 may generate object coordinates (eg, first object coordinates) under the control of the processor 310 based on the feature points of the object extracted while acquiring the image (eg, the first image).
  • the object coordinates (eg, first object coordinates) may be 3D spatial coordinates (eg, first object coordinates (X, Y, Z)).
  • the electronic device 301 may switch to the second camera module 1110 .
  • the second camera module 1110 may include a telephoto camera and may correspond to the second camera module 344 of FIG. 3 .
  • the electronic device 301 may switch from the first camera module 1100 to the second camera module 1110 under the control of the processor 310 .
  • the electronic device 301 switches from the first camera module 1100 to the second camera module 1110 under the control of the processor 310 . can be switched
  • the electronic device 301 when detecting that the distance value between the electronic device 301 and the subject increases, the electronic device 301 may switch to the second camera module under the control of the processor 310 . For example, when the distance value between the electronic device 301 and the subject increases from h 1 to h 2 , the electronic device 301 uses the second camera module to capture the image with the telephoto camera under the control of the processor 310 . (1110).
  • FIG. 12 is a flowchart illustrating a part of a detailed flow of performing image enhancement in an electronic device according to an exemplary embodiment.
  • FIG. 13 showing an embodiment in which image synthesis is performed by matching object coordinates in an electronic device.
  • the electronic device 301 may acquire a second image including a subject image using the second camera module 344 .
  • the electronic device 301 may photograph a subject using a telephoto camera included in the second camera module 344 under the control of the processor 310 .
  • the electronic device 301 may acquire the second image including the subject image by photographing it using a telephoto camera.
  • the distance value between the subject and the electronic device 301 is higher than when the electronic device 301 acquires the first image using the wide-angle camera included in the first camera module 342 , the second camera module 344 A case in which the second image is acquired by using the telephoto camera included in .
  • the electronic device 301 may obtain a distance value between the subject and the light source using the TOF camera 350 .
  • the electronic device 301 may obtain a distance value to the subject included in the second image using the TOF camera 350 under the control of the processor 310 .
  • the electronic device 301 may acquire the distance value of the light source under the control of the processor 310 using the TOF camera 350 in the same or similar manner as described with reference to FIG. 10 .
  • the electronic device 301 may extract key points of the subject and generate second object coordinates based on the extracted key points.
  • the electronic device 301 may extract a feature point of the subject under the control of the processor 310 using the obtained distance values of the subject and the light source. For example, the electronic device 301 may acquire a second image of the subject based on the second camera module 344 . The electronic device 301 may extract a feature point of the subject by using the obtained distance value of the subject and the distance value of the light source.
  • the electronic device 301 may generate object coordinates (eg, second object coordinates) under the control of the processor 310 based on the extracted feature points of the object.
  • the electronic device 301 may generate object coordinates (eg, second object coordinates) based on feature points of a subject extracted while acquiring an image (eg, a second image), and may generate object coordinates (eg, second image).
  • 2 object coordinates) may be 3D spatial coordinates (eg, second object coordinates (X, Y, Z)).
  • the electronic device 301 may match the first object coordinates of the first image and the second object coordinates of the second image.
  • the electronic device 301 may have acquired the first object coordinates from the first image.
  • the electronic device 301 is the first object coordinates, (X 1 , Y 1 , Z), (X 2 , Y 2 , Z), (X 3 , Y 3 , Z), (X 4 , Y ). 4 , Z) may have been obtained.
  • the electronic device 301 includes (X 1 , Y 1 , Z+H), (X 2 , Y 2 , Z+H), (X 3 , Y 3 , Z+H), ( X 4 , Y 4 , Z+H) may have been obtained.
  • the electronic device 301 displays the first object coordinates ((X 1 , Y 1 , Z), (X 2 , Y 2 , Z), (X 3 ) of the first image. , Y 3 , Z), (X 4 , Y 4 , Z)) in the second object coordinates of the second image ((X 1 , Y 1 , Z+H), (X 2 , Y 2 , Z+H) , (X 3 , Y 3 , Z+H), (X 4 , Y 4 , Z+H)) can be mapped.
  • the electronic device 301 may display a second preview screen displaying a composite image obtained by synthesizing the first image and the second image through the display.
  • the electronic device 301 controls the first object coordinates ((X 1 , Y 1 , Z), (X 2 , Y 2 , Z), (X 3 , Y ) according to the control of the processor 310 . 3 , Z), (X 4 , Y 4 , Z)), and second object coordinates ((X 1 , Y 1 , Z+H), (X 2 , Y 2 , Z+H), (X 3 , Y 3 , Z+H), (X 4 , Y 4 , Z+H)) may be synthesized (or registered) by synthesizing (or registering) images corresponding to each other (eg, the first image and the second image) to generate a composite image. .
  • the electronic device 301 may generate a synthesized image under the control of the processor 310 .
  • the synthesized image may be an image from which regions requiring image improvement (eg, a region reflected by a fluorescent lamp, a region in which a shadow is generated by a subject, an region in which a shadow is generated by a photographer) are removed.
  • first object coordinates ((X 1 , Y 1 , Z), (X 2 , Y 2 , Z), (X 3 , Y 3 , Z), (X 4 , Y 4 , Z)), and the second object coordinates ((X 1 , Y 1 , Z+H), (X 2 , Y 2 , Z+H), (X 3 , Y 3 , Z+H), (
  • X 4 , Y 4 , Z+H) has been described as an example, it may not be limited to the above-described embodiment.
  • the X and Y coordinates may be different, and H may also be different.
  • X, Y, Z, and H of the above-described four coordinates may be different, respectively.
  • FIG. 14 is a diagram illustrating an embodiment of generating a guide based on a subject for image improvement in an electronic device according to an embodiment.
  • the electronic device 301 may acquire an image (eg, a first image) including a subject and a shadow of the subject by using the first camera module 1410 . Also, the electronic device 301 may display a subject and an image (eg, a first image) including a shadow of the subject on the screen of the display 360 (eg, displayed on the first preview screen). In this case, the distance value between the electronic device 301 and the subject may be h 1 .
  • the electronic device 301 may display the guide 1420 on the screen of the display 360 (eg, the first preview screen) to improve the image.
  • the electronic device 301 displays the guide 1420 on the screen of the display 360 (eg, the first preview screen) to allow the user to perform an operation regarding the distance value between the electronic device 301 and the subject.
  • the user may move the electronic device 301 further away from the subject according to the guide 1420 .
  • the electronic device 301 may have a distance value of h 2 from the subject.
  • the distance (eg, h 2 ) between the electronic device 301 and the subject according to the guide 1420 may be a distance value capable of photographing without a shadow.
  • the electronic device 301 may obtain a distance value capable of photographing without a shadow based on at least the TOF camera 350 .
  • the electronic device 301 may switch the first camera module 1410 to the second camera module 1430 while the distance value h 1 is changed to the distance value h 2 .
  • the electronic device 301 may automatically perform the conversion, or may perform the conversion manually in response to obtaining a user's input.
  • the electronic device 301 may acquire an image that does not require image improvement (eg, a second image) in a state in which it is switched to the second camera module 1430 . Also, the electronic device 301 may display an image (eg, a second image) that does not require image improvement on a screen (eg, a second preview screen) of the display 360 .
  • image improvement eg, a second image
  • a screen e.g, a second preview screen
  • 15 is a diagram illustrating an embodiment of generating a shadow area for a subject image in an electronic device according to an embodiment.
  • the electronic device 301 may generate a shadow based on an image 1510 on which image enhancement has been performed (eg, an image from which the shadow has been removed). have.
  • the image 1520 is ) can be created.
  • 16 is a diagram illustrating an example of generating a light reflection area for a subject image in an electronic device according to an exemplary embodiment.
  • the electronic device 301 transmits light based on an image 1610 on which image enhancement is performed (eg, an image from which the light reflection effect is removed). You can create a reflective effect.
  • an image including a light reflection effect in the image 1610 on which image enhancement is performed using image data (eg, image data including a light reflection effect) stored in the memory 330 during the image enhancement process
  • image data eg, image data including a light reflection effect
  • the electronic device may be a device of various types.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a laptop, a desktop, a tablet, or a portable multimedia device
  • portable medical device e.g., a portable medical device
  • camera e.g., a camera
  • a wearable device e.g., a smart watch
  • a home appliance device e.g., a smart bracelet
  • first, second, or first or second may be used simply to distinguish the element from other elements in question, and may refer to elements in other aspects (e.g., importance or order) is not limited. It is said that one (eg, first) component is “coupled” or “connected” to another (eg, second) component, with or without the terms “functionally” or “communicatively”. When referenced, it means that one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • one or more instructions stored in a storage medium may be implemented as software (eg, the program 140) including
  • a processor eg, processor 120
  • a device eg, electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain a signal (eg, electromagnetic wave), and this term is used in cases where data is semi-permanently stored in the storage medium and It does not distinguish between temporary storage cases.
  • a signal eg, electromagnetic wave
  • the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play Store TM ) or on two user devices ( It can be distributed online (eg download or upload), directly between smartphones (eg smartphones).
  • a part of the computer program product may be temporarily stored or temporarily created in a machine-readable storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repetitively, or heuristically, or one or more of the operations are executed in a different order, omitted, or , or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

Un mode de réalisation de la présente invention concerne un dispositif électronique comprenant : une unité d'affichage ; un premier module de caméra comprenant une première caméra présentant un premier angle de vision ; un second module de caméra comprenant une seconde caméra présentant un second angle de vision différent du premier angle de vision ; et un processeur connecté fonctionnellement au premier module de caméra et au second module de caméra. Le processeur : affiche, par l'intermédiaire de l'unité d'affichage, un premier écran de prévisualisation affichant une première image acquise à l'aide du premier module de caméra et comprenant une image de sujet ; détermine si la première image satisfait ou non une condition de détermination de nécessité ou non d'une amélioration d'image ; utilise le second module de caméra pour acquérir une seconde image comprenant l'image du sujet ; met en correspondance des premières coordonnées de sujet de la première image avec des secondes coordonnées de sujet de la seconde image, s'il est déterminé que la première image satisfait la condition ; et affiche, par l'intermédiaire de l'unité d'affichage, un second écran de prévisualisation affichant une image composite obtenue par la combinaison de la première image et de la seconde image. Selon divers modes de réalisation du présent document, une amélioration d'image pour une ombre générée pendant la capture d'image d'un sujet, à l'aide d'une caméra du dispositif électronique, et une amélioration d'image pour une réflexion de lumière provoquée par le sujet et générée pendant la capture d'image, peuvent être réalisées. De plus, étant donné que le dispositif électronique améliore l'image d'un sujet capturé, l'expérience de l'utilisateur peut être améliorée.
PCT/KR2021/008443 2020-07-06 2021-07-02 Dispositif électronique d'amélioration d'image et procédé de fonctionnement de caméra de dispositif électronique WO2022010193A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0082944 2020-07-06
KR1020200082944A KR20220005283A (ko) 2020-07-06 2020-07-06 이미지 개선을 위한 전자장치 및 그 전자장치의 카메라 운용 방법

Publications (1)

Publication Number Publication Date
WO2022010193A1 true WO2022010193A1 (fr) 2022-01-13

Family

ID=79342009

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/008443 WO2022010193A1 (fr) 2020-07-06 2021-07-02 Dispositif électronique d'amélioration d'image et procédé de fonctionnement de caméra de dispositif électronique

Country Status (2)

Country Link
KR (1) KR20220005283A (fr)
WO (1) WO2022010193A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023219349A1 (fr) * 2022-05-12 2023-11-16 Samsung Electronics Co., Ltd. Dispositif à caméras multiples et procédés d'élimination d'ombres dans des images

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010072813A (ja) * 2008-09-17 2010-04-02 Fujitsu Ltd 画像処理装置および画像処理プログラム
JP2010268019A (ja) * 2009-05-12 2010-11-25 Nikon Corp 撮影装置
JP2018098738A (ja) * 2016-12-16 2018-06-21 キヤノン株式会社 画像処理装置、撮像装置、画像処理方法および画像処理プログラム
JP2020088810A (ja) * 2018-11-30 2020-06-04 キヤノン株式会社 撮像装置及び撮像装置の制御方法
KR20200073694A (ko) * 2018-12-14 2020-06-24 삼성전자주식회사 멀티 카메라를 포함하는 장치 및 이의 이미지 처리방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010072813A (ja) * 2008-09-17 2010-04-02 Fujitsu Ltd 画像処理装置および画像処理プログラム
JP2010268019A (ja) * 2009-05-12 2010-11-25 Nikon Corp 撮影装置
JP2018098738A (ja) * 2016-12-16 2018-06-21 キヤノン株式会社 画像処理装置、撮像装置、画像処理方法および画像処理プログラム
JP2020088810A (ja) * 2018-11-30 2020-06-04 キヤノン株式会社 撮像装置及び撮像装置の制御方法
KR20200073694A (ko) * 2018-12-14 2020-06-24 삼성전자주식회사 멀티 카메라를 포함하는 장치 및 이의 이미지 처리방법

Also Published As

Publication number Publication date
KR20220005283A (ko) 2022-01-13

Similar Documents

Publication Publication Date Title
WO2020032555A1 (fr) Dispositif électronique et procédé pour fournir une notification liée à une image affichée par l'intermédiaire d'un affichage et à une image stockée en mémoire sur la base d'une analyse d'image
WO2020171583A1 (fr) Dispositif électronique pour stabiliser une image et son procédé de fonctionnement
WO2020032473A2 (fr) Dispositif électronique de floutage d'image obtenue par combinaison de plusieurs images sur la base d'informations de profondeur et procédé de pilotage du dispositif électronique
WO2020171512A1 (fr) Dispositif électronique de recommandation de composition et son procédé de fonctionnement
WO2019039771A1 (fr) Dispositif électronique pour mémoriser des informations de profondeur en relation avec une image en fonction des propriétés d'informations de profondeur obtenues à l'aide d'une image, et son procédé de commande
WO2019164185A1 (fr) Dispositif électronique et procédé de correction d'une image corrigée selon un premier programme de traitement d'image, selon un second programme de traitement d'image dans un dispositif électronique externe
WO2019143050A1 (fr) Dispositif électronique et procédé de commande de mise au point automatique de caméra
WO2020032497A1 (fr) Procédé et appareil permettant d'incorporer un motif de bruit dans une image sur laquelle un traitement par flou a été effectué
WO2021112525A1 (fr) Dispositif électronique et procédé de commande de mouvement de caméra
WO2021157954A1 (fr) Procédé d'enregistrement vidéo mettant en oeuvre une pluralité de caméras, et dispositif associé
WO2020071823A1 (fr) Dispositif électronique et son procédé de reconnaissance de geste
WO2020171333A1 (fr) Dispositif électronique et procédé pour fournir un service correspondant à une sélection d'objet dans une image
WO2021133025A1 (fr) Dispositif électronique comprenant un capteur d'image et son procédé de fonctionnement
WO2019045517A1 (fr) Procédé de commande de synchronisation d'une pluralité de capteurs d'image et dispositif électronique destiné à sa mise en œuvre
WO2019208915A1 (fr) Dispositif électronique pour acquérir une image au moyen d'une pluralité de caméras par ajustage de la position d'un dispositif extérieur, et procédé associé
WO2021025509A1 (fr) Appareil et procédé d'affichage d'éléments graphiques selon un objet
WO2020231156A1 (fr) Dispositif électronique et procédé d'acquisition d'informations biométriques en utilisant une lumière d'affichage
WO2022092706A1 (fr) Procédé de prise de photographie à l'aide d'une pluralité de caméras, et dispositif associé
WO2021145667A1 (fr) Procédé et dispositif de commande de mouvement de caméra
WO2019103420A1 (fr) Dispositif électronique et procédé de partage d'image comprenant un dispositif externe, à l'aide d'informations de lien d'image
WO2022010193A1 (fr) Dispositif électronique d'amélioration d'image et procédé de fonctionnement de caméra de dispositif électronique
WO2021112500A1 (fr) Dispositif électronique et procédé pour corriger une image dans une commutation de caméra
WO2019059635A1 (fr) Dispositif électronique pour fournir une fonction en utilisant une image rvb et une image ir acquises par l'intermédiaire d'un capteur d'image
WO2020145482A1 (fr) Dispositif électronique de commande de fréquence de trames de capteur d'images et procédé associé
WO2019190250A1 (fr) Procédé de synthèse d'image sur un objet réfléchissant en fonction d'un attribut d'objet réfléchissant inclus dans une image différente et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21837144

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21837144

Country of ref document: EP

Kind code of ref document: A1