WO2021162353A1 - Dispositif électronique incluant un appareil photographique et son procédé de fonctionnement - Google Patents

Dispositif électronique incluant un appareil photographique et son procédé de fonctionnement Download PDF

Info

Publication number
WO2021162353A1
WO2021162353A1 PCT/KR2021/001526 KR2021001526W WO2021162353A1 WO 2021162353 A1 WO2021162353 A1 WO 2021162353A1 KR 2021001526 W KR2021001526 W KR 2021001526W WO 2021162353 A1 WO2021162353 A1 WO 2021162353A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
image
information
region
effect
Prior art date
Application number
PCT/KR2021/001526
Other languages
English (en)
Korean (ko)
Inventor
서동환
황인성
황지나
신대규
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Publication of WO2021162353A1 publication Critical patent/WO2021162353A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • Various embodiments disclosed in this document relate to an electronic device including a camera and an operating method thereof.
  • the bokeh effect is an image expression method that blurs the focus of the lens on an area other than the main subject on the premise that the lens is in focus on the main subject.
  • the prior art determines how to apply the bokeh effect to a region in an image based on depth information.
  • the depth information may refer to one image representing information related to a distance from an observation point of an image to an object surface included in the image in 3D computer graphics.
  • the accuracy of determining the main subject intended by the photographer when shooting the image may be reduced. Accordingly, it is determined that the main subject or the background is actually the main subject, and the bokeh effect may be applied to the main subject.
  • the present invention relates to an apparatus for providing a method of applying a natural background blur effect to an image by determining a main subject intended by a photographer in an image with higher accuracy, and an operating method thereof.
  • An electronic device includes a camera, a memory, and a processor connected to the camera and the memory, wherein, when the memory is executed, the processor acquires opacity information of an image acquired from the camera and acquire depth information of the image, determine a region of interest of the image based on the acquired depth information and the acquired opacity information, and apply a first effect to a background region other than the region of interest in the image It can store one or more instructions (instructions).
  • the method of operating an electronic device including a camera includes acquiring opacity information of an image acquired from the camera, acquiring depth information of the image, and obtaining the acquired depth information and the acquired depth information.
  • a region of interest of the image may be determined based on the opacity information, and the first effect may be applied to a background region other than the region of interest in the image.
  • the electronic device may apply a natural background blurring effect to the image by determining the main subject intended by the photographer in the image with higher accuracy.
  • the electronic device may more precisely separate fine boundary regions, such as hair, fur, and the like, from an image to apply a natural background blurring effect to the image.
  • the electronic device may apply a natural background blurring effect to the image by determining the main subject intended by the photographer with higher accuracy even in an image captured by a single camera.
  • FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments of the present disclosure
  • FIG. 2 is a diagram illustrating a configuration of an electronic device according to an exemplary embodiment.
  • FIG. 3 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • FIG. 4 is a diagram illustrating an original image and subject information obtained by an electronic device according to an exemplary embodiment.
  • FIG. 5 is a diagram for describing a method for an electronic device to obtain opacity information according to an exemplary embodiment.
  • FIG. 6 is a diagram illustrating a result of separating a subject and a background from an image by an electronic device according to an exemplary embodiment
  • FIG. 7 is a diagram illustrating a result of applying an effect to a background separated from a subject in an image by an electronic device according to an exemplary embodiment.
  • FIG. 8 is a diagram illustrating depth information of the original image of FIG. 4 acquired by an electronic device according to an exemplary embodiment.
  • FIG. 9 is a diagram illustrating an image formed by applying an effect to the original image of FIG. 4 by an electronic device according to an exemplary embodiment
  • FIG. 10 is a diagram for describing an operation of an electronic device according to an exemplary embodiment.
  • FIG. 11 is a diagram illustrating an image formed by an electronic device according to an exemplary embodiment.
  • FIG. 12 is a flowchart illustrating an operation of an electronic device according to an exemplary embodiment.
  • FIG. 13 is a diagram for explaining an operation of an electronic device according to an exemplary embodiment.
  • FIG. 14 is a diagram illustrating a histogram formed by an electronic device according to an exemplary embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments of the present disclosure.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments.
  • an electronic device 101 communicates with an electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or a second network 199 . It may communicate with the electronic device 104 or the server 108 through (eg, a long-distance wireless communication network). According to an embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • a first network 198 eg, a short-range wireless communication network
  • a second network 199 e.g., a second network 199
  • the electronic device 101 may communicate with the electronic device 104 through the server 108 .
  • the electronic device 101 includes a processor 120 , a memory 130 , an input device 150 , a sound output device 155 , a display device 160 , an audio module 170 , and a sensor module ( 176 , interface 177 , haptic module 179 , camera module 180 , power management module 188 , battery 189 , communication module 190 , subscriber identification module 196 , or antenna module 197 . ) may be included. In some embodiments, at least one of these components (eg, the display device 160 or the camera module 180 ) may be omitted or one or more other components may be added to the electronic device 101 . In some embodiments, some of these components may be implemented as one integrated circuit. For example, the sensor module 176 (eg, a fingerprint sensor, an iris sensor, or an illuminance sensor) may be implemented while being embedded in the display device 160 (eg, a display).
  • the sensor module 176 eg, a fingerprint sensor, an iris sensor, or an illuminance sensor
  • the processor 120 executes software (eg, the program 140) to execute at least one other component (eg, a hardware or software component) of the electronic device 101 connected to the processor 120 . It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 . may be loaded into the volatile memory 132 , process commands or data stored in the volatile memory 132 , and store the resulting data in the non-volatile memory 134 .
  • software eg, the program 140
  • the processor 120 converts commands or data received from other components (eg, the sensor module 176 or the communication module 190 ) to the volatile memory 132 .
  • the volatile memory 132 may be loaded into the volatile memory 132 , process commands or data stored in the volatile memory 132 , and store the resulting data in the non-volatile memory 134 .
  • the processor 120 includes a main processor 121 (eg, a central processing unit or an application processor), and a secondary processor 123 (eg, a graphics processing unit, an image signal processor) that can be operated independently or in conjunction with the main processor 121 . , a sensor hub processor, or a communication processor). Additionally or alternatively, the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a designated function. The auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • a main processor 121 eg, a central processing unit or an application processor
  • a secondary processor 123 eg, a graphics processing unit, an image signal processor
  • the auxiliary processor 123 may be configured to use less power than the main processor 121 or to be specialized for a designated function.
  • the auxiliary processor 123 may be implemented separately from or as a part of the main processor 121 .
  • the auxiliary processor 123 may be, for example, on behalf of the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or when the main processor 121 is active (eg, executing an application). ), together with the main processor 121, at least one of the components of the electronic device 101 (eg, the display device 160, the sensor module 176, or the communication module 190) It is possible to control at least some of the related functions or states.
  • the coprocessor 123 eg, an image signal processor or a communication processor
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176 ) of the electronic device 101 .
  • the data may include, for example, input data or output data for software (eg, the program 140 ) and instructions related thereto.
  • the memory 130 may include a volatile memory 132 or a non-volatile memory 134 .
  • the program 140 may be stored as software in the memory 130 , and may include, for example, an operating system 142 , middleware 144 , or an application 146 .
  • the input device 150 may receive a command or data to be used by a component (eg, the processor 120 ) of the electronic device 101 from the outside (eg, a user) of the electronic device 101 .
  • the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
  • the sound output device 155 may output a sound signal to the outside of the electronic device 101 .
  • the sound output device 155 may include, for example, a speaker or a receiver.
  • the speaker can be used for general purposes such as multimedia playback or recording playback, and the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from or as part of the speaker.
  • the display device 160 may visually provide information to the outside (eg, a user) of the electronic device 101 .
  • the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the corresponding device.
  • the display device 160 may include a touch circuitry configured to sense a touch or a sensor circuit (eg, a pressure sensor) configured to measure the intensity of a force generated by the touch. there is.
  • the audio module 170 may convert a sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 acquires a sound through the input device 150 , or an external electronic device (eg, a sound output device 155 ) connected directly or wirelessly with the electronic device 101 .
  • the electronic device 102) eg, a speaker or headphones
  • the electronic device 102 may output a sound.
  • the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101 or an external environmental state (eg, user state), and generates an electrical signal or data value corresponding to the sensed state. can do.
  • the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
  • the interface 177 may support one or more specified protocols that may be used by the electronic device 101 to directly or wirelessly connect with an external electronic device (eg, the electronic device 102 ).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • the connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102 ).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that the user can perceive through tactile or kinesthetic sense.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 may capture still images and moving images. According to an embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 may manage power supplied to the electronic device 101 .
  • the power management module 188 may be implemented as, for example, at least a part of a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101 .
  • battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell, or a fuel cell.
  • the communication module 190 is a direct (eg, wired) communication channel or a wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). It can support establishment and communication through the established communication channel.
  • the communication module 190 may include one or more communication processors that operate independently of the processor 120 (eg, an application processor) and support direct (eg, wired) communication or wireless communication.
  • the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg, : It may include a local area network (LAN) communication module, or a power line communication module).
  • a wireless communication module 192 eg, a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 eg, : It may include a local area network (LAN) communication module, or a power line communication module.
  • the corresponding communication module is a first network 198 (eg, a short-range communication network such as Bluetooth, WiFi direct or IrDA (infrared data association)) or a second network 199 (eg, a cellular network, the Internet, Alternatively, it may communicate with the external electronic device 104 through a computer network (eg, a telecommunication network such as a LAN or WAN).
  • a computer network eg, a telecommunication network such as a LAN or WAN.
  • These various types of communication modules may be integrated into one component (eg, a single chip) or may be implemented as a plurality of components (eg, multiple chips) separate from each other.
  • the wireless communication module 192 uses the subscriber information (eg, International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199 .
  • the electronic device 101 may be identified and authenticated.
  • the antenna module 197 may transmit or receive a signal or power to the outside (eg, an external electronic device).
  • the antenna module 197 may include one antenna including a conductor formed on a substrate (eg, a PCB) or a radiator formed of a conductive pattern.
  • the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected from the plurality of antennas by, for example, the communication module 190 . can be selected. A signal or power may be transmitted or received between the communication module 190 and an external electronic device through the selected at least one antenna.
  • other components eg, RFIC
  • other than the radiator may be additionally formed as a part of the antenna module 197 .
  • peripheral devices eg, a bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • GPIO general purpose input and output
  • SPI serial peripheral interface
  • MIPI mobile industry processor interface
  • the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199 .
  • Each of the external electronic devices 102 and 104 may be the same as or different from the electronic device 101 .
  • all or a part of operations executed in the electronic device 101 may be executed in one or more of the external electronic devices 102 , 104 , or 108 .
  • the electronic device 101 may perform the function or service itself instead of executing the function or service itself.
  • one or more external electronic devices may be requested to perform at least a part of the function or the service.
  • the one or more external electronic devices that have received the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101 .
  • the electronic device 101 may process the result as it is or additionally and provide it as at least a part of a response to the request.
  • cloud computing, distributed computing, or client-server computing technology may be used.
  • FIG. 2 is a diagram 200 illustrating a configuration of an electronic device according to an exemplary embodiment. Operations of the components of the electronic device described below may be performed by a processor (eg, the processor 120 of FIG. 1 ) of the electronic device (eg, the electronic device 101 of FIG. 1 ).
  • a processor eg, the processor 120 of FIG. 1
  • the electronic device eg, the electronic device 101 of FIG. 1 .
  • the electronic device 101 includes an image capturing module 201 , a depth information obtaining module 202 , an opacity obtaining module 203 , a subject separation module 204 , and a representative object determination module. 205 , a region of interest determination module 206 , a depth information use determination module 207 , a region of interest and background separation module 208 , and an effect application module 209 .
  • the image capturing module 201 may receive a captured image from a camera module (eg, the camera module 180 of FIG. 1 ) of the electronic device 101 .
  • the image may include at least one of a still image (eg, a photo, an image, etc.) and a moving image.
  • the depth information acquisition module 202 may acquire depth information for an image received by the image capturing module 201 .
  • the depth information acquisition module 202 is a deep learning (deep learning), a time of flight (TOF) sensor, a stereo camera, an image sensor including a multi-photo diode (PD), a structured light method
  • Depth information of an image may be acquired through an image sensor of
  • the opacity obtaining module 203 may obtain opacity information with respect to the image received by the image capturing module 201 .
  • the opacity obtaining module 203 may obtain opacity information of an image through deep learning or the like.
  • the opacity value may be a value corresponding to a specific range (eg, 0 to 255).
  • the opacity information may include an opacity value of at least a portion of an image.
  • the subject separation module 204 may separate the subject from the image received by the image capturing module 201 based on the opacity information obtained by the opacity obtaining module 203 . According to an embodiment, the subject separation module 204 may identify a region having an opacity value greater than or equal to a threshold value in the image as the subject. According to an embodiment, the subject separation module 204 may separate the subject from the image based on opacity information on the identified boundary area of the subject. Separating the subject according to an embodiment may be dividing the boundary between the subject and the background in the image.
  • the representative object determination module 205 determines a representative object from among at least one subject separated by the subject separation module 204 from the image, based on the depth information obtained by the depth information obtaining module 202 . can do. According to an embodiment, the representative object determination module 205 may determine the representative object based on a degree that the subject is close to the camera, the size of the subject in the image, and the close distance to the center of the image. According to an embodiment, the representative object determination module 205 may determine a subject whose distance to the camera is close to the representative object as the representative object. According to an embodiment, the representative object determination module 205 may determine a subject having the largest size of the subject in the image as the representative object.
  • the representative object determination module 205 may determine a subject closest to the center of the image as the representative object. According to an embodiment, the representative object determination module 205 may determine the representative object by further considering the opacity information obtained by the opacity obtaining module 203 .
  • the region of interest determination module 206 may determine the region of interest based on the representative object determined by the representative object determination module 205 . According to an embodiment, the region of interest determination module 206 may determine, as the region of interest, a region including the object in which the difference between the representative object and the depth value determined by the representative object determination module 205 is within a specific threshold value.
  • the depth information use determination module 207 may determine whether to use the depth information acquired by the depth information acquisition module 202 to determine a representative object in the image. According to an embodiment, the depth information use determination module 207 determines not to use the depth information acquired by the depth information acquisition module 202 to determine a representative object in the image when the opacity information in the image is polarized. can An operation of the depth information use determination module 207 according to an exemplary embodiment will be described later with reference to FIGS. 12 to 14 .
  • the region of interest and background separation module 208 may separate the region of interest determined from the image by the region of interest determination module 206 into a background that is a region other than the region of interest.
  • the effect application module 209 may apply the effect to the image by adjusting the degree of the effect based on depth information on the region of interest and the background region separated by the background separation module 208 .
  • the effect may be a bokeh effect, a blur effect, a blur effect, or the like.
  • the effect further includes at least one of a spin effect, a zoom effect, a motion blur effect, a path blur effect, and a tilt shift effect.
  • the spin effect may refer to a bokeh effect in which a pattern of concentric circles is seen around a specific subject (region of interest).
  • the zoom effect may refer to a photo effect that appears in a captured image when the camera captures an image while performing a zoom-in or zoom-out operation around a specific subject (or region of interest).
  • the motion blur effect may refer to an effect of applying blur according to a movement direction when a specific subject moves at the time of photographing.
  • the effect application module 209 adjusts the intensity of blur to increase as the depth value increases based on the depth information on the region of interest and the background region separated by the background separation module 208 to increase the image quality. It can be applied to the background area. That is, the electronic device may apply the blur effect of higher intensity as the distance from the camera increases among the background regions separated from the image.
  • FIG. 3 is a flowchart 300 illustrating an operation of an electronic device according to an exemplary embodiment. An operation of the electronic device described below may be performed by a processor (eg, the processor 120 of FIG. 1 ) of the electronic device (eg, the electronic device 101 of FIG. 1 ).
  • a processor eg, the processor 120 of FIG. 1
  • the electronic device eg, the electronic device 101 of FIG. 1 .
  • the electronic device (eg, the electronic device 101 of FIG. 1 ) acquires a photographed image from a camera module (eg, the camera module 180 of FIG. 1 ). can do.
  • the image may include at least one of a still image (eg, a photo, an image, etc.) and a moving image.
  • an electronic device may acquire depth information with respect to the acquired image.
  • an electronic device includes a deep learning (deep learning), a time of flight (TOF) sensor, a stereo camera, an image sensor including a multi-photo diode (PD), a structured light type image sensor, and the like. Through this, depth information of an image can be obtained.
  • deep learning deep learning
  • TOF time of flight
  • PD multi-photo diode
  • structured light type image sensor and the like.
  • the electronic device may obtain subject information from the image.
  • subject information according to an exemplary embodiment will be described with reference to FIG. 4 .
  • 4 is a diagram illustrating an original image 410 and subject information 420 acquired by an electronic device according to an exemplary embodiment.
  • the electronic device learns subject information based on information (eg, pixel position) of a subject area included in an original image 410 and RGB information of the corresponding area in the image. (420) can be obtained.
  • the subject information 420 may include information on at least one of a subject area included in an image and a subject type. For example, when the subject is a person, location information on a region, such as a face or body in an image, and color and brightness information located therein may be learned and obtained through inference that the subject is a person.
  • the electronic device may learn by using the original image 410 and the subject information image (eg, a mask map image including the characteristics of the subject), and subject information ( 420) can be obtained.
  • the subject information image eg, a mask map image including the characteristics of the subject
  • subject information ( 420) can be obtained. 4 shows that the subject is a person, the subject may be various, such as a person, an animal, or a plant according to an embodiment.
  • the electronic device may acquire opacity information for the acquired image.
  • the opacity information may correspond to an alpha map in which the original image is expressed as a value corresponding to a specific range (eg, 0 to 255).
  • a range of a value representing the opacity information may be different depending on the embodiment.
  • one pixel in the foreground may have an opacity of 255, and one pixel in the background may have an opacity of 0.
  • the electronic device may acquire opacity information of an image through deep learning or the like.
  • FIG. 5 is a diagram 500 for explaining a method for an electronic device to obtain opacity information according to an exemplary embodiment.
  • the electronic device may acquire the boundary area 502 of the subject 501 from the acquired subject information.
  • the electronic device may obtain opacity values for the boundary area 502 of the subject 501 .
  • the electronic device may acquire opacity information on the boundary region 502 through deep learning.
  • the electronic device may acquire opacity information (eg, an alpha map) representing the original image as a value corresponding to a specific range (eg, 0 to 255).
  • a range of a value representing the opacity information may be different depending on the embodiment.
  • the electronic device may learn by using the original image and the image (eg, the image including the feature of the boundary region) about the opacity information, and the opacity information (eg, the alpha map (eg, alpha map) alpha map)) can be obtained.
  • the opacity value may be a value corresponding to a specific range (eg, 0 to 255), and the boundary region 502 of the subject 501 may be expressed more precisely than binary information expressed as a value of 0 or 1. have.
  • the electronic device may separate the subject from the background in the image based on the acquired opacity information.
  • the electronic device may separate the subject and the background from the image based on the acquired opacity information. Separating the subject according to an embodiment may be dividing the boundary between the subject and the background in the image.
  • FIGS. 6 and 7 are diagram 610 and 620 illustrating a result of an electronic device separating a subject and a background from an image according to an exemplary embodiment.
  • 7 is a diagram 710 and 720 illustrating results of an electronic device applying an effect to a background separated from a subject in an image according to an exemplary embodiment.
  • the electronic device may separate the subject and the background from the image based on the acquired opacity information.
  • the first map 610 may be a map in which the subject 611 and the background 612 are separated based on binary information.
  • the binary information may be information in which a boundary between the subject 611 and the background 612 is expressed as a value of 0 or 1. Accordingly, it may be difficult to precisely separate a fine boundary portion such as the hair portion 613 of the subject 611 in separation based on binary information.
  • the second map 620 may be a map in which the subject 621 and the background 622 are separated based on opacity.
  • the opacity information may be information in which the boundary between the subject 621 and the background 622 is expressed as a value in a specific range (eg, 0 to 255). Accordingly, the separation based on the opacity can separate fine boundary portions such as the hair portion 623 of the subject 621 more precisely and naturally than when binary information is used.
  • the electronic device may apply an effect (eg, a bokeh effect) to a background area separated from the subject in the image.
  • an effect eg, a bokeh effect
  • the electronic device may apply an effect (eg, a bokeh effect) to a background area separated from the subject in the image.
  • the electronic device may determine a representative object of the image based on the acquired depth information and the acquired opacity information. According to an embodiment, the electronic device may determine the representative object based on a degree that the subject is close to the camera, the size of the subject in the image, and the like. According to an embodiment, the electronic device may determine a subject having the closest distance to the camera (a subject having the smallest depth value) as the representative object. According to an embodiment, the electronic device may determine a subject having the largest size of the subject in the image as the representative object.
  • the electronic device since the electronic device determines the representative object of the image in consideration of the acquired opacity information as well as the acquired depth information, the boundary of the representative object region may be more clearly determined. According to an embodiment, since the electronic device identifies a subject in consideration of opacity information obtained using deep learning, a background, a sub-subject, and a main subject can be identified with high accuracy, and a representative object of the image is determined based on this Therefore, the representative object intended by the videographer can be identified with higher accuracy.
  • the electronic device may determine an ROI of the image.
  • the electronic device may determine the ROI of the image based on the acquired depth information and the acquired opacity information.
  • the electronic device may determine the ROI based on the determined representative object.
  • the electronic device may determine, as the ROI, a region including an object in which the difference between the determined representative object and the depth value is within a specific threshold value.
  • FIG. 8 is a diagram illustrating depth information 800 for the original image 410 of FIG. 4 .
  • the depth information 800 expresses the object 801 having the smallest depth value among the five people of the original image most vividly, and the other objects 802 are expressed to be gradually blurred according to the depth value. that can be checked Accordingly, when the ROI is determined using only the depth information 800 , the object 810 having the smallest depth value is determined as the ROI, and all other objects 802 may be excluded from the ROI. In this case, although the intention of the photographer is that all five people are regions of interest, all but the object 801 having the smallest depth value are treated as background regions, so that an effect (eg, bokeh effect) can be applied.
  • an effect eg, bokeh effect
  • a picture in which at least some of the objects of interest (five people) are blurred may be formed, contrary to the intention of the photographer.
  • the region of interest is determined using only the depth information 800 , a boundary between objects is ambiguous and a portion of one object may be excluded from the region of interest.
  • a part of a person, such as a hand may be treated as a background area and an effect (eg, bokeh effect) may be applied. That is, when the electronic device determines the ROI, using only the depth information may be insufficient to determine the actual ROI intended by the photographer.
  • the electronic device may determine the region of interest of the image in consideration of both the acquired depth information and the acquired opacity information in operation 307 .
  • the electronic device may separate the identified region of interest from a background region that is a region other than the region of interest. Separating the region of interest and the background region according to an embodiment may be dividing the boundary between the region of interest and the background region in an image.
  • the electronic device may apply an effect to the separated background area.
  • the effect may be at least one of a bokeh effect, a blur effect, a blur effect, and a mosaic effect.
  • the effect further includes at least one of a spin effect, a zoom effect, a motion blur effect, a path blur effect, and a tilt shift effect.
  • the electronic device may apply the effect to the background region by adjusting the degree of the effect based on depth information on the separated background region.
  • the electronic device may apply the bokeh effect to the background region by adjusting the intensity of blur to increase as the depth value increases based on the depth information on the separated background region.
  • the electronic device adjusts the intensity of blur to increase as the difference between the depth value of the background region and the depth value of the region of interest increases, based on the depth information on the separated background region, thereby providing a bokeh effect on the background region. can be applied.
  • FIG. 9 is a diagram illustrating images 910 and 920 formed by applying an effect to the original image 410 of FIG. 4 by the electronic device.
  • a first image 910 may be an image formed by the electronic device according to the operation of FIG. 3 described above.
  • the electronic device determines the representative object 911 based on the opacity information and the depth information in the original image 410 of FIG. 4 , and the object 2 912 , the object 3 913 , and the object based on the representative object 911 .
  • an ROI including 4 914 and object 5 915 a first image 910 to which a bokeh effect is applied to a background region excluding the ROI may be formed.
  • the electronic device applies a blur effect to the image in consideration of not only depth information but also opacity information, so that the representative object 911, object 2 912, object 3 913, and object 4 ( 914) and object 5 (915) may be maintained in clear image quality, and only other regions may be blurred to form an image.
  • the electronic device calculates the depth values of the object 2 912 , the object 3 913 , the object 4 914 , and the object 5 915 included in the ROI as the average depth of the representative object 911 . It may be substituted with a value or a value within a certain range of the average depth value of the representative object 911 may be substituted.
  • the intensity of the effect applied to the representative object 911 , the object 2 912 , the object 3 913 , the object 4 914 , and the object 5 915 of the first image 910 may be the same. .
  • the effect of intensity 0 may be applied to the representative object 911 , object 2 912 , object 3 913 , object 4 914 , and object 5 915 of the first image 910 . have.
  • the second image 920 may be an image formed by a conventional electronic device.
  • the electronic device determines the representative object 911 based on only the depth information in the original image 410 of FIG. 4 , so that the object 2 912 , the object 3 913 , the object 4 914 and the object 5 which are the actual main subjects. Even at least a portion of 915 may form a second image 920 to which a bokeh effect is applied. That is, the electronic device fails to recognize the main subject actually intended by the photographer, and thus the representative object 911 , the object 2 912 , the object 3 913 , the object 4 914 , and the object 5 915 are the main subjects. Up to at least a portion of the image may be blurred together with the background area.
  • the flowchart of FIG. 3 corresponds to an example, and some orders may be omitted, changed, or merged according to embodiments.
  • the electronic device may capture an image with a single camera.
  • This may be a case in which the electronic device does not include a TOF sensor or a plurality of cameras, or a case in which the electronic device includes a plurality of cameras but simply captures an image with a single camera.
  • the accuracy of classifying objects in a captured image by object only with depth information obtained by a single camera may be low, so correction may be required.
  • the electronic device may further acquire opacity information of the image and use it to determine a region of interest that more closely matches the photographer's intention. That is, in the case of the above-described embodiment, by using both depth information and opacity information, it is possible to determine the main subject intended by the photographer with high accuracy even when an image is captured with only a single camera.
  • 10 is a diagram 1000 , 1010 , and 1020 for explaining an operation of an electronic device according to an exemplary embodiment.
  • 11 is a diagram illustrating an image 1100 formed by an electronic device according to an exemplary embodiment.
  • An operation of the electronic device described below may be performed by a processor (eg, the processor 120 of FIG. 1 ) of the electronic device (eg, the electronic device 101 of FIG. 1 ).
  • an original image 1000 captured from a camera module (eg, the camera module 180 of FIG. 1 ) of the electronic device. ) can be obtained.
  • the image may include at least one of a still image (eg, a photo, an image, etc.) and a moving image.
  • the original image 1000 may include an object 1 1001 , an object 2 1002 , and a background (road, tree, etc.).
  • the electronic device may obtain the opacity information 1010 from the original image 1000 .
  • the electronic device may acquire the opacity information 1010 of the image through deep learning or the like.
  • the opacity information 1010 may correspond to a map in which the original image 1000 is expressed as a value corresponding to a specific range (eg, 0 to 255).
  • the electronic device may identify the object 1 1011 and the object 2 1012, which are regions having an opacity value greater than or equal to a threshold value in the opacity information 1010 , as subjects.
  • the object 1 1011 and the object 2 1012 may have an opacity value of 255, and other regions may have an opacity value of 0.
  • the electronic device may recognize object 1 1011 and object 2 1012 having an opacity value of 255 as subjects.
  • the electronic device may separate the subjects 1011 and 1012 from the background in the image based on the acquired opacity information 1010 . According to an embodiment, the electronic device may separate the subjects 1011 and 1012 and the background from the image based on the acquired opacity information. Separating the subject according to an embodiment may be dividing the boundary between the subject and the background in the image.
  • the electronic device may acquire depth information 1020 from the original image 1000 .
  • the depth information 1020 may correspond to a map in which the original image 1000 is expressed as a depth value.
  • the electronic device expresses object 1 (1021) having the smallest depth value most clearly and object 2 (1022) located further away from the object 1 (1021) with respect to the camera to be blurred than object 1 (1021). can express
  • the electronic device may determine the representative object of the image based on the acquired depth information 1020 and the acquired opacity information 1010 . According to an embodiment, the electronic device may determine the representative object based on a degree that the subject is close to the camera, the size of the subject in the image, and the like. According to an embodiment, the electronic device may determine the object 1 1001, which is the subject closest to the camera, as the representative object. According to an embodiment, the electronic device may determine object 1 1001 , which is a subject having the largest size in the image, as the representative object.
  • the electronic device may determine the ROI of the original image 1000 . According to an embodiment, the electronic device may determine the ROI of the original image 1000 based on the acquired depth information 1020 and the acquired opacity information 1010 . According to an embodiment, the electronic device may determine the ROI based on the determined representative object object 1 1001 . According to an embodiment, the electronic device may determine, as the region of interest, a region including an object in which the difference between the determined representative object object 1 1001 and the depth value is within a specific threshold value. According to an embodiment, since the difference between the depth values of the object 1 1001 and the object 2 1002 is equal to or greater than a specific threshold, the electronic device may determine that the object 2 1002 does not correspond to the ROI. According to an embodiment, the electronic device may separate the identified object 1 1001 that is the region of interest and the background region that is the other region.
  • the electronic device may apply an effect to the separated background area.
  • the effect may be at least one of a bokeh effect, a blur effect, a blur effect, and a mosaic effect.
  • the effect further includes at least one of a spin effect, a zoom effect, a motion blur effect, a path blur effect, and a tilt shift effect.
  • the electronic device may apply the effect to the background region by adjusting the degree of the effect based on depth information on the separated background region.
  • the electronic device may apply the bokeh effect to the background region by adjusting the intensity of blur to increase as the depth value increases based on the depth information on the separated background region.
  • the electronic device may form an image 1100 to which a bokeh effect is applied to a region other than the object 1 1101 determined as the ROI.
  • the object 2 1102 may be blurred together with the background area because it is determined that the distance from the object 1 1101 is equal to or greater than a threshold value. Accordingly, it is possible to identify the main subject actually intended by the photographer with higher accuracy and to form an image in which an area excluding the main subject is blurred.
  • FIG. 12 is a flowchart 1200 illustrating an operation of an electronic device according to an exemplary embodiment.
  • 13 is a diagram 1310 and 1320 for explaining an operation of an electronic device according to an exemplary embodiment.
  • 14 is a diagram illustrating a histogram 1400 formed by an electronic device according to an exemplary embodiment.
  • An operation of the electronic device described below may be performed by a processor (eg, the processor 120 of FIG. 1 ) of the electronic device (eg, the electronic device 101 of FIG. 1 ).
  • an electronic device eg, the electronic device 101 of FIG. 1
  • a camera module eg, the camera module 180 of FIG. 1
  • An original image 1310 may be acquired.
  • the electronic device may acquire depth information 1320 with respect to the acquired original image 1310 .
  • a method for the electronic device to obtain depth information according to an embodiment may be the same as the method described in the above-described embodiment.
  • the electronic device may obtain subject information from the image.
  • a method for the electronic device to obtain subject information according to an embodiment may be the same as the method described in the above-described embodiment.
  • the electronic device may acquire opacity information for the acquired image.
  • the electronic device may acquire opacity information of an image through deep learning or the like.
  • the opacity information may correspond to a map representing the original image 1310 as a value corresponding to a specific range (eg, 0 to 255).
  • the electronic device may separate the object 1 1311 , which is the subject, from the background area 1312 in the original image 1310 based on the acquired opacity information. Separating the subject according to an embodiment may be dividing the boundary between the subject and the background in the image.
  • the electronic device may determine whether to use the acquired depth information 1320 to determine a representative object in the original image 1310 .
  • the electronic device forms a histogram 1400 as shown in FIG. 14 , and forms a histogram 1400 on whether to use the depth information 1320 to determine a representative object in the original image 1310 . can be judged on the basis of
  • the electronic device may form a histogram 1400 based on the acquired depth information.
  • the x-axis of the histogram 1400 may indicate a depth value (eg, 0 to 255), and the y-axis may indicate the number of pixels having a corresponding depth value. Since the proportion of object 1 1311 in the original image 1310 is large and the depth values of the background region 1310 do not differ significantly from each other, the histogram 1400 may indicate that the depth values in the image are polarized.
  • the histogram 1400 may be divided into three regions, (1) a region of interest, (2) a middle region, and (3) a background region, based on two axes 1401 and 1402 .
  • the electronic device may determine that the use of depth information is meaningless because the middle region (2) of the histogram 1400 hardly exists and only front/rear views exist. Accordingly, the electronic device may determine not to use the depth information 1320 to determine the representative object in the image. Through this, the electronic device can prevent a result of obtaining a side effect due to the use of unnecessary depth information.
  • the electronic device may determine a representative object of an image based on the acquired opacity information, determine a region of interest based on the representative object, and determine the identified region of interest and interest.
  • a background area that is an area outside the area can be separated.
  • the method of determining the representative object, the method of determining the region of interest, and the method of separating the region of interest and the background region may be the same as described in the above-described embodiment. However, in this case, the electronic device may use the opacity information except for the depth information.
  • the electronic device may apply an effect to the separated background area 1312 .
  • the effect may be at least one of a bokeh effect, a blur effect, a blur effect, and a mosaic effect.
  • the electronic device may identify the main subject intended by the photographer with higher accuracy and apply the optical effect to the main subject.
  • the electronic device may exclude the use of the depth information to prevent a result of obtaining a side effect due to the unnecessary use of the depth information.
  • the electronic device since the electronic device separates the boundary region of the subject by acquiring the opacity information, it is possible to apply the effect by precisely separating the fine boundary regions such as hair and fur, like an actual optical effect.
  • An electronic device includes a camera, a memory, and a processor connected to the camera and the memory, wherein, when the memory is executed, the processor acquires opacity information of an image acquired from the camera and acquire depth information of the image, determine a region of interest of the image based on the acquired depth information and the acquired opacity information, and apply a first effect to a background region other than the region of interest in the image It can store one or more instructions (instructions).
  • an area having an opacity value greater than or equal to a threshold value in the image may be identified as a subject based on the acquired opacity information.
  • the identified subject may be separated from the image based on opacity information on the boundary area of the identified subject.
  • the opacity information may be acquired using deep learning.
  • a representative object of the image may be determined based on the acquired depth information and the acquired opacity information.
  • the representative object may be determined in consideration of a depth value or a size.
  • the region of interest of the image may be determined in consideration of at least one of a depth value, an object type, or a size.
  • depth values of pixels included in the ROI may be replaced with the determined average depth value of the representative object, respectively.
  • the region of interest and the background region are separated from the image, and the intensity of the first effect is adjusted based on depth information of the background region to apply the first effect to the background region. can be applied.
  • the first effect may be at least one of a bokeh effect, a blur effect, a blur effect, and a mosaic effect.
  • the method of operating an electronic device including a camera includes acquiring opacity information of an image acquired from the camera, acquiring depth information of the image, and obtaining the acquired depth information and the acquired depth information.
  • a region of interest of the image may be determined based on the opacity information, and the first effect may be applied to a background region other than the region of interest in the image.
  • an area having an opacity value greater than or equal to a threshold value in the image may be identified as a subject based on the acquired opacity information.
  • the identified subject may be separated from the projection based on opacity information on the boundary area of the identified subject.
  • the opacity information may be acquired using deep learning.
  • a representative object of the image may be determined based on the acquired depth information and the acquired opacity information.
  • the representative object may be determined in consideration of a depth value or a size.
  • the region of interest of the image may be determined in consideration of a depth value and a type or size of the object.
  • the average depth value of the ROI may be replaced with the determined average depth value of the representative object.
  • the region of interest and the background region are separated from the image, and the intensity of the first effect is adjusted based on depth information of the background region to apply the first effect to the background region. can be applied.
  • the first effect may be at least one of a bokeh effect, a blur effect, a blur effect, and a mosaic effect.
  • the electronic device may have various types of devices.
  • the electronic device may include, for example, a portable communication device (eg, a smart phone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
  • a portable communication device eg, a smart phone
  • a computer device e.g., a smart phone
  • a portable multimedia device e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a camera e.g., a portable medical device
  • a wearable device e.g., a smart bracelet
  • a home appliance device e.g., a home appliance
  • a or B “at least one of A and B”, “at least one of A or B”, “A, B or C”, “at least one of A, B and C” and “A,
  • Each of the phrases “at least one of B, or C” may include any one of, or all possible combinations of, items listed together in the corresponding one of the phrases.
  • Terms such as “first”, “second”, or “first” or “second” may simply be used to distinguish the component from other such components, and refer to those components in other aspects (e.g., importance or order) is not limited.
  • one (eg first) component is “coupled” or “connected” to another (eg, second) component with or without the terms “functionally” or “communicatively”
  • one component can be connected to the other component directly (eg by wire), wirelessly, or through a third component.
  • module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as, for example, logic, logic block, component, or circuit.
  • a module may be an integrally formed part or a minimum unit or a part of the part that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • Various embodiments of the present document include one or more instructions stored in a storage medium (eg, internal memory 136 or external memory 138) readable by a machine (eg, electronic device 101).
  • a machine eg, electronic device 101
  • the processor eg, the processor 120
  • the device eg, the electronic device 101
  • the one or more instructions may include code generated by a compiler or code executable by an interpreter.
  • the device-readable storage medium may be provided in the form of a non-transitory storage medium.
  • 'non-transitory storage medium' is a tangible device and only means that it does not contain a signal (eg, electromagnetic wave). It does not distinguish the case where it is stored as
  • the 'non-transitory storage medium' may include a buffer in which data is temporarily stored.
  • the method according to various embodiments disclosed in this document may be provided as included in a computer program product.
  • Computer program products may be traded between sellers and buyers as commodities.
  • the computer program product is distributed in the form of a machine-readable storage medium (eg compact disc read only memory (CD-ROM)), or via an application store (eg Play StoreTM) or on two user devices ( It can be distributed (eg downloaded or uploaded) directly, online between smartphones (eg: smartphones).
  • a portion of a computer program product eg, a downloadable app
  • a machine-readable storage medium such as a memory of a manufacturer's server, a server of an application store, or a relay server. It may be temporarily stored or temporarily created.
  • each component eg, a module or a program of the above-described components may include a singular or a plurality of entities.
  • one or more components or operations among the above-described corresponding components may be omitted, or one or more other components or operations may be added.
  • a plurality of components eg, a module or a program
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component are executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations are executed in a different order, or omitted. or one or more other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Multimedia (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Signal Processing (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Studio Devices (AREA)

Abstract

Selon un mode de réalisation de la présente divulgation, un dispositif électronique comprend : un appareil photographique ; une mémoire ; et un processeur connecté à l'appareil photographique et à la mémoire, la mémoire pouvant stocker dans celle-ci une ou plusieurs instructions qui, lorsqu'elles sont exécutées, amènent le processeur à obtenir des informations d'opacité d'une image obtenue par l'appareil photographique, à obtenir des informations de profondeur de l'image, à déterminer une zone d'intérêt dans l'image en fonction des informations de profondeur obtenues et des informations d'opacité obtenues, et à appliquer un premier effet sur une zone d'arrière-plan à l'extérieur de la zone d'intérêt dans l'image.
PCT/KR2021/001526 2020-02-10 2021-02-05 Dispositif électronique incluant un appareil photographique et son procédé de fonctionnement WO2021162353A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0015985 2020-02-10
KR1020200015985A KR20210101713A (ko) 2020-02-10 2020-02-10 카메라를 포함하는 전자 장치 및 그의 동작 방법

Publications (1)

Publication Number Publication Date
WO2021162353A1 true WO2021162353A1 (fr) 2021-08-19

Family

ID=77291583

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/001526 WO2021162353A1 (fr) 2020-02-10 2021-02-05 Dispositif électronique incluant un appareil photographique et son procédé de fonctionnement

Country Status (2)

Country Link
KR (1) KR20210101713A (fr)
WO (1) WO2021162353A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020057294A1 (fr) 2018-09-21 2020-03-26 Telefonaktiebolaget Lm Ericsson (Publ) Procédé et appareil d'optimisation de mobilité
US20230091780A1 (en) * 2020-08-04 2023-03-23 Samsung Electronics Co., Ltd. Electronic device and method for generating an image

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009021989A (ja) * 2007-06-15 2009-01-29 Fujifilm Corp 画像表示装置及び画像表示方法
KR20150049122A (ko) * 2013-10-29 2015-05-08 삼성전자주식회사 영상 촬상 장치 및 이의 보케 영상 생성 방법
KR20160044203A (ko) * 2014-10-15 2016-04-25 포항공과대학교 산학협력단 전경 물체 추출을 위한 매팅 방법 및 이를 수행하는 장치
KR20170060498A (ko) * 2015-11-24 2017-06-01 삼성전자주식회사 영상 촬영 장치 및 영상 촬영 장치의 제어 방법
WO2019103912A2 (fr) * 2017-11-22 2019-05-31 Arterys Inc. Récupération d'image basée sur le contenu pour analyse de lésion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009021989A (ja) * 2007-06-15 2009-01-29 Fujifilm Corp 画像表示装置及び画像表示方法
KR20150049122A (ko) * 2013-10-29 2015-05-08 삼성전자주식회사 영상 촬상 장치 및 이의 보케 영상 생성 방법
KR20160044203A (ko) * 2014-10-15 2016-04-25 포항공과대학교 산학협력단 전경 물체 추출을 위한 매팅 방법 및 이를 수행하는 장치
KR20170060498A (ko) * 2015-11-24 2017-06-01 삼성전자주식회사 영상 촬영 장치 및 영상 촬영 장치의 제어 방법
WO2019103912A2 (fr) * 2017-11-22 2019-05-31 Arterys Inc. Récupération d'image basée sur le contenu pour analyse de lésion

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020057294A1 (fr) 2018-09-21 2020-03-26 Telefonaktiebolaget Lm Ericsson (Publ) Procédé et appareil d'optimisation de mobilité
US20230091780A1 (en) * 2020-08-04 2023-03-23 Samsung Electronics Co., Ltd. Electronic device and method for generating an image
US12022189B2 (en) * 2020-08-04 2024-06-25 Samsung Electronics Co., Ltd Electronic device and method for generating an image

Also Published As

Publication number Publication date
KR20210101713A (ko) 2021-08-19

Similar Documents

Publication Publication Date Title
WO2020171583A1 (fr) Dispositif électronique pour stabiliser une image et son procédé de fonctionnement
WO2020171553A1 (fr) Dispositif électronique appliquant un effet bokeh à une image, et procédé de commande associé
WO2020204659A1 (fr) Dispositif électronique, procédé et support lisible par ordinateur pour fournir un effet de flou dans une vidéo
WO2019156308A1 (fr) Appareil et procédé d'estimation de mouvement de stabilisation d'image optique
WO2019164185A1 (fr) Dispositif électronique et procédé de correction d'une image corrigée selon un premier programme de traitement d'image, selon un second programme de traitement d'image dans un dispositif électronique externe
WO2020080845A1 (fr) Dispositif électronique et procédé pour obtenir des images
WO2022030855A1 (fr) Dispositif électronique et procédé permettant de générer une image par application d'un effet sur un sujet et un arrière-plan
WO2019066373A1 (fr) Procédé de correction d'image sur la base de catégorie et de taux de reconnaissance d'objet inclus dans l'image et dispositif électronique mettant en œuvre celui-ci
WO2020032497A1 (fr) Procédé et appareil permettant d'incorporer un motif de bruit dans une image sur laquelle un traitement par flou a été effectué
WO2019164288A1 (fr) Procédé de fourniture de données de gestion de traduction de texte associées à une application, et dispositif électronique associé
WO2019142997A1 (fr) Appareil et procédé pour compenser un changement d'image provoqué par un mouvement de stabilisation d'image optique (sio)
WO2019039870A1 (fr) Dispositif électronique capable de commander un effet d'affichage d'image, et procédé d'affichage d'image
WO2020116844A1 (fr) Dispositif électronique et procédé d'acquisition d'informations de profondeur à l'aide au moins de caméras ou d'un capteur de profondeur
WO2020209492A1 (fr) Caméra pliée et dispositif électronique le comprenant
WO2020032383A1 (fr) Dispositif électronique permettant de fournir un résultat de reconnaissance d'un objet externe à l'aide des informations de reconnaissance concernant une image, des informations de reconnaissance similaires associées à des informations de reconnaissance, et des informations de hiérarchie, et son procédé d'utilisation
WO2019139404A1 (fr) Dispositif électronique et procédé de traitement d'image correspondante
WO2021162353A1 (fr) Dispositif électronique incluant un appareil photographique et son procédé de fonctionnement
WO2020153738A1 (fr) Dispositif électronique et procédé de connexion d'un nœud de masse à un module de caméra
WO2020085718A1 (fr) Procédé et dispositif de génération d'avatar sur la base d'une image corrigée
WO2021157996A1 (fr) Dispositif électronique et son procédé de traitement d'image
WO2021125875A1 (fr) Dispositif électronique pour fournir un service de traitement d'image à travers un réseau
WO2020190008A1 (fr) Dispositif électronique pour fonction de focalisation auto, et procédé de commande correspondant
WO2020204404A1 (fr) Dispositif électronique et procédé de commande de sortie de sources lumineuses du dispositif électronique
WO2020130579A1 (fr) Procédé de traitement d'image, et dispositif électronique associé
WO2019182359A1 (fr) Dispositif électronique de notification de mise à jour de traitement de signal d'image et procédé de fonctionnement de celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21753341

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21753341

Country of ref document: EP

Kind code of ref document: A1