WO2024085673A1 - Dispositif électronique pour obtenir de multiples images d'exposition et son procédé de fonctionnement - Google Patents
Dispositif électronique pour obtenir de multiples images d'exposition et son procédé de fonctionnement Download PDFInfo
- Publication number
- WO2024085673A1 WO2024085673A1 PCT/KR2023/016224 KR2023016224W WO2024085673A1 WO 2024085673 A1 WO2024085673 A1 WO 2024085673A1 KR 2023016224 W KR2023016224 W KR 2023016224W WO 2024085673 A1 WO2024085673 A1 WO 2024085673A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- electronic device
- metadata
- basic
- processor
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 62
- 230000015654 memory Effects 0.000 claims description 70
- 238000012545 processing Methods 0.000 claims description 50
- 238000012937 correction Methods 0.000 claims description 24
- 238000012805 post-processing Methods 0.000 claims description 19
- 230000009466 transformation Effects 0.000 claims description 13
- 238000004891 communication Methods 0.000 description 48
- 230000008569 process Effects 0.000 description 27
- 238000010586 diagram Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 13
- 238000005516 engineering process Methods 0.000 description 9
- 238000009877 rendering Methods 0.000 description 9
- 238000013528 artificial neural network Methods 0.000 description 8
- 238000002156 mixing Methods 0.000 description 7
- 239000002131 composite material Substances 0.000 description 6
- 230000002194 synthesizing effect Effects 0.000 description 6
- 238000013473 artificial intelligence Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- 239000003381 stabilizer Substances 0.000 description 5
- 230000015572 biosynthetic process Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000007781 pre-processing Methods 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000003786 synthesis reaction Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 230000000638 stimulation Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000000306 recurrent effect Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003155 kinesthetic effect Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 229910052724 xenon Inorganic materials 0.000 description 1
- FHNFHKCVQCLJFQ-UHFFFAOYSA-N xenon atom Chemical compound [Xe] FHNFHKCVQCLJFQ-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/92—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
Definitions
- This disclosure relates to an electronic device for acquiring multiple exposure images and a method of operating the same.
- a multiple exposure photo in which multiple images appear overlapping can be obtained.
- an electronic device including a digital camera a plurality of images can be acquired through an image sensor, and a multiple exposure image can be obtained by combining the plurality of images.
- the electronic device may store an image file containing data about the acquired multiple exposure image in a storage medium.
- an electronic device may perform image processing on the image data. For example, an electronic device may apply white balance to image data.
- White balance may mean correcting the color of an object shown in an image by considering the effect that the color of lighting in the shooting environment has on the image. For example, a white object may be photographed as yellow around an incandescent light, and a white object may be photographed as blue in a sunlight environment.
- the electronic device can apply white balance to the captured image to correct the color included in the image data so that white objects appear white.
- the electronic device can also apply white balance to multiple exposure images.
- An electronic device may include a camera including an image sensor, a memory for storing instructions, and at least one processor.
- the at least one processor may execute the instructions stored in the memory.
- the electronic device allows the at least one processor to acquire a first base image corresponding to first image sensor data obtained from the image sensor and first metadata related to parameters for performing white balancing on the first base image. It can be configured to do so.
- the electronic device performs white balancing on the second basic image and a second basic image corresponding to second image sensor data obtained from the image sensor at a different point in time than the first image sensor data by the at least one processor. It may be configured to obtain second metadata related to the parameter.
- the electronic device may be configured to have the at least one processor determine reference metadata based on at least some of the first metadata or the second metadata.
- the electronic device may be configured so that the at least one processor obtains a third basic image by combining the first basic image and the second basic image based on parameters corresponding to the reference metadata.
- the electronic device may be configured such that the at least one processor stores an image file generated based on the third base image and the reference metadata.
- a method of operating an electronic device including a camera including an image sensor includes acquiring a first basic image and first metadata corresponding to first image sensor data obtained from the image sensor. It can be included.
- the method may include acquiring a second base image and second metadata corresponding to second image sensor data obtained from the image sensor at a different point in time than the first image sensor data.
- the method may include determining reference metadata based on at least some of the first metadata or the second metadata.
- the method may include an operation of obtaining a third basic image obtained by combining the first basic image and the second basic image based on parameters corresponding to the reference metadata.
- the method may include storing an image file created based on the third base image and the reference metadata.
- a computer-readable recording medium recording a program includes, when executed, an operation in which an electronic device acquires a first base image and first metadata corresponding to first image sensor data obtained from the image sensor. , An operation of acquiring a second base image and second metadata corresponding to second image sensor data acquired from the image sensor at a different point in time from the first image sensor data, the first metadata or the second metadata an operation of determining reference metadata based on at least some of the following, an operation of obtaining a third base image obtained by combining the first base image and the second base image based on parameters corresponding to the reference metadata, and the first base image.
- a program may be recorded to perform a method including an operation of storing a basic image and an image file created based on the reference metadata.
- FIG. 1 is a block diagram of an electronic device in a network environment, according to various embodiments.
- FIG. 2 is a block diagram illustrating a camera module, according to various embodiments.
- Figure 3 is a block diagram of an electronic device according to one embodiment.
- FIG. 4 is a flowchart illustrating a process in which an electronic device stores an acquired image as an image file, according to an embodiment.
- FIG. 5 is a flowchart illustrating a process in which an electronic device stores an image using inverse white balancing conversion, according to an embodiment.
- FIG. 6 is a flowchart illustrating a process in which an electronic device stores an image by compensating for white balancing, according to an embodiment.
- FIG. 7 is a diagram illustrating an example of an image in which an electronic device acquires and renders a multiple exposure image according to an embodiment.
- FIG. 8 is a block diagram illustrating an example in which an electronic device acquires a basic image and a post-processed image from image sensor data, according to an embodiment.
- FIG. 9 is a block diagram illustrating a configuration included in an image file stored by an electronic device according to an embodiment.
- FIG. 10 is a diagram illustrating in module units the configuration of an electronic device that stores an image using inverse white balancing conversion, according to an embodiment.
- FIG. 11 is a diagram illustrating in module units the configuration of an electronic device that stores an image by compensating for white balancing, according to an embodiment.
- FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to various embodiments.
- the electronic device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with at least one of the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
- a first network 198 e.g., a short-range wireless communication network
- a second network 199 e.g., a second network 199.
- the electronic device 101 may communicate with the electronic device 104 through the server 108.
- the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
- at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added to the electronic device 101.
- some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
- the processor 120 for example, executes software (e.g., program 140) to operate at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
- software e.g., program 140
- the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
- the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
- the processor 120 includes a main processor 121 (e.g., a central processing unit or an application processor) or an auxiliary processor 123 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
- a main processor 121 e.g., a central processing unit or an application processor
- auxiliary processor 123 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
- the electronic device 101 includes a main processor 121 and a secondary processor 123
- the secondary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
- the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
- the auxiliary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or while the main processor 121 is in an active (e.g., application execution) state. ), together with the main processor 121, at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) At least some of the functions or states related to can be controlled.
- co-processor 123 e.g., image signal processor or communication processor
- may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
- the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
- Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 101 itself on which the artificial intelligence model is performed, or may be performed through a separate server (e.g., server 108).
- Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
- An artificial intelligence model may include multiple artificial neural network layers.
- Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
- artificial intelligence models may additionally or alternatively include software structures.
- the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
- Memory 130 may include volatile memory 132 or non-volatile memory 134.
- the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142, middleware 144, or application 146.
- the input module 150 may receive commands or data to be used in a component of the electronic device 101 (e.g., the processor 120) from outside the electronic device 101 (e.g., a user).
- the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
- the sound output module 155 may output sound signals to the outside of the electronic device 101.
- the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
- the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
- the display module 160 can visually provide information to the outside of the electronic device 101 (eg, a user).
- the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
- the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
- the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
- the electronic device 102 e.g., speaker or headphone
- the sensor module 176 detects the operating state (e.g., power or temperature) of the electronic device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
- the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
- the interface 177 may support one or more designated protocols that can be used to connect the electronic device 101 directly or wirelessly with an external electronic device (eg, the electronic device 102).
- the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
- HDMI high definition multimedia interface
- USB universal serial bus
- SD card interface Secure Digital Card interface
- audio interface audio interface
- connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
- the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
- the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
- the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
- the camera module 180 can capture still images and moving images.
- the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
- the power management module 188 can manage power supplied to the electronic device 101.
- the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery 189 may supply power to at least one component of the electronic device 101.
- the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
- Communication module 190 is configured to provide a direct (e.g., wired) communication channel or wireless communication channel between electronic device 101 and an external electronic device (e.g., electronic device 102, electronic device 104, or server 108). It can support establishment and communication through established communication channels. Communication module 190 operates independently of processor 120 (e.g., an application processor) and may include one or more communication processors that support direct (e.g., wired) communication or wireless communication.
- processor 120 e.g., an application processor
- the communication module 190 is a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
- a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
- GNSS global navigation satellite system
- wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
- the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
- a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
- a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
- a telecommunication network such as a cellular network, a 5G network, a next-generation communication network
- the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
- subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
- IMSI International Mobile Subscriber Identifier
- the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
- NR access technology provides high-speed transmission of high-capacity data (eMBB (enhanced mobile broadband)), minimization of terminal power and access to multiple terminals (mMTC (massive machine type communications)), or high reliability and low latency (URLLC (ultra-reliable and low latency). -latency communications)) can be supported.
- the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
- the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing. It can support technologies such as input/output (FD-MIMO: full dimensional MIMO), array antenna, analog beam-forming, or large scale antenna.
- the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199).
- the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
- Peak data rate e.g., 20 Gbps or more
- loss coverage e.g., 164 dB or less
- U-plane latency e.g., 164 dB or less
- the antenna module 197 may transmit or receive signals or power to or from the outside (eg, an external electronic device).
- the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
- the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected to the plurality of antennas by, for example, the communication module 190. can be selected. Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
- other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 197.
- RFIC radio frequency integrated circuit
- the antenna module 197 may form a mmWave antenna module.
- a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high-frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
- a mmWave antenna module includes a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high-frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side)
- peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
- signal e.g. commands or data
- commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
- Each of the external electronic devices 102 or 104 may be of the same or different type as the electronic device 101.
- all or part of the operations performed in the electronic device 101 may be executed in one or more of the external electronic devices 102, 104, or 108.
- the electronic device 101 may perform the function or service instead of executing the function or service on its own.
- one or more external electronic devices may be requested to perform at least part of the function or service.
- One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 101.
- the electronic device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
- cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
- the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
- the external electronic device 104 may include an Internet of Things (IoT) device.
- Server 108 may be an intelligent server using machine learning and/or neural networks.
- the external electronic device 104 or server 108 may be included in the second network 199.
- the electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
- Electronic devices may be of various types.
- Electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances.
- Electronic devices according to embodiments of this document are not limited to the above-described devices.
- first, second, or first or second may be used simply to distinguish one component from another, and to refer to those components in other respects (e.g., importance or order) is not limited.
- One (e.g., first) component is said to be “coupled” or “connected” to another (e.g., second) component, with or without the terms “functionally” or “communicatively.”
- module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and is interchangeable with terms such as logic, logic block, component, or circuit, for example. It can be used as A module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions. For example, according to one embodiment, the module may be implemented in the form of an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- Various embodiments of the present document are one or more instructions stored in a storage medium (e.g., built-in memory 136 or external memory 138) that can be read by a machine (e.g., electronic device 101). It may be implemented as software (e.g., program 140) including these.
- a processor e.g., processor 120
- the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
- a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
- 'non-transitory' only means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves). This term refers to cases where data is stored semi-permanently in the storage medium. There is no distinction between temporary storage cases.
- Computer program products are commodities and can be traded between sellers and buyers.
- the computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)) or via an application store (e.g. Play Store TM ) or on two user devices (e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
- a machine-readable storage medium e.g. compact disc read only memory (CD-ROM)
- an application store e.g. Play Store TM
- two user devices e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
- at least a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store's server, or a relay server.
- each component (e.g., module or program) of the above-described components may include a single or plural entity, and some of the plurality of entities may be separately placed in other components. there is.
- one or more of the components or operations described above may be omitted, or one or more other components or operations may be added.
- multiple components eg, modules or programs
- the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component of the plurality of components prior to the integration. .
- operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, or omitted. Alternatively, one or more other operations may be added.
- FIG. 2 is a block diagram 200 illustrating a camera module 180, according to various embodiments.
- the camera module 180 includes a lens assembly 210, a flash 220, an image sensor 230, an image stabilizer 240, a memory 250 (e.g., buffer memory), or an image signal processor. It may include (260).
- the lens assembly 210 may collect light emitted from a subject that is the target of image capture.
- Lens assembly 210 may include one or more lenses.
- the camera module 180 may include a plurality of lens assemblies 210. In this case, the camera module 180 may form, for example, a dual camera, a 360-degree camera, or a spherical camera.
- Some of the plurality of lens assemblies 210 have the same lens properties (e.g., angle of view, focal length, autofocus, f number, or optical zoom), or at least one lens assembly is different from another lens assembly. It may have one or more lens properties that are different from the lens properties of .
- the lens assembly 210 may include, for example, a wide-angle lens or a telephoto lens.
- the flash 220 may emit light used to enhance light emitted or reflected from a subject.
- the flash 220 may include one or more light emitting diodes (eg, red-green-blue (RGB) LED, white LED, infrared LED, or ultraviolet LED), or a xenon lamp.
- the image sensor 230 may acquire an image corresponding to the subject by converting light emitted or reflected from the subject and transmitted through the lens assembly 210 into an electrical signal.
- the image sensor 230 is one image sensor selected from among image sensors with different properties, such as an RGB sensor, a BW (black and white) sensor, an IR sensor, or a UV sensor, and the same It may include a plurality of image sensors having different properties, or a plurality of image sensors having different properties.
- Each image sensor included in the image sensor 230 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
- CCD charged coupled device
- CMOS complementary metal oxide semiconductor
- the image stabilizer 240 moves at least one lens or image sensor 230 included in the lens assembly 210 in a specific direction in response to the movement of the camera module 180 or the electronic device 101 including the same.
- the operating characteristics of the image sensor 230 can be controlled (e.g., adjusting read-out timing, etc.). This allows to compensate for at least some of the negative effects of said movement on the image being captured.
- the image stabilizer 240 is a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 180. It is possible to detect such movement of the camera module 180 or the electronic device 101 using .
- the image stabilizer 240 may be implemented as, for example, an optical image stabilizer.
- the memory 250 may at least temporarily store at least a portion of the image acquired through the image sensor 230 for the next image processing task. For example, when image acquisition is delayed due to the shutter or when multiple images are acquired at high speed, the acquired original image (e.g., Bayer-patterned image or high-resolution image) is stored in the memory 250. , the corresponding copy image (e.g., low resolution image) may be previewed through the display module 160. Thereafter, when a specified condition is satisfied (eg, user input or system command), at least a portion of the original image stored in the memory 250 may be obtained and processed, for example, by the image signal processor 260. According to one embodiment, the memory 250 may be configured as at least part of the memory 130 or as a separate memory that operates independently.
- a specified condition eg, user input or system command
- the image signal processor 260 may perform one or more image processes on an image acquired through the image sensor 230 or an image stored in the memory 250.
- the one or more image processes may include, for example, depth map creation, three-dimensional modeling, panorama creation, feature point extraction, image compositing, or image compensation (e.g., noise reduction, resolution adjustment, brightness adjustment, blurring).
- the image signal processor 260 may include blurring, sharpening, or softening, and may include at least one of the components included in the camera module 180 (eg, an image sensor).
- the image processed by the image signal processor 260 may be stored back in the memory 250 for further processing.
- the image signal processor 260 may be configured as at least a part of the processor 120, or the image signal processor 260 may be configured as a separate processor that operates independently of the processor 120. When configured as a separate processor, at least one image processed by the image signal processor 260 may be displayed through the display module 160 as is or after additional image processing by the processor 120.
- the electronic device 101 may include a plurality of camera modules 180, each with different properties or functions.
- at least one of the plurality of camera modules 180 may be a wide-angle camera, and at least another one may be a telephoto camera.
- at least one of the plurality of camera modules 180 may be a front camera, and at least another one may be a rear camera.
- Figure 3 is a block diagram of an electronic device 101 according to an embodiment.
- the electronic device 101 includes a display 310 (e.g., the display module 160 of FIG. 1), a camera module 320 (e.g., the camera module 180 of FIG. 2), and a memory 330. It may include (e.g., memory 130 in FIG. 1 and memory 250 in FIG. 2) and at least one processor 340 (e.g., processor 120 in FIG. 1).
- the camera module 320 may include an image sensor 230 and an image signal processor 260.
- FIG. 3 is for explaining one embodiment, and some components shown in FIG. 3 may be replaced with other components or omitted.
- At least one processor 340 may execute a camera application for capturing images. At least one processor 340 may execute a multi-capturing mode in which images captured based on a camera application are composited with other images and provided. For example, at least one processor 340 may execute a multi-capture mode in response to a user input for selecting a multi-capture mode menu included in a user interface provided by a camera application.
- At least one processor 340 may acquire a first base image.
- 'basic image' may refer to image data before image processing is performed to represent brightness and color perceived by human vision.
- the basic image is image data acquired while the image signal processor 260 performs a pre-processing process and a demosaic process on the image sensor data output from the image sensor 230 or as a result of the process. It can be included.
- the base image may consist of a native file format.
- the base image may be constructed according to a raw image in the DNG (digital negative) format defined by AdobeTM.
- the electronic device 101 may render the basic image by performing an additional operation using the basic image and metadata for the basic image. For example, the electronic device 101 may perform demosaicing or white balancing on the basic image based on metadata.
- the image signal processor 260 may generate metadata including image sensor data transmitted from the image sensor 230 or information acquired in the process of processing the image sensor data.
- metadata may include a white balancing parameter value corresponding to a color temperature for performing white balancing on a basic image obtained based on image sensor data.
- the image signal processor 260 may be related to the red light source for the first basic image acquired based on the first image sensor data.
- First metadata containing values for performing white balancing based on color temperature may be obtained.
- methods for acquiring a basic image may be implemented in various ways.
- at least one processor 340 may receive a user input for selecting a shooting button on a user interface displayed through the display 310 while the multiple exposure shooting mode is being executed. Based on the received user input, the image signal processor 260 may generate a first base image and first metadata from the first image sensor data output from the image sensor 230.
- at least one processor 340 may display an image list including the first basic image stored in the memory 330 through the display 310 .
- the stored first basic image may be stored in the memory 330 in the form of an image file stored in raw format. Image files saved in native format may include underlying images and metadata.
- At least one processor 340 may obtain the first basic image and first metadata from the memory 330 based on a user input for selecting the first basic image displayed through the display 310.
- the first basic image may be generated based on first image sensor data output from the image sensor 230 included in the electronic device 101, or may be generated based on the image sensor (not shown) included in the external electronic device (not shown). It may be generated based on image sensor data output from (not used).
- At least one processor 340 may obtain a second base image and second metadata corresponding to the second base image.
- the second basic image and the second metadata may be obtained based on the second image sensor data output from the image sensor 230 at a time different from the time when the first image sensor data is output from the image sensor 230.
- At least one processor 340 may obtain reference metadata based on at least some of the first metadata or the second metadata. For example, at least one processor 340 may determine the base image acquired first among base images that are subject to synthesis (eg, a first base image and a second base image) as the reference image. At least one processor 340 may determine metadata corresponding to the reference image as reference metadata. For another example, at least one processor 340 determines metadata corresponding to any basic image among the basic images (e.g., the first basic image and the second basic image) that are the target of synthesis as the reference metadata. You can. For another example, at least one processor 340 may determine reference metadata including a value (e.g., average value) calculated by calculating the parameters included in the first metadata and the parameters included in the second metadata. .
- a value e.g., average value
- At least one processor 340 may synthesize the first base image and the second base image based on parameters corresponding to reference metadata. At least one processor 340 may obtain a third basic image by combining the first basic image and the second basic image. When no further images are synthesized in addition to the third base image, at least one processor 340 may store an image file generated based on the third base image and reference metadata. For example, an image file created based on the third basic image and reference metadata may be stored in the memory 330.
- the at least one processor 340 when the at least one processor 340 renders the third base image based on the stored image file, an operation (e.g., white balancing) for rendering the image file based on reference metadata is performed. You can.
- the reference metadata is the first metadata of the first base image
- white balancing when white balancing is applied based on the reference metadata, appropriate white balancing may not be applied to the pixel value based on the second base image. Accordingly, at least one processor 340 may correct the first base image and the second base image or may correct the third base image in consideration of the reference metadata used during rendering.
- the at least one processor 340 may apply white balancing to a first base image based on a first corrected image and second metadata to which white balancing is applied to the first base image based on the first metadata.
- the applied second correction image can be obtained.
- At least one processor 340 may obtain a third correction image that is a composite of the first correction image and the second correction image.
- the at least one processor 340 stores an image file in a memory ( 330).
- the inverse transformation component of the white balancing applied based on the reference metadata is offset.
- a third image may be rendered based on components to which white balancing is applied based on metadata corresponding to each basic image.
- the white balancing parameter value corresponding to the first metadata is 5
- the white balancing parameter value corresponding to the second metadata is 20, and corresponds to the reference metadata.
- white balancing based on the value 5 may be applied to the first basic image
- white balancing based on the value 20 may be applied to the second basic image.
- the component based on the first basic image corresponds to a value to which no white balancing has been applied
- the component based on the second basic image corresponds to a value based on 4.
- a composite image corresponding to the white balancing applied value can be stored in the image file.
- white balancing based on the value 5 is applied to the component based on the first base image, and white balancing based on the second base image is applied.
- White balancing based on a value of 20 may be applied to the component.
- this is for easy explanation of the operating principle according to one embodiment and is not limited thereto.
- At least one processor 340 may obtain a fourth corrected image by applying white balancing to the first base image based on the first metadata and the reference metadata. At least one processor 340 may obtain a fifth corrected image by applying white balancing to the second base image based on the second metadata and the reference metadata. At least one processor 340 may consider white balancing to be performed by metadata to be included in the image file in the process of applying white balancing to each basic data. For example, the value of the white balancing parameter corresponding to the first metadata is 5, the value of the white balancing parameter corresponding to the second metadata is 20, and the value of the white balancing parameter corresponding to the reference metadata is 5. This explains the operation of acquiring a corrected image.
- at least one processor 340 may omit the operation of applying white balancing to the basic image paired with metadata corresponding to the reference metadata.
- At least one processor 340 may display a screen including a preview image through the display 310 .
- the preview image may be displayed by overlaying an image streaming from the image sensor 230 on a recently acquired image (single image or composite image).
- the at least one processor 340 applies the image sensor 230 to the image corresponding to the first basic image.
- the image frames obtained through this method can be output by overlaying a continuously updated image. If the multiple exposure shooting mode is being executed after the first basic image and the second basic image are acquired and the third basic image is synthesized, the at least one processor 340 uses an image sensor to capture an image corresponding to the third basic image.
- the image frame obtained through (230) can be output by overlaying a continuously updated image.
- At least one processor 340 may further obtain a post-processed image that is used to provide a preview image.
- a post-processed image may refer to an image obtained by the image signal processor 260 or at least one processor 340 performing image processing on a basic image.
- the post-processed image may include image data obtained by performing tone mapping, noise removal, color correction, or gamma correction on the base image.
- the post-processed image may be composed of a format (e.g., JPG (joint photographic experts group) format) containing information with the same or similar luminance and color as perceived by human vision.
- the post-processed image may have a lower bit depth than the base image.
- At least one processor 340 may synthesize a plurality of post-processed images. For example, the at least one processor 340 combines a first post-processed image corresponding to the first basic image and a second post-processed image corresponding to the second basic image, thereby combining the first basic image and the second basic image. A third post-processed image corresponding to the third basic image synthesized can be generated. When generating an image file, at least one processor 340 may configure the image file to include a finally obtained post-processed image.
- the post-processed image included in the image file can be used as a thumbnail image or cover image for the image file.
- FIG. 4 is a flowchart 400 illustrating a process in which an electronic device (eg, the electronic device 101 of FIGS. 1 and 3 ) stores an acquired image as an image file, according to an embodiment.
- the operation of the electronic device is performed by at least one processor (e.g., the electronic device 101 of FIGS. 1 and 3) of the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3).
- the processor 120 in 1, the image signal processor 260 in FIG. 2, or at least one processor 340 in FIG. 3) performs an operation or an electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) It can be understood as being performed by controlling other components of .
- an electronic device may acquire a plurality of basic images.
- the electronic device e.g., the electronic device 101 in FIGS. 1 and 3 executes a camera application to receive a user input for selecting a menu for executing multiple exposure mode shooting included in the displayed user interface. You can.
- the electronic device may enter the multiple exposure mode and determine a multiple exposure mode setting value for performing multiple exposure mode photography.
- the electronic device e.g., the electronic device 101 in FIGS. 1 and 3 selects the number of shots that define the quantity of basic images to be used to generate a synthesized image, and defines a method for synthesizing the image.
- At least one user input for selecting a compositing mode may be received.
- the electronic device e.g., the electronic device 101 in FIGS. 1 and 3) may determine a shooting setting value for shooting an image based on the set multiple exposure mode setting value.
- the shooting setting value may include at least one of sensitivity, white balance, or exposure time. Shooting settings can be changed based on user input for each shot while executing the multiple exposure mode. Alternatively, the shooting setting value may be automatically set by an electronic device (eg, the electronic device 101 in FIGS. 1 and 3).
- an electronic device may acquire a first basic image and acquire first metadata corresponding to the first basic image.
- the electronic device may acquire the second basic image at a time different from the time when the first basic image was captured, and obtain second metadata corresponding to the second basic image.
- an electronic device may acquire reference metadata.
- an electronic device according to an embodiment may select a reference image from a plurality of basic images.
- the electronic device e.g., the electronic device 101 in FIGS. 1 and 3) may determine the metadata of the reference image as the reference metadata.
- the reference image may be, for example, a base image first acquired while executing multiple exposure mode. However, it is not limited to this.
- an electronic device may obtain an image obtained by combining a plurality of basic images based on reference metadata.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3 may apply white balancing to an acquired basic image based on metadata corresponding to the basic image and composite it with another basic image. You can.
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3 may obtain an image obtained by synthesizing the basic images by applying inverse transformation based on reference metadata.
- an electronic device may store an image file generated based on the synthesized image and reference metadata.
- the generated image file may include a base image having a native format and reference metadata.
- an electronic device e.g., the electronic device 101 in FIGS. 1 and 3 may store an image file in a digital negative (DNG) format.
- DNG digital negative
- FIG. 5 is a flowchart 500 illustrating a process in which an electronic device (e.g., the electronic device 101 of FIGS. 1 and 3 ) according to an embodiment stores an image using inverse white balancing conversion.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3
- FIG. 5 stores an image using inverse white balancing conversion.
- an electronic device may acquire image sensor data from an image sensor.
- the electronic device may acquire a basic image, metadata, and a post-processed image based on the acquired image sensor data.
- an electronic device e.g., the electronic device 101 in FIGS. 1 and 3 may output an image from an image sensor based on a user input of selecting a shooting icon on a displayed user interface while the multiple exposure shooting mode is set.
- a first basic image, first metadata, and a first post-processed image may be obtained from sensor data.
- operation 510 includes the basic image. It can also be replaced with the action of selecting a file.
- an electronic device may apply white balancing to a basic image obtained based on metadata. For example, when the first basic image is captured around a white light source, the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) generates color information of the first basic image based on the color temperature of the white light source. It can be corrected.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3 may combine the basic image corrected in operation 530 with an existing basic image. If there is no existing basic image to be the subject of compositing, operation 540 may be omitted. For example, if only the first basic image has been acquired in operations 510 to 530, operation 540 may be omitted.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3 may combine the post-processed image acquired in operation 520 with an existing post-processed image. If there is no existing post-processed image to be the subject of compositing, operation 545 may be omitted. For example, if only the first post-processing image is acquired in operations 510 and 520, operation 545 may be omitted.
- the electronic device may determine whether multiple exposure mode shooting has ended. For example, when the number of shots included in the multiple exposure mode setting value is set to N, the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) repeats operations 510 to 545 to synthesize the image. When the number of times reaches N, it can be determined that multiple exposure mode shooting has ended. For another example, the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) terminates multiple exposure mode shooting based on receiving a user input corresponding to a command to end multiple exposure mode shooting. It can be judged that
- the electronic device may perform operations 510 to 545. For example, when the number of shots is set to 3, the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) performs operations 510 to 530 after acquiring the first basic image to capture the second meta image.
- White balancing may be applied to the second basic image based on the data. For example, when the second basic image is captured around a red light source, the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) generates color information of the second basic image based on the color temperature of the red light source. It can be corrected.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) synthesizes a first basic image (first corrected image) and a second basic image (second corrected image) to which white balancing is applied, respectively.
- a third basic image synthetic image
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) may obtain a third post-processing image by combining the first post-processing image and the second post-processing image.
- the electronic device may determine that multiple exposure mode shooting has ended.
- the electronic device may apply inverse white balancing transformation based on reference metadata to the finally generated basic image.
- Reference metadata may be determined as described in operation 420.
- the electronic device includes inverse image data to which inverse transformation has been applied (e.g., data to which inverse transformation of white balancing has been applied to a third base image), reference metadata, and a post-processed image (e.g., a third post-processed image). You can save image files.
- inverse image data to which inverse transformation has been applied e.g., data to which inverse transformation of white balancing has been applied to a third base image
- reference metadata e.g., a third post-processed image.
- the electronic device may determine that multiple exposure mode shooting has not ended. .
- the electronic device e.g., the electronic device 101 in FIGS. 1 and 3 may apply white balancing to the fourth basic image obtained in operations 510 and 520 based on metadata corresponding to the fourth basic image. there is.
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3 may obtain a fifth basic image by combining the fourth basic image to which white balancing is applied with the third basic image.
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3) may obtain a fifth post-processing image by combining the fourth post-processing image obtained in operation 520 with the third post-processing image. there is.
- the electronic device may determine that multiple exposure mode shooting has ended. .
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3 may apply inverse white balancing transformation based on reference metadata to the finally synthesized basic image. For example, according to the above-described example, when the reference metadata is the first metadata, the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) stores the first metadata for the fifth basic image. Based on this, inverse transformation of white balancing can be applied.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) includes an inverse image to which inverse white balancing has been applied (e.g., an image to which inverse white balancing has been applied to the fifth basic image). You can save image files.
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) may store the image file to further include reference metadata and a post-processed image (e.g., a fifth post-processed image).
- FIG. 6 is a flowchart 600 illustrating a process in which an electronic device (e.g., the electronic device 101 of FIGS. 1 and 3 ) compensates for white balancing and stores an image according to an embodiment.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3
- compensates for white balancing and stores an image according to an embodiment e.g., the electronic device 101 of FIGS. 1 and 3
- an electronic device may acquire image sensor data from an image sensor.
- the electronic device may acquire a basic image, metadata, and a post-processed image based on the acquired image sensor data.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3 may output an image from an image sensor based on a user input of selecting a shooting icon of a displayed user interface while the multi-photography mode is set.
- a first basic image, first metadata, and a first post-processed image may be obtained from data.
- operation 610 includes the basic image. It can also be replaced with the action of selecting a file.
- an electronic device may determine reference metadata.
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3 may determine reference metadata based on the metadata obtained in operation 620.
- the reference metadata may include metadata corresponding to the basic image initially acquired while performing multiple exposure mode shooting.
- the standard metadata can be determined in various ways, and the method of determining the standard metadata is not limited.
- another electronic device whitens the basic image based on the reference metadata and metadata corresponding to the basic image acquired in operation 620. Balancing can be applied. For example, when the first basic image and first metadata are acquired in operation 620, the value of the white balancing parameter corresponding to the first metadata may be compensated by considering the reference metadata included in the image file. An electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) may perform a white balancing operation on the first basic image based on the compensated white balancing parameter. If the reference metadata is the first metadata, operation 640 may be omitted for the first basic image.
- an electronic device may combine the basic image to which white balancing has been applied in operation 640 with an existing basic image. If there is no existing basic image to be the subject of compositing, operation 650 may be omitted. For example, if only the first basic image has been acquired through operations 610 to 640, operation 650 may be omitted.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3 may combine the post-processed image acquired in operation 620 with an existing post-processed image. If there is no existing post-processed image to be the subject of compositing, operation 655 may be omitted.
- the electronic device may determine whether multiple exposure mode shooting has ended. For example, the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) may compare the set number of shots with the number of times the image is synthesized to determine whether multiple exposure mode shooting has ended. For another example, the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) may control the user whether to perform additional shooting after capturing and compositing images or whether to end multiple exposure mode shooting and save the file. It can also receive input.
- the electronic device may perform operations 610 to 655. For example, in operations 610 and 620, the electronic device (eg, the electronic device 101 of FIGS. 1 and 3) may acquire a second base image, second metadata, and a second post-processed image. If reference metadata has been determined, operation 630 may be omitted. In operation 640, the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) considers white balancing to be performed based on reference metadata to be included in the image file and sets a white balancing parameter corresponding to the second metadata. can compensate.
- the electronic device may perform a white balancing operation on the second basic image based on the compensated white balancing parameter.
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3
- operations 610 to 655 may be performed to further synthesize additional images into the base image and the post-processed image.
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3
- displays the finally acquired basic image e.g., in operation 670.
- Third basic image and an image file including reference metadata can be stored.
- the image file may further include a finally acquired post-processed image (eg, a third post-processed image).
- FIG. 7 is a diagram illustrating an example in which an electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) acquires and renders a multiple exposure image according to an embodiment.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3 acquires and renders a multiple exposure image according to an embodiment.
- An electronic device e.g., the electronic device 101 of FIGS. 1 and 3 performs the first photographing 701 to capture the first basic image 711, the first metadata 721, and the first A post-processed image 731 can be acquired.
- the electronic device e.g., the electronic device 101 of FIGS. 1 and 3 is used in a different capturing environment (e.g., color temperature of an ambient light source) at a time different from the time when the first capturing 701 is performed.
- the second photographing 702 can be performed.
- the electronic device e.g., the electronic device 101 in FIGS. 1 and 3) produces a second basic image 712, second metadata 722, and a second post-processed image 732 as a result of the second capturing 702. ) can be obtained.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3 performs an operation of synthesizing images acquired in the first capture 701 and the second capture 702 based on metadata ( 755) can be performed.
- the first base image 711 and the second base image 712 based on the process shown in the flowchart 400 of FIG. 4, the flowchart 500 of FIG. 5, or the flowchart 600 of FIG. 6. It can be synthesized.
- an electronic device may generate an image file 760 including a synthesized image.
- the image file 760 may include a third basic image 713 that is a composite of the first basic image 711 and the second basic image 712.
- the third base image 713 is generated from the first base image 711 and the second base image 712 based on the first metadata 721, the second metadata 722, and the reference metadata 723. It can be.
- the image file 760 includes reference metadata 723 determined based on at least part of the first metadata 721 or the second metadata 722 as metadata for the third base image 713. can do.
- the reference metadata 723 may include first metadata 721.
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) renders the third basic image 713 of the image file 760 based on the reference metadata 723. can do.
- An electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) may display the rendered multiple exposure image 714.
- the third basic image 713 may include data generated by considering the metadata 721 and 722 and the reference metadata 723 for each of the basic images 711 and 712. Accordingly, the multiple exposure image 714 rendered from the third basic image 713 based on the reference metadata 723 has the same or similar color as the thumbnail or cover image 743 displaying the post-processed image 733. It can be expressed.
- FIG. 8 is a block diagram illustrating an example in which an electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) according to an embodiment acquires a basic image and a post-processed image from image sensor data.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3
- acquires a basic image and a post-processed image from image sensor data e.g., the electronic device 101 of FIGS. 1 and 3
- an image signal processor e.g., image signal processor 260 of FIG. 2 of an electronic device (e.g., electronic device 101 of FIGS. 1 and 3) performs Bayer/raw processing.
- Image sensor data 800 may be processed through the engine 810 and the image processing engine 820.
- the Bayer/raw processing engine 810 may include a Bayer preprocessing module 811 and a Bayer demosaic module 812.
- Image sensor data 800 may be primarily processed in the Bayer preprocessing module 811.
- the Bayer demosaicing module 812 can convert the data (Bayer image sensor data) primarily processed in the Bayer preprocessing module 811 into RGB format.
- a basic image 830 (eg, the first basic image 711 and the second basic image 712 in FIG. 7) may be obtained from the processing process or processing result of the Bayer/raw processing engine 810.
- metadata e.g., the first metadata 721 in FIG. 7, the second metadata
- Data 722 may be obtained.
- the result of the Bayer/raw processing engine 810 processing the image sensor data 800 may be transmitted to the image processing engine 820.
- the image processing engine 820 includes, for example, a tone mapping module 821, a noise reduction module 822, a color correction module 823, and gamma correction. It may include at least one of a gamma correction module 824, a spatial filter module 825, or a grain add module 826.
- a post-processed image 840 having the same or similar brightness and color as perceived by sight can be obtained.
- the configuration of the Bayer/raw processing engine 810 and the image processing engine 820 shown in FIG. 8 is for illustrative purposes only and is not limited thereto.
- FIG. 9 is a block diagram illustrating a configuration included in an image file 900 stored by an electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) according to an embodiment.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3
- image file 900 may include base image 910 and metadata 920.
- the image file 900 may further include a post-processed image 930 corresponding to the base image 910.
- the post-processed image 930 may be omitted.
- the image file 900 may be configured, for example, in a digital negative (DNG) format defined by AdobeTM.
- the base image 910 may be a raw image defined in DNG.
- Metadata 920 may be raw image metadata defined in DNG.
- the post-processed image 930 may be a processed image defined in DNG.
- the post-processed image 930 may be provided in the form of a thumbnail image to easily search the image stored in the image file 900.
- the electronic device e.g., the electronic device 101 in FIGS. 1 and 3 executes dedicated software to render the basic image 910 using the basic image 910 and metadata 920 in the image file 900. can do.
- FIG. 10 is a diagram illustrating in module units the configuration of an electronic device (e.g., the electronic device 101 of FIGS. 1 and 3 ) that stores an image using inverse white balancing transformation, according to an embodiment.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3
- FIG. 10 stores an image using inverse white balancing transformation, according to an embodiment.
- image sensor data 1001 output from the image sensor module 230 may be input to the image signal processing module 1010.
- the image signal processing module 1010 may process the image sensor data 1001 and output a basic image and metadata 1002.
- the reference image determination module 1020 may determine a reference image from among a plurality of basic images output from the image signal processing module 1010.
- Reference metadata 1004 corresponding to the reference image may be included in the image file 1007 generated by the image file encoding module 1070.
- the white balance application module 1030 may correct color information of the base image by applying white balancing to the base image based on metadata corresponding to the base image.
- the white balancing application module 1030 may extract white balancing parameters from metadata and correct the basic image to have a value considering white balancing. This process can be applied to all basic images taken based on multiple exposure mode.
- the base image to which white balancing has been applied may be transmitted to the base image blending module 1040.
- the basic image blending module 1040 can synthesize basic images to which white balancing is applied according to a set synthesis mode.
- the inverse white balance applying module 1050 may extract white balancing parameters from the reference metadata 1004.
- the finally synthesized basic image can be inversely converted using the white balancing parameter extracted from the reference metadata 1004.
- the inverse conversion process may be performed based on the conversion used in the white balance application module 1030.
- the inverse transformation process can also be configured as an approximation.
- the synthesized base image 1003 output by the inverse white balance application module 1050 may be transmitted to the image file encoding module 1070.
- the image file encoding module 1070 may encode an image file 1007 that includes a composite base image 1003 and reference metadata 1004.
- the image signal processing module 1010 may further output a post-processed image 1005 obtained by performing image processing on the image sensor data 1001.
- Each post-processed image 1005 output from the image signal processing module 1010 may be synthesized by the post-processed image blending module 1055.
- the synthesized post-processed image 1006 may be included in the image file 1007 by the image file encoding module 1070.
- the electronic device may further include a preview overlay module 1080 and a preview rendering module 1090.
- the post-processed image 1006 composited by the post-processed image blending module 1055 may be passed to the preview overlay module 1080.
- the preview overlay module 1080 may overlay the post-processed image 1005 output from the image signal processing module 1010 on the synthesized post-processed image 1006.
- the image signal processing module 1010 may stream the post-processed image 1005 to the preview overlay module 1080 based on the rendering cycle of the preview image 1008.
- the preview rendering module 1090 outputs a preview image 1008 by rendering an image in which the post-processed image 1005, which is updated every rendering cycle, is overlaid on the synthesized post-processed image 1006. You can.
- FIG. 11 is a diagram showing the configuration of an electronic device (e.g., the electronic device 101 of FIGS. 1 and 3 in FIGS. 1 and 3 ) in module units, which stores an image by compensating for white balance, according to an embodiment.
- an electronic device e.g., the electronic device 101 of FIGS. 1 and 3 in FIGS. 1 and 3
- module units which stores an image by compensating for white balance, according to an embodiment.
- image sensor data 1001 output from the image sensor module 230 may be input to the image signal processing module 1010.
- the image signal processing module 1010 may process the image sensor data 1001 and output a basic image and metadata 1002.
- the reference image determination module 1020 may determine a reference image from among a plurality of basic images output from the image signal processing module 1010.
- Reference metadata 1004 corresponding to the reference image may be included in the image file 1007 generated by the image file encoding module 1070.
- the determined reference metadata 1004 may be passed to a white balance compensation module 1130.
- the white balance compensation module 1130 determines the value of the white balancing parameter to be applied to the base image based on the white balancing parameter extracted from the metadata 1002 for the base image and the white balancing parameter extracted from the determined reference metadata 1004. can compensate.
- the white balance compensation module 1130 may apply white balancing to the basic image based on the compensated value.
- the basic image to which white balancing has been applied may be transmitted to the basic image blending module 1040.
- the basic image blending module 1040 may synthesize basic images to which white balancing is applied according to a set synthesis mode.
- post-processing image blending module 1055, image file encoding module 1070, preview overlay module 1080, and preview rendering module 1090 may operate as described above with respect to FIG. 10. .
- the image file may include metadata containing information that can be used to render an image included in the image file. Metadata may be obtained from image sensor data output from an image sensor or may include information obtained in the process of processing image sensor data. It is difficult to determine the brightness and color recognized by human vision using only the image data of the image file saved in raw format. Accordingly, the electronic device can perform additional operations (eg, demosaic or white balance) based on the metadata.
- Multiple exposure images can be obtained by combining images acquired using image sensors at different viewpoints. Since images used to obtain a multiple exposure image may be created at different times, metadata corresponding to each image may be different. For example, when the color temperature of an ambient light source of an image captured at a first viewpoint is different from that of an ambient light source of an image captured at a second viewpoint, different parameters related to white balancing may be required. However, since the parameters stored in metadata are parameters corresponding to one condition, the result of applying white balancing to the synthesized image may be different from the color of the required image.
- An electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) includes a camera (e.g., the image sensor 230 in FIGS. 2 and 3) including an image sensor (e.g., the image sensor 230 in FIGS. 2 and 3).
- the at least one processor (e.g., the processor 220 of FIG. 1, the processor 330 of FIG.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) has the at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3) use the image sensor (e.g., the electronic device 101 in FIG. 3).
- the electronic device e.g., the electronic device 101 in FIGS. 1 and 3
- the electronic device has the at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3) use the image sensor (e.g., the electronic device 101 in FIG. 3).
- the electronic device (e.g., the electronic device 101 in FIGS.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) is configured to store the first metadata or the at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3). and may be configured to determine reference metadata based on at least some of the second metadata.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) has at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3) corresponding to the reference metadata. It may be configured to obtain a third basic image by combining the first basic image and the second basic image based on parameters.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) has the at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3) use the third base image and the It may be configured to store the image file generated based on the reference metadata in the memory (eg, memory 130 in FIG. 1, memory 250 in FIG. 2, and memory 330 in FIG. 3).
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) includes the at least one processor (e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3). It may be configured to select the first basic image among the first basic image and the second basic image as a reference image.
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) determines that the at least one processor (e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3) selects the first processor based on the selection. 1 may be configured to obtain reference metadata including at least some of the metadata.
- An electronic device may further include a display.
- the first basic image may be an image stored in the memory (eg, memory 130 in FIG. 1, memory 250 in FIG. 2, and memory 330 in FIG. 3).
- the electronic device e.g., the electronic device 101 in FIGS. 1 and 3) includes the at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3) with the memory (e.g., the electronic device 101 in FIG. 1).
- the display may be configured to control the display to display an image list including the first basic image stored in the memory 130, the memory 250 of FIG. 2, and the memory 330 of FIG. 3.
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) has the at least one processor (e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3) 1 Can be configured to receive user input for selecting a base image.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) operates the at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3) based on the user input. It may be configured to select the first base image as a reference image.
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) includes the at least one processor (e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3). It may be configured to obtain a first post-processed image to which image processing including white balancing is applied to the first basic image.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) has at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3) with respect to the second basic image. It may be configured to obtain a second post-processed image to which image processing including white balancing has been applied.
- the electronic device (e.g., the electronic device 101 in FIGS.
- the at least one processor e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3 process a first post-processed image and It may be configured to obtain a third post-processing image by combining the second post-processing image.
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) includes the at least one processor (e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3). and may be configured to control the display to display a third post-processed image as a thumbnail image or preview image for the image file.
- the at least one processor e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3
- the display may be configured to control the display to display a third post-processed image as a thumbnail image or preview image for the image file.
- the first base image and the second base image may be configured in a raw file format.
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) includes the at least one processor (e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3).
- the at least one processor e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3.
- the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) includes the at least one processor (e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3).
- a third correction is performed by combining a first correction image in which white balancing is applied to the first basic image based on first metadata and a second correction image in which white balancing is applied to the second basic image based on the second metadata. It may be configured to acquire an image.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) has at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3) with respect to the third corrected image. It may be configured to obtain the third basic image by applying inverse white balancing transformation based on the reference metadata.
- An electronic device (e.g., the electronic device 101 of FIGS. 1 and 3) according to an embodiment includes the at least one processor (e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3). 1 may be configured to obtain a fourth corrected image by applying white balancing to the first basic image based on metadata and the reference metadata.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) is configured to store the second metadata and the at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3). It may be configured to obtain a fifth corrected image by applying white balancing to the second basic image based on reference metadata.
- the electronic device (e.g., the electronic device 101 in FIGS. 1 and 3) has the at least one processor (e.g., the processor 220 in FIG. 1 and the processor 330 in FIG. 3) process the fourth corrected image and the It may be configured to obtain the third basic image by synthesizing the fifth correction image.
- An electronic device e.g., the electronic device 101 of FIGS. 1 and 3 according to an embodiment has the at least one processor (e.g., the processor 220 of FIG. 1 and the processor 330 of FIG. 3) meet the standard. It may be configured to render the third base image by applying white balancing to the third base image based on metadata.
- a camera including an image sensor (e.g., the image sensor 230 of FIGS. 2 and 3) according to an embodiment.
- a method of operating an electronic device includes the image sensor (e.g., the image sensor 230 of FIGS. 2 and 3). It may include an operation of acquiring a first basic image and first metadata corresponding to the first image sensor data obtained from. The method includes a second base image and a second meta corresponding to the second image sensor data acquired from the image sensor (e.g., the image sensor 230 in FIGS.
- the method may include determining reference metadata based on at least some of the first metadata or the second metadata.
- the method may include an operation of obtaining a third basic image obtained by combining the first basic image and the second basic image based on parameters corresponding to the reference metadata.
- the method may include storing an image file created based on the third base image and the reference metadata.
- the operation of acquiring the reference metadata includes selecting the first base image among the first base image and the second base image as a reference image, and based on the selection, the first base image It may include an operation of obtaining reference metadata including at least some of the metadata.
- the first basic image is a memory (e.g., the memory 130 of FIG. 1 or the memory of FIG. 2) included in the electronic device (e.g., the electronic device 101 of FIGS. 1 and 3). 250) and may include images stored in memory 330 of FIG. 3.
- the operation of selecting the first basic image as the reference image includes displaying an image list including the first basic image through a display, and receiving a user input for selecting the first basic image with respect to the image list. It may include an operation of selecting the first basic image as a reference image based on the user input.
- a method includes an operation of acquiring a first post-processed image to which image processing including white balancing is applied to the first basic image, and image processing including white balancing is applied to the second basic image.
- the method may further include obtaining a second post-processed image and obtaining a third post-processed image by combining the first post-processed image and the second post-processed image.
- the method according to one embodiment may further include displaying the third post-processed image as a sample image or preview image for the image file through a display.
- the first base image and the second base image may be configured in a raw file format.
- the method according to one embodiment includes overlaying an image frame updated by being included in an image stream output from the image sensor (e.g., the image sensor 230 in FIGS. 2 and 3) on the third post-processed image.
- An operation of displaying the included screen through a display may be further included.
- the operation of acquiring the third base image includes applying white balancing to the first base image based on the first metadata and the first correction image based on the second metadata. 2 Obtaining a third correction image by combining a second correction image with white balancing applied to the base image, and applying inverse white balancing transformation to the third correction image based on the reference metadata to obtain the third base image. It may include an operation to obtain.
- the operation of acquiring the third base image includes obtaining a fourth corrected image by applying white balancing to the first base image based on the first metadata and the reference metadata.
- the operation of acquiring a fifth corrected image by applying white balancing to the second basic image based on the second metadata and the reference metadata, and the operation of acquiring the third basic image include the fourth corrected image and the It may include an operation of obtaining the third basic image by synthesizing the fifth correction image.
- the method according to an embodiment may further include rendering the third basic image by applying white balancing to the third basic image based on the reference metadata.
- an electronic device that can improve the quality of a multiple exposure image and a method of operating the same can be provided.
- an electronic device that renders a multi-exposure image stored in raw format so that the displayed result has the same or similar color as the color of a thumbnail image, cover image, or preview image for the multiple exposure image.
- a device and a method of operating the same may be provided.
- an electronic device that provides a preview image expected to be generated when a shooting command is input and a method of operating the same may be provided.
- a computer-readable storage medium that stores one or more programs (software modules) may be provided.
- One or more programs stored in a computer-readable storage medium are configured to be executable by one or more processors in an electronic device (configured for execution).
- One or more programs include instructions that cause the electronic device to execute methods according to embodiments described in the claims or specification of the present disclosure.
- These programs include random access memory, non-volatile memory including flash memory, read only memory (ROM), and electrically erasable programmable ROM.
- EEPROM electrically erasable programmable read only memory
- magnetic disc storage device compact disc-ROM (CD-ROM), digital versatile discs (DVDs), or other types of disk storage. It can be stored in an optical storage device or magnetic cassette. Alternatively, it may be stored in a memory consisting of a combination of some or all of these. Additionally, multiple configuration memories may be included.
- the program may be operated through a communication network such as the Internet, an intranet, a local area network (LAN), a wide LAN (WLAN), or a storage area network (SAN), or a combination thereof. It may be stored on an attachable storage device that is accessible. This storage device can be connected to a device performing an embodiment of the present disclosure through an external port. Additionally, a separate storage device on a communication network may be connected to the device performing an embodiment of the present disclosure.
- a communication network such as the Internet, an intranet, a local area network (LAN), a wide LAN (WLAN), or a storage area network (SAN), or a combination thereof. It may be stored on an attachable storage device that is accessible. This storage device can be connected to a device performing an embodiment of the present disclosure through an external port. Additionally, a separate storage device on a communication network may be connected to the device performing an embodiment of the present disclosure.
- terms such as “unit”, “module”, etc. may refer to a hardware component such as a processor or circuit, and/or a software component executed by a hardware component such as a processor. .
- a “part” or “module” is stored in an addressable storage medium and may be implemented by a program that can be executed by a processor.
- “part” and “module” refer to components such as software components, object-oriented software components, class components, and task components, as well as processes, functions, properties, and programs. It may be implemented by scissors, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays and variables.
- “comprises at least one of a, b, or c” means “contains only a, only b, only c, includes a and b, includes b and c,” It may mean including a and c, or including all a, b, and c.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
Un dispositif électronique selon divers modes de réalisation peut être configuré pour : corriger une première image de base et une deuxième image de base sur la base de métadonnées de référence à inclure dans un fichier d'image ; obtenir une troisième image de base en combinant la première image corrigée et la deuxième image corrigée ou corriger, sur la base des métadonnées de référence, une image obtenue en combinant la première image de base et la deuxième image de base ; et stocker un fichier d'image généré sur la base de la troisième image de base et des métadonnées de référence.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN202211060353 | 2022-10-21 | ||
IN202211060353 | 2022-10-21 | ||
KR1020220163039A KR20240056374A (ko) | 2022-10-21 | 2022-11-29 | 다중 노출 영상을 획득하는 전자 장치 및 그 동작 방법 |
KR10-2022-0163039 | 2022-11-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024085673A1 true WO2024085673A1 (fr) | 2024-04-25 |
Family
ID=90737961
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2023/016224 WO2024085673A1 (fr) | 2022-10-21 | 2023-10-19 | Dispositif électronique pour obtenir de multiples images d'exposition et son procédé de fonctionnement |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024085673A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100285907B1 (ko) * | 1998-12-31 | 2001-04-16 | 윤종용 | 다중노출 기능을 갖는 디지털 스틸 카메라 |
JP2013058869A (ja) * | 2011-09-07 | 2013-03-28 | Canon Inc | 画像処理装置、画像処理方法及びプログラム |
KR20140106221A (ko) * | 2013-02-26 | 2014-09-03 | 삼성전자주식회사 | 다수 이미지 센서를 이용한 촬영방법 및 장치 |
JP2015122569A (ja) * | 2013-12-20 | 2015-07-02 | キヤノン株式会社 | 撮像装置、撮像装置の制御方法およびプログラム |
JP2016063361A (ja) * | 2014-09-17 | 2016-04-25 | リコーイメージング株式会社 | 撮影装置 |
-
2023
- 2023-10-19 WO PCT/KR2023/016224 patent/WO2024085673A1/fr unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100285907B1 (ko) * | 1998-12-31 | 2001-04-16 | 윤종용 | 다중노출 기능을 갖는 디지털 스틸 카메라 |
JP2013058869A (ja) * | 2011-09-07 | 2013-03-28 | Canon Inc | 画像処理装置、画像処理方法及びプログラム |
KR20140106221A (ko) * | 2013-02-26 | 2014-09-03 | 삼성전자주식회사 | 다수 이미지 센서를 이용한 촬영방법 및 장치 |
JP2015122569A (ja) * | 2013-12-20 | 2015-07-02 | キヤノン株式会社 | 撮像装置、撮像装置の制御方法およびプログラム |
JP2016063361A (ja) * | 2014-09-17 | 2016-04-25 | リコーイメージング株式会社 | 撮影装置 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022039424A1 (fr) | Procédé de stabilisation d'images et dispositif électronique associé | |
WO2022108235A1 (fr) | Procédé, appareil et support de stockage pour obtenir un obturateur lent | |
WO2022149654A1 (fr) | Dispositif électronique pour réaliser une stabilisation d'image, et son procédé de fonctionnement | |
WO2022092706A1 (fr) | Procédé de prise de photographie à l'aide d'une pluralité de caméras, et dispositif associé | |
WO2020190030A1 (fr) | Dispositif électronique de génération d'image composite et procédé associé | |
WO2022149812A1 (fr) | Dispositif électronique comprenant un module de caméra et procédé de fonctionnement de dispositif électronique | |
WO2022244970A1 (fr) | Procédé de capture d'image de dispositif électronique, et dispositif électronique associé | |
WO2021251631A1 (fr) | Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé | |
WO2021261737A1 (fr) | Dispositif électronique comprenant un capteur d'image, et procédé de commande de celui-ci | |
WO2024085673A1 (fr) | Dispositif électronique pour obtenir de multiples images d'exposition et son procédé de fonctionnement | |
WO2024122913A1 (fr) | Dispositif électronique pour acquérir une image à l'aide d'un modèle d'apprentissage automatique, et son procédé de fonctionnement | |
WO2022092607A1 (fr) | Dispositif électronique comportant un capteur d'image et procédé de fonctionnement de celui-ci | |
WO2024111924A1 (fr) | Procédé de fourniture d'image et dispositif électronique le prenant en charge | |
WO2024080767A1 (fr) | Dispositif électronique d'acquisition d'image à l'aide d'une caméra, et son procédé de fonctionnement | |
WO2024117587A1 (fr) | Dispositif électronique fournissant une fonction de filtre et son procédé de fonctionnement | |
WO2024085487A1 (fr) | Dispositif électronique, procédé et support de stockage lisible par ordinateur non transitoire destinés au changement de réglage de caméra | |
WO2023018262A1 (fr) | Procédé de fourniture d'image et dispositif électronique le prenant en charge | |
WO2022231270A1 (fr) | Dispositif électronique et son procédé de traitement d'image | |
WO2024106746A1 (fr) | Dispositif électronique et procédé d'augmentation de la résolution d'une image de bokeh numérique | |
WO2023234705A1 (fr) | Dispositif électronique comprenant un capteur d'image et procédé de fonctionnement associé | |
WO2024158139A1 (fr) | Procédé de commande d'équilibrage des blancs, et dispositif électronique le prenant en charge | |
WO2024076101A1 (fr) | Procédé de traitement d'images sur la base de l'intelligence artificielle et dispositif électronique conçu pour prendre en charge le procédé | |
WO2023033396A1 (fr) | Dispositif électronique pour traiter une entrée de prise de vue continue, et son procédé de fonctionnement | |
WO2024205028A1 (fr) | Dispositif électronique comprenant un capteur d'image et procédé de fonctionnement de celui-ci | |
WO2024053824A1 (fr) | Dispositif électronique pour fournir une image et son procédé de fonctionnement |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23880249 Country of ref document: EP Kind code of ref document: A1 |