WO2024111924A1 - Procédé de fourniture d'image et dispositif électronique le prenant en charge - Google Patents

Procédé de fourniture d'image et dispositif électronique le prenant en charge Download PDF

Info

Publication number
WO2024111924A1
WO2024111924A1 PCT/KR2023/017143 KR2023017143W WO2024111924A1 WO 2024111924 A1 WO2024111924 A1 WO 2024111924A1 KR 2023017143 W KR2023017143 W KR 2023017143W WO 2024111924 A1 WO2024111924 A1 WO 2024111924A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
processor
electronic device
information
Prior art date
Application number
PCT/KR2023/017143
Other languages
English (en)
Korean (ko)
Inventor
곽성신
김성오
임광용
송인선
이다솜
최지환
Original Assignee
삼성전자주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020220176829A external-priority patent/KR20240078250A/ko
Application filed by 삼성전자주식회사 filed Critical 삼성전자주식회사
Publication of WO2024111924A1 publication Critical patent/WO2024111924A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • Various embodiments of the present disclosure relate to a method of providing an image and an electronic device supporting the same.
  • An electronic device may include a plurality of cameras to provide improved camera functions.
  • the electronic device may include an ultra-wide-angle camera, a wide-angle camera, and a telephoto camera, each having different basic zoom magnifications.
  • An electronic device can provide images with higher quality by combining a plurality of images acquired through each of a plurality of cameras.
  • the electronic device may use calibration (also referred to as “multi calibration”) information of a plurality of cameras to synthesize the plurality of images.
  • Calibration information of the plurality of cameras may include physical position differences between the plurality of cameras (and differences between the optical axes of the plurality of cameras).
  • Calibration information for a plurality of cameras may be obtained using specialized equipment during the process of manufacturing a camera module including a plurality of cameras or during the process of replacing the camera module with another camera module.
  • Calibration information of a plurality of cameras included in an electronic device may change while using the electronic device. For example, when the electronic device collides with an external object (e.g., when the electronic device falls to the floor and collides with the floor) or when the electronic device deteriorates (e.g., the adhesive force by which a plurality of cameras are attached to the electronic device When this is weakened), physical position differences between the plurality of cameras are changed, and differences between the optical axes of the plurality of cameras may occur (or be changed).
  • an external object e.g., when the electronic device falls to the floor and collides with the floor
  • the electronic device deteriorates e.g., the adhesive force by which a plurality of cameras are attached to the electronic device When this is weakened
  • physical position differences between the plurality of cameras are changed, and differences between the optical axes of the plurality of cameras may occur (or be changed).
  • One embodiment of the present disclosure is to obtain a high-pixel image by combining a plurality of images acquired through a plurality of cameras, and to update calibration information based on information acquired while acquiring the high-pixel image. It relates to a method of providing and an electronic device that supports the same.
  • An electronic device includes a display module, a plurality of cameras including a first camera and a second camera having a narrower field of view than the first camera, and positions of the first camera and the second camera. It may include a memory that stores calibration information including the difference between the optical axes of the first camera and the second camera and/or the difference between the optical axes of the first camera and the second camera, and at least one processor.
  • the at least one processor may acquire a first image through the first camera and a second image through the second camera.
  • the at least one processor may determine a first area corresponding to a second image within the first image based on the calibration information.
  • the at least one processor may align the first image and the second image by performing matching on the first area and the second image.
  • the at least one processor may stitch the aligned first image and the second image.
  • the at least one processor may obtain an image to be displayed through the display module by performing an image enhancement operation using the stitched first image and the second image.
  • the at least one processor may update the calibration information based on information for aligning the first image and the second image.
  • a method of providing an image in an electronic device includes acquiring a first image through a first camera and acquiring a second image through a second camera having a narrower field of view than the first camera. may include. The method, based on calibration information including a difference between the positions of the first camera and the second camera and/or a difference between the optical axes of the first camera and the second camera, within the first image An operation of determining a first area corresponding to the second image may be included. The method may include aligning the first image and the second image by performing matching on the first area and the second image. The method may include stitching the aligned first image and the second image. The method may include obtaining an image to be displayed through a display module of the electronic device by performing an image enhancement operation using the stitched first image and the second image. The method may include updating the calibration information based on information for aligning the first image and the second image.
  • a non-transitory computer-readable medium having computer-executable instructions recorded thereon, wherein the computer-executable instructions, when executed, cause an electronic device including at least one processor to display a first image through a first camera. can be acquired, and a second image can be acquired through a second camera having a narrower angle of view than that of the first camera.
  • the computer-executable instructions when executed, cause an electronic device including at least one processor to determine a difference between the positions of the first camera and the second camera and/or the optical axes of the first camera and the second camera. Based on calibration information including the difference between the two images, a first area corresponding to the second image within the first image may be determined.
  • the computer-executable instructions when executed, cause an electronic device including at least one processor to align the first image and the second image by performing matching on the first area and the second image. You can.
  • the computer-executable instructions may cause an electronic device including at least one processor to stitch the aligned first image and the second image.
  • the computer-executable instructions when executed, cause an electronic device including at least one processor to perform an image enhancement operation using the stitched first image and the second image, thereby displaying the computer through a display module of the electronic device. You can obtain an image that will be
  • the computer-executable instructions may cause an electronic device including at least one processor to update the calibration information based on information for aligning the first image and the second image.
  • a method for providing an image according to an embodiment and an electronic device supporting the same include obtaining a high-pixel image by combining a plurality of images acquired through a plurality of cameras, and based on information acquired while acquiring the high-pixel image. You can update the calibration information by doing this.
  • FIG. 1 is a block diagram of an electronic device in a network environment, according to one embodiment.
  • FIG. 2 is a block diagram illustrating a camera module, according to one embodiment.
  • Figure 3 is a block diagram of an electronic device, according to one embodiment.
  • Figure 4 is a diagram for explaining calibration information according to one embodiment.
  • Figure 5 is a block diagram of a processor, according to one embodiment.
  • Figure 6 is a flowchart for explaining a method of providing an image, according to one embodiment.
  • Figure 7 is a diagram for explaining a method of providing an image, according to an embodiment.
  • FIG. 8 is a flowchart illustrating a method of correcting image properties according to an embodiment.
  • FIG. 9 is a diagram for explaining a method of correcting image properties according to an embodiment.
  • FIG. 10 is a diagram for explaining a method of providing an image according to an embodiment.
  • FIG. 11 is a diagram for explaining a method of providing an image, according to an embodiment.
  • FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to one embodiment.
  • the electronic device 101 communicates with the electronic device 102 through a first network 198 (e.g., a short-range wireless communication network) or a second network 199. It is possible to communicate with at least one of the electronic device 104 or the server 108 through (e.g., a long-distance wireless communication network). According to one embodiment, the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • a first network 198 e.g., a short-range wireless communication network
  • a second network 199 e.g., a long-distance wireless communication network.
  • the electronic device 101 may communicate with the electronic device 104 through the server 108.
  • the electronic device 101 includes a processor 120, a memory 130, an input module 150, an audio output module 155, a display module 160, an audio module 170, and a sensor module ( 176), interface 177, connection terminal 178, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196 , or may include an antenna module 197.
  • at least one of these components eg, the connection terminal 178) may be omitted or one or more other components may be added to the electronic device 101.
  • some of these components e.g., sensor module 176, camera module 180, or antenna module 197) are integrated into one component (e.g., display module 160). It can be.
  • the processor 120 for example, executes software (e.g., program 140) to operate at least one other component (e.g., hardware or software component) of the electronic device 101 connected to the processor 120. It can be controlled and various data processing or calculations can be performed. According to one embodiment, as at least part of data processing or computation, the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132. The commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • software e.g., program 140
  • the processor 120 stores commands or data received from another component (e.g., sensor module 176 or communication module 190) in volatile memory 132.
  • the commands or data stored in the volatile memory 132 can be processed, and the resulting data can be stored in the non-volatile memory 134.
  • the processor 120 includes a main processor 121 (e.g., a central processing unit or an application processor) or an auxiliary processor 123 that can operate independently or together (e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor).
  • a main processor 121 e.g., a central processing unit or an application processor
  • auxiliary processor 123 e.g., a graphics processing unit, a neural network processing unit ( It may include a neural processing unit (NPU), an image signal processor, a sensor hub processor, or a communication processor.
  • the electronic device 101 includes a main processor 121 and a secondary processor 123
  • the secondary processor 123 may be set to use lower power than the main processor 121 or be specialized for a designated function. You can.
  • the auxiliary processor 123 may be implemented separately from the main processor 121 or as part of it.
  • the auxiliary processor 123 may, for example, act on behalf of the main processor 121 while the main processor 121 is in an inactive (e.g., sleep) state, or while the main processor 121 is in an active (e.g., application execution) state. ), together with the main processor 121, at least one of the components of the electronic device 101 (e.g., the display module 160, the sensor module 176, or the communication module 190) At least some of the functions or states related to can be controlled.
  • coprocessor 123 e.g., image signal processor or communication processor
  • may be implemented as part of another functionally related component e.g., camera module 180 or communication module 190. there is.
  • the auxiliary processor 123 may include a hardware structure specialized for processing artificial intelligence models.
  • Artificial intelligence models can be created through machine learning. For example, such learning may be performed in the electronic device 101 itself on which the artificial intelligence model is performed, or may be performed through a separate server (e.g., server 108).
  • Learning algorithms may include, for example, supervised learning, unsupervised learning, semi-supervised learning, or reinforcement learning, but It is not limited.
  • An artificial intelligence model may include multiple artificial neural network layers.
  • Artificial neural networks include deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), restricted boltzmann machine (RBM), belief deep network (DBN), bidirectional recurrent deep neural network (BRDNN), It may be one of deep Q-networks or a combination of two or more of the above, but is not limited to the examples described above.
  • artificial intelligence models may additionally or alternatively include software structures.
  • the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101. Data may include, for example, input data or output data for software (e.g., program 140) and instructions related thereto.
  • Memory 130 may include volatile memory 132 or non-volatile memory 134.
  • the program 140 may be stored as software in the memory 130 and may include, for example, an operating system 142, middleware 144, or application 146.
  • the input module 150 may receive commands or data to be used in a component of the electronic device 101 (e.g., the processor 120) from outside the electronic device 101 (e.g., a user).
  • the input module 150 may include, for example, a microphone, mouse, keyboard, keys (eg, buttons), or digital pen (eg, stylus pen).
  • the sound output module 155 may output sound signals to the outside of the electronic device 101.
  • the sound output module 155 may include, for example, a speaker or a receiver. Speakers can be used for general purposes such as multimedia playback or recording playback.
  • the receiver can be used to receive incoming calls. According to one embodiment, the receiver may be implemented separately from the speaker or as part of it.
  • the display module 160 can visually provide information to the outside of the electronic device 101 (eg, a user).
  • the display module 160 may include, for example, a display, a hologram device, or a projector, and a control circuit for controlling the device.
  • the display module 160 may include a touch sensor configured to detect a touch, or a pressure sensor configured to measure the intensity of force generated by the touch.
  • the audio module 170 can convert sound into an electrical signal or, conversely, convert an electrical signal into sound. According to one embodiment, the audio module 170 acquires sound through the input module 150, the sound output module 155, or an external electronic device (e.g., directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (e.g., speaker or headphone).
  • the electronic device 102 e.g., speaker or headphone
  • the sensor module 176 detects the operating state (e.g., power or temperature) of the electronic device 101 or the external environmental state (e.g., user state) and generates an electrical signal or data value corresponding to the detected state. can do.
  • the sensor module 176 includes, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an IR (infrared) sensor, a biometric sensor, It may include a temperature sensor, humidity sensor, or light sensor.
  • the interface 177 may support one or more designated protocols that can be used to connect the electronic device 101 directly or wirelessly with an external electronic device (eg, the electronic device 102).
  • the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • SD card interface Secure Digital Card interface
  • audio interface audio interface
  • connection terminal 178 may include a connector through which the electronic device 101 can be physically connected to an external electronic device (eg, the electronic device 102).
  • the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
  • the haptic module 179 can convert electrical signals into mechanical stimulation (e.g., vibration or movement) or electrical stimulation that the user can perceive through tactile or kinesthetic senses.
  • the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
  • the camera module 180 can capture still images and moving images.
  • the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
  • the power management module 188 can manage power supplied to the electronic device 101.
  • the power management module 188 may be implemented as at least a part of, for example, a power management integrated circuit (PMIC).
  • PMIC power management integrated circuit
  • the battery 189 may supply power to at least one component of the electronic device 101.
  • the battery 189 may include, for example, a non-rechargeable primary battery, a rechargeable secondary battery, or a fuel cell.
  • the communication module 190 may be a wireless communication module 192 (e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (e.g., : LAN (local area network) communication module, or power line communication module) may be included.
  • a wireless communication module 192 e.g., a cellular communication module, a short-range wireless communication module, or a global navigation satellite system (GNSS) communication module
  • GNSS global navigation satellite system
  • wired communication module 194 e.g., : LAN (local area network) communication module, or power line communication module
  • the corresponding communication module is a first network 198 (e.g., a short-range communication network such as Bluetooth, wireless fidelity (WiFi) direct, or infrared data association (IrDA)) or a second network 199 (e.g., legacy It may communicate with an external electronic device 104 through a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network, the Internet, or a computer network (e.g., LAN or WAN).
  • a telecommunication network such as a cellular network, a 5G network, a next-generation communication network
  • the wireless communication module 192 uses subscriber information (e.g., International Mobile Subscriber Identifier (IMSI)) stored in the subscriber identification module 196 within a communication network such as the first network 198 or the second network 199.
  • subscriber information e.g., International Mobile Subscriber Identifier (IMSI)
  • IMSI International Mobile Subscriber Identifier
  • the wireless communication module 192 may support 5G networks after 4G networks and next-generation communication technologies, for example, NR access technology (new radio access technology).
  • NR access technology provides high-speed transmission of high-capacity data (enhanced mobile broadband (eMBB)), minimization of terminal power and access to multiple terminals (massive machine type communications (mMTC)), or ultra-reliable and low-latency (URLLC). -latency communications)) can be supported.
  • the wireless communication module 192 may support high frequency bands (eg, mmWave bands), for example, to achieve high data rates.
  • the wireless communication module 192 uses various technologies to secure performance in high frequency bands, for example, beamforming, massive array multiple-input and multiple-output (MIMO), and full-dimensional multiplexing.
  • MIMO massive array multiple-input and multiple-output
  • the wireless communication module 192 may support various requirements specified in the electronic device 101, an external electronic device (e.g., electronic device 104), or a network system (e.g., second network 199). According to one embodiment, the wireless communication module 192 supports Peak data rate (e.g., 20 Gbps or more) for realizing eMBB, loss coverage (e.g., 164 dB or less) for realizing mmTC, or U-plane latency (e.g., 164 dB or less) for realizing URLLC.
  • Peak data rate e.g., 20 Gbps or more
  • loss coverage e.g., 164 dB or less
  • U-plane latency e.g., 164 dB or less
  • the antenna module 197 may transmit or receive signals or power to or from the outside (eg, an external electronic device).
  • the antenna module 197 may include an antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, PCB).
  • the antenna module 197 may include a plurality of antennas (eg, an array antenna). In this case, at least one antenna suitable for a communication method used in a communication network such as the first network 198 or the second network 199 is connected to the plurality of antennas by, for example, the communication module 190. can be selected. Signals or power may be transmitted or received between the communication module 190 and an external electronic device through the at least one selected antenna.
  • other components eg, radio frequency integrated circuit (RFIC) may be additionally formed as part of the antenna module 197.
  • RFIC radio frequency integrated circuit
  • the antenna module 197 may form a mmWave antenna module.
  • a mmWave antenna module includes: a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side) of the printed circuit board and capable of transmitting or receiving signals in the designated high frequency band. can do.
  • a mmWave antenna module includes: a printed circuit board, an RFIC disposed on or adjacent to a first side (e.g., bottom side) of the printed circuit board and capable of supporting a designated high frequency band (e.g., mmWave band); And a plurality of antennas (e.g., array antennas) disposed on or adjacent to the second side (e.g., top or side)
  • peripheral devices e.g., bus, general purpose input and output (GPIO), serial peripheral interface (SPI), or mobile industry processor interface (MIPI)
  • signal e.g. commands or data
  • commands or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
  • Each of the external electronic devices 102 or 104 may be of the same or different type as the electronic device 101.
  • all or part of the operations performed in the electronic device 101 may be executed in one or more of the external electronic devices 102, 104, or 108.
  • the electronic device 101 may perform the function or service instead of executing the function or service on its own.
  • one or more external electronic devices may be requested to perform at least part of the function or service.
  • One or more external electronic devices that have received the request may execute at least part of the requested function or service, or an additional function or service related to the request, and transmit the result of the execution to the electronic device 101.
  • the electronic device 101 may process the result as is or additionally and provide it as at least part of a response to the request.
  • cloud computing distributed computing, mobile edge computing (MEC), or client-server computing technology can be used.
  • the electronic device 101 may provide an ultra-low latency service using, for example, distributed computing or mobile edge computing.
  • the external electronic device 104 may include an Internet of Things (IoT) device.
  • Server 108 may be an intelligent server using machine learning and/or neural networks.
  • the external electronic device 104 or server 108 may be included in the second network 199.
  • the electronic device 101 may be applied to intelligent services (e.g., smart home, smart city, smart car, or healthcare) based on 5G communication technology and IoT-related technology.
  • An electronic device may be of various types.
  • Electronic devices may include, for example, portable communication devices (e.g., smartphones), computer devices, portable multimedia devices, portable medical devices, cameras, wearable devices, or home appliances.
  • Electronic devices according to embodiments of this document are not limited to the above-described devices.
  • first, second, or first or second may be used simply to distinguish one component from another, and to refer to those components in other respects (e.g., importance or order) is not limited.
  • One (e.g., first) component is said to be “coupled” or “connected” to another (e.g., second) component, with or without the terms “functionally” or “communicatively.”
  • module used in one embodiment of this document may include a unit implemented in hardware, software, or firmware, and may be interchangeable with terms such as logic, logic block, component, or circuit, for example. can be used
  • a module may be an integrated part or a minimum unit of the parts or a part thereof that performs one or more functions.
  • the module may be implemented in the form of an application-specific integrated circuit (ASIC).
  • ASIC application-specific integrated circuit
  • One embodiment of the present document is one or more instructions stored in a storage medium (e.g., built-in memory 136 or external memory 138) that can be read by a machine (e.g., electronic device 101). It may be implemented as software (e.g., program 140) including these.
  • a processor e.g., processor 120
  • the one or more instructions may include code generated by a compiler or code that can be executed by an interpreter.
  • a storage medium that can be read by a device may be provided in the form of a non-transitory storage medium.
  • 'non-transitory' only means that the storage medium is a tangible device and does not contain signals (e.g. electromagnetic waves).
  • This term refers to cases where data is stored semi-permanently in the storage medium. There is no distinction between cases where it is temporarily stored.
  • a method according to an embodiment disclosed in this document may be provided and included in a computer program product.
  • Computer program products are commodities and can be traded between sellers and buyers.
  • the computer program product may be distributed in the form of a machine-readable storage medium (e.g. compact disc read only memory (CD-ROM)) or via an application store (e.g. Play Store TM ) or on two user devices (e.g. It can be distributed (e.g. downloaded or uploaded) directly between smart phones) or online.
  • a portion of the computer program product may be at least temporarily stored or temporarily created in a machine-readable storage medium, such as the memory of a manufacturer's server, an application store server, or a relay server.
  • each component (e.g., module or program) of the above-described components may include a single or multiple entities, and some of the multiple entities may be separately placed in other components.
  • one or more of the above-described corresponding components or operations may be omitted, or one or more other components or operations may be added.
  • multiple components eg, modules or programs
  • the integrated component may perform one or more functions of each component of the plurality of components identically or similarly to those performed by the corresponding component of the plurality of components prior to the integration. .
  • operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, omitted, or , or one or more other operations may be added.
  • Figure 2 is a block diagram 200 illustrating a camera module 180, according to one embodiment.
  • the camera module 180 includes a lens assembly 210, a flash 220, an image sensor 230, an image stabilizer 240, a memory 250 (e.g., buffer memory), or an image signal processor. It may include (260).
  • the lens assembly 210 may collect light emitted from a subject that is the target of image capture.
  • Lens assembly 210 may include one or more lenses.
  • the camera module 180 may include a plurality of lens assemblies 210. In this case, the camera module 180 may form, for example, a dual camera, a 360-degree camera, or a spherical camera.
  • Some of the plurality of lens assemblies 210 have the same lens properties (e.g., angle of view, focal length, autofocus, f number, or optical zoom), or at least one lens assembly is different from another lens assembly. It may have one or more lens properties that are different from the lens properties of .
  • the lens assembly 210 may include, for example, a wide-angle lens or a telephoto lens.
  • the flash 220 may emit light used to enhance light emitted or reflected from a subject.
  • the flash 220 may include one or more light emitting diodes (eg, red-green-blue (RGB) LED, white LED, infrared LED, or ultraviolet LED), or a xenon lamp.
  • the image sensor 230 may acquire an image corresponding to the subject by converting light emitted or reflected from the subject and transmitted through the lens assembly 210 into an electrical signal.
  • the image sensor 230 is one image sensor selected from among image sensors with different properties, such as an RGB sensor, a BW (black and white) sensor, an IR sensor, or a UV sensor, and the same It may include a plurality of image sensors having different properties, or a plurality of image sensors having different properties.
  • Each image sensor included in the image sensor 230 may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
  • CCD charged coupled device
  • CMOS complementary metal oxide semiconductor
  • the image stabilizer 240 moves at least one lens or image sensor 230 included in the lens assembly 210 in a specific direction in response to the movement of the camera module 180 or the electronic device 101 including the same.
  • the operating characteristics of the image sensor 230 can be controlled (e.g., adjusting read-out timing, etc.). This allows to compensate for at least some of the negative effects of said movement on the image being captured.
  • the image stabilizer 240 uses a gyro sensor (not shown) or an acceleration sensor (not shown) disposed inside or outside the camera module 180 to stabilize the camera module 180 or the electronic device 101. ) can detect such movements.
  • the image stabilizer 240 may be implemented as, for example, an optical image stabilizer.
  • the memory 250 may at least temporarily store at least a portion of the image acquired through the image sensor 230 for the next image processing task. For example, when image acquisition is delayed due to the shutter or when multiple images are acquired at high speed, the acquired original image (e.g., Bayer-patterned image or high-resolution image) is stored in the memory 250. , the corresponding copy image (e.g., low resolution image) may be previewed through the display module 160. Thereafter, when a specified condition is satisfied (eg, user input or system command), at least a portion of the original image stored in the memory 250 may be obtained and processed, for example, by the image signal processor 260. According to one embodiment, the memory 250 may be configured as at least part of the memory 130 or as a separate memory that operates independently.
  • a specified condition eg, user input or system command
  • the image signal processor 260 may perform one or more image processes on an image acquired through the image sensor 230 or an image stored in the memory 250.
  • the one or more image processes may include, for example, depth map creation, three-dimensional modeling, panorama creation, feature point extraction, image compositing, or image compensation (e.g., noise reduction, resolution adjustment, brightness adjustment, blurring).
  • the image signal processor 260 may include blurring, sharpening, or softening, and may include at least one of the components included in the camera module 180 (eg, an image sensor).
  • the image processed by the image signal processor 260 may be stored back in the memory 250 for further processing.
  • the image signal processor 260 may be configured as at least a part of the processor 120, or the image signal processor 260 may be configured as a separate processor that operates independently of the processor 120. When configured as a separate processor, at least one image processed by the image signal processor 260 may be displayed through the display module 160 as is or after additional image processing by the processor 120.
  • the electronic device 101 may include a plurality of camera modules 180, each having different properties or functions.
  • at least one of the plurality of camera modules 180 may be a wide-angle camera, and at least another one may be a telephoto camera.
  • at least one of the plurality of camera modules 180 may be a front camera, and at least another one may be a rear camera.
  • Figure 3 is a block diagram of an electronic device 301, according to one embodiment.
  • the electronic device 301 may include a display module 310, a camera module 320, a memory 330, and/or a processor 340.
  • display module 310 may be display module 160 of FIG. 1 .
  • the display module 310 may display an image acquired by the camera module 320.
  • the display module 310 may display dynamic images (e.g., preview images, video images) and/or still images (e.g., capture images) acquired by the camera module 320. It can be displayed.
  • camera module 320 may be camera module 180 of FIG. 1 or FIG. 2 .
  • the camera module 320 may include a plurality of cameras (hereinafter referred to as “plurality of cameras”) having different fields of view.
  • the camera module 320 includes a first camera 321 (also referred to as an “ultra wide camera”) having an angle of view ranging from about 100 degrees to about 120 degrees, and a second camera having an angle of view of about 80 degrees. It may include a second camera 322 (also referred to as a “wide camera”), and a third camera 323 (also referred to as a “telephoto camera”) having an angle of view ranging from about 10 degrees to about 40 degrees. there is.
  • the focal lengths of the lens of the first camera 321, the lens of the second camera 322, and the lens of the third camera 323 may be different.
  • the focal length of the lens of the first camera 321 is about 7mm to about 15mm
  • the focal length of the lens of the second camera 322 is It is about 15mm to about 35mm
  • the focal length of the lens of the third camera 323 may be about 70mm to about 250mm.
  • the angles of view and focal lengths of the plurality of cameras described above may be examples.
  • the camera module 320 in FIG. 3 is illustrated as including a first camera 321, a second camera 322, and a third camera 323, but is not limited thereto.
  • the camera module 320 does not include the third camera 323, or includes the first camera 321 in addition to the first camera 321, the second camera 322, and the third camera 323. , may include an additional camera having a view angle different from that of the second camera 322 and the third camera 323.
  • memory 330 may be memory 130 of FIG. 1 or memory 250 of FIG. 2 .
  • the memory 330 may store information for performing an operation to provide an image.
  • memory 330 may further include calibration information 331.
  • the calibration information 331 includes physical position differences between the plurality of cameras 321, 322, and 323 (and differences between the optical axes of the plurality of cameras 321, 322, and 323). may include.
  • the calibration information 331 will be described in more detail.
  • FIG. 4 is a diagram for explaining calibration information 331 according to an embodiment.
  • reference numeral 401 denotes a plurality of cameras 321, 322, and 323, respectively, after calibration (e.g., calibration performed during the process of the camera module 320) is performed.
  • the angle of view areas 410, 420, and 430 may be indicated.
  • the angle of view area 410 may represent a scene of a subject captured by the first camera 321.
  • the angle of view area 420 may represent a scene of a subject captured by the second camera 322.
  • the angle of view area 430 may represent a scene of a subject captured by the third camera 323.
  • the angle of view area 410 may include the angle of view area 420 and the angle of view area 430
  • the angle of view area 420 may include the angle of view area 430 .
  • the difference between the positions where the plurality of cameras 321, 322, and 323 are disposed in the electronic device 301 (e.g., the plurality of cameras 321, 322, 323) There may be distances between each other.
  • the optical axes of the plurality of cameras 321, 322, and 323 (e.g., the plurality of cameras 321, The axis passing through the center of each angle of view (322, 323) may be different.
  • the optical axis of the camera may refer to the axis passing through the center of the camera lens and the center of the image sensor (or the axis passing through the center of the camera's angle of view).
  • the optical axes of the plurality of cameras 321, 322, and 323 are parallel to each other.
  • the arrangement can be adjusted (e.g., the optical axis can be adjusted).
  • Calibration information 331 including (positioned positions) may be stored in the memory 330 of the electronic device 301.
  • the plurality of cameras 321, 322, 323) locations may be changed.
  • the position of one of the positions of the plurality of cameras 321, 322, and 323 may be changed on the X-axis, Y-axis, and/or Z-axis. In this case, when the positions of the plurality of cameras 321, 322, and 323 change, axes passing through the centers of the angles of view of the plurality of cameras 321, 322, and 323 may change.
  • the plurality of cameras 321, 322 , 323) differences may occur between the optical axes.
  • the optical axes of the plurality of cameras 321, 322, and 323, which were parallel to each other, may become non-parallel.
  • one of the optical axes of the plurality of cameras 321, 322, and 323 may be changed (eg, rotated) based on the X-axis, Y-axis, and/or Z-axis.
  • the optical axis of the second camera 322 may be rotated based on the Z-axis.
  • the angle of view area e.g., the angle of view area 420
  • the optical axis of the second camera 322 are aligned before the optical axis of the second camera 322 is rotated.
  • the angle of view area eg, angle of view area 421) may be different.
  • a plurality of cameras (321, 322, 323) generated by changing one of the optical axes of the plurality of cameras (321, 322, 323) based on the X-axis, Y-axis, and/or Z-axis.
  • Differences between the optical axes e.g., angles formed between the optical axes of the plurality of cameras 321, 322, 323 are referred to as “differences between the optical axes of the plurality of cameras” or “angles of view of the plurality of cameras ( It will be referred to as “distortion between optical axes”.
  • the electronic device 301 (e.g., processor 340) synthesizes a plurality of images acquired through each of the plurality of cameras 321, 322, and 323 based on the calibration information 331. can do.
  • the electronic device 301 collects information (e.g., a plurality of (information acquired while aligning the images), the difference between the positions of the plurality of cameras 321, 322, and 323 and/or the optical axes of the plurality of cameras 321, 322, and 323
  • Calibration information 331 including the difference between the two may be updated (and stored in the memory 330).
  • the operation of the electronic device 301 to synthesize a plurality of images acquired through each of the plurality of cameras 321, 322, and 323 and the operation of updating the calibration information 331 will be described in detail later.
  • processor 340 may be processor 120 of Figure 1.
  • processor 340 may control overall operations for providing images. In one embodiment, processor 340 may include one or more processors to provide images.
  • the processor 340 may include a plurality of components to perform an operation of providing an image.
  • a plurality of components included in the processor 340 will be described in detail with reference to FIG. 5.
  • the electronic device 301 is illustrated as including a display module 310, a camera module 320, a memory 330, and/or a processor 340, but is not limited thereto.
  • the electronic device 301 may further include at least one component shown in FIG. 1 .
  • Figure 5 is a block diagram of the processor 340, according to one embodiment.
  • the processor 340 includes a correction module 510, an alignment module 520, a stitching module 530, an image enhancement module 540, and/or a calibration information management module ( 550).
  • the correction module 510 is based on the calibration information 331 obtained from the calibration information management module 550, a plurality of images acquired from each of the plurality of cameras 321, 322, and 323. Correction can be performed. For example, when the first image and the second image are acquired through each of the first camera 321 and the second camera 322, the correction module 510 uses the first camera 321 and the second camera ( Based on the calibration information related to 322), the first area corresponding to the second image within the first image may be determined (eg, detected). The correction module 510 provides information for changing the properties of the first image so that the properties of the first image are the same as the properties of the second image, based on the pixel values of the first area and the pixel values of the second image. can be obtained. The correction module 510 may correct the first image and/or the second image based on information for changing the properties of the first image.
  • the correction module 510 may further perform an operation to correct the brightness of the first image and/or the brightness of the second image.
  • the calibration module 510 configures parameters related to settings of the first camera 321 and/or the second camera 322 such that the first image and the second image have the same exposure and/or white balance. (parameters) can be controlled (e.g., set, adjusted).
  • correction module 510 A more detailed description of the operation performed by the correction module 510 will be described in detail later with reference to FIGS. 8 and 9.
  • the alignment module 520 determines a first region within the first image corresponding to the second image, based on calibration information associated with the first camera 321 and the second camera 322 ( Example: detection) can be performed.
  • the alignment module 520 may align the first image and the second image by performing matching on the first area and the second image.
  • the alignment module 520 may acquire information for aligning the first image and the second image by performing matching on the first area and the second image.
  • the stitching module 530 may stitch (eg, combine) the aligned first and second images.
  • the image enhancement module 540 may obtain an image to be displayed through the display module 310 by performing an image enhancement operation using the first image and the second image.
  • the image enhancement module 540 may obtain an image to be displayed through the display module 310 by performing a super resolution (SR) operation on the first image and the second image.
  • SR super resolution
  • the calibration information management module 550 may manage the calibration information 331 stored in the memory 330. For example, the calibration information management module 550 updates the calibration information 331 stored in the memory 330 based on the information for aligning the first image and the second image obtained in the alignment module 520. can do.
  • the processor 340 is illustrated as including a correction module 510, an alignment module 520, a stitching module 530, an image enhancement module 540, and/or a calibration information management module 550. , but is not limited to this.
  • at least two of the correction module 510, alignment module 520, stitching module 530, image enhancement module 540, and/or calibration information management module 550 are integrated into one module. and can be implemented.
  • the processor 340 includes at least one of the correction module 510, alignment module 520, stitching module 530, image enhancement module 540, and/or calibration information management module 550. may not include.
  • it may further include at least one module. there is.
  • the electronic device 301 includes a display module 320, a first camera 321, and a plurality of devices including a second camera 322 having a narrower field of view than the first camera 321. Calibration comprising differences between the positions of the cameras, the first camera 321 and the second camera 322 and/or the difference between the optical axes of the first camera 321 and the second camera 322 It may include a memory 330 that stores information 331, and at least one processor 340. The at least one processor 340 may acquire a first image through the first camera 321 and a second image through the second camera 322. The at least one processor 340 may determine a first area corresponding to the second image within the first image based on the calibration information 331.
  • the at least one processor 340 may align the first image and the second image by performing matching on the first area and the second image.
  • the at least one processor 340 may stitch the aligned first image and the second image.
  • the at least one processor 340 may obtain an image to be displayed through the display module 320 by performing an image enhancement operation using the stitched first image and the second image.
  • the at least one processor 340 may update the calibration information 331 based on information for aligning the first image and the second image.
  • the at least one processor 340 acquires first feature points from the first area, acquires second feature points corresponding to the first feature points with the second image, and It may be configured to obtain information for aligning the first image and the second image based on the positions of the first feature points and the positions of the second feature points.
  • the at least one processor 340 may be configured to align the second image within the first image based on a difference between the positions of the first feature points and the positions of the second feature points. You can.
  • the information for aligning the first image and the second image includes information for positioning the second image within the first image and/or information for warping the second image. may include.
  • the at least one processor 340 determines the positions of the first camera 321 and the second camera 322 based on information for locating the second image within the first image. It may be configured to update the difference between the optical axes of the first camera 321 and the second camera 322 based on information for warping of the second image.
  • the at least one processor 340 obtains information for converting an attribute of the first image based on the first area and the second image, and the obtained Based on the information, the first image may be corrected so that the properties of the first image are the same as the properties of the second image.
  • the at least one processor 340 uses a histogram for the first image and the second image to calculate the average of pixel values of the first area and the pixel of the second image. Obtain an average of the values, obtain data for transforming the pixel values of the first region such that the average of the pixel values of the first region and the average of the pixel values of the second image are the same, and the obtained and may be configured to apply data to the first image.
  • the at least one processor 340 may be configured to perform upscaling on the first image such that the size of the first area is the same as the size of the second image. .
  • the at least one processor 340 may be configured to perform an SR operation on the stitched first image and the second image using a learned artificial intelligence model.
  • the at least one processor 340 acquires an image to be displayed through the display module 320 and then selects a point of the acquired image based on a user input that changes the zoom factor. It may be configured to enlarge or reduce the acquired image as a reference.
  • FIG. 6 is a flowchart 600 for explaining a method of providing an image, according to an embodiment.
  • Figure 7 is a diagram for explaining a method of providing an image, according to an embodiment.
  • the processor 340 captures a first image through the first camera 321 (hereinafter referred to as an image acquired through the first camera 321). It is possible to acquire a “first image”) and acquire a second image through the second camera 322 (hereinafter, the image acquired through the second camera 322 is referred to as the “second image”). there is.
  • the processor 340 may acquire the first image and the second image substantially simultaneously through the first camera 321 (ultra-wide-angle camera) and the second camera 322 (wide-angle camera). .
  • the processor 340 may acquire a first image through the first camera 321 and then acquire a second image through the second camera 322 within a specified time.
  • the scene represented by the first image may include the scene represented by the second image.
  • the scene represented by the first image may further include a scene existing within the field of view area of the first camera 321 in addition to the scene represented by the second image.
  • the first image and the second image may be preview images acquired in real time.
  • the present invention is not limited thereto, and the first image and the second image may be captured images obtained based on user input.
  • the first image and the second image are acquired by the first camera 321 and the second camera 322, respectively, and then stored in the memory 330 (e.g., a gallery application). These can be images.
  • FIG. 6 illustrates that a first image is acquired through the first camera 321 and a second image is acquired through the second camera 322, the present invention is not limited thereto.
  • the processor 340 acquires a first image through the first camera 321, a second image through the second camera 322, and a third image through the third camera 323. Images can be acquired substantially simultaneously (or sequentially within a specified time).
  • the processor 340 acquires a first image through the first camera 321 and a second image through the second camera 322 will be described.
  • the processor 340 is configured to: correct at least one attribute of the first image and/or the second image after the first image and the second image are acquired (and before performing operation 603). can be performed. The operation of correcting at least one attribute of the first image and the second image will be described later with reference to FIGS. 8 and 9.
  • the processor 340 based on the calibration information 331, selects a first region within the first image corresponding to the second image (hereinafter, within the first image to the second image).
  • the corresponding area is referred to as “the first area”) can be determined.
  • the calibration information 331 may include differences between positions of a plurality of cameras and/or differences between optical axes of a plurality of cameras. In one embodiment, the calibration information 331 is a difference between the positions of a plurality of cameras (e.g., a difference between the positions of the first camera 321 and the second camera 322, the first camera 321 and the second camera 322).
  • Differences between the positions of the three cameras 323, and/or differences between the positions of the second camera 322 and the third camera 323) and/or differences between the optical axes of a plurality of cameras may be information for minimizing the disparity between the first image and the second image that occurs due to the difference between the optical axis of the camera 323 and the optical axis of the third camera 323.
  • the first area corresponding to the second image within the first image may be an area containing substantially the same scene as the scene represented by the second image. In one embodiment, the first area corresponding to the second image within the first image may be an area containing feature points that are substantially the same as feature points included in the second image. In one embodiment, the first region corresponding to the second image within the first image may be an region including a region of interest that is substantially the same as a region of interest included in the second image.
  • the processor 340 determines a second image within the first image based on the difference between the positions of the plurality of cameras and/or the difference between the optical axes of the plurality of cameras included in the calibration information 331.
  • the first area corresponding to the image may be determined (eg, detected).
  • the processor 340 determines the lens distortion occurring within the first image and the second image before determining the first region within the first image corresponding to the second image. Compensation is possible. For example, the processor 340 may check the lens distortion value of the lens of the first camera 321 and the lens distortion value of the lens of the second camera 322 stored in the memory 330. The processor 340 calculates lens distortion (e.g., lens distortion) occurring in the first image and the second image based on the lens distortion value of the lens of the first camera 321 and the lens distortion value of the lens of the second camera 322. radiation distortion) can be compensated for.
  • lens distortion e.g., lens distortion
  • processor 340 may perform upscaling on the first image before determining a first region within the first image corresponding to the second image. You can.
  • the size of the first image acquired through the first camera 321 and the size of the second image acquired through the second camera 322 may be the same.
  • both the first image and the second image may have a resolution of 12M (e.g., 4000*3000 (4000 pixel values horizontally and 3000 pixel values vertically)).
  • the scene represented by the first image may include the scene represented by the second image.
  • the processor 340 performs upscaling on the first image such that the size of the first area corresponding to the second image within the first image is substantially the same as (e.g., corresponds to) the size of the second image.
  • the first image can be enlarged.
  • the processor 340 may align the first image and the second image by performing matching on the first area and the second image.
  • the first image and the second image may not be aligned.
  • the border of the first image and the border of the second image that adjoin each other may not be aligned (e.g. : may not match).
  • reference numeral 701 in FIG. 7 refers to a first image 710 acquired by the first camera 321 (e.g., an image acquired by the first camera 321 and then upscaled), a second image 710 acquired by the first camera 321, and then upscaled.
  • a second image 720 acquired by the camera 322 e.g., an image acquired by the second camera 322 and then upscaled
  • a third image 730 acquired by the third camera 323 can display overlapping screens.
  • Reference numeral 702 in FIG. 7 may represent an enlarged image of the area 740 within the screen of reference numeral 701.
  • area 711 may be part of the first image 710
  • area 721 may be part of the second image 720
  • area 731 may be part of the third image 730.
  • the boundary of the area 711 and the boundary of the area 721 may not coincide with each other. For example, while one number "5" is printed on the calendar to be photographed, two "5"s may be displayed in area 750, such as area 750 at reference numeral 702 in FIG. 7. there is.
  • the processor 340 may perform matching on the first area and the second image. For example, the processor 340 generates feature points from the first area (hereinafter referred to as “first feature points”) and feature points corresponding to the first feature points from the second image (hereinafter referred to as “second feature points”). ”) can be obtained (e.g. extracted). The processor 340 may check the positions of first feature points within the first area and confirm the positions of second feature points within the second area. The processor 340 generates a first image and a second image based on the positions of the first feature points and the positions of the second feature points (e.g., the difference between the positions of the first feature points and the positions of the second feature points). Information for sorting can be obtained. The processor 340 generates a first image and a second image based on the positions of the first feature points and the positions of the second feature points (e.g., the difference between the positions of the first feature points and the positions of the second feature points). You can sort.
  • first feature points feature points from the first area
  • the processor 340 may align the first image and the second image based on information for aligning the first image and the second image. For example, the processor 340 may align the second image within the first image based on information for aligning the first image and the second image. For example, the processor 340, based on information for aligning the first image and the second image, determines the scene represented by the area including the boundary with the second image within the first image and the best image of the second image. The act of positioning a second image within a first image so that the scene represented by the spot area continues seamlessly, and/or warping (e.g., rotation transformation) of the second image (and/or the first image) (rotation transform) and/or perspective transform) can be performed.
  • warping e.g., rotation transformation
  • the information for aligning the first image and the second image is such that the scene represented by the area including the border with the second image within the first image and the scene represented by the edge area of the second image are disconnected.
  • Information for positioning the second image within the first image e.g., information about where the second image will be stitched within the first image
  • the second image and/or the first image
  • the information for locating the second image within the first image includes the location of the first area corresponding to the second image within the first image, and the location of the first area after the first image and the second image are aligned. It may include differences between the positions of the second image within the first image.
  • the information for warping the second image (and/or the first image) includes rotating and/or rotating the second image with respect to the first image so that the first image and the second image are aligned.
  • it may include the angle (or the amount of change in angle) used for perspective conversion.
  • processor 340 may stitch the first image and the second image. For example, the processor 340 may stitch (eg, combine) the aligned first and second images.
  • the processor 340 aligns the first and second images and then adds Overlapping and/or blending (e.g., alpha blending) may be performed. For example, the processor 340 divides a portion of the second area (e.g., an area including the boundary of the first image and the second image within the first image) excluding the second image within the first image into the second area. Overlapping may be performed by replacing a part of an image, or replacing a part of a second image (e.g., an area including the boundary between the first image and the second image within the second image) with a part of the second area. For example, the processor 340 may perform blending to adjust the transparency of the first image and the transparency of the second image.
  • Overlapping e.g., alpha blending
  • the processor 340 may obtain an image to be displayed through the display module 310 by performing an image enhancement operation using the first image and the second image.
  • the processor 340 may obtain an image to be displayed through the display module 310 by performing an image enhancement operation on the stitched first and second images through operation 607.
  • an image enhancement operation may include an operation that can improve the quality (or picture quality) of an image.
  • an image enhancement operation may include an SR operation. However, it is not limited to this.
  • the SR operation may refer to an operation of acquiring an image with high resolution from one image or multiple images with low resolution.
  • SR operation is a single image super resolution (SISR) (also referred to as “single frame super resolution” (SFSR)) method of acquiring one high-resolution image from one low-resolution image or from multiple images.
  • SISR single image super resolution
  • SFSR single frame super resolution
  • MISR multi image super resolution
  • the processor 340 uses a learned artificial intelligence model to create, within the stitched first and second images, details of the second image that are enhanced and patterns of the second image that are stitched using the learned artificial intelligence model. ), an SR operation may be performed to enhance the details of the first image (e.g., an area within the first image excluding the second image).
  • the processor 340 performs an SR operation on the stitched first image and the second image using an internal patch-based artificial intelligence model or a supervised CNN-based artificial intelligence model as a learned artificial intelligence model. can do.
  • the learned artificial intelligence model used for SR operation is not limited to the examples described above.
  • the processor 340 may display an image obtained by performing an image enhancement operation using the first image and the second image through the display module 310.
  • processor 340 updates calibration information 331 stored in memory 330 based on information for aligning the first image and the second image. You can (update).
  • the processor 340 may obtain information for aligning the first image and the second image by performing matching on the first area and the second image in operation 605. .
  • the information for aligning the first image and the second image is such that the scene represented by the area including the border with the second image within the first image and the scene represented by the edge area of the second image are disconnected.
  • the information for locating the second image within the first image includes the location of the first area corresponding to the second image within the first image, and the location of the first area after the first image and the second image are aligned. It may include differences between the positions of the second image within the first image.
  • the processor 340 controls the first camera 321 and the first camera 321 included in the calibration information 331 stored in the memory 330, based on information for locating the second image within the first image.
  • the position difference between the two cameras 322 can be updated.
  • processor 340 may determine the difference between the position of the first region within the first image that corresponds to the second image and the position of the second image within the first image after the first and second images are aligned. Based on , the amount of change in the position difference between the first camera 321 and the second camera 322 can be obtained (eg, calculated).
  • the processor 340 determines the first camera 321 and The position difference between the second cameras 322 can be updated.
  • the information for warping the second image (and/or the first image) includes the angle (or the change in angle) used to rotate and/or perspective transform the second image with respect to the first image. ) may include.
  • the processor 340 operates the first camera included in the calibration information 331 stored in the memory 330 based on information for warping of the second image (and/or the first image).
  • the difference between the optical axes of the camera 321 and the second camera 322 can be updated.
  • the processor 340 determines the difference between the optical axes of the first camera 321 and the second camera 322 based on information for warping of the second image (and/or the first image).
  • the amount of change can be obtained (e.g. calculated).
  • the processor 340 determines the difference between the optical axes of the first camera 321 and the second camera 322 based on the obtained change in the difference between the optical axes of the first camera 321 and the second camera 322. can be updated.
  • FIG. 8 is a flowchart 800 illustrating a method of correcting image properties according to an embodiment.
  • FIG. 9 is a diagram 900 for explaining a method of correcting image properties according to an embodiment.
  • FIG. 8 may be a diagram for explaining an operation of correcting the first image and/or the second image after performing operation 601 of FIG. 6 and before performing operation 603.
  • the processor 340 may determine a first area corresponding to the second image within the first image based on calibration information. For example, as shown in FIG. 9, the processor 340 acquires the first image 910 through the first camera 321 and acquires the second image 920 through the second camera 322. After acquisition, based on the calibration information 331, the first area 930 corresponding to the second image within the first image may be determined.
  • operation 801 is at least partially the same or similar to operation 603 of FIG. 6, overlapping descriptions will be omitted.
  • operation 603 in FIG. 6 may not be performed.
  • the processor 340 may obtain information for converting attributes of the first image and/or the second image based on the first area and the second image. .
  • the properties of the first image and/or second image include color, saturation, exposure, or white balance (e.g., color temperature) of the first image and/or second image. It may include at least one of: However, the properties of the first image and/or the second image are not limited to the above-described examples.
  • the information for converting the properties of the first image and/or the second image is based on the pixel value of the first area and the pixel value of the second image, such that the properties of the first image are those of the second image. It may include information for changing the properties of the first image so that they become the same as the properties.
  • processor 340 determines a relationship between pixel values of the first region and pixel values of the second image (e.g., between pixel values of the first region and pixel values of the second image, at each corresponding pixel location). Based on the difference), obtain (e.g., create) data (e.g., a look up table, function, equation, or constant values) for converting the properties of the first region to the properties of the second image. can do. For example, in FIG. 9, the processor 340 generates an attribute (e.g., attribute value) of the first area based on the pixel value of the first area 930 and the pixel value of the second image 920. 2 Data 940 for conversion into image properties (e.g., property values) can be obtained.
  • attribute e.g., attribute value
  • the processor 340 may correct the first image and/or the second image based on information to transform properties of the first image and/or the second image.
  • the processor 340 converts the properties of the first image to the properties of the second image, based on data (e.g., data 940) for converting the properties of the first area into the properties of the second image.
  • the properties of the first image may be converted to be the same as .
  • the processor 340 divides data for converting properties of the first area into properties of the second image into a first area and a first area within the first image (e.g., first area 930). ) can be applied to areas other than (e.g. area 931).
  • the processor 340 applies data for converting the properties of the first area to the properties of the second image to the first image, thereby creating a first image (e.g., image 950) having the same properties as the second image.
  • area 930 is an area obtained by applying data for converting the properties of the first area to the properties of the second image to the first area 930
  • area 960 may be an area obtained by applying data for converting the properties of the first area to the properties of the second image to the area 931.
  • the processor 340 may convert the properties of the second image into the properties of the first image so that the properties of the second image and the properties of the first image are the same.
  • the processor 340 may further perform an operation to correct the brightness of the first image and/or the brightness of the second image. For example, the processor 340 obtains (e.g., calculates) the average of the pixel values of the first area and the average of the pixel values of the second image using the histogram for the first image and the second image. )can do. The processor 340 may obtain (eg, calculate) data for converting the pixel values of the first area so that the average of the pixel values of the first area and the average of the pixel values of the second image are the same. The processor 340 may correct the brightness of the first image by applying the obtained data to the first image. However, it is not limited to this.
  • the processor 340 may obtain data for converting the pixel values of the second area so that the average of the pixel values of the first area and the average of the pixel values of the second image are the same.
  • the processor 340 may correct the brightness of the second image by applying the obtained data to the second image.
  • the processor 340 configures parameters related to settings of the first camera 321 and/or the second camera 322 such that the first image and the second image have the same exposure and/or white balance. (parameters) can be controlled (e.g., set, adjusted).
  • the processor 340 may selectively perform an operation to correct the attributes of the image of FIG. 8. For example, when the first image and the second image are images acquired through the first camera 321 and the second camera 322, the processor 340 performs an operation of correcting the properties of the image of FIG. 8 ( and an operation of correcting brightness and/or controlling parameters related to settings of the first camera 321 and/or the second camera 322). If the first image and the second image are images previously acquired through the first camera 321 and the second camera 322 and stored in the memory 330, the processor 340 determines the properties of the image in FIG. The compensating operation may not be performed.
  • FIG. 10 is a diagram 1000 for explaining a method of providing an image according to an embodiment.
  • the processor 340 may perform an operation of providing an image using three or more images acquired through each of three or more cameras.
  • the processor 340 acquires a first image 1010 through the first camera 321 and acquires a second image 1020 through the second camera 322. and the third image 1030 can be acquired through the third camera 323.
  • the processor 340 may acquire the first image 1010, the second image 1020, and the third image 1030 substantially identically (or sequentially within a designated time).
  • the processor 340 performs at least some of the operations described with reference to FIGS. 6 to 9 for the first image 1010, the second image 1020, and the third image 1030.
  • image synthesis operations By performing similar operations (hereinafter referred to as “image synthesis operations”) two or more times, a final image (eg, an image to be displayed through the display module 310) can be obtained.
  • the processor 340 may obtain the first composite image 1011 by performing an image synthesis operation on the first image 1010 and the second image 1020.
  • the processor 340 may obtain the second composite image 1021 by performing an image synthesis operation on the second image 1020 and the third image 1030.
  • the processor 340 may obtain the final composite image 1012 by performing an image synthesis operation on the first composite image 1011 and the second composite image 1021. However, it is not limited to this. For example, the processor 340 may obtain the first composite image 1011 by performing an image synthesis operation on the first image 1010 and the second image 1020. The processor 340 may obtain the final composite image 1012 by performing an image synthesis operation on the first composite image 1011 and the third image 1030.
  • FIG. 11 is a diagram for explaining a method of providing an image, according to an embodiment.
  • the processor 340 performs an image compositing operation on a plurality of images acquired through each of the plurality of cameras 321, 322, and 323 while the camera application is running. By performing this, the final composite image can be obtained. For example, while the camera application is running, the processor 340 performs an image synthesis operation for a plurality of images acquired through each of the plurality of cameras 321, 322, and 323 in a designated mode for acquiring images. By performing , the final composite image to be displayed through the display module 310 can be obtained.
  • the processor 340 may acquire a plurality of images through each of the plurality of cameras 321, 322, and 323, regardless of the zoom factor input by the user.
  • the processor 340 may obtain a final composite image to be displayed through the display module 310 by performing an image synthesis operation on the plurality of acquired images.
  • the processor 340 may enlarge or reduce the final composite image based on the zoom factor input by the user and display it through the display module 310. .
  • the processor 340 may display the final composite image by enlarging or reducing it based on a point within the final composite image through the display module 310.
  • the processor 340 determines that the first image acquired through the first camera 321 is the same as the captured scene, as shown by reference numeral 1101.
  • the final composite image 1110 including the scene can be displayed through the display module 310.
  • the processor 340 uses a point 1151 of the final composite image 1110 as a reference, as shown by reference numeral 1102.
  • An enlarged image 1120 of the final composite image 1110 can be displayed through the display module 310.
  • the processor 340 uses a point 1151 of the final composite image 1110 as a reference, as shown by reference numeral 1103.
  • An enlarged image 1130 of the image 1120 can be displayed through the display module 310.
  • a point 1152 in the image 1120 at 1102 and a point 1153 in the image 1130 at 1103 are both a point 1151 within the final composite image 1110 at 1101. ) may be the same points.
  • the processor 340 when displaying the final composite image 1110 by enlarging it, corresponds to the image displayed on the screen in the reduced final composite image and the reduced final composite image in some areas of the screen.
  • An object representing an area can be displayed through the display module 310.
  • the processor 340 displays the reduced final composite image 1150 and the image displayed on the screen within the reduced final composite image 1150 in some areas of the screen.
  • Object 1151 representing the area corresponding to 1120 can be displayed through the display module 310.
  • the processor 340 as shown by reference numeral 1103, displays the reduced final composite image 1160 and the image displayed on the screen within the reduced final composite image 1160 in some areas of the screen.
  • Object 1161 representing the area corresponding to 1130 can be displayed through the display module 310.
  • the processor 340 may display an object for selecting a zoom factor through the display module 310.
  • the processor 340 selects a first object 1141 for selecting a zoom factor of 0.5x (0.5x) and a zoom factor of 1x (1.0x).
  • An object 1140 including a second object 1142 for selecting and a third object 1143 for selecting a zoom factor of 3x (3.0x) may be displayed through the display module 310.
  • the processor 340 displays the final composite image by enlarging or reducing it based on a point within the final composite image through the display module 310, thereby displaying the final composite image when the zoom factor is changed.
  • each of the images corresponding to each of the zoom magnifications can be smoothly converted and displayed through the display module 310.
  • a method of providing an image in the electronic device 301 includes acquiring a first image through a first camera 321 and using a second camera having a narrower field of view than the first camera 321. It may include an operation of acquiring a second image through 322. The method includes a difference between the positions of the first camera 321 and the second camera 322 and/or a difference between the optical axes of the first camera 321 and the second camera 322. Based on the calibration information 331, the operation of determining a first area corresponding to the second image within the first image may be included. The method may include aligning the first image and the second image by performing matching on the first area and the second image. The method may include stitching the aligned first image and the second image.
  • the method may include obtaining an image to be displayed through the display module 310 of the electronic device by performing an image enhancement operation using the stitched first image and the second image.
  • the method may include updating the calibration information 331 based on information for aligning the first image and the second image.
  • the operation of aligning the first image and the second image includes obtaining first feature points from the first area, and adding second feature points corresponding to the first feature points to the second image. It may include obtaining information for aligning the first image and the second image based on the positions of the first feature points and the positions of the second feature points.
  • the operation of aligning the first image and the second image includes aligning the second image within the first image based on differences between the positions of the first feature points and the positions of the second feature points. It may include the operation of sorting.
  • the information for aligning the first image and the second image may include information for positioning the second image within the first image and/or information for warping the second image. You can.
  • the operation of updating the calibration information 331 is based on information for positioning the second image within the first image, and the first camera 321 and the second camera 322 An operation of updating the difference between the positions of and an operation of updating the difference between the optical axes of the first camera 321 and the second camera 322 based on information for warping for the second image. You can.
  • the method includes, based on the first area and the second image, obtaining information for converting properties of the first image and based on the obtained information, converting the first image
  • the method may further include correcting the first image so that its properties are the same as those of the second image.
  • the method includes obtaining an average of pixel values of the first area and an average of the pixel values of the second image using histograms for the first image and the second image, the method comprising: Obtaining data for converting pixel values of the first area so that the average of the pixel values of the first area and the average of the pixel values of the second image are the same, and converting the obtained data to the first image. Additional operations to apply may be included.
  • the method may further include performing upscaling on the first image so that the size of the first area is the same as the size of the second image.
  • the operation of performing an image enhancement operation using the stitched first image and the second image includes SR for the stitched first image and the second image using a learned artificial intelligence model. It may include actions that perform actions.
  • the method acquires an image to be displayed through the display module 310, and then changes the acquired image based on a point of the acquired image based on a user input that changes the zoom factor.
  • the operation of enlarging or reducing may further be included.
  • the computer-readable recording media includes storage media such as magnetic storage media (eg, ROM, floppy disk, hard disk, etc.) and optical read media (eg, CD-ROM, DVD, etc.).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)

Abstract

Un dispositif électronique selon un mode de réalisation peut comprendre : un module d'affichage ; une pluralité de caméras comprenant une première caméra et une seconde caméra possédant un champ de vue plus étroit que le champ de vue de la première caméra ; une mémoire qui stocke des informations d'étalonnage comprenant une différence entre les positions de la première caméra et de la seconde caméra et/ou une différence entre des axes optiques de la première caméra et de la seconde caméra ; et au moins un processeur. Le ou les processeurs peuvent obtenir une première image par l'intermédiaire de la première caméra et obtenir une seconde image par l'intermédiaire de la seconde caméra. Le ou les processeurs peuvent déterminer une première région correspondant à la seconde image à l'intérieur de la première image sur la base des informations d'étalonnage. Le ou les processeurs peuvent aligner la première image et la seconde image par mise en correspondance de la première région et de la seconde image. Le ou les processeurs peuvent assembler la première image et la seconde image alignées. Le ou les processeurs peuvent obtenir une image à afficher par l'intermédiaire du module d'affichage par réalisation d'une opération d'amélioration d'image à l'aide de la première image assemblée et de la seconde image. Le ou les processeurs peuvent mettre à jour les informations d'étalonnage sur la base d'informations pour aligner la première image et la seconde image.
PCT/KR2023/017143 2022-11-25 2023-10-31 Procédé de fourniture d'image et dispositif électronique le prenant en charge WO2024111924A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2022-0160866 2022-11-25
KR20220160866 2022-11-25
KR1020220176829A KR20240078250A (ko) 2022-11-25 2022-12-16 이미지를 제공하는 방법 및 이를 지원하는 전자 장치
KR10-2022-0176829 2022-12-16

Publications (1)

Publication Number Publication Date
WO2024111924A1 true WO2024111924A1 (fr) 2024-05-30

Family

ID=91195868

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/017143 WO2024111924A1 (fr) 2022-11-25 2023-10-31 Procédé de fourniture d'image et dispositif électronique le prenant en charge

Country Status (1)

Country Link
WO (1) WO2024111924A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160115466A (ko) * 2015-03-27 2016-10-06 한국전자통신연구원 파노라믹 비디오를 스티칭하는 장치 및 이를 위한 스티칭 방법
KR20180022539A (ko) * 2016-08-24 2018-03-06 한국전자통신연구원 중첩영역의 제어점들을 이용한 시차 최소화 스티칭 장치 및 방법
KR20180068022A (ko) * 2016-12-13 2018-06-21 현대자동차주식회사 스테레오 카메라 영상 실시간 자동 캘리브레이션 장치, 그를 포함한 시스템 및 그 방법
KR20220017697A (ko) * 2020-08-05 2022-02-14 한국기술교육대학교 산학협력단 복수의 센서간 캘리브레이션 방법 및 장치
KR20220115223A (ko) * 2021-02-10 2022-08-17 한국전자통신연구원 다중 카메라 캘리브레이션 방법 및 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160115466A (ko) * 2015-03-27 2016-10-06 한국전자통신연구원 파노라믹 비디오를 스티칭하는 장치 및 이를 위한 스티칭 방법
KR20180022539A (ko) * 2016-08-24 2018-03-06 한국전자통신연구원 중첩영역의 제어점들을 이용한 시차 최소화 스티칭 장치 및 방법
KR20180068022A (ko) * 2016-12-13 2018-06-21 현대자동차주식회사 스테레오 카메라 영상 실시간 자동 캘리브레이션 장치, 그를 포함한 시스템 및 그 방법
KR20220017697A (ko) * 2020-08-05 2022-02-14 한국기술교육대학교 산학협력단 복수의 센서간 캘리브레이션 방법 및 장치
KR20220115223A (ko) * 2021-02-10 2022-08-17 한국전자통신연구원 다중 카메라 캘리브레이션 방법 및 장치

Similar Documents

Publication Publication Date Title
WO2022039424A1 (fr) Procédé de stabilisation d'images et dispositif électronique associé
WO2022114801A1 (fr) Dispositif électronique comprenant une pluralité de dispositifs de prise de vues, et procédé de commande de dispositif électronique
WO2022149654A1 (fr) Dispositif électronique pour réaliser une stabilisation d'image, et son procédé de fonctionnement
WO2022235043A1 (fr) Dispositif électronique comprenant une pluralité de caméras et son procédé de fonctionnement
WO2022092706A1 (fr) Procédé de prise de photographie à l'aide d'une pluralité de caméras, et dispositif associé
WO2020190030A1 (fr) Dispositif électronique de génération d'image composite et procédé associé
WO2023277298A1 (fr) Procédé de stabilisation d'image et dispositif électronique associé
WO2022260252A1 (fr) Dispositif électronique à module de dispositif de prise de vues et procédé opératoire associé
WO2022220444A1 (fr) Procédé de balayage lors d'une prise de vue avec un appareil photo et appareil électronique associé
WO2022149812A1 (fr) Dispositif électronique comprenant un module de caméra et procédé de fonctionnement de dispositif électronique
WO2022097930A1 (fr) Dispositif électronique et procédé d'affichage pour celui-ci
WO2024111924A1 (fr) Procédé de fourniture d'image et dispositif électronique le prenant en charge
WO2024085673A1 (fr) Dispositif électronique pour obtenir de multiples images d'exposition et son procédé de fonctionnement
WO2024080730A1 (fr) Dispositif électronique et procédé offrant une fonction d'obturateur lent
WO2022220621A1 (fr) Dispositif électronique comprenant un réflecteur et un ensemble objectif
WO2024053824A1 (fr) Dispositif électronique pour fournir une image et son procédé de fonctionnement
WO2024122913A1 (fr) Dispositif électronique pour acquérir une image à l'aide d'un modèle d'apprentissage automatique, et son procédé de fonctionnement
WO2022231270A1 (fr) Dispositif électronique et son procédé de traitement d'image
WO2022154168A1 (fr) Dispositif électronique apte à réaliser une mise au point automatique et son procédé de fonctionnement
WO2024117587A1 (fr) Dispositif électronique fournissant une fonction de filtre et son procédé de fonctionnement
WO2022154164A1 (fr) Dispositif électronique apte à régler un angle de vue et procédé de fonctionnement associé
WO2024076176A1 (fr) Procédé de commande de caméra et dispositif électronique
WO2024080767A1 (fr) Dispositif électronique d'acquisition d'image à l'aide d'une caméra, et son procédé de fonctionnement
WO2024014761A1 (fr) Procédé de correction de tremblement de dispositif de prise de vues et dispositif électronique le prenant en charge
WO2021251631A1 (fr) Dispositif électronique comprenant une fonction de réglage de mise au point, et procédé associé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23894853

Country of ref document: EP

Kind code of ref document: A1