WO2021101037A1 - Système et procédé de sélection dynamique de trame d'image de référence - Google Patents

Système et procédé de sélection dynamique de trame d'image de référence Download PDF

Info

Publication number
WO2021101037A1
WO2021101037A1 PCT/KR2020/012038 KR2020012038W WO2021101037A1 WO 2021101037 A1 WO2021101037 A1 WO 2021101037A1 KR 2020012038 W KR2020012038 W KR 2020012038W WO 2021101037 A1 WO2021101037 A1 WO 2021101037A1
Authority
WO
WIPO (PCT)
Prior art keywords
image frame
short
reference image
long
frame
Prior art date
Application number
PCT/KR2020/012038
Other languages
English (en)
Inventor
Long N LE
Ruiwen ZHEN
John William Glotzbach
Hamid Rahim Sheikh
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/719,633 external-priority patent/US10911691B1/en
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2021101037A1 publication Critical patent/WO2021101037A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • This disclosure relates generally to image capturing systems. More specifically, this disclosure relates to a system and method for dynamic selection of a reference image frame.
  • cameras on mobile electronic devices include cameras that can be used to capture still and video images. While convenient, cameras on mobile electronic devices typically suffer from a number of shortcomings. For example, cameras on mobile electronic devices often capture images with ghost artifacts or limited sharpness, such as when capturing images with motion and high saturation. This is typically because image sensors in the cameras have limited dynamic range. It is possible to capture multiple image frames of a scene and then combine the "best" parts of the image frames to produce a blended image. However, producing a blended image from a set of image frames with different exposures is a challenging process, especially for dynamic scenes.
  • a method includes obtaining, using at least one image sensor of an electronic device, multiple image frames of a scene.
  • the multiple image frames include a plurality of short image frames at a first exposure level and a plurality of long image frames at a second exposure level longer than the first exposure level.
  • the method also includes generating a short reference image frame and a long reference image frame using the multiple image frames.
  • the method further includes selecting, using a processor of the electronic device, the short reference image frame or the long reference image frame as a reference frame, where the selection is based on an amount of saturated motion in the long image frame and an amount of a shadow region in the short image frame.
  • the method includes generating a final image of the scene using the reference frame.
  • FIGURE 1 illustrates an example network configuration including an electronic device in accordance with this disclosure.
  • FIGURES 2A, 2B, 2C, 2D, 2E, 2F, 2G, 2H, and 2I illustrate an example process for dynamic selection of a reference image frame in accordance with this disclosure.
  • FIGURES 3A and 3B illustrate examples of benefits that can be realized using dynamic selection of a reference image frame in accordance with this disclosure.
  • FIGURE 4 illustrates an example method for dynamic selection of a reference image frame in accordance with this disclosure.
  • This disclosure provides a system and method for dynamic selection of a reference image frame.
  • a method in a first embodiment, includes obtaining, using at least one image sensor of an electronic device, multiple image frames of a scene.
  • the multiple image frames include a plurality of short image frames at a first exposure level and a plurality of long image frames at a second exposure level longer than the first exposure level.
  • the method also includes generating a short reference image frame and a long reference image frame using the multiple image frames.
  • the method further includes selecting, using a processor of the electronic device, the short reference image frame or the long reference image frame as a reference frame, where the selection is based on an amount of saturated motion in the long image frame and an amount of a shadow region in the short image frame.
  • the method includes generating a final image of the scene using the reference frame.
  • an electronic device in a second embodiment, includes at least one image sensor and at least one processing device.
  • the at least one processing device is configured to obtain, using the at least one image sensor, multiple image frames of a scene.
  • the multiple image frames include a plurality of short image frames at a first exposure level and a plurality of long image frames at a second exposure level longer than the first exposure level.
  • the at least one processing device is also configured to generate a short reference image frame and a long reference image frame using the multiple image frames.
  • the at least one processing device is further configured to select the short reference image frame or the long reference image frame as a reference frame, where the selection is based on an amount of saturated motion in the long image frame and an amount of a shadow region in the short image frame.
  • the at least one processing device is configured to generate a final image of the scene using the reference frame.
  • a non-transitory machine-readable medium contains instructions that when executed cause at least one processor of an electronic device to obtain multiple image frames of a scene that are captured using at least one image sensor of the electronic device.
  • the multiple image frames include a plurality of short image frames at a first exposure level and a plurality of long image frames at a second exposure level longer than the first exposure level.
  • the medium also contains instructions that when executed cause the at least one processor to generate a short reference image frame and a long reference image frame using the multiple image frames.
  • the medium further contains instructions that when executed cause the at least one processor to select the short reference image frame or the long reference image frame as a reference frame, where the selection is based on an amount of saturated motion in the long image frame and an amount of a shadow region in the short image frame.
  • the medium contains instructions that when executed cause the at least one processor to generate a final image of the scene using the reference frame.
  • various functions described below can be implemented or supported by one or more computer programs, each of which is formed from computer readable program code and embodied in a computer readable medium.
  • application and “program” refer to one or more computer programs, software components, sets of instructions, procedures, functions, objects, classes, instances, related data, or a portion thereof adapted for implementation in a suitable computer readable program code.
  • computer readable program code includes any type of computer code, including source code, object code, and executable code.
  • computer readable medium includes any type of medium capable of being accessed by a computer, such as read only memory (ROM), random access memory (RAM), a hard disk drive, a compact disc (CD), a digital video disc (DVD), or any other type of memory.
  • ROM read only memory
  • RAM random access memory
  • CD compact disc
  • DVD digital video disc
  • a "non-transitory” computer readable medium excludes wired, wireless, optical, or other communication links that transport transitory electrical or other signals.
  • a non-transitory computer readable medium includes media where data can be permanently stored and media where data can be stored and later overwritten, such as a rewritable optical disc or an erasable memory device.
  • phrases such as “have,” “may have,” “include,” or “may include” a feature indicate the existence of the feature and do not exclude the existence of other features.
  • the phrases “A or B,” “at least one of A and/or B,” or “one or more of A and/or B” may include all possible combinations of A and B.
  • “A or B,” “at least one of A and B,” and “at least one of A or B” may indicate all of (1) including at least one A, (2) including at least one B, or (3) including at least one A and at least one B.
  • first and second may modify various components regardless of importance and do not limit the components. These terms are only used to distinguish one component from another.
  • a first user device and a second user device may indicate different user devices from each other, regardless of the order or importance of the devices.
  • a first component may be denoted a second component and vice versa without departing from the scope of this disclosure.
  • the phrase “configured (or set) to” may be interchangeably used with the phrases “suitable for,” “having the capacity to,” “designed to,” “adapted to,” “made to,” or “capable of” depending on the circumstances.
  • the phrase “configured (or set) to” does not essentially mean “specifically designed in hardware to.” Rather, the phrase “configured to” may mean that a device can perform an operation together with another device or parts.
  • the phrase “processor configured (or set) to perform A, B, and C” may mean a generic-purpose processor (such as a CPU or application processor) that may perform the operations by executing one or more software programs stored in a memory device or a dedicated processor (such as an embedded processor) for performing the operations.
  • Examples of an "electronic device” may include at least one of a smartphone, a tablet personal computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop computer, a netbook computer, a workstation, a personal digital assistant (PDA), a portable multimedia player (PMP), an MP3 player, a mobile medical device, a camera, or a wearable device (such as smart glasses, a head-mounted device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic accessory, an electronic tattoo, a smart mirror, or a smart watch).
  • PDA personal digital assistant
  • PMP portable multimedia player
  • MP3 player MP3 player
  • a mobile medical device such as smart glasses, a head-mounted device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic accessory, an electronic tattoo, a smart mirror, or a smart watch.
  • Other examples of an electronic device include a smart home appliance.
  • Examples of the smart home appliance may include at least one of a television, a digital video disc (DVD) player, an audio player, a refrigerator, an air conditioner, a cleaner, an oven, a microwave oven, a washer, a drier, an air cleaner, a set-top box, a home automation control panel, a security control panel, a TV box (such as SAMSUNG HOMESYNC, APPLETV, or GOOGLE TV), a smart speaker or speaker with an integrated digital assistant (such as SAMSUNG GALAXY HOME, APPLE HOMEPOD, or AMAZON ECHO), a gaming console (such as an XBOX, PLAYSTATION, or NINTENDO), an electronic dictionary, an electronic key, a camcorder, or an electronic picture frame.
  • a television such as SAMSUNG HOMESYNC, APPLETV, or GOOGLE TV
  • a smart speaker or speaker with an integrated digital assistant such as SAMSUNG GALAXY HOME, APPLE HOMEPOD, or AMAZON
  • an electronic device include at least one of various medical devices (such as diverse portable medical measuring devices (like a blood sugar measuring device, a heartbeat measuring device, or a body temperature measuring device), a magnetic resource angiography (MRA) device, a magnetic resource imaging (MRI) device, a computed tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a global positioning system (GPS) receiver, an event data recorder (EDR), a flight data recorder (FDR), an automotive infotainment device, a sailing electronic device (such as a sailing navigation device or a gyro compass), avionics, security devices, vehicular head units, industrial or home robots, automatic teller machines (ATMs), point of sales (POS) devices, or Internet of Things (IoT) devices (such as a bulb, various sensors, electric or gas meter, sprinkler, fire alarm, thermostat, street light, toaster, fitness equipment, hot water tank, heater, or boiler).
  • MRA magnetic resource
  • an electronic device include at least one part of a piece of furniture or building/structure, an electronic board, an electronic signature receiving device, a projector, or various measurement devices (such as devices for measuring water, electricity, gas, or electromagnetic waves).
  • an electronic device may be one or a combination of the above-listed devices.
  • the electronic device may be a flexible electronic device.
  • the electronic device disclosed here is not limited to the above-listed devices and may include new electronic devices depending on the development of technology.
  • the term "user” may denote a human or another device (such as an artificial intelligent electronic device) using the electronic device.
  • FIGURES 1 through 4 discussed below, and the various embodiments of this disclosure are described with reference to the accompanying drawings. However, it should be appreciated that this disclosure is not limited to these embodiments and all changes and/or equivalents or replacements thereto also belong to the scope of this disclosure.
  • the same or similar reference denotations may be used to refer to the same or similar elements throughout the specification and the drawings.
  • HDR high dynamic range
  • This disclosure provides techniques for dynamic selection of a reference image frame that can be used in various image processing applications, such as HDR imaging.
  • a robust framework is provided for dynamically switching between "short” and “long” reference image frames, where “short” and “long” here refer to exposures.
  • use of a short reference image frame can eliminate saturated motion ghosts.
  • a long reference image frame can be used.
  • the disclosed embodiments can reduce motion noise by increasing the exposure and reducing the gain of the short reference image frame as compared to a regular short image frame. Increasing the exposure time and reducing the gain for the short reference image frame help lower noise while keeping the exposure level the same.
  • FIGURE 1 illustrates an example network configuration 100 including an electronic device in accordance with this disclosure.
  • the embodiment of the network configuration 100 shown in FIGURE 1 is for illustration only. Other embodiments of the network configuration 100 could be used without departing from the scope of this disclosure.
  • an electronic device 101 is included in the network configuration 100.
  • the electronic device 101 can include at least one of a bus 110, a processor 120, a memory 130, an input/output (I/O) interface 150, a display 160, a communication interface 170, or a sensor 180.
  • the electronic device 101 may exclude at least one of these components or may add at least one other component.
  • the bus 110 includes a circuit for connecting the components 120-180 with one another and for transferring communications (such as control messages and/or data) between the components.
  • the processor 120 includes one or more of a central processing unit (CPU), an application processor (AP), or a communication processor (CP).
  • the processor 120 is able to perform control on at least one of the other components of the electronic device 101 and/or perform an operation or data processing relating to communication.
  • the processor 120 can be a graphics processor unit (GPU).
  • the processor 120 can receive image data and can process the image data (as discussed in more detail below) to dynamically select a reference image frame that can be used for further image processing.
  • the memory 130 can include a volatile and/or non-volatile memory.
  • the memory 130 can store commands or data related to at least one other component of the electronic device 101.
  • the memory 130 can store software and/or a program 140.
  • the program 140 includes, for example, a kernel 141, middleware 143, an application programming interface (API) 145, and/or an application program (or "application”) 147.
  • At least a portion of the kernel 141, middleware 143, or API 145 may be denoted an operating system (OS).
  • OS operating system
  • the kernel 141 can control or manage system resources (such as the bus 110, processor 120, or memory 130) used to perform operations or functions implemented in other programs (such as the middleware 143, API 145, or application program 147).
  • the kernel 141 provides an interface that allows the middleware 143, the API 145, or the application 147 to access the individual components of the electronic device 101 to control or manage the system resources.
  • the application 147 includes one or more applications for image capture and image processing as discussed below. These functions can be performed by a single application or by multiple applications that each carries out one or more of these functions.
  • the middleware 143 can function as a relay to allow the API 145 or the application 147 to communicate data with the kernel 141, for instance.
  • a plurality of applications 147 can be provided.
  • the middleware 143 is able to control work requests received from the applications 147, such as by allocating the priority of using the system resources of the electronic device 101 (like the bus 110, the processor 120, or the memory 130) to at least one of the plurality of applications 147.
  • the API 145 is an interface allowing the application 147 to control functions provided from the kernel 141 or the middleware 143.
  • the API 145 includes at least one interface or function (such as a command) for filing control, window control, image processing, or text control.
  • the I/O interface 150 serves as an interface that can, for example, transfer commands or data input from a user or other external devices to other component(s) of the electronic device 101.
  • the I/O interface 150 can also output commands or data received from other component(s) of the electronic device 101 to the user or the other external device.
  • the display 160 includes, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, a quantum-dot light emitting diode (QLED) display, a microelectromechanical systems (MEMS) display, or an electronic paper display.
  • the display 160 can also be a depth-aware display, such as a multi-focal display.
  • the display 160 is able to display, for example, various contents (such as text, images, videos, icons, or symbols) to the user.
  • the display 160 can include a touchscreen and may receive, for example, a touch, gesture, proximity, or hovering input using an electronic pen or a body portion of the user.
  • the communication interface 170 is able to set up communication between the electronic device 101 and an external electronic device (such as a first electronic device 102, a second electronic device 104, or a server 106).
  • the communication interface 170 can be connected with a network 162 or 164 through wireless or wired communication to communicate with the external electronic device.
  • the communication interface 170 can be a wired or wireless transceiver or any other component for transmitting and receiving signals, such as images.
  • the wireless communication is able to use at least one of, for example, long term evolution (LTE), long term evolution-advanced (LTE-A), 5th generation wireless system (5G), millimeter-wave or 60 GHz wireless communication, Wireless USB, code division multiple access (CDMA), wideband code division multiple access (WCDMA), universal mobile telecommunication system (UMTS), wireless broadband (WiBro), or global system for mobile communication (GSM), as a cellular communication protocol.
  • the wired connection can include, for example, at least one of a universal serial bus (USB), high definition multimedia interface (HDMI), recommended standard 232 (RS-232), or plain old telephone service (POTS).
  • the network 162 or 164 includes at least one communication network, such as a computer network (like a local area network (LAN) or wide area network (WAN)), Internet, or a telephone network.
  • the electronic device 101 further includes one or more sensors 180 that can meter a physical quantity or detect an activation state of the electronic device 101 and convert metered or detected information into an electrical signal.
  • one or more sensors 180 can include one or more cameras or other imaging sensors for capturing images of scenes.
  • the sensor(s) 180 can also include one or more buttons for touch input, a gesture sensor, a gyroscope or gyro sensor, an air pressure sensor, a magnetic sensor or magnetometer, an acceleration sensor or accelerometer, a grip sensor, a proximity sensor, a color sensor (such as a red green blue (RGB) sensor), a bio-physical sensor, a temperature sensor, a humidity sensor, an illumination sensor, an ultraviolet (UV) sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an infrared (IR) sensor, an ultrasound sensor, an iris sensor, or a fingerprint sensor.
  • the sensor(s) 180 can further include an inertial measurement unit, which can include one or more accelerometers, gyroscopes, and other components.
  • the sensor(s) 180 can include a control circuit for controlling at least one of the sensors included here. Any of these sensor(s) 180 can be located within the electronic device 101.
  • the first external electronic device 102 or the second external electronic device 104 can be a wearable device or an electronic device-mountable wearable device (such as an HMD).
  • the electronic device 101 can communicate with the electronic device 102 through the communication interface 170.
  • the electronic device 101 can be directly connected with the electronic device 102 to communicate with the electronic device 102 without involving with a separate network.
  • the electronic device 101 can also be an augmented reality wearable device, such as eyeglasses, that include one or more cameras.
  • the first and second external electronic devices 102 and 104 and the server 106 each can be a device of the same or a different type from the electronic device 101.
  • the server 106 includes a group of one or more servers.
  • all or some of the operations executed on the electronic device 101 can be executed on another or multiple other electronic devices (such as the electronic devices 102 and 104 or server 106).
  • the electronic device 101 when the electronic device 101 should perform some function or service automatically or at a request, the electronic device 101, instead of executing the function or service on its own or additionally, can request another device (such as electronic devices 102 and 104 or server 106) to perform at least some functions associated therewith.
  • the other electronic device (such as electronic devices 102 and 104 or server 106) is able to execute the requested functions or additional functions and transfer a result of the execution to the electronic device 101.
  • the electronic device 101 can provide a requested function or service by processing the received result as it is or additionally.
  • a cloud computing, distributed computing, or client-server computing technique may be used, for example. While FIGURE 1 shows that the electronic device 101 includes the communication interface 170 to communicate with the external electronic device 104 or server 106 via the network 162 or 164, the electronic device 101 may be independently operated without a separate communication function according to some embodiments of this disclosure.
  • the server 106 can include the same or similar components 110-180 as the electronic device 101 (or a suitable subset thereof).
  • the server 106 can support to drive the electronic device 101 by performing at least one of operations (or functions) implemented on the electronic device 101.
  • the server 106 can include a processing module or processor that may support the processor 120 implemented in the electronic device 101.
  • the server 106 processes data using a multi-task fusion neural network architecture to perform multiple tasks using the data as described below.
  • the server 106 processes image data using the multi-task fusion neural network architecture to perform multiple tasks using the image data and generate images of scenes.
  • FIGURE 1 illustrates one example of a network configuration 100 including an electronic device 101
  • the network configuration 100 could include any number of each component in any suitable arrangement.
  • computing and communication systems come in a wide variety of configurations, and FIGURE 1 does not limit the scope of this disclosure to any particular configuration.
  • FIGURE 1 illustrates one operational environment in which various features disclosed in this patent document can be used, these features could be used in any other suitable system.
  • FIGURES 2A, 2B, 2C, 2D, 2E, 2F, 2G, 2H, and 2I illustrate an example process for dynamic selection of a reference image frame in accordance with this disclosure.
  • the process 200 is described as being performed using the electronic device 101 shown in FIGURE 1. However, the process 200 could be used with any other suitable electronic device and in any suitable system.
  • the electronic device 101 captures multiple image frames of a scene in a burst capture operation 202.
  • the capture operation 202 may be performed in response to an event, such as a user actuating a shutter control.
  • multiple image frames are captured using one image sensor 180 of the electronic device 101, such as a camera.
  • multiple image frames are captured using multiple image sensors 180 of the electronic device 101, such as multiple cameras, in which case each image sensor 180 may capture one image or multiple images.
  • the captured image frames represent Bayer image frames, rather than other types of image frames (such as YUV image frames).
  • Bayer image frames may retain linearity properties that can be lost in YUV image frames after the image frames are processed through an image signal processor (ISP), and this linearity can be useful in subsequent processing.
  • the captured image frames include frames captured using different exposures. For example, some of the image frames may be captured at a relatively short exposure (such as EV-3 or EV-2) and are referred to as short image frames. Other image frames may be captured at a relatively long exposure (such as EV+0) and are referred to as long image frames.
  • Some of the short image frames may exhibit areas with an undesirable amount of noise.
  • a common strategy to reduce noise is to blend multiple short image frames together.
  • blending is generally less advantageous (even with the assistance of local alignment) due to the risk of creating "ghost" artifacts.
  • Aggressive spatial filtering can reduce the noise but also inherently hides dark details.
  • the electronic device 101 can select one of the short image frames and increase the exposure time of the selected short image frame by a factor of A while reducing the ISO gain of the selected short image frame by a factor of 1/ A . This helps to maintain the short exposure level.
  • A may represent a power of two (such as 2, 4, 8, etc.). This functionality is summarized in Table 1.
  • the selected short image frame with improved quality can later be used as a short reference image frame for deghosting and blending as discussed in greater detail below.
  • the electronic device 101 After the image capture operation 202, the electronic device 101 performs a registration operation 205 to align the captured image frames.
  • Registration generally refers to aligning different image frames so that common points in the image frames are aligned. This may be useful or desirable since later blending operations may be most successful when the image frames are well-aligned.
  • one objective of the registration operation 205 is to ensure that the image frames are aligned even in the presence of camera motion. To achieve this, the electronic device 101 examines the captured image frames for static elements that are common among the image frames and aligns the image frames according to features of those static elements.
  • the registration operation 205 may be performed as follows.
  • the electronic device 101 can equalize the multi-exposure image frames to the same exposure level, such as EV+0. This can be done to increase the number of matched key points between image frames and thereby provide a more reliable homography estimate in later steps. Since Bayer image frames are linear, exposure equalization here may simply involve performing a scaling operation, where the scaling factor is determined by the number of EV stops. Appropriate capping after scaling can be used to reduce the number of keypoints with no matches or with wrong matches.
  • the electronic device 101 then extracts luminance components from the image frames and performs keypoint detection and descriptor extraction on the luminance components.
  • the electronic device 101 estimates a homography to warp other image frames to a selected base image frame. Any image frame can be selected here as the base image frame since the goal of the registration operation 205 is to align the frames that are potentially misaligned, such as due to camera motion.
  • the electronic device 101 separates the aligned image frames into two groups (namely short and long) and performs multiple subsequent operations in separate paths for the two image groups as shown in FIGURE 2A.
  • the short image frames should exhibit better clarity in saturated areas of a scene, while the long image frames should have better details in dark areas of the scene.
  • the electronic device 101 performs an equalization operation 210a on the short image frames and performs an equalization operation 210b on the long image frames.
  • the equalization operations 210a-210b are performed to bring the short image frames and the long image frames to the same brightness level.
  • the equalization operation 210a includes bringing the short image frames to the brightness level of the long image frames. For example, if the long image frames have an exposure level of EV+0 and the short image frames have an exposure level of EV-3, the exposure difference can be defined as .
  • the brightness level of each short image frame can be multiplied by eight to bring the brightness level of the short image frames closer to the brightness level of the long image frames.
  • the equalization operation 210b for the long image frames can be trivial (such as "multiply by one") or omitted.
  • the equalization operations 210a-210b could be performed to bring the short and long image frames to an intermediate brightness level.
  • the electronic device 101 After the equalization operations 210a-210b, the electronic device 101 performs deghosting operations 220a-220b.
  • the deghosting operations 220a-220b are performed to identify motion regions among the image frames so that blending in those regions can be suppressed. This is useful since blending in motion regions can lead to ghost artifacts.
  • the deghosting operations 220a-220b can be performed in any suitable manner.
  • FIGURE 2B illustrates one example implementation of the deghosting operations 220a-220b in greater detail. Although the process shown in FIGURE 2B represents one example process for deghosting, other deghosting processes can be used and are within the scope of this disclosure.
  • the inputs into each deghosting operation 220a-220b include two equalized image frames .
  • the equalized image frames include the short image frames equalized in the equalization operation 210a.
  • the equalized image frames include the long image frames equalized in the equalization operation 210b.
  • one of the image frames is selected as a reference image frame
  • the other image frame or frames are designated as non-reference image frames .
  • any image frame can be chosen as the reference image frame without loss of generality.
  • the outputs of the deghosting operations 220a-220b are weighted motion maps , which in some embodiments have values in the range [0, 1].
  • the motion map may be trivial, such as when for all pixels p .
  • the motion map may be determined as follows in some embodiments.
  • the electronic device 101 performs an image difference function 221 to compute the differences between a non-reference image frame and the reference image frame , ignoring pixels where both frames are saturated since there is no reliable information there for motion analysis.
  • the difference signal can be computed according to the following:
  • reference and non-reference saturated regions S n and S r are declared pixel-wise when the maximum value across multiple channels (such as the Bayer channels Gr , Gb , R , B ) exceeds a predetermined saturation threshold . This can be expressed as follows:
  • the operator represents the total absolute difference across the multiple channels. This can be expressed as follows:
  • the electronic device 101 may apply a bilateral filter function 223 to generate a filtered difference image.
  • the filtering can be guided by the reference image frame and applied on the difference signal in Equation (1).
  • the difference signal can be expressed by:
  • N represents a normalization factor used to ensure that filter weights sum to one, represents the image coordinate of pixel p , q represents a neighboring pixel of p , and and represent domain and range standard deviations.
  • the reference image frame is used for guidance since motion pixels are pulled from the reference image frame .
  • the electronic device 101 After the filtered difference image is determined, the electronic device 101 performs a threshold estimation function 225 to compute a constant false-alarm rate (CFAR) difference threshold for classifying motion and non-motion.
  • CFAR constant false-alarm rate
  • the threshold can be automatically set based on a percentile of all difference signals in a scene, excluding pixels with zero difference.
  • the standard deviation on the threshold can be set to the difference value that is a few percentiles below the threshold for the subsequent conversion to a soft motion map.
  • the electronic device 101 can also perform a threshold suppression function 227 to further reduce the threshold in shadow regions of both reference and non-reference image frames to improve motion detection.
  • the electronic device 101 may average multiple channel values (such as Gr , Gb , R , and B values) for both the reference and non-reference image frames, generate a luminance map, and multiply the luminance map by the difference threshold to determine the threshold suppression.
  • the threshold suppression can be expressed as follows:
  • the electronic device 101 then performs a motion map determination function 229 to determine a non-reference motion map for each non-reference image frame. In some embodiments, this occurs by converting the difference signal to a soft decision on motion/non-motion, such as based on a sigmoid model. In some cases, the motion maps can be determined according to the following:
  • small pixel values (such as close to 0) indicate motion
  • large pixel values (such as close to 1) indicate static areas.
  • smaller values could indicate static areas
  • larger values could indicate motion.
  • the electronic device 101 can perform same exposure (SE) blending operations 230a-230b to generate same-exposure blended image frames.
  • SE same exposure
  • the electronic device 101 can blend the multiple short image frames using the SE blending operation 230a to generate a single blended short image frame.
  • the electronic device 101 can blend the multiple long image frames using the SE blending operation 230b to generate a single blended long image frame.
  • the electronic device 101 respectively uses the motion maps from the deghosting operations 220a-220b to guide the blending.
  • the blending can be expressed according to the following:
  • the electronic device 101 can perform deghosting operations 240a-240b to generate blended motion maps and for the blended short image frame and the blended long image frame, respectively.
  • the deghosting operations 240a-240b may be the same as or similar to the deghosting operations 220a-220b, and reference can be made to FIGURE 2B for further details of the process in some embodiments.
  • the deghosting operations 240a-240b differ from the deghosting operations 220a-220b in their inputs. That is, the deghosting operations 220a-220b receive image frames having the same exposure (either short image frames or long image frames) as inputs.
  • the deghosting operations 240a-240b receive image frames having different exposures (long image frames and short image frames) as inputs.
  • the deghosting operations 240a-240b receive as inputs both the blended short image frame from the SE blending operation 230a and the blended long image frame from the SE blending operation 230b.
  • the output of the deghosting operation 240a is the short blended motion map
  • the output of the deghosting operation 240b is the long blended motion map .
  • the electronic device 101 can perform multi-exposure (ME) blending operations 250a-250b.
  • ME multi-exposure
  • the blending policy for ME blending may consider not only the motion maps but also saturation areas. Depending on whether the same-exposure blended short image frame or the same-exposure blended long image frame from the SE blending operations 230a-230b is chosen as a primary image (or reference image), the blending policy may be different.
  • the electronic device 101 can perform the ME blending operation 250a in which the same-exposure blended short image frame is the primary image and the same-exposure blended long image frame is the secondary image.
  • the output of the ME blending operation 250a is a ME blended short reference image frame.
  • the ME blending operation 250a can be performed according to the following.
  • the blend weight can be computed according to:
  • Equation (10) represents the motion map for the blended long image frame, and represents the saturation map for the blended long image. Note that , in a short reference system. According to Equation (10), is small in both motion and saturation regions of , resulting in content from the blended short image frame in those regions.
  • the electronic device 101 can perform the ME blending operation 250b in which the same-exposure blended long image frame is the primary image and the same-exposure blended short image frame is the secondary image.
  • the output of the ME blending operation 250b is a ME blended long reference image frame.
  • the ME blending operation 250b can be performed according to the following.
  • the blend weight can be computed according to:
  • the electronic device 101 performs the ME blending operations 250a-250b to generate the ME blended short reference image frame and the ME blended long reference image frame , respectively.
  • the electronic device 101 then performs a reference selection operation 260 to select one of the reference image frames and to be used as a reference frame for further image processing.
  • Various techniques may be used to perform the reference selection operation 260.
  • FIGURES 2C and 2D illustrate two example processes 260a and 260b that may be performed as the reference selector operation 260 in greater detail.
  • the electronic device 101 starts the process 260a by obtaining the short and long reference image frames and .
  • the electronic device 101 performs a large shadow region test 261 to check for the existence of large shadow regions in the short reference image frame .
  • Various techniques may be used to perform the large shadow region test 261.
  • FIGURE 2E illustrates one example implementation of the large shadow region test 261 in greater detail.
  • the electronic device 101 takes the short reference image frame and, at step 2611, computes the luminance values from the entire frame . In some cases, this can be performed according to the following:
  • Luma 0.213 * R + 0.715 * G + 0.072 * B (14)
  • R, G, and B represent red, green, and blue channels, respectively, of each pixel.
  • this provides luminance values in a range of 0-255.
  • this is merely one example, and other luminance equations could be used to generate luminance values in different ranges.
  • the electronic device 101 determines the median luminance value of the luminance values across the image frame.
  • the median luminance value corresponds to the value which 50% of the luminance values are above and 50% of the luminance values are below.
  • the electronic device 101 determines if the median luminance value is less than a predetermined threshold value (such as 50). If so, the short reference image frame is considered to be too dark. If not, the short reference image frame is considered to be not too dark.
  • the electronic device 101 takes the ISO value of the short reference image frame and determines whether the ISO value is greater than a predetermined threshold value (such as 4000). If the ISO value is greater than the threshold, the short reference image frame is considered to be too noisy. If not, the short reference image frame is considered to be not too noisy.
  • the electronic device 101 takes a logical OR of the determinations from steps 2613 and 2614 to determine whether the short reference image frame is considered to be too dark or too noisy.
  • the short reference image frame is not too dark and not too noisy, it is considered that a large shadow region does not exist in the short reference image frame , and thus the short reference image frame can be used as the reference image frame. Otherwise, if the short reference image frame is too dark, too noisy, or both, it is considered that a large shadow region exists in the short reference image frame , and the short reference image frame is not suitable for use as the reference image frame.
  • the next operation depends on whether the process 260a or the process 260b is being performed for the reference selection operation 260.
  • FIGURE 2F illustrates one example implementation of the residual saturated motion test 262 in greater detail. As shown in FIGURE 2F, the electronic device 101 takes the long reference image frame and, at step 2621, performs saturation analysis on the long reference image frame to generate a saturation map 2622. The saturation map 2622 indicates which portions of the long reference image frame are too bright. Various techniques may be used to perform the saturation analysis.
  • FIGURE 2G illustrates one example implementation of the saturation analysis of step 2621 in greater detail.
  • the electronic device 101 takes the maximum value among various channels (such as R, G, or B) for each pixel in the long reference image frame .
  • the R, G, and B values are all high, and step 26211 selects the maximum value.
  • the electronic device 101 applies a nonlinear mapping function to the maximum value for each pixel.
  • the mapping function can be given by the following:
  • the electronic device 101 generates the saturation map 2622 by applying the mapping function to each pixel.
  • the electronic device 101 also obtains the long blended motion map from the deghosting operation 240b and inverts the motion map , such as by subtracting each value from a value of one. This helps to align the motion map with the numerical convention of the saturation map 2622. For example, 0 may indicate motion and 1 may indicate no motion in the motion map , so inverting the motion map results in 1 indicating motion and 0 indicating no motion. In the saturation map 2622, 1 may indicate saturation and 0 may indicate no saturation.
  • the electronic device 101 multiplies the motion map and the saturation map 2622 pixel-wise to obtain a saturated motion map 2624. In the saturated motion map, values close to 1 may indicate that there is motion in a saturated area, and values close to 0 may indicate that there is no motion in a saturated area.
  • the electronic device 101 performs a morphological open function on the saturated motion map 2624, which is a filtering operation to reduce noise.
  • the electronic device 101 binarizes the saturated motion map 2624 based on a predetermined threshold value. In binarization, values in the saturated motion map 2624 that are below the threshold are assigned a new value of 0, and values in the saturated motion map 2624 that are above the threshold are assigned a new value of 1.
  • the values of the binarized saturated motion map 2624 are added together to obtain a saturated motion size value 2628 that indicates how much area of the long reference image frame includes saturated motion.
  • the electronic device 101 determines if the saturated motion size value 2628 is greater than a predetermined threshold value (such as 20). If the saturated motion size value 2628 is not greater than the threshold value, it is considered that there is not a large amount of residual saturated motion in the long reference image frame , and thus the long reference image frame can be used as the reference image frame. Otherwise, if the saturated motion size value 2628 is greater than the threshold value, it is considered that there is a large amount of residual saturated motion in the long reference image frame , and thus the long reference image frame is not suitable for use as the reference image frame. The next operation depends on whether the process 260a or the process 260b is being performed for the reference selection operation 260.
  • a predetermined threshold value such as 20
  • the electronic device 101 may select the single-exposure blended long image frame that is output after the SE-blending operation 230b as the reference image frame, where the single-exposure blended long image frame is a blended image frame with a EV+0 exposure level that is suitable for use as the reference image frame.
  • FIGURE 2D illustrates another example process 260b of the reference selection operation 260 in greater detail.
  • the process 260b is similar to the process 260a in that both processes 260a-260b include a large shadow region test 261 and a residual saturated motion test 262.
  • the residual saturated motion test 262 is performed before the large shadow region test 261, instead of after as in FIGURE 2C.
  • the result of the process 260b can be the same as the result of the process 260a: the reference image frame is selected from the short reference image frame , the long reference image frame , or the single-exposure blended long image frame .
  • Edge noise filtering is a post-processing operation in which the electronic device 101 performs spatial denoising and edge enhancement to remove noise and improve the appearances of edges in the reference image frame.
  • Various techniques for edge noise filtering are known in the art.
  • the electronic device 101 can also perform one or more image signal processing (ISP) operations 280 using the reference image frame.
  • ISP operations can include a variety of image processing functions, such as lens shading correction, white balancing, demosaicing, RGB matrix correction, gamma correction, YUV conversion, RGB conversion, and the like.
  • the ISP operations 280 can include a tone mapping operation.
  • the tone mapping operation can be used to convert higher-bit Bayer data (such as a radiance map) to standard lower-bit integer Bayer data (such as 10-bit data) in such a way that the visual impressions and details of the original real scene are faithfully reproduced.
  • FIGURE 2H illustrates one example implementation of a tone mapping operation 282 in greater detail. As shown in FIGURE 2H, the tone mapping operation 282 includes a luminance compression step 284 and a contrast enhancement step 286.
  • the electronic device 101 may use the following function to compress the luminance data and preserve HDR details:
  • Equation (18) the values of k1 and k2 may be empirically determined, and the key value k may vary (such as between 0.2 and 0.8). Once the key value is known, the value can be determined, such as by solving a nonlinear function. With the luminance compression step 284, the overall brightness is compressed to 10-bit or other lower-bit range.
  • linear quantization groups pixels based on actual pixel values without taking into account an image's pixel distribution.
  • a traditional technique that considers pixel distribution is histogram equalization that clusters pixels based on equally distributing pixel populations.
  • histogram equalization may result in exaggeration of contrast or plain areas.
  • the contrast enhancement operation 286 can be performed to interpolate cutting points in quantization using cutting points from linear quantization and histogram matching.
  • the contrast enhancement operation 286 uses a highly-efficient recursive binary cut approach. The contrast enhancement operation 286 strikes a balance between linear quantization and histogram equalization.
  • the process 200 is performed using image frames captured at two exposure levels, namely short and long.
  • the number of exposure levels represented in the captured image frames could be more than two exposure levels.
  • an iterative process may be used for the operations 210a-210b, 220a-220b, 230a-230b, 240a-240b, 250a-250b, and 260.
  • FIGURE 2I illustrates one example implementation in which the operations 210-260 are performed iteratively for image frames captured at multiple exposure levels.
  • the image frames are initially captured at four exposure levels (such as EV-3, EV-2, EV+0, and EV+1).
  • Pairs of exposure levels are selected as short and long exposure levels, and the operations 210-260 are performed in one iteration on the image frames at that pair of exposure levels.
  • the results of the iteration are paired with image frames at another exposure level, and the operations 210-260 are performed in another iteration. This process is repeated until a final iteration results in the final selection of the reference image frame.
  • FIGURES 2A, 2B, 2C, 2D, 2E, 2F, 2G, 2H, and 2I illustrate one example of a process 200 for dynamic selection of a reference image frame
  • various changes may be made to these figures.
  • the process 200 is shown as selecting one reference image frame, other embodiments could result in the selection of more than one reference image frame.
  • the operations of the process 200 can be performed by any suitable component(s) of an electronic device 101 or other device, including the processor 120 of the electronic device 101 or by an image sensor 180 of the electronic device 101.
  • the process 200 has been described above in conjunction with captured image frames in the Bayer ("raw format") domain.
  • the same or similar processing can be performed for image frames in other domains, such as in the YUV ("visual format") domain.
  • brightness matching using the equalization operation 210a cannot be achieved using a trivial multiplication operation due to the lack of linearity in the YUV domain. Instead, a histogram matching (with a long frame) operation can be used.
  • FIGURES 3A and 3B illustrate examples of benefits that can be realized using dynamic selection of a reference image frame in accordance with this disclosure.
  • FIGURES 3A and 3B depict a comparison between an image 301 of a scene captured using conventional image processing techniques and an image 302 of the same scene captured using one of the embodiments disclosed above.
  • the image 301 was captured and processed using a conventional HDR operation.
  • the image 301 includes significant ghost artifacts resulting from a moving hand in a saturated area.
  • the image 302 in FIGURE 3B was captured and processed using a short reference image frame that was selected using the process 200 as described above. The resulting image 302 provides superior HDR results and does not exhibit any ghost artifacts in the moving hand.
  • FIGURES 3A and 3B illustrate one example of benefits that can be realized using dynamic selection of a reference image frame
  • various changes may be made to FIGURES 3A and 3B.
  • FIGURES 3A and 3B are merely meant to illustrate one example of the type of benefits that may be obtained using dynamic selection of a reference image frame. Images of scenes vary widely, and other results may be obtained depending on the scene and the implementation.
  • FIGURE 4 illustrates an example method 400 for dynamic selection of a reference image frame in accordance with this disclosure.
  • the method 400 shown in FIGURE 4 is described as involving the performance of the process 200 using the electronic device 101 shown in FIGURE 1.
  • the method 400 shown in FIGURE 4 could be used with any other suitable electronic device and in any suitable system.
  • multiple image frames of a scene are obtained using at least one image sensor of an electronic device at step 402.
  • the multiple image frames include a plurality of short image frames at a first exposure level and a plurality of long image frames at a second exposure level longer than the first exposure level.
  • a short reference image frame and a long reference image frame are generated using the multiple image frames at step 404.
  • the short reference image frame or the long reference image frame is selected as a reference frame at step 406.
  • a final image of the scene is generated using the reference frame at step 408.
  • This could include, for example, the processor 120 of the electronic device 101 performing one or more ISP operations 280, which can include a tone mapping operation 282. Note that any other desired image processing operations may also occur here to produce the final image of the scene.
  • the final image of the scene can be stored, output, or used in some manner at step 410.
  • This could include, for example, the processor 120 of the electronic device 101 displaying the final image of the scene on the display 160 of the electronic device 101.
  • This could also include the processor 120 of the electronic device 101 saving the final image of the scene to a camera roll stored in a memory 130 of the electronic device 101.
  • This could further include the processor 120 of the electronic device 101 attaching the final image of the scene to a text message, email, or other communication to be transmitted from the electronic device 101.
  • the final image of the scene could be used in any other or additional manner.
  • FIGURE 4 illustrates one example of a method 400 for dynamic selection of a reference image frame
  • various changes may be made to FIGURE 4.
  • steps in FIGURE 4 could overlap, occur in parallel, occur in a different order, or occur any number of times.
  • each of the functions in the electronic device 101 or server 106 can be implemented or supported using one or more software applications or other software instructions that are executed by at least one processor 120 of the electronic device 101 or server 106.
  • at least some of the functions in the electronic device 101 or server 106 can be implemented or supported using dedicated hardware components.
  • the operations of each device can be performed using any suitable hardware or any suitable combination of hardware and software/firmware instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Procédé comprenant l'obtention, à l'aide d'au moins un capteur d'image d'un dispositif électronique, de multiples trames d'image d'une scène. Les multiples trames d'image comprennent une pluralité de trames d'image courtes à un premier niveau d'exposition et une pluralité de trames d'image longues à un second niveau d'exposition plus long que le premier niveau d'exposition. Le procédé comprend également la génération d'une trame d'image de référence courte et d'une trame d'image de référence longue à l'aide des multiples trames d'image. Le procédé comprend en outre la sélection, à l'aide d'un processeur du dispositif électronique, de la trame d'image de référence courte ou de la trame d'image de référence longue en tant que cadre de référence, la sélection étant basée sur une quantité de mouvement saturé dans la trame d'image longue et une quantité d'une région d'ombre dans la trame d'image courte. De plus, le procédé comprend la génération d'une image finale de la scène à l'aide du cadre de référence.
PCT/KR2020/012038 2019-11-19 2020-09-07 Système et procédé de sélection dynamique de trame d'image de référence WO2021101037A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201962937321P 2019-11-19 2019-11-19
US62/937,321 2019-11-19
US16/719,633 2019-12-18
US16/719,633 US10911691B1 (en) 2019-11-19 2019-12-18 System and method for dynamic selection of reference image frame
KR10-2020-0113210 2020-09-04
KR1020200113210A KR20210061258A (ko) 2019-11-19 2020-09-04 기준 이미지 프레임의 동적 선택을 위한 시스템 및 방법

Publications (1)

Publication Number Publication Date
WO2021101037A1 true WO2021101037A1 (fr) 2021-05-27

Family

ID=75980153

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/012038 WO2021101037A1 (fr) 2019-11-19 2020-09-07 Système et procédé de sélection dynamique de trame d'image de référence

Country Status (1)

Country Link
WO (1) WO2021101037A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113327215A (zh) * 2021-05-28 2021-08-31 浙江大华技术股份有限公司 一种宽动态图像合成方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110013848A1 (en) * 2009-07-15 2011-01-20 Canon Kabushiki Kaisha Image processing apparatus and method for controlling the same
US20130070965A1 (en) * 2011-09-21 2013-03-21 Industry-University Cooperation Foundation Sogang University Image processing method and apparatus
WO2014172059A2 (fr) * 2013-04-15 2014-10-23 Qualcomm Incorporated Selection d'image de reference pour un filtrage d'image fantome mobile
KR20150045877A (ko) * 2013-10-21 2015-04-29 삼성테크윈 주식회사 영상 처리 장치 및 영상 처리 방법
JP2017208604A (ja) * 2016-05-16 2017-11-24 ソニー株式会社 画像処理装置、画像処理方法、撮像装置、プログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110013848A1 (en) * 2009-07-15 2011-01-20 Canon Kabushiki Kaisha Image processing apparatus and method for controlling the same
US20130070965A1 (en) * 2011-09-21 2013-03-21 Industry-University Cooperation Foundation Sogang University Image processing method and apparatus
WO2014172059A2 (fr) * 2013-04-15 2014-10-23 Qualcomm Incorporated Selection d'image de reference pour un filtrage d'image fantome mobile
KR20150045877A (ko) * 2013-10-21 2015-04-29 삼성테크윈 주식회사 영상 처리 장치 및 영상 처리 방법
JP2017208604A (ja) * 2016-05-16 2017-11-24 ソニー株式会社 画像処理装置、画像処理方法、撮像装置、プログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113327215A (zh) * 2021-05-28 2021-08-31 浙江大华技术股份有限公司 一种宽动态图像合成方法、装置、电子设备及存储介质
CN113327215B (zh) * 2021-05-28 2022-10-21 浙江大华技术股份有限公司 一种宽动态图像合成方法、装置、电子设备及存储介质

Similar Documents

Publication Publication Date Title
WO2021201438A1 (fr) Système et procédé de déformation de mouvement utilisant des trames à multiples expositions
US10911691B1 (en) System and method for dynamic selection of reference image frame
WO2020251285A1 (fr) Appareil et procédé de création d'image à plage dynamique élevée (hdr) de scènes dynamiques à l'aide d'un étiquetage basé sur une coupe graphique
EP4066216A1 (fr) Système et procédé de génération de trames à expositions multiples à partir d'une entrée unique
WO2021177784A1 (fr) Génération de carte de profondeur à super-résolution pour caméras multiples ou autres environnements
WO2019216632A1 (fr) Dispositif électronique et procédé de stockage et de traitement de domaine fovéal
JP6924901B2 (ja) 写真撮影方法および電子装置
WO2020171305A1 (fr) Appareil et procédé de capture et de mélange d'images multiples pour photographie flash de haute qualité à l'aide d'un dispositif électronique mobile
WO2022146023A1 (fr) Système et procédé de rendu d'effet de profondeur de champ synthétique pour des vidéos
WO2022086237A1 (fr) Super-résolution sensible au noyau
WO2021101097A1 (fr) Architecture de réseau neuronal de fusion multi-tâches
WO2021101037A1 (fr) Système et procédé de sélection dynamique de trame d'image de référence
US11200653B2 (en) Local histogram matching with global regularization and motion exclusion for multi-exposure image fusion
WO2022014791A1 (fr) Rallumage d'images à multiples dispositifs de prise de vues basé sur une profondeur de trames multiples
WO2021025375A1 (fr) Appareil et procédé d'alignement d'image régularisé efficace pour fusion multi-trame
WO2021112550A1 (fr) Système et procédé de génération de trames à expositions multiples à partir d'une entrée unique
WO2021107592A1 (fr) Système et procédé de retouche d'image précise pour éliminer un contenu non souhaité d'images numériques
CN111602390A (zh) 终端白平衡处理方法、终端及计算机可读存储介质
WO2023038307A1 (fr) Génération d'image par mise à l'échelle non linéaire et mappage des tonalités sur la base de courbes splines cubiques
WO2021025445A1 (fr) Mise en correspondance d'histogramme local avec régularisation globale et exclusion de mouvement pour fusion d'image à expositions multiples
WO2023287162A1 (fr) Système et procédé de mélange multi-trame multi-exposition d'images rouges vertes bleues blanches (rvbb)
WO2022154508A1 (fr) Correction d'étalonnage dynamique en capture d'images multiples à expositions multiples
WO2022197066A1 (fr) Mélange de pixels pour synthétiser des trames vidéo avec gestion d'occlusion et de tatouage numérique
WO2023149786A1 (fr) Procédé et dispositif électronique de synthèse de données d'apprentissage d'image et de traitement d'image à l'aide d'une intelligence artificielle
WO2023017977A1 (fr) Procédé d'amélioration de la qualité dans un système de caméra sous-écran avec une distorsion augmentant radialement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20889973

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20889973

Country of ref document: EP

Kind code of ref document: A1