US20180338124A1 - Method of decaying chrominance in images - Google Patents
Method of decaying chrominance in images Download PDFInfo
- Publication number
- US20180338124A1 US20180338124A1 US16/048,132 US201816048132A US2018338124A1 US 20180338124 A1 US20180338124 A1 US 20180338124A1 US 201816048132 A US201816048132 A US 201816048132A US 2018338124 A1 US2018338124 A1 US 2018338124A1
- Authority
- US
- United States
- Prior art keywords
- root
- pixel
- pixels
- processor
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000002156 mixing Methods 0.000 claims abstract description 5
- 230000006870 function Effects 0.000 claims description 3
- 238000010295 mobile communication Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 241000023320 Luma <angiosperm> Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000006227 byproduct Substances 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical compound COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/646—Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/21—Circuitry for suppressing or minimising disturbance, e.g. moiré or halo
- H04N5/213—Circuitry for suppressing or minimising impulsive noise
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/77—Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
Definitions
- the present invention is directed generally to methods of reducing or removing chromatic noise in images and digital video.
- Luminance noise refers to fluctuations in brightness. Luminance noise may appear as light and dark specks (e.g., within a region of an image in which pixels should have the same or similar brightness). Chromatic or chroma noise refers to fluctuations in color. Chroma noise may appear as specks or blotches of unexpected color(s) (e.g., within a region of an image in which pixels should have the same or similar colors). Chroma noise is often more apparent in very dark or very light areas of an image and may give the image an unnatural appearance.
- Image editing software often includes a user input (e.g., slider) that may be used to remove chroma noise manually.
- Software may also automatically remove chroma noise by decolorizing any pixels that have an unexpected color when compared to their neighboring pixels. Decolorized pixels are set to black, which essentially converts the chroma noise to luminance noise. Then, other image processing techniques may be applied to the image to remove the luminance noise and improve the overall appearance of the image.
- FIG. 1 is a functional block diagram of a video capture system.
- FIG. 2 is a flow diagram of a method of generating a denoised image performable by the video capture system.
- FIG. 3 is a functional block diagram illustrating an exemplary mobile communication device that may be used to implement the video capture system.
- FIG. 1 illustrates a video capture system 200 configured to capture digital video 203 , which may be referred to as an image stream.
- the digital video 203 may be captured and/or processed as a Real-Time Messaging Protocol (“RTMP”) video stream.
- RTMP Real-Time Messaging Protocol
- the video capture system 200 may be implemented as a mobile communication device 140 (described below and illustrated in FIG. 3 ).
- the video capture system 200 includes a housing 202 , a camera 204 , one or more processors 206 , memory 208 , a display 210 , and one or more manual controls 220 .
- the camera 204 , the processor(s) 206 , the memory 208 , and the display 210 may be connected together by a bus 212 (e.g., like a bus system 186 illustrated in FIG. 3 ).
- the camera 204 is mounted on the housing 202 .
- the camera 204 is configured to capture the digital video 203 and store that digital video 203 in the memory 208 .
- the captured digital video 203 includes a series of root images (e.g., including a root image 240 ) of a scene.
- the camera 204 may be implemented as a camera or video capture device 158 (see FIG. 3 ).
- the processor(s) 206 is/are configured to execute software instructions stored in the memory 208 .
- the processor(s) 206 may be implemented as a central processing unit (“CPU”) 150 (see FIG. 3 ) and the memory 208 may be implemented as memory 152 (see FIG. 3 ).
- the display 210 is positioned to be viewed by the user while the user operates the video capture system 200 .
- the display 210 is configured to display a preview of the digital video 203 being captured by the camera 204 .
- the display 210 may be implemented as conventional display device, such as a touch screen.
- the display 210 may be mounted on the housing 202 .
- the display 210 may be implemented as a display 154 (see FIG. 3 ).
- the display 210 may be implemented as an electronic viewfinder, an auxiliary monitor connected to the video capture system 200 , and the like.
- the manual control(s) 220 is/are configured to be operated by the user and may affect properties (e.g., focus, exposure, and the like) of the digital video 203 being captured.
- the manual control(s) 220 may be implemented as software controls that generate virtual controls displayed by the display 210 .
- the display 210 may be implemented as touch screen configured to receive user input that manually manipulates the manual control(s) 220 .
- the manual control(s) 220 may be implemented as physical controls (e.g., button, knobs, and the like) disposed on the housing 202 and configured to be manually manipulated by the user.
- the manual control(s) 220 may be connected to the processor(s) 206 and the memory 208 by the bus 212 .
- the manual control(s) 220 may include a focus control 220 A, an exposure control 220 B, and the like.
- the focus control 220 A may be used to change the focus of the digital video being captured by the camera 204 .
- the exposure control 220 B may change an ISO value, shutter speed, aperture, or an exposure value (“EV”) of the digital video being captured by the camera 204 .
- the memory 208 stores a noise decay module 230 implemented by the processor(s) 206 .
- the noise decay module 230 may generate and display the virtual controls implementing the manual control(s) 220 .
- the manual control(s) 220 may be implemented by other software instructions stored in the memory 208 .
- FIG. 2 is a flow diagram of a method 280 performed by the noise decay module 230 (see FIG. 1 ).
- the method 280 (see FIG. 2 ) generates the denoised image 250 from one of the series of root images of the digital video 203 .
- the method 280 (see FIG. 2 ) will be described as generating the denoised image 250 from the root image 240 .
- the noise decay module 230 obtains the root image 240 as a raw bitmap (e.g., directly the camera 204 ) before the root image 240 is encoded.
- the root image 240 includes a plurality of root pixels each associated with one or more color values within a color space (e.g., a standard Red Green Blue (“SRGB”) color space).
- SRGB Red Green Blue
- the RGB color values of each root pixel include separate values for red (“R srgb ”), green (“G srgb ”), and blue (“B srgb ”).
- the method 280 may be adapted for use with other color spaces, such as HSL (Hue, Saturation, Lightness), HSV (Hue, Saturation, Value), and the like.
- the noise decay module 230 determines whether the root image 240 has linearized gamma values. In other words, has the root image 240 not yet been gamma corrected? The decision in decision block 284 (see FIG. 2 ) is “YES,” when the root image 240 has linearized gamma values, meaning the root image 240 not yet been gamma corrected. Otherwise, the decision in decision block 284 (see FIG. 2 ) is “NO.”
- the noise decay module 230 advances to block 288 (see FIG. 2 ).
- the decision in decision block 284 is “NO”
- the noise decay module 230 remaps the root image 240 to linear gamma (e.g., using a shader or a lookup table).
- the following formulas may be used to obtain the linear RGB values (R linear , G linear , and B linear ) for each root pixel in the root image 240 :
- the noise decay module 230 advances to block 288 (see FIG. 2 ).
- the noise decay module 230 processes each root pixel of the root image 240 one at a time. Thus, in block 288 (see FIG. 2 ), the noise decay module 230 selects one of the root pixels.
- the noise decay module 230 calculates a perceptual luminance (“p”) for the selected root pixel.
- the perceptual luminance (“p”) may be calculated by first calculating a relative luminance (“Y”) for the selected root pixel.
- the relative luminance (“Y”) refers to the brightness of the selected root pixel.
- the relative luminance (“Y”) of a particular pixel may be calculated using the following function in which a variable “s” represents the three linearized RGB color values (R linear , G linear , and B linear ) of the particular pixel expressed as an RGB vector:
- the relative luminance (“Y”) may be calculated for each pixel in a two-dimensional region of the root image 240 centered at the selected root pixel.
- the region may be three pixels by three pixels.
- the selected root pixel may be characterized as being an origin of the region (which includes the root pixel and its eight surrounding neighbors) and assigned a coordinate value of (0, 0).
- a separate relative luminance value may be calculated for each of the eight root pixels neighboring the selected root pixel as well as for the selected root pixel.
- the following set of nine relative luminance values would be calculated: Y ( ⁇ 1 , ⁇ 1) , Y ( ⁇ 1,0) , Y ( ⁇ 1,1) , Y (0, ⁇ 1) , Y (0,0) , Y (0,1) , Y (1, ⁇ 1) , Y (1,0) , and Y (1,1) .
- these relative luminance values may be combined to determine the relative luminance (“Y”) of the selected root pixel.
- an average or a median of the relative luminance values may be calculated and used as the relative luminance (“Y”) of the selected root pixel.
- the perceptual luminance (“p”) of the selected root pixel equals the relative luminance (“Y”) of the selected root pixel. Otherwise, the relative luminance (“Y”) may be linearized to obtain the perceptual luminance (“p”) using the following formula:
- the perceptual luminance (“p”) in the RGB color space may be used by the method 280 (see FIG. 2 ) for two reasons.
- CFA color filter array
- the noise decay module 230 creates a linear monochromatic RGB vector by setting the value of each of the R,G, and B elements of the linear monochromatic RGB vector equal to the perceptual luminance (“p”).
- the noise decay module 230 multiplies the linear monochromatic RGB vector by a relative-luminance weighted saturation bias (“o”) to obtain a biased monochromatic RGB vector.
- the relative-luminance weighted saturation bias (“o”) may be calculated using the following formula:
- the noise decay module 230 generates a new pixel of the denoised image 250 with new (desaturated) color values by blending the biased monochromatic RGB vector ([o*p, o*p, o*p]) with the RGB vector ([R linear , G linear , B linear ]) of the selected root pixel.
- the biased monochromatic RGB vector is multiplied by a first weight and the RGB vector is multiplied by a second weight wherein the first and second weights total one.
- the new color values are less saturated than the original color values associated with the selected root pixel. In particular, dim or less bright areas are more desaturated than brighter areas.
- the method 280 may be characterized as desaturating the selected root pixel and/or applying a weighted saturation to the selected root pixel.
- the noise decay module 230 determines whether all of the root pixels of the root image 240 have been selected in block 288 (see FIG. 2 ).
- the decision in decision block 298 is “YES,” when the noise decay module 230 has not yet selected all of the root pixels.
- the noise decay module 230 returns to block 288 and selects a next root pixel from the root image 240 .
- the decision in decision block 298 is “NO,” when the noise decay module 230 has selected all of the root pixels.
- the decision in decision block 298 is “NO,” the method 280 (see FIG. 2 ) terminates.
- the denoised image 250 may be remapped to a different color space.
- the linear RGB values may be remapped to the sRGB color space.
- the denoised image 250 may be subject to one or more additional operations, such as Gamma curve remapping, luma curve augmentation (shadow/highlight repair), histogram equalization, additional spacial denoise, RGB mixing, and lookup table application.
- the denoised image 250 may be displayed to the user using the display 210 .
- the method 280 desaturates the root image 240 (or linear bitmap) using the perceptual luminance (“p”) assigned to each root pixel to reduce or minimize chroma noise in critically underexposed (or dark) areas of the root image 240 .
- Darker regions are desaturated more than lighter areas, which may be characterized as progressively desaturating the very darkest pixels (where chrominance typically decomposes in low bit-depth images).
- the method 280 does not evaluate high-frequency chrominance of either the root pixel selected in block 288 or its neighborhood. Instead, the method 280 assumes that the occurrence of chrominant anomalies (or chroma noise) progressively increases as the perceptual luminance (“p”) of the selected root-pixel (or its neighborhood) approaches zero. Therefore, the method 280 evaluates only the perceptual luminance (“p”) of the selected root pixel (which may be the median relative luminance of its spatial neighborhood). The visual reduction of chrominance noise in darker sectors of the root image 240 is an incidental byproduct of the progressive desaturation process.
- the method 280 decays the chrominance of the root image 240 and generates the denoised image 250 within the gamut of the original color space (e.g., the sRGB color space) of the root image 240 .
- FIG. 3 is a functional block diagram illustrating a mobile communication device 140 .
- the mobile communication device 140 may be implemented as a cellular telephone, smart phone, a tablet computing device, a self-contained camera module (e.g., a wired web camera or an Action Camera module), and the like.
- the mobile communication device 140 may be implemented as a smartphone executing IOS or Android OS.
- the mobile communication device 140 may be configured to capture the digital video 203 (see FIG. 1 ) and process the digital video 203 as a RTMP protocol video stream.
- the mobile communication device 140 includes the CPU 150 .
- the CPU 150 may be implemented as a conventional microprocessor, application specific integrated circuit (ASIC), digital signal processor (DSP), programmable gate array (PGA), or the like.
- ASIC application specific integrated circuit
- DSP digital signal processor
- PGA programmable gate array
- the mobile communication device 140 is not limited by the specific form of the CPU 150 .
- the mobile communication device 140 also contains the memory 152 .
- the memory 152 may store instructions and data to control operation of the CPU 150 .
- the memory 152 may include random access memory, ready-only memory, programmable memory, flash memory, and the like.
- the mobile communication device 140 is not limited by any specific form of hardware used to implement the memory 152 .
- the memory 152 may also be integrally formed in whole or in part with the CPU 150 .
- the mobile communication device 140 also includes conventional components, such as a display 154 (e.g., operable to display the denoised image 250 ), the camera or video capture device 158 , and keypad or keyboard 156 . These are conventional components that operate in a known manner and need not be described in greater detail. Other conventional components found in wireless communication devices, such as USB interface, Bluetooth interface, infrared device, and the like, may also be included in the mobile communication device 140 . For the sake of clarity, these conventional elements are not illustrated in the functional block diagram of FIG. 3 .
- the mobile communication device 140 also includes a network transmitter 162 such as may be used by the mobile communication device 140 for normal network wireless communication with a base station (not shown).
- FIG. 3 also illustrates a network receiver 164 that operates in conjunction with the network transmitter 162 to communicate with the base station (not shown).
- the network transmitter 162 and network receiver 164 are implemented as a network transceiver 166 .
- the network transceiver 166 is connected to an antenna 168 . Operation of the network transceiver 166 and the antenna 168 for communication with a wireless network (not shown) is well-known in the art and need not be described in greater detail herein.
- the mobile communication device 140 may also include a conventional geolocation module (not shown) operable to determine the current location of the mobile communication device 140 .
- the various components illustrated in FIG. 3 are coupled together by the bus system 186 .
- the bus system 186 may include an address bus, data bus, power bus, control bus, and the like.
- the various busses in FIG. 3 are illustrated as the bus system 186 .
- the memory 152 may store instructions executable by the CPU 150 .
- the instructions may implement portions of one or more of the methods described above (e.g., the method 280 illustrated in FIG. 2 ). Such instructions may be stored on one or more non-transitory computer or processor readable media.
- any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components.
- any two components so associated can also be viewed as being “operably connected,” or “operably coupled,” to each other to achieve the desired functionality.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Image Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
- This application claims the benefit of U.S. Provisional Application No. 62/468,063, filed on Mar. 7, 2017, and U.S. Provisional Application No. 62/468,874, filed on Mar. 8, 2017, both of which are incorporated herein by reference in their entireties.
- The present invention is directed generally to methods of reducing or removing chromatic noise in images and digital video.
- Luminance noise refers to fluctuations in brightness. Luminance noise may appear as light and dark specks (e.g., within a region of an image in which pixels should have the same or similar brightness). Chromatic or chroma noise refers to fluctuations in color. Chroma noise may appear as specks or blotches of unexpected color(s) (e.g., within a region of an image in which pixels should have the same or similar colors). Chroma noise is often more apparent in very dark or very light areas of an image and may give the image an unnatural appearance.
- Image editing software often includes a user input (e.g., slider) that may be used to remove chroma noise manually. Software may also automatically remove chroma noise by decolorizing any pixels that have an unexpected color when compared to their neighboring pixels. Decolorized pixels are set to black, which essentially converts the chroma noise to luminance noise. Then, other image processing techniques may be applied to the image to remove the luminance noise and improve the overall appearance of the image.
-
FIG. 1 is a functional block diagram of a video capture system. -
FIG. 2 is a flow diagram of a method of generating a denoised image performable by the video capture system. -
FIG. 3 is a functional block diagram illustrating an exemplary mobile communication device that may be used to implement the video capture system. -
FIG. 1 illustrates avideo capture system 200 configured to capturedigital video 203, which may be referred to as an image stream. For example, thedigital video 203 may be captured and/or processed as a Real-Time Messaging Protocol (“RTMP”) video stream. By way of a non-limiting example, thevideo capture system 200 may be implemented as a mobile communication device 140 (described below and illustrated inFIG. 3 ). Thevideo capture system 200 includes ahousing 202, acamera 204, one ormore processors 206,memory 208, adisplay 210, and one or moremanual controls 220. Thecamera 204, the processor(s) 206, thememory 208, and thedisplay 210 may be connected together by a bus 212 (e.g., like abus system 186 illustrated inFIG. 3 ). - The
camera 204 is mounted on thehousing 202. Thecamera 204 is configured to capture thedigital video 203 and store thatdigital video 203 in thememory 208. The captureddigital video 203 includes a series of root images (e.g., including a root image 240) of a scene. By way of a non-limiting example, thecamera 204 may be implemented as a camera or video capture device 158 (seeFIG. 3 ). - The processor(s) 206 is/are configured to execute software instructions stored in the
memory 208. By way of a non-limiting example, the processor(s) 206 may be implemented as a central processing unit (“CPU”) 150 (seeFIG. 3 ) and thememory 208 may be implemented as memory 152 (seeFIG. 3 ). - The
display 210 is positioned to be viewed by the user while the user operates thevideo capture system 200. Thedisplay 210 is configured to display a preview of thedigital video 203 being captured by thecamera 204. By way of a non-limiting example, thedisplay 210 may be implemented as conventional display device, such as a touch screen. Thedisplay 210 may be mounted on thehousing 202. For example, thedisplay 210 may be implemented as a display 154 (seeFIG. 3 ). Alternatively, thedisplay 210 may be implemented as an electronic viewfinder, an auxiliary monitor connected to thevideo capture system 200, and the like. - The manual control(s) 220 is/are configured to be operated by the user and may affect properties (e.g., focus, exposure, and the like) of the
digital video 203 being captured. The manual control(s) 220 may be implemented as software controls that generate virtual controls displayed by thedisplay 210. In such embodiments, thedisplay 210 may be implemented as touch screen configured to receive user input that manually manipulates the manual control(s) 220. Alternatively, the manual control(s) 220 may be implemented as physical controls (e.g., button, knobs, and the like) disposed on thehousing 202 and configured to be manually manipulated by the user. In such embodiments, the manual control(s) 220 may be connected to the processor(s) 206 and thememory 208 by thebus 212. - By way of non-limiting examples, the manual control(s) 220 may include a
focus control 220A, anexposure control 220B, and the like. Thefocus control 220A may be used to change the focus of the digital video being captured by thecamera 204. Theexposure control 220B may change an ISO value, shutter speed, aperture, or an exposure value (“EV”) of the digital video being captured by thecamera 204. - The
memory 208 stores anoise decay module 230 implemented by the processor(s) 206. In some embodiments, thenoise decay module 230 may generate and display the virtual controls implementing the manual control(s) 220. Alternatively, the manual control(s) 220 may be implemented by other software instructions stored in thememory 208. -
FIG. 2 is a flow diagram of amethod 280 performed by the noise decay module 230 (seeFIG. 1 ). Referring toFIG. 1 , the method 280 (seeFIG. 2 ) generates thedenoised image 250 from one of the series of root images of thedigital video 203. For ease of illustration, the method 280 (seeFIG. 2 ) will be described as generating thedenoised image 250 from theroot image 240. - In first block 282 (see
FIG. 2 ), thenoise decay module 230 obtains theroot image 240 as a raw bitmap (e.g., directly the camera 204 ) before theroot image 240 is encoded. Theroot image 240 includes a plurality of root pixels each associated with one or more color values within a color space (e.g., a standard Red Green Blue (“SRGB”) color space). In this example, the RGB color values of each root pixel include separate values for red (“Rsrgb”), green (“Gsrgb”), and blue (“Bsrgb”). However, through application of ordinary skill in the art to the present teachings, themethod 280 may be adapted for use with other color spaces, such as HSL (Hue, Saturation, Lightness), HSV (Hue, Saturation, Value), and the like. - In decision block 284 (see
FIG. 2 ), thenoise decay module 230 determines whether theroot image 240 has linearized gamma values. In other words, has theroot image 240 not yet been gamma corrected? The decision in decision block 284 (seeFIG. 2 ) is “YES,” when theroot image 240 has linearized gamma values, meaning theroot image 240 not yet been gamma corrected. Otherwise, the decision in decision block 284 (seeFIG. 2 ) is “NO.” - When the decision in decision block 284 (see
FIG. 2 ) is “YES,” thenoise decay module 230 advances to block 288 (seeFIG. 2 ). On the other hand, when the decision in decision block 284 (seeFIG. 2 ) is “NO,” in block 286 (seeFIG. 2 ), thenoise decay module 230 remaps theroot image 240 to linear gamma (e.g., using a shader or a lookup table). For example, if the root pixels are in the SRGB color space and the RGB values (Rsrgb, Gsrgb, and Bsrgb) are scaled to range from 0 to 1, the following formulas may be used to obtain the linear RGB values (Rlinear, Glinear, and Blinear) for each root pixel in the root image 240: -
- Then, the
noise decay module 230 advances to block 288 (seeFIG. 2 ). - At this point, the
noise decay module 230 processes each root pixel of theroot image 240 one at a time. Thus, in block 288 (seeFIG. 2 ), thenoise decay module 230 selects one of the root pixels. - Then, in block 290 (see
FIG. 2 ), thenoise decay module 230 calculates a perceptual luminance (“p”) for the selected root pixel. The perceptual luminance (“p”) may be calculated by first calculating a relative luminance (“Y”) for the selected root pixel. The relative luminance (“Y”) refers to the brightness of the selected root pixel. - The relative luminance (“Y”) of a particular pixel may be calculated using the following function in which a variable “s” represents the three linearized RGB color values (Rlinear, Glinear, and Blinear) of the particular pixel expressed as an RGB vector:
-
- Using the above equation, the relative luminance (“Y”) may be calculated for each pixel in a two-dimensional region of the
root image 240 centered at the selected root pixel. For example, the region may be three pixels by three pixels. In this example, the selected root pixel may be characterized as being an origin of the region (which includes the root pixel and its eight surrounding neighbors) and assigned a coordinate value of (0, 0). Thus, a separate relative luminance value may be calculated for each of the eight root pixels neighboring the selected root pixel as well as for the selected root pixel. In this example, the following set of nine relative luminance values would be calculated: Y(−1 ,−1), Y(−1,0), Y(−1,1), Y(0,−1), Y(0,0), Y(0,1), Y(1,−1), Y(1,0), and Y(1,1). Then, these relative luminance values may be combined to determine the relative luminance (“Y”) of the selected root pixel. For example, an average or a median of the relative luminance values may be calculated and used as the relative luminance (“Y”) of the selected root pixel. - If the color values of the selected root pixel (represented by the RGB vector “s”) are linear, the perceptual luminance (“p”) of the selected root pixel equals the relative luminance (“Y”) of the selected root pixel. Otherwise, the relative luminance (“Y”) may be linearized to obtain the perceptual luminance (“p”) using the following formula:
-
- The perceptual luminance (“p”) in the RGB color space may be used by the method 280 (see
FIG. 2 ) for two reasons. First, the human eye is vastly more sensitive to green than any other color and the RGB perceptual luminance easily accounts for this sensitivity. Second, digital image sensors (e.g., included in the camera 204 ) that include an RGB color filter array (“CFA”) configuration produce green channels that are lower in noise than their red and blue counterparts. By using the perceptual luminance (“p”) to determine chrominance decay (or desaturate the root image 240), the method 280 (seeFIG. 2 ) spares (or causes less decay in) higher-quality green-dominant colors in theroot image 240. - Next, in block 292 (see
FIG. 2 ), thenoise decay module 230 creates a linear monochromatic RGB vector by setting the value of each of the R,G, and B elements of the linear monochromatic RGB vector equal to the perceptual luminance (“p”). -
linear monochromatic RGB vector=[p, p, p] Eq. 4 - In block 294 (see
FIG. 2 ), thenoise decay module 230 multiplies the linear monochromatic RGB vector by a relative-luminance weighted saturation bias (“o”) to obtain a biased monochromatic RGB vector. -
biased monochromatic RGB vector=[o*p, o*p, o*p] Eq. 5 - The relative-luminance weighted saturation bias (“o”) may be calculated using the following formula:
-
o=0.16667×In(p)+1.0 Eq. 6 - In block 296 (see
FIG. 2 ), thenoise decay module 230 generates a new pixel of thedenoised image 250 with new (desaturated) color values by blending the biased monochromatic RGB vector ([o*p, o*p, o*p]) with the RGB vector ([Rlinear, Glinear, Blinear]) of the selected root pixel. In other words, the biased monochromatic RGB vector is multiplied by a first weight and the RGB vector is multiplied by a second weight wherein the first and second weights total one. The new color values are less saturated than the original color values associated with the selected root pixel. In particular, dim or less bright areas are more desaturated than brighter areas. Thus, themethod 280 may be characterized as desaturating the selected root pixel and/or applying a weighted saturation to the selected root pixel. - Next, in decision block 298 (see
FIG. 2 ), thenoise decay module 230 determines whether all of the root pixels of theroot image 240 have been selected in block 288 (seeFIG. 2 ). The decision in decision block 298 (seeFIG. 2 ) is “YES,” when thenoise decay module 230 has not yet selected all of the root pixels. When the decision in decision block 298 (seeFIG. 2 ) is “YES,” thenoise decay module 230 returns to block 288 and selects a next root pixel from theroot image 240. - On the other hand, the decision in decision block 298 (see
FIG. 2 ) is “NO,” when thenoise decay module 230 has selected all of the root pixels. When the decision in decision block 298 (seeFIG. 2 ) is “NO,” the method 280 (seeFIG. 2 ) terminates. - At this point, a new pixel has been generated for each of the root pixels. Combined, the new pixels define the
denoised image 250. Optionally, thedenoised image 250 may be remapped to a different color space. For example, the linear RGB values may be remapped to the sRGB color space. Thedenoised image 250 may be subject to one or more additional operations, such as Gamma curve remapping, luma curve augmentation (shadow/highlight repair), histogram equalization, additional spacial denoise, RGB mixing, and lookup table application. Optionally, thedenoised image 250 may be displayed to the user using thedisplay 210. - The method 280 (see
FIG. 2 ) desaturates the root image 240 (or linear bitmap) using the perceptual luminance (“p”) assigned to each root pixel to reduce or minimize chroma noise in critically underexposed (or dark) areas of theroot image 240. Darker regions are desaturated more than lighter areas, which may be characterized as progressively desaturating the very darkest pixels (where chrominance typically decomposes in low bit-depth images). - Referring to
FIG. 2 , themethod 280 does not evaluate high-frequency chrominance of either the root pixel selected inblock 288 or its neighborhood. Instead, themethod 280 assumes that the occurrence of chrominant anomalies (or chroma noise) progressively increases as the perceptual luminance (“p”) of the selected root-pixel (or its neighborhood) approaches zero. Therefore, themethod 280 evaluates only the perceptual luminance (“p”) of the selected root pixel (which may be the median relative luminance of its spatial neighborhood). The visual reduction of chrominance noise in darker sectors of theroot image 240 is an incidental byproduct of the progressive desaturation process. - The
method 280 decays the chrominance of theroot image 240 and generates thedenoised image 250 within the gamut of the original color space (e.g., the sRGB color space) of theroot image 240. -
FIG. 3 is a functional block diagram illustrating amobile communication device 140. Themobile communication device 140 may be implemented as a cellular telephone, smart phone, a tablet computing device, a self-contained camera module (e.g., a wired web camera or an Action Camera module), and the like. By way of a non-limiting example, themobile communication device 140 may be implemented as a smartphone executing IOS or Android OS. Themobile communication device 140 may be configured to capture the digital video 203 (seeFIG. 1 ) and process thedigital video 203 as a RTMP protocol video stream. - The
mobile communication device 140 includes theCPU 150. Those skilled in the art will appreciate that theCPU 150 may be implemented as a conventional microprocessor, application specific integrated circuit (ASIC), digital signal processor (DSP), programmable gate array (PGA), or the like. Themobile communication device 140 is not limited by the specific form of theCPU 150. - The
mobile communication device 140 also contains thememory 152. Thememory 152 may store instructions and data to control operation of theCPU 150. Thememory 152 may include random access memory, ready-only memory, programmable memory, flash memory, and the like. Themobile communication device 140 is not limited by any specific form of hardware used to implement thememory 152. Thememory 152 may also be integrally formed in whole or in part with theCPU 150. - The
mobile communication device 140 also includes conventional components, such as a display 154 (e.g., operable to display the denoised image 250), the camera orvideo capture device 158, and keypad orkeyboard 156. These are conventional components that operate in a known manner and need not be described in greater detail. Other conventional components found in wireless communication devices, such as USB interface, Bluetooth interface, infrared device, and the like, may also be included in themobile communication device 140. For the sake of clarity, these conventional elements are not illustrated in the functional block diagram ofFIG. 3 . - The
mobile communication device 140 also includes anetwork transmitter 162 such as may be used by themobile communication device 140 for normal network wireless communication with a base station (not shown).FIG. 3 also illustrates anetwork receiver 164 that operates in conjunction with thenetwork transmitter 162 to communicate with the base station (not shown). In a typical embodiment, thenetwork transmitter 162 andnetwork receiver 164 are implemented as anetwork transceiver 166. Thenetwork transceiver 166 is connected to anantenna 168. Operation of thenetwork transceiver 166 and theantenna 168 for communication with a wireless network (not shown) is well-known in the art and need not be described in greater detail herein. - The
mobile communication device 140 may also include a conventional geolocation module (not shown) operable to determine the current location of themobile communication device 140. - The various components illustrated in
FIG. 3 are coupled together by thebus system 186. Thebus system 186 may include an address bus, data bus, power bus, control bus, and the like. For the sake of convenience, the various busses inFIG. 3 are illustrated as thebus system 186. - The
memory 152 may store instructions executable by theCPU 150. The instructions may implement portions of one or more of the methods described above (e.g., themethod 280 illustrated inFIG. 2 ). Such instructions may be stored on one or more non-transitory computer or processor readable media. - The foregoing described embodiments depict different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected,” or “operably coupled,” to each other to achieve the desired functionality.
- While particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from this invention and its broader aspects and, therefore, the appended claims are to encompass within their scope all such changes and modifications as are within the true spirit and scope of this invention. Furthermore, it is to be understood that the invention is solely defined by the appended claims. It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations).
- Accordingly, the invention is not limited except as by the appended claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/048,132 US20180338124A1 (en) | 2017-03-07 | 2018-07-27 | Method of decaying chrominance in images |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762468063P | 2017-03-07 | 2017-03-07 | |
US201762468874P | 2017-03-08 | 2017-03-08 | |
US15/910,993 US10051252B1 (en) | 2017-03-07 | 2018-03-02 | Method of decaying chrominance in images |
US16/048,132 US20180338124A1 (en) | 2017-03-07 | 2018-07-27 | Method of decaying chrominance in images |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/910,993 Continuation US10051252B1 (en) | 2017-03-07 | 2018-03-02 | Method of decaying chrominance in images |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180338124A1 true US20180338124A1 (en) | 2018-11-22 |
Family
ID=63078962
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/910,993 Active US10051252B1 (en) | 2017-03-07 | 2018-03-02 | Method of decaying chrominance in images |
US15/910,989 Active US10187637B2 (en) | 2017-03-07 | 2018-03-02 | Inductive micro-contrast evaluation method |
US15/914,673 Active 2038-12-07 US10778947B2 (en) | 2017-03-07 | 2018-03-07 | Sympathetic assistive mutation of live camera preview/display image stream |
US16/048,132 Abandoned US20180338124A1 (en) | 2017-03-07 | 2018-07-27 | Method of decaying chrominance in images |
US16/213,198 Active US10547819B2 (en) | 2017-03-07 | 2018-12-07 | Inductive micro-contrast evaluation method |
Family Applications Before (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/910,993 Active US10051252B1 (en) | 2017-03-07 | 2018-03-02 | Method of decaying chrominance in images |
US15/910,989 Active US10187637B2 (en) | 2017-03-07 | 2018-03-02 | Inductive micro-contrast evaluation method |
US15/914,673 Active 2038-12-07 US10778947B2 (en) | 2017-03-07 | 2018-03-07 | Sympathetic assistive mutation of live camera preview/display image stream |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/213,198 Active US10547819B2 (en) | 2017-03-07 | 2018-12-07 | Inductive micro-contrast evaluation method |
Country Status (2)
Country | Link |
---|---|
US (5) | US10051252B1 (en) |
WO (1) | WO2018165023A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10051252B1 (en) * | 2017-03-07 | 2018-08-14 | Filmic Inc. | Method of decaying chrominance in images |
US10645357B2 (en) * | 2018-03-01 | 2020-05-05 | Motorola Mobility Llc | Selectively applying color to an image |
CN109743473A (en) * | 2019-01-11 | 2019-05-10 | 珠海全志科技股份有限公司 | Video image 3 D noise-reduction method, computer installation and computer readable storage medium |
US11423588B2 (en) * | 2019-11-05 | 2022-08-23 | Adobe Inc. | Color transforms using static shaders compiled at initialization |
Family Cites Families (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5496106A (en) | 1994-12-13 | 1996-03-05 | Apple Computer, Inc. | System and method for generating a contrast overlay as a focus assist for an imaging device |
US6075875A (en) | 1996-09-30 | 2000-06-13 | Microsoft Corporation | Segmentation of image features using hierarchical analysis of multi-valued image data and weighted averaging of segmentation results |
US6667738B2 (en) | 1998-01-07 | 2003-12-23 | Vtech Communications, Ltd. | Touch screen overlay apparatus |
US6504575B1 (en) | 1998-02-27 | 2003-01-07 | Flashpoint Technology, Inc. | Method and system for displaying overlay bars in a digital imaging device |
US6996549B2 (en) | 1998-05-01 | 2006-02-07 | Health Discovery Corporation | Computer-aided image analysis |
US6326935B1 (en) | 1999-09-28 | 2001-12-04 | Gateway, Inc. | Method and apparatus for changing the mode of a display apparatus |
KR100372092B1 (en) | 2001-02-06 | 2003-02-14 | 주식회사 우리기술 | Medium Player for playing moving picture on the background of the screen and The Processing Method for moving picture for using it and A computer-readable Storage Medium for executing the above Medium Player or Method |
US20030103062A1 (en) | 2001-11-30 | 2003-06-05 | Ruen-Rone Lee | Apparatus and method for controlling a stereo 3D display using overlay mechanism |
US20030202015A1 (en) | 2002-04-30 | 2003-10-30 | Battles Amy E. | Imaging device user interface method and apparatus |
TW200424920A (en) | 2003-05-07 | 2004-11-16 | Acer Inc | Display apparatus having customized on-screen-display and method thereof |
US7269295B2 (en) * | 2003-07-31 | 2007-09-11 | Hewlett-Packard Development Company, L.P. | Digital image processing methods, digital image devices, and articles of manufacture |
US20060044328A1 (en) | 2004-08-26 | 2006-03-02 | Rai Barinder S | Overlay control circuit and method |
US8014034B2 (en) | 2005-04-13 | 2011-09-06 | Acd Systems International Inc. | Image contrast enhancement |
WO2007058895A2 (en) | 2005-11-11 | 2007-05-24 | Visualsonics Inc. | Overlay image contrast enhancement |
US7492938B2 (en) | 2006-02-14 | 2009-02-17 | Intelliscience Corporation | Methods and systems for creating data samples for data analysis |
US20100092082A1 (en) | 2006-11-29 | 2010-04-15 | Keigo Hirakawa | framework for wavelet-based analysis and processing of color filter array images with applications to denoising and demosaicing |
US8570426B2 (en) | 2008-11-25 | 2013-10-29 | Lytro, Inc. | System of and method for video refocusing |
CN101316321B (en) * | 2007-05-30 | 2010-04-07 | 展讯通信(上海)有限公司 | Pattern noise removal method and device based on median filter |
US7796829B2 (en) | 2008-12-10 | 2010-09-14 | The United States Of America As Represented By The Secretary Of The Army | Method and system for forming an image with enhanced contrast and/or reduced noise |
US8284271B2 (en) | 2009-06-05 | 2012-10-09 | Apple Inc. | Chroma noise reduction for cameras |
US8416262B2 (en) | 2009-09-16 | 2013-04-09 | Research In Motion Limited | Methods and devices for displaying an overlay on a device display screen |
US8508624B1 (en) | 2010-03-19 | 2013-08-13 | Ambarella, Inc. | Camera with color correction after luminance and chrominance separation |
US8922704B2 (en) | 2010-09-01 | 2014-12-30 | Apple Inc. | Techniques for collection of auto-focus statistics |
US8531542B2 (en) | 2010-09-01 | 2013-09-10 | Apple Inc. | Techniques for acquiring and processing statistics data in an image signal processor |
US8786625B2 (en) | 2010-09-30 | 2014-07-22 | Apple Inc. | System and method for processing image data using an image signal processor having back-end processing logic |
US8508612B2 (en) | 2010-09-30 | 2013-08-13 | Apple Inc. | Image signal processor line buffer configuration for processing ram image data |
US8699813B2 (en) | 2010-11-19 | 2014-04-15 | Analog Devices, Inc | Adaptive filter for low-light noise reduction |
US8878950B2 (en) * | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
US8701020B1 (en) | 2011-02-01 | 2014-04-15 | Google Inc. | Text chat overlay for video chat |
WO2012144817A2 (en) | 2011-04-19 | 2012-10-26 | Samsung Electronics Co., Ltd. | Method and apparatus for defining overlay region of user interface control |
US9448619B1 (en) | 2011-11-30 | 2016-09-20 | Google Inc. | Video advertisement overlay system and method |
US9743057B2 (en) | 2012-05-31 | 2017-08-22 | Apple Inc. | Systems and methods for lens shading correction |
US9332239B2 (en) | 2012-05-31 | 2016-05-03 | Apple Inc. | Systems and methods for RGB image processing |
US9105078B2 (en) | 2012-05-31 | 2015-08-11 | Apple Inc. | Systems and methods for local tone mapping |
US9025867B2 (en) | 2012-05-31 | 2015-05-05 | Apple Inc. | Systems and methods for YCC image processing |
US9014504B2 (en) | 2012-05-31 | 2015-04-21 | Apple Inc. | Systems and methods for highlight recovery in an image signal processor |
US9077943B2 (en) | 2012-05-31 | 2015-07-07 | Apple Inc. | Local image statistics collection |
US20130321675A1 (en) | 2012-05-31 | 2013-12-05 | Apple Inc. | Raw scaler with chromatic aberration correction |
WO2014070273A1 (en) | 2012-11-02 | 2014-05-08 | Board Of Regents, The University Of Texas System | Recursive conditional means image denoising |
WO2014174919A1 (en) | 2013-04-26 | 2014-10-30 | 浜松ホトニクス株式会社 | Image acquisition device and focusing method for image acquisition device |
EP2804378A1 (en) | 2013-05-14 | 2014-11-19 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Chroma subsampling |
EP2887634B1 (en) | 2013-12-23 | 2018-09-19 | Thomson Licensing | Method of mapping source colors from a source color gamut into a target color gamut |
ES2690371T3 (en) | 2013-12-27 | 2018-11-20 | Thomson Licensing | Method for reverse tonal mapping of an image |
US9619862B2 (en) | 2014-05-30 | 2017-04-11 | Apple Inc. | Raw camera noise reduction using alignment mapping |
US10147017B2 (en) | 2014-06-20 | 2018-12-04 | Qualcomm Incorporated | Systems and methods for obtaining structural information from a digital image |
US20150378558A1 (en) | 2014-06-30 | 2015-12-31 | Reliance Jio Infocomm Usa, Inc. | System and method for providing a user-controlled overlay for user interface |
US9525804B2 (en) | 2014-08-30 | 2016-12-20 | Apple Inc. | Multi-band YCbCr noise modeling and noise reduction based on scene metadata |
US9583035B2 (en) | 2014-10-22 | 2017-02-28 | Snaptrack, Inc. | Display incorporating lossy dynamic saturation compensating gamut mapping |
CN104486607B (en) | 2014-12-31 | 2016-08-24 | 上海富瀚微电子股份有限公司 | A kind of method and device of image chroma noise reduction |
US9659349B2 (en) | 2015-06-12 | 2017-05-23 | Gopro, Inc. | Color filter array scaler |
KR101785027B1 (en) | 2016-01-14 | 2017-11-06 | 주식회사 라온텍 | Image distortion compensation display device and image distortion compensation method using the same |
EP4273799A3 (en) | 2016-02-08 | 2023-12-06 | Imago Systems, Inc. | System and method for the visualization and characterization of objects in images |
WO2017175231A1 (en) | 2016-04-07 | 2017-10-12 | Carmel Haifa University Economic Corporation Ltd. | Image dehazing and restoration |
US10051252B1 (en) * | 2017-03-07 | 2018-08-14 | Filmic Inc. | Method of decaying chrominance in images |
-
2018
- 2018-03-02 US US15/910,993 patent/US10051252B1/en active Active
- 2018-03-02 US US15/910,989 patent/US10187637B2/en active Active
- 2018-03-05 WO PCT/US2018/020921 patent/WO2018165023A1/en active Application Filing
- 2018-03-07 US US15/914,673 patent/US10778947B2/en active Active
- 2018-07-27 US US16/048,132 patent/US20180338124A1/en not_active Abandoned
- 2018-12-07 US US16/213,198 patent/US10547819B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
US10778947B2 (en) | 2020-09-15 |
US10187637B2 (en) | 2019-01-22 |
US20180262752A1 (en) | 2018-09-13 |
WO2018165023A1 (en) | 2018-09-13 |
US20190110036A1 (en) | 2019-04-11 |
US10051252B1 (en) | 2018-08-14 |
US20180262688A1 (en) | 2018-09-13 |
US10547819B2 (en) | 2020-01-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240137658A1 (en) | Global tone mapping | |
US10051252B1 (en) | Method of decaying chrominance in images | |
US9538093B2 (en) | Forming high-dynamic-range (HDR) images using single-channel data | |
US6792160B2 (en) | General purpose image enhancement algorithm which augments the visual perception of detail in digital images | |
EP1302898B1 (en) | System and method for digital image tone mapping using an adaptive sigmoidal function based on perceptual preference guidelines | |
US8417064B2 (en) | Image processing device and method, program and recording medium | |
US8391598B2 (en) | Methods for performing local tone mapping | |
US10469760B2 (en) | High dynamic range imaging | |
CN107635103B (en) | Image processing method, mobile terminal and medium product | |
US8797427B2 (en) | Image processing apparatus | |
US20100278423A1 (en) | Methods and systems for contrast enhancement | |
US9177396B2 (en) | Image processing apparatus and image processing method | |
US20120093433A1 (en) | Dynamic Adjustment of Noise Filter Strengths for use with Dynamic Range Enhancement of Images | |
JP2007094742A (en) | Image signal processor and image signal processing program | |
CN105960658B (en) | Image processing apparatus, image capturing apparatus, image processing method, and non-transitory storage medium that can be processed by computer | |
EP1911267A1 (en) | Compensating for improperly exposed areas in digital images | |
US20110187891A1 (en) | Methods and Systems for Automatic White Balance | |
WO2013187133A1 (en) | Image processing device and image processing method | |
CN108629738B (en) | Image processing method and device | |
JP6335614B2 (en) | Image processing apparatus, control method thereof, and program | |
US20210297558A1 (en) | Cubiform method | |
US9055232B2 (en) | Image processing apparatus capable of adding soft focus effects, image processing method, and storage medium | |
Adams et al. | Perceptually based image processing algorithm design | |
Narasimha et al. | A real-time high dynamic range HD video camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: EVERNOTE CORPORATION, CALIFORNIA Free format text: MERGER;ASSIGNOR:FILMIC, INC.;REEL/FRAME:065613/0721 Effective date: 20231020 |