US9691138B2 - System and method for adjusting pixel saturation - Google Patents
System and method for adjusting pixel saturation Download PDFInfo
- Publication number
- US9691138B2 US9691138B2 US14/014,592 US201314014592A US9691138B2 US 9691138 B2 US9691138 B2 US 9691138B2 US 201314014592 A US201314014592 A US 201314014592A US 9691138 B2 US9691138 B2 US 9691138B2
- Authority
- US
- United States
- Prior art keywords
- target
- input
- aerial image
- pixel
- saturation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- G06T5/008—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6002—Corrections within particular colour systems
- H04N1/6005—Corrections within particular colour systems with luminance or chrominance signals, e.g. LC1C2, HSL or YUV
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6027—Correction or control of colour gradation or colour contrast
Definitions
- the pixel saturation within an image generally relates to the vividness of the colors contained within the image. Specifically, each pixel within an image may have a saturation value associated the dominance of hue within the pixel color. In many instances, people prefer images with more saturated, vivid colors. Unfortunately, satellite and aerial images are often substantially under-saturated and, thus, lacks significant vivid colors. This is particularly true for satellite and aerial images taken at higher elevations, as the effects of atmospheric conditions tend to produce flat, non-vivid colors. As a result, it is often desirable to adjust the pixel saturation within such under-saturated images to levels at or above the saturation levels of the colors found in nature.
- the present subject matter is directed to a tangible, non-transitory computer-readable medium storing computer-executable instructions that, when executed by one or more processors, cause the processor(s) to perform specific operations.
- the operations may generally include accessing a target distribution function associated with at least one target image and an input distribution function associated with at least one input image.
- the target distribution function may define a target probability for a pixel saturation of each pixel within the target image(s).
- the input distribution function may define an input probability for an initial saturation value of each pixel within the input image(s), with the input image(s) differing from the target image(s).
- exemplary aspects of the present disclosure are directed to other methods, systems, apparatus, non-transitory computer-readable media, user interfaces and devices for adjusting the pixel saturation of a plurality of images.
- FIG. 2 illustrates a flow diagram of one embodiment of a method for adjusting pixel saturation in accordance with aspects of the present subject matter
- FIG. 3 illustrates a histogram charting saturation values for an example set of target images
- FIG. 4 illustrates a histogram charting saturation values for an example set of input images
- FIG. 5 illustrates cumulative distribution functions associated with the saturation values shown in FIGS. 3 and 4 , particularly illustrating a cumulative distribution function of the pixel saturations within the example set of target images and a cumulative distribution function of the pixel saturations within the example set of input images;
- FIG. 6 illustrates the cumulative distribution function shown in FIG. 5 for the pixel saturations within the target images before and after modification using a saturation modifier.
- the present subject matter is directed to a system and method for adjusting the pixel saturation of a plurality of input images.
- the saturation values of the pixels contained within the input images may be adjusted based on the saturation values of pixels contained within a plurality of reference or target images having one or more desired saturation characteristics, such as a desired saturation distribution and/or desired saturation values.
- desired saturation characteristics such as a desired saturation distribution and/or desired saturation values.
- CDFs cumulative distribution functions
- the saturation value for each pixel contained within the input images may be transformed or converted to a target saturation value, thereby allowing the cumulative distribution of the pixel saturations of the converted input images to match the cumulative distribution of the pixel saturations of the target images without otherwise altering the appearance of the input images.
- the disclosed system and method may be advantageously applied to satellite and/or aerial imagery.
- aerial images are often under-saturated (especially images taken at higher altitudes) due to the effects of atmospheric conditions and, thus, typically contain flat, non-vivid colors.
- the present subject matter may be used to normalize the pixel saturation of such aerial images based on a different set of aerial images (e.g., target images) having desired saturation characteristics.
- the target images selected may correspond to aerial images taken at low altitudes such that the effect of atmospheric conditions on pixel saturation is minimized.
- the saturation values of the pixels contained within such target images may then be analyzed to define a target CDF for pixel saturation, which may be used for transforming the initial saturation values of the pixels contained within the under-saturated images to new, enhanced saturation values.
- similar sets of aerial images e.g., sets of aerial images depicting the same landscape
- the system and method disclosed herein may be utilized to normalize the saturations of the differing sets of aerial images.
- the content contained within the target images may be generally representative of the content contained within the input images. For instance, if the input images correspond to a set of aerial images within which 30% of the pixels are associated with the depiction of open fields and 20% of the pixels are associated with depiction of buildings, it may be desirable for the set of target images to contain similar pixel percentages.
- random images of the world are independent and identically distributed (IID). Specifically, for a large set of aerial images, it may be assumed that the images provide a sufficient approximation of IID pixels. In such instances, for any set of target aerial images, a sufficient variation in saturation may be provided to allow the pixel saturation for a set of input images to be properly transformed using the disclosed system and method.
- the disclosed system and method may generally be utilized to adjust the pixel saturation of any suitable images.
- the pixel saturation of images captured using conventional point and shoot cameras may also be adjusted using the disclosed system and method.
- FIG. 1 illustrates one embodiment of a system 100 for adjusting pixel saturation in accordance with aspects of the present subject matter.
- the system 100 may allow for the pixel saturation of a plurality of input images to be normalized or otherwise transformed based on the pixel saturation of a plurality of target images.
- the system 100 may include a client-server architecture where a server 110 communicates with one or more clients, such as a local client device 140 , over a network 160 .
- the server 110 may generally be any suitable computing device, such as a remote web server(s) or a local server(s), and/or any suitable combination of computing devices.
- the server 110 may be implemented as a parallel or distributed system in which two or more computing devices act together as single server.
- the client device 140 may generally be any suitable computing device(s), such as a laptop(s), desktop(s), smartphone(s), tablet(s), mobile device(s), wearable computing device(s), a display with one or more processors coupled thereto and/or embedded therein and/or any other computing device(s). Although a single client device 140 is shown in FIG. 1 , it should be appreciated that any number of clients may be connected to the server 110 over the network 160 .
- the server 110 may host a GIS 124 , such as a mapping application (e.g. the Google Maps mapping services provided by Google Inc.), a virtual globe application (e.g. the Google Earth virtual globe application provided by Google Inc.), or any other suitable geographic information system.
- a mapping application e.g. the Google Maps mapping services provided by Google Inc.
- a virtual globe application e.g. the Google Earth virtual globe application provided by Google Inc.
- the client device 140 may present a user interface that allows a user to interact with the GIS 124 .
- the user interface may be served through a network or web-based application that is executed on the client device 140 , such as a web browser, a thin client application or any other suitable network or web-based application or the user interface may be served locally on the client device 140 .
- the server 110 may transmit geospatial data, such as satellite and/or aerial imagery and other data (e.g., terrain and vector data), over the network 160 to the client device 140 .
- geospatial data such as satellite and/or aerial imagery and other data (e.g., terrain and vector data)
- the client device 140 may render the geospatial data, via the user interface, in a display device associated with the client device 140 .
- a user may then access and/or interact with the data presented in the user interface.
- the server 110 may include a processor(s) 112 and a memory 114 .
- the processor(s) 112 may be any suitable processing device, such as a microprocessor, microcontroller, integrated circuit, or other suitable processing device.
- the memory 114 may include any suitable computer-readable medium or media, including, but not limited to, non-transitory computer-readable media, RAM, ROM, hard drives, flash drives, or other memory devices.
- the memory 114 may store information accessible by processor(s) 112 , including instructions 116 that can be executed by processor(s) 112 and data 118 that can be retrieved, manipulated, created, or stored by processor(s) 112 .
- the data 118 may be stored in one or more databases.
- the memory 114 may include an image database 120 for storing image files associated with a plurality of images.
- images may correspond to a plurality of target images having one or more desired saturation characteristics.
- the target images may be selected based on the presence of a desired saturation distribution across the pixels contained within the images.
- the images stored within the image database 120 may also correspond to a plurality of input images having saturation characteristics that are desired to be and/or that have already been transformed based on the saturation characteristics of the target images.
- images taken by satellite and/or aerial imaging equipment may be stored within the image database 120 as input images.
- satellite and aerial images are often under-saturated, especially those taken at higher altitudes.
- the under-saturated images may be stored within the image database 120 and the pixel saturation of such input images may be subsequently transformed based on the pixel saturation of the target images.
- the transformed input images may then be made available to one or more client devices 140 , such as by serving the transformed input images as part of a geospatial asset (e.g., a 3-D globe or 2-D map) made available via the GIS 124 hosted by the server 110 .
- a geospatial asset e.g., a 3-D globe or 2-D map
- the memory 114 may also include a saturation database 122 for storing data associated with the pixel saturation of any suitable images, including any images stored within the image database 120 .
- saturation values for each pixel contained within the target images and/or the input images may be stored within the saturation database 122 .
- various other types of data associated with the saturation characteristics of the images may also be stored within the saturation database 122 , such as histogram data, probability density functions and/or cumulative density functions (CDFs) related to the pixel saturation of the target images and/or the input images.
- CDFs cumulative density functions
- one or more reference or look-up tables may also be stored within the saturation database 122 for converting the initial saturation values of the pixels contained within the input images to new, target saturation values.
- the saturation value for each pixel contained within the target images and/or the input images may be initially determined using any suitable means and/or method known in the art. For instance, if the images stored within the image database 120 are represented using an RGB color model, the saturation values for each pixel may be obtained by converting the color model to an HSV color model (or any other suitable color model) using known conversion algorithms and techniques.
- the instructions 116 stored within the memory 114 may be executed by the processor(s) 112 to implement a saturation conversion module 126 .
- the saturation conversion module 126 may be configured to adjust the pixel saturation of each pixel contained within the input images to a target pixel saturation based on the saturation distribution of the target images. For instance, a given input pixel within the input images may have an initial saturation value (e.g., a value ranging from 0 to 1 or from 0 to 55).
- the saturation conversion module 126 may be configured to determine a target saturation value for such input pixel. The module 126 may then adjust the initial saturation value to the corresponding target saturation value.
- the saturation characteristics of the input images may be transformed to those of the target images.
- the server 110 may, in several embodiments, be configured to host a GIS 124 that allows the server to communicate with a corresponding GIS client(s) 150 running on the client device 140 .
- GIS 124 that allows the server to communicate with a corresponding GIS client(s) 150 running on the client device 140 .
- geospatial data including satellite and/or aerial imagery, may be transmitted to and rendered by the client device 140 .
- the imagery rendered by the client device 140 may correspond to one or more input images for which the pixel saturation has been adjusted based on the saturation distribution of the corresponding target images.
- module refers to computer logic utilized to provide desired functionality.
- a module may be implemented in hardware, application specific circuits, firmware and/or software controlling a general purpose processor.
- the modules are program code files stored on the storage device, loaded into memory and executed by a processor or can be provided from computer program products, for example computer executable instructions, that are stored in a tangible computer-readable storage medium such as RAM, ROM, hard disk or optical or magnetic media.
- the server 110 may also include a network interface 128 for providing communications over the network 160 .
- the network interface 128 may be any device/medium that allows the server 110 to interface with the network 160 .
- the client device 140 may also include one or more processors 142 and associated memory 144 .
- the processor(s) 142 may be any suitable processing device known in the art, such as a microprocessor, microcontroller, integrated circuit, or other suitable processing device.
- the memory 144 may be any suitable computer-readable medium or media, including, but not limited to, non-transitory computer-readable media, RAM, ROM, hard drives, flash drives, or other memory devices.
- the memory 144 may be configured to store various types of information, such as data 146 that may be accessed by the processor(s) 142 and instructions 148 that may be executed by the processor(s) 142 .
- the data 146 may generally correspond to any suitable files or other data that may be retrieved, manipulated, created, or stored by processor(s) 142 . In several embodiments, the data 146 may be stored in one or more databases.
- the instructions 148 stored within the memory 144 of the client device 140 may generally be any set of instructions that, when executed by the processor(s) 142 , cause the processor(s) 142 to provide desired functionality.
- the instructions 148 may be software instructions rendered in a computer readable form or the instructions may be implemented using hard-wired logic or other circuitry.
- suitable instructions may be stored within the memory 144 for implementing one or more GIS clients 150 , such as one or more earth-browsing clients and/or mapping clients, designed to render the geospatial data (including satellite and/or aerial imagery) associated with the geospatial assets available via the GIS 124 .
- the GIS client(s) 150 may be configured to retrieve imagery data from the server 110 and render such images for display/use by the user.
- the client device 140 may also include a network interface 152 for providing communications over the network 160 . Similar to the interface 128 for the server 110 , the network interface 152 may generally be any device/medium that allows the client device 140 to interface with the network 160 .
- the network 160 may be any type of communications network, such as a local area network (e.g. intranet), wide area network (e.g. Internet), or some combination thereof.
- the network can also include a direct connection between the client device 140 and the server 110 .
- communication between the server 110 and the client device 140 may be carried via a network interface using any type of wired and/or wireless connection, using a variety of communication protocols (e.g. TCP/IP, HTTP, SMTP, FTP), encodings or formats (e.g. HTML, XML), and/or protection schemes (e.g. VPN, secure HTTP, SSL).
- FIG. 2 a flow diagram of one embodiment of a method for adjusting the pixel saturation of a plurality images is illustrated in accordance with aspect of the present subject matter.
- the method 200 will generally be discussed herein with reference to the system 100 shown in FIG. 1 .
- the method elements 202 - 208 are shown in FIG. 2 in a specific order, the various elements of the disclosed method 200 may generally be performed in any suitable order that is consistent with the disclosure provided herein.
- the method 200 includes accessing a target distribution function associated with the pixel saturation of a plurality of target images.
- the server 110 may include an image database 120 for storing data associated with a plurality of images, including a plurality of target images having one or more desired saturation characteristics. For instance, target images may be selected that have desired pixel saturation distributions and may be subsequently stored on the image database 120 . In several embodiments, different sets of target images may be stored and/or defined within the image database 120 . For example, target images may be grouped based on content so that a specific target image group may be selected that provides a sufficient representation of the content contained within any input images being transformed using the disclosed method 200 .
- the saturation values of the pixels contained within the target images may be initially analyzed to define a probability density function for such images.
- FIG. 3 illustrates a histogram of the saturation values for pixels contained within an example set of target images.
- the number of pixels having such saturation value or having a saturation value falling within an individual range of values
- the histogram data may then be used to define the probability density function for the target images.
- a best-fit curve 302 may be defined that serves as an approximation or representation of the probability density function of the distribution of the pixel saturation for the target images.
- the probability density function may, in turn, be used to define the target CDF for the images (e.g., by taking the integral of the probability density function).
- FIG. 5 illustrates the target CDF (curve 502 ) obtained using the histogram data of FIG. 3 .
- the target CDF 502 defines the percentile or probability (y-axis) that a particular pixel within the target images will have a saturation value (x-axis) that is less than or equal to a given saturation value. For instance, using the example data shown in FIG. 5 , 80% of the pixels contained within the target images have a saturation value that is less than or equal to about 0.65 while 40% of the pixels have a saturation value that is less than or equal to about 0.475.
- the method 200 includes accessing an input distribution function associated with the pixel saturation for a plurality of input images.
- the image database 120 of the server 110 may be configured to store a plurality of input images.
- the input images may, in several embodiments, correspond to images having saturation characteristics that are less desirable than the saturation characteristics of the target images.
- the input images may correspond to under-saturated satellite or aerial imagery (e.g., images taken at high altitudes). In such instance, it may be desirable to normalize or otherwise transform the pixel saturation of the input images based on the pixel saturation of the target images.
- the saturation values of the pixels contained within the input images may be analyzed to define an input distribution function for such images (hereinafter referred to as the input CDF).
- FIG. 4 illustrates a histogram of the saturation values for pixels contained within an example set of input images.
- the histogram data corresponds to input images having under-saturated pixels and, thus, the data differs significantly from that of the target images.
- a probability density function may be defined for the input images (e.g., such as by defining a best-fit curve 402 based on the data), which may then be used to define the input CDF.
- FIG. 4 illustrates a histogram of the saturation values for pixels contained within an example set of input images.
- the histogram data corresponds to input images having under-saturated pixels and, thus, the data differs significantly from that of the target images.
- a probability density function may be defined for the input images (e.g., such as by defining a best-fit curve 402 based on the data), which may then
- FIG. 5 illustrates the input CDF (curve 504 ) obtained based on the histogram data of FIG. 4 .
- the input CDF 504 is shifted to the left compared to the target CDF 502 .
- the target images in which 80% of the pixels have a saturation value that is less than or equal to about 0.65 80% of the pixels contained within the input images have a saturation value that is less than or equal to about 0.4.
- 40% of the pixels contained within the input images have a saturation value that is less than or equal to about 0.3.
- the present subject matter is generally described herein as using a plurality of target and input images, the techniques disclosed herein may also be applicable when only a single target image and/or a single input image exists.
- the target CDF may be defined based on the saturation values contained within a single target image.
- the input CDF may be defined based on the saturation values contained within a single input image.
- the method 200 includes associating the initial saturation values of the pixels within the plurality of input images with target saturation values based on the input and target distribution functions.
- each input pixel of the input images may have an initial saturation value.
- an input probability e.g., the probability that the pixel saturation of an input pixel is less or equal to a given saturation value
- the input probability associated with the pixel is equal to about 0.4.
- a target saturation value for the input pixel may be determined based on the target CDF and the corresponding input probability.
- the target saturation value may be selected based on the saturation value at which the target probability (e.g., the probability that the pixel saturation of a pixel of the target images is less or equal to a given saturation value) is equal to the input probability for the input pixel. For instance, referring again to example data shown in FIG. 5 , for a probability value of about 0.4, the corresponding target saturation value is about 0.475.
- the method 200 includes adjusting the pixel saturation for each input pixel from its initial saturation value to the target saturation value. For instance, in the example described above with reference to FIG. 5 , for an input pixel having an initial saturation value of 0.3, the saturation value may be adjusted to a target saturation value of 0.475.
- the distribution of pixel saturations for the input images may be transformed to the same distribution of pixel saturations for the target images without otherwise altering the appearance of the input images.
- a function may be defined for the target CDF (e.g., tcdf(x)) that maps target pixel saturation values to the portion of pixels contained within the target images having a saturation value less than or equal to a given saturation value (e.g., the target probability).
- a function may be defined for the input CDF (e.g., icdf(x)) that maps input pixel saturation values to the portion of pixels contained within the input images having a saturation value less than or equal to a given saturation value (e.g., the input probability).
- an inverse function may be defined for the target CDF (e.g., tcdf_inv(y)) that maps the target probabilities to the corresponding target saturation values.
- the function tcdf_inv(icdf(x)) may be utilized, wherein x is the initial saturation value.
- one or more reference or look-up tables may be created that correlates the initial saturation values of the input images to target saturation values.
- the server 110 may be configured to iterate over all the potential input saturation values to define corresponding target saturation values.
- the look-up table(s) may be referenced to quickly determine the corresponding target saturation value.
- a saturation modifier may be utilized to modify the target saturation values represented by the target CDF. For instance, it may desirable to increase or decrease the pixel saturation of the target CDF to alter the corresponding distribution achieved within the transformed input images (e.g., to create a synthetic saturation distribution).
- the saturation modifier may generally correspond to any suitable correction factor or formula used to modify the target saturation values.
- the modifier may correspond to a specific number used as a multiplier or divider to adjust the target saturation values, such as by using a multiplier of two to double the target saturation values.
- a minimum function may be utilized so that the resulting target saturation value is equal to min(tcdf(x)*m, z), wherein m corresponds to the multiplier and z corresponds to the maximum saturation value (e.g., 1 or 55).
- the saturation modifier may be associated with fitting the target saturation values along a gamma curve.
- FIG. 6 illustrates both the target CDF (curve 502 ) shown in FIG. 502 and an enhanced target CDF (curve 602 ) having a gamma correction of 2 (e.g., by raising the target saturation values within the target CDF 502 to the 1 ⁇ 2 power) such that the modified target saturation values are equal to tcdf(x) ⁇ 0.5.
- the largest increase in saturation value is at the middle of the saturation range (e.g., at a saturation value of 0.5 for saturation values ranging from 0 to 1.0).
- the saturation modifier may be utilized to modify the target saturation values contained with such range(s).
- the target CDF may, in alternative embodiments, be defined based on any other suitable data and/or process.
- the target CDF may be entirely synthetic, such as by being created to achieve some ideal or preferred look rather than to comply with (or even be a function of) any real-world imagery.
- the target CDF may, for example, be defined based on saturation values obtained via experimentation using a suitable image editing software.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (19)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/014,592 US9691138B2 (en) | 2013-08-30 | 2013-08-30 | System and method for adjusting pixel saturation |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/014,592 US9691138B2 (en) | 2013-08-30 | 2013-08-30 | System and method for adjusting pixel saturation |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20150062151A1 US20150062151A1 (en) | 2015-03-05 |
| US9691138B2 true US9691138B2 (en) | 2017-06-27 |
Family
ID=52582569
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/014,592 Active 2034-06-19 US9691138B2 (en) | 2013-08-30 | 2013-08-30 | System and method for adjusting pixel saturation |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US9691138B2 (en) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111833398B (en) * | 2019-04-16 | 2023-09-08 | 杭州海康威视数字技术股份有限公司 | A method and device for marking pixels in images |
| CN114365211B (en) * | 2020-01-21 | 2024-12-20 | 谷歌有限责任公司 | Gamma lookup table compression based on dimensionality reduction |
Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20010014182A1 (en) * | 1997-06-20 | 2001-08-16 | Ryuji Funayama | Image processing apparatus |
| US6351558B1 (en) | 1996-11-13 | 2002-02-26 | Seiko Epson Corporation | Image processing system, image processing method, and medium having an image processing control program recorded thereon |
| US20030025835A1 (en) | 2001-08-06 | 2003-02-06 | Oplus Technologies Ltd. | Method for independently controlling hue or saturation of individual colors in a real time digital video image |
| US6731794B2 (en) | 2001-04-12 | 2004-05-04 | Hewlett-Packard Development Company, L.P. | Method and apparatus for estimating true color values for saturated color values in digitally captured image data |
| US20050074180A1 (en) * | 2003-10-03 | 2005-04-07 | Wilensky Gregg D. | Determining parameters for adjusting images |
| US20060188153A1 (en) | 2005-02-22 | 2006-08-24 | Texas Instruments Incorporated | System and method for local saturation adjustment |
| WO2008053408A2 (en) | 2006-10-30 | 2008-05-08 | Koninklijke Philips Electronics N.V. | Color saturation enhancement |
| US20090087092A1 (en) * | 2007-09-27 | 2009-04-02 | Samsung Electro-Mechanics Co., Ltd. | Histogram stretching apparatus and histogram stretching method for enhancing contrast of image |
| US20100302347A1 (en) * | 2009-05-27 | 2010-12-02 | Sony Corporation | Image pickup apparatus, electronic device, panoramic image recording method, and program |
| EP2320378A1 (en) | 2009-11-06 | 2011-05-11 | Nxp B.V. | Colour image enhancement |
| US8284316B2 (en) | 2010-03-17 | 2012-10-09 | Ili Technology Corporation | Real-time image processing circuit capable of enhancing brightness contrast and color saturation |
| US20130114894A1 (en) * | 2010-02-26 | 2013-05-09 | Vikas Yadav | Blending of Exposure-Bracketed Images Using Weight Distribution Functions |
| US20150156415A1 (en) * | 2011-12-30 | 2015-06-04 | Google Inc. | Multiplane Panoramas of Long Scenes |
-
2013
- 2013-08-30 US US14/014,592 patent/US9691138B2/en active Active
Patent Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6351558B1 (en) | 1996-11-13 | 2002-02-26 | Seiko Epson Corporation | Image processing system, image processing method, and medium having an image processing control program recorded thereon |
| EP1587300A2 (en) | 1996-11-13 | 2005-10-19 | Seiko Epson Corporation | Image processing system and method and medium having the corresponding program recorded thereon |
| US20010014182A1 (en) * | 1997-06-20 | 2001-08-16 | Ryuji Funayama | Image processing apparatus |
| US6731794B2 (en) | 2001-04-12 | 2004-05-04 | Hewlett-Packard Development Company, L.P. | Method and apparatus for estimating true color values for saturated color values in digitally captured image data |
| US20030025835A1 (en) | 2001-08-06 | 2003-02-06 | Oplus Technologies Ltd. | Method for independently controlling hue or saturation of individual colors in a real time digital video image |
| US20050074180A1 (en) * | 2003-10-03 | 2005-04-07 | Wilensky Gregg D. | Determining parameters for adjusting images |
| US20060188153A1 (en) | 2005-02-22 | 2006-08-24 | Texas Instruments Incorporated | System and method for local saturation adjustment |
| WO2008053408A2 (en) | 2006-10-30 | 2008-05-08 | Koninklijke Philips Electronics N.V. | Color saturation enhancement |
| US20090087092A1 (en) * | 2007-09-27 | 2009-04-02 | Samsung Electro-Mechanics Co., Ltd. | Histogram stretching apparatus and histogram stretching method for enhancing contrast of image |
| US20100302347A1 (en) * | 2009-05-27 | 2010-12-02 | Sony Corporation | Image pickup apparatus, electronic device, panoramic image recording method, and program |
| EP2320378A1 (en) | 2009-11-06 | 2011-05-11 | Nxp B.V. | Colour image enhancement |
| US20110110588A1 (en) | 2009-11-06 | 2011-05-12 | Nxp B.V. | Colour image enhancement |
| US20130114894A1 (en) * | 2010-02-26 | 2013-05-09 | Vikas Yadav | Blending of Exposure-Bracketed Images Using Weight Distribution Functions |
| US8284316B2 (en) | 2010-03-17 | 2012-10-09 | Ili Technology Corporation | Real-time image processing circuit capable of enhancing brightness contrast and color saturation |
| US20150156415A1 (en) * | 2011-12-30 | 2015-06-04 | Google Inc. | Multiplane Panoramas of Long Scenes |
Also Published As
| Publication number | Publication date |
|---|---|
| US20150062151A1 (en) | 2015-03-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12100074B2 (en) | View synthesis robust to unconstrained image data | |
| US8761457B1 (en) | Aligning ground based images and aerial imagery | |
| US11037278B2 (en) | Systems and methods for transforming raw sensor data captured in low-light conditions to well-exposed images using neural network architectures | |
| US9594774B2 (en) | Estimating depth from a single image | |
| US11323676B2 (en) | Image white balance processing system and method | |
| US10186023B2 (en) | Unified multi-image fusion approach | |
| US11004179B2 (en) | Image blurring methods and apparatuses, storage media, and electronic devices | |
| CN113688907B (en) | Model training, video processing method, device, device and storage medium | |
| CN105118027B (en) | A kind of defogging method of image | |
| WO2023273536A1 (en) | Method and apparatus for generating relighting image, and electronic device | |
| US10567777B2 (en) | Contrast optimization and local adaptation approach for high dynamic range compression | |
| US12148186B2 (en) | System and method for learning tone curves for local image enhancement | |
| WO2019101005A1 (en) | Pixel compensation method and apparatus, and terminal device | |
| CN102446347B (en) | White balance method and device for image | |
| CN109493296A (en) | Image enchancing method, device, electronic equipment and computer-readable medium | |
| CN116993616A (en) | A single low-light scene image enhancement method and enhancement system | |
| US9691138B2 (en) | System and method for adjusting pixel saturation | |
| US20240404011A1 (en) | Tone mapping via dynamic histogram matching | |
| US9613294B2 (en) | Control of computer vision pre-processing based on image matching using structural similarity | |
| US20170148177A1 (en) | Image processing apparatus, image processing method, and program | |
| US20250078469A1 (en) | Deformable convolution-based detail restoration for single-image high dynamic range (hdr) reconstruction | |
| US12518364B2 (en) | Machine learning segmentation-based tone mapping in high noise and high dynamic range environments or other environments | |
| US12340487B2 (en) | Learning based discrete cosine noise filter | |
| US20180158194A1 (en) | Determining Optical Flow | |
| US20160379402A1 (en) | Apparatus and Method for Rendering a Source Pixel Mesh Image |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MARWOOD, DAVID;REEL/FRAME:031116/0238 Effective date: 20130829 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044097/0658 Effective date: 20170929 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |