WO2019152499A1 - Systems and methods for image signal processor tuning using a reference image - Google Patents

Systems and methods for image signal processor tuning using a reference image Download PDF

Info

Publication number
WO2019152499A1
WO2019152499A1 PCT/US2019/015823 US2019015823W WO2019152499A1 WO 2019152499 A1 WO2019152499 A1 WO 2019152499A1 US 2019015823 W US2019015823 W US 2019015823W WO 2019152499 A1 WO2019152499 A1 WO 2019152499A1
Authority
WO
WIPO (PCT)
Prior art keywords
isp
parameter
image
parameter values
input image
Prior art date
Application number
PCT/US2019/015823
Other languages
French (fr)
Inventor
Shreyas Hampali Shivakumar
Pawan Kumar Baheti
Naveen Srinivasamurthy
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to CN201980010542.8A priority Critical patent/CN111656781A/en
Publication of WO2019152499A1 publication Critical patent/WO2019152499A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Definitions

  • This disclosure relates generally to systems and methods for tuning an image signal processor, and specifically to determining one or more parameters used by an image signal processor to process an image.
  • a raw image captured by a camera sensor is processed by an image signal processor
  • ISP image processing
  • Processing may include a plurality of filters or processing blocks being applied to the captured image, such as denoising or noise filtering, edge enhancement, color balancing, contrast, intensity adjustment (such as darkening or lightening), tone adjustment, and so on.
  • Image processing blocks or modules may include lens/sensor noise correction, Bayer filters, de-mosaicing, color conversion, correction or enhancement/suppression of image attributes, denoising filters, and sharpening filters.
  • Each module may include a large number of tunable parameters (such as hundreds or thousands of parameters per module). Additionally, modules may be co-dependent as different modules may affect similar aspects of an image. For example, denoising and texture correction or enhancement may both affect high frequency aspects of an image. As a result, a large number of parameters are determined or adjusted for an ISP to generate a final image from a captured raw image.
  • the parameters for an ISP conventionally are tuned manually by an expert with experience in how to process input images for desirable output images.
  • the expert may require 3-4 weeks to determine or adjust device settings for the parameters based on a combination of a specific camera sensor and ISP. Since the camera sensor or other camera features (such as lens characteristics or imperfections, aperture size, shutter speed and movement, flash brightness and color, and so on) may impact the captured image and therefore at least some of the tunable parameters for the ISP, each combination of camera sensor and ISP would need to be tuned by an expert.
  • An example device may include one or more processors.
  • the one or more processor may be configured to receive an input image to be processed, receive a reference image that is a processed image of the input image by a second image signal processor, and determine one or more parameter values to be used by the image signal processor in processing the input image based on one or more differences between the input image and the reference image.
  • a method for tuning an image signal processor includes receiving, by a device, an input image to be processed.
  • the method also includes receiving, by the device, a reference image that is a processed image of the input image by a second image signal processor.
  • the method further includes determining one or more parameter values to be used by the image signal processor in processing the input image based on one or more differences between the input image and the reference image.
  • a non-transitory computer-readable medium may store instructions that, when executed by a processor, cause a device to tune an image signal processor.
  • the instructions may cause the device to receive an input image to be processed.
  • the instructions also may cause the device to receive a reference image that is a processed image of the input image by a second image signal processor.
  • the instructions further may cause the device to determine one or more parameter values to be used by the image signal processor in processing the input image based on one or more differences between the input image and the reference image.
  • a device in another example, includes means for receiving an input image to be processed, means for receiving a reference image that is a processed image of the input image from a second image signal processor, and means for determining one or more parameter values to be used by the ISP in processing the input image based on one or more differences between the input image and the reference image.
  • FIG. l is a block diagram of an example device for tuning an ISP.
  • FIG. 2 is an illustrative flow chart depicting a conventional operation for tuning an ISP for a scene type.
  • FIG. 3 is an illustrative flow chart depicting an example operation for automatically tuning an ISP.
  • FIG. 4 is an illustrative flow chart depicting an example operation for adjusting the parameter database.
  • FIG. 5 is a depiction of a relationship between texture and sharpness IQ metrics.
  • FIG. 6 is an illustrative flow chart depicting an example operation for determining new sets of parameter values for adjusting the parameter database.
  • FIG. 7 is an illustrative flow chart depicting an example operation for adjusting one or more IQ metrics in a sequential fashion in adjusting the parameters for personal preference.
  • FIG. 8 is a depiction of an example clustering of parameter sets as illustrated by a relationship of noise versus texture.
  • FIG. 9 is a depiction of an example tree branch illustration for sequentially adjusting IQ metrics.
  • FIG. 10 is a snapshot of an example GUI for adjusting an edge IQ metric.
  • FIG. 11 is a snapshot of an example GUI for adjusting a high contrast texture IQ metric.
  • FIG. 12 is a snapshot of an example GUI for adjusting a low contrast texture IQ metric.
  • FIG. 13 is a snapshot of an example GUI for adjusting a noise IQ metric.
  • FIG. 14 is a snapshot of an example GUI indicating the concatenation of selections for the different IQ metrics.
  • FIG. 15 is an illustrative flow chart depicting an example operation for using a reference image in automatically tuning an ISP.
  • FIG. 16 is an illustrative flow chart depicting an example operation for determining the closest parameter sets and adjusting the parameter database.
  • FIG. 17 is a depiction of an example feedback tuning flow using a reference image.
  • FIG. 18 is a depiction of an example non-recursive tuning flow using a reference image.
  • FIG. 19 is an illustrative flow chart depicting an example operation for using a trained parameter estimator to determine the parameters for an ISP to process images.
  • FIG. 20 is a block diagram of an example ISP.
  • FIG. 21 is a depiction of an example flow for using different patches in training different modules for determining the parameter values for the ISP.
  • ISP such as determining or adjusting the parameters used by an ISP for processing an input image.
  • an expert may require weeks of testing and adjusting to determine the parameters to be used by the ISP.
  • a user may have different preferences than what an expert may consider a desirable processed image. For example, a user may prefer more color saturation, a softer image, or otherwise than an expert tuning the ISP.
  • aspects of the present disclosure may be used in tuning an ISP so that less time may be required to tune the ISP and/or a person without expertise (such as a device user) may assist in tuning the ISP with his or her preferences.
  • a database of ISP parameters may be populated, adapted or updated based on user preferences. The final or updated database may then be used to provide the parameters to the ISP in processing an incoming image.
  • a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software.
  • various illustrative components, blocks, modules, circuits, and steps are described below generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
  • the example devices may include components other than those shown, including well-known components such as a processor, memory and the like.
  • aspects of the present disclosure are applicable to any suitable electronic device configured to or capable of tuning an ISP (such as a security system with one or more cameras, smartphones, tablets, laptop computers, digital video and/or still cameras, web cameras, cloud computing networks, testing equipment for ISPs, fabrication facilities, testing devices to interface with ISPs, and so on). While described below with respect to a device having or coupled to one camera, aspects of the present disclosure are applicable to devices having any number of cameras (including no cameras, where images or video are provided to the device, or multiple cameras), and are therefore not limited to devices having one camera. Aspects of the present disclosure are applicable for devices capturing still images as well as for capturing video, and may be implemented in devices having or coupled to cameras of different capabilities (such as a video camera or a still image camera).
  • aspects of the present disclosure are applicable to devices coupled to or interfacing an ISP (such as manufacturing or testing equipment and test devices), and are therefore not limited to devices having an ISP.
  • the term“device” is not limited to one or a specific number of physical objects (such as one smartphone, one camera controller, one processing system and so on). As used herein, a device may be any electronic device with one or more parts that may implement at least some portions of this disclosure. While the below description and examples use the term“device” to describe various aspects of this disclosure, the term“device” is not limited to a specific configuration, type, or number of objects.
  • FIG. l is a block diagram of an example device 100 for tuning an ISP.
  • the example device 100 may include or be coupled to a camera 102, a processor 104, a memory 106 storing instructions 108, and a camera controller 110.
  • the device 100 may optionally include (or be coupled to) a display 114 and a number of input/output (I/O) components 116.
  • the device 100 may include additional features or components not shown.
  • a wireless interface which may include a number of transceivers and a baseband processor, may be included for a wireless communication device.
  • the device 100 may include or be coupled to additional cameras other than the camera 102.
  • the camera 102 may be capable of capturing individual image frames (such as still images) and/or capturing video (such as a succession of captured image frames).
  • the camera 102 may include a single camera sensor and camera lens, or be a dual camera module or any other suitable module with multiple camera sensors and lenses.
  • the memory 106 may be a non-transient or non- transitory computer readable medium storing computer-executable instructions 108 to perform all or a portion of one or more operations described in this disclosure.
  • the memory 106 may also store a parameter database 109 or a look-up table (LUT) to be used for storing and looking up the parameters for an ISP (such as ISP 112).
  • the device 100 may also include a power supply 118, which may be coupled to or integrated into the device 100.
  • the processor 104 may be one or more suitable processors capable of executing scripts or instructions of one or more software programs (such as instructions 108) stored within the memory 106.
  • the processor 104 may be one or more general purpose processors that execute instructions 108 to cause the device 100 to perform any number of functions or operations.
  • the processor 104 may include integrated circuits or other hardware to perform functions or operations without the use of software. While shown to be coupled to each other via the processor 104 in the example of FIG. 1, the processor 104, the memory 106, the camera controller 110, the optional display 114, and the optional I/O components 116 may be coupled to one another in various arrangements. For example, the processor 104, the memory 106, the camera controller 110, the optional display 114, and/or the optional I/O components 116 may be coupled to each other via one or more local buses (not shown for simplicity).
  • the display 114 may be any suitable display or screen allowing for user interaction and/or to present items (such as captured images, video, or a preview image) for viewing by a user.
  • the display 114 may be a touch-sensitive display.
  • the I/O components 116 may be or include any suitable mechanism, interface, or device to receive input (such as commands) from the user and to provide output to the user.
  • the I/O components 116 may include (but are not limited to) a graphical user interface, keyboard, mouse, microphone and speakers, and so on.
  • the display 114 and/or the I/O components 116 may provide a preview image to a user and/or receive a user input for adjusting one or more settings of the camera 102 (such as selecting and/or deselecting a region of interest of a displayed preview image for an AF operation).
  • the camera controller 110 may include an ISP 112, which may be one or more image signal processors to process captured image frames or video provided by the camera 102. In some example implementations, the camera controller 110 (such as the ISP 112) may also control operation of the camera 102. In some aspects, the ISP 112 may process received images using parameters provided from the parameter database 109. The processor 104 may determine the parameters from the parameter database 109 to be used by the ISP 112. The ISP 112 may execute instructions from a memory to process image frames or video, may include specific hardware to process image frames or video, or additionally or alternatively may include a combination of specific hardware and the ability to execute software instructions for processing image frames or video.
  • images may be received by the device 100 from sources other than a camera, such as other devices, equipment, network attached storage, and so on.
  • the device 100 may be a testing device where the ISP 112 is removable so that another ISP may be coupled to the device 100 (such as a test device, testing equipment, and so on). While the following examples are described regarding device 100 and ISP 112, the present disclosure should not be limited to a specific type of device or hardware configuration for tuning an ISP.
  • IQ metrics are measurements of perceivable attributes of an image (with each perceivable attribute called a“ness”).
  • Example nesses are the luminance of an image, the sharpness of an image, the graininess of an image, the tone of an image, the color saturation of an image, and so on, and are perceived by a person if changed for an image.
  • the number of IQ metrics may be 10-20, with each IQ metric corresponding to a plurality of tunable parameters. Additionally, two different IQ metrics may affect some of the same tunable parameters for the ISP 112.
  • the parameter database 109 may correlate different values of IQ metrics to different values for the parameters. For example, an input vector of IQ metrics may be associated with an output vector of tunable parameters so that an ISP 112 may be tuned for the corresponding IQ metrics. Since the number of parameters may be large, the parameter database 109 may not store all combinations of IQ metrics, but instead include a portion of the number of combinations.
  • the memory 106 and parameter database 109 are shown to be included in device 100, the database may be stored outside of the device 100 (such as in a network attached storage, cloud storage, testing equipment coupled to device 100, and so on).
  • the present disclosure should not be limited to device 100 or a specific implementation of parameter database 109 or memory 106.
  • the parameters may also impact components outside of the ISP 112 (such as the camera 102), and the present disclosure should not be limited to specific described parameters or parameters specific only to the ISP.
  • the parameters may be for a specific ISP and camera (or camera sensor) combination.
  • An IQ model may be used to map the IQ metrics to the tunable parameters. Any type of
  • the IQ model may be used, and the present disclosure is not limited to a specific IQ model for correlating IQ metrics to ISP parameters.
  • the IQ model may include one or more modulation transfer functions (MTFs) to determine changes in the ISP parameters associated with a change in an IQ metric.
  • MTFs modulation transfer functions
  • changing a luminance IQ metric may correspond to parameters associated with adjusting a camera sensor sensitivity, shutter speed, flash, the ISP determining an intensity for each pixel of an incoming image, the ISP adjusting the tone or color balance of each pixel for compensation, and so on.
  • a luminance MTF may be used to indicate that a change in the luminance IQ metric corresponds to specific changes in the correlating parameters.
  • the IQ model or MTFs may vary between different ISPs or vary between different combinations of ISPs and cameras (or camera sensors).
  • tuning the ISP may comprise determining the differences in MTFs or the IQ model so that the IQ metric values are correlated to preferred tunable parameter values for the ISP (in the parameter database 109).
  • an“optimally” processed image may be based on user preference or subjective for one or more experts, the optimization of an IQ model may be open ended and subject to differences between users or persons assisting with the tuning.
  • an IQ scale such as from 0 to 100, with 100 being the best
  • the IQ for a processed image is quantified, and an expert may use such quantification to tune an ISP (such as the adjusting or determining the parameters for the ISP or the combination of the ISP and camera).
  • Some IQ metrics may be opposed to one another, such as noisiness and texture, where reducing or increasing the noise may correspondingly reduce or increase the high frequency texture information in an image.
  • trade-offs are determined between IQ metrics to attempt to optimize processing of an image (such as by generating the highest quantified IQ score from an IQ scale).
  • Optimizing the IQ metrics or otherwise tuning an ISP may differ for different scene types. For example, indoor scenes illuminated by incandescent lighting may correspond to different “optimal” IQ metrics (and corresponding parameters) than outdoor scenes with bright natural lighting. In another example, a scene with large flat fields of color and luminance may correspond to different “optimal” IQ metrics than a scene with large numbers of colors and variances in color within a field.
  • FIG. 2 is an illustrative flow chart depicting a conventional operation 200 for tuning an ISP
  • An initial set of parameter values for the ISP is used in processing one or more received images (202).
  • An expert then inspects the original and processed images to determine how the parameters should be adjusted (204). Through inspection of the images, the expert determines the parameters to be adjusted and the amount of adjustment (206). For example, the expert may determine the IQ metrics to be adjusted and the amount of adjustment, and one or more MTFs for the adjusted IQ metrics may be used to determine the amount of adjustment for corresponding ISP parameters.
  • the parameters are adjusted (208), and the adjusted parameters are used for the ISP to again process one or more images (210).
  • the process reverts to 204, with the expert repeatedly inspecting the images, adjusting the parameters, and the ISP processing images with the adjusted parameters until the expert is satisfied with the processed images.
  • the parameter values may be stored in a parameter database (such as database 209) for the scene type. Multiple sets of parameter values may be stored for a scene type, and/or the stored sets of parameter values may correspond to discrete differences in one or more IQ metrics.
  • At least a portion of the ISP is automatically tuned by a device.
  • the time for tuning an ISP may be reduced.
  • Automatically tuning the ISP may also take into account user preferences to tune an ISP for a user’s preferences instead of an expert (therefore providing images more preferable to the user).
  • the automatic tuning of an ISP may be performed during device or ISP design, manufacture or testing, which may include assisting an expert in tuning the ISP.
  • the automatic tuning of an ISP may be performed by an end user’s device, such as a smartphone, tablet, or other device including and/or in communication with one or more ISPs (such as device 100 including ISP 112).
  • an ISP 112 may have been tuned previously by an expert, with the parameter database 109 populated with parameter values to be used for different scene types.
  • Automatically tuning with user input may update the ISP tuning so that the parameter database 109 may be updated to include parameter values preferred by the user (such as by densifying the parameter database 109 with additional vectors of parameter values or adjusting existing vectors of parameter values).
  • the MTFs may be updated through the automatic tune procedure to better correlate parameters with IQ metrics.
  • the automatic tuning may include software, special hardware, or a combination of both.
  • automatically tuning may include an application or software to be executed by processor 104 for populating or updating the parameter database 109 of device 100.
  • a person (such as a tuning expert and/or a user of a given device) may be presented with different possible processed images to determine which images the person prefers and therefore which IQ metrics may be of more importance to the person in tuning the ISP. Additionally, or alternatively, a person may select the IQ metrics of importance to him or her, and the device may present possible processed images for different values of the IQ metrics to determine the person’s preference and therefore improve the tuning of the ISP for the person.
  • FIG. 3 is an illustrative flow chart depicting an example operation 300 for automatically tuning an ISP.
  • one or more images may be received.
  • values for parameters that are fixed for an ISP optionally are determined (304).
  • sensor or module specific parameter values such as some parameters for black level, lens roll-off, gamma, color, etc., may not change for different scene types.
  • the parameter values may therefore be determined separate from automatically tuning the ISP (such as determining values for non-fixed parameters).
  • step 304 may not be performed.
  • the ISP may then be automatically tuned using the received images (306).
  • the parameter database and/or the MTFs for an IQ model may be populated or adjusted using the received images (308).
  • relationships and trade-offs between IQ metrics or parameters may be determined or defined for the received images.
  • One example relationship is texture vs. edge sharpness for an image. Preserving edges in an image may also preserve texture or other high frequency information in an image.
  • Another example relationship is noise vs. texture. Preserving texture or high frequency information may also result in more noise being present within an image.
  • a further example relationship is color vs. tone.
  • tone adjustment may impact the color values for the pixels of the image (such as skewing one or more red, green, or blue values of a pixel when adjusting the tone of the image).
  • the IQ model to quantify IQ may be used to determine different example values for the parameter set (based on the determined trade-offs) for producing processed images with high IQ scores (such as greater than a predetermined or adjustable threshold, greater than an IQ score for a previous processed image, etc.).
  • parameter values for the ISP for different scene types may be determined based on personal preference (310). For example, a person may be provided (e g., presented for selection) choices with perceptible differences in processed images of the received images in order to assist in determining a person’s preferences. The preferences selected by the person may then be used to densify the parameter database (e g., populate additional data points), adjust the parameter database (e.g., adjust existing data points), set (e.g., configure or determine) the parameter values for the ISP for processing images, or perform a combination of two or more of the operations.
  • personal preference 310
  • a person may be provided (e g., presented for selection) choices with perceptible differences in processed images of the received images in order to assist in determining a person’s preferences.
  • the preferences selected by the person may then be used to densify the parameter database (e g., populate additional data points), adjust the parameter database (e.g., adjust existing data points), set (e.g., configure or determine)
  • the parameter database 109 may include sets of parameter values previously determined to cause an ISP to generate a“high-quality” image (e g., as designated or determined based on an IQ score equaling or exceeding a threshold score). Each set of parameter values may be associated with IQ metric values.
  • the database 109 may be organized or have multiple organization structures so that vectors with similar IQ metrics may be grouped together. For example, the database 109 may be indexed or organized so that sets with similar texture ness values may be identified. As described in FIG. 3, the parameter database 109 may be adjusted or updated for automatically tuning the ISP.
  • FIG. 4 is an illustrative flow chart depicting an example operation 400 for adjusting the parameter database.
  • one or more images for processing by an ISP are received or otherwise made available.
  • the images may be raw images captured by a camera sensor with noise and luminance characteristics that may impact processing.
  • one or more personal preferences (such as preferences of the expert and/or a user for a final processed image) may optionally be received (404).
  • Example preferences may include preferences regarding color saturation, tone, noisiness, etc. of the person for the processed images.
  • a device may then determine whether an existing parameter database (with one or more previously determined sets of parameter values) is to be adjusted based on the characteristics for the camera sensor and/or the personal preferences (406).
  • an insufficient number of sets of parameter values may be determined to exist in the parameter database.
  • the existing sets may be determined to insufficiently correlate to the camera sensor used for capturing the received images.
  • a scene type of a received image may not be covered by the existing parameter database.
  • the existing parameter database may be used without adjustment (410).
  • the received images may be evaluated using the existing sets of parameter values in the parameter database (412).
  • one or more relationships among IQ metrics may be analyzed using the received images (414). For example, the scatter of IQ metric relationships for texture versus edge sharpness (based on the existing sets of parameter values and the received images for processing) may be analyzed.
  • One or more new sets of parameter values may then be determined based on the analyzed relationships (416).
  • the relationship between edge sharpness and texture IQ metrics may be used to determine new sets of parameter values for different sharpness and texture IQ metrics that still provide a sufficient IQ score for a processed image.
  • the new sets of parameter values may also be used to better define tradeoffs for IQ metrics for the IQ model. For example, new sets of parameter values may indicate tradeoffs between a noisiness IQ metric and a texture IQ metric.
  • the one or more new sets of parameter values may then be determined to be added to the parameter database (418), thus densifying the parameter database.
  • an existing set of parameter values may be amended based on a new set of parameter values determined.
  • FIG. 5 is a depiction of a relationship 500 between texture and sharpness IQ metrics.
  • Existing points 502 indicating the relationship between the nesses may be from the existing sets of parameter values corresponding to different texture and sharpness IQ metrics.
  • a plurality of new parameter value sets for different texture and sharpness IQ metrics may be determined using the received images (so as to have a sufficient IQ score for a processed image).
  • the new sets may correspond to new points 504 on the relationship 500 between texture and sharpness IQ metrics, which may better indicate tradeoffs between IQ metrics.
  • the relationship 500 is depicted as a graph of two nesses, the relationship may be between any number of nesses and therefore any number of dimensions.
  • Determining new sets of parameter values may be based on existing sets of parameter values in the parameter database. For example, an existing set of parameter values (a parent set) may be adjusted in order to create one or more new sets of parameter values (children sets).
  • FIG. 6 is an illustrative flow chart depicting an example operation 600 for determining new sets of parameter values for adjusting the parameter database.
  • a space of near IQ metrics for an existing parent set is determined. For example, a determined distance away from a parent set may be a determined space. Triangulation or sum of differences are example methods for determining a distance, but the space may be determined in any suitable way. Graphically for 3 nesses, a cube may be determined around a parent set, where potential children sets may exist within the cube (space). In another example, a sphere or other suitable shape may be determined around a parent set.
  • a child set may be determined by interpolating parameters values between the parent set and an existing set (such as described regarding 604-608). In some other example implementations, a child set may be determined by perturbing or adjusting parameters of the parent set within the space (such as described regarding 610). In some further example implementations, a combination of interpolating and perturbing may be performed. For example, some child sets may be created through perturbation, then additional child sets may be created through interpolating between the previous child sets and the parent set. In another example, an interpolated child set’s parameters may be perturbed within a space to adjust the child set or create new child sets.
  • the furthest neighbor from the parent set in the space is used for interpolation.
  • any neighbor may be used for interpolation in other examples.
  • the distances between the parent set and existing sets in the space may be determined.
  • the furthest set from the parent set may then be determined based on the distances (606).
  • the space may be defined in dimensions of nesses, and a distance may be the combined difference in nesses between the sets.
  • the differences in parameter values between the furthest set and the parent set may be considered the maximum adjustments to the parameter values for the parent set in creating children sets.
  • any resulting child set may be configured to be within the space.
  • one or more parameter values from the parent set may be adjusted with an interpolated difference between the furthest neighbor and the parent set (608).
  • an interpolated difference between the furthest neighbor and the parent set (608).
  • only a subset of the IQ metrics may be determined to be adjusted.
  • the corresponding parameters for the subset of IQ metrics may be adjusted through interpolation.
  • a is a value between 0 and 1.
  • a may be constant for all parameters to be adjusted.
  • the factor of adjustment for the parameters being adjusted is the same. For example, based on all parameters being adjusted, the child set is as depicted in equation (2) below:
  • Child set parent set + a(neighbor set— parent set ) (2)
  • a new set may be determined by adjusting or perturbing one or more parameters of the parent set (610).
  • the sparsity of sets around the parent set may be determined, with the sparsity used to determine the factor by which to adjust one or more parameters.
  • a sparsity cost for a parent set may be a distance between the parent set and a distribution of existing sets in the space or across the group. For example, the Mahalanobis distance between the parent set and its existing neighbors in the space may be determined as the sparsity cost. The distance may also be determined for each existing set and an average distance determined for the existing sets across the entire group (which may be an average cost for the group).
  • the factor for adjusting parameters may be as depicted in equation (3) below: where x is the parent set sparsity cost and c is the average sparsity cost for the entire group. If the sparsity around the parent set is greater than the average sparsity (less neighbors surround the parent set than typical), then adjustments to the parameters may be smaller so that the corresponding IQ metrics are within the space. Conversely, if the sparsity around the parent set is less than the average sparsity (more neighbors surround the parent set than typical), then adjustments to the parameters may be greater since the greater number of neighbors indicate that the corresponding IQ metrics for greater adjustments should still be within the space.
  • the size of the window for adjusting a parameter may be a standard deviation of the parameter for the entire group times the factor, and the window may be centered at the parameter value for the parent set. If the sparsity around the parent set is greater than or equal to the average sparsity (less or the same number of neighbors surround the parent set and are distributed than typical), the window size may be approximately one standard deviation. Conversely, if the sparsity around the parent set is less than the average sparsity, the window size may be multiple standard deviations.
  • a parameter value is randomly or pseudo- randomly selected from the window.
  • related parameters (such as parameters associated with an IQ metric) may be adjusted by a similar factor, where a same position in the window is used for each related parameter.
  • the IQ metrics for each potential child set may be determined (612).
  • the received image(s) may be processed by the ISP using the child parameter values, and IQ metrics may be calculated from the processed image(s).
  • a determination may then be made whether the IQ metrics are valid (614).
  • the IQ metrics are compared to the IQ metrics for existing sets in the parameters database to determine if they are consistent. If a portion of the IQ metrics are outliers (e.g., not included among the IQ metrics of the existing sets in the parameter database), the IQ metrics may be considered invalid.
  • an IQ score may be computed for a processed image. If the image score is sufficient, such as greater than a threshold, the IQ metrics are considered valid.
  • Other suitable processes for determining the validity of the IQ metrics may be used, and the present disclosure should not be limited to specific examples.
  • the child set may be added to the parameter database (616). If the new IQ metrics are considered invalid (614), the child set may be rejected and not added to the parameter database (618).
  • a display may provide (e.g., display) different processed images for a varying IQ metric, and a mechanism for receiving user input (e.g., a GUI or a camera or microphone) may allow a user to select the preferred processed images to indicate the preferences for the IQ metric.
  • FIG. 7 is an illustrative flow chart depicting an example operation 700 for adjusting one or more IQ metrics in a sequential fashion in adjusting the parameters for personal preference. The process may be used to indicate which parameter sets from the parameter database are preferred by the user for the ISP (or ISP and camera combination).
  • the IQ metrics to be adjusted for a user are determined.
  • a user may indicate which IQ metrics are of particular importance to that particular user.
  • the IQ metrics may be for a particular scene or generally for all scenes.
  • the parameter sets of the parameter database may then be clustered or grouped for each of the IQ metrics to be adjusted (704).
  • FIG. 8 is a depiction of an example clustering of parameter sets as illustrated by a relationship of noise versus texture. As shown, the parameter sets are clustered into three groups: low noise and texture 802, medium noise and texture 804, and high noise and texture 806. While three groups are shown, any number of clusterings may exist.
  • the groupings or clusterings indicate the sets with close IQ metrics (such as IQ metrics within a determined distance of one another). For example, the three clusterings indicate that the noise IQ metric and the texture IQ metric are similar for the parameter sets in a cluster. While not shown, one or more parameter sets may not be clustered and may be removed from consideration for the final parameter set to be used by the ISP.
  • a received image is processed for each of the parameter sets in a clustering for the IQ metric to first be adjusted (706).
  • the image may also be processed with a varying IQ metric corresponding to differences in the corresponding parameters for each of the parameter sets (with each parameter set possibly being used multiple times to process the image).
  • the number of times that the image is processed may correspond to the number of parameter sets clustered for the IQ metric.
  • the processed images are then displayed or otherwise presented to a user (708) so that the user may indicate which processed image(s) are preferred.
  • the user may then indicate (such as through a GUI or other user input) which processed images are preferred (710).
  • an IQ score may be determined for each of the processed images, and the highest IQ scores or scores greater than a threshold may be used to select the processed images.
  • the corresponding parameter values for the IQ metric being adjusted may be determined (712).
  • the user selections may have a subset of parameters corresponding to the IQ metric with similar or the same parameter values across the user selections.
  • the parameter values associated with the IQ metric is preserved when processing an image for a next varying IQ metric.
  • the image is then again processed for a next varying IQ metric (714).
  • the process may continue until all indicated metrics are adjusted.
  • the parameter database may be searched to determine whether the parameters for the preferred IQ metrics are similar to the parameters for one or more stored parameter sets.
  • Such parameter sets may be considered the preferred sets of parameter values to be used by the ISP for processing an image.
  • the determined parameter values may be added to the parameter database as one or more new parameter sets.
  • FIG. 9 is a depiction of an example tree branch illustration 900 for sequentially adjusting IQ metrics.
  • the clusterings 902 are used as starting points, and an edge MTF 904 may first be used to adjust an edge IQ metric.
  • a high contrast texture MTF 906 may then be used to next adjust a high contrast texture IQ metric.
  • a low contrast texture MTF 908 may next be used to adjust a low contrast texture IQ metric.
  • a noise MTF 910 may then be used to adjust a noise IQ metric.
  • Fine tuning adjustments may then be performed to finalize one or more parameters that may change the perception of the processed image.
  • the end point of each of the arrows may indicate a different processed image.
  • the continuing arrows may indicate that the user selected those images for the respective IQ metric.
  • the darkened solid arrows, the dashed solid arrows and the gray solid arrow may indicate images selected by the user as preferred over other selected images. The user may select the image corresponding to the final darkened solid arrow during overshoot 912 as the preferred image with respect to the other preferred images.
  • a GUI may be used in adjusting one or more IQ metrics.
  • a GUI may allow a user to inspect the trade-off between IQ metrics and determine the preferred metrics.
  • the GUI may allow a user to determine the preferred IQ metric for the selected metrics to be adjusted.
  • FIGS. 10-14 depict an example GUI for adjusting IQ metrics corresponding to the example tree branch illustration in FIG. 9.
  • FIG. 10 is a snapshot 1000 of an example GUI for adjusting an edge IQ metric.
  • a user may select one or more of the defined edge IQ metric values or relationships and press next to go to the next IQ metric.
  • FIG. 11 is a snapshot 1100 of an example GUI for adjusting a high contrast texture IQ metric.
  • FIG. 12 is a snapshot 1200 of an example GUI for adjusting a low contrast texture IQ metric.
  • a user may select one or more of the defined low contrast texture IQ metric values or relationships and press next to go to the next IQ metric.
  • FIG. 13 is a snapshot 1300 of an example GUI for adjusting a noise IQ metric.
  • a user may select one or more of the defined noise IQ metric values or relationships and press add to cart to end.
  • the potential noise IQ metrics (N in FIG. 13) are based on the previously selected IQ metrics (E selected for edge tuning (FIG. 10), H selected for high contrast tuning (FIG. 11), and L selected for low contrast tuning (FIG. 12) under each of the images on the left of the snapshot 1300).
  • the GUI may show the groupings of selected IQ metrics (with respective parameter sets).
  • FIG. 14 is a snapshot 1400 of an example GUI indicating the concatenation of selections for the different IQ metrics.
  • a user may select one or more final concatenations to be used (such as by checking the box to the left illustrated in snapshot 1400).
  • the parameter set used by the ISP is thus dependent on the selected IQ metric values or relationships (such as through the different MTF s for determining the parameter values for a selected grouping of IQ metrics.
  • one or more sets of parameter values from the parameter database may be identified based on the selected concatenation of IQ metrics (such as from FIG. 14). Such identified sets of parameter values may therefore be used by the ISP in processing received images.
  • the optimization of an IQ model may be open ended and subject to different preferences between users or persons. There may be no“correct” set of parameter values since different processed images using different parameter values may be considered to be of similar IQ by a person. As a result, determining the parameter values to be used or otherwise tuning an ISP may be long or tedious since the parameter values may not converge to one specific set of parameter values. Determining initial parameters values or how to adjust parameter values may be difficult since there may not be one preferred setting for the IQ metrics.
  • a reference image processed by a different ISP or device may be introduced into the automatic tuning process.
  • the reference image may provide some guidance or indication as to one or more preferred IQ metrics and their associated parameter values.
  • a reference image may be used to determine one or more closest sets of parameter values in the parameter database.
  • the closest sets may be used to densify or otherwise adjust the parameter database.
  • FIG. 15 is an illustrative flow chart depicting an example operation 1500 for using a reference image in automatically tuning an ISP Beginning at 1502, a reference image may be received.
  • the reference image may be previously processed.
  • the reference image may have been provided by a different ISP or device after completing processing.
  • the reference image is different than the input image for processing by the ISP.
  • one or more preferred IQ metrics may be determined from the reference image (1504). For example, a texture IQ metric, a noise IQ metric, and an edge IQ metric may be determined from the reference image. Other examples IQ metrics may include a tone IQ metric, color IQ metric, high frequency contrast IQ metric, low frequency contract IQ metric, and so on. While the example processes are described regarding texture, noise, and edge IQ metrics, other and any number of IQ metrics may be used. Therefore, the present disclosure should not be limited to specific IQ metrics or examples.
  • One or more parameter sets with the parameter values for the sets corresponding to IQ metrics closest to the preferred IQ metrics may then be identified (1506).
  • a parameter database may store a vector of IQ metrics for each set of parameter values.
  • the MTFs for an IQ model may be used to determine the IQ metrics for each set of parameter values in the parameter database.
  • Parameter sets with the closest IQ metrics to the preferred IQ metrics may be considered the closest parameter sets.
  • a distance function may be used to determine the closest parameter sets.
  • An example distance function is depicted in equation (4) below: forj from 1 to D (4) where i is a specific IQ metric, X is the preferred IQ metric value for the specific IQ metric from the group or vector of preferred IQ metric values X, Mj is the group or vector of IQ metric values for the jth parameter set in the parameter database, W is a weight for the ith IQ metric from weight vector W (where each IQ metric may be associated with a different weight), and D is the number of parameter sets in the parameter database.
  • the distance function may be an unweighted summation, where the difference between a parameter set IQ metric value and the preferred IQ metric value is not multiplied by a weight factor.
  • the preferred IQ metrics determined are the texture IQ metric, edge IQ metric, and noise IQ metric
  • i may range from 1 to 3 for the three IQ metrics
  • the distance for a parameter set j may be a sum of three values: the weighted difference between corresponding IQ metric values and the preferred IQ metric values for the texture, edge, and noise IQ metrics.
  • the closest parameter set j may be the parameter set with the smallest or minimum distance across the parameter sets.
  • a parameter set may be selected if the distance is less than a threshold. In this manner, a parameter set may be identified without searching the entire parameter database.
  • the ISP may then process a received image (1508). For example, a raw image may be input into or received by the device or ISP and processed using the identified parameter set(s). The received image may be the raw image (pre processing) of the reference image. One or more personal or user preferences also may be determined or received (1510). Then, the parameter database may be adjusted based on the one or more personal preferences and the one or more identified parameter sets (1512).
  • variations to an identified parameter set may be used to process the input image, and the variations are analyzed to determine if the child set is to be added to the parameter database.
  • example operation 600 in FIG. 6 may be used to densify the parameter database, where the parent set is from the one or more identified parameter sets in 1506.
  • the process of identifying one or more parameter sets and using such to adjust the parameter database (1506-1512) may be recursively performed until determined that the parameter database is to not be further adjusted.
  • the parameter database may reach a critical number of parameter sets being stored.
  • the parameter database may stop being updated if no new child sets with valid IQ metrics (such as from example operation 600 in FIG. 6) are identified or determined.
  • the parameter database may stop being updated if the new child sets do not sufficiently improve the IQ (such as increasing the IQ score by a threshold amount or differences between the parent set and child set cannot be perceived by a user when processing an image).
  • FIG. 16 is an illustrative flow chart depicting an example operation 1600 for determining the closest parameter sets and adjusting the parameter database.
  • Example operation 1600 in FIG. 16 may be an example implementation of steps 1506-1512 of FIG. 15. While FIG. 16 is described regarding texture, noise, and edge IQ metrics, any IQ metrics and number of IQ metrics may be used.
  • a closest parameter set for the preferred IQ metrics (such as the texture, noise, and edge
  • IQ metrics may be determined from the parameter database (1602).
  • the distance function depicted in equation (4) may be used to determine the closest parameter set.
  • a different parameter set other than the closest parameter set may be better suited in processing an image.
  • one or more of the IQ metrics may be relaxed in determining a closest parameter set. While operation 1600 describes relaxing one IQ metric in determining a closest parameter set, more than one IQ metric may be relaxed.
  • a closest parameter set with a relaxed texture IQ metric may be determined.
  • the weight vector in determining a distance may be adjusted to reduce the weight for the texture IQ metric.
  • the weight may be adjusted to zero (to remove consideration of the IQ metric from determining the distance) or a portion of the previous weight (to reduce consideration of the IQ metric in determining the distance).
  • a closest parameter set with a relaxed noise IQ metric may be determined (1606), and a closest parameter set with a relaxed edge IQ metric may be determined (1608).
  • one or more of the determined parameter sets may be the same.
  • previously determined closest parameter sets may be removed from consideration in determining a closest parameter set so that the number of determined parameter sets corresponds to the number of preferred IQ metrics.
  • a received image may then be processed using the determined/identified parameter sets
  • the parameter set to be used may be determined to be the closest parameter set (such as determined in 1602) or somewhere between the closest parameter set and one of the parameter sets with a relaxed IQ metric (such as determined in 1604 through 1608) (1612).
  • the determined parameter set may be one of the parameter sets determined in 1604 through 1608 (instead of between one of the parameter sets and the closest parameter set).
  • the processed images may be presented to a user.
  • the user may then select the preferred processed image(s).
  • the user input or selection may indicate which parameter set to be used. For example, if the user selects the processed image for the closest parameter set, the closest parameter set is determined to be the parameter set to be used by the ISP. In this manner, the parameter database is not updated since the closest parameter set is selected. If the user selects one of the processed images for the parameter sets for relaxed IQ metrics, a parameter set between the closest parameter set and the corresponding relaxed parameter set may be determined to be used. As a result, a child set from the closest parameter set and the relaxed IQ metric parameter set may be created.
  • the child set from the closest parameter set and the relaxed IQ metric parameter set may be determined through interpolation between the two existing parameter sets. For example, steps 604-608 of example operation 600 in FIG. 6 may be used to determine or create a child set. In another example, one or more IQ metric values between the values for the closest parameter set and the parameter set of the relaxed IQ metric may be determined. One or more MTFs of the IQ model may then be used to determine the parameter values for a child set.
  • the child set is used in processing the received image and compared to the processed images for the two parent sets. If a user prefers the processed image for the child set (or alternatively, an IQ score or other evaluation of the processed images indicate that the processed image for the child set is greater than for the other processed images), the child set may be added to the parameter database. The process may be repeated as long as the parameter database is to be adjusted (such as being densified with additional child sets). If a user prefers the processed image for the closest parameter set (or alternatively, an IQ score or other evaluation of the processed images indicate that the processed image for the child set is less than for the other processed images), the child set may be rejected and the parameter database not further updated.
  • Using IQ metrics and user preferences for the entirety of tuning an ISP may require a significant amount of time.
  • an expert manually adjusting IQ metrics may take weeks to tune an ISP.
  • automatically tuning with recursively updating the parameter sets or adjusting the parameter database based on repeated user inputs may take, e.g., 6-8 hours.
  • recursively adjusting the parameter database or tuning the parameters may be removed. For example, a one-shot or non-recursive process may be used to initially determine the parameter values for a parameter set to be used by the ISP (which may be called“coarse” tuning).
  • the initially determined parameter values may then be tuned or adjusted using, e.g., user preferences, a scene type, luminance, and/or characteristics of the target ISP, to improve or optimize the parameter set (which may be called“fine” tuning). While coarse tuning and fine tuning are described, coarse tuning may be used exclusively to determine a parameter set to be used by the ISP. Therefore, the present disclosure should not be limited to including both coarse tuning and fine tuning.
  • FIG. 17 is a depiction of an example feedback tuning flow 1700 using a reference image.
  • Example feedback tuning flows using a reference image include processes 1500 and 1600 in FIG. 15 and FIG. 16, respectively.
  • a reference or target image from a separate ISP may be used to adapt a tuning tool.
  • the parameter database may be adjusted using the reference image.
  • the tuning tool may be recursively updated (such as continuing to densify or otherwise adjust the parameter database), and the updated tool (such as the adjusted parameter database) may recursively be used to process an input image (such as the raw image) by the ISP to determine more feedback, such as user preferences and/or IQ metrics, for updating the tuning tool.
  • the feedback loop of updating the tool and determining feedback to again update the tool may continue until the tool is sufficiently updated.
  • the parameter database is adjusted until one or more parameter sets are determined to be sufficient for use by the ISP in processing images.
  • the feedback loop for updating the tuning tool (such as adjusting the parameter database) is replaced with a non-recursive or one-shot process for determining initial values for the parameters using a reference image.
  • the time for determining the parameter values may be reduced or expedited.
  • FIG. 18 is a depiction of an example non-recursive (“feed-forward”) tuning flow 1800 using a reference image.
  • a previously trained parameter estimator for the ISP may be used to determine initial parameter values based on differences between the input image (such as the raw image) and the reference image (such as a target image from a different ISP). Since the input image is not processed and evaluated multiple times as a result of a feedback or recursive process, the time in determining the parameter values and processing the input image by the ISP using the parameters values may be reduced.
  • a parameter estimator may be previously trained before being used to determine parameter values in the flow 1800 in FIG. 18.
  • the parameter estimator is a neural network or other fuzzy logic decision maker that is trained using a plurality of other inputs and corresponding reference images for the ISP.
  • the parameter estimator may be a deep layer neural network (“DeepNet”).
  • DeepNet deep layer neural network
  • the error or loss between a processed image from the ISP and the reference image is analyzed to determine how to adjust or update the DeepNet. For example, differences in IQ metrics may be compared between the reference image and the processed image in further training the DeepNet.
  • DeepNet DeepNet to determine the parameter values
  • IQ metrics or subjective measures for processed images may not be required or used.
  • the parameter estimator once trained, may be sufficient on its own in determining the parameter values.
  • the trained parameter estimator may receive a reference (output) image from a different ISP and the corresponding input image. The images may be used to estimate parameters for processing the input image by the present ISP so that the processed image from the ISP approximates the reference image from the different ISP.
  • FIG. 19 is an illustrative flow chart depicting an example operation 1900 for using a trained parameter estimator (such as a DeepNet) to determine the parameters for an ISP to process images.
  • a trained parameter estimator such as a DeepNet
  • an input image to be processed by the ISP may be received.
  • a reference image corresponding to the input image processed by a different ISP may also be received (1904).
  • the input image and the reference image may then be input into the trained parameter estimator for determining or estimating the parameters to be used by the ISP for processing the input image (1906).
  • the trained parameter estimator may estimate the parameters for the ISP so that the ISP may approximate the reference image in processing the input image (1908). For example, the processed image from the ISP is to be as close as possible to the reference image.
  • preferred processed images from another ISP may be selected in providing reference images for input images in training the parameter estimator.
  • the parameter estimator may estimate parameter values so to“imitate” (e g., track, mirror, closely correspond, substantially replicate, etc.) processing of images by the other ISP.
  • the parameter estimator may determine parameter values that, when used to tune an ISP, generate a processed image that would be perceived by a user/person to substantially replicate the processing performed by the other, distinct ISP.
  • FIG. 20 is a block diagram of an example ISP 2000.
  • ISP 2000 may be an example implementation of the ISP 112 of device 100 in FIG. 1.
  • the illustrated ISP 2000 may be a single thread (or single core) processor with a sequence of filters 2002A-2002N.
  • the ISP may be (or included in) a multi-thread or multiple core processor.
  • filter 1 (2002A) may be a noise reduction filter
  • filter 2 (2002B) may be an edge enhancement filter
  • filter N (2002N) may be a final filter to complete processing the captured image frame.
  • the filters may process the image, with the filters corresponding to different IQ metrics (such as a noise IQ metric for filter 1 2002A, an edge IQ metric for filter 2 2002B, and so on).
  • Each filter may use a plurality of parameters to process the corresponding aspect of the input image (such as sharpening edges, denoising, and so on).
  • a different parameter estimator may be trained for each of the filters or modules of the ISP. For example, a first parameter estimator may be trained for filter 1 2002A, a second parameter estimator may be trained for filter 2 2002B, and so on. In this manner, a parameter estimator may estimate or determine the parameter values for the corresponding filter or module of the ISP to be used in processing the input image.
  • one or more same parameters may be used by multiple filters or modules of an ISP in processing an input image.
  • different parameter estimators may determine different values for a parameter.
  • the ISP may use the different values corresponding to the respective filters or modules.
  • the ISP or device may select and use one of the determined values for the parameter across the multiple filters or modules.
  • the initially determined parameter values may be adjusted or fine-tuned.
  • user preferences or ISP specific characteristics may be used to fine-tune the determined parameter values.
  • fine-tuning may include any of the previously described processes of adjusting parameter values using personal preferences or other user input.
  • the determined parameter values need not be adjusted or fine-tuned, and fine-tuning is not required.
  • portions of an image sensitive to changes in the module’s parameter values may be used to train a corresponding parameter estimator.
  • a TE42 chart may be an input image for training a parameter estimator, and a patch of the TE42 chart may be sensitive to changes in parameter values for the corresponding ISP module.
  • a Siamese convolutional DeepNet may be used to fuse information from multiple patches (for multiple modules) in order to estimate the parameters. In this manner, the outputs may be stacked and convolutionally processed in training the parameter estimators.
  • each of patch pairs l-N may correspond to the patch of the input image and the corresponding patch of the reference image.
  • Each pair of patches may be compared to determine example module parameters 2102.
  • the example module parameters may then be compared or otherwise combined to provide the final module parameters 2104 (such as described above regarding having differences in some module parameter values).
  • the techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium (such as the memory 106 in the example device 100 of FIG. 1) comprising instructions 108 that, when executed by the processor 104, cause the device 100 to perform one or more of the methods described above.
  • the non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.
  • the non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASE1 memory, other known storage media, and the like.
  • RAM synchronous dynamic random access memory
  • ROM read only memory
  • NVRAM non-volatile random access memory
  • EEPROM electrically erasable programmable read-only memory
  • FLASE1 memory other known storage media, and the like.
  • the techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.
  • processors such as the processor 104 of FIG. 1.
  • processors may include but are not limited to one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), application specific instruction set processors (ASIPs), field programmable gate arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • ASIPs application specific instruction set processors
  • FPGAs field programmable gate arrays
  • processors may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein.
  • the functionality described herein may be provided within dedicated software modules or hardware modules configured as described
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Aspects of the present disclosure relate to systems and methods for tuning an image signal processor. An example device may include one or more processors. The one or more processor may be configured to receive an input image to be processed, receive a reference image that is a processed image of the input image by a second image signal processor, and determine one or more parameter values to be used by the image signal processor in processing the input image based on one or more differences between the input image and the reference image.

Description

SYSTEMS AND METHODS FOR IMAGE SIGNAL PROCESSOR TUNING USING A
REFERENCE IMAGE
TECHNICAL FIELD
[0001] This disclosure relates generally to systems and methods for tuning an image signal processor, and specifically to determining one or more parameters used by an image signal processor to process an image.
BACKGROUND
[0002] A raw image captured by a camera sensor is processed by an image signal processor
(ISP) to generate a final image. Processing may include a plurality of filters or processing blocks being applied to the captured image, such as denoising or noise filtering, edge enhancement, color balancing, contrast, intensity adjustment (such as darkening or lightening), tone adjustment, and so on. Image processing blocks or modules may include lens/sensor noise correction, Bayer filters, de-mosaicing, color conversion, correction or enhancement/suppression of image attributes, denoising filters, and sharpening filters. Each module may include a large number of tunable parameters (such as hundreds or thousands of parameters per module). Additionally, modules may be co-dependent as different modules may affect similar aspects of an image. For example, denoising and texture correction or enhancement may both affect high frequency aspects of an image. As a result, a large number of parameters are determined or adjusted for an ISP to generate a final image from a captured raw image.
[0003] The parameters for an ISP conventionally are tuned manually by an expert with experience in how to process input images for desirable output images. As a result of the correlations between ISP filters / modules and the sheer number of tunable parameters, the expert may require 3-4 weeks to determine or adjust device settings for the parameters based on a combination of a specific camera sensor and ISP. Since the camera sensor or other camera features (such as lens characteristics or imperfections, aperture size, shutter speed and movement, flash brightness and color, and so on) may impact the captured image and therefore at least some of the tunable parameters for the ISP, each combination of camera sensor and ISP would need to be tuned by an expert.
SUMMARY
[0004] This Summary is provided to introduce in a simplified form a selection of concepts that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to limit the scope of the claimed subject matter. [0005] Aspects of the present disclosure relate to systems and methods for tuning an image signal processor. An example device may include one or more processors. The one or more processor may be configured to receive an input image to be processed, receive a reference image that is a processed image of the input image by a second image signal processor, and determine one or more parameter values to be used by the image signal processor in processing the input image based on one or more differences between the input image and the reference image.
[0006] In another example, a method for tuning an image signal processor is disclosed. The example method includes receiving, by a device, an input image to be processed. The method also includes receiving, by the device, a reference image that is a processed image of the input image by a second image signal processor. The method further includes determining one or more parameter values to be used by the image signal processor in processing the input image based on one or more differences between the input image and the reference image.
[0007] In a further example, a non-transitory computer-readable medium is disclosed. The non- transitory computer-readable medium may store instructions that, when executed by a processor, cause a device to tune an image signal processor. The instructions may cause the device to receive an input image to be processed. The instructions also may cause the device to receive a reference image that is a processed image of the input image by a second image signal processor. The instructions further may cause the device to determine one or more parameter values to be used by the image signal processor in processing the input image based on one or more differences between the input image and the reference image.
[0008] In another example, a device is disclosed. The device includes means for receiving an input image to be processed, means for receiving a reference image that is a processed image of the input image from a second image signal processor, and means for determining one or more parameter values to be used by the ISP in processing the input image based on one or more differences between the input image and the reference image.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] Aspects of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements.
[0010] FIG. l is a block diagram of an example device for tuning an ISP.
[0011] FIG. 2 is an illustrative flow chart depicting a conventional operation for tuning an ISP for a scene type. [0012] FIG. 3 is an illustrative flow chart depicting an example operation for automatically tuning an ISP.
[0013] FIG. 4 is an illustrative flow chart depicting an example operation for adjusting the parameter database.
[0014] FIG. 5 is a depiction of a relationship between texture and sharpness IQ metrics.
[0015] FIG. 6 is an illustrative flow chart depicting an example operation for determining new sets of parameter values for adjusting the parameter database.
[0016] FIG. 7 is an illustrative flow chart depicting an example operation for adjusting one or more IQ metrics in a sequential fashion in adjusting the parameters for personal preference.
[0017] FIG. 8 is a depiction of an example clustering of parameter sets as illustrated by a relationship of noise versus texture.
[0018] FIG. 9 is a depiction of an example tree branch illustration for sequentially adjusting IQ metrics.
[0019] FIG. 10 is a snapshot of an example GUI for adjusting an edge IQ metric.
[0020] FIG. 11 is a snapshot of an example GUI for adjusting a high contrast texture IQ metric.
[0021] FIG. 12 is a snapshot of an example GUI for adjusting a low contrast texture IQ metric.
[0022] FIG. 13 is a snapshot of an example GUI for adjusting a noise IQ metric.
[0023] FIG. 14 is a snapshot of an example GUI indicating the concatenation of selections for the different IQ metrics.
[0024] FIG. 15 is an illustrative flow chart depicting an example operation for using a reference image in automatically tuning an ISP.
[0025] FIG. 16 is an illustrative flow chart depicting an example operation for determining the closest parameter sets and adjusting the parameter database.
[0026] FIG. 17 is a depiction of an example feedback tuning flow using a reference image.
[0027] FIG. 18 is a depiction of an example non-recursive tuning flow using a reference image.
[0028] FIG. 19 is an illustrative flow chart depicting an example operation for using a trained parameter estimator to determine the parameters for an ISP to process images.
[0029] FIG. 20 is a block diagram of an example ISP.
[0030] FIG. 21 is a depiction of an example flow for using different patches in training different modules for determining the parameter values for the ISP. DETAILED DESCRIPTION
[0031] Aspects of the present disclosure may be used for tuning an image signal processor
(ISP), such as determining or adjusting the parameters used by an ISP for processing an input image. In conventionally tuning an ISP, an expert may require weeks of testing and adjusting to determine the parameters to be used by the ISP. Additionally, a user may have different preferences than what an expert may consider a desirable processed image. For example, a user may prefer more color saturation, a softer image, or otherwise than an expert tuning the ISP. Aspects of the present disclosure may be used in tuning an ISP so that less time may be required to tune the ISP and/or a person without expertise (such as a device user) may assist in tuning the ISP with his or her preferences. In some aspects, a database of ISP parameters may be populated, adapted or updated based on user preferences. The final or updated database may then be used to provide the parameters to the ISP in processing an incoming image.
[0032] In the following description, numerous specific details are set forth, such as examples of specific components, circuits, and processes to provide a thorough understanding of the present disclosure. The term“coupled” as used herein means connected directly to or connected through one or more intervening components or circuits. Also, in the following description and for purposes of explanation, specific nomenclature is set forth to provide a thorough understanding of the present disclosure. However, it will be apparent to one skilled in the art that these specific details may not be required to practice the teachings disclosed herein. In other instances, well-known circuits and devices are shown in block diagram form to avoid obscuring teachings of the present disclosure. Some portions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing and other symbolic representations of operations on data bits within a computer memory. In the present disclosure, a procedure, logic block, process, or the like, is conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system.
[0033] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present application, discussions utilizing the terms such as“accessing,” “receiving,”“sending,”“using,”“selecting,”“determining,”“normalizing,”“multiplying,”“averaging,” “monitoring,”“comparing,”“applying,”“updating,”“measuring,”“deriving,”“settling” or the like, refer to the actions and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system’s registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0034] In the figures, a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps are described below generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. Also, the example devices may include components other than those shown, including well-known components such as a processor, memory and the like.
[0035] Aspects of the present disclosure are applicable to any suitable electronic device configured to or capable of tuning an ISP (such as a security system with one or more cameras, smartphones, tablets, laptop computers, digital video and/or still cameras, web cameras, cloud computing networks, testing equipment for ISPs, fabrication facilities, testing devices to interface with ISPs, and so on). While described below with respect to a device having or coupled to one camera, aspects of the present disclosure are applicable to devices having any number of cameras (including no cameras, where images or video are provided to the device, or multiple cameras), and are therefore not limited to devices having one camera. Aspects of the present disclosure are applicable for devices capturing still images as well as for capturing video, and may be implemented in devices having or coupled to cameras of different capabilities (such as a video camera or a still image camera).
Additionally, while described below with respect to a device having one or more ISPs, aspects of the present disclosure are applicable to devices coupled to or interfacing an ISP (such as manufacturing or testing equipment and test devices), and are therefore not limited to devices having an ISP.
[0036] The term“device” is not limited to one or a specific number of physical objects (such as one smartphone, one camera controller, one processing system and so on). As used herein, a device may be any electronic device with one or more parts that may implement at least some portions of this disclosure. While the below description and examples use the term“device” to describe various aspects of this disclosure, the term“device” is not limited to a specific configuration, type, or number of objects.
[0037] FIG. l is a block diagram of an example device 100 for tuning an ISP. The example device 100 may include or be coupled to a camera 102, a processor 104, a memory 106 storing instructions 108, and a camera controller 110. The device 100 may optionally include (or be coupled to) a display 114 and a number of input/output (I/O) components 116. The device 100 may include additional features or components not shown. For example, a wireless interface, which may include a number of transceivers and a baseband processor, may be included for a wireless communication device. The device 100 may include or be coupled to additional cameras other than the camera 102.
The disclosure should not be limited to any specific examples or illustrations, including the example device 100.
[0038] The camera 102 may be capable of capturing individual image frames (such as still images) and/or capturing video (such as a succession of captured image frames). The camera 102 may include a single camera sensor and camera lens, or be a dual camera module or any other suitable module with multiple camera sensors and lenses. The memory 106 may be a non-transient or non- transitory computer readable medium storing computer-executable instructions 108 to perform all or a portion of one or more operations described in this disclosure. The memory 106 may also store a parameter database 109 or a look-up table (LUT) to be used for storing and looking up the parameters for an ISP (such as ISP 112). The device 100 may also include a power supply 118, which may be coupled to or integrated into the device 100.
[0039] The processor 104 may be one or more suitable processors capable of executing scripts or instructions of one or more software programs (such as instructions 108) stored within the memory 106. In some aspects, the processor 104 may be one or more general purpose processors that execute instructions 108 to cause the device 100 to perform any number of functions or operations. In additional or alternative aspects, the processor 104 may include integrated circuits or other hardware to perform functions or operations without the use of software. While shown to be coupled to each other via the processor 104 in the example of FIG. 1, the processor 104, the memory 106, the camera controller 110, the optional display 114, and the optional I/O components 116 may be coupled to one another in various arrangements. For example, the processor 104, the memory 106, the camera controller 110, the optional display 114, and/or the optional I/O components 116 may be coupled to each other via one or more local buses (not shown for simplicity).
[0040] The display 114 may be any suitable display or screen allowing for user interaction and/or to present items (such as captured images, video, or a preview image) for viewing by a user. In some aspects, the display 114 may be a touch-sensitive display. The I/O components 116 may be or include any suitable mechanism, interface, or device to receive input (such as commands) from the user and to provide output to the user. For example, the I/O components 116 may include (but are not limited to) a graphical user interface, keyboard, mouse, microphone and speakers, and so on. The display 114 and/or the I/O components 116 may provide a preview image to a user and/or receive a user input for adjusting one or more settings of the camera 102 (such as selecting and/or deselecting a region of interest of a displayed preview image for an AF operation).
[0041] The camera controller 110 may include an ISP 112, which may be one or more image signal processors to process captured image frames or video provided by the camera 102. In some example implementations, the camera controller 110 (such as the ISP 112) may also control operation of the camera 102. In some aspects, the ISP 112 may process received images using parameters provided from the parameter database 109. The processor 104 may determine the parameters from the parameter database 109 to be used by the ISP 112. The ISP 112 may execute instructions from a memory to process image frames or video, may include specific hardware to process image frames or video, or additionally or alternatively may include a combination of specific hardware and the ability to execute software instructions for processing image frames or video.
[0042] Alternatively, images may be received by the device 100 from sources other than a camera, such as other devices, equipment, network attached storage, and so on. In some other aspects, the device 100 may be a testing device where the ISP 112 is removable so that another ISP may be coupled to the device 100 (such as a test device, testing equipment, and so on). While the following examples are described regarding device 100 and ISP 112, the present disclosure should not be limited to a specific type of device or hardware configuration for tuning an ISP.
[0043] With the number of tunable parameters for an ISP possibly reaching hundreds or thousands, a reduced number of metrics (called“image quality” (IQ) metrics) may be mapped to the tunable parameters so that a person assisting in tuning an ISP 112 may focus on the fewer IQ metrics than the large number of tunable parameters. IQ metrics are measurements of perceivable attributes of an image (with each perceivable attribute called a“ness”). Example nesses are the luminance of an image, the sharpness of an image, the graininess of an image, the tone of an image, the color saturation of an image, and so on, and are perceived by a person if changed for an image. For example, if a luminance of an image is decreased, a person perceives the image to be darker. In some examples, the number of IQ metrics may be 10-20, with each IQ metric corresponding to a plurality of tunable parameters. Additionally, two different IQ metrics may affect some of the same tunable parameters for the ISP 112. In some example implementations, the parameter database 109 may correlate different values of IQ metrics to different values for the parameters. For example, an input vector of IQ metrics may be associated with an output vector of tunable parameters so that an ISP 112 may be tuned for the corresponding IQ metrics. Since the number of parameters may be large, the parameter database 109 may not store all combinations of IQ metrics, but instead include a portion of the number of combinations. While the memory 106 and parameter database 109 are shown to be included in device 100, the database may be stored outside of the device 100 (such as in a network attached storage, cloud storage, testing equipment coupled to device 100, and so on). The present disclosure should not be limited to device 100 or a specific implementation of parameter database 109 or memory 106. Further, the parameters may also impact components outside of the ISP 112 (such as the camera 102), and the present disclosure should not be limited to specific described parameters or parameters specific only to the ISP. For example, the parameters may be for a specific ISP and camera (or camera sensor) combination.
[0044] An IQ model may be used to map the IQ metrics to the tunable parameters. Any type of
IQ model may be used, and the present disclosure is not limited to a specific IQ model for correlating IQ metrics to ISP parameters. In some example implementations, the IQ model may include one or more modulation transfer functions (MTFs) to determine changes in the ISP parameters associated with a change in an IQ metric. For example, changing a luminance IQ metric may correspond to parameters associated with adjusting a camera sensor sensitivity, shutter speed, flash, the ISP determining an intensity for each pixel of an incoming image, the ISP adjusting the tone or color balance of each pixel for compensation, and so on. A luminance MTF may be used to indicate that a change in the luminance IQ metric corresponds to specific changes in the correlating parameters.
[0045] The IQ model or MTFs may vary between different ISPs or vary between different combinations of ISPs and cameras (or camera sensors). As a result, tuning the ISP may comprise determining the differences in MTFs or the IQ model so that the IQ metric values are correlated to preferred tunable parameter values for the ISP (in the parameter database 109). Since an“optimally” processed image may be based on user preference or subjective for one or more experts, the optimization of an IQ model may be open ended and subject to differences between users or persons assisting with the tuning. Flowever, there are attempts to quantify an IQ, such as by using an IQ scale (such as from 0 to 100, with 100 being the best) to indicate the IQ performance for an ISP and/or a camera. In this manner, the IQ for a processed image is quantified, and an expert may use such quantification to tune an ISP (such as the adjusting or determining the parameters for the ISP or the combination of the ISP and camera). Some IQ metrics may be opposed to one another, such as noisiness and texture, where reducing or increasing the noise may correspondingly reduce or increase the high frequency texture information in an image. When tuning an ISP, trade-offs are determined between IQ metrics to attempt to optimize processing of an image (such as by generating the highest quantified IQ score from an IQ scale).
[0046] Optimizing the IQ metrics or otherwise tuning an ISP may differ for different scene types. For example, indoor scenes illuminated by incandescent lighting may correspond to different “optimal” IQ metrics (and corresponding parameters) than outdoor scenes with bright natural lighting. In another example, a scene with large flat fields of color and luminance may correspond to different “optimal” IQ metrics than a scene with large numbers of colors and variances in color within a field.
As a result, an ISP may be tuned for a plurality of different scene types. [0047] FIG. 2 is an illustrative flow chart depicting a conventional operation 200 for tuning an
ISP for a scene type. An initial set of parameter values for the ISP is used in processing one or more received images (202). An expert then inspects the original and processed images to determine how the parameters should be adjusted (204). Through inspection of the images, the expert determines the parameters to be adjusted and the amount of adjustment (206). For example, the expert may determine the IQ metrics to be adjusted and the amount of adjustment, and one or more MTFs for the adjusted IQ metrics may be used to determine the amount of adjustment for corresponding ISP parameters. The parameters are adjusted (208), and the adjusted parameters are used for the ISP to again process one or more images (210). The process reverts to 204, with the expert repeatedly inspecting the images, adjusting the parameters, and the ISP processing images with the adjusted parameters until the expert is satisfied with the processed images. Once the parameters are“optimized,” the parameter values may be stored in a parameter database (such as database 209) for the scene type. Multiple sets of parameter values may be stored for a scene type, and/or the stored sets of parameter values may correspond to discrete differences in one or more IQ metrics.
[0048] In some example implementations, at least a portion of the ISP is automatically tuned by a device. As a result, the time for tuning an ISP may be reduced. Automatically tuning the ISP may also take into account user preferences to tune an ISP for a user’s preferences instead of an expert (therefore providing images more preferable to the user). The automatic tuning of an ISP may be performed during device or ISP design, manufacture or testing, which may include assisting an expert in tuning the ISP. Alternatively or additionally, the automatic tuning of an ISP may be performed by an end user’s device, such as a smartphone, tablet, or other device including and/or in communication with one or more ISPs (such as device 100 including ISP 112). For example, an ISP 112 may have been tuned previously by an expert, with the parameter database 109 populated with parameter values to be used for different scene types. Automatically tuning with user input may update the ISP tuning so that the parameter database 109 may be updated to include parameter values preferred by the user (such as by densifying the parameter database 109 with additional vectors of parameter values or adjusting existing vectors of parameter values). In another example, the MTFs may be updated through the automatic tune procedure to better correlate parameters with IQ metrics. The automatic tuning may include software, special hardware, or a combination of both. For example, automatically tuning may include an application or software to be executed by processor 104 for populating or updating the parameter database 109 of device 100.
[0049] In automatically tuning, a person (such as a tuning expert and/or a user of a given device) may be presented with different possible processed images to determine which images the person prefers and therefore which IQ metrics may be of more importance to the person in tuning the ISP. Additionally, or alternatively, a person may select the IQ metrics of importance to him or her, and the device may present possible processed images for different values of the IQ metrics to determine the person’s preference and therefore improve the tuning of the ISP for the person.
[0050] FIG. 3 is an illustrative flow chart depicting an example operation 300 for automatically tuning an ISP. Beginning at 302, one or more images (such as raw images captured by a camera) may be received. In some implementations, values for parameters that are fixed for an ISP optionally are determined (304). For example, sensor or module specific parameter values, such as some parameters for black level, lens roll-off, gamma, color, etc., may not change for different scene types. The parameter values may therefore be determined separate from automatically tuning the ISP (such as determining values for non-fixed parameters). Alternatively, step 304 may not be performed.
[0051] The ISP may then be automatically tuned using the received images (306). As one option, the parameter database and/or the MTFs for an IQ model may be populated or adjusted using the received images (308). For example, relationships and trade-offs between IQ metrics or parameters may be determined or defined for the received images. One example relationship is texture vs. edge sharpness for an image. Preserving edges in an image may also preserve texture or other high frequency information in an image. Another example relationship is noise vs. texture. Preserving texture or high frequency information may also result in more noise being present within an image. A further example relationship is color vs. tone. If the tone of an image is adjusted, tone adjustment may impact the color values for the pixels of the image (such as skewing one or more red, green, or blue values of a pixel when adjusting the tone of the image). The IQ model to quantify IQ may be used to determine different example values for the parameter set (based on the determined trade-offs) for producing processed images with high IQ scores (such as greater than a predetermined or adjustable threshold, greater than an IQ score for a previous processed image, etc.).
[0052] In an additional or an alternative option in automatically tuning the ISP (306), parameter values for the ISP for different scene types may be determined based on personal preference (310). For example, a person may be provided (e g., presented for selection) choices with perceptible differences in processed images of the received images in order to assist in determining a person’s preferences. The preferences selected by the person may then be used to densify the parameter database (e g., populate additional data points), adjust the parameter database (e.g., adjust existing data points), set (e.g., configure or determine) the parameter values for the ISP for processing images, or perform a combination of two or more of the operations.
[0053] The following examples describe automatically tuning with respect to noise vs. image sharpness IQ metrics / nesses and related parameters. However, the same or similar processes may be used for automatically tuning the ISP for other relationships, including color vs. tone or others.
Additionally, while a relationship between two nesses is described, relationships between three or more nesses may be determined or used to determine parameter values, with the number of calculations scaling non-linearly with the number of nesses to be related. Therefore, the following examples are for illustrative purposes only, and should not limit the scope of the present disclosure.
[0054] The parameter database 109 may include sets of parameter values previously determined to cause an ISP to generate a“high-quality” image (e g., as designated or determined based on an IQ score equaling or exceeding a threshold score). Each set of parameter values may be associated with IQ metric values. The database 109 may be organized or have multiple organization structures so that vectors with similar IQ metrics may be grouped together. For example, the database 109 may be indexed or organized so that sets with similar texture ness values may be identified. As described in FIG. 3, the parameter database 109 may be adjusted or updated for automatically tuning the ISP.
[0055] FIG. 4 is an illustrative flow chart depicting an example operation 400 for adjusting the parameter database. Beginning at 402, one or more images for processing by an ISP are received or otherwise made available. The images may be raw images captured by a camera sensor with noise and luminance characteristics that may impact processing. Further, one or more personal preferences (such as preferences of the expert and/or a user for a final processed image) may optionally be received (404). Example preferences may include preferences regarding color saturation, tone, noisiness, etc. of the person for the processed images. A device may then determine whether an existing parameter database (with one or more previously determined sets of parameter values) is to be adjusted based on the characteristics for the camera sensor and/or the personal preferences (406). For example, an insufficient number of sets of parameter values may be determined to exist in the parameter database. In another example, the existing sets may be determined to insufficiently correlate to the camera sensor used for capturing the received images. In a further example, a scene type of a received image may not be covered by the existing parameter database.
[0056] Based on a determination that the parameter database is not to be adjusted (408), such that the sets of parameter values are determined to be sufficient for the received images, the existing parameter database may be used without adjustment (410). Based on a determination that the parameter database are to be adjusted (408), the received images may be evaluated using the existing sets of parameter values in the parameter database (412). In evaluating the received images (412), one or more relationships among IQ metrics may be analyzed using the received images (414). For example, the scatter of IQ metric relationships for texture versus edge sharpness (based on the existing sets of parameter values and the received images for processing) may be analyzed. One or more new sets of parameter values may then be determined based on the analyzed relationships (416). For example, the relationship between edge sharpness and texture IQ metrics may be used to determine new sets of parameter values for different sharpness and texture IQ metrics that still provide a sufficient IQ score for a processed image. The new sets of parameter values may also be used to better define tradeoffs for IQ metrics for the IQ model. For example, new sets of parameter values may indicate tradeoffs between a noisiness IQ metric and a texture IQ metric. The one or more new sets of parameter values may then be determined to be added to the parameter database (418), thus densifying the parameter database. Alternatively or additionally, an existing set of parameter values may be amended based on a new set of parameter values determined.
[0057] In determining new sets of parameter values, one or more IQ metrics may remain fixed while one or more other IQ metrics are adjusted. FIG. 5 is a depiction of a relationship 500 between texture and sharpness IQ metrics. Existing points 502 indicating the relationship between the nesses may be from the existing sets of parameter values corresponding to different texture and sharpness IQ metrics. With only two existing points 502 in the example, a plurality of new parameter value sets for different texture and sharpness IQ metrics may be determined using the received images (so as to have a sufficient IQ score for a processed image). The new sets may correspond to new points 504 on the relationship 500 between texture and sharpness IQ metrics, which may better indicate tradeoffs between IQ metrics. While the relationship 500 is depicted as a graph of two nesses, the relationship may be between any number of nesses and therefore any number of dimensions.
[0058] Determining new sets of parameter values may be based on existing sets of parameter values in the parameter database. For example, an existing set of parameter values (a parent set) may be adjusted in order to create one or more new sets of parameter values (children sets). FIG. 6 is an illustrative flow chart depicting an example operation 600 for determining new sets of parameter values for adjusting the parameter database.
[0059] Beginning at 602, a space of near IQ metrics for an existing parent set is determined. For example, a determined distance away from a parent set may be a determined space. Triangulation or sum of differences are example methods for determining a distance, but the space may be determined in any suitable way. Graphically for 3 nesses, a cube may be determined around a parent set, where potential children sets may exist within the cube (space). In another example, a sphere or other suitable shape may be determined around a parent set.
[0060] In some example implementations, a child set may be determined by interpolating parameters values between the parent set and an existing set (such as described regarding 604-608). In some other example implementations, a child set may be determined by perturbing or adjusting parameters of the parent set within the space (such as described regarding 610). In some further example implementations, a combination of interpolating and perturbing may be performed. For example, some child sets may be created through perturbation, then additional child sets may be created through interpolating between the previous child sets and the parent set. In another example, an interpolated child set’s parameters may be perturbed within a space to adjust the child set or create new child sets. [0061] In the example of 604-608, the furthest neighbor from the parent set in the space is used for interpolation. However, any neighbor may be used for interpolation in other examples. Referring back to 604, the distances between the parent set and existing sets in the space may be determined. The furthest set from the parent set may then be determined based on the distances (606). For example, the space may be defined in dimensions of nesses, and a distance may be the combined difference in nesses between the sets. In this manner, the differences in parameter values between the furthest set and the parent set may be considered the maximum adjustments to the parameter values for the parent set in creating children sets. As a result, any resulting child set may be configured to be within the space.
[0062] After determining the furthest neighbor in the space (606), one or more parameter values from the parent set may be adjusted with an interpolated difference between the furthest neighbor and the parent set (608). In some example implementations, only a subset of the IQ metrics may be determined to be adjusted. In this manner, the corresponding parameters for the subset of IQ metrics may be adjusted through interpolation. In some other example implementations, all of the parameters may be adjusted through interpolation. Adjusting a parameter may be performed as depicted in equation (1) below:
child parameter— parent parameter + a(neighbor parameter—
parent parameter ) (1) where a is a value between 0 and 1. In some example implementations, a may be constant for all parameters to be adjusted. As a result, the factor of adjustment for the parameters being adjusted is the same. For example, based on all parameters being adjusted, the child set is as depicted in equation (2) below:
child set = parent set + a(neighbor set— parent set ) (2)
[0063] Alternative or additional to a child set being determined through interpolation, a new set may be determined by adjusting or perturbing one or more parameters of the parent set (610). In some example implementations of adjusting one or more parameters of the parent set, the sparsity of sets around the parent set may be determined, with the sparsity used to determine the factor by which to adjust one or more parameters. A sparsity cost for a parent set may be a distance between the parent set and a distribution of existing sets in the space or across the group. For example, the Mahalanobis distance between the parent set and its existing neighbors in the space may be determined as the sparsity cost. The distance may also be determined for each existing set and an average distance determined for the existing sets across the entire group (which may be an average cost for the group). The factor for adjusting parameters may be as depicted in equation (3) below:
Figure imgf000015_0001
where x is the parent set sparsity cost and c is the average sparsity cost for the entire group. If the sparsity around the parent set is greater than the average sparsity (less neighbors surround the parent set than typical), then adjustments to the parameters may be smaller so that the corresponding IQ metrics are within the space. Conversely, if the sparsity around the parent set is less than the average sparsity (more neighbors surround the parent set than typical), then adjustments to the parameters may be greater since the greater number of neighbors indicate that the corresponding IQ metrics for greater adjustments should still be within the space. The size of the window for adjusting a parameter may be a standard deviation of the parameter for the entire group times the factor, and the window may be centered at the parameter value for the parent set. If the sparsity around the parent set is greater than or equal to the average sparsity (less or the same number of neighbors surround the parent set and are distributed than typical), the window size may be approximately one standard deviation. Conversely, if the sparsity around the parent set is less than the average sparsity, the window size may be multiple standard deviations. In some example implementations, a parameter value is randomly or pseudo- randomly selected from the window. In some further example implementations, related parameters (such as parameters associated with an IQ metric) may be adjusted by a similar factor, where a same position in the window is used for each related parameter.
[0064] After generating one or more potential child sets, the IQ metrics for each potential child set may be determined (612). For example, the received image(s) may be processed by the ISP using the child parameter values, and IQ metrics may be calculated from the processed image(s). A determination may then be made whether the IQ metrics are valid (614). In one example, the IQ metrics are compared to the IQ metrics for existing sets in the parameters database to determine if they are consistent. If a portion of the IQ metrics are outliers (e.g., not included among the IQ metrics of the existing sets in the parameter database), the IQ metrics may be considered invalid. In another example, an IQ score may be computed for a processed image. If the image score is sufficient, such as greater than a threshold, the IQ metrics are considered valid. Other suitable processes for determining the validity of the IQ metrics may be used, and the present disclosure should not be limited to specific examples.
[0065] If the new IQ metrics are considered valid (614), the child set may be added to the parameter database (616). If the new IQ metrics are considered invalid (614), the child set may be rejected and not added to the parameter database (618).
[0066] With the database of sets of parameter values to be used for an ISP, personal preferences through user input(s) may be collected to focus the parameter database for personal preferences. In some example implementations, a display may provide (e.g., display) different processed images for a varying IQ metric, and a mechanism for receiving user input (e.g., a GUI or a camera or microphone) may allow a user to select the preferred processed images to indicate the preferences for the IQ metric. FIG. 7 is an illustrative flow chart depicting an example operation 700 for adjusting one or more IQ metrics in a sequential fashion in adjusting the parameters for personal preference. The process may be used to indicate which parameter sets from the parameter database are preferred by the user for the ISP (or ISP and camera combination).
[0067] Beginning at 702, the IQ metrics to be adjusted for a user are determined. In one example, a user may indicate which IQ metrics are of particular importance to that particular user. The IQ metrics may be for a particular scene or generally for all scenes. The parameter sets of the parameter database may then be clustered or grouped for each of the IQ metrics to be adjusted (704). FIG. 8 is a depiction of an example clustering of parameter sets as illustrated by a relationship of noise versus texture. As shown, the parameter sets are clustered into three groups: low noise and texture 802, medium noise and texture 804, and high noise and texture 806. While three groups are shown, any number of clusterings may exist. Additionally, while the relationship is illustrated as between two nesses, the relationship may be any number of dimensions corresponding to the number of nesses being related. The groupings or clusterings indicate the sets with close IQ metrics (such as IQ metrics within a determined distance of one another). For example, the three clusterings indicate that the noise IQ metric and the texture IQ metric are similar for the parameter sets in a cluster. While not shown, one or more parameter sets may not be clustered and may be removed from consideration for the final parameter set to be used by the ISP.
[0068] Referring back to FIG. 7, a received image is processed for each of the parameter sets in a clustering for the IQ metric to first be adjusted (706). The image may also be processed with a varying IQ metric corresponding to differences in the corresponding parameters for each of the parameter sets (with each parameter set possibly being used multiple times to process the image). The number of times that the image is processed may correspond to the number of parameter sets clustered for the IQ metric. The processed images are then displayed or otherwise presented to a user (708) so that the user may indicate which processed image(s) are preferred. The user may then indicate (such as through a GUI or other user input) which processed images are preferred (710). Alternatively, an IQ score may be determined for each of the processed images, and the highest IQ scores or scores greater than a threshold may be used to select the processed images.
[0069] For the user selections, the corresponding parameter values for the IQ metric being adjusted may be determined (712). For example, the user selections may have a subset of parameters corresponding to the IQ metric with similar or the same parameter values across the user selections. In another example, for each selection, the parameter values associated with the IQ metric is preserved when processing an image for a next varying IQ metric. The image is then again processed for a next varying IQ metric (714). The process may continue until all indicated metrics are adjusted. Afterwards, the parameter database may be searched to determine whether the parameters for the preferred IQ metrics are similar to the parameters for one or more stored parameter sets. Such parameter sets may be considered the preferred sets of parameter values to be used by the ISP for processing an image.
Additionally or alternatively, the determined parameter values may be added to the parameter database as one or more new parameter sets.
[0070] In sequentially adjusting IQ metrics, the adjustments may be depicted in a tree branch structure. FIG. 9 is a depiction of an example tree branch illustration 900 for sequentially adjusting IQ metrics. As shown, the clusterings 902 are used as starting points, and an edge MTF 904 may first be used to adjust an edge IQ metric. A high contrast texture MTF 906 may then be used to next adjust a high contrast texture IQ metric. A low contrast texture MTF 908 may next be used to adjust a low contrast texture IQ metric. A noise MTF 910 may then be used to adjust a noise IQ metric. Fine tuning adjustments (indicated as overshoot 912) may then be performed to finalize one or more parameters that may change the perception of the processed image. The end point of each of the arrows may indicate a different processed image. The continuing arrows may indicate that the user selected those images for the respective IQ metric. In some example implementations, the darkened solid arrows, the dashed solid arrows and the gray solid arrow may indicate images selected by the user as preferred over other selected images. The user may select the image corresponding to the final darkened solid arrow during overshoot 912 as the preferred image with respect to the other preferred images.
[0071] A GUI may be used in adjusting one or more IQ metrics. For example, a GUI may allow a user to inspect the trade-off between IQ metrics and determine the preferred metrics. In another example, the GUI may allow a user to determine the preferred IQ metric for the selected metrics to be adjusted. FIGS. 10-14 depict an example GUI for adjusting IQ metrics corresponding to the example tree branch illustration in FIG. 9. FIG. 10 is a snapshot 1000 of an example GUI for adjusting an edge IQ metric. A user may select one or more of the defined edge IQ metric values or relationships and press next to go to the next IQ metric. FIG. 11 is a snapshot 1100 of an example GUI for adjusting a high contrast texture IQ metric. With the selections for the edge IQ metric, a user may select one or more of the defined high contrast texture IQ metric values or relationships and press next to go to the next IQ metric. FIG. 12 is a snapshot 1200 of an example GUI for adjusting a low contrast texture IQ metric. With the selections for the edge IQ metric and the high contrast IQ metric, a user may select one or more of the defined low contrast texture IQ metric values or relationships and press next to go to the next IQ metric. FIG. 13 is a snapshot 1300 of an example GUI for adjusting a noise IQ metric.
With the selections for the edge IQ metric and the high and low contrast IQ metrics, a user may select one or more of the defined noise IQ metric values or relationships and press add to cart to end. As shown, the potential noise IQ metrics (N in FIG. 13) are based on the previously selected IQ metrics (E selected for edge tuning (FIG. 10), H selected for high contrast tuning (FIG. 11), and L selected for low contrast tuning (FIG. 12) under each of the images on the left of the snapshot 1300). [0072] The GUI may show the groupings of selected IQ metrics (with respective parameter sets). FIG. 14 is a snapshot 1400 of an example GUI indicating the concatenation of selections for the different IQ metrics. In some example implementations, a user may select one or more final concatenations to be used (such as by checking the box to the left illustrated in snapshot 1400). The parameter set used by the ISP is thus dependent on the selected IQ metric values or relationships (such as through the different MTF s for determining the parameter values for a selected grouping of IQ metrics. For example, one or more sets of parameter values from the parameter database may be identified based on the selected concatenation of IQ metrics (such as from FIG. 14). Such identified sets of parameter values may therefore be used by the ISP in processing received images.
[0073] As previously stated, the optimization of an IQ model may be open ended and subject to different preferences between users or persons. There may be no“correct” set of parameter values since different processed images using different parameter values may be considered to be of similar IQ by a person. As a result, determining the parameter values to be used or otherwise tuning an ISP may be long or tedious since the parameter values may not converge to one specific set of parameter values. Determining initial parameters values or how to adjust parameter values may be difficult since there may not be one preferred setting for the IQ metrics.
[0074] In some aspects of the present disclosure, a reference image processed by a different ISP or device may be introduced into the automatic tuning process. The reference image may provide some guidance or indication as to one or more preferred IQ metrics and their associated parameter values.
For example, a reference image may be used to determine one or more closest sets of parameter values in the parameter database. The closest sets may be used to densify or otherwise adjust the parameter database. The below example processes of using a reference image for automatically tuning may be combined with one or more of the previously described example processes for automatically tuning or updating the parameter database.
[0075] FIG. 15 is an illustrative flow chart depicting an example operation 1500 for using a reference image in automatically tuning an ISP Beginning at 1502, a reference image may be received. The reference image may be previously processed. For example, the reference image may have been provided by a different ISP or device after completing processing. In some example implementations, the reference image is different than the input image for processing by the ISP.
[0076] After receiving the reference image, one or more preferred IQ metrics may be determined from the reference image (1504). For example, a texture IQ metric, a noise IQ metric, and an edge IQ metric may be determined from the reference image. Other examples IQ metrics may include a tone IQ metric, color IQ metric, high frequency contrast IQ metric, low frequency contract IQ metric, and so on. While the example processes are described regarding texture, noise, and edge IQ metrics, other and any number of IQ metrics may be used. Therefore, the present disclosure should not be limited to specific IQ metrics or examples.
[0077] One or more parameter sets with the parameter values for the sets corresponding to IQ metrics closest to the preferred IQ metrics may then be identified (1506). In some example implementations, a parameter database may store a vector of IQ metrics for each set of parameter values. In some other example implementations, the MTFs for an IQ model may be used to determine the IQ metrics for each set of parameter values in the parameter database. Parameter sets with the closest IQ metrics to the preferred IQ metrics may be considered the closest parameter sets.
[0078] In some example implementations, a distance function may be used to determine the closest parameter sets. An example distance function is depicted in equation (4) below:
Figure imgf000020_0001
forj from 1 to D (4) where i is a specific IQ metric, X is the preferred IQ metric value for the specific IQ metric from the group or vector of preferred IQ metric values X, Mj is the group or vector of IQ metric values for the jth parameter set in the parameter database, W is a weight for the ith IQ metric from weight vector W (where each IQ metric may be associated with a different weight), and D is the number of parameter sets in the parameter database. In some other example implementations, the distance function may be an unweighted summation, where the difference between a parameter set IQ metric value and the preferred IQ metric value is not multiplied by a weight factor.
[0079] In one example, if the preferred IQ metrics determined are the texture IQ metric, edge IQ metric, and noise IQ metric, i may range from 1 to 3 for the three IQ metrics, and the distance for a parameter set j may be a sum of three values: the weighted difference between corresponding IQ metric values and the preferred IQ metric values for the texture, edge, and noise IQ metrics. In some example implementations, the closest parameter set j may be the parameter set with the smallest or minimum distance across the parameter sets. In some other example implementations, a parameter set may be selected if the distance is less than a threshold. In this manner, a parameter set may be identified without searching the entire parameter database.
[0080] Using the one or more identified parameter sets, the ISP may then process a received image (1508). For example, a raw image may be input into or received by the device or ISP and processed using the identified parameter set(s). The received image may be the raw image (pre processing) of the reference image. One or more personal or user preferences also may be determined or received (1510). Then, the parameter database may be adjusted based on the one or more personal preferences and the one or more identified parameter sets (1512).
[0081] In some example implementations, variations to an identified parameter set may be used to process the input image, and the variations are analyzed to determine if the child set is to be added to the parameter database. For example, example operation 600 in FIG. 6 may be used to densify the parameter database, where the parent set is from the one or more identified parameter sets in 1506. Referring back to FIG. 15, the process of identifying one or more parameter sets and using such to adjust the parameter database (1506-1512) may be recursively performed until determined that the parameter database is to not be further adjusted. For example, the parameter database may reach a critical number of parameter sets being stored. In another example, the parameter database may stop being updated if no new child sets with valid IQ metrics (such as from example operation 600 in FIG. 6) are identified or determined. In a further example, the parameter database may stop being updated if the new child sets do not sufficiently improve the IQ (such as increasing the IQ score by a threshold amount or differences between the parent set and child set cannot be perceived by a user when processing an image).
[0082] In some example implementations for identifying one or more parameter sets from the parameter database (1506), a number of parameter sets equal to or greater than the number of preferred IQ metrics may be identified. FIG. 16 is an illustrative flow chart depicting an example operation 1600 for determining the closest parameter sets and adjusting the parameter database. Example operation 1600 in FIG. 16 may be an example implementation of steps 1506-1512 of FIG. 15. While FIG. 16 is described regarding texture, noise, and edge IQ metrics, any IQ metrics and number of IQ metrics may be used.
[0083] A closest parameter set for the preferred IQ metrics (such as the texture, noise, and edge
IQ metrics) may be determined from the parameter database (1602). In some example implementations, the distance function depicted in equation (4) may be used to determine the closest parameter set. In contrast to the closest parameter set taking into account all preferred IQ metrics, a different parameter set other than the closest parameter set may be better suited in processing an image. For example, one or more of the IQ metrics may be relaxed in determining a closest parameter set. While operation 1600 describes relaxing one IQ metric in determining a closest parameter set, more than one IQ metric may be relaxed.
[0084] Referring to 1604, a closest parameter set with a relaxed texture IQ metric may be determined. In some example implementations, the weight vector in determining a distance may be adjusted to reduce the weight for the texture IQ metric. For example, the weight may be adjusted to zero (to remove consideration of the IQ metric from determining the distance) or a portion of the previous weight (to reduce consideration of the IQ metric in determining the distance). Similarly, a closest parameter set with a relaxed noise IQ metric may be determined (1606), and a closest parameter set with a relaxed edge IQ metric may be determined (1608). In some example implementations, one or more of the determined parameter sets may be the same. In some other example implementations, previously determined closest parameter sets may be removed from consideration in determining a closest parameter set so that the number of determined parameter sets corresponds to the number of preferred IQ metrics.
[0085] A received image may then be processed using the determined/identified parameter sets
(1610), which may be similar to 1508 in FIG. 15. Using the processed images, the parameter set to be used may be determined to be the closest parameter set (such as determined in 1602) or somewhere between the closest parameter set and one of the parameter sets with a relaxed IQ metric (such as determined in 1604 through 1608) (1612). In some example implementations, the determined parameter set may be one of the parameter sets determined in 1604 through 1608 (instead of between one of the parameter sets and the closest parameter set).
[0086] In some aspects of determining the parameter set to be used (1612), the processed images may be presented to a user. The user may then select the preferred processed image(s). The user input or selection may indicate which parameter set to be used. For example, if the user selects the processed image for the closest parameter set, the closest parameter set is determined to be the parameter set to be used by the ISP. In this manner, the parameter database is not updated since the closest parameter set is selected. If the user selects one of the processed images for the parameter sets for relaxed IQ metrics, a parameter set between the closest parameter set and the corresponding relaxed parameter set may be determined to be used. As a result, a child set from the closest parameter set and the relaxed IQ metric parameter set may be created.
[0087] In some example implementations, the child set from the closest parameter set and the relaxed IQ metric parameter set may be determined through interpolation between the two existing parameter sets. For example, steps 604-608 of example operation 600 in FIG. 6 may be used to determine or create a child set. In another example, one or more IQ metric values between the values for the closest parameter set and the parameter set of the relaxed IQ metric may be determined. One or more MTFs of the IQ model may then be used to determine the parameter values for a child set.
[0088] In some aspects, the child set is used in processing the received image and compared to the processed images for the two parent sets. If a user prefers the processed image for the child set (or alternatively, an IQ score or other evaluation of the processed images indicate that the processed image for the child set is greater than for the other processed images), the child set may be added to the parameter database. The process may be repeated as long as the parameter database is to be adjusted (such as being densified with additional child sets). If a user prefers the processed image for the closest parameter set (or alternatively, an IQ score or other evaluation of the processed images indicate that the processed image for the child set is less than for the other processed images), the child set may be rejected and the parameter database not further updated.
[0089] Using IQ metrics and user preferences for the entirety of tuning an ISP may require a significant amount of time. As previously stated, an expert manually adjusting IQ metrics may take weeks to tune an ISP. Additionally, automatically tuning with recursively updating the parameter sets or adjusting the parameter database based on repeated user inputs may take, e.g., 6-8 hours. In some aspects of at least an initial portion for automatically tuning an ISP, recursively adjusting the parameter database or tuning the parameters may be removed. For example, a one-shot or non-recursive process may be used to initially determine the parameter values for a parameter set to be used by the ISP (which may be called“coarse” tuning). The initially determined parameter values may then be tuned or adjusted using, e.g., user preferences, a scene type, luminance, and/or characteristics of the target ISP, to improve or optimize the parameter set (which may be called“fine” tuning). While coarse tuning and fine tuning are described, coarse tuning may be used exclusively to determine a parameter set to be used by the ISP. Therefore, the present disclosure should not be limited to including both coarse tuning and fine tuning.
[0090] FIG. 17 is a depiction of an example feedback tuning flow 1700 using a reference image.
Example feedback tuning flows using a reference image include processes 1500 and 1600 in FIG. 15 and FIG. 16, respectively. As shown, a reference or target image from a separate ISP may be used to adapt a tuning tool. For example, the parameter database may be adjusted using the reference image.
As shown, the tuning tool may be recursively updated (such as continuing to densify or otherwise adjust the parameter database), and the updated tool (such as the adjusted parameter database) may recursively be used to process an input image (such as the raw image) by the ISP to determine more feedback, such as user preferences and/or IQ metrics, for updating the tuning tool. The feedback loop of updating the tool and determining feedback to again update the tool may continue until the tool is sufficiently updated. For example, the parameter database is adjusted until one or more parameter sets are determined to be sufficient for use by the ISP in processing images.
[0091] In some example implementations, the feedback loop for updating the tuning tool (such as adjusting the parameter database) is replaced with a non-recursive or one-shot process for determining initial values for the parameters using a reference image. In this manner, the time for determining the parameter values may be reduced or expedited. FIG. 18 is a depiction of an example non-recursive (“feed-forward”) tuning flow 1800 using a reference image. As shown, a previously trained parameter estimator for the ISP may be used to determine initial parameter values based on differences between the input image (such as the raw image) and the reference image (such as a target image from a different ISP). Since the input image is not processed and evaluated multiple times as a result of a feedback or recursive process, the time in determining the parameter values and processing the input image by the ISP using the parameters values may be reduced.
[0092] A parameter estimator may be previously trained before being used to determine parameter values in the flow 1800 in FIG. 18. In some example implementations, the parameter estimator is a neural network or other fuzzy logic decision maker that is trained using a plurality of other inputs and corresponding reference images for the ISP. In one example, the parameter estimator may be a deep layer neural network (“DeepNet”). In training a DeepNet using a plurality of input images and corresponding reference images from a different ISP, parameter relationships between an input image and a reference image may be determined and refined. As a result, differences between an input image and a corresponding reference image may be an input into the relationships defined by the DeepNet to determine and output parameter values to be used by the ISP in processing images. In training the DeepNet, the error or loss between a processed image from the ISP and the reference image is analyzed to determine how to adjust or update the DeepNet. For example, differences in IQ metrics may be compared between the reference image and the processed image in further training the DeepNet.
[0093] In some example implementations of using a trained parameter estimator (such as a
DeepNet) to determine the parameter values, IQ metrics or subjective measures for processed images may not be required or used. For example, the parameter estimator, once trained, may be sufficient on its own in determining the parameter values. In determining the parameter values, the trained parameter estimator may receive a reference (output) image from a different ISP and the corresponding input image. The images may be used to estimate parameters for processing the input image by the present ISP so that the processed image from the ISP approximates the reference image from the different ISP.
[0094] FIG. 19 is an illustrative flow chart depicting an example operation 1900 for using a trained parameter estimator (such as a DeepNet) to determine the parameters for an ISP to process images. Beginning at 1902, an input image to be processed by the ISP may be received. A reference image corresponding to the input image processed by a different ISP may also be received (1904). The input image and the reference image may then be input into the trained parameter estimator for determining or estimating the parameters to be used by the ISP for processing the input image (1906). With the input image and the reference image, the trained parameter estimator may estimate the parameters for the ISP so that the ISP may approximate the reference image in processing the input image (1908). For example, the processed image from the ISP is to be as close as possible to the reference image.
[0095] In training a parameter estimator, preferred processed images from another ISP may be selected in providing reference images for input images in training the parameter estimator. In this manner, the parameter estimator may estimate parameter values so to“imitate” (e g., track, mirror, closely correspond, substantially replicate, etc.) processing of images by the other ISP. In other words, the parameter estimator may determine parameter values that, when used to tune an ISP, generate a processed image that would be perceived by a user/person to substantially replicate the processing performed by the other, distinct ISP.
[0096] In some example implementations, multiple parameter estimators may be used in tuning an ISP. An ISP may include a plurality of filters or modules for processing different aspects of an input image. FIG. 20 is a block diagram of an example ISP 2000. ISP 2000 may be an example implementation of the ISP 112 of device 100 in FIG. 1. The illustrated ISP 2000 may be a single thread (or single core) processor with a sequence of filters 2002A-2002N. In alternative aspects, the ISP may be (or included in) a multi-thread or multiple core processor. In one example implementation, filter 1 (2002A) may be a noise reduction filter, filter 2 (2002B) may be an edge enhancement filter, and filter N (2002N) may be a final filter to complete processing the captured image frame. The filters may process the image, with the filters corresponding to different IQ metrics (such as a noise IQ metric for filter 1 2002A, an edge IQ metric for filter 2 2002B, and so on). Each filter may use a plurality of parameters to process the corresponding aspect of the input image (such as sharpening edges, denoising, and so on).
[0097] In some example implementations, a different parameter estimator may be trained for each of the filters or modules of the ISP. For example, a first parameter estimator may be trained for filter 1 2002A, a second parameter estimator may be trained for filter 2 2002B, and so on. In this manner, a parameter estimator may estimate or determine the parameter values for the corresponding filter or module of the ISP to be used in processing the input image.
[0098] Since some of the IQ metrics or filters may be related, one or more same parameters may be used by multiple filters or modules of an ISP in processing an input image. In some aspects, different parameter estimators may determine different values for a parameter. The ISP may use the different values corresponding to the respective filters or modules. Alternatively, the ISP or device may select and use one of the determined values for the parameter across the multiple filters or modules.
[0099] As previously described, the initially determined parameter values may be adjusted or fine-tuned. For example, user preferences or ISP specific characteristics may be used to fine-tune the determined parameter values. Examples of fine-tuning may include any of the previously described processes of adjusting parameter values using personal preferences or other user input. However, the determined parameter values need not be adjusted or fine-tuned, and fine-tuning is not required.
[00100] In some example implementations for training parameter estimators for the modules or filters of the ISP, portions of an image sensitive to changes in the module’s parameter values may be used to train a corresponding parameter estimator. For example, a TE42 chart may be an input image for training a parameter estimator, and a patch of the TE42 chart may be sensitive to changes in parameter values for the corresponding ISP module. In some example implementations for determining parameters (with some parameters corresponding to multiple different filters or modules), a Siamese convolutional DeepNet may be used to fuse information from multiple patches (for multiple modules) in order to estimate the parameters. In this manner, the outputs may be stacked and convolutionally processed in training the parameter estimators. FIG. 21 is a depiction of an example flow 2100 for using different patches in training different modules and thus determining the parameter values for the ISP. As shown, each of patch pairs l-N may correspond to the patch of the input image and the corresponding patch of the reference image. Each pair of patches may be compared to determine example module parameters 2102. The example module parameters may then be compared or otherwise combined to provide the final module parameters 2104 (such as described above regarding having differences in some module parameter values).
[00101] The techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium (such as the memory 106 in the example device 100 of FIG. 1) comprising instructions 108 that, when executed by the processor 104, cause the device 100 to perform one or more of the methods described above. The non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.
[00102] The non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASE1 memory, other known storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.
[00103] The various illustrative logical blocks, modules, circuits and instructions described in connection with the embodiments disclosed herein may be executed by one or more processors, such as the processor 104 of FIG. 1. Such processor(s) may include but are not limited to one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), application specific instruction set processors (ASIPs), field programmable gate arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. The term“processor,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein. In addition, in some aspects, the functionality described herein may be provided within dedicated software modules or hardware modules configured as described
herein. Also, the techniques could be fully implemented in one or more circuits or logic elements. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
[00104] While the present disclosure shows illustrative aspects, it should be noted that various changes and modifications could be made herein without departing from the scope of the appended claims. Additionally, the functions, steps or actions of the method claims in accordance with aspects described herein need not be performed in any particular order unless expressly stated otherwise. For example, the steps of the described example operations may be performed in any order and at any frequency. Furthermore, although elements may be described or claimed in the singular, the plural is contemplated unless limitation to the singular is explicitly stated. Accordingly, the disclosure is not limited to the illustrated examples and any means for performing the functionality described herein are included in aspects of the disclosure.

Claims

CLAIMS What is claimed is:
1. A method for tuning an image signal processor (ISP), comprising:
receiving, by a device, an input image to be processed;
receiving, by the device, a reference image that is a processed image of the input image by a second ISP; and
determining one or more parameter values to be used by the ISP in processing the input image based on one or more differences between the input image and the reference image.
2. The method of claim 1, wherein determining the one or more parameter values comprises determining a set of parameter values to reduce differences between a processed input image not yet provided by the ISP and the reference image.
3. The method of claim 2, wherein determining the set of parameter values comprises: providing the input image to a parameter estimator;
providing the reference image to the parameter estimator; and
receiving the set of parameter values from the parameter estimator in response to providing the input image and the reference image to the parameter estimator, wherein the parameter estimator is previously trained using a plurality of previous input images and a plurality of previous processed images corresponding to the plurality of previous input images.
4. The method of claim 3, wherein determining the set of parameter values further comprises estimating the set of parameter values by the parameter estimator, wherein the parameter estimator is a neural network trained to determine relationships between parameters for the ISP.
5. The method of claim 4, wherein estimating the set of parameter values is non-recursive.
6. The method of claim 5, further comprising configuring the ISP with the set of parameter values for processing images received by the ISP.
7. The method of claim 6, further comprising capturing the input image by a camera sensor of the device, wherein:
the device includes the ISP coupled to the camera sensor and is configured to process images from the camera sensor; and
the set of parameter values corresponds to a pairing of the camera sensor and the ISP.
8. The method of claim 7, further comprising:
processing, by the ISP, the input image using the set of parameter values received from the parameter estimator; and
displaying the processed input image on a display of the device.
9. The method of claim 8, further comprising:
storing the parameter estimator in a memory of the device; and
executing, by one or more applications processors of the device coupled to the ISP, the parameter estimator to estimate the set of parameter values.
10. The method of claim 9, further comprising performing wireless communications via one or more wireless transceivers and a baseband processor of the device.
11. A device configured to tune an image signal processor (ISP), comprising:
one or more processors configured to:
receive an input image to be processed;
receive a reference image that is a processed image of the input image by a second ISP; and
determine one or more parameter values to be used by the ISP in processing the input image based on one or more differences between the input image and the reference image.
12. The device of claim 11, wherein the one or more processors, in determining the one or more parameter values, are configured to determine a set of parameter values to reduce differences between a processed input image not yet provided by the ISP and the reference image.
13. The device of claim 12, wherein the one or more processors, in determining the set of parameters values, are configured to:
provide the input image to a parameter estimator;
provide the reference image to the parameter estimator; and
receive the set of parameter values from the parameter estimator in response to providing the input image and the reference image to the parameter estimator, wherein the parameter estimator is previously trained using a plurality of previous input images and a plurality of previous processed images corresponding to the plurality of previous input images.
14. The device of claim 13, further comprising the parameter estimator configured to estimate the set of parameter values, wherein the parameter estimator is a neural network trained to determine relationships between parameters for the ISP.
15. The device of claim 14, wherein the parameter estimator is configured to estimate the set of parameter values via a non-recursive operation.
16. The device of claim 15, wherein the one or more processors are further configured to configure the ISP with the set of parameter values for processing images received by the ISP.
17. The device of claim 16, further comprising:
a camera sensor configured to capture the input image; and
the ISP coupled to the camera sensor and configured to process images from the camera sensor, wherein the set of parameter values corresponds to a pairing of the camera sensor and the ISP.
18. The device of claim 17, further comprising a display, wherein the ISP is configured to process the input image using the set of parameter values received from the parameter estimator and the display is configured to display the processed input image.
19. The device of claim 18, further comprising a memory configured to store the parameter estimator, wherein the one or more processors are one or more applications processors coupled to the ISP and configured to execute the parameter estimator to estimate the set of parameter values.
20. The device of claim 19, further comprising one or more wireless transceivers and a baseband processor configured to perform wireless communications.
21. A non-transitory computer-readable medium storing one or more programs containing instructions that, when executed by one or more processors of a device, cause the device to tune an image signal processor (ISP), comprising:
receiving an input image to be processed;
receiving a reference image that is a processed image of the input image by a second ISP; and determining one or more parameter values to be used by the ISP in processing the input image based on one or more differences between the input image and the reference image.
22. The computer-readable medium of claim 21, wherein the instructions further cause the device to:
provide the input image to a parameter estimator;
provide the reference image to the parameter estimator; and
receive a set of parameter values from the parameter estimator in response to providing the input image and the reference image to the parameter estimator, wherein the parameter estimator is previously trained using a plurality of previous input images and a plurality of previous processed images corresponding to the plurality of previous input images to estimate the set of parameter values to reduce differences between a processed input image not yet provided by the ISP and the reference image.
23. The computer-readable medium of claim 22, wherein the instructions further cause the device to estimate the set of parameter values using the parameter estimator in a non-recursive manner, wherein the parameter estimator is a neural network trained to determine relationships between parameters for the ISP.
24. The computer-readable medium of claim 23, wherein the instructions further cause the device to configure the ISP with the set of parameter values for processing image received by the ISP.
25. The computer-readable medium of claim 24, wherein the instructions further cause the device to capture the input image by a camera sensor of the device, wherein the device includes the ISP coupled to the camera sensor and is configured to process images from the camera sensor, and further wherein the set of parameter values corresponds to a pairing of the camera sensor and the ISP.
26. A device configured to tune an image signal processor (ISP), comprising:
means for receiving an input image to be processed;
means for receiving a reference image that is a processed image of the input image by a second ISP; and
means for determining one or more parameter values to be used by the ISP in processing the input image based on one or more differences between the input image and the reference image.
27. The device of claim 26, further comprising:
means for providing the input image to a parameter estimator;
means for providing the reference image to the parameter estimator; and
means for receiving a set of parameter values from the parameter estimator in response to providing the input image and the reference image to the parameter estimator, wherein the parameter estimator is previously trained using a plurality of previous input images and a plurality of previous processed images corresponding to the plurality of previous input images to estimate the set of parameter values to reduce differences between a processed input image not yet provided by the ISP and the reference image.
28. The device of claim 27, wherein the parameter estimator is configured to perform a non recursive operation in estimating the set of parameter values, wherein the parameter estimator is a neural network trained to determine relationships between parameters for the ISP.
29. The device of claim 28, further comprising means for configuring the ISP with the set of parameter values for processing image received by the ISP.
PCT/US2019/015823 2018-01-30 2019-01-30 Systems and methods for image signal processor tuning using a reference image WO2019152499A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980010542.8A CN111656781A (en) 2018-01-30 2019-01-30 System and method for image signal processor tuning using reference images

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201841003400 2018-01-30
IN201841003400 2018-01-30

Publications (1)

Publication Number Publication Date
WO2019152499A1 true WO2019152499A1 (en) 2019-08-08

Family

ID=65516760

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/015823 WO2019152499A1 (en) 2018-01-30 2019-01-30 Systems and methods for image signal processor tuning using a reference image

Country Status (2)

Country Link
CN (1) CN111656781A (en)
WO (1) WO2019152499A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021201993A1 (en) * 2020-03-30 2021-10-07 Qualcomm Incorporated Automated camera tuning
WO2023025063A1 (en) * 2021-08-23 2023-03-02 索尼集团公司 Image signal processor optimization method and device
WO2024081761A1 (en) * 2022-10-14 2024-04-18 Motional Ad Llc Cascade camera tuning
WO2024082183A1 (en) * 2022-10-19 2024-04-25 华为技术有限公司 Parameter adjustment method and apparatus, and intelligent terminal
US12062151B2 (en) * 2020-03-11 2024-08-13 Mediatek Inc. Image-guided adjustment to super-resolution operations

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4057141A1 (en) * 2021-03-08 2022-09-14 Beijing Xiaomi Mobile Software Co., Ltd. A method for optimizing image signal processing

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010129893A2 (en) * 2009-05-08 2010-11-11 Qualcomm Incorporated Systems, methods, and apparatus for camera tuning and systems, methods, and apparatus for reference pattern generation
WO2012059618A1 (en) * 2010-11-01 2012-05-10 Nokia Corporation Tuning of digital image quality
US20150036018A1 (en) * 2013-08-01 2015-02-05 Mediatek Inc. Method and apparatus for tuning camera correction setting for camera module

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170029185A (en) * 2015-09-07 2017-03-15 삼성전자주식회사 Auto-tuning method for operation parameters of image signal processor
US9916525B2 (en) * 2015-10-13 2018-03-13 Siemens Healthcare Gmbh Learning-based framework for personalized image quality evaluation and optimization

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010129893A2 (en) * 2009-05-08 2010-11-11 Qualcomm Incorporated Systems, methods, and apparatus for camera tuning and systems, methods, and apparatus for reference pattern generation
WO2012059618A1 (en) * 2010-11-01 2012-05-10 Nokia Corporation Tuning of digital image quality
US20150036018A1 (en) * 2013-08-01 2015-02-05 Mediatek Inc. Method and apparatus for tuning camera correction setting for camera module

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HAOMIAO JIANG ET AL: "Learning the image processing pipeline", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 30 May 2016 (2016-05-30), XP080704560 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12062151B2 (en) * 2020-03-11 2024-08-13 Mediatek Inc. Image-guided adjustment to super-resolution operations
WO2021201993A1 (en) * 2020-03-30 2021-10-07 Qualcomm Incorporated Automated camera tuning
CN115362502A (en) * 2020-03-30 2022-11-18 高通股份有限公司 Automatic camera commissioning
WO2023025063A1 (en) * 2021-08-23 2023-03-02 索尼集团公司 Image signal processor optimization method and device
WO2024081761A1 (en) * 2022-10-14 2024-04-18 Motional Ad Llc Cascade camera tuning
WO2024082183A1 (en) * 2022-10-19 2024-04-25 华为技术有限公司 Parameter adjustment method and apparatus, and intelligent terminal

Also Published As

Publication number Publication date
CN111656781A (en) 2020-09-11

Similar Documents

Publication Publication Date Title
WO2019152499A1 (en) Systems and methods for image signal processor tuning using a reference image
WO2019152534A1 (en) Systems and methods for image signal processor tuning
US10237527B2 (en) Convolutional color correction in digital images
EP3542347B1 (en) Fast fourier color constancy
US7343040B2 (en) Method and system for modifying a digital image taking into account it's noise
US9852499B2 (en) Automatic selection of optimum algorithms for high dynamic range image processing based on scene classification
US8730329B2 (en) Automatic adaptive image sharpening
US7587085B2 (en) Method and apparatus for red-eye detection in an acquired digital image
US9741117B2 (en) Multiple camera apparatus and method for synchronized auto white balance
KR100983037B1 (en) Method for controlling auto white balance
CN112703509A (en) Artificial intelligence techniques for image enhancement
CN102867295B (en) A kind of color correction method for color image
CN110248105A (en) A kind of image processing method, video camera and computer storage medium
CN105187728B (en) Photographic method and device
CN106506946B (en) A kind of camera automatic focusing method and video camera
US9628727B2 (en) Information processing apparatus and method, and image capturing system determining or acquiring target noise amount
CN110248170A (en) Image color method of adjustment and device
CN114331907A (en) Color shading correction method and device
WO2019152481A1 (en) Systems and methods for image signal processor tuning
KR101349968B1 (en) Image processing apparatus and method for automatically adjustment of image
CN113596422A (en) Method for adjusting color correction matrix CCM and monitoring equipment
Peltoketo SNR and Visual Noise of Mobile Phone Cameras
CN111372008B (en) Automatic brightness gain adjustment method based on video content and camera
CN111782845A (en) Image adjusting method, image adjusting device and mobile terminal
Zhang et al. Image exposure assessment: a benchmark and a deep convolutional neural networks based model

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19706823

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19706823

Country of ref document: EP

Kind code of ref document: A1