WO2019152534A1 - Systèmes et procédés de réglage de processeur de signal d'image - Google Patents

Systèmes et procédés de réglage de processeur de signal d'image Download PDF

Info

Publication number
WO2019152534A1
WO2019152534A1 PCT/US2019/015872 US2019015872W WO2019152534A1 WO 2019152534 A1 WO2019152534 A1 WO 2019152534A1 US 2019015872 W US2019015872 W US 2019015872W WO 2019152534 A1 WO2019152534 A1 WO 2019152534A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
parameter
metrics
parameter set
determining
Prior art date
Application number
PCT/US2019/015872
Other languages
English (en)
Inventor
Pawan Kumar Baheti
Shilpi Sahu
Naveen Srinivasamurthy
Yogesh Gupta
Uday Kiran Pudipeddi
Shreyas Hampali Shivakumar
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to CN201980009535.6A priority Critical patent/CN111630854A/zh
Publication of WO2019152534A1 publication Critical patent/WO2019152534A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • This disclosure relates generally to systems and methods for tuning an image signal processor, and specifically to determining one or more parameters used by an image signal processor to process an image.
  • a raw image captured by a camera sensor is processed by an image signal processor
  • ISP image processing
  • Processing may include a plurality of filters or processing blocks being applied to the captured image, such as denoising or noise filtering, edge enhancement, color balancing, contrast, intensity adjustment (such as darkening or lightening), tone adjustment, and so on.
  • Image processing blocks or modules may include lens/sensor noise correction, Bayer filters, de-mosaicing, color conversion, correction or enhancement/suppression of image attributes, denoising filters, and sharpening filters.
  • Each module may include a large number of tunable parameters (such as hundreds or thousands of parameters per module). Additionally, modules may be co-dependent as different modules may affect similar aspects of an image. For example, denoising and texture correction or enhancement may both affect high frequency aspects of an image. As a result, a large number of parameters are determined or adjusted for an ISP to generate a final image from a captured raw image.
  • the parameters for an ISP conventionally are tuned manually by an expert with experience in how to process input images for desirable output images.
  • the expert may require 3-4 weeks to determine or adjust device settings for the parameters based on a combination of a specific camera sensor and ISP. Since the camera sensor or other camera features (such as lens characteristics or imperfections, aperture size, shutter speed and movement, flash brightness and color, and so on) may impact the captured image and therefore at least some of the tunable parameters for the ISP, each combination of camera sensor and ISP would need to be tuned by an expert.
  • a device for tuning an image signal processor may include one or more processors configured to receive a reference image, determine a plurality of image quality (IQ) metrics based on the reference image, determine a value for each of the plurality of IQ metrics for the reference image, identify one or more existing parameter sets in a parameter database based on the values of the plurality of IQ metrics, and determine whether the parameter database is to be adjusted based on the one or more existing parameter sets.
  • ISP image signal processor
  • a method for tuning an ISP includes receiving, by a processor, a reference image.
  • the method also includes determining, by the processor, a plurality of image quality (IQ) metrics based on the reference image.
  • the method further includes determining, by the processor, a value for each of the plurality of IQ metrics for the reference image.
  • the method also includes identifying, by the processor, one or more existing parameter sets in a parameter database based on the values of the plurality of IQ metrics.
  • the method further includes determining, by the processor, whether the parameter database is to be adjusted based on the one or more existing parameter sets.
  • a non-transitory computer-readable medium may store instructions that, when executed by a processor of a device configured to tune an ISP, cause the device to receive a reference image, determine a plurality of image quality (IQ) metrics based on the reference image, determine a value for each of the plurality of IQ metrics for the reference image, identify one or more existing parameter sets in a parameter database based on the values of the plurality of IQ metrics, and determine whether the parameter database is to be adjusted based on the one or more existing parameter sets.
  • a device configured to tune an ISP is disclosed.
  • the device includes means for receiving a reference image, means for determining a plurality of image quality (IQ) metrics based on the reference image, means for determining a value for each of the plurality of IQ metrics for the reference image, means for identifying one or more existing parameter sets in a parameter database based on the values of the plurality of IQ metrics, and means for determining whether the parameter database is to be adjusted based on the one or more existing parameter sets.
  • IQ image quality
  • FIG. l is a block diagram of an example device for tuning an ISP.
  • FIG. 2 is an illustrative flow chart depicting a conventional operation for tuning an ISP for a scene type.
  • FIG. 3 is an illustrative flow chart depicting an example operation for automatically tuning an 1SP.
  • FIG. 4 is an illustrative flow chart depicting an example operation for adjusting the parameter database.
  • FIG. 5 is a depiction of a relationship between texture and sharpness 1Q metrics.
  • FIG. 6 is an illustrative flow chart depicting an example operation for determining new sets of parameter values for adjusting the parameter database.
  • FIG. 7 is an illustrative flow chart depicting an example operation for adjusting one or more IQ metrics in a sequential fashion in adjusting the parameters for personal preference.
  • FIG. 8 is a depiction of an example clustering of parameter sets as illustrated by a relationship of noise versus texture.
  • FIG. 9 is a depiction of an example tree branch illustration for sequentially adjusting IQ metrics.
  • FIG. 10 is a snapshot of an example GUI for adjusting an edge IQ metric.
  • FIG. 11 is a snapshot of an example GUI for adjusting a high contrast texture 1Q metric.
  • FIG. 12 is a snapshot of an example GUI for adjusting a low contrast texture IQ metric.
  • FIG. 13 is a snapshot of an example GUI for adjusting a noise IQ metric.
  • FIG. 14 is a snapshot of an example GUI indicating the concatenation of selections for the different IQ metrics.
  • FIG. 15 is an illustrative flow chart depicting an example operation for using a reference image in automatically tuning an ISP.
  • FIG. 16 is an illustrative flow chart depicting an example operation for determining the closest parameter sets and adjusting the parameter database.
  • FIG. 17 is an illustrative flow chart depicting another example operation for determining new sets of parameter values for adjusting the parameter database.
  • FIG. 18 is an illustrative flow chart depicting an example operation for adding new parameter sets to the parameter database a pre-defmed number of times.
  • FIG. 19 is an illustrative flow chart depicting an example operation for determining if any of the parameter sets are acceptable for processing an image.
  • FIG. 20 is an illustrative flow chart depicting an example operation for determining a different parameter set when none of the used parameter sets are deemed acceptable.
  • FIG. 21 is an illustrative flow chart depicting another example operation for determining a different parameter set when none of the used parameter sets are deemed acceptable.
  • ISP such as determining or adjusting the parameters used by an ISP for processing an input image.
  • an expert may require weeks of testing and adjusting to determine the parameters to be used by the ISP.
  • a user may have different preferences than what an expert may consider a desirable processed image. For example, a user may prefer more color saturation, a softer image, or otherwise than an expert tuning the ISP.
  • aspects of the present disclosure may be used in tuning an ISP so that less time may be required to tune the ISP and/or a person without expertise (such as a device user) may assist in tuning the ISP with his or her preferences.
  • a database of ISP parameters may be populated, adapted or updated based on user preferences. The final or updated database may then be used to provide the parameters to the ISP in processing an incoming image.
  • a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software.
  • various illustrative components, blocks, modules, circuits, and steps are described below generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
  • the example devices may include components other than those shown, including well-known components such as a processor, memory and the like.
  • aspects of the present disclosure are applicable to any suitable electronic device configured to or capable of tuning an ISP (such as a security system with one or more cameras, smartphones, tablets, laptop computers, digital video and/or still cameras, web cameras, cloud computing networks, testing equipment for ISPs, fabrication facilities, testing devices to interface with ISPs, and so on). While described below with respect to a device having or coupled to one camera, aspects of the present disclosure are applicable to devices having any number of cameras (including no cameras, where images or video are provided to the device, or multiple cameras), and are therefore not limited to devices having one camera. Aspects of the present disclosure are applicable for devices capturing still images as well as for capturing video, and may be implemented in devices having or coupled to cameras of different capabilities (such as a video camera or a still image camera).
  • aspects of the present disclosure are applicable to devices coupled to or interfacing an ISP (such as manufacturing or testing equipment and test devices), and are therefore not limited to devices having an ISP.
  • the term“device” is not limited to one or a specific number of physical objects (such as one smartphone, one camera controller, one processing system and so on). As used herein, a device may be any electronic device with one or more parts that may implement at least some portions of this disclosure. While the below description and examples use the term“device” to describe various aspects of this disclosure, the term“device” is not limited to a specific configuration, type, or number of objects.
  • FIG. l is a block diagram of an example device 100 for tuning an ISP.
  • the example device 100 may include or be coupled to a camera 102, a processor 104, a memory 106 storing instructions 108, and a camera controller 110.
  • the device 100 may optionally include (or be coupled to) a display 114 and a number of input/output (I/O) components 116.
  • the device 100 may include additional features or components not shown.
  • a wireless interface which may include a number of transceivers and a baseband processor, may be included for a wireless communication device.
  • the device 100 may include or be coupled to additional cameras other than the camera 102.
  • the camera 102 may be capable of capturing individual image frames (such as still images) and/or capturing video (such as a succession of captured image frames).
  • the camera 102 may include a single camera sensor and camera lens, or be a dual camera module or any other suitable module with multiple camera sensors and lenses.
  • the memory 106 may be a non-transient or non- transitory computer readable medium storing computer-executable instructions 108 to perform all or a portion of one or more operations described in this disclosure.
  • the memory 106 may also store a parameter database 109 or a look-up table (LUT) to be used for storing and looking up the parameters for an ISP (such as ISP 112).
  • the device 100 may also include a power supply 118, which may be coupled to or integrated into the device 100.
  • the processor 104 may be one or more suitable processors capable of executing scripts or instructions of one or more software programs (such as instructions 108) stored within the memory 106.
  • the processor 104 may be one or more general purpose processors that execute instructions 108 to cause the device 100 to perform any number of functions or operations.
  • the processor 104 may include integrated circuits or other hardware to perform functions or operations without the use of software. While shown to be coupled to each other via the processor 104 in the example of FIG. 1, the processor 104, the memory 106, the camera controller 110, the optional display 114, and the optional I/O components 116 may be coupled to one another in various arrangements. For example, the processor 104, the memory 106, the camera controller 110, the optional display 114, and/or the optional I/O components 116 may be coupled to each other via one or more local buses (not shown for simplicity).
  • the display 114 may be any suitable display or screen allowing for user interaction and/or to present items (such as captured images, video, or a preview image) for viewing by a user.
  • the display 114 may be a touch-sensitive display.
  • the I/O components 116 may be or include any suitable mechanism, interface, or device to receive input (such as commands) from the user and to provide output to the user.
  • the I/O components 116 may include (but are not limited to) a graphical user interface, keyboard, mouse, microphone and speakers, and so on.
  • the display 114 and/or the I/O components 116 may provide a preview image to a user and/or receive a user input for adjusting one or more settings of the camera 102 (such as selecting and/or deselecting a region of interest of a displayed preview image for an AF operation).
  • the camera controller 110 may include an ISP 112, which may be one or more image signal processors to process captured image frames or video provided by the camera 102.
  • the camera controller 110 (such as the ISP 112) may also control operation of the camera 102.
  • the ISP 112 may process received images using parameters provided from the parameter database 109.
  • the processor 104 may determine the parameters from the parameter database 109 to be used by the ISP 112.
  • the ISP 112 may execute instructions from a memory to process image frames or video, may include specific hardware to process image frames or video, or additionally or alternatively may include a combination of specific hardware and the ability to execute software instructions for processing image frames or video.
  • images may be received by the device 100 from sources other than a camera, such as other devices, equipment, network attached storage, and so on.
  • the device 100 may be a testing device where the ISP 112 is removable so that another ISP may be coupled to the device 100 (such as a test device, testing equipment, and so on). While the following examples are described regarding device 100 and ISP 112, the present disclosure should not be limited to a specific type of device or hardware configuration for tuning an ISP.
  • IQ metrics are measurements of perceivable attributes of an image (with each perceivable attribute called a“ness”).
  • Example nesses are the luminance of an image, the sharpness of an image, the graininess of an image, the tone of an image, the color saturation of an image, and so on, and are perceived by a person if changed for an image.
  • the number of IQ metrics may be 10-20, with each IQ metric corresponding to a plurality of tunable parameters. Additionally, two different IQ metrics may affect some of the same tunable parameters for the ISP 112.
  • the parameter database 109 may correlate different values of IQ metrics to different values for the parameters. For example, an input vector of IQ metrics may be associated with an output vector of tunable parameters so that an ISP 112 may be tuned for the corresponding IQ metrics. Since the number of parameters may be large, the parameter database 109 may not store all combinations of IQ metrics, but instead include a portion of the number of combinations.
  • the memory 106 and parameter database 109 are shown to be included in device 100, the database may be stored outside of the device 100 (such as in a network attached storage, cloud storage, testing equipment coupled to device 100, and so on).
  • the present disclosure should not be limited to device 100 or a specific implementation of parameter database 109 or memory 106.
  • the parameters may also impact components outside of the ISP 112 (such as the camera 102), and the present disclosure should not be limited to specific described parameters or parameters specific only to the ISP.
  • the parameters may be for a specific ISP and camera (or camera sensor) combination.
  • An IQ model may be used to map the IQ metrics to the tunable parameters. Any type of
  • the IQ model may be used, and the present disclosure is not limited to a specific IQ model for correlating IQ metrics to ISP parameters.
  • the IQ model may include one or more modulation transfer functions (MTFs) to determine changes in the ISP parameters associated with a change in an IQ metric.
  • MTFs modulation transfer functions
  • changing a luminance IQ metric may correspond to parameters associated with adjusting a camera sensor sensitivity, shutter speed, flash, the ISP determining an intensity for each pixel of an incoming image, the ISP adjusting the tone or color balance of each pixel for compensation, and so on.
  • a luminance MTF may be used to indicate that a change in the luminance IQ metric corresponds to specific changes in the correlating parameters.
  • the IQ model or MTFs may vary between different ISPs or vary between different combinations of ISPs and cameras (or camera sensors).
  • tuning the ISP may comprise determining the differences in MTFs or the IQ model so that the IQ metric values are correlated to preferred tunable parameter values for the ISP (in the parameter database 109). Since an“optimally” processed image may be based on user preference or subjective for one or more experts, the optimization of an IQ model may be open ended and subject to differences between users or persons assisting with the tuning. However, there are attempts to quantify an IQ, such as by using an IQ scale (such as from 0 to 100, with 100 being the best) to indicate the IQ performance for an ISP and/or a camera.
  • an IQ scale such as from 0 to 100, with 100 being the best
  • the IQ for a processed image is quantified, and an expert may use such quantification to tune an ISP (such as the adjusting or determining the parameters for the ISP or the combination of the ISP and camera).
  • Some IQ metrics may be opposed to one another, such as noisiness and texture, where reducing or increasing the noise may correspondingly reduce or increase the high frequency texture information in an image.
  • trade-offs are determined between IQ metrics to attempt to optimize processing of an image (such as by generating the highest quantified IQ score from an IQ scale).
  • Optimizing the IQ metrics or otherwise tuning an ISP may differ for different scene types. For example, indoor scenes illuminated by incandescent lighting may correspond to different “optimal” IQ metrics (and corresponding parameters) than outdoor scenes with bright natural lighting.
  • a scene with large flat fields of color and luminance may correspond to different “optimal” IQ metrics than a scene with large numbers of colors and variances in color within a field.
  • an ISP may be tuned for a plurality of different scene types.
  • FIG. 2 is an illustrative flow chart depicting a conventional operation 200 for tuning an
  • An initial set of parameter values for the ISP is used in processing one or more received images (202).
  • An expert then inspects the original and processed images to determine how the parameters should be adjusted (204). Through inspection of the images (204), the expert determines the parameters to be adjusted and the amount of adjustment (206). For example, the expert may determine the IQ metrics to be adjusted and the amount of adjustment, and one or more MTFs for the adjusted IQ metrics may be used to determine the amount of adjustment for corresponding ISP parameters.
  • the parameters are adjusted (208), and the adjusted parameters are used for the ISP to again process one or more images (210).
  • the process reverts to 204, with the expert repeatedly inspecting the images, adjusting the parameters, and the ISP processing images with the adjusted parameters until the expert is satisfied with the processed images.
  • the parameter values may be stored in a parameter database (such as database 209) for the scene type. Multiple sets of parameter values may be stored for a scene type, and/or the stored sets of parameter values may correspond to discrete differences in one or more IQ metrics.
  • At least a portion of the ISP is automatically tuned by a device.
  • the time for tuning an ISP may be reduced.
  • Automatically tuning the ISP may also take into account user preferences to tune an ISP for a user’s preferences instead of an expert (therefore providing images more preferable to the user).
  • the automatic tuning of an ISP may be performed during device or ISP design, manufacture or testing, which may include assisting an expert in tuning the ISP.
  • the automatic tuning of an ISP may be performed by an end user’s device, such as a smartphone, tablet, or other device including and/or in communication with one or more ISPs (such as device 100 including ISP 112).
  • an ISP 112 may have been tuned previously by an expert, with the parameter database 109 populated with parameter values to be used for different scene types.
  • Automatically tuning with user input may update the ISP tuning so that the parameter database 109 may be updated to include parameter values preferred by the user (such as by densifying the parameter database 109 with additional vectors of parameter values or adjusting existing vectors of parameter values).
  • the MTFs may be updated through the automatic tune procedure to better correlate parameters with IQ metrics.
  • the automatic tuning may include software, special hardware, or a combination of both.
  • automatically tuning may include an application or software to be executed by processor 104 for populating or updating the parameter database 109 of device 100.
  • a person (such as a tuning expert and/or a user of a given device) may be presented with different possible processed images to determine which images the person prefers and therefore which IQ metrics may be of more importance to the person in tuning the ISP Additionally, or alternatively, a person may select the IQ metrics of importance to him or her, and the device may present possible processed images for different values of the IQ metrics to determine the person’s preference and therefore improve the tuning of the ISP for the person.
  • FIG. 3 is an illustrative flow chart depicting an example operation 300 for automatically tuning an ISP.
  • one or more images may be received.
  • values for parameters that are fixed for an ISP optionally are determined (304).
  • sensor or module specific parameter values such as some parameters for black level, lens roll-off, gamma, color, etc., may not change for different scene types.
  • the parameter values may therefore be determined separate from automatically tuning the ISP (such as determining values for non-fixed parameters).
  • step 304 may not be performed.
  • the ISP may then be automatically tuned using the received images (306).
  • the parameter database and/or the MTFs for an IQ model may be populated or adjusted using the received images (308).
  • relationships and trade-offs between IQ metrics or parameters may be determined or defined for the received images.
  • One example relationship is texture vs. edge sharpness for an image. Preserving edges in an image may also preserve texture or other high frequency information in an image.
  • Another example relationship is noise vs. texture. Preserving texture or high frequency information may also result in more noise being present within an image.
  • a further example relationship is color vs. tone.
  • tone adjustment may impact the color values for the pixels of the image (such as skewing one or more red, green, or blue values of a pixel when adjusting the tone of the image).
  • the IQ model to quantify IQ may be used to determine different example values for the parameter set (based on the determined trade-offs) for producing processed images with high IQ scores (such as greater than a predetermined or adjustable threshold, greater than an IQ score for a previous processed image, etc.).
  • parameter values for the ISP for different scene types may be determined based on personal preference (310). For example, a person may be provided (e g., presented for selection) choices with perceptible differences in processed images of the received images in order to assist in determining a person’s preferences. The preferences selected by the person may then be used to densify the parameter database (e.g., populate additional data points), adjust the parameter database (e g., adjust existing data points), set (e g., configure or determine) the parameter values for the ISP for processing images, or perform a combination of two or more of the operations.
  • personal preference 310
  • a person may be provided (e g., presented for selection) choices with perceptible differences in processed images of the received images in order to assist in determining a person’s preferences.
  • the preferences selected by the person may then be used to densify the parameter database (e.g., populate additional data points), adjust the parameter database (e g., adjust existing data points), set (e g., configure or determine)
  • the parameter database 109 may include sets of parameter values previously determined to cause an ISP to generate a“high-quality” image (e.g., as designated or determined based on an IQ score equaling or exceeding a threshold score). Each set of parameter values may be associated with IQ metric values.
  • the database 109 may be organized or have multiple organization structures so that vectors with similar IQ metrics may be grouped together. For example, the database 109 may be indexed or organized so that sets with similar texture ness values may be identified. As described in FIG. 3, the parameter database 109 may be adjusted or updated for automatically tuning the ISP.
  • FIG. 4 is an illustrative flow chart depicting an example operation 400 for adjusting the parameter database.
  • one or more images for processing by an ISP are received or otherwise made available.
  • the images may be raw images captured by a camera sensor with noise and luminance characteristics that may impact processing.
  • one or more personal preferences (such as preferences of the expert and/or a user for a final processed image) may optionally be received (404).
  • Example preferences may include preferences regarding color saturation, tone, noisiness, etc. of the person for the processed images.
  • a device may then determine whether an existing parameter database (with one or more previously determined sets of parameter values) is to be adjusted based on the characteristics for the camera sensor and/or the personal preferences (406).
  • an insufficient number of sets of parameter values may be determined to exist in the parameter database.
  • the existing sets may be determined to insufficiently correlate to the camera sensor used for capturing the received images.
  • a scene type of a received image may not be covered by the existing parameter database.
  • the existing parameter database may be used without adjustment (410).
  • the received images may be evaluated using the existing sets of parameter values in the parameter database (412)
  • one or more relationships among IQ metrics may be analyzed using the received images (414). For example, the scatter of IQ metric relationships for texture versus edge sharpness (based on the existing sets of parameter values and the received images for processing) may be analyzed.
  • One or more new sets of parameter values may then be determined based on the analyzed relationships (416).
  • the relationship between edge sharpness and texture IQ metrics may be used to determine new sets of parameter values for different sharpness and texture IQ metrics that still provide a sufficient IQ score for a processed image.
  • the new sets of parameter values may also be used to better define tradeoffs for IQ metrics for the IQ model. For example, new sets of parameter values may indicate tradeoffs between a noisiness IQ metric and a texture IQ metric.
  • the one or more new sets of parameter values may then be determined to be added to the parameter database (418), thus densifying the parameter database.
  • an existing set of parameter values may be amended based on a new set of parameter values determined.
  • FIG. 5 is a depiction of a relationship 500 between texture and sharpness IQ metrics.
  • Existing points 502 indicating the relationship between the nesses may be from the existing sets of parameter values corresponding to different texture and sharpness IQ metrics.
  • a plurality of new parameter value sets for different texture and sharpness IQ metrics may be determined using the received images (so as to have a sufficient IQ score for a processed image).
  • the new sets may correspond to new points 504 on the relationship 500 between texture and sharpness IQ metrics, which may better indicate tradeoffs between IQ metrics.
  • the relationship 500 is depicted as a graph of two nesses, the relationship may be between any number of nesses and therefore any number of dimensions.
  • Determining new sets of parameter values may be based on existing sets of parameter values in the parameter database. For example, an existing set of parameter values (a parent set) may be adjusted in order to create one or more new sets of parameter values (called child sets or children sets).
  • FIG. 6 is an illustrative flow chart depicting an example operation 600 for determining new sets of parameter values for adjusting the parameter database.
  • a space of near IQ metrics for an existing parent set is determined. For example, a determined distance away from a parent set may be a determined space. Triangulation or sum of differences are example methods for determining a distance, but the space may be determined in any suitable way. Graphically for 3 nesses, a cube or sphere may be determined around a parent set, where potential child sets may exist within the cube or sphere (space). In another example, a sphere or other suitable shape may be determined around a parent set.
  • a child set may be determined by interpolating parameters values between the parent set and an existing set (such as described regarding 604-608). In some other example implementations, a child set may be determined by perturbing or adjusting parameters of the parent set within the space (such as described regarding 610). In some further example implementations, a combination of interpolating and perturbing may be performed. For example, some child sets may be created through perturbation, then additional child sets may be created through interpolating between the previous child sets and the parent set. In another example, an interpolated child set’s parameters may be perturbed within a space to adjust the child set or create new child sets.
  • the furthest neighbor from the parent set in the space is used for interpolation.
  • any neighbor may be used for interpolation in other examples.
  • the distances between the parent set and existing sets in the space may be determined.
  • the furthest set from the parent set may then be determined based on the distances (606).
  • the space may be defined in dimensions of nesses, and a distance may be the combined difference in nesses between the sets.
  • the differences in parameter values between the furthest set and the parent set may be considered the maximum adjustments to the parameter values for the parent set in creating children sets.
  • any resulting child set may be configured to be within the space.
  • one or more parameter values from the parent set may be adjusted with an interpolated difference between the furthest neighbor and the parent set (608).
  • an interpolated difference between the furthest neighbor and the parent set (608).
  • only a subset of the IQ metrics may be determined to be adjusted.
  • the corresponding parameters for the subset of IQ metrics may be adjusted through interpolation.
  • child parameter parent parameter + a(neighbor parameter—
  • a is a value between 0 and 1.
  • a may be constant for all parameters to be adjusted.
  • the factor of adjustment for the parameters being adjusted is the same. For example, based on all parameters being adjusted, the child set is as depicted in equation (2) below:
  • Child set parent set + a neighbor set— parent set ) (2)
  • a new set may be determined by adjusting or perturbing one or more parameters of the parent set (610).
  • the sparsity of sets around the parent set may be determined, with the sparsity used to determine the factor by which to adjust one or more parameters.
  • a sparsity cost for a parent set may be a distance between the parent set and a distribution of existing sets in the space or across the group. For example, the Mahalanobis distance between the parent set and its existing neighbors in the space may be determined as the sparsity cost. The distance may also be determined for each existing set and an average distance determined for the existing sets across the entire group (which may be an average cost for the group).
  • the factor for adjusting parameters may be as depicted in equation (3) below:
  • x is the parent set sparsity cost and c is the average sparsity cost for the entire group. If the sparsity around the parent set is greater than the average sparsity (less neighbors surround the parent set than typical), then adjustments to the parameters may be smaller so that the corresponding IQ metrics are within the space. Conversely, if the sparsity around the parent set is less than the average sparsity (more neighbors surround the parent set than typical), then adjustments to the parameters may be greater since the greater number of neighbors indicate that the corresponding IQ metrics for greater adjustments should still be within the space.
  • the size of the window for adjusting a parameter may be a standard deviation of the parameter for the entire group times the factor, and the window may be centered at the parameter value for the parent set. If the sparsity around the parent set is greater than or equal to the average sparsity (less or the same number of neighbors surround the parent set and are distributed than typical), the window size may be approximately one standard deviation. Conversely, if the sparsity around the parent set is less than the average sparsity, the window size may be multiple standard deviations.
  • a parameter value is randomly or pseudo- randomly selected from the window.
  • related parameters (such as parameters associated with an IQ metric) may be adjusted by a similar factor, where a same position in the window is used for each related parameter.
  • the IQ metrics for each potential child set may be determined (612).
  • the received image(s) may be processed by the ISP using the child parameter values, and IQ metrics may be calculated from the processed image(s).
  • a determination may then be made whether the IQ metrics are valid (614).
  • the IQ metrics are compared to the IQ metrics for existing sets in the parameters database to determine if they are consistent. If a portion of the IQ metrics are outliers (e.g., not included among the IQ metrics of the existing sets in the parameter database), the IQ metrics may be considered invalid.
  • an IQ score may be computed for a processed image. If the image score is sufficient, such as greater than a threshold, the IQ metrics are considered valid.
  • Other suitable processes for determining the validity of the IQ metrics may be used, and the present disclosure should not be limited to specific examples.
  • a display may provide (e.g., display) different processed images for a varying IQ metric, and a mechanism for receiving user input (e g., a GUI or a camera or microphone) may allow a user to select the preferred processed images to indicate the preferences for the IQ metric.
  • a mechanism for receiving user input e g., a GUI or a camera or microphone
  • FIG. 7 is an illustrative flow chart depicting an example operation 700 for adjusting one or more IQ metrics in a sequential fashion in adjusting the parameters for personal preference.
  • the process may be used to indicate which parameter sets from the parameter database are preferred by the user for the ISP (or ISP and camera combination).
  • the IQ metrics to be adjusted for a user are determined.
  • a user may indicate which IQ metrics are of particular importance to that particular user.
  • the IQ metrics may be for a particular scene or generally for all scenes.
  • the parameter sets of the parameter database may then be clustered or grouped for each of the IQ metrics to be adjusted (704).
  • FIG. 8 is a depiction of an example clustering of parameter sets as illustrated by a relationship of noise versus texture. As shown, the parameter sets are clustered into three groups: low noise and texture 802, medium noise and texture 804, and high noise and texture 806. While three groups are shown, any number of clusterings may exist.
  • the groupings or clusterings indicate the sets with close IQ metrics (such as IQ metrics within a determined distance of one another). For example, the three clusterings indicate that the noise IQ metric and the texture IQ metric are similar for the parameter sets in a cluster. While not shown, one or more parameter sets may not be clustered and may be removed from consideration for the final parameter set to be used by the ISP.
  • a received image is processed for each of the parameter sets in a clustering for the IQ metric to first be adjusted (706).
  • the image may also be processed with a varying IQ metric corresponding to differences in the corresponding parameters for each of the parameter sets (with each parameter set possibly being used multiple times to process the image).
  • the number of times that the image is processed may correspond to the number of parameter sets clustered for the IQ metric.
  • the processed images are then displayed or otherwise presented to a user (708) so that the user may indicate which processed image(s) are preferred.
  • the user may then indicate (such as through a GUI or other user input) which processed images are preferred (710).
  • an IQ score may be determined for each of the processed images, and the highest IQ scores or scores greater than a threshold may be used to select the processed images.
  • the corresponding parameter values for the IQ metric being adjusted may be determined (712).
  • the user selections may have a subset of parameters corresponding to the IQ metric with similar or the same parameter values across the user selections.
  • the parameter values associated with the IQ metric is preserved when processing an image for a next varying IQ metric.
  • the image is then again processed for a next varying IQ metric (714).
  • the process may continue until all indicated metrics are adjusted.
  • the parameter database may be searched to determine whether the parameters for the preferred IQ metrics are similar to the parameters for one or more stored parameter sets.
  • Such parameter sets may be considered the preferred sets of parameter values to be used by the ISP for processing an image.
  • the determined parameter values may be added to the parameter database as one or more new parameter sets.
  • FIG. 9 is a depiction of an example tree branch illustration 900 for sequentially adjusting IQ metrics.
  • the clusterings 902 are used as starting points, and an edge MTF 904 may first be used to adjust an edge IQ metric.
  • a high contrast texture MTF 906 may then be used to next adjust a high contrast texture IQ metric.
  • a low contrast texture MTF 908 may next be used to adjust a low contrast texture IQ metric.
  • a noise MTF 910 may then be used to adjust a noise IQ metric.
  • Fine tuning adjustments may then be performed to finalize one or more parameters that may change the perception of the processed image.
  • the end point of each of the arrows may indicate a different processed image.
  • the continuing arrows may indicate that the user selected those images for the respective IQ metric.
  • the darkened solid arrows, the dashed solid arrows and the gray solid arrow may indicate images selected by the user as preferred over other selected images. The user may select the image corresponding to the final darkened solid arrow during overshoot 912 as the preferred image with respect to the other preferred images.
  • a GUI may be used in adjusting one or more IQ metrics.
  • a GUI may allow a user to inspect the trade-off between IQ metrics and determine the preferred metrics.
  • the GUI may allow a user to determine the preferred IQ metric for the selected metrics to be adjusted.
  • FIGS. 10-14 depict an example GUI for adjusting IQ metrics corresponding to the example tree branch illustration in FIG. 9.
  • FIG. 10 is a snapshot 1000 of an example GUI for adjusting an edge IQ metric.
  • a user may select one or more of the defined edge IQ metric values or relationships and press next to go to the next IQ metric.
  • FIG. 11 is a snapshot 1100 of an example GUI for adjusting a high contrast texture IQ metric.
  • FIG. 12 is a snapshot 1200 of an example GUI for adjusting a low contrast texture IQ metric.
  • a user may select one or more of the defined low contrast texture IQ metric values or relationships and press next to go to the next IQ metric.
  • FIG. 13 is a snapshot 1300 of an example GUI for adjusting a noise IQ metric.
  • the potential noise IQ metrics are based on the previously selected IQ metrics (E selected for edge tuning (FIG. 10), FI selected for high contrast tuning (FIG. 11), and L selected for low contrast tuning (FIG. 12) under each of the images on the left of the snapshot 1300).
  • the GUI may show the groupings of selected IQ metrics (with respective parameter sets).
  • FIG. 14 is a snapshot 1400 of an example GUI indicating the concatenation of selections for the different IQ metrics.
  • a user may select one or more final concatenations to be used (such as by checking the box to the left illustrated in snapshot 1400).
  • the parameter set used by the ISP is thus dependent on the selected IQ metric values or relationships (such as through the different MTF s for determining the parameter values for a selected grouping of IQ metrics.
  • one or more sets of parameter values from the parameter database may be identified based on the selected concatenation of IQ metrics (such as from FIG. 14). Such identified sets of parameter values may therefore be used by the ISP in processing received images.
  • the optimization of an IQ model may be open ended and subject to different preferences between users or persons. There may be no“correct” set of parameter values since different processed images using different parameter values may be considered to be of similar IQ by a person. As a result, determining the parameter values to be used or otherwise tuning an ISP may be long or tedious since the parameter values may not converge to one specific set of parameter values. Determining initial parameters values or how to adjust parameter values may be difficult since there may not be one preferred setting for the IQ metrics.
  • a reference image processed by a different ISP or device may be introduced into the automatic tuning process.
  • the reference image may provide some guidance or indication as to one or more preferred IQ metrics and their associated parameter values.
  • a reference image may be used to determine one or more closest sets of parameter values in the parameter database.
  • the closest sets may be used to densify or otherwise adjust the parameter database.
  • FIG. 15 is an illustrative flow chart depicting an example operation 1500 for using a reference image in automatically tuning an ISP.
  • a reference image may be received.
  • the reference image may be previously processed.
  • the reference image may have been provided by a different ISP or device after completing processing.
  • the reference image is different than the input image for processing by the ISP.
  • one or more preferred IQ metrics may be determined from the reference image (1504). For example, a texture IQ metric, a noise IQ metric, and an edge IQ metric may be determined from the reference image.
  • IQ metrics may include a tone IQ metric, color IQ metric, high frequency contrast IQ metric, low frequency contract IQ metric, and so on. While the example processes are described regarding texture, noise, and edge IQ metrics, other and any number of IQ metrics may be used. Therefore, the present disclosure should not be limited to specific IQ metrics or examples.
  • One or more parameter sets with the parameter values for the sets corresponding to IQ metrics closest to the preferred IQ metrics may then be identified (1506).
  • a parameter database may store a vector of IQ metrics for each set of parameter values.
  • the MTFs for an IQ model may be used to determine the IQ metrics for each set of parameter values in the parameter database.
  • Parameter sets with the closest IQ metrics to the preferred IQ metrics may be considered the closest parameter sets.
  • a distance function may be used to determine the closest parameter sets.
  • An example distance function is depicted in equation (4) below: forj from 1 to D (4) where i is a specific IQ metric, X is the preferred IQ metric value for the specific IQ metric from the group or vector of preferred IQ metric values X, Mj is the group or vector of IQ metric values for the jth parameter set in the parameter database, Wi is a weight for the ith IQ metric from weight vector W (where each IQ metric may be associated with a different weight), and D is the number of parameter sets in the parameter database.
  • the distance function may be an unweighted summation, where the difference between a parameter set IQ metric value and the preferred IQ metric value is not multiplied by a weight factor.
  • the preferred IQ metrics determined are the texture IQ metric, edge IQ metric, and noise IQ metric
  • i may range from 1 to 3 for the three IQ metrics
  • the distance for a parameter set j may be a sum of three values: the weighted difference between corresponding IQ metric values and the preferred IQ metric values for the texture, edge, and noise IQ metrics.
  • the closest parameter set j may be the parameter set with the smallest or minimum distance across the parameter sets.
  • a parameter set may be selected if the distance is less than a threshold. In this manner, a parameter set may be identified without searching the entire parameter database.
  • the ISP may then process a received image (1508). For example, a raw image may be input into or received by the device or ISP and processed using the identified parameter set(s). The received image may be the raw image (pre- processing) of the reference image. One or more personal or user preferences also may be determined or received (1510). Then, the parameter database may be adjusted based on the one or more personal preferences and the one or more identified parameter sets (1 12).
  • variations to an identified parameter set may be used to process the input image, and the variations are analyzed to determine if the child set is to be added to the parameter database.
  • example operation 600 in FIG. 6 may be used to densify the parameter database, where the parent set is from the one or more identified parameter sets in 1506.
  • the process of identifying one or more parameter sets and using such to adjust the parameter database (1506-1512) may be recursively performed until determined that the parameter database is to not be further adjusted.
  • the parameter database may reach a critical number of parameter sets being stored.
  • the parameter database may stop being updated if no new child sets with valid IQ metrics (such as from example operation 600 in FIG. 6) are identified or determined.
  • the parameter database may stop being updated if the new child sets do not sufficiently improve the IQ (such as increasing the IQ score by a threshold amount or differences between the parent set and child set cannot be perceived by a user when processing an image).
  • FIG. 16 is an illustrative flow chart depicting an example operation 1600 for determining the closest parameter sets and adjusting the parameter database.
  • Example operation 1600 in FIG. 16 may be an example implementation of steps 1506-1512 of FIG. 15. While FIG. 16 is described regarding texture, noise, and edge IQ metrics, any IQ metrics and number of IQ metrics may be used.
  • a closest parameter set for the preferred IQ metrics (such as the texture, noise, and edge
  • IQ metrics may be determined from the parameter database (1602).
  • the distance function depicted in equation (4) may be used to determine the closest parameter set.
  • a different parameter set other than the closest parameter set may be better suited in processing an image.
  • one or more of the IQ metrics may be relaxed in determining a closest parameter set. While operation 1600 describes relaxing one IQ metric in determining a closest parameter set, more than one IQ metric may be relaxed.
  • a closest parameter set with a relaxed texture IQ metric may be determined.
  • the weight vector in determining a distance may be adjusted to reduce the weight for the texture IQ metric.
  • the weight may be adjusted to zero (to remove consideration of the IQ metric from determining the distance) or a portion of the previous weight (to reduce consideration of the IQ metric in determining the distance).
  • a closest parameter set with a relaxed noise IQ metric may be determined (1606), and a closest parameter set with a relaxed edge IQ metric may be determined (1608).
  • one or more of the determined parameter sets may be the same.
  • previously determined closest parameter sets may be removed from consideration in determining a closest parameter set so that the number of determined parameter sets corresponds to the number of preferred IQ metrics.
  • a received image may then be processed using the determined/identified parameter sets
  • the parameter set to be used may be determined to be the closest parameter set (such as determined in 1602) or somewhere between the closest parameter set and one of the parameter sets with a relaxed IQ metric (such as determined in 1604 through 1608) (1612).
  • the determined parameter set may be one of the parameter sets determined in 1604 through 1608 (instead of between one of the parameter sets and the closest parameter set).
  • the processed images may be presented to a user.
  • the user may then select the preferred processed image(s).
  • the user input or selection may indicate which parameter set to be used. For example, if the user selects the processed image for the closest parameter set, the closest parameter set is determined to be the parameter set to be used by the ISP. In this manner, the parameter database is not updated since the closest parameter set is selected. If the user selects one of the processed images for the parameter sets for relaxed IQ metrics, a parameter set between the closest parameter set and the corresponding relaxed parameter set may be determined to be used. As a result, a child set from the closest parameter set and the relaxed IQ metric parameter set may be created.
  • the child set from the closest parameter set and the relaxed IQ metric parameter set may be determined through interpolation between the two existing parameter sets. For example, steps 604-608 of example operation 600 in FIG. 6 may be used to determine or create a child set. In another example, one or more IQ metric values between the values for the closest parameter set and the parameter set of the relaxed IQ metric may be determined. One or more MTFs of the IQ model may then be used to determine the parameter values for a child set.
  • the child set is used in processing the received image and compared to the processed images for the two parent sets. If a user prefers the processed image for the child set (or alternatively, an IQ score or other evaluation of the processed images indicate that the processed image for the child set is greater than for the other processed images), the child set may be added to the parameter database The process may be repeated as long as the parameter database is to be adjusted (such as being densified with additional child sets). If a user prefers the processed image for the closest parameter set (or alternatively, an IQ score or other evaluation of the processed images indicate that the processed image for the child set is less than for the other processed images), the child set may be rejected and the parameter database not further updated.
  • the device 100 automatically may densify the parameter database 109 with further parameter sets without a user input or personal preferences.
  • a number of iterations in densifying the parameter database 109 may be performed before using the parameter database 109 for processing an image.
  • FIG. 17 is an illustrative flow chart depicting an example operation 1700 for determining new sets of parameter values for adjusting the parameter database 109. While the example operation 1700 describes use of the three IQ metrics of texture, edge sharpness, and noise, greater or fewer IQ metrics may be used, and other IQ metrics may be used in automatically adjusting the parameter database 109. Further, while the example operation 1700 includes receiving one image, any number of images may be received and used to adjust the parameter database 109. Additionally, determining closest sets of parameter values (parameter sets) and determining new parameter sets for adjusting the parameter database 109 may be performed as described above or may be performed in any suitable manner for the example operation 1700.
  • adjusting the parameter database 109 is described regarding densifying the parameter database 109.
  • adjusting the parameter database 109 may include removing or adjusting specific parameter sets in the parameter database 109.
  • the example operation 1700 is for illustrative purposes, and the present disclosure should not be limited to the provided examples.
  • the device 100 may receive an image.
  • the device 100 then may determine one or more IQ metric values for the received image (1704).
  • IQ metric values for the received image (1704).
  • the device 100 may determine a texture metric value, an edge sharpness metric value, and a noise metric value for the received image (1706).
  • the IQ metric values may identify the reference image in an IQ metric space.
  • each analyzed image may include an associated texture metric value, edge sharpness metric value, and noise metric value.
  • the IQ metric space at the least may be three dimensional if visualized, with a dimension associated with one of the IQ metrics.
  • the set of IQ metrics for an image may represent coordinates for the image in the IQ metric space. In some other examples, other IQ metrics may exist, and the IQ metric space may be larger (or smaller) than three dimensions.
  • the device 100 may determine a plurality of parameter sets in the parameter database 109 for the determined IQ metrics (1708).
  • one of the determined parameter sets is the closest existing parameter set for the determined IQ metrics in the parameter database 109 (1710).
  • the distance function in equation (4) above may be used in determining the closest parameter set.
  • each parameter set may be associated with a preferred IQ metric value for each IQ metric.
  • the device 100 may determine the existing parameter set whose associated preferred IQ metric values cause the smallest distance value from the IQ metric values for the received image.
  • the weights in the weight vector W may be uniform, or the distance function may be unweighted.
  • the device 100 may determine a closest parameter set where one of the IQ metrics is relaxed. For example, the device 100 may determine the closest existing parameter set for the determined IQ metric values where the texture metric is relaxed (1712), may determine the closest existing parameter set for the determined IQ metric values where the edge sharpness metric is relaxed (1714), and may determine the closest existing parameter set for the determined IQ metric values where the noise metric is relaxed (1716). Steps 1710 - 1716 may be similar to steps 1602 - 1608 in FIG. 16.
  • the device 100 may densify the parameter database 109 by adding one or more new parameter sets (1720), such as new parameter sets determined using one or more relaxed weights or other new parameter sets determined closest to an existing parameter set. The process may revert to decision 1718, and the device 100 may determine if the parameter database 109 is to be further adjusted by adding one or more new parameter sets.
  • the device 100 may use each of the last determined parameter sets to process the received image and thus generate a plurality of processed images from the received image (1722). For example, if the device 100 added parameter sets to the parameter database 109 (in 1720), the device 100 may use the last added parameter sets to process the received image. If the device 100 did not perform step 1720 (the parameter database 109 was not to be adjusted in 1718), the device 100 may use the plurality of parameter sets determined in step 1708 to process the received image.
  • the device 100 may adjust the parameter database a pre-defmed number of times. If the device 100 has not yet reached the pre-defmed number of times, the device 100 may continue to add new parameter sets to the parameter database 109. In another example, the device 100 may adjust the parameter database 109 until a threshold number of parameter sets are included in the parameter database 109. In a further example, the device 100 may prevent adjustment of the parameter database 109 if an available storage for the parameter database 109 is below a storage threshold.
  • the device 100 may adjust the parameter database 109 based on a sparsity of parameter sets in the space (such as the number of parameter sets being greater than or less than a threshold of number of parameter sets per portion of the space). In this manner, the parameter database 109 may be densified with additional parameter sets to fill holes in the space of the parameter sets.
  • FIG. 18 is an illustrative flow chart depicting an example operation 1800 for adding new parameter sets to the parameter database 109 a pre-defmed number of times.
  • the operation 1800 in FIG. 18 is an example implementation of 1718 and 1720 in FIG. 17. While the example operation 1800 includes adding new parameter sets to the parameter database 109 a pre-defmed number of times, any suitable means for updating the parameter database may be performed, and the present disclosure should not be limited to the operation 1800 in FIG. 18.
  • the device 100 may determine a new parameter set not in the parameter database 109 based on each determined parameter set. Referring back to step 1708 in FIG.
  • the device 100 may determine a plurality of parameter sets in the parameter database 109 for the determined IQ metric values for a received image. For example, the device 100 may determine the closest existing parameter set for the determined IQ metric values (in step 1710), and the device 100 may determine the closest existing parameter sets with one (or more) of the IQ metrics relaxed (such as in steps 1712 - 1716). If the IQ metric values of a texture, edge sharpness, and noise are determined for the received image, and the device 100 determines an existing parameter set for each of the IQ metrics relaxed and an existing parameter set closest to the determined IQ metric values, the device 100 may determine a new parameter set not in the parameter database 109 for each of the determined parameter sets.
  • the device 100 may determine a new parameter set based on the closest existing parameter set (1804).
  • the closest existing parameter set may be as determined, e.g., in step 1710 of FIG. 17.
  • the device 100 also may determine a new parameter set based on the closest existing parameter set where the texture metric is relaxed (1806).
  • the closest existing parameter set where the texture metric is relaxed may be as determined, e.g., in step 1712 of FIG. 17.
  • the device 100 further may determine a new parameter set based on the closest existing parameter set where the edge sharpness metric is relaxed (1808).
  • the closest existing parameter set where the edge sharpness metric is relaxed may be as determined, e.g., in step 1714 of FIG. 17.
  • the device 100 also may determine a new parameter set based on the closest existing parameter set where the noise metric is relaxed (1810).
  • the closest existing parameter set where the noise metric is relaxed may be as determined, e.g., in step 1716 of FIG. 17.
  • the device 100 may determine parameter values that cause the processed image to have IQ metric values closer to the determined IQ metric values for the reference image than for the parameter values for the existing parameter set. For example, the device 100 may determine that the preferred IQ metric values for an existing parameter set is a distance q from the determined IQ metric values for the reference image. The device 100 then may determine one or more new IQ metric values that result in a distance from the determined IQ metric values smaller than distance q. In some example implementations, the new IQ metric value may be based on whether that IQ metric is relaxed for the existing parameter set.
  • the device 100 may adjust each IQ metric value for which the IQ metric is not relaxed while not adjusting the IQ metric for which the IQ metric is relaxed.
  • the device 100 may adjust the texture metric value and the noise metric value without adjusting the edge sharpness metric value.
  • Determining new parameter sets based on newly determined IQ metric values may include the device 100 iteratively selecting a set of parameters values, processing the reference image to determine if the set of parameter values corresponds to the newly determined IQ metric values, and adjusting the parameter values (based on the determination) for again processing the reference image until a suitable parameter set for adding to the parameter database 109 is determined.
  • the device 100 may be trained to determine how to adjust values of the parameter sets in densifying the parameter database 109.
  • training the device 100 to densify the parameter database 109 may be based on the distribution of the preferred IQ metric values for the existing parameter sets.
  • the device 100 may use a parameter generator, which may be embodied in software, hardware, or a combination of both, to determine new parameter sets to be added to the parameter database 109.
  • the parameter generator may include a neural network to learn dependencies between the IQ metric values and the parameter values.
  • Another suitable system may be a fuzzy system.
  • any suitable training method or system may be used in determining the relationships between IQ metric values and parameter values.
  • the parameter generator may be used to determine the new parameter values based on the distribution D of the existing parameter sets and corresponding preferred IQ metric values, and the neural network may be used to reinforce learned correct relationships in the distribution D and correct errors in learned incorrect relationships in the distribution D. Reinforcement and correction may be performed when new information is received regarding the distribution D, such as each time a parameter set is added to the parameter database 109 (thus adjusting the distribution D), each time a set of IQ metric values is determined for an image processed using an existing parameter set, or other suitable instances when new information regarding the parameter database 109 is received and may be used in further determining relationships in the distribution D.
  • the device 100 may determine a representative group or vector of parameter values (such as vector R) for the distribution D.
  • the device 100 may define the distribution D for the parameter sets and preferred IQ metric values as a multivariate Gaussian distribution. In this manner, the device 100 may determine the mean vector of parameter values for the multivariate Gaussian distribution (D) as the representative vector R.
  • Suitable indicators or representative vector R include a median vector of parameter values, a combination of a mean vector and a median vector of parameter values, a logical center of a distribution D not defined as a multivariate Gaussian distribution (such as multi-modal distributions, skewed distributions, or other distributions without a Gaussian curve), and so on.
  • the distribution D and the representative vector R are used to train the neural network to determine a function d to be used in determining new sets of parameter values.
  • C may equal P plus an adjustment based on the function d and the vector R, as represented in equation (5) below:
  • function d may represent a function of the vector R to be used in varying the existing parameter set P.
  • function d may include separate functions for adjusting each parameter value of vector R to generate an adjusted vector R’ .
  • the adjusted vector R’ then may be combined (such as added, subtracted, averaged, or other suitable combination) with the determined parameter set P to determine a new parameter set C for adding to the parameter database 109.
  • the device 100 may use the set of IQ metric values for each parameter set in the parameter database 109, may use the target set of IQ metric values (such as determined from the reference image), and may use one or more selected parameter sets P (such as in steps 1710 - 1716 in FIG. 17) as inputs to train the neural network or other suitable system of the parameter generator in determining a function d for determined vector R.
  • the function d may be based on the relationships between different IQ metrics and different parameters.
  • function d for adjusting P based on R may be a function dependent on a determined P, Mp, and m, as represented in equation (6) below:
  • the device 100 may determine, for each new parameter set, whether the parameter set is to be added to the parameter database 109 (1812). For example, for each new parameter set determined in steps 1804 - 1810, the device 100 may determine whether each of the four new parameter sets is to be added to the parameter database 109. Determining whether a parameter set is to be added to the parameter database 109 may be similar to decision 614 in FIG. 6.
  • a cost function estimator is used in determining whether a new parameter set is to be added to the parameter database 109.
  • the cost function estimator may be embodied in software, hardware, or a combination of both.
  • the device 100 may determine the IQ metric values for the new parameter set C.
  • the determined IQ metric values together may comprise the IQ metric vector Me for the parameter set C.
  • the cost function estimator may be trained to provide a cost function f w based on Me and m.
  • the inputs to train the cost function estimator may include the determined parameter sets P, the change to the parameter set P to create the new parameter set C (such as d(R) in equation (5)), and any updates to the parameter database 109 (such as new parameter sets added).
  • the device 100 may determine to add the parameter set C to the parameter database 109 if the cost function value for the new parameter set is less than the cost function value for the existing parameter set P. For example, given Me, Mp (which is the set of IQ metric values for the existing parameter set P), and m, if f w (Mc, m) ⁇ f w (Mp, m), the device 100 may determine to add the parameter set C to the parameter database 109. Otherwise, the device 100 may determine not to add the parameter set C to the parameter database 109. While the examples provide some suitable methods for determining whether to add parameter set C to the parameter database 109, other suitable methods may be used, and the present disclosure should not be limited to the provided examples.
  • the device 100 may discard the new parameter set (1814). For example, the device 100 may determine to continue using a parameter set P instead of a new parameter set C determined from the parameter set P, which may be discarded. If the device 100 determines to add the new parameter set to the parameter database 109, the device 100 may add the new parameter set to the parameter database 109 (1816), and the device 100 may begin using the new parameter set C.
  • Cl may be a new parameter set determined for a closest parameter set PI
  • C2 may be a new parameter set determined for a closest parameter set P2 when the texture metric is relaxed
  • C3 may be a new parameter set determined for a closest parameter set P3 when the edge sharpness metric is relaxed
  • C4 may be a new parameter set determined for a closest parameter set P4 when the noise metric is relaxed.
  • the device 100 may determine that Cl is closer than PI, P2 is closer than C2, C3 is closer than P3, and C4 is closer than P4 for the IQ metrics for the received image.
  • the device 100 may determine to add parameter sets Cl, C3, and C4, and the device 100 may determine to discard parameter set C2 (and continue to use parameter set P2 for when the texture metric is relaxed).
  • the device 100 may determine if a number of iterations in densifying the parameter database is reached (1818). If the number of iterations is reached, the device 100 may use the last valid parameter set in processing the reference image (1820). For example, if a new parameter set C is determined from a parameter set P, and the parameter set C is added to the parameter database 109, the device 100 may process the image using the parameter set C. If the parameter set C is discarded, the device 100 may process the image using the parameter set P. Step 1820 may be similar to step 1722 in FIG. 17. If the number of iterations is not reached, the process may revert to step 1802, and the device 100 may determine one or more new parameter sets based on the closest existing parameter set.
  • the number of iterations may be the number of times that a closest existing parameter set is determined (such as parameter set P) and one or more new parameter sets are determined from the existing parameter set (such as parameter set C). For example, if one or more new parameter sets are added to the parameter database 109 in a first iteration, the device 100 may determine, from the IQ metric values for the new parameter sets and the IQ metric values for the parameter set P previously determined to be closest, which parameter set is now the closest parameter set for the determined IQ metric values for the reference image. As before for steps 1804 - 1810, one or more IQ metrics may be relaxed in determining a closest existing parameter set.
  • the device 100 may determine a new parameter set and determine whether the new parameter set is to be added to the parameter database 109.
  • the process of determining and adding new parameter sets may be performed any number of iterations.
  • the number of iterations in determining and adding parameter sets to the parameter database 109 may be any suitable number defined before determining parameter sets for the parameter database 109.
  • the number of iterations may be, e.g., fixed, user adjustable, dependent on the number of parameter sets existing in the parameter database 109, or any other suitable number. In this manner, the device 100 may continue to add parameter sets to the parameter database 109 without user input.
  • the device 100 may add a number of new parameter sets up to, e.g., one more than the number of IQ metrics used.
  • the IQ metrics used are a texture metric, an edge sharpness metric, and a noise metric (totaling three IQ metrics)
  • the device 100 may determine a new parameter set for each of the IQ metrics relaxed (thus equaling three new parameter sets), and the device 100 may determine a new parameter set for no IQ metrics relaxed, thus totaling four new parameter sets for the three IQ metrics which may be added to the parameter database 109.
  • the device 100 may add up to all four parameter sets Cl - C4 to the parameter database 109.
  • additional parameter sets may be determined by using multiple weights for each IQ metric.
  • the device 100 may use each of the parameter sets added in the previous iteration to determine another plurality of new parameter sets (such as four new parameter sets for three IQ metrics). In this manner, the number of parameter sets in the parameter database 109 may grow exponentially in relation to the number of iterations.
  • parameter sets may be added during the first iteration, sixteen parameter sets may be added during the second iteration (four new parameter sets determined during the second iteration from each new parameter set determined during the first iteration), 64 parameter sets may be added during the third iteration, and so on.
  • the device 100 may determine if a parameter set for processing the image is acceptable to a user.
  • the device 100 may use one or more parameter sets from the densified parameter database 109 to generate one or more processed images from the received image (such as in step 1722 in FIG. 17 or in step 1820 in FIG. 18). The user then may determine if any of the processed images are acceptable.
  • FIG. 19 is an illustrative flow chart depicting an example operation 1900 for determining if any of the parameter sets are acceptable for processing an image (such as the received image in step 1702 in FIG. 17). While the example operation 1900 illustrates use of the IQ metrics of texture, edge sharpness, and noise, any number of IQ metrics and different IQ metrics may be used. Further, while the example operation 1900 illustrates generating a number of processed images equal to the number of IQ metrics plus one (e g., four processed images for three IQ metrics of texture, edge sharpness, and noise), any number of processed images may be generated. The present disclosure should not be limited to the provided examples in FIG. 19. [00116] Beginning at 1902, the device 100 may process the received image to generate one or more processed images.
  • the device 100 may process the received image to generate one or more processed images.
  • the device 100 may use one or more of the last determined parameter sets for the determined IQ metrics of the received image to process the received image.
  • the device 100 may process the received image using the last closest parameter set to generate a first processed image (1904). Referring back to FIG. 18, the device 100 may add a new parameter set Cl (in step 1816), or the device 100 may discard the new parameter set Cl (in step 1814) for an existing parameter set Pl that was determined closest for the IQ metrics for the received image. If the parameter set Cl is discarded, the device 100 may use the parameter set Pl in generating the first processed image (in step 1904 in FIG. 19). If the parameter set Cl is added to the parameter database 109, the device 100 may use the parameter set Cl in generating the first processed image (in step 1904 in FIG. 19).
  • the device 100 also may process the received image using the last closest parameter set with the texture metric relaxed to generate a second processed image (1906). Referring back to FIG. 18, the device 100 may add a new parameter set C2 (in step 1816), or the device 100 may discard the new parameter set C2 (in step 1814) for an existing parameter set P2 that was determined closest for the IQ metrics (with the texture metric relaxed) for the received image. If the parameter set C2 is discarded, the device 100 may use the parameter set P2 in generating the second processed image (in step 1906 in FIG. 19). If the parameter set C2 is added to the parameter database 109, the device 100 may use the parameter set C2 in generating the second processed image (in step 1906 in FIG. 19).
  • the device 100 also may process the received image using the last closest parameter set with the edge sharpness metric relaxed to generate a third processed image (1908).
  • the device 100 may add a new parameter set C3 (in step 1816), or the device 100 may discard the new parameter set C3 (in step 1814) for an existing parameter set P3 that was determined closest for the IQ metrics (with the edge sharpness metric relaxed) for the received image. If the parameter set C3 is discarded, the device 100 may use the parameter set P3 in generating the third processed image (in step 1908 in FIG. 19). If the parameter set C3 is added to the parameter database 109, the device 100 may use the parameter set C3 in generating the third processed image (in step 1908 in FIG. 19).
  • the device 100 also may process the received image using the last closest parameter set with the noise metric relaxed to generate a fourth processed image (1910).
  • the device 100 may add a new parameter set C4 (in step 1816), or the device 100 may discard the new parameter set C4 (in step 1814) for an existing parameter set P4 that was determined closest for the IQ metrics (with the noise metric relaxed) for the received image. If the parameter set C4 is discarded, the device 100 may use the parameter set P4 in generating the fourth processed image (in step 1910 in FIG. 19). If the parameter set C4 is added to the parameter database 109, the device 100 may use the parameter set C4 in generating the fourth processed image (in step 1910 in FIG. 19).
  • the device 100 may provide the one or more processed images for display to a user (1912).
  • the device 100 may display the processed images on the display 114.
  • the processed images may be output for display by a separate device.
  • the processed images may be displayed in any suitable manner. In one example, two or more processed images may be displayed concurrently so that a user may compare images side by side. In addition, or alternatively, the processed images may be displayed one at a time.
  • the display 114 may be configured to switch between images based on a user input (such as a user swiping through the images or providing another suitable user gesture to change the displayed image).
  • the device 100 may receive a user input indicating whether one or more of the processed images are acceptable by the user (1914). For example, a user may examine the displayed processed images, and the user may decide if any or which images are acceptable to the user. The user then may indicate via a user input which (if any) processed image is acceptable. In indicating which processed image (if any) is acceptable, the user indicates which parameter set is acceptable for processing the received image.
  • the device 100 may determine from the user input whether a parameter set for the received image is deemed acceptable (1916). For example, if an image is deemed acceptable via a user input, the device 100 may determine that there exists a parameter set that is acceptable for processing the received image. If a parameter set is deemed acceptable (1918), the device 100 may determine which parameter set is deemed acceptable for the received image (1920). For example, if the user selects the processed image corresponding to the parameter set C2 determined with the texture metric relaxed, the device 100 may determine the parameter set C2 for processing the received image.
  • the user may select multiple images as acceptable.
  • the device 100 may determine multiple parameter sets in the parameter database 109 that are acceptable for processing the received image.
  • the device 100 may use the multiple parameter sets to generate multiple corresponding processed images. The user then may determine whether one of the multiple processed images is preferred.
  • the user may select one image, and one parameter set in the parameter database 109 is used for processing a similar image that is received.
  • the device 100 may determine a different parameter set for processing the received image (1922). For example, the user may indicate that none of the processed images are acceptable. In this manner, the device 100 may determine that none of the previously determined parameter sets are acceptable for processing the received image. In some example implementations, the device 100 may determine a different parameter set not previously used that is in the parameter database 109 and is acceptable to the user for processing the received image. In some other example implementations, the device 100 may determine a new parameter set not in the parameter database 109.
  • FIG. 20 is an illustrative flow chart depicting an example operation 2000 for determining a different parameter set.
  • the operation 2000 in FIG. 20 may be an example implementation of step 1922 in FIG. 19.
  • the device 100 may receive a user input indicating the most acceptable processed image. For example, although the user may not find any of the processed images acceptable, the user may indicate that the processed image corresponding to the parameter set with the texture metric relaxed is the most acceptable processed image.
  • the device 100 may determine the existing parameter set corresponding to the most acceptable processed image (2004). The device 100 then may determine a new parameter set from the existing parameter set (2006). In some example implementations, the device 100 may compare the IQ metric values from the received image and the IQ metric values from the most acceptable processed image to determine changes in the IQ metric values that may correspond to making a processed image acceptable. For example, the parameter values of the existing parameter set may be adjusted to converge the IQ metric values for a processed image toward preferred IQ metric values.
  • the device 100 may process the received image to generate a new processed image (2008), and the device 100 may output the new processed image for display to the user (2010). The device 100 then may receive a user input as to whether the new processed image is acceptable (2012). If the new processed image is acceptable (2014), the device 100 may use the new parameter set for processing the received image and similar images (2016). For example, the device 100 may note or store an indication that the parameter set is to be used in the future for similar images. If the new processed image is not acceptable, the process may revert to step 2006, and the device 100 may determine another parameter set to attempt to determine an acceptable parameter set for processing the received image.
  • the device 100 may determine a new parameter set in the parameter database 109 without relying on previously used parameter sets.
  • FIG. 21 is an illustrative flow chart depicting another example operation 2100 for determining a different parameter set when none of the used parameter sets are deemed acceptable. In the example operation 2100 in FIG. 21, the new parameter set exists in the parameter database 109.
  • the device 100 may adjust one or more weights for the IQ metrics used for determining the closest parameter set.
  • the weights may be included in a weight vector W for the different IQ metrics used in determining a parameter set (such as for equation (4) above).
  • the weights may be user adjustable. For example, a user may determine that edge sharpness is more important than other IQ metrics. As a result, the user may increase w(edge sharpness) so that a different parameter set may be determined. Additionally or alternatively, the user may reduce the weight for the texture metric or reduce the weight for the noise metric. Any number of weights for the IQ metrics may be adjusted by the user.
  • the device 100 may determine a new parameter set from the parameter database 109 (2104). For example, the adjusted weights may cause the device 100 to determine a different parameter set that is closest.
  • the device 100 may process the received image using the new parameter set to generate a new processed image (2106), and the device 100 may output the new processed image for display to the user (2108).
  • the device 100 may receive a user input indicating the new processed image is acceptable (2110). If the new processed image is acceptable (2112), the device 100 may use the new parameter set to process the received image and similar images (2114). For example, the device 100 may note or store an indication that the parameter set is to be used in the future for similar images. If the new processed image is not acceptable, the process may revert to step 2102, and the weights again may be adjusted.
  • the user may adjust the IQ metric values directly, such as described above. For example, the user may sequentially adjust the IQ metric values as described above regarding FIG. 7 - FIG. 14. The new parameter set then may be selected from the parameter database 109 based on the adjusted IQ metric values.
  • the techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium (such as the memory 106 in the example device 100 of FIG. 1) comprising instructions 108 that, when executed by the processor 104, cause the device 100 to perform one or more of the methods described above.
  • the non-transitory processor-readable data storage medium may form part of a computer program product, which may include packaging materials.
  • the non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, other known storage media, and the like.
  • RAM synchronous dynamic random access memory
  • ROM read only memory
  • NVRAM non-volatile random access memory
  • EEPROM electrically erasable programmable read-only memory
  • FLASH memory other known storage media, and the like.
  • the techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.
  • processors such as the processor 104 of FIG. 1.
  • processors may include but are not limited to one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), application specific instruction set processors (ASIPs), field programmable gate arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • ASIPs application specific instruction set processors
  • FPGAs field programmable gate arrays
  • processors may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein.
  • the functionality described herein may be provided within dedicated software modules or hardware modules configured as described
  • a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e g., a combination of a DSP and a
  • microprocessor a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

Des aspects de la présente invention concernent des systèmes et des procédés pour régler un processeur de signal d'image (ISP). Un dispositif donné à titre d'exemple peut comprendre un ou plusieurs processeurs configurés pour recevoir une image de référence, déterminer une pluralité de métriques de qualité d'image (IQ) sur la base de l'image de référence, déterminer une valeur pour chaque métrique parmi la pluralité de métriques d'IQ pour l'image de référence, identifier un ou plusieurs ensembles de paramètres existants dans une base de données de paramètres sur la base des valeurs de la pluralité de métriques d'IQ et déterminer si la base de données de paramètres doit être ajustée sur la base du ou des ensembles de paramètres existants.
PCT/US2019/015872 2018-01-30 2019-01-30 Systèmes et procédés de réglage de processeur de signal d'image WO2019152534A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980009535.6A CN111630854A (zh) 2018-01-30 2019-01-30 用于图像信号处理器调谐的系统和方法

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IN201841003373 2018-01-30
IN201841003373 2018-01-30
IN201841033902 2018-09-10
IN201841033902 2018-09-10

Publications (1)

Publication Number Publication Date
WO2019152534A1 true WO2019152534A1 (fr) 2019-08-08

Family

ID=65433749

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/015872 WO2019152534A1 (fr) 2018-01-30 2019-01-30 Systèmes et procédés de réglage de processeur de signal d'image

Country Status (2)

Country Link
CN (1) CN111630854A (fr)
WO (1) WO2019152534A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020176007A1 (fr) * 2019-02-27 2020-09-03 Huawei Technologies Co., Ltd. Appareil et procédé de traitement d'image
CN113099100A (zh) * 2019-12-23 2021-07-09 神讯电脑(昆山)有限公司 取像参数的调整方法
WO2021201993A1 (fr) * 2020-03-30 2021-10-07 Qualcomm Incorporated Syntonisation automatisée de caméra
WO2024082183A1 (fr) * 2022-10-19 2024-04-25 华为技术有限公司 Procédé et appareil de réglage de paramètre, et terminal intelligent

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117940951A (zh) * 2021-09-13 2024-04-26 华为技术有限公司 确定图像信号处理参数的方法、装置和感知系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160379352A1 (en) * 2015-06-24 2016-12-29 Samsung Electronics Co., Ltd. Label-free non-reference image quality assessment via deep neural network
US20170070671A1 (en) * 2015-09-07 2017-03-09 Samsung Electronics Co., Ltd. Systems, methods, apparatuses, and non-transitory computer readable media for automatically tuning operation parameters of image signal processors
US20170103512A1 (en) * 2015-10-13 2017-04-13 Siemens Healthcare Gmbh Learning-based framework for personalized image quality evaluation and optimization

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8379130B2 (en) * 2009-08-07 2013-02-19 Qualcomm Incorporated Apparatus and method of processing images based on an adjusted value of an image processing parameter

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160379352A1 (en) * 2015-06-24 2016-12-29 Samsung Electronics Co., Ltd. Label-free non-reference image quality assessment via deep neural network
US20170070671A1 (en) * 2015-09-07 2017-03-09 Samsung Electronics Co., Ltd. Systems, methods, apparatuses, and non-transitory computer readable media for automatically tuning operation parameters of image signal processors
US20170103512A1 (en) * 2015-10-13 2017-04-13 Siemens Healthcare Gmbh Learning-based framework for personalized image quality evaluation and optimization

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
DONG J: "Learning adaptive parameter tuning for image processing", IS AND T INTERNATIONAL SYMPOSIUM ON ELECTRONIC IMAGING SCIENCE AND TECHNOLOGY - IMAGE PROCESSING: ALGORITHMS AND SYSTEMS XVI 2018 SOCIETY FOR IMAGING SCIENCE AND TECHNOLOGY USA, vol. Part F138652, 2018, XP002790353, DOI: 10.2352/ISSN.2470-1173.2018.13.IPAS-196 *
LUCAS YVES ET AL: "Modeling, Evaluation and Control of a Road Image Processing Chain", 19 June 2005, IMAGE ANALYSIS AND RECOGNITION, 11th International Conference, ICIAR 2014, Vilamoura, Portugal, ISBN: 978-3-642-17318-9, XP047466459 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020176007A1 (fr) * 2019-02-27 2020-09-03 Huawei Technologies Co., Ltd. Appareil et procédé de traitement d'image
CN113099100A (zh) * 2019-12-23 2021-07-09 神讯电脑(昆山)有限公司 取像参数的调整方法
WO2021201993A1 (fr) * 2020-03-30 2021-10-07 Qualcomm Incorporated Syntonisation automatisée de caméra
CN115362502A (zh) * 2020-03-30 2022-11-18 高通股份有限公司 自动相机调试
US20230054572A1 (en) * 2020-03-30 2023-02-23 Qualcomm Incorporated Automated camera tuning
WO2024082183A1 (fr) * 2022-10-19 2024-04-25 华为技术有限公司 Procédé et appareil de réglage de paramètre, et terminal intelligent

Also Published As

Publication number Publication date
CN111630854A (zh) 2020-09-04

Similar Documents

Publication Publication Date Title
WO2019152534A1 (fr) Systèmes et procédés de réglage de processeur de signal d'image
US10237527B2 (en) Convolutional color correction in digital images
WO2019152499A1 (fr) Systèmes et procédé pour réglage de processeur de signal d'image utilisant une image de référence
US10949958B2 (en) Fast fourier color constancy
US8730329B2 (en) Automatic adaptive image sharpening
US20150170389A1 (en) Automatic selection of optimum algorithms for high dynamic range image processing based on scene classification
CN112232476A (zh) 更新测试样本集的方法及装置
WO2012170462A2 (fr) Correction d'exposition automatique d'images
US20220076385A1 (en) Methods and systems for denoising media using contextual information of the media
CN114066857A (zh) 红外图像质量评价方法、装置、电子设备及可读存储介质
CN111861937A (zh) 一种基于msr改进的图像增强方法及系统
WO2019152481A1 (fr) Systèmes et procédés de réglage de processeur de signal d'image
US9940543B2 (en) Control of computer vision pre-processing based on image matching using structural similarity
US11443414B2 (en) Image signal processing
CN105574844B (zh) 辐射响应函数估计方法和装置
Fry et al. Bridging the gap between imaging performance and image quality measures
CN112651945A (zh) 一种基于多特征的多曝光图像感知质量评价方法
CN109035178B (zh) 一种应用于图像去噪的多参数取值调优方法
Peltoketo SNR and Visual Noise of Mobile Phone Cameras
CN111782845A (zh) 一种图像调整方法、图像调整装置及移动终端
CN110111286A (zh) 图像优化方式的确定方法和装置
US20240135587A1 (en) Method of learning parameter of sensor filter and apparatus for performing the same
CN117173642B (zh) 一种基于大数据的建筑施工视频实时监测预警方法
van Zwanenberg et al. Camera system performance derived from natural scenes
CN117939307B (zh) 一种适用于融合相机的自适应亮度调节方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19705440

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19705440

Country of ref document: EP

Kind code of ref document: A1