US20200372682A1 - Predicting optimal values for parameters used in an operation of an image signal processor using machine learning - Google Patents
Predicting optimal values for parameters used in an operation of an image signal processor using machine learning Download PDFInfo
- Publication number
- US20200372682A1 US20200372682A1 US16/724,626 US201916724626A US2020372682A1 US 20200372682 A1 US20200372682 A1 US 20200372682A1 US 201916724626 A US201916724626 A US 201916724626A US 2020372682 A1 US2020372682 A1 US 2020372682A1
- Authority
- US
- United States
- Prior art keywords
- sample
- parameters
- signal processor
- image signal
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000010801 machine learning Methods 0.000 title claims abstract description 83
- 238000011156 evaluation Methods 0.000 claims abstract description 89
- 238000000034 method Methods 0.000 claims abstract description 62
- 238000012545 processing Methods 0.000 claims description 23
- 238000012549 training Methods 0.000 claims description 15
- 238000005457 optimization Methods 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 7
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 10
- 230000000875 corresponding effect Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 4
- 230000001276 controlling effect Effects 0.000 description 4
- 230000002596 correlated effect Effects 0.000 description 4
- 230000005693 optoelectronics Effects 0.000 description 3
- 101100004188 Arabidopsis thaliana BARD1 gene Proteins 0.000 description 2
- 101100328883 Arabidopsis thaliana COL1 gene Proteins 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005538 encapsulation Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000002542 deteriorative effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/97—Determining parameters from multiple pictures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
-
- G06N3/0445—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/001—Image restoration
- G06T5/002—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/007—Dynamic range modification
- G06T5/009—Global, i.e. based on properties of the image as a whole
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- G06T5/60—
-
- G06T5/70—
-
- G06T5/92—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/71—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
- H04N25/75—Circuitry for providing, modifying or processing image signals from the pixel array
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
Definitions
- the present disclosure relates to a method of training a machine learning to predict optimal values for parameters used in an operation of an image signal processor and an electronic device configured to perform the method.
- An image sensor is a semiconductor-based sensor configured to receive light and generate an electrical signal.
- Raw data, output by the image sensor may be processed by an image signal processor (ISP).
- the image signal processor may generate an image using the raw data output by the image sensor.
- the image signal processor may generate an image from the raw data based on various parameters. However, quality and characteristics of the generated image may vary depending on values of the parameters applied to the image signal processor.
- At least one exemplary embodiment of the inventive concept provides a method predicting performance of an image signal processor or quality of images generated by the image signal using machine learning. The resulting predictions may be used to tune the image signal processor to improve quality of images generated by the image signal processor.
- a method of training a machine learning model to predict optimal values for a plurality of parameters used in an operation of an image signal processor includes: capturing an image of a sample subject to obtain sample data; generating a plurality of sets of sample values for the plurality of parameters; emulating the image signal processor (ISP) processing the sample data according to each of the sets to generate a plurality of sample images; evaluating each of the plurality of sample images for a plurality of evaluation items to generate respective sample scores; and training the machine learning model to predict the optimal values using the sample values and the sample scores.
- ISP image signal processor
- a method of predicting optimal values for a plurality of parameters used in an operation of an image signal processor includes: inputting initial values for the plurality of parameters to a machine learning model including an input layer having a plurality of input nodes, corresponding to the plurality of parameters, and an output layer having a plurality of output nodes, corresponding to a plurality of evaluation items extracted from a result image generated by the image signal processor; obtaining evaluation scores for the plurality of evaluation items using an output of the machine learning model; adjusting weights, applied to the plurality of parameters, based on the evaluation scores; and determining the optimal values using the adjusted weights.
- an electronic device includes an image signal processor and a parameter optimization module.
- the image signal processor is configured to process raw data, output by an image sensor, depending on a plurality of parameters to generate a result image.
- the parameter optimization module includes a machine learning model, receiving sample values for the plurality of parameters and outputting a plurality of sample scores indicating quality of sample images, the sample images being generated by the image signal processor processing the raw data based on the sample values, the parameter optimization module being configured to determine weights, respectively applied to the plurality of parameters, using the machine learning model.
- the image signal processor applies the weights to the plurality of parameters to generate a plurality of weighted parameters and generates the result image by processing the raw data using the weighted parameters.
- FIG. 1 is a block diagram of an image sensor according to an example embodiment
- FIGS. 2 and 3 are schematic diagrams of image sensors according to an example embodiment, respectively;
- FIG. 4 illustrates a pixel array of an image sensor according to an example embodiment
- FIG. 5 is a block diagram of an electronic device according to an exemplary embodiment of the inventive concept
- FIG. 6 is a flowchart illustrating a method of generating data that may be used to train a machine learning model for an image signal processor according to an exemplary embodiment of the inventive concept
- FIG. 7 illustrates a system that may use the machine learning model according to an exemplary embodiment of the inventive concept
- FIGS. 8 to 10 illustrate a method of training the machine learning model according to an exemplary embodiment of the inventive concept
- FIG. 11 illustrates a system for training the machine learning model according to an exemplary embodiment of the inventive concept
- FIG. 12 illustrates the machine learning model according to an exemplary embodiment of the inventive concept
- FIG. 13 is a flowchart illustrating a method of operating the machine learning model according to an exemplary embodiment of the inventive concept
- FIG. 14 illustrates a system providing a method of operating the machine learning model according to an exemplary embodiment of the inventive concept
- FIGS. 15 to 17 illustrate a method of operating the machine learning model according to an exemplary embodiment of the inventive concept
- FIG. 18 is a block diagram of an electronic device according to an exemplary embodiment of the inventive concept.
- FIGS. 19A and 19B illustrate an electronic device according to an exemplary embodiment of the inventive concept.
- FIGS. 20 and 21 illustrate an operation of an electronic device according to an exemplary embodiment of the inventive concept.
- FIG. 1 is a block diagram of an image sensor according to an exemplary embodiment of the inventive concept.
- an image sensor 10 includes a pixel array 11 , a row driver 12 (e.g., a row driving circuit), a readout circuit 13 , and a column driver 14 (e.g., a column driving circuit), control logic 15 (e.g., a logic or control circuit).
- the row driver 12 , the readout circuit 13 , the column driver 14 , and the control logic 16 may be circuits configured to generate image data for controlling the pixel array 11 , and may be incorporated into a controller.
- the image sensor 10 may convert light, transferred from an object 30 , into an electrical signal to generate raw data for generating image.
- the raw data may be output to a processor 20 .
- the processor 20 may include an image signal processor (ISP) configured to generate an image using the raw data.
- ISP image signal processor
- the image signal processor is mounted in the image sensor 10 .
- the pixel array 11 may include a plurality of pixels PX.
- Each of the plurality of pixels PX may include an optoelectronic component configured to receive light and generate charges based on the received light, for example, a photodiode (PD).
- PD photodiode
- each of the plurality of pixels PX includes two or more optoelectronic components. Two or more optoelectronic components may be included in each of the plurality of pixels PX such that each of the pixels PX generates a pixel signal corresponding to light of various wavelength bands or provides an autofocusing function.
- Each of the plurality of pixels PX may include a pixel circuit configured to generate a pixel signal from charges generated by one or more photodiodes.
- the pixel circuit includes a transmission transistor, a drive transistor, a select transistor, and a reset transistor.
- the pixel circuit may output a reset voltage and a pixel voltage using charges generated by the photodiodes.
- the pixel voltage may be a voltage reflecting charges generated by photodiodes included in each of the plurality of pixels PX.
- two or more adjacent pixels PX may constitute a single pixel group, and two or more pixels, belonging to a pixel group, may share at least some of a transmission transistor, a drive transistor, a select transistor, and a reset transistor with each other.
- the row driver 12 may drive the pixel array 11 in units of rows.
- the row driver 12 may generate a transmission control signal controlling a transmission transistor of a pixel circuit, a reset control signal controlling a reset transistor of the pixel circuit, and a select control signal controlling a select transistor of the pixel circuit.
- the readout circuit 13 may include at least one of a correlated double sampler (CDS) and an analog-to-digital converter (ADC).
- the correlated double sampler may be connected to pixels, included in a row line selected by a row select signal provided by the row driver 12 , through column lines and may perform correlated double sampling to detect a reset voltage and a pixel voltage.
- the analog-to-digital converter may output a digital signal after converting the reset voltage and the pixel voltage, detected by the correlated double sampler, into the digital signal.
- the column driver 14 may include a latch circuit, a buffer, an amplifier circuit, and may temporarily store or amplify the digital signal, received from the readout circuit 13 , to generate image data. Operating timings of the row driver 12 , the readout circuit 13 , and the column driver 14 may be determined by the control logic 15 . As an example, the control logic 15 may be operated by a control instruction transmitted by the processor 20 . The processor 20 may signal-process the raw data, output by the column driver 14 and the control logic 15 , to generate an image and may output the image to a display device, or store the image in a storage device such as a memory.
- FIGS. 2 and 3 are schematic diagrams of image sensors according to exemplary embodiments, respectively.
- an image sensor 40 includes a first layer 41 , a second layer 42 provided below the first layer 41 , and a third layer 43 provided below the second layer 42 .
- the first layer 41 , the second layer 42 , and the third layer 43 may be stacked in a direction perpendicular to each other.
- the first layer 41 and the second layer 42 are stacked in a wafer level, and the third layer 43 is attached to a portion below the second layer 42 .
- the first to third layers 41 , 42 , and 43 may be provided as a single semiconductor package.
- the first layer 41 includes a sensing area SA, in which a plurality of pixels are provided, and a first pad area PA 1 provided around the sensing area SA.
- a plurality of upper pads PAD are included in the first pad region PAL
- the plurality of upper pads PAD may be connected to pads and a logic circuit LC of the second layer 42 through a via or a wire.
- the pads of the second layer 42 may be provided in a second pad area PA 2 of the second layer 42 .
- Each of the plurality of pixels PX may include a photodiode configured to receive light and generate charges and a pixel circuit configured to process the charges generated by the photodiode.
- the pixel circuit may include a plurality of transistors configured to output a voltage corresponding to the charges generated by a photodiode.
- the second layer 42 may include a plurality of components configured to implement the control logic LC.
- the plurality of components implementing the logic circuit LC may include circuits configured to drive a pixel circuit provided on the first layer 41 , such as a row driver, a readout circuit, a column driver, and control logic.
- the plurality of components implementing the logic circuit LC may be connected to a pixel circuit through the first and second pad areas PA 1 and PA 2 .
- the logic circuit LC may obtain the reset voltage and the pixel voltage from the plurality of pixels PX to generate a pixel signal.
- At least one of the plurality of pixels PX includes a plurality of photodiodes disposed on the same level. Pixel signals, generated from charges of each of the plurality of photodiodes, may have a phase difference from each other.
- the logic circuit LC may provide an autofocusing function based on a phase difference of pixel signals generated from a plurality of photodiodes included in a single pixel PX.
- the third layer 43 may include a memory chip MC, a dummy chip DC and an encapsulation layer EN encapsulating the memory chip MC and the dummy chip DC.
- the memory chip MC may be a dynamic random access memory (DRAM) or a static random access memory (SRAM).
- the dummy chip DC does not store data.
- the dummy chip DC may be omitted.
- the memory chip MC may be electrically connected to at least some of the components, included in the logic circuit LC of the second layer 42 , by a bump, a via, or a wire, and may store data required to provide an autofocusing function.
- the bump is a microbump.
- an image sensor 50 includes a first layer 51 and a second layer 52 .
- the first layer 51 includes a sensing area SA in which a plurality of pixels PX are provided, a logic circuit LC in which components for driving the plurality of pixels PX are provided, and a first pad area PA 1 provided around the sensing area SA and the logic circuit LC.
- a plurality of upper pads PAD are included in the first pad area PAL
- the plurality of upper pads PAD may be connected to a memory chip MC, provided on the second layer 52 , through a via or a wire.
- the second layer 52 may include a memory chip MC and a dummy chip DC, and an encapsulation layer EN encapsulating the memory chip MC and the dummy chip DC.
- the dummy chip DC may be omitted.
- FIG. 4 illustrates a pixel array of an image sensor according to an exemplary embodiment of the inventive concept.
- a pixel array PA of an image sensor includes a plurality of pixels PX.
- the plurality of pixels may be connected to a plurality of row lines ROW 1 to ROWm (ROW) and a plurality of column lines COL 1 to COLn (COL).
- ROW row lines
- COL column lines
- a given pixel of the pixels PX may be connected to a given row line of the row lines ROW 1 to ROWm and to a given column line of the column lines COL 1 to COLn.
- the image sensor may drive the plurality of pixels PX in units of the plurality of row lines ROW.
- time required for driving a selected driving line among the plurality of row lines ROW and reading the reset voltage and the pixel voltage from pixels PX connected to the selected driving line may be defined as one horizontal cycle.
- the image sensor may operate in a rolling shutter manner, in which a plurality of the pixels PX are sequentially exposed to light, or a global shutter manner in which a plurality of the pixels PX are simultaneously exposed to light.
- a reset voltage and a pixel voltage, output from each of the plurality of pixels PX, may be converted into digital data and may be processed as raw data through predetermined signal processing.
- An image signal processor mounted in the image sensor or an additional processor communicating with the image sensor, may generate a result image displayed on a display or stored in a memory. Accordingly, different result images may generated from the raw data depending on performance or a tuning method of the image signal processor. Thus, a user may be provided with an optimal result image by improving performance of the image signal processor or by precisely tuning the image signal processor.
- the performance of the image signal processor may be improved by providing a method of modeling the image signal processor so there is no room for intervention of a person's subjective judgment.
- a user may be provided with an optimal result image by tuning the image signal processor in consideration of the user's desire.
- FIG. 5 is a block diagram of an electronic device according to an exemplary embodiment of the inventive concept.
- an electronic device 100 includes an image sensor 110 , a processor 120 , a memory 130 , and a display 140 .
- the processor 120 may control overall operation of the electronic device 100 , and may be implemented by a central processing unit (CPU), an application processor (AP), or a system on chip (SoC).
- the image sensor 110 and the image signal processor are mounted on a single integrated circuit chip.
- the image sensor 110 may generate raw data in response to external light and may transmit the raw data to the processor 120 .
- the processor 120 may include an image signal processor 121 configured to signal-process the raw data to generate a result image.
- the image signal processor 121 may adjust a plurality of parameters associated with the raw data and signal-process the raw data according to the adjusted parameters to generate a result image.
- the parameters may include two or more of color, blurring, sharpness, noise, a contrast ratio, resolution, and a size. In an alternate embodiment, the parameters may include only one of color, blurring, sharpness, noise, a contrast ratio, resolution, and a size.
- the result image, output by the image signal processor 121 may be stored in the memory 130 or may be displayed on the display 140 .
- the processor 120 may include a parameter optimization module 122 .
- the parameter optimization module 122 and the image signal processor are mounted in a single integrated circuit.
- the parameter optimization module 122 may adjust weights given to the plurality of parameters, and characteristics of the result image, output by the image signal processor 121 , can be changed depending on the adjusted weights.
- the parameter optimization module 122 adjusts a color, one of the plurality of parameters, to output a warm-tone result image or a cold-tone result image from the same raw data. For example, when a first weight is applied to the color parameter to generate a first weighted parameter, the image signal processor 121 processing the raw data using the first weighted parameter outputs a warm-tone result image. For example, when a second weight different from the first weight is applied to the color parameter to generate a second weighted parameter, the image signal processor 121 processing the raw data using the second weighted parameter outputs a cold-tone result image.
- the weight, applied to the plurality of parameters by the parameter optimization module 122 is determined by a modeling method performed in advance.
- the weight, applied to the plurality of parameters by the parameter optimization module 122 may be adaptively adjusted based on user feedback.
- a weight may be determined by a modeling method using a machine learning model to significantly reduce a possibility of intervention of a person's subjective evaluation and to improve performance of the image signal processor 121 while accurately and objectively tuning the image signal processor 121 .
- FIG. 6 is a flowchart illustrating a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept
- FIG. 7 illustrates a system providing a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept.
- a method of modeling an image signal processor includes capturing an image of a sample subject to obtain sample data (S 100 ).
- a system 200 for modeling an image signal processor may include an electronic device 210 , including an image sensor, a sample subject 220 , and a computer device 230 in which a modeling method is executed.
- the electronic device 210 is illustrated as being a camera, it may be replaced with another device including an image sensor.
- the computer device 230 is illustrated as being a desktop computer, it may be replaced with another device executing the modeling method.
- the electronic device 210 and the computer device 230 are implemented as a single device.
- the sample subject 220 may be a test chart.
- the sample subject 220 may include a plurality of capturing regions 221 to 223 (regions of interest), which may be different from each other.
- a first capturing region 221 may be a region in which people are displayed
- a second capturing region 222 may be a region in which a black-and-white pattern is displayed
- a third capturing region 223 may be a region in which a color pattern is displayed.
- the sample data, obtained by the electronic device 210 capturing the sample subject 220 may be raw data.
- the raw data may be transferred to the computer device 230 including an image signal processor (ISP) simulator.
- the ISP simulator is capable of simulating different types of image signal processors.
- the ISP simulator could emulate one or more of the image signal processors processing the raw data to generate an image.
- Emulating a given image signal processor processing the raw data may include the given image signal processor processing the raw data using one or more parameters.
- a given parameter may be settable to only certain values, where each setting has a different affect. For example, if the given parameter is settable to only a first value or a second other value, emulating an image signal processor processing the raw data using the given parameter set to the first value could result in a first image, while an emulating the image signal processor processing the same raw data using the given parameter set to the second value could result in a second image different from the first image.
- the computer device 230 sets parameters used in an operation of an image signal processor to respective sample values (S 110 ).
- the computer device 230 signal-processes the raw data using the image signal processor simulator to emulate the image signal processor processing the sample data using the sample values to generate a plurality of sample images (S 120 ).
- a plurality of sample scores of evaluation items are obtained for each of the plurality of sample images (S 130 ).
- the plurality of sample scores may be scores calculated from a plurality of evaluation items selected to evaluate each of the plurality of sample images.
- the plurality of evaluation items may include at least one of an image color, resolution, a dynamic range, shading, sharpness, texture loss, and noise. For example, if only resolution is considered, the first sample image has a low resolution and the second sample image has a high resolution, the first sample image could receive a lower score than the second sample image.
- the computer device 230 stores the sample values for the parameters, the sample images, and sample scores, in a database (DB) (S 140 ).
- the database may include a mapping of each parameter to a respective sample value.
- the sample values, sample images, and sample scores stored in the database (DB) may be used to train a machine learning model to infer the performance of the image signal processor or to predict the quality of an image that will be produced by the image signal processor when parameters having certain values are used during processing of raw data.
- FIGS. 8 to 10 illustrate a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept.
- FIG. 8 may be a schematic diagram of a system for modeling an image signal processor.
- the system 300 includes a simulator 310 , an evaluation framework 320 , and a database 330 .
- the simulator 310 receives sample data 301 , which is raw data obtained by capturing an image of a sample subject such as a test chart.
- the simulator 310 may include a parameter generator 311 , configured to determine a plurality of sample values for a plurality of parameters 232 used in an operation of an image signal processor, and an ISP simulator 312 configured to simulate (or emulate) the image signal processor operating on the sample data 392 using the sample values of the parameters 232 .
- the parameter generator 311 may determine sample values of the parameters 232 such as image color, blurring, noise, a contrast ratio, resolution, and size.
- At least one of the parameters may be classified into a plurality of detailed parameters according to an embodiment. For example, there may be a plurality of detailed parameters for noise and a plurality of detailed parameters for color.
- the ISP simulator 312 may signal-process the sample data 301 using the plurality of sample values 332 , determined for the plurality of parameters by the parameter generator 311 , to generate sample images 331 .
- the operation of the simulator 310 will be described in more detail with reference to FIG. 9 .
- the parameter generator 311 generates a plurality of sample values for first through sixth parameters.
- the parameter generator 311 may generate first through sixth sample sets having different sample values for the first through sixth parameters.
- the first to sixth parameters are parameters used in an operation of the image signal processor.
- the number and types of the parameters may be variously changed.
- the number of sample sets, generated by setting the sample values for the parameters by the parameter generator 311 may also be variously changed.
- the ISP simulator 312 When the sample sets are determined, the ISP simulator 312 generates the first through sixth sample images 410 to 460 ( 400 ) by setting parameters to each of the sample sets and simulating (or emulating) the operation of the image signal processor on the sample data 301 using each of the sample sets. For example, the ISP simulator 312 may emulate the image signal processor processing raw data of the sample data 301 using the 6 sample parameters set to their respective values in the first sample set to generate the first sample image 410 , emulate the image signal processor processing the raw data using the 6 sample parameters set to their respective values in the second sample set to generate the second sample image 420 , etc.
- the sample images 400 are images generated from the sample data 301 obtained by capturing an image of the same sample subject. Since the sample images 400 are images generated by the ISP simulator 312 by different sample sets, they may have different quality and/or characteristics.
- the evaluation framework 320 receives sample images, generated by the simulator 310 , to evaluate the quality of the sample images.
- the evaluation framework 320 may evaluate each of the sample images for a plurality of evaluation items and may express a result of the evaluation as sample scores 333 .
- the evaluation framework 320 may obtain sample scores 333 of evaluation items such as resolution 321 , texture loss 322 , sharpness 323 , noise 324 , a dynamic range 325 , shading 326 , and a color 327 , for each of the sample images.
- this will be described in more detail with reference to FIG. 10 .
- the evaluation framework 320 receives the sample images 400 to obtain sample scores 333 for a plurality of evaluation items.
- sample scores for the first through third evaluation items may be obtained by evaluating each of the sample images 400 .
- the first evaluation item could be resolution 321
- the second evaluation item could be sharpness 323
- the third evaluation item could be noise 324 .
- the evaluation framework 320 may classify and store the sample scores depending on the sample images 400 .
- a lowest point and a highest point of each of the sample scores may vary depending on the sample items.
- the first sample image 410 includes a first score of 70.37 for the first evaluation item, a second score of 62.29 for the second evaluation item, and a third score of 1979.25 for the third evaluation item.
- the database 330 may be established.
- the database 330 includes sample images 331 and sample values 332 of the plurality of parameters, generated by the simulator 310 , and sample scores 333 obtained by evaluating the sample images 331 for the plurality of evaluation items 321 to 327 by the evaluation framework.
- the sample images 331 , the sample values 332 of the plurality of parameters, and the sample scores 333 , stored in the database 330 may be used to train the machine learning model.
- the machine learning model, trained by data stored in the database 330 may be a model for predicting the quality of a result image output by the image signal processor.
- this will be described in more detail with reference to FIGS. 11 and 12 .
- FIG. 11 illustrates a system providing a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept
- FIG. 12 illustrates a machine learning model employed in a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept.
- a system 500 may operate in cooperation with a database 600 .
- the database 600 may be a database established by the modeling method described with reference to FIGS. 8 to 10 , and may include sample images, sample values for a plurality of parameters, and sample scores for a plurality of evaluation items.
- the machine learning model trainer 510 trains a machine learning model 700 to predict a quality of an image produced by a given image signal processor using parameters have certain values using sample values 501 of the parameters and sample scores 502 stored in the database 600 .
- the sample values 501 of the parameters may be at least one of first to sixth sample sets set in the same manner as described in the example embodiment with reference to FIG. 9 .
- a first sample score set in the example embodiment, illustrated in FIG. 10 may be selected as the sample scores 502 .
- the machine learning model trainer 510 may input sample values, included in the first sample set, to the machine learning model 700 .
- the machine learning model trainer 510 trains the machine learning model 700 until the output of the machine learning model 700 matches evaluation scores of the first sample score set or a difference between evaluation scores of the first sample score set becomes less than or equal to a reference difference.
- the machine learning model 700 may be implemented by an artificial neural network (ANN).
- the machine learning model 700 includes an input layer 710 , a hidden layer 720 , an output layer 730 .
- a plurality of nodes, included in the input layer 710 , the hidden layer 720 , and the output layer 730 may be connected to each other in a fully connected manner.
- the input layer 710 includes a plurality of input nodes x 1 to x i .
- the number of the input nodes x 1 to x i corresponds to the number of parameters.
- the output layer 730 includes a plurality of output nodes y 1 to y j .
- the number of the output nodes y 1 to y j corresponds to the number of evaluation items.
- the hidden layer 720 includes first to third hidden layers 721 to 723 , and the number of the hidden layers 721 to 723 may be variously changed.
- the machine learning model 700 may be trained by adjusting weights of the hidden nodes included in the hidden layer 720 .
- the first to sixth sample sets are input to the input layer 710 and the weights of the hidden nodes, included in the hidden layer 720 , may be adjusted until values, output to the output layer 730 , correspond to the first to sixth sample score set. Accordingly, after the training has completed, quality of a result image, output by the image signal processor, may be inferred using the machine learning model 700 when the parameters have predetermined values.
- FIG. 13 is a flowchart illustrating a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept.
- a method of modeling an image signal processor includes setting initial values for a plurality of parameters applied to an image signal processor (S 200 ).
- the plurality of parameters applied to the image signal processor may include a color, blurring, noise, a contrast ratio, a resolution, and a size an image as parameters used in an operation of the image signal processor.
- the machine learning model may be a model trained to predict the quality of the resulting image output by the image signal processor.
- An output of the machine learning model may vary depending on the values of the parameters applied to the image signal processor.
- a training process of the machine learning model may be understood based on the example embodiment described above with reference to FIGS. 11 and 12 .
- Evaluation scores for a plurality of evaluation items are obtained using the output of the machine learning model (S 220 ).
- the machine learning model is a model trained by the image signal processor to predict the quality of a result image generated by signal-processing raw data, and the output of the machine learning model corresponds to the evaluation scores of a plurality of evaluation items.
- the plurality of evaluation items may include a color, sharpness, noise, resolution, a dynamic range, shading, and texture loss of an image.
- weights applied to the parameters are adjusted based on the obtained evaluation scores for the plurality of evaluation items (S 230 ).
- each of the evaluation scores may be compared with predetermined reference scores and, when there is an evaluation score which does not reach a reference score, the weight is applied to at least one of the parameters may be increased or decreased such that the corresponding evaluation score may be increased.
- the evaluation score, output by the machine learning model may be compared with a reference score while changing a weight by a predetermined number of times.
- FIGS. 14 to 17 illustrate a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept.
- FIG. 14 is a schematic diagram of a system for providing a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept.
- a system 800 according to an exemplary embodiment includes a parameter adjusting module 810 , a machine learning model 820 , and a feedback module 830 .
- the parameter adjusting module 810 may adjust values input to a machine learning model 820 .
- the machine learning model 820 may receive parameters used in an operation of an image signal processor, and may output evaluation scores indicating quality and/or characteristics of a resulting image generated by the image signal processor operating depending on values of the parameters. Accordingly, the parameter adjusting module 810 may adjust the values of the parameters used in the operation of the image signal processor. For example, the parameter adjusting module 810 may adjust weights applied to the parameters. When initial values of parameters 801 are input, the parameter adjusting module 810 may apply predetermined weights to the initial values of the parameters to generate weighted values of the parameters, and input the weighted values to the machine learning model 820 .
- the machine learning model 820 may be a model trained to predict the quality of the resulting image generated by the image signal processor.
- An output of the machine learning model 820 may correspond to an evaluation score of the evaluation items indicating quality of a result image.
- the feedback module 830 compares an output of the machine learning model 820 with a target score of the evaluation items and transmits a result of the comparison to the parameter adjusting module 810 .
- the parameter adjusting module 810 adjusts weights applied to the parameters, with reference to a comparison result transmitted by the feedback module 830 .
- the parameter adjusting module 810 may adjust the weights applied to the parameters, a predetermined number of times or until the difference between the evaluation scores and the target scores output by the machine learning model 820 is reduced to be less than or equal to the reference difference.
- optimized ISP parameters 802 e.g., parameters set to optimal values
- the parameters set to the optimal values may be used to tune an image signal processor.
- the parameters set to the optimal values may be output to the image signal processor for storage on the image signal processor and then the image signal processor can use the parameters set to these values when performing a subsequent operation (e.g., process raw data to generate an image).
- the system 800 may adjust the weights applied to the parameters, by considering feedback from a user of an electronic device in which an image signal processor is mounted.
- the system 800 may be mounted in the electronic device together with the image signal processor and may adaptively adjust the weights with reference to the feedback from the user.
- FIGS. 15 to 17 are provided to illustrate a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept.
- initial values may be set for first to sixth parameters.
- the initial values may be any values generated at random.
- a predetermined weight may be reflected on an input layer IL to be input to the machine learning model.
- the input layer IL may receive a plurality of input values, and the plurality of input values may correspond to parameters used in an operation of the image signal processor.
- a weight may be applied to the plurality of input values, and the plurality of input values and the weight may be connected in a fully connected manner or a partially connected manner. When the plurality of input values and the weight correspond to each other in the partially connected manner, the weight is not connected to at least one of the plurality of input values.
- the machine learning model may output at least one output value to an output layer OL using a plurality of weight-given input values.
- the output value may correspond to an evaluation score of an evaluation item which may indicate the quality of the image generated by the image signal processor.
- the number of input values included in the input layer IL, and the number of output values included in the output layer OL, may be variously changed according to exemplary embodiments.
- FIG. 17 is a graph illustrating variation of evaluation scores y 1 to y 4 depending on the number of times of training of the machine learning model.
- the output layer OL outputs the evaluation scores y1 to y4 for four evaluation items.
- the assumption is merely an exemplary embodiment as a shape of the layer is not limited thereto.
- the machine learning model When the machine learning model outputs first to fourth evaluation scores y1 to y4, the first to fourth evaluation scores y 1 to y 4 are compared with the first to fourth target scores, respectively. At least one of the weights, applied to the plurality of input values, may be changed depending on a result of the comparison. In the exemplary embodiment illustrated in FIG. 17 , weights applied to hidden nodes of a hidden layer included in the machine learning model, are not adjusted while weights applied to the plurality of input values in the input layer IL of the machine learning model, are adjusted.
- the first to fourth evaluation scores y 1 to y 4 output by the machine learning model may be approximated to each of the first to fourth target scores.
- At least one of the weights may be adjusted until a predetermined number of times of training completes or until a difference between the first to fourth evaluation scores y 1 to y 4 and the first to fourth target scores is reduced to be less than a reference difference.
- weights may be determined. The determined weights may be assigned to input values of the input layer IL, corresponding to parameters used in an operation of the image signal processor, in the fully connected manner or the partially connected manner.
- Raw data obtained by capturing an image of a sample subject
- the image signal processor may be tuned to satisfy predetermined evaluation conditions output by the image signal processor.
- the image signal processor since the image signal processor is tuned using the raw data obtained by capturing of an image of the sample subject, a relatively long time may be required.
- the tuning depends on a person's objective evaluation, it may be difficult to objectively and precisely tune the image signal processor.
- image data obtained by capturing an image of at least one sample subject is processed by the image signal processor simulator according to sample values of various parameters to generate sample images.
- Sample scores, obtained by evaluating the sample images, and sample values of the parameters may be stored in a database. Since the sample scores and sample values of the parameters stored in the database are numerical items, an effect of a person's subjective evaluation may be significantly reduced.
- a machine learning model trained to receive the sample values of the parameters and to output sample scores may be prepared. Weights, applied to the parameters, may be adjusted such that evaluation scores output by the machine learning model receiving initial values of the parameters, reach target scores.
- an image sensor processor is tuned by adjusting weights applied to parameters used in an operation of the image signal processor, with numerical items, and an effect of a person's subjective evaluation may be significantly reduced to objectively and precisely tune the image signal processor.
- the image signal processor may be adaptively tuned depending on a user by considering an end-user's desire in processes of comparing the evaluation scores outputted by the machine learning model, with target scores and adjusting weights of the parameters.
- FIG. 18 is a block diagram of an electronic device according to an exemplary embodiment of the inventive concept.
- An electronic device 900 includes a display 910 , an image sensor 920 , a memory 930 , a processor 940 , and a port 950 .
- the electronic device 900 may further include a wired/wireless communications device and a power supply.
- the port 950 may be provided for the electronic device 900 to communicate with a video card, a sound card, a memory card, and a universal serial bus (USB) device.
- the electronic device 900 may conceptually include all devices, which employ the image sensor 920 , in addition to a smartphone, a tablet personal computer (PC), and a digital camera.
- the processor 940 may perform a specific operation, command, or task.
- the processor 940 may be a central processing unit (CPU) or a system on chip (SoC), and may communicate with the display 910 , the image sensor 920 , and the memory 930 as well as other devices connected to the port 950 through a bus 960 .
- CPU central processing unit
- SoC system on chip
- the processor 940 may include an image signal processor 941 .
- the image signal processor 941 generates a result image using raw data generated by the image sensor 920 capturing an image of a subject.
- the processor 940 may display the result image generated by image signal processor 941 on the display 910 and may store the result image in memory 930 .
- the memory 930 may be a storage medium configured to store data necessary for an operation of the electronic device 900 or multimedia data.
- the memory 930 may include a volatile memory such as random access memory (RAM) or a nonvolatile memory such as a flash memory.
- RAM random access memory
- the memory 930 may also include at least one of a solid state drive (SSD), a hard disk drive (HDD), and an optical drive (ODD) as a storage device.
- SSD solid state drive
- HDD hard disk drive
- ODD optical drive
- the memory 930 may include a machine learning model 931 such as the machine learning model 700 .
- the machine learning model 931 may receive parameters used in an operation of the image signal processor 941 , and may output evaluation scores of evaluation items indicating a quality of the result image generated by the image signal processor 941 using the parameters.
- the parameters input to the machine learning model 931 may include a color, blurring, noise, a contrast ratio, a resolution, and a size of an image.
- the evaluation scores, output by the machine learning model 931 may correspond to evaluation items such as a color, sharpness, noise, a resolution, a dynamic range, shading, and texture loss of the image.
- the electronic device 900 may adaptively adjust weights applied to the parameters used in the operation of the image signal processor 941 , using the machine learning model 931 .
- the electronic device 900 does not train the machine learning model 931 itself and merely adjusts the weights applied to the parameters in a front end of an input layer of the machine learning model 931 .
- the image signal processor 941 may be tuned for a user without great burden of an arithmetic operation.
- FIGS. 19A and 19B illustrate examples of electronic devices that may include the electronic device 900 .
- an electronic device 1000 is a mobile device such as a smartphone.
- the electronic device 1000 is not limited to a mobile device such as a smartphone.
- the electronic device 100 may be any device including a camera which captures an image.
- the electronic device 1000 includes a housing 1001 , a display 1002 , and cameras 1005 and 1006 .
- the display 1002 substantially covers an entire front surface of the housing 1001 and includes a first region 1003 and a second region 1002 , depending on an operating mode of the electronic device 1000 or an application which is being executed.
- the display 1002 may be provided integrally with a touch sensor configured to sense a user's touch input.
- the cameras 1005 and 1006 may include a general camera 1005 and a time-of-flight (ToF) camera 1006 .
- the general camera 1005 may include a first camera 1005 A and a second camera 1005 B.
- the first camera 1005 A and the second camera 1005 B may be implemented with image sensors having different angles of view, different aperture values, or a different number of pixels. Due to a thickness of the housing 1001 , it may be difficult to employ a zoom lens for adjusting an angle of view and an aperture value in the general camera 1005 .
- the first camera 1005 A and the second camera 1005 B may provide an image capturing function satisfying user's various needs.
- the ToF camera 1006 may be combined with an additional light source to generate a depth map.
- the ToF camera 1006 may provide a face recognition function.
- the ToF may operate in combination with an infrared light source.
- a camera 1007 and a light emitting unit 1008 may be disposed on the rear surface. Similar to the camera 1005 disposed on a front surface of the electronic device 1000 , the camera 1007 includes a plurality of cameras 1007 A to 1007 C having at least one of different aperture values, different angles of view, and a different number of pixels of the image sensor.
- the light emitting unit 1008 may employ a light emitting diode (LED) as a light source and may operate as a flash when capturing images using the camera 1007 .
- LED light emitting diode
- an electronic device 1000 having two or more cameras 1005 to 1007 mounted therein, may provide various image capturing functions.
- An image signal processor, mounted in the electronic device 1000 needs to be appropriately tuned to improve the quality of a result image captured by the cameras 1005 to 1007 .
- the image signal processor mounted in the electronic device 1000 , may process raw data generated by the cameras 1005 to 1007 depending on values of a plurality of parameters to generate a result image. Quality or characteristics of the result image may depend on the values of the parameters, applied to the image signal processor, in addition to the raw data.
- weights are applied to the parameters used in an operation of the image signal processor to generated weighted parameters, and the quality and characteristics of the result image are improved by adjusting the weights.
- weights are applied to the parameters used in an operation of the image signal processor to generate weighted parameters, and a user of the electronic device 1000 adjusts the weights to generate a preferred result image.
- the electronic device 1000 may directly receive feedback from the user to adjust the weights applied to the parameters.
- a color, sharpness, and a contrast ratio of the user's preferred image may be accumulated depending on a capturing site (e.g., the location where an image of the subject was captured), a capturing time (e.g., a time when the image of the subject was captured), and type of a captured subject, and thus, weights of the parameters, applied to the image signal processor, may be changed.
- the electronic device 1000 may adjust the weights applied to the parameters in a front end of an input layer of a machine learning model, such that among the evaluation scores output by an embedded machine learning model, sharpness and a color are adjusted toward a user's preference.
- the adjusted weights may be stored in a memory, and may be applied to the parameters of the image signal processor when a capturing environment, in which a person is selected as a subject outdoors on a sunny day, is recognized.
- FIGS. 20 and 21 illustrate an operation of an electronic device according to an exemplary embodiment of the inventive concept.
- FIG. 20 is a raw image corresponding to an image before an image signal processor signal-processes raw data
- FIG. 21 is a result image generated by signal-processing raw data by an image signal processor.
- the raw image may exhibit poorer noise characteristics than the result image.
- certain weights applied to parameters of an image signal processor may be set to values which improve noise characteristics.
- other weights applied to parameters of an image signal processor may be determined to be values deteriorating noise characteristics depending on user setting or a capturing environment.
- a plurality of parameters that determine operating characteristics of an image signal processor may be tuned using a machine learning model. Weights for the plurality of parameters, applied to the image signal processor, may be determined using the machine learning model such that the image signal processor achieves optimal performance. Accordingly, the image signal processor may be objectively and precisely tuned, as compared with a conventional manner in which a person manually tunes the image signal processor. In addition, weights applied to parameters may be adjusted by considering feedback received from a user of an electronic device in which an image signal processor is mounted. Thus, an image signal processor optimized for the user may be implemented.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Medical Informatics (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
Description
- This U.S. non-provisional patent application claims the benefit of priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2019-0059573 filed on May 21, 2019 in the Korean Intellectual Property Office, the disclosure of which is incorporated by reference in its entirety herein.
- The present disclosure relates to a method of training a machine learning to predict optimal values for parameters used in an operation of an image signal processor and an electronic device configured to perform the method.
- An image sensor is a semiconductor-based sensor configured to receive light and generate an electrical signal. Raw data, output by the image sensor, may be processed by an image signal processor (ISP). The image signal processor may generate an image using the raw data output by the image sensor. The image signal processor may generate an image from the raw data based on various parameters. However, quality and characteristics of the generated image may vary depending on values of the parameters applied to the image signal processor.
- At least one exemplary embodiment of the inventive concept provides a method predicting performance of an image signal processor or quality of images generated by the image signal using machine learning. The resulting predictions may be used to tune the image signal processor to improve quality of images generated by the image signal processor.
- According to an exemplary embodiment of the inventive concept, a method of training a machine learning model to predict optimal values for a plurality of parameters used in an operation of an image signal processor includes: capturing an image of a sample subject to obtain sample data; generating a plurality of sets of sample values for the plurality of parameters; emulating the image signal processor (ISP) processing the sample data according to each of the sets to generate a plurality of sample images; evaluating each of the plurality of sample images for a plurality of evaluation items to generate respective sample scores; and training the machine learning model to predict the optimal values using the sample values and the sample scores.
- According to an exemplary embodiment of the inventive concept, a method of predicting optimal values for a plurality of parameters used in an operation of an image signal processor includes: inputting initial values for the plurality of parameters to a machine learning model including an input layer having a plurality of input nodes, corresponding to the plurality of parameters, and an output layer having a plurality of output nodes, corresponding to a plurality of evaluation items extracted from a result image generated by the image signal processor; obtaining evaluation scores for the plurality of evaluation items using an output of the machine learning model; adjusting weights, applied to the plurality of parameters, based on the evaluation scores; and determining the optimal values using the adjusted weights.
- According to an exemplary embodiment of the inventive concept, an electronic device includes an image signal processor and a parameter optimization module. The image signal processor is configured to process raw data, output by an image sensor, depending on a plurality of parameters to generate a result image. The parameter optimization module includes a machine learning model, receiving sample values for the plurality of parameters and outputting a plurality of sample scores indicating quality of sample images, the sample images being generated by the image signal processor processing the raw data based on the sample values, the parameter optimization module being configured to determine weights, respectively applied to the plurality of parameters, using the machine learning model. The image signal processor applies the weights to the plurality of parameters to generate a plurality of weighted parameters and generates the result image by processing the raw data using the weighted parameters.
- Embodiments of the present disclosure will be more clearly understood from the following detailed description, taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a block diagram of an image sensor according to an example embodiment; -
FIGS. 2 and 3 are schematic diagrams of image sensors according to an example embodiment, respectively; -
FIG. 4 illustrates a pixel array of an image sensor according to an example embodiment; -
FIG. 5 is a block diagram of an electronic device according to an exemplary embodiment of the inventive concept; -
FIG. 6 is a flowchart illustrating a method of generating data that may be used to train a machine learning model for an image signal processor according to an exemplary embodiment of the inventive concept; -
FIG. 7 illustrates a system that may use the machine learning model according to an exemplary embodiment of the inventive concept; -
FIGS. 8 to 10 illustrate a method of training the machine learning model according to an exemplary embodiment of the inventive concept; -
FIG. 11 illustrates a system for training the machine learning model according to an exemplary embodiment of the inventive concept; -
FIG. 12 illustrates the machine learning model according to an exemplary embodiment of the inventive concept; -
FIG. 13 is a flowchart illustrating a method of operating the machine learning model according to an exemplary embodiment of the inventive concept; -
FIG. 14 illustrates a system providing a method of operating the machine learning model according to an exemplary embodiment of the inventive concept; -
FIGS. 15 to 17 illustrate a method of operating the machine learning model according to an exemplary embodiment of the inventive concept; -
FIG. 18 is a block diagram of an electronic device according to an exemplary embodiment of the inventive concept; -
FIGS. 19A and 19B illustrate an electronic device according to an exemplary embodiment of the inventive concept; and -
FIGS. 20 and 21 illustrate an operation of an electronic device according to an exemplary embodiment of the inventive concept. - Hereinafter, exemplary embodiments of the inventive concept will be described with reference to the accompanying drawings.
-
FIG. 1 is a block diagram of an image sensor according to an exemplary embodiment of the inventive concept. - Referring to
FIG. 1 , animage sensor 10 according to an exemplary embodiment includes apixel array 11, a row driver 12 (e.g., a row driving circuit), areadout circuit 13, and a column driver 14 (e.g., a column driving circuit), control logic 15 (e.g., a logic or control circuit). Therow driver 12, thereadout circuit 13, thecolumn driver 14, and the control logic 16 may be circuits configured to generate image data for controlling thepixel array 11, and may be incorporated into a controller. - The
image sensor 10 may convert light, transferred from anobject 30, into an electrical signal to generate raw data for generating image. The raw data may be output to aprocessor 20. Theprocessor 20 may include an image signal processor (ISP) configured to generate an image using the raw data. According to an exemplary embodiment of the inventive concept, the image signal processor is mounted in theimage sensor 10. - The
pixel array 11, incorporated in theimage sensor 10, may include a plurality of pixels PX. Each of the plurality of pixels PX may include an optoelectronic component configured to receive light and generate charges based on the received light, for example, a photodiode (PD). In an exemplary embodiment, each of the plurality of pixels PX includes two or more optoelectronic components. Two or more optoelectronic components may be included in each of the plurality of pixels PX such that each of the pixels PX generates a pixel signal corresponding to light of various wavelength bands or provides an autofocusing function. - Each of the plurality of pixels PX may include a pixel circuit configured to generate a pixel signal from charges generated by one or more photodiodes. In an exemplary embodiment, the pixel circuit includes a transmission transistor, a drive transistor, a select transistor, and a reset transistor. As an example, the pixel circuit may output a reset voltage and a pixel voltage using charges generated by the photodiodes. The pixel voltage may be a voltage reflecting charges generated by photodiodes included in each of the plurality of pixels PX. In an exemplary embodiment, two or more adjacent pixels PX may constitute a single pixel group, and two or more pixels, belonging to a pixel group, may share at least some of a transmission transistor, a drive transistor, a select transistor, and a reset transistor with each other.
- The
row driver 12 may drive thepixel array 11 in units of rows. For example, therow driver 12 may generate a transmission control signal controlling a transmission transistor of a pixel circuit, a reset control signal controlling a reset transistor of the pixel circuit, and a select control signal controlling a select transistor of the pixel circuit. - The
readout circuit 13 may include at least one of a correlated double sampler (CDS) and an analog-to-digital converter (ADC). The correlated double sampler may be connected to pixels, included in a row line selected by a row select signal provided by therow driver 12, through column lines and may perform correlated double sampling to detect a reset voltage and a pixel voltage. The analog-to-digital converter may output a digital signal after converting the reset voltage and the pixel voltage, detected by the correlated double sampler, into the digital signal. - The
column driver 14 may include a latch circuit, a buffer, an amplifier circuit, and may temporarily store or amplify the digital signal, received from thereadout circuit 13, to generate image data. Operating timings of therow driver 12, thereadout circuit 13, and thecolumn driver 14 may be determined by thecontrol logic 15. As an example, thecontrol logic 15 may be operated by a control instruction transmitted by theprocessor 20. Theprocessor 20 may signal-process the raw data, output by thecolumn driver 14 and thecontrol logic 15, to generate an image and may output the image to a display device, or store the image in a storage device such as a memory. -
FIGS. 2 and 3 are schematic diagrams of image sensors according to exemplary embodiments, respectively. - Referring to
FIG. 2 , animage sensor 40 according to an exemplary embodiment includes afirst layer 41, asecond layer 42 provided below thefirst layer 41, and athird layer 43 provided below thesecond layer 42. Thefirst layer 41, thesecond layer 42, and thethird layer 43 may be stacked in a direction perpendicular to each other. In an exemplary embodiment, thefirst layer 41 and thesecond layer 42 are stacked in a wafer level, and thethird layer 43 is attached to a portion below thesecond layer 42. The first tothird layers - The
first layer 41 includes a sensing area SA, in which a plurality of pixels are provided, and a first pad area PA1 provided around the sensing area SA. A plurality of upper pads PAD are included in the first pad region PAL The plurality of upper pads PAD may be connected to pads and a logic circuit LC of thesecond layer 42 through a via or a wire. The pads of thesecond layer 42 may be provided in a second pad area PA2 of thesecond layer 42. - Each of the plurality of pixels PX may include a photodiode configured to receive light and generate charges and a pixel circuit configured to process the charges generated by the photodiode. The pixel circuit may include a plurality of transistors configured to output a voltage corresponding to the charges generated by a photodiode.
- The
second layer 42 may include a plurality of components configured to implement the control logic LC. The plurality of components implementing the logic circuit LC may include circuits configured to drive a pixel circuit provided on thefirst layer 41, such as a row driver, a readout circuit, a column driver, and control logic. The plurality of components implementing the logic circuit LC may be connected to a pixel circuit through the first and second pad areas PA1 and PA2. The logic circuit LC may obtain the reset voltage and the pixel voltage from the plurality of pixels PX to generate a pixel signal. - In an exemplary embodiment, at least one of the plurality of pixels PX includes a plurality of photodiodes disposed on the same level. Pixel signals, generated from charges of each of the plurality of photodiodes, may have a phase difference from each other. The logic circuit LC may provide an autofocusing function based on a phase difference of pixel signals generated from a plurality of photodiodes included in a single pixel PX.
- The
third layer 43, provided below thesecond layer 42, may include a memory chip MC, a dummy chip DC and an encapsulation layer EN encapsulating the memory chip MC and the dummy chip DC. The memory chip MC may be a dynamic random access memory (DRAM) or a static random access memory (SRAM). In an embodiment, the dummy chip DC does not store data. The dummy chip DC may be omitted. The memory chip MC may be electrically connected to at least some of the components, included in the logic circuit LC of thesecond layer 42, by a bump, a via, or a wire, and may store data required to provide an autofocusing function. In an exemplary embodiment, the bump is a microbump. - Referring to
FIG. 3 , animage sensor 50 according to an exemplary embodiment includes afirst layer 51 and asecond layer 52. Thefirst layer 51 includes a sensing area SA in which a plurality of pixels PX are provided, a logic circuit LC in which components for driving the plurality of pixels PX are provided, and a first pad area PA1 provided around the sensing area SA and the logic circuit LC. A plurality of upper pads PAD are included in the first pad area PAL The plurality of upper pads PAD may be connected to a memory chip MC, provided on thesecond layer 52, through a via or a wire. Thesecond layer 52 may include a memory chip MC and a dummy chip DC, and an encapsulation layer EN encapsulating the memory chip MC and the dummy chip DC. The dummy chip DC may be omitted. -
FIG. 4 illustrates a pixel array of an image sensor according to an exemplary embodiment of the inventive concept. - Referring to
FIG. 4 , a pixel array PA of an image sensor according to an exemplary embodiment includes a plurality of pixels PX. The plurality of pixels may be connected to a plurality of row lines ROW1 to ROWm (ROW) and a plurality of column lines COL1 to COLn (COL). For example, a given pixel of the pixels PX may be connected to a given row line of the row lines ROW1 to ROWm and to a given column line of the column lines COL1 to COLn. The image sensor may drive the plurality of pixels PX in units of the plurality of row lines ROW. As an example, time required for driving a selected driving line among the plurality of row lines ROW and reading the reset voltage and the pixel voltage from pixels PX connected to the selected driving line may be defined as one horizontal cycle. The image sensor may operate in a rolling shutter manner, in which a plurality of the pixels PX are sequentially exposed to light, or a global shutter manner in which a plurality of the pixels PX are simultaneously exposed to light. - A reset voltage and a pixel voltage, output from each of the plurality of pixels PX, may be converted into digital data and may be processed as raw data through predetermined signal processing. An image signal processor, mounted in the image sensor or an additional processor communicating with the image sensor, may generate a result image displayed on a display or stored in a memory. Accordingly, different result images may generated from the raw data depending on performance or a tuning method of the image signal processor. Thus, a user may be provided with an optimal result image by improving performance of the image signal processor or by precisely tuning the image signal processor.
- When tuning of the image signal processor employs a method depending on a person's evaluation, it may be difficult to objectively and precisely tune the image signal processor. In an exemplary embodiment, the performance of the image signal processor may be improved by providing a method of modeling the image signal processor so there is no room for intervention of a person's subjective judgment. In addition, a user may be provided with an optimal result image by tuning the image signal processor in consideration of the user's desire.
-
FIG. 5 is a block diagram of an electronic device according to an exemplary embodiment of the inventive concept. - Referring to
FIG. 5 , anelectronic device 100 according to an exemplary embodiment of the inventive concept includes animage sensor 110, aprocessor 120, amemory 130, and adisplay 140. Theprocessor 120 may control overall operation of theelectronic device 100, and may be implemented by a central processing unit (CPU), an application processor (AP), or a system on chip (SoC). In an exemplary embodiment, theimage sensor 110 and the image signal processor are mounted on a single integrated circuit chip. - The
image sensor 110 may generate raw data in response to external light and may transmit the raw data to theprocessor 120. Theprocessor 120 may include animage signal processor 121 configured to signal-process the raw data to generate a result image. Theimage signal processor 121 may adjust a plurality of parameters associated with the raw data and signal-process the raw data according to the adjusted parameters to generate a result image. The parameters may include two or more of color, blurring, sharpness, noise, a contrast ratio, resolution, and a size. In an alternate embodiment, the parameters may include only one of color, blurring, sharpness, noise, a contrast ratio, resolution, and a size. The result image, output by theimage signal processor 121, may be stored in thememory 130 or may be displayed on thedisplay 140. - The
processor 120 may include aparameter optimization module 122. In an exemplary embodiment, theparameter optimization module 122 and the image signal processor are mounted in a single integrated circuit. As an example, theparameter optimization module 122 may adjust weights given to the plurality of parameters, and characteristics of the result image, output by theimage signal processor 121, can be changed depending on the adjusted weights. As an example, theparameter optimization module 122 adjusts a color, one of the plurality of parameters, to output a warm-tone result image or a cold-tone result image from the same raw data. For example, when a first weight is applied to the color parameter to generate a first weighted parameter, theimage signal processor 121 processing the raw data using the first weighted parameter outputs a warm-tone result image. For example, when a second weight different from the first weight is applied to the color parameter to generate a second weighted parameter, theimage signal processor 121 processing the raw data using the second weighted parameter outputs a cold-tone result image. - In an exemplary embodiment, the weight, applied to the plurality of parameters by the
parameter optimization module 122, is determined by a modeling method performed in advance. The weight, applied to the plurality of parameters by theparameter optimization module 122, may be adaptively adjusted based on user feedback. As an example, a weight may be determined by a modeling method using a machine learning model to significantly reduce a possibility of intervention of a person's subjective evaluation and to improve performance of theimage signal processor 121 while accurately and objectively tuning theimage signal processor 121. -
FIG. 6 is a flowchart illustrating a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept, andFIG. 7 illustrates a system providing a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept. - Referring to
FIG. 6 , a method of modeling an image signal processor according to an exemplary embodiment includes capturing an image of a sample subject to obtain sample data (S100). Referring toFIG. 7 , asystem 200 for modeling an image signal processor may include anelectronic device 210, including an image sensor, asample subject 220, and acomputer device 230 in which a modeling method is executed. Although theelectronic device 210 is illustrated as being a camera, it may be replaced with another device including an image sensor. In addition, although thecomputer device 230 is illustrated as being a desktop computer, it may be replaced with another device executing the modeling method. According to an exemplary embodiment, theelectronic device 210 and thecomputer device 230 are implemented as a single device. - The
sample subject 220 may be a test chart. Thesample subject 220 may include a plurality of capturingregions 221 to 223 (regions of interest), which may be different from each other. As an example, afirst capturing region 221 may be a region in which people are displayed, asecond capturing region 222 may be a region in which a black-and-white pattern is displayed, and athird capturing region 223 may be a region in which a color pattern is displayed. The sample data, obtained by theelectronic device 210 capturing thesample subject 220, may be raw data. The raw data may be transferred to thecomputer device 230 including an image signal processor (ISP) simulator. In an embodiment, the ISP simulator is capable of simulating different types of image signal processors. For example, the ISP simulator could emulate one or more of the image signal processors processing the raw data to generate an image. Emulating a given image signal processor processing the raw data may include the given image signal processor processing the raw data using one or more parameters. For example, a given parameter may be settable to only certain values, where each setting has a different affect. For example, if the given parameter is settable to only a first value or a second other value, emulating an image signal processor processing the raw data using the given parameter set to the first value could result in a first image, while an emulating the image signal processor processing the same raw data using the given parameter set to the second value could result in a second image different from the first image. - The
computer device 230 sets parameters used in an operation of an image signal processor to respective sample values (S110). Thecomputer device 230 signal-processes the raw data using the image signal processor simulator to emulate the image signal processor processing the sample data using the sample values to generate a plurality of sample images (S120). - In the modeling method executed in the
computer device 230, a plurality of sample scores of evaluation items are obtained for each of the plurality of sample images (S130). The plurality of sample scores may be scores calculated from a plurality of evaluation items selected to evaluate each of the plurality of sample images. As an example, the plurality of evaluation items may include at least one of an image color, resolution, a dynamic range, shading, sharpness, texture loss, and noise. For example, if only resolution is considered, the first sample image has a low resolution and the second sample image has a high resolution, the first sample image could receive a lower score than the second sample image. - The
computer device 230 stores the sample values for the parameters, the sample images, and sample scores, in a database (DB) (S140). The database may include a mapping of each parameter to a respective sample value. For example, the sample values, sample images, and sample scores stored in the database (DB), may be used to train a machine learning model to infer the performance of the image signal processor or to predict the quality of an image that will be produced by the image signal processor when parameters having certain values are used during processing of raw data. -
FIGS. 8 to 10 illustrate a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept. -
FIG. 8 may be a schematic diagram of a system for modeling an image signal processor. Referring toFIG. 8 , thesystem 300 includes asimulator 310, anevaluation framework 320, and adatabase 330. - The
simulator 310 receivessample data 301, which is raw data obtained by capturing an image of a sample subject such as a test chart. Thesimulator 310 may include aparameter generator 311, configured to determine a plurality of sample values for a plurality of parameters 232 used in an operation of an image signal processor, and anISP simulator 312 configured to simulate (or emulate) the image signal processor operating on the sample data 392 using the sample values of the parameters 232. For example, theparameter generator 311 may determine sample values of the parameters 232 such as image color, blurring, noise, a contrast ratio, resolution, and size. At least one of the parameters may be classified into a plurality of detailed parameters according to an embodiment. For example, there may be a plurality of detailed parameters for noise and a plurality of detailed parameters for color. - The
ISP simulator 312 may signal-process thesample data 301 using the plurality ofsample values 332, determined for the plurality of parameters by theparameter generator 311, to generatesample images 331. Hereinafter, the operation of thesimulator 310 will be described in more detail with reference toFIG. 9 . - Referring to
FIG. 9 , theparameter generator 311 generates a plurality of sample values for first through sixth parameters. As an example, theparameter generator 311 may generate first through sixth sample sets having different sample values for the first through sixth parameters. The first to sixth parameters are parameters used in an operation of the image signal processor. The number and types of the parameters may be variously changed. Similarly, the number of sample sets, generated by setting the sample values for the parameters by theparameter generator 311, may also be variously changed. - When the sample sets are determined, the
ISP simulator 312 generates the first throughsixth sample images 410 to 460 (400) by setting parameters to each of the sample sets and simulating (or emulating) the operation of the image signal processor on thesample data 301 using each of the sample sets. For example, theISP simulator 312 may emulate the image signal processor processing raw data of thesample data 301 using the 6 sample parameters set to their respective values in the first sample set to generate thefirst sample image 410, emulate the image signal processor processing the raw data using the 6 sample parameters set to their respective values in the second sample set to generate thesecond sample image 420, etc. In an exemplary embodiment, thesample images 400 are images generated from thesample data 301 obtained by capturing an image of the same sample subject. Since thesample images 400 are images generated by theISP simulator 312 by different sample sets, they may have different quality and/or characteristics. - Returning to
FIG. 8 , theevaluation framework 320 receives sample images, generated by thesimulator 310, to evaluate the quality of the sample images. As an example, theevaluation framework 320 may evaluate each of the sample images for a plurality of evaluation items and may express a result of the evaluation as sample scores 333. In the example embodiment illustrated inFIG. 8 , theevaluation framework 320 may obtainsample scores 333 of evaluation items such asresolution 321,texture loss 322,sharpness 323,noise 324, adynamic range 325, shading 326, and acolor 327, for each of the sample images. Hereinafter, this will be described in more detail with reference toFIG. 10 . - Referring to
FIG. 10 , theevaluation framework 320 receives thesample images 400 to obtainsample scores 333 for a plurality of evaluation items. As an example, sample scores for the first through third evaluation items may be obtained by evaluating each of thesample images 400. For example, the first evaluation item could beresolution 321, the second evaluation item could besharpness 323, and the third evaluation item could benoise 324. Theevaluation framework 320 may classify and store the sample scores depending on thesample images 400. A lowest point and a highest point of each of the sample scores may vary depending on the sample items. For example, as shown inFIG. 10 , thefirst sample image 410 includes a first score of 70.37 for the first evaluation item, a second score of 62.29 for the second evaluation item, and a third score of 1979.25 for the third evaluation item. - When the evaluation by the
evaluation framework 320 has completed, thedatabase 330 may be established. Thedatabase 330 includessample images 331 andsample values 332 of the plurality of parameters, generated by thesimulator 310, andsample scores 333 obtained by evaluating thesample images 331 for the plurality ofevaluation items 321 to 327 by the evaluation framework. - The
sample images 331, the sample values 332 of the plurality of parameters, and the sample scores 333, stored in thedatabase 330, may be used to train the machine learning model. The machine learning model, trained by data stored in thedatabase 330, may be a model for predicting the quality of a result image output by the image signal processor. Hereinafter, this will be described in more detail with reference toFIGS. 11 and 12 . -
FIG. 11 illustrates a system providing a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept, andFIG. 12 illustrates a machine learning model employed in a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept. - Referring to
FIG. 11 , asystem 500 according to an exemplary embodiment may operate in cooperation with adatabase 600. Thedatabase 600 may be a database established by the modeling method described with reference toFIGS. 8 to 10 , and may include sample images, sample values for a plurality of parameters, and sample scores for a plurality of evaluation items. - In an exemplary embodiment, the machine
learning model trainer 510 trains amachine learning model 700 to predict a quality of an image produced by a given image signal processor using parameters have certain values usingsample values 501 of the parameters andsample scores 502 stored in thedatabase 600. As an example, the sample values 501 of the parameters may be at least one of first to sixth sample sets set in the same manner as described in the example embodiment with reference toFIG. 9 . As an example, when the first sample set is selected, a first sample score set in the example embodiment, illustrated inFIG. 10 , may be selected as the sample scores 502. - The machine
learning model trainer 510 may input sample values, included in the first sample set, to themachine learning model 700. In an exemplary embodiment, the machinelearning model trainer 510 trains themachine learning model 700 until the output of themachine learning model 700 matches evaluation scores of the first sample score set or a difference between evaluation scores of the first sample score set becomes less than or equal to a reference difference. - Referring to
FIG. 12 , themachine learning model 700 may be implemented by an artificial neural network (ANN). Themachine learning model 700 includes aninput layer 710, ahidden layer 720, anoutput layer 730. As an example, a plurality of nodes, included in theinput layer 710, the hiddenlayer 720, and theoutput layer 730, may be connected to each other in a fully connected manner. Theinput layer 710 includes a plurality of input nodes x1 to xi. In an exemplary embodiment, the number of the input nodes x1 to xi corresponds to the number of parameters. Theoutput layer 730 includes a plurality of output nodes y1 to yj. In an exemplary embodiment, the number of the output nodes y1 to yj corresponds to the number of evaluation items. - The
hidden layer 720 includes first to thirdhidden layers 721 to 723, and the number of thehidden layers 721 to 723 may be variously changed. As an example, themachine learning model 700 may be trained by adjusting weights of the hidden nodes included in the hiddenlayer 720. For example, the first to sixth sample sets are input to theinput layer 710 and the weights of the hidden nodes, included in the hiddenlayer 720, may be adjusted until values, output to theoutput layer 730, correspond to the first to sixth sample score set. Accordingly, after the training has completed, quality of a result image, output by the image signal processor, may be inferred using themachine learning model 700 when the parameters have predetermined values. -
FIG. 13 is a flowchart illustrating a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept. - Referring to
FIG. 13 , a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept includes setting initial values for a plurality of parameters applied to an image signal processor (S200). The plurality of parameters applied to the image signal processor may include a color, blurring, noise, a contrast ratio, a resolution, and a size an image as parameters used in an operation of the image signal processor. - Next, the initial values for the parameters are input to the machine learning model (S210). The machine learning model may be a model trained to predict the quality of the resulting image output by the image signal processor. An output of the machine learning model may vary depending on the values of the parameters applied to the image signal processor. A training process of the machine learning model may be understood based on the example embodiment described above with reference to
FIGS. 11 and 12 . - Evaluation scores for a plurality of evaluation items are obtained using the output of the machine learning model (S220). As described above, the machine learning model is a model trained by the image signal processor to predict the quality of a result image generated by signal-processing raw data, and the output of the machine learning model corresponds to the evaluation scores of a plurality of evaluation items. In an exemplary embodiment, the plurality of evaluation items may include a color, sharpness, noise, resolution, a dynamic range, shading, and texture loss of an image.
- In the modeling method according to an exemplary embodiment, weights applied to the parameters are adjusted based on the obtained evaluation scores for the plurality of evaluation items (S230). As an example, each of the evaluation scores may be compared with predetermined reference scores and, when there is an evaluation score which does not reach a reference score, the weight is applied to at least one of the parameters may be increased or decreased such that the corresponding evaluation score may be increased. Alternatively, the evaluation score, output by the machine learning model, may be compared with a reference score while changing a weight by a predetermined number of times.
-
FIGS. 14 to 17 illustrate a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept. -
FIG. 14 is a schematic diagram of a system for providing a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept. Referring toFIG. 14 , asystem 800 according to an exemplary embodiment includes aparameter adjusting module 810, amachine learning model 820, and afeedback module 830. - The
parameter adjusting module 810 may adjust values input to amachine learning model 820. Themachine learning model 820 may receive parameters used in an operation of an image signal processor, and may output evaluation scores indicating quality and/or characteristics of a resulting image generated by the image signal processor operating depending on values of the parameters. Accordingly, theparameter adjusting module 810 may adjust the values of the parameters used in the operation of the image signal processor. For example, theparameter adjusting module 810 may adjust weights applied to the parameters. When initial values ofparameters 801 are input, theparameter adjusting module 810 may apply predetermined weights to the initial values of the parameters to generate weighted values of the parameters, and input the weighted values to themachine learning model 820. - The
machine learning model 820 may be a model trained to predict the quality of the resulting image generated by the image signal processor. An output of themachine learning model 820 may correspond to an evaluation score of the evaluation items indicating quality of a result image. In an exemplary embodiment, thefeedback module 830 compares an output of themachine learning model 820 with a target score of the evaluation items and transmits a result of the comparison to theparameter adjusting module 810. In an exemplary embodiment, theparameter adjusting module 810 adjusts weights applied to the parameters, with reference to a comparison result transmitted by thefeedback module 830. Theparameter adjusting module 810 may adjust the weights applied to the parameters, a predetermined number of times or until the difference between the evaluation scores and the target scores output by themachine learning model 820 is reduced to be less than or equal to the reference difference. When adjusting the weights has finished, optimized ISP parameters 802 (e.g., parameters set to optimal values) may be output from thesystem 800. The parameters set to the optimal values may be used to tune an image signal processor. The parameters set to the optimal values may be output to the image signal processor for storage on the image signal processor and then the image signal processor can use the parameters set to these values when performing a subsequent operation (e.g., process raw data to generate an image). - The
system 800 may adjust the weights applied to the parameters, by considering feedback from a user of an electronic device in which an image signal processor is mounted. In this case, thesystem 800 may be mounted in the electronic device together with the image signal processor and may adaptively adjust the weights with reference to the feedback from the user. -
FIGS. 15 to 17 are provided to illustrate a method of modeling an image signal processor according to an exemplary embodiment of the inventive concept. Referring toFIG. 15 , initial values may be set for first to sixth parameters. The initial values may be any values generated at random. - Referring to
FIG. 16 , a predetermined weight may be reflected on an input layer IL to be input to the machine learning model. For example, the input layer IL may receive a plurality of input values, and the plurality of input values may correspond to parameters used in an operation of the image signal processor. A weight may be applied to the plurality of input values, and the plurality of input values and the weight may be connected in a fully connected manner or a partially connected manner. When the plurality of input values and the weight correspond to each other in the partially connected manner, the weight is not connected to at least one of the plurality of input values. - The machine learning model may output at least one output value to an output layer OL using a plurality of weight-given input values. The output value may correspond to an evaluation score of an evaluation item which may indicate the quality of the image generated by the image signal processor. The number of input values included in the input layer IL, and the number of output values included in the output layer OL, may be variously changed according to exemplary embodiments.
-
FIG. 17 is a graph illustrating variation of evaluation scores y1 to y4 depending on the number of times of training of the machine learning model. In the exemplary embodiment illustrated inFIG. 17 , it is assumed that the output layer OL outputs the evaluation scores y1 to y4 for four evaluation items. However, the assumption is merely an exemplary embodiment as a shape of the layer is not limited thereto. - When the machine learning model outputs first to fourth evaluation scores y1 to y4, the first to fourth evaluation scores y1 to y4 are compared with the first to fourth target scores, respectively. At least one of the weights, applied to the plurality of input values, may be changed depending on a result of the comparison. In the exemplary embodiment illustrated in
FIG. 17 , weights applied to hidden nodes of a hidden layer included in the machine learning model, are not adjusted while weights applied to the plurality of input values in the input layer IL of the machine learning model, are adjusted. - As training is repeated while changing at least one of the weights, the first to fourth evaluation scores y1 to y4 output by the machine learning model, may be approximated to each of the first to fourth target scores. At least one of the weights may be adjusted until a predetermined number of times of training completes or until a difference between the first to fourth evaluation scores y1 to y4 and the first to fourth target scores is reduced to be less than a reference difference. When the number of times of training completes or the difference between the first to fourth evaluation scores y1 to y4 and the first to fourth target scores is reduced to less than the reference difference, weights may be determined. The determined weights may be assigned to input values of the input layer IL, corresponding to parameters used in an operation of the image signal processor, in the fully connected manner or the partially connected manner.
- Raw data, obtained by capturing an image of a sample subject, may be input to the image signal processor to tune the image signal processor, and the image signal processor may be tuned to satisfy predetermined evaluation conditions output by the image signal processor. In this case, since the image signal processor is tuned using the raw data obtained by capturing of an image of the sample subject, a relatively long time may be required. In addition, when the tuning depends on a person's objective evaluation, it may be difficult to objectively and precisely tune the image signal processor.
- Meanwhile, in at least one exemplary embodiment of the inventive concept, image data obtained by capturing an image of at least one sample subject, is processed by the image signal processor simulator according to sample values of various parameters to generate sample images. Sample scores, obtained by evaluating the sample images, and sample values of the parameters may be stored in a database. Since the sample scores and sample values of the parameters stored in the database are numerical items, an effect of a person's subjective evaluation may be significantly reduced. In addition, a machine learning model trained to receive the sample values of the parameters and to output sample scores, may be prepared. Weights, applied to the parameters, may be adjusted such that evaluation scores output by the machine learning model receiving initial values of the parameters, reach target scores.
- In at least one exemplary embodiment, an image sensor processor is tuned by adjusting weights applied to parameters used in an operation of the image signal processor, with numerical items, and an effect of a person's subjective evaluation may be significantly reduced to objectively and precisely tune the image signal processor. Additionally, the image signal processor may be adaptively tuned depending on a user by considering an end-user's desire in processes of comparing the evaluation scores outputted by the machine learning model, with target scores and adjusting weights of the parameters.
-
FIG. 18 is a block diagram of an electronic device according to an exemplary embodiment of the inventive concept. - An
electronic device 900 according to an exemplary embodiment illustrated inFIG. 18 includes adisplay 910, animage sensor 920, amemory 930, aprocessor 940, and aport 950. Theelectronic device 900 may further include a wired/wireless communications device and a power supply. Among components illustrated inFIG. 18 , theport 950 may be provided for theelectronic device 900 to communicate with a video card, a sound card, a memory card, and a universal serial bus (USB) device. Theelectronic device 900 may conceptually include all devices, which employ theimage sensor 920, in addition to a smartphone, a tablet personal computer (PC), and a digital camera. - The
processor 940 may perform a specific operation, command, or task. Theprocessor 940 may be a central processing unit (CPU) or a system on chip (SoC), and may communicate with thedisplay 910, theimage sensor 920, and thememory 930 as well as other devices connected to theport 950 through abus 960. - The
processor 940 may include animage signal processor 941. Theimage signal processor 941 generates a result image using raw data generated by theimage sensor 920 capturing an image of a subject. Theprocessor 940 may display the result image generated byimage signal processor 941 on thedisplay 910 and may store the result image inmemory 930. - The
memory 930 may be a storage medium configured to store data necessary for an operation of theelectronic device 900 or multimedia data. Thememory 930 may include a volatile memory such as random access memory (RAM) or a nonvolatile memory such as a flash memory. Thememory 930 may also include at least one of a solid state drive (SSD), a hard disk drive (HDD), and an optical drive (ODD) as a storage device. - The
memory 930 may include amachine learning model 931 such as themachine learning model 700. Themachine learning model 931 may receive parameters used in an operation of theimage signal processor 941, and may output evaluation scores of evaluation items indicating a quality of the result image generated by theimage signal processor 941 using the parameters. As an example, the parameters input to themachine learning model 931 may include a color, blurring, noise, a contrast ratio, a resolution, and a size of an image. The evaluation scores, output by themachine learning model 931, may correspond to evaluation items such as a color, sharpness, noise, a resolution, a dynamic range, shading, and texture loss of the image. - The
electronic device 900 may adaptively adjust weights applied to the parameters used in the operation of theimage signal processor 941, using themachine learning model 931. In an exemplary embodiment, theelectronic device 900 does not train themachine learning model 931 itself and merely adjusts the weights applied to the parameters in a front end of an input layer of themachine learning model 931. Thus, theimage signal processor 941 may be tuned for a user without great burden of an arithmetic operation. -
FIGS. 19A and 19B illustrate examples of electronic devices that may include theelectronic device 900. - Referring to
FIGS. 19A and 19B , anelectronic device 1000 according to an exemplary embodiment is a mobile device such as a smartphone. However, theelectronic device 1000 is not limited to a mobile device such as a smartphone. For example, and theelectronic device 100 may be any device including a camera which captures an image. - The
electronic device 1000 includes ahousing 1001, adisplay 1002, andcameras display 1002 substantially covers an entire front surface of thehousing 1001 and includes afirst region 1003 and asecond region 1002, depending on an operating mode of theelectronic device 1000 or an application which is being executed. Thedisplay 1002 may be provided integrally with a touch sensor configured to sense a user's touch input. - The
cameras general camera 1005 and a time-of-flight (ToF)camera 1006. Thegeneral camera 1005 may include afirst camera 1005A and asecond camera 1005B. Thefirst camera 1005A and thesecond camera 1005B may be implemented with image sensors having different angles of view, different aperture values, or a different number of pixels. Due to a thickness of thehousing 1001, it may be difficult to employ a zoom lens for adjusting an angle of view and an aperture value in thegeneral camera 1005. - Accordingly, the
first camera 1005A and thesecond camera 1005B, having different angles of view and/or different aperture values, may provide an image capturing function satisfying user's various needs. - The
ToF camera 1006 may be combined with an additional light source to generate a depth map. TheToF camera 1006 may provide a face recognition function. As an example, the ToF may operate in combination with an infrared light source. - Referring to
FIG. 19B illustrating a back surface of theelectronic device 1000, acamera 1007 and alight emitting unit 1008 may be disposed on the rear surface. Similar to thecamera 1005 disposed on a front surface of theelectronic device 1000, thecamera 1007 includes a plurality ofcameras 1007A to 1007C having at least one of different aperture values, different angles of view, and a different number of pixels of the image sensor. Thelight emitting unit 1008 may employ a light emitting diode (LED) as a light source and may operate as a flash when capturing images using thecamera 1007. - As described with reference to
FIGS. 19A and 19B , anelectronic device 1000, having two ormore cameras 1005 to 1007 mounted therein, may provide various image capturing functions. An image signal processor, mounted in theelectronic device 1000, needs to be appropriately tuned to improve the quality of a result image captured by thecameras 1005 to 1007. - The image signal processor, mounted in the
electronic device 1000, may process raw data generated by thecameras 1005 to 1007 depending on values of a plurality of parameters to generate a result image. Quality or characteristics of the result image may depend on the values of the parameters, applied to the image signal processor, in addition to the raw data. In an exemplary embodiment, weights are applied to the parameters used in an operation of the image signal processor to generated weighted parameters, and the quality and characteristics of the result image are improved by adjusting the weights. - Alternatively, weights are applied to the parameters used in an operation of the image signal processor to generate weighted parameters, and a user of the
electronic device 1000 adjusts the weights to generate a preferred result image. For example, theelectronic device 1000 may directly receive feedback from the user to adjust the weights applied to the parameters. Alternatively, a color, sharpness, and a contrast ratio of the user's preferred image may be accumulated depending on a capturing site (e.g., the location where an image of the subject was captured), a capturing time (e.g., a time when the image of the subject was captured), and type of a captured subject, and thus, weights of the parameters, applied to the image signal processor, may be changed. - As an example, when the user prefers low sharpness and warm colors for images on which people are captured outdoors on a sunny day, the
electronic device 1000 may adjust the weights applied to the parameters in a front end of an input layer of a machine learning model, such that among the evaluation scores output by an embedded machine learning model, sharpness and a color are adjusted toward a user's preference. The adjusted weights may be stored in a memory, and may be applied to the parameters of the image signal processor when a capturing environment, in which a person is selected as a subject outdoors on a sunny day, is recognized. -
FIGS. 20 and 21 illustrate an operation of an electronic device according to an exemplary embodiment of the inventive concept. -
FIG. 20 is a raw image corresponding to an image before an image signal processor signal-processes raw data, andFIG. 21 is a result image generated by signal-processing raw data by an image signal processor. In the exemplary embodiments illustrated inFIGS. 20 and 21 , the raw image may exhibit poorer noise characteristics than the result image. For example, certain weights applied to parameters of an image signal processor, may be set to values which improve noise characteristics. For example, other weights applied to parameters of an image signal processor, may be determined to be values deteriorating noise characteristics depending on user setting or a capturing environment. - As described above, according to an exemplary embodiment, a plurality of parameters that determine operating characteristics of an image signal processor, may be tuned using a machine learning model. Weights for the plurality of parameters, applied to the image signal processor, may be determined using the machine learning model such that the image signal processor achieves optimal performance. Accordingly, the image signal processor may be objectively and precisely tuned, as compared with a conventional manner in which a person manually tunes the image signal processor. In addition, weights applied to parameters may be adjusted by considering feedback received from a user of an electronic device in which an image signal processor is mounted. Thus, an image signal processor optimized for the user may be implemented.
- While exemplary embodiments of the inventive concept have been shown and described above, it will be apparent to those skilled in the art that modifications and variations can be made without departing from the scope of the present inventive concept.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/067,298 US20230117343A1 (en) | 2019-05-21 | 2022-12-16 | Predicting optimal values for parameters used in an operation of an image signal processor using machine learning |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020190059573A KR20200134374A (en) | 2019-05-21 | 2019-05-21 | Modeling method for image signal processor, and electronic device |
KR10-2019-0059573 | 2019-05-21 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/067,298 Continuation US20230117343A1 (en) | 2019-05-21 | 2022-12-16 | Predicting optimal values for parameters used in an operation of an image signal processor using machine learning |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200372682A1 true US20200372682A1 (en) | 2020-11-26 |
Family
ID=69770536
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/724,626 Abandoned US20200372682A1 (en) | 2019-05-21 | 2019-12-23 | Predicting optimal values for parameters used in an operation of an image signal processor using machine learning |
US18/067,298 Pending US20230117343A1 (en) | 2019-05-21 | 2022-12-16 | Predicting optimal values for parameters used in an operation of an image signal processor using machine learning |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/067,298 Pending US20230117343A1 (en) | 2019-05-21 | 2022-12-16 | Predicting optimal values for parameters used in an operation of an image signal processor using machine learning |
Country Status (4)
Country | Link |
---|---|
US (2) | US20200372682A1 (en) |
EP (1) | EP3742389A1 (en) |
KR (1) | KR20200134374A (en) |
CN (1) | CN111988544A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210031367A1 (en) * | 2019-07-31 | 2021-02-04 | Brain Corporation | Systems, apparatuses, and methods for rapid machine learning for floor segmentation for robotic devices |
US20220269507A1 (en) * | 2021-02-24 | 2022-08-25 | Northrop Grumman Systems Corporation | Systems and methods for emulating a processor |
US20220343770A1 (en) * | 2021-04-27 | 2022-10-27 | Rockwell Collins, Inc. | Machine-learned operating system and processor |
US11532077B2 (en) | 2020-08-17 | 2022-12-20 | Netflix, Inc. | Techniques for computing perceptual video quality based on brightness and color components |
US11557025B2 (en) * | 2020-08-17 | 2023-01-17 | Netflix, Inc. | Techniques for training a perceptual quality model to account for brightness and color distortions in reconstructed videos |
US20230093199A1 (en) * | 2021-06-09 | 2023-03-23 | Oracle International Corporation | Tuning external invocations utilizing weight-based parameter resampling |
US20230156348A1 (en) * | 2021-01-21 | 2023-05-18 | Nec Corporation | Parameter optimization system, parameter optimization method, and computer program |
WO2024044474A1 (en) * | 2022-08-22 | 2024-02-29 | Qualcomm Incorporated | Systems and methods for multi-context image capture |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112488962A (en) * | 2020-12-17 | 2021-03-12 | 成都极米科技股份有限公司 | Method, device, equipment and medium for adjusting picture color based on deep learning |
CN115719440A (en) * | 2021-08-23 | 2023-02-28 | 索尼集团公司 | Image signal processor optimization method and device |
WO2023035263A1 (en) * | 2021-09-13 | 2023-03-16 | 华为技术有限公司 | Method and device for determining image signal processing parameters, and perception system |
CN116091900A (en) * | 2021-10-29 | 2023-05-09 | 华为技术有限公司 | Image processing method, device and storage medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9826149B2 (en) * | 2015-03-27 | 2017-11-21 | Intel Corporation | Machine learning of real-time image capture parameters |
US9916525B2 (en) * | 2015-10-13 | 2018-03-13 | Siemens Healthcare Gmbh | Learning-based framework for personalized image quality evaluation and optimization |
US11030722B2 (en) * | 2017-10-04 | 2021-06-08 | Fotonation Limited | System and method for estimating optimal parameters |
US10755425B2 (en) * | 2018-02-05 | 2020-08-25 | Intel Corporation | Automatic tuning of image signal processors using reference images in image processing environments |
US10796200B2 (en) * | 2018-04-27 | 2020-10-06 | Intel Corporation | Training image signal processors using intermediate loss functions |
-
2019
- 2019-05-21 KR KR1020190059573A patent/KR20200134374A/en unknown
- 2019-12-23 US US16/724,626 patent/US20200372682A1/en not_active Abandoned
-
2020
- 2020-03-03 EP EP20160775.1A patent/EP3742389A1/en active Pending
- 2020-04-21 CN CN202010315993.7A patent/CN111988544A/en active Pending
-
2022
- 2022-12-16 US US18/067,298 patent/US20230117343A1/en active Pending
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210031367A1 (en) * | 2019-07-31 | 2021-02-04 | Brain Corporation | Systems, apparatuses, and methods for rapid machine learning for floor segmentation for robotic devices |
US11613016B2 (en) * | 2019-07-31 | 2023-03-28 | Brain Corporation | Systems, apparatuses, and methods for rapid machine learning for floor segmentation for robotic devices |
US11532077B2 (en) | 2020-08-17 | 2022-12-20 | Netflix, Inc. | Techniques for computing perceptual video quality based on brightness and color components |
US11557025B2 (en) * | 2020-08-17 | 2023-01-17 | Netflix, Inc. | Techniques for training a perceptual quality model to account for brightness and color distortions in reconstructed videos |
US20230156348A1 (en) * | 2021-01-21 | 2023-05-18 | Nec Corporation | Parameter optimization system, parameter optimization method, and computer program |
US20220269507A1 (en) * | 2021-02-24 | 2022-08-25 | Northrop Grumman Systems Corporation | Systems and methods for emulating a processor |
US11550580B2 (en) * | 2021-02-24 | 2023-01-10 | Northrop Grumman Systems Corporation | Systems and methods for emulating a processor |
US20220343770A1 (en) * | 2021-04-27 | 2022-10-27 | Rockwell Collins, Inc. | Machine-learned operating system and processor |
US20230093199A1 (en) * | 2021-06-09 | 2023-03-23 | Oracle International Corporation | Tuning external invocations utilizing weight-based parameter resampling |
US11860839B2 (en) * | 2021-06-09 | 2024-01-02 | Oracle International Corporation | Tuning external invocations utilizing weight-based parameter resampling |
WO2024044474A1 (en) * | 2022-08-22 | 2024-02-29 | Qualcomm Incorporated | Systems and methods for multi-context image capture |
Also Published As
Publication number | Publication date |
---|---|
CN111988544A (en) | 2020-11-24 |
KR20200134374A (en) | 2020-12-02 |
US20230117343A1 (en) | 2023-04-20 |
EP3742389A1 (en) | 2020-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230117343A1 (en) | Predicting optimal values for parameters used in an operation of an image signal processor using machine learning | |
US10931897B2 (en) | Systems and methods for a digital image sensor | |
JP7179461B2 (en) | Image sensor apparatus and method for simultaneously capturing flash and ambient illumination images | |
US11803296B2 (en) | Electronic device displaying interface for editing video data and method for controlling same | |
CN103236433B (en) | Broadband imager | |
JP2021158674A (en) | Image sensor apparatus and method for simultaneously capturing multiple images | |
CN104038702A (en) | Image capture apparatus and control method thereof | |
US10785387B2 (en) | Electronic device for taking moving picture by adjusting threshold associated with movement of object in region of interest according to movement of electronic device and method for operating same | |
CN103685920A (en) | Image processing apparatus and method and an imaging apparatus having image processing apparatus | |
JP2017536786A (en) | Image sensor apparatus and method for obtaining low noise and high speed capture of a photographic scene | |
TW201841493A (en) | Image Sensing Method and Image Sensor with Rolling Exposure Time Compensation | |
US20160019681A1 (en) | Image processing method and electronic device using the same | |
CN109474781A (en) | Photographic device, the control method of photographic device, recording medium | |
CN106375655A (en) | Digital photographing apparatus and digital photographing method | |
US20230105329A1 (en) | Image signal processor and image sensor including the image signal processor | |
WO2017175802A1 (en) | Image processing device, electronic apparatus, playback device, playback program, and playback method | |
KR100921817B1 (en) | Image sensor for motion detection and optic pointing device using it | |
US20240070835A1 (en) | System and platform for automatic optimization of image quality of image sensor and operating method thereof | |
CN104639842A (en) | Image processing device and exposure control method | |
US20210390671A1 (en) | Image processing system for performing image quality tuning and method of performing image quality tuning | |
JP6794649B2 (en) | Electronics and recording programs | |
JP2022108329A (en) | Information processing device, information processing system, information processing method and program | |
US9380226B2 (en) | System and method for extraction of a dynamic range zone image | |
CN102387301A (en) | Imaging apparatus and imaging method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, YOUNGHOON;KIM, SUNGSU;LEE, JUNGMIN;SIGNING DATES FROM 20190925 TO 20190927;REEL/FRAME:051353/0387 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |