US20150264329A1 - Portable terminal device and image correction method - Google Patents
Portable terminal device and image correction method Download PDFInfo
- Publication number
- US20150264329A1 US20150264329A1 US14/592,795 US201514592795A US2015264329A1 US 20150264329 A1 US20150264329 A1 US 20150264329A1 US 201514592795 A US201514592795 A US 201514592795A US 2015264329 A1 US2015264329 A1 US 2015264329A1
- Authority
- US
- United States
- Prior art keywords
- image
- information
- color temperature
- terminal device
- portable terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003702 image correction Methods 0.000 title claims abstract description 85
- 238000000034 method Methods 0.000 title claims description 30
- 230000008569 process Effects 0.000 claims description 18
- 238000012545 processing Methods 0.000 description 41
- 238000012937 correction Methods 0.000 description 21
- 238000006243 chemical reaction Methods 0.000 description 18
- 238000004891 communication Methods 0.000 description 15
- 238000005516 engineering process Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 5
- 241000593989 Scardinius erythrophthalmus Species 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 201000005111 ocular hyperemia Diseases 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 239000003086 colorant Substances 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- H04N9/735—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6077—Colour balance, e.g. colour cast correction
-
- G06T5/005—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6083—Colour correction or control controlled by factors external to the apparatus
- H04N1/6086—Colour correction or control controlled by factors external to the apparatus by scene illuminant, i.e. conditions at the time of picture capture, e.g. flash, optical filter used, evening, cloud, daylight, artificial lighting, white point measurement, colour temperature
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/646—Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30216—Redeye defect
Definitions
- the embodiments discussed herein are related to a portable terminal device, an image correction method, and an image correction program, for example.
- portable terminal devices such as feature phones and smart phones have various functions in addition to a telephone function.
- One of such functions is a camera function.
- the portable terminal device may capture a high resolution image by a sophisticated camera function.
- the portable terminal device includes a camera lens, image capturing element, a red green blue (RGB) sensor, and so forth and may perform image correction such as white balance adjustment to an image captured by the camera lens and the image capturing element by using the RGB sensor.
- RGB red green blue
- the RGB sensor that is used for such image correction is provided on a side on which the camera lens is installed, for example.
- the portable terminal device includes a display screen such as an LCD and the RGB sensor is installed on a side on which the display screen is provided.
- the RGB sensor provided on the display screen side is used for adjustment of colors on an image displayed on the display screen and as a proximity sensor, for example.
- the RGB sensor is used as the proximity sensor, when a face approaches a microphone or a speaker provided in the portable terminal device on a call, the RGB sensor provided on the display screen side detects the face, and a power supply to the display screen is temporarily turned off. Reduction in power consumption of the portable terminal device may thereby be expected, for example.
- Japanese Laid-open Patent Publication No. 2011-203437 discloses a technology in which a first photometric sensor is provided on a display side of a display section in an electronic camera, a second photometric sensor is provided on a back side of the display side of the display section, and the display section thereby displays a corrected image to which adjustment is applied in accordance with a lighting environment by using a first lighting condition from the first photometric sensor and a second lighting condition from the second photometric sensor.
- this technology may provide a unit that properly adjusts appearances of colors of the image displayed on the display section in accordance with brightness of an environment.
- Japanese Laid-open Patent Publication No. 2003-102018 discloses a technology in which in an electronic still camera that includes a white balance adjustment unit that performs white balance adjustment based on a captured image, reliability of a stored white control signal is assessed based on the number of image capturing frames and an elapsed time from a prescribed time point, and a user is notified of low reliability in a case where the reliability is low.
- this technology may minimize incorrect white balance adjustment that occurs at an actual image capturing time point in a case the light source conditions of the lighting are different.
- Japanese Laid-open Patent Publication No. 2008-219128 discloses an image capturing device that changes image capturing modes by controlling the white balance of an image signal obtained by a charge coupled device (CCD) in accordance with the difference between an average first color of an atmosphere that is measured by a color sensor and an average second color of an object that is measured with the image signal.
- CCD charge coupled device
- this technology may provide an image capturing device that may perform adjustment to an appropriate white balance in a short time.
- a portable terminal device having a first color temperature sensor that is provided on a first side of the portable terminal device, and having a second color temperature sensor that is provided on a second side of the portable terminal device, the portable terminal device includes, at least one camera module configured to capture an image; and an image processor configured to perform image correction to the image that is captured by the camera module based on at least one of first information from the first color temperature sensor and second information from the second color temperature sensor.
- FIG. 1 illustrates a configuration example of a portable terminal device
- FIGS. 2A to 2C illustrate an example of an external appearance of the portable terminal device
- FIG. 3 illustrates a configuration example of the portable terminal device
- FIG. 4 illustrates a configuration example of RGB sensors and an image processor
- FIG. 5 is a flowchart that illustrates an operation example of the portable terminal device
- FIG. 6 illustrates a use example of the portable terminal device
- FIG. 7 is a flowchart that illustrates an operation example of the portable terminal device
- FIGS. 8A and 8B illustrate examples of cases where a direction of light that is received by an object is different from a direction of light that is received by the RGB sensors;
- FIG. 9 illustrates an external appearance of the portable terminal device
- FIG. 10 illustrates an example of a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor.
- an RGB sensor that is used to perform image correction may be provided only on a side on which a camera lens is provided, for example.
- a direction of light that is received by the RGB sensor is different from a direction of light that is received by an object, and thus the image correction may not appropriately be performed.
- FIG. 10 is a diagram for illustrating an example of such a case.
- a portable terminal device 10 includes a camera lens 12 and an RGB sensor (or color sensor) 14 .
- the RGB sensor 14 is installed on a side on which the camera lens 12 is installed.
- the RGB sensor 14 directly receives the sunlight.
- an object 16 receives the sunlight from a back side, and the side of the object that is directed toward the portable terminal device 10 does not directly receive the sunlight.
- the RGB sensor 14 does not directly receive the sunlight, but the side of the object 16 that is directed toward the camera lens 12 directly receives the sunlight.
- the above-described technology in which the first and second photometric sensors are respectively provided on the display side and the back side of the display side of the display section is to adjust a display on the display section in accordance with the brightness of the environment by using the first and second photometric sensors but is not a technology to capture an image. Therefore, this technology may not perform appropriate image correction in a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor when an image is captured.
- the above-described technology in which the number of image capturing frames and the elapsed time from a prescribed time point are measured and the reliability of the stored white control signal is assessed is to assess the reliability of the white control signal.
- appropriate white balance may not be performed for an image captured with low reliability.
- this technology makes no suggestion on how the image correction is performed in a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor when an image is captured.
- the above-described technology that controls the white balance in accordance with the average first color or the like of the atmosphere that is measured by the color sensor makes no suggestion on how the image correction is performed in a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor when an image is captured.
- FIG. 1 illustrates a configuration example of a portable terminal device 100 in the first embodiment.
- the portable terminal device 100 may include information processing devices such as feature phones, smart phones, tablet terminals, personal digital assistants (PDA), and portable game consoles, for example.
- information processing devices such as feature phones, smart phones, tablet terminals, personal digital assistants (PDA), and portable game consoles, for example.
- PDA personal digital assistants
- the portable terminal device 100 includes a camera module 150 , a first color temperature sensor 151 , a second color temperature sensor 152 , and an image processor 153 .
- the camera module 150 captures an image.
- the camera module 150 outputs image data of the captured image to the image processor 153 .
- the image processor 153 compares first information from the first color temperature sensor 151 that is provided on a first side 161 of the portable terminal device 100 with second information from the second color temperature sensor 152 that is provided on a second side 162 of the portable terminal device 100 .
- the image processor 153 performs image correction to the image that is captured by the camera module 150 in accordance with a comparison result between the first information and the second information. In this case, the image processor 153 may not perform the image correction in accordance with the comparison result.
- the first and second color temperature sensors 151 and 152 are provided on the first and second sides 161 and 162 of the portable terminal device 100 . Accordingly, the portable terminal device 100 may obtain information that corresponds to a case where the direction of light that is received by the object is different from the direction of light that is received by the color sensor when an image is captured.
- the image processor 153 performs the image correction to the image that is captured by the camera module 150 based on each of the first information and the second information from the first and second color temperature sensors 151 and 152 and may thereby perform appropriate image correction in accordance with a surrounding environment.
- the second side 162 is preferably provided on a back side with respect to the first side 161 of the portable terminal device 100 .
- first and second color temperature sensors 151 and 152 that are respectively provided on the first and second sides 161 and 162 allow the image correction to correspond to the case where the direction of light that is received by the object is different from the direction of light that is received by the color sensor, compared to a case where the first and second color temperature sensors 151 and 152 are provided on other sides.
- a second embodiment will next be described.
- An example of an external appearance of the portable terminal device and so forth will first be described, and a configuration example of the portable terminal device will next be described.
- FIGS. 2A to 2C illustrate an example of an external appearance of the portable terminal device (which may hereinafter be referred to as terminal) 100 in the second embodiment.
- FIGS. 2A , 2 B, and 2 C illustrate the example of the external appearance in an oblique direction, from a front side, and from a back side, respectively.
- the terminal 100 may be a feature phone, a smart phone, a tablet, a personal computer, or the like, for example.
- the example in FIGS. 2A to 2C illustrates an example of a smart phone.
- the terminal 100 performs radio communication with a radio base station device and is provided with various services such as call services and video streaming services via the radio base station device.
- the terminal 100 has a camera function of capturing an image in addition to a call function.
- a camera lens 117 - 1 and an RGB sensor 118 are provided on the back side of the terminal 100 .
- the RGB sensor 118 is provided on the same side as the side on which the camera lens 117 - 1 is installed, for example.
- the RGB sensor 118 is used to perform image correction to the image that is captured by the camera lens 117 - 1 , for example.
- the RGB sensor 118 may hereinafter be referred to as camera module side RGB sensor 118 , for example.
- a camera module is provided in an internal portion of the terminal 100 .
- the camera function in the terminal 100 may be realized by the camera module, and an image of the object or the like may be captured.
- the camera lens 117 - 1 and the camera module side RGB sensor 118 are portions of the camera module, for example. The camera module will be described later in detail.
- a liquid crystal display (LCD) 120 and an RGB sensor 115 are provided on a face side of the terminal 100 .
- the RGB sensor 115 is provided on the same side as the side on which the LCD 120 is installed, for example.
- the RGB sensor 115 is used for adjustment of colors of a screen displayed on the LCD 120 and as a proximity sensor, for example.
- the RGB sensor 115 is used as an RGB sensor for the image correction. Details will be described later.
- the RGB sensor 115 may hereinafter be referred to as LCD side RGB sensor 115 .
- FIG. 3 illustrates a configuration example of the terminal 100 .
- the terminal 100 includes a main processor 101 , a communication control section 102 , a radio frequency (RF) section 103 , an antenna 104 , a microphone 105 , an input sound conversion section 106 , a sound processing section 107 , an output sound conversion section 108 , and a speaker 109 .
- the terminal 100 includes an application control processing unit (ACPU) 110 , a human centric engine (HCE) 111 , an acceleration sensor 112 , a gyro sensor 113 , a geomagnetism sensor 114 , and the LCD side RGB sensor 115 .
- the terminal 100 includes an image processor 116 , a camera module 117 , the LCD 120 , an input control section 121 , and a memory 122 .
- the camera module 117 includes the camera module side RGB sensor 118 .
- the main processor 101 is connected with the communication control section 102 , the input sound conversion section 106 , the sound processing section 107 , the output sound conversion section 108 , the ACPU 110 , the HCE 111 , the image processor 116 , the camera module 117 , the LCD 120 , the input control section 121 , and the memory 122 via a bus 125 .
- the main processor 101 controls each section of the terminal 100 .
- the main processor 101 performs control related to radio communication and control related to input and output of sounds.
- the communication control section 102 performs various kinds of control related to radio communication. For example, the communication control section 102 applies a demodulation process or a decoding process to a baseband signal that is output from the RF section 103 , thereby extracts user data (sound data, image data, and so forth) or the like, and outputs those to the sound processing section 107 , the image processor 116 , or the like. Further, the communication control section 102 receives the user data or the like from the sound processing section 107 , the image processor 116 , or the like, applies a coding process, a modulation process, or the like to the user data or the like, and outputs those to the RF section 103 .
- the RF section 103 performs frequency conversion of a radio signal in a radio band that is received by the antenna 104 into a baseband signal in the baseband (down-conversion) or performs frequency conversion of a signal output from the communication control section 102 into the radio signal in the radio band (up-conversion).
- the antenna 104 transmits the radio signal output from the RF section 103 to the radio base station device or receives the radio signal transmitted from the radio base station device and outputs the signal to the RF 103 .
- the input sound conversion section 106 converts a sound input from the microphone 105 into digital sound data and outputs the converted sound data to the sound processing section 107 , for example.
- the sound processing section 107 performs a process of canceling noises of the sound data output from the input sound conversion section 106 or the like or performs a compression process of the sound data and outputs the sound data that result from the sound processing to the communication control section 102 or the like, for example. Further, the sound processing section 107 performs a process of expanding the compressed sound data or canceling noises for the sound data output from the communication control section 102 or the like and outputs the sound data that result from the sound processing to the output sound conversion section 108 or the like, for example.
- the output sound conversion section 108 receives the sound data from the sound processing section 107 , converts the digital sound data into analog sound data, and allows the sound to be output from the speaker 109 , for example.
- the ACPU 110 is a CPU for applications and controls the image processor 116 , the camera module 117 , and so forth, for example.
- the HCE 111 functions as a sub-processor of the ACPU 110 and controls the acceleration sensor 112 , the gyro sensor 113 , the geomagnetism sensor 114 , and the LCD side RGB sensor 115 , for example.
- the acceleration sensor 112 detects an acceleration of the terminal 100 by a capacitance detection method, a piezoresistance method, or the like, for example.
- the gyro sensor 113 detects an angular velocity of the terminal 100 by using a mechanical rotation method or the like to an internal object, for example.
- the geomagnetism sensor 114 detects a fluctuation of an external magnetic field and thereby detects the direction of the terminal 100 , for example.
- the image processor 116 performs the image correction to the image data output from the camera module 117 or performs image processing such as compression of the image data and outputs the image data that result from the image processing to the communication control section 102 , the memory 122 , or the like, for example.
- the image processor 116 performs image processing such as an expansion process or noise canceling to the compressed image data output from the communication control section 102 or the like and outputs the image data that result from the image processing to the LCD 120 , the memory 122 , or the like.
- the image processor 116 will be described later in detail.
- the camera module 117 is a section that realizes the camera function in the terminal 100 and also includes the camera module side RGB sensor 118 , the camera lens 117 - 1 , an image capturing element, and so forth.
- the camera module 117 will be described later in detail.
- the LCD 120 is a liquid crystal display screen in the terminal 100 and may display, in the screen, operation keys or the like that enable an operation. For example, the operation keys that are displayed on the LCD 120 are operated, thereby starting the camera module 117 (or a camera).
- the input control section 121 is a button for turning on or off a power supply of the terminal 100 , a button for turning on or off a power supply of the LCD 120 , or the like, for example.
- the input control section 121 outputs a signal that corresponds to a button operation to the main processor 101 or the like and may thereby turn on the power supply of the LCD 120 or the power supply of the terminal 100 by control of the main processor 101 or the like.
- the memory 122 stores the sound data, the image data, other data, and so forth.
- the terminal 100 may have a hardware configuration that includes a first CPU 131 , a second CPU 132 , and a third CPU 133 .
- the terminal 100 includes the first to third CPUs 131 to 133 , the RF section 103 , the antenna 104 , the microphone 105 , the speaker 109 , the LCD 120 , and the memory 122 .
- the first CPU 131 corresponds to the main processor 101 , the communication control section 102 , the input sound conversion section 106 , the sound processing section 107 , the output sound conversion section 108 , and the input control section 121 , for example.
- the second CPU 132 corresponds to the ACPU 110 , the image processor 116 , and the camera module 117 .
- the third CPU 133 corresponds to the HCE 111 , the acceleration sensor 112 , the gyro sensor 113 , the geomagnetism sensor 114 , and the LCD side RGB sensor 115 .
- the first to third CPUs 131 to 133 may be processors, controllers, or the like such as micro processing units (MPU) and field-programmable gate arrays (FPGA), for example.
- the first to third CPUs 131 to 133 read out and execute programs stored in the memory 122 and may thereby realize the functions of the main processor 101 , the communication control section 102 , the RGB sensors 115 and 118 , the image processor 116 , and so forth.
- FIG. 4 illustrates a configuration example of the RGB sensors 115 and 118 and the image processor 116 .
- the LCD side RGB sensor 115 includes an RGB light receiving element 1151 and an analog-to-digital (A/D) converter 1152 .
- the RGB light receiving element 1151 is a photodiode, for example, and converts received light into an electric signal by photoelectric conversion.
- the RGB light receiving element 1151 includes three light receiving elements of R, G, and B, thereby generates RGB signals (which may hereinafter be referred to as RGB data (or image data)) as electric signals, and outputs the RGB signals to the A/D converter 1152 .
- the RGB light receiving element 1151 is formed as plural RGB light receiving elements 1151 in the LCD side RGB sensor 115 , and the plural RGB light receiving elements 1151 configure the image capturing element.
- the A/D converter 1152 converts analog RGB data output from the RGB light receiving elements 1151 into digital RGB data and outputs the converted RGB data to the image processor 116 .
- the camera module 117 includes camera components 1171 and the camera module side RGB sensor 118 .
- the camera components 1171 includes the camera lens 117 - 1 illustrated in FIG. 2C , a diaphragm device, and so forth.
- the camera module side RGB sensor 118 includes an RGB light receiving element 1181 and an A/D converter 1182 .
- the RGB light receiving element 1181 is also a photodiode, for example, converts received light into an electric signal by photoelectric conversion, and generates RGB data as electric signals.
- the RGB light receiving element 1181 is also formed as plural RGB light receiving elements 1181 in the camera module 117 .
- the A/D converter 1182 converts analog RGB data output from the RGB light receiving elements 1181 into digital RGB data and outputs the converted RGB data to the image processor 116 .
- the image processor 116 includes a memory 1161 , a color temperature processing section 1162 , and an image correction section 1163 .
- the memory 1161 stores the RGB data output from the two RGB sensors 115 and 118 (or values of R, G, and B contained in the RGB data; the values of R, G, and B may hereinafter be referred to as RGB value). Further, the memory 1161 stores a standard value of the color temperature of the sunlight. The standard value will be described later in detail. In addition, the memory 1161 also stores color temperature data that correspond to the RGB data that are detected by the LCD side RGB sensor 115 when the camera is started. The color temperature data are data (or values) that result from conversion of the concerned RGB data stored in the memory 1161 by the color temperature processing section 1162 .
- the memory 1161 may be provided in the image processor 116 or may correspond to the memory 122 provided outside of the image processor 116 .
- the color temperature processing section 1162 converts the RGB data stored in the memory 1161 into the color temperature data (or color temperature values, which may hereinafter be referred to as color temperature data) and stores the converted color temperature data in the memory 1161 .
- the color temperature data may be obtained by the following calculation, for example.
- the color temperature processing section 1162 converts the RGB values in an RGB color coordinate system into XY chromaticity values in an XYZ color coordinate system and converts the converted XY chromaticity values into uv chromaticity values in an Luv color coordinate system.
- the color temperature processing section 1162 calculates an absolute temperature that corresponds to the uv chromaticity values as the color temperature. Such conversion, calculation, and so forth of the color coordinate systems are performed by the basic arithmetic operations, obtainment of values by using a table that represents a correspondence relationship retained in the memory 1161 and so forth by the color temperature processing section 1162 , for example.
- the color temperature is a unit or a criterion that represents the color of light that is emitted by a certain light source by a quantitative value, and kelvin (K), lux (lx), or the like is used as a unit.
- the color temperature processing section 1162 performs the following process, for example. That is, the color temperature processing section 1162 calculates the color temperature data that correspond to the RGB data generated by the LCD side RGB sensor 115 . Further, the color temperature processing section 1162 reads out the standard value of a color temperature related to the sunlight that is stored in the memory 1161 from the memory 1161 . The color temperature processing section 1162 then compares the color temperature that corresponds to the LCD side RGB sensor 115 with the standard value and outputs a comparison result to the image correction section 1163 . Details will be described in an operation example.
- the image correction section 1163 performs (or does not perform) the image correction to the image that is captured by the camera module 117 in accordance with the comparison result from the color temperature processing section 1162 .
- the image correction section 1163 reads out the RGB data of the image captured by the camera module 117 from the memory 1161 and performs the image correction to the RGB data.
- As the image correction automatic exposure (AE) correction or the like is performed by adjustment of the RGB values or the like, for example. Details of the image correction will also be described in the operation example.
- the image correction section 1163 outputs the image data to which the image correction is performed or the image data to which the image correction is not performed to the LCD 120 or the memory 122 .
- the first color temperature sensor 151 in the first embodiment corresponds to the LCD side RGB sensor 115 , the memory 1161 , and the color temperature processing section 1162 in the second embodiment, for example.
- the second color temperature sensor 152 in the first embodiment corresponds to the camera module side RGB sensor 118 , the memory 1161 , and the color temperature processing section 1162 in the second embodiment, for example.
- the image processor 153 in the first embodiment corresponds to the image processor 116 in the second embodiment, for example.
- FIG. 5 is a flowchart that illustrates an operation example in the second embodiment.
- the terminal 100 starts the camera, measures the color temperature by the LCD side RGB sensor 115 , and saves the color temperature in the image processor 116 (S 11 ).
- the side of the LCD 120 of the terminal 100 may be directed toward the sky when the camera is started.
- FIG. 6 illustrates such a state.
- the side of the LCD 120 is directed toward the sky.
- the LCD side RGB sensor 115 measures the RGB values when the camera is started and then obtains the color temperature value that corresponds to the RGB values, and the brightness around the object of the image capturing may thereby be obtained.
- the image processor 116 calculates the color temperature in the color temperature processing section 1162 with respect to the RGB values that are obtained by the LCD side RGB sensor 115 when the camera is started and saves the color temperature value in the memory 1161 .
- the terminal 100 next determines whether or not the color temperature at the time when the camera is started is lower than a standard color temperature of the sunlight (S 12 ).
- the color temperature processing section 1162 reads out the color temperature value at the time when the camera is started and the standard color temperature value of the sunlight from the memory 1161 and compares the two color temperature values, thereby making a determination.
- the terminal 100 performs the AE correction (or white balance adjustment, which may hereinafter be referred to as “AE correction”) (S 13 ) in a case where the color temperature at the time when the camera is started is lower than the standard color temperature of the sunlight (YES in S 12 ).
- the case where the color temperature at the time when the camera is started is lower than the standard color temperature of the sunlight is a case where the environment at the time when the camera is started is darker than a case where an image is captured in the sunlight, for example. In a case where the environment is dark, the captured image becomes dark as a whole. In such a case, the terminal 100 corrects the RGB data of the image captured by the camera module 117 and performs the AE correction such that the image becomes brighter as a whole.
- the color temperature processing section 1162 of the image processor 116 detects that the color temperature value at the time when the camera is started is lower than the standard color temperature value of the sunlight, the color temperature processing section 1162 outputs the comparison result to the image correction section 1163 .
- the image correction section 1163 obtains the comparison result, the image correction section 1163 obtains the RGB data of the image captured by the camera module 117 from the memory 1161 , multiplies all the RGB values by a prescribed ratio, and changes the RGB values at the obtainment to larger RGB values, thereby performing the AE correction.
- the image correction section 1163 may perform the image correction by correcting values that are offset by a certain value or larger among the RGB values obtained from the memory 1161 to a reference value or lower. Accordingly, the image correction section 1163 may correct a dark image to a bright image, for example.
- the terminal 100 then saves the image data that result from the AE correction in the memory 1161 (S 14 ) and finishes a series of processes (S 15 ).
- the terminal 100 saves the image data with no particular image correction (S 14 ) and finishes a series of processes (S 15 ).
- the environment at the time when the camera is started is as bright as or brighter than an environment in the sunlight.
- the process is finished without performing the AE correction.
- the terminal 100 detects the color temperature by using a state where the side of the LCD 120 of the terminal 100 is directed toward the sky when the camera is started.
- the color temperature around the object may be detected with high accuracy.
- the terminal 100 uses the color temperature for a determination of whether or not the image correction is performed and may thus perform appropriate image correction in accordance with the color temperature around the object, that is, in accordance with a surrounding environment.
- the third embodiment is an example where a determination is made whether or not the image correction is performed by using the two RGB sensors 115 and 118 that are the LCD side RGB sensor 115 and the camera module side RGB sensor 118 .
- a configuration example of the terminal 100 in the third embodiment is similar to the second embodiment and is illustrated in FIGS. 2A to 4 , for example.
- FIG. 7 is a flowchart that illustrates an operation example in the third embodiment.
- the terminal 100 starts the process (S 30 )
- the terminal 100 measures the color temperatures in the two RGB sensors 115 and 118 when the shutter of the camera is pressed (S 31 ).
- the measurement is performed as follows. That is, when the shutter in the camera components 1171 is pressed, a signal that indicates that the shutter is pressed is output to the camera module side RGB sensor 118 . Further, the signal is also output to the image processor 116 , and the image processor 116 outputs the signal to the LCD side RGB sensor 115 .
- the two RGB sensors 115 and 118 obtain the RGB data on receiving the signal.
- the camera module side RGB sensor 118 obtains the RGB data of the object side (for example, the first side)
- the LCD side RGB sensor 115 obtains the RGB data of the opposite side to the object side (for example, the second side provided on the back side of the first side).
- the RGB data obtained by the two RGB sensors 115 and 118 are sent to the memory 1161 of the image processor 116 .
- the color temperature processing section 1162 obtains the RGB data obtained by the two RGB sensors 115 and 118 from the memory 1161 and calculates the color temperature data with respect to the two sets of RGB data.
- a calculation method of the color temperature data is similar to the second embodiment, for example.
- the terminal 100 next determines whether or not the color temperature from the LCD side RGB sensor 115 is lower than the color temperature of the camera module side RGB sensor 118 (S 32 ).
- the terminal 100 performs the image correction by the AE correction to make the captured image a brighter image (S 33 ).
- FIG. 8A illustrates an example of a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensors.
- the camera module side RGB sensor 118 directly receives the sunlight, but the LCD side RGB sensor 115 does not directly receive the sunlight. Further, the object is directed toward the camera with the sunlight on the back.
- the color temperature from the LCD side RGB sensor 115 becomes lower than the color temperature from the camera module side RGB sensor 118 .
- the terminal 100 captures an image of the object, the image becomes dark compared to a case where the object directly receives the sunlight.
- the terminal 100 performs the AE correction to correct the dark image to a bright image.
- Backlight correction or the like may be used as an image correction method of making an image that is darkly captured brighter.
- the color temperature processing section 1162 compares the two color temperatures that are calculated in S 31 .
- the color temperature processing section 1162 detects that the color temperature from the LCD side RGB sensor 115 is lower than the color temperature of the camera module side RGB sensor 118 .
- the color temperature processing section 1162 outputs the detection result (or the comparison result) to the image correction section 1163 .
- the image correction section 1163 reads out the RGB data of the image captured by the camera module 117 from the memory 1161 and applies the AE correction to the RGB data.
- a method of the AE correction is performed in a similar manner to the method described in the second embodiment, for example.
- the image correction section 1163 stores the RGB data that result from the image correction in the memory 1161 .
- the terminal 100 performs red-eye correction that corrects eye portions darker by using facial recognition in the image processor 116 (S 35 ).
- FIG. 8B illustrates an example of a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensors.
- the LCD side RGB sensor 115 directly receives the sunlight, but the camera module side RGB sensor 118 does not directly receive the sunlight.
- the object directly receives the sunlight and is directed toward the camera.
- the color temperature from the LCD side RGB sensor 115 becomes equivalent to or higher than the color temperature from the camera module side RGB sensor 118 .
- the terminal 100 captures an image of the object, a bright image is captured because the object directly receives the sunlight.
- the object is an animal such as a human
- the eyes of the object in an image may be captured redder than normal. In such a case, the terminal 100 performs the red-eye correction.
- the color temperature processing section 1162 compares the two color temperatures that are calculated in S 31 .
- the color temperature processing section 1162 detects that the color temperature from the LCD side RGB sensor 115 is equivalent to or higher than the color temperature from the camera module side RGB sensor 118 .
- the color temperature processing section 1162 outputs the detection result (or the comparison result) to the image correction section 1163 .
- the image correction section 1163 reads out the RGB data of the image captured by the camera module 117 from the memory 1161 and obtains the RGB data of the eye portions of the object based on the RGB data, difference, and so forth of the image in a certain area.
- the image correction section 1163 then corrects data values that correspond to R among the RGB data related to the eye portions to lower values than the obtained values. Alternatively, the image correction section 1163 corrects the data values that correspond to R among the obtained RGB data related to the eye portions (or pixel values related to red) to lower values than a reference value. The image correction section 1163 stores the corrected RGB data in the memory 1161 .
- the terminal 100 performs the AE correction (S 33 ) or the red-eye correction (S 35 ) and then finishes a series of processes (S 34 ).
- the terminal 100 in the third embodiment compares the color temperatures from the two RGB sensors 115 and 118 that are provided on different sides and performs the correction to the captured image in accordance with the comparison result. Accordingly, the terminal 100 performs the image correction, by the two RGB sensors 115 and 118 , in consideration of a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor 118 when an image is captured. Thus, the terminal 100 may perform appropriate image correction even in a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor 118 when an image is captured, and appropriate image correction in accordance with a surrounding environment may thus be performed.
- an example of the image correction is described with an example of the red-eye correction (S 35 ).
- Such image correction is only one example, and the image correction may be performed to a specified portion of an image that is caused by direct reception of the sunlight.
- the image correction section 1163 determines that the image has a noise in a case where differences greater than a reference value are present between the RGB values of the pixels and the RGB values of adjacent pixels and may correct the RGB values of the concerned pixels to the reference value or lower while defining the concerned pixels as the specified portion.
- the object is an animal such as a human.
- the object may be a landscape or the like.
- a noise in the image that is caused by reception of the sunlight may be corrected.
- the AE correction (S 33 ) may be performed by a method that reduces values that are offset by a certain value or larger among the RGB values to a reference value or lower.
- the side on which one sensor of the LCD side RGB sensor 115 and the camera module side RGB sensor 118 is provided is defined as the first side and the other sensor is installed on the second side that is provided on the back side with respect to the first side.
- the two sensors 115 and 118 may not be provided on mutually opposite sides but may be provided on different sides.
- FIG. 9 illustrates an example of a case where the LCD side RGB sensor 115 is installed on an upper side of the terminal 100 .
- the LCD side RGB sensor 115 is installed on the upper side, thereby enabling appropriate calculation of the color temperature around the object based on the sunlight or the like.
- the terminal 100 may perform appropriate image correction in accordance with a surrounding environment even in a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor 118 when an image is captured.
- An upper side (yz plane) with respect to the side (xy plane) on which the LCD 120 is provided in the terminal 100 is defined as “upper side”, for example.
- the portable terminal device may include information processing devices such as tablet terminals, personal digital assistants (PDA), and portable game consoles, for example.
- information processing devices such as tablet terminals, personal digital assistants (PDA), and portable game consoles, for example.
- PDA personal digital assistants
- portable game consoles for example.
- Such an information processing device may carry out the image correction that is described in the second and third embodiments in a similar manner to the terminal 100 .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Color Television Image Signal Generators (AREA)
- Studio Devices (AREA)
- Processing Of Color Television Signals (AREA)
- Telephone Function (AREA)
Abstract
Description
- This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2014-051726, filed on Mar. 14, 2014, the entire contents of which are incorporated herein by reference.
- The embodiments discussed herein are related to a portable terminal device, an image correction method, and an image correction program, for example.
- Recently, portable terminal devices such as feature phones and smart phones have various functions in addition to a telephone function. One of such functions is a camera function. The portable terminal device may capture a high resolution image by a sophisticated camera function.
- For example, the portable terminal device includes a camera lens, image capturing element, a red green blue (RGB) sensor, and so forth and may perform image correction such as white balance adjustment to an image captured by the camera lens and the image capturing element by using the RGB sensor. This allows the portable terminal device to capture a natural image regardless of an outdoor or indoor image capturing position. The RGB sensor that is used for such image correction is provided on a side on which the camera lens is installed, for example.
- Differently, there is a case where the portable terminal device includes a display screen such as an LCD and the RGB sensor is installed on a side on which the display screen is provided. The RGB sensor provided on the display screen side is used for adjustment of colors on an image displayed on the display screen and as a proximity sensor, for example. In a case where the RGB sensor is used as the proximity sensor, when a face approaches a microphone or a speaker provided in the portable terminal device on a call, the RGB sensor provided on the display screen side detects the face, and a power supply to the display screen is temporarily turned off. Reduction in power consumption of the portable terminal device may thereby be expected, for example.
- As for such image correction, the following technologies are disclosed, for example. For example, Japanese Laid-open Patent Publication No. 2011-203437 discloses a technology in which a first photometric sensor is provided on a display side of a display section in an electronic camera, a second photometric sensor is provided on a back side of the display side of the display section, and the display section thereby displays a corrected image to which adjustment is applied in accordance with a lighting environment by using a first lighting condition from the first photometric sensor and a second lighting condition from the second photometric sensor.
- It is considered that this technology may provide a unit that properly adjusts appearances of colors of the image displayed on the display section in accordance with brightness of an environment.
- Further, Japanese Laid-open Patent Publication No. 2003-102018 discloses a technology in which in an electronic still camera that includes a white balance adjustment unit that performs white balance adjustment based on a captured image, reliability of a stored white control signal is assessed based on the number of image capturing frames and an elapsed time from a prescribed time point, and a user is notified of low reliability in a case where the reliability is low.
- It is considered that this technology may minimize incorrect white balance adjustment that occurs at an actual image capturing time point in a case the light source conditions of the lighting are different.
- Further, Japanese Laid-open Patent Publication No. 2008-219128 discloses an image capturing device that changes image capturing modes by controlling the white balance of an image signal obtained by a charge coupled device (CCD) in accordance with the difference between an average first color of an atmosphere that is measured by a color sensor and an average second color of an object that is measured with the image signal.
- It is considered that this technology may provide an image capturing device that may perform adjustment to an appropriate white balance in a short time.
- In accordance with an aspect of the embodiments, a portable terminal device having a first color temperature sensor that is provided on a first side of the portable terminal device, and having a second color temperature sensor that is provided on a second side of the portable terminal device, the portable terminal device includes, at least one camera module configured to capture an image; and an image processor configured to perform image correction to the image that is captured by the camera module based on at least one of first information from the first color temperature sensor and second information from the second color temperature sensor.
- The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
- These and/or other aspects and advantages will become apparent and more readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawing of which:
-
FIG. 1 illustrates a configuration example of a portable terminal device; -
FIGS. 2A to 2C illustrate an example of an external appearance of the portable terminal device; -
FIG. 3 illustrates a configuration example of the portable terminal device; -
FIG. 4 illustrates a configuration example of RGB sensors and an image processor; -
FIG. 5 is a flowchart that illustrates an operation example of the portable terminal device; -
FIG. 6 illustrates a use example of the portable terminal device; -
FIG. 7 is a flowchart that illustrates an operation example of the portable terminal device; -
FIGS. 8A and 8B illustrate examples of cases where a direction of light that is received by an object is different from a direction of light that is received by the RGB sensors; -
FIG. 9 illustrates an external appearance of the portable terminal device; and -
FIG. 10 illustrates an example of a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor. - In a portable terminal device, an RGB sensor that is used to perform image correction may be provided only on a side on which a camera lens is provided, for example. In such a case, a direction of light that is received by the RGB sensor is different from a direction of light that is received by an object, and thus the image correction may not appropriately be performed.
FIG. 10 is a diagram for illustrating an example of such a case. - As illustrated in
FIG. 10 , aportable terminal device 10 includes acamera lens 12 and an RGB sensor (or color sensor) 14. TheRGB sensor 14 is installed on a side on which thecamera lens 12 is installed. In the example ofFIG. 10 , in a case where sunlight travels in A direction, theRGB sensor 14 directly receives the sunlight. On the one hand, an object 16 receives the sunlight from a back side, and the side of the object that is directed toward theportable terminal device 10 does not directly receive the sunlight. On the other hand, in a case where the sunlight travels in B direction, theRGB sensor 14 does not directly receive the sunlight, but the side of the object 16 that is directed toward thecamera lens 12 directly receives the sunlight. In such a case, if the image correction is performed based on a signal detected by theRGB sensor 14, appropriate correction may not be performed even if an image of the object is corrected based on the signal detected by theRGB sensor 14 because light reception is different between the object and theRGB sensor 14. Such a phenomenon may occur in a case of a light source other than the sunlight, for example, a fluorescent light or the like. - The above-described technology in which the first and second photometric sensors are respectively provided on the display side and the back side of the display side of the display section is to adjust a display on the display section in accordance with the brightness of the environment by using the first and second photometric sensors but is not a technology to capture an image. Therefore, this technology may not perform appropriate image correction in a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor when an image is captured.
- Further, the above-described technology in which the number of image capturing frames and the elapsed time from a prescribed time point are measured and the reliability of the stored white control signal is assessed is to assess the reliability of the white control signal. Thus, appropriate white balance may not be performed for an image captured with low reliability. Further, this technology makes no suggestion on how the image correction is performed in a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor when an image is captured.
- In addition, the above-described technology that controls the white balance in accordance with the average first color or the like of the atmosphere that is measured by the color sensor makes no suggestion on how the image correction is performed in a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensor when an image is captured.
- In a case where appropriate image correction may not be performed in a case where the direction of light that is received by the object is different from the direction of light that is received by the color sensor when an image is captured, appropriate image correction in accordance with a surrounding environment may not be performed either.
- Embodiments of this disclosure will hereinafter be described based on the above-described facts that are newly observed by the inventors.
- A first embodiment will be described.
FIG. 1 illustrates a configuration example of aportable terminal device 100 in the first embodiment. Theportable terminal device 100 may include information processing devices such as feature phones, smart phones, tablet terminals, personal digital assistants (PDA), and portable game consoles, for example. - The
portable terminal device 100 includes acamera module 150, a firstcolor temperature sensor 151, a secondcolor temperature sensor 152, and animage processor 153. - The
camera module 150 captures an image. Thecamera module 150 outputs image data of the captured image to theimage processor 153. - The
image processor 153 compares first information from the firstcolor temperature sensor 151 that is provided on afirst side 161 of the portableterminal device 100 with second information from the secondcolor temperature sensor 152 that is provided on asecond side 162 of the portableterminal device 100. Theimage processor 153 performs image correction to the image that is captured by thecamera module 150 in accordance with a comparison result between the first information and the second information. In this case, theimage processor 153 may not perform the image correction in accordance with the comparison result. - As described above, in the first embodiment, the first and second
color temperature sensors second sides terminal device 100. Accordingly, the portableterminal device 100 may obtain information that corresponds to a case where the direction of light that is received by the object is different from the direction of light that is received by the color sensor when an image is captured. Thus, theimage processor 153 performs the image correction to the image that is captured by thecamera module 150 based on each of the first information and the second information from the first and secondcolor temperature sensors - In this case, the
second side 162 is preferably provided on a back side with respect to thefirst side 161 of the portableterminal device 100. This is because such first and secondcolor temperature sensors second sides color temperature sensors - A second embodiment will next be described. An example of an external appearance of the portable terminal device and so forth will first be described, and a configuration example of the portable terminal device will next be described.
- <Example of External Appearance of Portable Terminal Device>
-
FIGS. 2A to 2C illustrate an example of an external appearance of the portable terminal device (which may hereinafter be referred to as terminal) 100 in the second embodiment.FIGS. 2A , 2B, and 2C illustrate the example of the external appearance in an oblique direction, from a front side, and from a back side, respectively. - The terminal 100 may be a feature phone, a smart phone, a tablet, a personal computer, or the like, for example. The example in
FIGS. 2A to 2C illustrates an example of a smart phone. The terminal 100 performs radio communication with a radio base station device and is provided with various services such as call services and video streaming services via the radio base station device. The terminal 100 has a camera function of capturing an image in addition to a call function. - As illustrated in
FIG. 2C , a camera lens 117-1 and anRGB sensor 118 are provided on the back side of the terminal 100. TheRGB sensor 118 is provided on the same side as the side on which the camera lens 117-1 is installed, for example. TheRGB sensor 118 is used to perform image correction to the image that is captured by the camera lens 117-1, for example. TheRGB sensor 118 may hereinafter be referred to as camera moduleside RGB sensor 118, for example. - A camera module is provided in an internal portion of the terminal 100. The camera function in the terminal 100 may be realized by the camera module, and an image of the object or the like may be captured. The camera lens 117-1 and the camera module
side RGB sensor 118 are portions of the camera module, for example. The camera module will be described later in detail. - Further, as illustrated in
FIG. 2B and so forth, a liquid crystal display (LCD) 120 and anRGB sensor 115 are provided on a face side of the terminal 100. TheRGB sensor 115 is provided on the same side as the side on which theLCD 120 is installed, for example. TheRGB sensor 115 is used for adjustment of colors of a screen displayed on theLCD 120 and as a proximity sensor, for example. - In the first embodiment, the
RGB sensor 115 is used as an RGB sensor for the image correction. Details will be described later. TheRGB sensor 115 may hereinafter be referred to as LCDside RGB sensor 115. - <Configuration Example of
Portable Terminal Device 100> -
FIG. 3 illustrates a configuration example of the terminal 100. The terminal 100 includes amain processor 101, a communication control section 102, a radio frequency (RF)section 103, an antenna 104, amicrophone 105, an inputsound conversion section 106, asound processing section 107, an outputsound conversion section 108, and aspeaker 109. Further, the terminal 100 includes an application control processing unit (ACPU) 110, a human centric engine (HCE) 111, anacceleration sensor 112, agyro sensor 113, ageomagnetism sensor 114, and the LCDside RGB sensor 115. In addition, the terminal 100 includes animage processor 116, acamera module 117, theLCD 120, aninput control section 121, and amemory 122. Thecamera module 117 includes the camera moduleside RGB sensor 118. - The
main processor 101 is connected with the communication control section 102, the inputsound conversion section 106, thesound processing section 107, the outputsound conversion section 108, theACPU 110, theHCE 111, theimage processor 116, thecamera module 117, theLCD 120, theinput control section 121, and thememory 122 via abus 125. - The
main processor 101 controls each section of the terminal 100. For example, themain processor 101 performs control related to radio communication and control related to input and output of sounds. - The communication control section 102 performs various kinds of control related to radio communication. For example, the communication control section 102 applies a demodulation process or a decoding process to a baseband signal that is output from the
RF section 103, thereby extracts user data (sound data, image data, and so forth) or the like, and outputs those to thesound processing section 107, theimage processor 116, or the like. Further, the communication control section 102 receives the user data or the like from thesound processing section 107, theimage processor 116, or the like, applies a coding process, a modulation process, or the like to the user data or the like, and outputs those to theRF section 103. - The
RF section 103 performs frequency conversion of a radio signal in a radio band that is received by the antenna 104 into a baseband signal in the baseband (down-conversion) or performs frequency conversion of a signal output from the communication control section 102 into the radio signal in the radio band (up-conversion). - The antenna 104 transmits the radio signal output from the
RF section 103 to the radio base station device or receives the radio signal transmitted from the radio base station device and outputs the signal to theRF 103. - The input
sound conversion section 106 converts a sound input from themicrophone 105 into digital sound data and outputs the converted sound data to thesound processing section 107, for example. - The
sound processing section 107 performs a process of canceling noises of the sound data output from the inputsound conversion section 106 or the like or performs a compression process of the sound data and outputs the sound data that result from the sound processing to the communication control section 102 or the like, for example. Further, thesound processing section 107 performs a process of expanding the compressed sound data or canceling noises for the sound data output from the communication control section 102 or the like and outputs the sound data that result from the sound processing to the outputsound conversion section 108 or the like, for example. - The output
sound conversion section 108 receives the sound data from thesound processing section 107, converts the digital sound data into analog sound data, and allows the sound to be output from thespeaker 109, for example. - The
ACPU 110 is a CPU for applications and controls theimage processor 116, thecamera module 117, and so forth, for example. - The
HCE 111 functions as a sub-processor of theACPU 110 and controls theacceleration sensor 112, thegyro sensor 113, thegeomagnetism sensor 114, and the LCDside RGB sensor 115, for example. - The
acceleration sensor 112 detects an acceleration of the terminal 100 by a capacitance detection method, a piezoresistance method, or the like, for example. Thegyro sensor 113 detects an angular velocity of the terminal 100 by using a mechanical rotation method or the like to an internal object, for example. Thegeomagnetism sensor 114 detects a fluctuation of an external magnetic field and thereby detects the direction of the terminal 100, for example. - The
image processor 116 performs the image correction to the image data output from thecamera module 117 or performs image processing such as compression of the image data and outputs the image data that result from the image processing to the communication control section 102, thememory 122, or the like, for example. Theimage processor 116 performs image processing such as an expansion process or noise canceling to the compressed image data output from the communication control section 102 or the like and outputs the image data that result from the image processing to theLCD 120, thememory 122, or the like. Theimage processor 116 will be described later in detail. - The
camera module 117 is a section that realizes the camera function in the terminal 100 and also includes the camera moduleside RGB sensor 118, the camera lens 117-1, an image capturing element, and so forth. Thecamera module 117 will be described later in detail. - The
LCD 120 is a liquid crystal display screen in the terminal 100 and may display, in the screen, operation keys or the like that enable an operation. For example, the operation keys that are displayed on theLCD 120 are operated, thereby starting the camera module 117 (or a camera). - The
input control section 121 is a button for turning on or off a power supply of the terminal 100, a button for turning on or off a power supply of theLCD 120, or the like, for example. Theinput control section 121 outputs a signal that corresponds to a button operation to themain processor 101 or the like and may thereby turn on the power supply of theLCD 120 or the power supply of the terminal 100 by control of themain processor 101 or the like. - The
memory 122 stores the sound data, the image data, other data, and so forth. - As illustrated in
FIG. 3 , the terminal 100 may have a hardware configuration that includes afirst CPU 131, asecond CPU 132, and athird CPU 133. In this case, the terminal 100 includes the first tothird CPUs 131 to 133, theRF section 103, the antenna 104, themicrophone 105, thespeaker 109, theLCD 120, and thememory 122. - The
first CPU 131 corresponds to themain processor 101, the communication control section 102, the inputsound conversion section 106, thesound processing section 107, the outputsound conversion section 108, and theinput control section 121, for example. - Further, the
second CPU 132 corresponds to theACPU 110, theimage processor 116, and thecamera module 117. - In addition, the
third CPU 133 corresponds to theHCE 111, theacceleration sensor 112, thegyro sensor 113, thegeomagnetism sensor 114, and the LCDside RGB sensor 115. - The first to
third CPUs 131 to 133 may be processors, controllers, or the like such as micro processing units (MPU) and field-programmable gate arrays (FPGA), for example. The first tothird CPUs 131 to 133 read out and execute programs stored in thememory 122 and may thereby realize the functions of themain processor 101, the communication control section 102, theRGB sensors image processor 116, and so forth. - <Configuration Example of
RGB Sensors Image Processor 116> - A configuration example of the
RGB sensors image processor 116 will next be described. -
FIG. 4 illustrates a configuration example of theRGB sensors image processor 116. The LCDside RGB sensor 115 includes an RGBlight receiving element 1151 and an analog-to-digital (A/D)converter 1152. - The RGB
light receiving element 1151 is a photodiode, for example, and converts received light into an electric signal by photoelectric conversion. In this case, the RGBlight receiving element 1151 includes three light receiving elements of R, G, and B, thereby generates RGB signals (which may hereinafter be referred to as RGB data (or image data)) as electric signals, and outputs the RGB signals to the A/D converter 1152. The RGBlight receiving element 1151 is formed as plural RGBlight receiving elements 1151 in the LCDside RGB sensor 115, and the plural RGBlight receiving elements 1151 configure the image capturing element. - The A/
D converter 1152 converts analog RGB data output from the RGBlight receiving elements 1151 into digital RGB data and outputs the converted RGB data to theimage processor 116. - Meanwhile, the
camera module 117 includescamera components 1171 and the camera moduleside RGB sensor 118. Thecamera components 1171 includes the camera lens 117-1 illustrated inFIG. 2C , a diaphragm device, and so forth. - The camera module
side RGB sensor 118 includes an RGBlight receiving element 1181 and an A/D converter 1182. - The RGB
light receiving element 1181 is also a photodiode, for example, converts received light into an electric signal by photoelectric conversion, and generates RGB data as electric signals. The RGBlight receiving element 1181 is also formed as plural RGBlight receiving elements 1181 in thecamera module 117. - The A/
D converter 1182 converts analog RGB data output from the RGBlight receiving elements 1181 into digital RGB data and outputs the converted RGB data to theimage processor 116. - The
image processor 116 includes amemory 1161, a colortemperature processing section 1162, and animage correction section 1163. - The
memory 1161 stores the RGB data output from the twoRGB sensors 115 and 118 (or values of R, G, and B contained in the RGB data; the values of R, G, and B may hereinafter be referred to as RGB value). Further, thememory 1161 stores a standard value of the color temperature of the sunlight. The standard value will be described later in detail. In addition, thememory 1161 also stores color temperature data that correspond to the RGB data that are detected by the LCDside RGB sensor 115 when the camera is started. The color temperature data are data (or values) that result from conversion of the concerned RGB data stored in thememory 1161 by the colortemperature processing section 1162. - The
memory 1161 may be provided in theimage processor 116 or may correspond to thememory 122 provided outside of theimage processor 116. - The color
temperature processing section 1162 converts the RGB data stored in thememory 1161 into the color temperature data (or color temperature values, which may hereinafter be referred to as color temperature data) and stores the converted color temperature data in thememory 1161. The color temperature data may be obtained by the following calculation, for example. - That is, the color
temperature processing section 1162 converts the RGB values in an RGB color coordinate system into XY chromaticity values in an XYZ color coordinate system and converts the converted XY chromaticity values into uv chromaticity values in an Luv color coordinate system. In addition, the colortemperature processing section 1162 calculates an absolute temperature that corresponds to the uv chromaticity values as the color temperature. Such conversion, calculation, and so forth of the color coordinate systems are performed by the basic arithmetic operations, obtainment of values by using a table that represents a correspondence relationship retained in thememory 1161 and so forth by the colortemperature processing section 1162, for example. - The color temperature is a unit or a criterion that represents the color of light that is emitted by a certain light source by a quantitative value, and kelvin (K), lux (lx), or the like is used as a unit.
- In the second embodiment, the color
temperature processing section 1162 performs the following process, for example. That is, the colortemperature processing section 1162 calculates the color temperature data that correspond to the RGB data generated by the LCDside RGB sensor 115. Further, the colortemperature processing section 1162 reads out the standard value of a color temperature related to the sunlight that is stored in thememory 1161 from thememory 1161. The colortemperature processing section 1162 then compares the color temperature that corresponds to the LCDside RGB sensor 115 with the standard value and outputs a comparison result to theimage correction section 1163. Details will be described in an operation example. - The
image correction section 1163 performs (or does not perform) the image correction to the image that is captured by thecamera module 117 in accordance with the comparison result from the colortemperature processing section 1162. In a case where the image correction is performed, theimage correction section 1163 reads out the RGB data of the image captured by thecamera module 117 from thememory 1161 and performs the image correction to the RGB data. As the image correction, automatic exposure (AE) correction or the like is performed by adjustment of the RGB values or the like, for example. Details of the image correction will also be described in the operation example. Theimage correction section 1163 outputs the image data to which the image correction is performed or the image data to which the image correction is not performed to theLCD 120 or thememory 122. - The first
color temperature sensor 151 in the first embodiment corresponds to the LCDside RGB sensor 115, thememory 1161, and the colortemperature processing section 1162 in the second embodiment, for example. Further, the secondcolor temperature sensor 152 in the first embodiment corresponds to the camera moduleside RGB sensor 118, thememory 1161, and the colortemperature processing section 1162 in the second embodiment, for example. In addition, theimage processor 153 in the first embodiment corresponds to theimage processor 116 in the second embodiment, for example. - An operation example in the terminal 100 will next be described.
FIG. 5 is a flowchart that illustrates an operation example in the second embodiment. - When the terminal 100 starts the process (S10), the terminal 100 starts the camera, measures the color temperature by the LCD
side RGB sensor 115, and saves the color temperature in the image processor 116 (S11). - For example, the side of the
LCD 120 of the terminal 100 may be directed toward the sky when the camera is started.FIG. 6 illustrates such a state. In a case where the camera is started by operating the operation keys or the like that are displayed on theLCD 120, the side of theLCD 120 is directed toward the sky. - The LCD
side RGB sensor 115 measures the RGB values when the camera is started and then obtains the color temperature value that corresponds to the RGB values, and the brightness around the object of the image capturing may thereby be obtained. Theimage processor 116 calculates the color temperature in the colortemperature processing section 1162 with respect to the RGB values that are obtained by the LCDside RGB sensor 115 when the camera is started and saves the color temperature value in thememory 1161. - Returning to
FIG. 5 , the terminal 100 next determines whether or not the color temperature at the time when the camera is started is lower than a standard color temperature of the sunlight (S12). For example, the colortemperature processing section 1162 reads out the color temperature value at the time when the camera is started and the standard color temperature value of the sunlight from thememory 1161 and compares the two color temperature values, thereby making a determination. - The terminal 100 performs the AE correction (or white balance adjustment, which may hereinafter be referred to as “AE correction”) (S13) in a case where the color temperature at the time when the camera is started is lower than the standard color temperature of the sunlight (YES in S12). The case where the color temperature at the time when the camera is started is lower than the standard color temperature of the sunlight is a case where the environment at the time when the camera is started is darker than a case where an image is captured in the sunlight, for example. In a case where the environment is dark, the captured image becomes dark as a whole. In such a case, the terminal 100 corrects the RGB data of the image captured by the
camera module 117 and performs the AE correction such that the image becomes brighter as a whole. - It is assumed that an image of the object is captured at an arbitrary time point after the camera is started (S11) and before the AE correction is performed (S13), for example.
- Specifically, the following process is performed, for example. That is, when the color
temperature processing section 1162 of theimage processor 116 detects that the color temperature value at the time when the camera is started is lower than the standard color temperature value of the sunlight, the colortemperature processing section 1162 outputs the comparison result to theimage correction section 1163. When theimage correction section 1163 obtains the comparison result, theimage correction section 1163 obtains the RGB data of the image captured by thecamera module 117 from thememory 1161, multiplies all the RGB values by a prescribed ratio, and changes the RGB values at the obtainment to larger RGB values, thereby performing the AE correction. Alternatively, theimage correction section 1163 may perform the image correction by correcting values that are offset by a certain value or larger among the RGB values obtained from thememory 1161 to a reference value or lower. Accordingly, theimage correction section 1163 may correct a dark image to a bright image, for example. - The terminal 100 then saves the image data that result from the AE correction in the memory 1161 (S14) and finishes a series of processes (S15).
- On the other hand, in a case where the color temperature at the time when the camera is started is equivalent to or higher than the color temperature of the sunlight (NO in S12), the terminal 100 saves the image data with no particular image correction (S14) and finishes a series of processes (S15).
- In this case, the environment at the time when the camera is started is as bright as or brighter than an environment in the sunlight. In the second embodiment, in such a case, the process is finished without performing the AE correction.
- As described above, in the second embodiment, the terminal 100 detects the color temperature by using a state where the side of the
LCD 120 of the terminal 100 is directed toward the sky when the camera is started. Thus, the color temperature around the object may be detected with high accuracy. Further, the terminal 100 uses the color temperature for a determination of whether or not the image correction is performed and may thus perform appropriate image correction in accordance with the color temperature around the object, that is, in accordance with a surrounding environment. - A third embodiment will next be described. The third embodiment is an example where a determination is made whether or not the image correction is performed by using the two
RGB sensors side RGB sensor 115 and the camera moduleside RGB sensor 118. A configuration example of the terminal 100 in the third embodiment is similar to the second embodiment and is illustrated inFIGS. 2A to 4 , for example. -
FIG. 7 is a flowchart that illustrates an operation example in the third embodiment. When the terminal 100 starts the process (S30), the terminal 100 measures the color temperatures in the twoRGB sensors - For example, the measurement is performed as follows. That is, when the shutter in the
camera components 1171 is pressed, a signal that indicates that the shutter is pressed is output to the camera moduleside RGB sensor 118. Further, the signal is also output to theimage processor 116, and theimage processor 116 outputs the signal to the LCDside RGB sensor 115. The twoRGB sensors side RGB sensor 118 obtains the RGB data of the object side (for example, the first side), and the LCDside RGB sensor 115 obtains the RGB data of the opposite side to the object side (for example, the second side provided on the back side of the first side). The RGB data obtained by the twoRGB sensors memory 1161 of theimage processor 116. The colortemperature processing section 1162 obtains the RGB data obtained by the twoRGB sensors memory 1161 and calculates the color temperature data with respect to the two sets of RGB data. A calculation method of the color temperature data is similar to the second embodiment, for example. - The terminal 100 next determines whether or not the color temperature from the LCD
side RGB sensor 115 is lower than the color temperature of the camera module side RGB sensor 118 (S32). - In a case where the color temperature from the LCD
side RGB sensor 115 is lower than the color temperature of the camera module side RGB sensor 118 (YES in S32), the terminal 100 performs the image correction by the AE correction to make the captured image a brighter image (S33). -
FIG. 8A illustrates an example of a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensors. In a case illustrated inFIG. 8A , the camera moduleside RGB sensor 118 directly receives the sunlight, but the LCDside RGB sensor 115 does not directly receive the sunlight. Further, the object is directed toward the camera with the sunlight on the back. In such a case, the color temperature from the LCDside RGB sensor 115 becomes lower than the color temperature from the camera moduleside RGB sensor 118. In such a case, when the terminal 100 captures an image of the object, the image becomes dark compared to a case where the object directly receives the sunlight. In such a case, the terminal 100 performs the AE correction to correct the dark image to a bright image. Backlight correction or the like may be used as an image correction method of making an image that is darkly captured brighter. - Specifically, the following process is performed, for example. That is, the color
temperature processing section 1162 compares the two color temperatures that are calculated in S31. In a case where the colortemperature processing section 1162 detects that the color temperature from the LCDside RGB sensor 115 is lower than the color temperature of the camera moduleside RGB sensor 118, the colortemperature processing section 1162 outputs the detection result (or the comparison result) to theimage correction section 1163. When theimage correction section 1163 obtains the comparison result, theimage correction section 1163 reads out the RGB data of the image captured by thecamera module 117 from thememory 1161 and applies the AE correction to the RGB data. A method of the AE correction is performed in a similar manner to the method described in the second embodiment, for example. Theimage correction section 1163 stores the RGB data that result from the image correction in thememory 1161. - Returning to
FIG. 7 , on the other hand, in a case where the color temperature from the LCDside RGB sensor 115 is equivalent to or higher than the color temperature from the camera module side RGB sensor 118 (NO in S32), the terminal 100 performs red-eye correction that corrects eye portions darker by using facial recognition in the image processor 116 (S35). -
FIG. 8B illustrates an example of a case where the direction of light that is received by the object is different from the direction of light that is received by the RGB sensors. In a case illustrated inFIG. 8B , the LCDside RGB sensor 115 directly receives the sunlight, but the camera moduleside RGB sensor 118 does not directly receive the sunlight. The object directly receives the sunlight and is directed toward the camera. In such a case, the color temperature from the LCDside RGB sensor 115 becomes equivalent to or higher than the color temperature from the camera moduleside RGB sensor 118. Accordingly, when the terminal 100 captures an image of the object, a bright image is captured because the object directly receives the sunlight. In this case, in a case where the object is an animal such as a human, the eyes of the object in an image may be captured redder than normal. In such a case, the terminal 100 performs the red-eye correction. - Specifically, the following process is performed, for example. That is, the color
temperature processing section 1162 compares the two color temperatures that are calculated in S31. In a case where the colortemperature processing section 1162 detects that the color temperature from the LCDside RGB sensor 115 is equivalent to or higher than the color temperature from the camera moduleside RGB sensor 118, the colortemperature processing section 1162 outputs the detection result (or the comparison result) to theimage correction section 1163. When theimage correction section 1163 obtains the comparison result, theimage correction section 1163 reads out the RGB data of the image captured by thecamera module 117 from thememory 1161 and obtains the RGB data of the eye portions of the object based on the RGB data, difference, and so forth of the image in a certain area. Theimage correction section 1163 then corrects data values that correspond to R among the RGB data related to the eye portions to lower values than the obtained values. Alternatively, theimage correction section 1163 corrects the data values that correspond to R among the obtained RGB data related to the eye portions (or pixel values related to red) to lower values than a reference value. Theimage correction section 1163 stores the corrected RGB data in thememory 1161. - Returning to
FIG. 7 , the terminal 100 performs the AE correction (S33) or the red-eye correction (S35) and then finishes a series of processes (S34). - As described above, the terminal 100 in the third embodiment compares the color temperatures from the two
RGB sensors RGB sensors RGB sensor 118 when an image is captured. Thus, the terminal 100 may perform appropriate image correction even in a case where the direction of light that is received by the object is different from the direction of light that is received by theRGB sensor 118 when an image is captured, and appropriate image correction in accordance with a surrounding environment may thus be performed. - In the third embodiment, an example of the image correction is described with an example of the red-eye correction (S35). Such image correction is only one example, and the image correction may be performed to a specified portion of an image that is caused by direct reception of the sunlight. In such a case, for example, the
image correction section 1163 determines that the image has a noise in a case where differences greater than a reference value are present between the RGB values of the pixels and the RGB values of adjacent pixels and may correct the RGB values of the concerned pixels to the reference value or lower while defining the concerned pixels as the specified portion. - Further, a description is made about an example where the object is an animal such as a human. However, the object may be a landscape or the like. In such a case also, a noise in the image that is caused by reception of the sunlight may be corrected.
- In addition, the AE correction (S33) may be performed by a method that reduces values that are offset by a certain value or larger among the RGB values to a reference value or lower.
- In the second and third embodiments, a description is made about an example where the side on which one sensor of the LCD
side RGB sensor 115 and the camera moduleside RGB sensor 118 is provided is defined as the first side and the other sensor is installed on the second side that is provided on the back side with respect to the first side. For example, the twosensors -
FIG. 9 illustrates an example of a case where the LCDside RGB sensor 115 is installed on an upper side of the terminal 100. The LCDside RGB sensor 115 is installed on the upper side, thereby enabling appropriate calculation of the color temperature around the object based on the sunlight or the like. Thus, the terminal 100 may perform appropriate image correction in accordance with a surrounding environment even in a case where the direction of light that is received by the object is different from the direction of light that is received by theRGB sensor 118 when an image is captured. - An upper side (yz plane) with respect to the side (xy plane) on which the
LCD 120 is provided in the terminal 100 is defined as “upper side”, for example. - The portable terminal device that is described in the second and third embodiments may include information processing devices such as tablet terminals, personal digital assistants (PDA), and portable game consoles, for example. Such an information processing device may carry out the image correction that is described in the second and third embodiments in a similar manner to the terminal 100.
- All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Claims (15)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014051726A JP2015177308A (en) | 2014-03-14 | 2014-03-14 | Portable terminal device, image correction method, and image correction program |
JP2014-051726 | 2014-03-14 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150264329A1 true US20150264329A1 (en) | 2015-09-17 |
US9549160B2 US9549160B2 (en) | 2017-01-17 |
Family
ID=54070420
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/592,795 Expired - Fee Related US9549160B2 (en) | 2014-03-14 | 2015-01-08 | Portable terminal device and image correction method |
Country Status (2)
Country | Link |
---|---|
US (1) | US9549160B2 (en) |
JP (1) | JP2015177308A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160330368A1 (en) * | 2015-05-06 | 2016-11-10 | Xiaomi Inc. | Method and Device for Setting Shooting Parameter |
US9762878B2 (en) * | 2015-10-16 | 2017-09-12 | Google Inc. | Auto white balance using infrared and/or ultraviolet signals |
CN107959810A (en) * | 2017-12-28 | 2018-04-24 | 上海传英信息技术有限公司 | A kind of data burning method and data recording system for CCD camera assembly |
US20190068938A1 (en) * | 2017-08-23 | 2019-02-28 | Motorola Mobility Llc | Using a light color sensor to improve a representation of colors in captured image data |
CN111918047A (en) * | 2020-07-27 | 2020-11-10 | Oppo广东移动通信有限公司 | Photographing control method and device, storage medium and electronic equipment |
CN114584752A (en) * | 2020-11-30 | 2022-06-03 | 华为技术有限公司 | Image color restoration method and related equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5583397A (en) * | 1993-10-20 | 1996-12-10 | Asahi Kogaku Kogyo Kabushiki Kaisha | Strobe apparatus with color temperature control |
US20030169348A1 (en) * | 2002-03-06 | 2003-09-11 | Eiichiro Ikeda | White balance adjustment method, image sensing apparatus, program, and storage medium |
US20050264701A1 (en) * | 2002-07-04 | 2005-12-01 | Huh Young-Sik | Method and system for color temperature conversion of compressed video image |
JP2008219128A (en) * | 2007-02-28 | 2008-09-18 | Fujifilm Corp | Photographing device |
US20110150451A1 (en) * | 2009-12-21 | 2011-06-23 | Canon Kabushiki Kaisha | Image pickup apparatus and controlling method therefor |
JP2011203437A (en) * | 2010-03-25 | 2011-10-13 | Nikon Corp | Image display device and program |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003102018A (en) | 2001-09-21 | 2003-04-04 | Canon Inc | Electronic still camera, imaging method, computer- readable storage medium, and computer program |
JP4260568B2 (en) * | 2003-07-31 | 2009-04-30 | 京セラ株式会社 | Camera apparatus and image processing control method |
US20080231726A1 (en) * | 2007-03-23 | 2008-09-25 | Motorola, Inc. | Apparatus and method for image color correction in a portable device |
JP2010056883A (en) * | 2008-08-28 | 2010-03-11 | Nikon Corp | Optical device and photographing device |
JP5794804B2 (en) * | 2011-03-29 | 2015-10-14 | 京セラ株式会社 | Electronics |
JP5681589B2 (en) * | 2011-08-18 | 2015-03-11 | 富士フイルム株式会社 | Imaging apparatus and image processing method |
-
2014
- 2014-03-14 JP JP2014051726A patent/JP2015177308A/en active Pending
-
2015
- 2015-01-08 US US14/592,795 patent/US9549160B2/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5583397A (en) * | 1993-10-20 | 1996-12-10 | Asahi Kogaku Kogyo Kabushiki Kaisha | Strobe apparatus with color temperature control |
US20030169348A1 (en) * | 2002-03-06 | 2003-09-11 | Eiichiro Ikeda | White balance adjustment method, image sensing apparatus, program, and storage medium |
US20050264701A1 (en) * | 2002-07-04 | 2005-12-01 | Huh Young-Sik | Method and system for color temperature conversion of compressed video image |
JP2008219128A (en) * | 2007-02-28 | 2008-09-18 | Fujifilm Corp | Photographing device |
US20110150451A1 (en) * | 2009-12-21 | 2011-06-23 | Canon Kabushiki Kaisha | Image pickup apparatus and controlling method therefor |
JP2011203437A (en) * | 2010-03-25 | 2011-10-13 | Nikon Corp | Image display device and program |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160330368A1 (en) * | 2015-05-06 | 2016-11-10 | Xiaomi Inc. | Method and Device for Setting Shooting Parameter |
US10079971B2 (en) * | 2015-05-06 | 2018-09-18 | Xiaomi Inc. | Method and device for setting shooting parameter |
US9762878B2 (en) * | 2015-10-16 | 2017-09-12 | Google Inc. | Auto white balance using infrared and/or ultraviolet signals |
US20190068938A1 (en) * | 2017-08-23 | 2019-02-28 | Motorola Mobility Llc | Using a light color sensor to improve a representation of colors in captured image data |
US10567721B2 (en) * | 2017-08-23 | 2020-02-18 | Motorola Mobility Llc | Using a light color sensor to improve a representation of colors in captured image data |
CN107959810A (en) * | 2017-12-28 | 2018-04-24 | 上海传英信息技术有限公司 | A kind of data burning method and data recording system for CCD camera assembly |
CN111918047A (en) * | 2020-07-27 | 2020-11-10 | Oppo广东移动通信有限公司 | Photographing control method and device, storage medium and electronic equipment |
CN114584752A (en) * | 2020-11-30 | 2022-06-03 | 华为技术有限公司 | Image color restoration method and related equipment |
Also Published As
Publication number | Publication date |
---|---|
US9549160B2 (en) | 2017-01-17 |
JP2015177308A (en) | 2015-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9549160B2 (en) | Portable terminal device and image correction method | |
US10027938B2 (en) | Image processing device, imaging device, image processing method, and image processing program | |
CN107438163B (en) | Photographing method, terminal and computer readable storage medium | |
US9214111B2 (en) | Image display apparatus and method | |
RU2659484C1 (en) | Method and device for display brightness adjusting | |
CN113810600B (en) | Terminal image processing method and device and terminal equipment | |
KR20130056442A (en) | Device and method for adjusting white balance | |
CN113810601B (en) | Terminal image processing method and device and terminal equipment | |
CN111510698A (en) | Image processing method, device, storage medium and mobile terminal | |
US10298899B2 (en) | Image processing device, imaging device, image processing method, and program | |
US20240119566A1 (en) | Image processing method and apparatus, and electronic device | |
US9942483B2 (en) | Information processing device and method using display for auxiliary light | |
CN106484356A (en) | Adjust the method and device of brightness of image | |
WO2023015985A1 (en) | Image processing method and electronic device | |
JP5556215B2 (en) | Image quality adjustment method for display unit using camera for portable device, apparatus and program thereof | |
CN116052568A (en) | Display screen calibration method and related equipment | |
CN112099741A (en) | Display screen position identification method, electronic device and computer readable storage medium | |
US10319342B2 (en) | Method and device for adjusting display brightness | |
US20220078328A1 (en) | Image capturing apparatus, image capturing method, and carrier means | |
EP4156168A1 (en) | Image processing method and electronic device | |
EP4387251A1 (en) | Method and apparatus for controlling exposure, and electronic device | |
US20240163497A1 (en) | Synchronous Playing Method and Apparatus | |
JP7210517B2 (en) | IMAGING DEVICE, IMAGING CONTROL METHOD, IMAGING CONTROL PROGRAM | |
CN114338962B (en) | Image forming method and apparatus | |
JP2010119017A (en) | Front projection video display device, projection image correction method, and projection image correction program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FUJITSU LIMITED, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TAKANASHI, TAKAKO;REEL/FRAME:034668/0730 Effective date: 20141211 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20210117 |