WO2006103881A1 - 撮像装置 - Google Patents
撮像装置 Download PDFInfo
- Publication number
- WO2006103881A1 WO2006103881A1 PCT/JP2006/304302 JP2006304302W WO2006103881A1 WO 2006103881 A1 WO2006103881 A1 WO 2006103881A1 JP 2006304302 W JP2006304302 W JP 2006304302W WO 2006103881 A1 WO2006103881 A1 WO 2006103881A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- inflection point
- subject
- signal
- output signal
- unit
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/741—Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/57—Control of the dynamic range
- H04N25/571—Control of the dynamic range involving a non-linear response
- H04N25/573—Control of the dynamic range involving a non-linear response the logarithmic type
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
Definitions
- the present invention relates to an imaging apparatus, and more particularly to an imaging apparatus having an imaging element capable of switching between a logarithmic conversion operation and a linear conversion operation.
- an imaging device that photoelectrically converts incident light into an electrical signal is provided in an imaging device such as a digital camera or a camera unit incorporated in a vehicle-mounted camera.
- an image sensor linear log sensor
- Patent Document 1 and Patent Document 2 an image sensor (linear log sensor) has been proposed that can switch between a linear conversion operation and a logarithmic conversion operation of an electric signal according to the amount of incident light.
- the dynamic range is wider than that of an image sensor (linear sensor) that performs only a linear conversion operation.
- the linear log sensor is preferably used as a photoelectric conversion characteristic that can utilize the advantages of the linear conversion operation or the logarithmic conversion operation for the main subject.
- Patent Document 1 Japanese Patent Laid-Open No. 2002-223392
- Patent Document 2 Japanese Unexamined Patent Application Publication No. 2004-088312
- the main subject is determined by the user's preference and exists near the center of the shooting screen, and only near the edge of the screen. /, Sometimes.
- the imaging device uses a system that automatically determines the main subject, the subject different from the subject intended by the user is recognized as the main subject, and shooting is performed under conditions suitable for the subject. In this case, even if a linear log sensor is used, there is a possibility that the captured image intended by the user cannot be obtained.
- An object of the present invention is to provide an imaging apparatus capable of obtaining a captured image intended by a user in an imaging apparatus having an imaging element capable of switching between a logarithmic conversion operation and a linear conversion operation. is there.
- the invention described in claim 1 is an image pickup apparatus, which receives a linear conversion operation for linearly converting incident light into an electric signal and a logarithmic conversion operation for logarithmic conversion.
- An image sensor having a plurality of pixels that can be switched according to the amount of light, a display unit that displays an image obtained by the image sensor, and an area for designating an arbitrary region of the image displayed on the display unit.
- the output signal of the image sensor in the region specified by the operation unit and the operation unit is evaluated, and a variable that becomes a boundary between the linear region and the logarithmic region in the output signal of the image sensor based on the evaluation result of the output signal.
- an inflection point changing section for changing the inflection point.
- the user may specify a desired subject range on the operation unit after confirming the preview screen of the subject displayed on the display unit. it can.
- the inflection point is changed based on the evaluation result of the image data of the subject specified by the user, it is possible to change the photoelectric conversion characteristics of the image sensor according to the user's needs.
- the invention described in claim 2 is the imaging apparatus according to claim 1, wherein the inflection point changing unit performs logarithmic conversion of at least an output signal of the designated region. Is The inflection point is changed as described above.
- the image sensor since the image sensor performs a logarithmic conversion operation when the subject specified by the user is imaged, the logarithmic conversion operation can be performed for the subject specified by the user. It is possible to obtain an image that takes advantage of the advantages. In other words, since the dynamic range is wide, even when a subject with a wide luminance range is photographed, all luminance information can be expressed as an electric signal.
- the invention described in claim 3 is the imaging apparatus according to claim 2, wherein the inflection point changing unit is configured such that an output signal of the designated region has a predetermined value. At the above time, the inflection point is changed.
- the designated subject image when the output signal of the subject image designated by the user becomes equal to or greater than a predetermined value, the designated subject image is obtained particularly when the image sensor performs a linear conversion operation.
- the image is overexposed due to the saturation of the output signal, it is possible to secure the dynamic range and prevent overexposure of the object specified by the user by causing the image sensor to perform a logarithmic conversion operation.
- the invention described in claim 4 is the imaging apparatus according to claim 1, wherein the inflection point changing unit linearly converts an output signal of at least the designated region. As described above, the inflection point is changed.
- the imaging device since the imaging device performs a linear conversion operation when the subject specified by the user is imaged, the linear conversion operation can be performed on the subject specified by the user. It is possible to obtain a captured image that takes advantage of the advantages. That is, sufficient data can be obtained within a predetermined luminance range to obtain sufficient contrast of the subject.
- the invention according to claim 5 is the imaging apparatus according to claim 4, wherein the inflection point changing unit is configured such that an output signal of the designated area is a logarithmic conversion area. The inflection point is changed so that a linear transformation output is obtained.
- the invention described in claim 6 is the imaging apparatus according to any one of claims 1 to 5, for designating an arbitrary area on the display unit. A window is displayed.
- the range of the subject designated by the user can be easily visually confirmed by the window displayed on the display unit.
- the invention described in claim 7 is the imaging apparatus according to any one of claims 1 to 6, wherein the operation unit includes a display unit and the display unit.
- the window can be moved and the designated range can be changed by the window.
- the user operates the operation unit to move the window of the display unit or to change the designated range of the window, so that the user's desired subject Can be easily specified.
- the invention described in claim 8 is the imaging apparatus according to any one of claims 1 to 7, wherein the inflection point changing unit is configured to transmit the imaging element.
- the inflection point is changed by changing a voltage value set in the pixel.
- the image sensor since the inflection point is changed based on the evaluation result of the image data of the subject in the region specified by the user, the image sensor according to the user's needs. By changing the photoelectric conversion characteristics, it is possible to obtain a photographed image intended by the user.
- the dynamic range is ensured by causing the image sensor to perform a logarithmic conversion operation when the subject specified by the user is imaged, and the subject specified by the user. Can prevent overexposure.
- sufficient data is obtained within a predetermined luminance range by causing the image sensor to perform a linear conversion operation when capturing an image of a subject specified by the user.
- sufficient contrast of the subject can be obtained.
- the user can move the window of the display unit by operating the operation unit or change the designated range of the window, so that the desired subject of the user can be obtained. Can be easily specified.
- FIG. 1 is a front view showing a configuration of an imaging apparatus according to an embodiment of the present invention.
- FIG. 2 is a rear view showing the configuration of the imaging apparatus according to the embodiment of the present invention.
- FIG. 3 is a block diagram showing a functional configuration of the imaging apparatus according to the embodiment of the present invention.
- FIG. 4 is a block diagram showing a configuration of an image sensor according to an embodiment of the present invention.
- FIG. 5 is a circuit diagram showing a configuration of a pixel included in the image sensor according to the embodiment of the present invention.
- FIG. 6 is a time chart showing the operation of the pixels provided in the image sensor according to the embodiment of the present invention.
- FIG. 7 is a graph showing an output with respect to an incident light amount of the image sensor according to the embodiment of the present invention.
- FIG. 8 is an example of a display screen in the display unit according to the embodiment of the present invention.
- FIG. 9 is another example of the display screen in the display unit according to the embodiment of the present invention.
- FIG. 10 is a graph showing changes in inflection points of the image sensor according to the embodiment of the present invention.
- FIG. 11 is a graph showing changes in inflection points of the image sensor according to the embodiment of the present invention.
- FIG. 12 is a flowchart showing an imaging method according to an embodiment of the present invention.
- FIG. 13 is a diagram showing a display screen before the inflection point is changed in the display unit according to the embodiment of the present invention.
- FIG. 14 is a diagram showing a display screen after an inflection point is changed in the display unit according to the embodiment of the present invention.
- the imaging apparatus 1 is a compact digital camera.
- the imaging apparatus according to the present invention includes an electronic device having a photographing function such as a single-lens reflex digital camera, a camera-equipped mobile phone, and an in-vehicle camera.
- a photographing function such as a single-lens reflex digital camera, a camera-equipped mobile phone, and an in-vehicle camera.
- camera units incorporated into electronic devices such as mobile phones and in-vehicle cameras.
- a lens unit 3 that collects image light of a subject at a predetermined focal point is an optical axis of the lens unit 3. Is provided so as to be orthogonal to the front surface of the housing 2.
- An imaging element 4 that photoelectrically converts the reflected light of the subject incident through the lens unit 3 into an electrical signal is provided inside the housing 2 and behind the lens unit 3.
- an irradiating unit 5 that irradiates light at the time of photographing is provided near the upper end of the front surface of the housing 2.
- the irradiation unit 5 of the present embodiment is configured by a strobe device built in the imaging apparatus 1, it may be configured by an external strobe or a high brightness LED.
- a light control sensor 6 is provided on the front surface of the housing 2 and in the vicinity of the upper part of the lens unit 3. The light control sensor 6 reflects light emitted from the irradiation unit 5 on the subject. Thus, the reflected light is received.
- a system control unit 7 and a signal processing unit 8 In each case, a circuit board (not shown) including a circuit such as FIG. 3) is provided. In addition, a battery 9 is built in the housing 2 and a recording unit 10 such as a memory card is loaded.
- a monitor 11 for image display is provided on the back surface of the housing 2.
- the monitor 11 is composed of an LCD (Liquid Crystal Display), a CRT (Cathode Ray Tube), etc., and can display a preview screen of a subject and a photographed image.
- a zoom button W12 Wide angle
- a zoom button T13 Telephoto: telephoto
- an optical viewfinder 14 for confirming the subject also on the rear side force of the housing 2 is disposed above the position where the lens unit 3 is provided on the back surface of the imaging device 1.
- a cursor for selection provided with a cross key for moving the cursor window displayed on the screen of the monitor 11 or changing the designated range of the window.
- Key 15 is provided.
- a center key of the selection cross key 15 is provided with a confirmation key for confirming the contents designated by the cursor or window.
- a release switch 16 for performing shirt tale release is provided between the battery 9 and the lens unit 3 on the upper surface of the imaging device 1.
- the release switch 16 can be operated in a “half-pressed state” that is pushed in halfway and in a “full-pressed state” that is pushed in further.
- the power of the imaging device 1 is turned on (started) by pressing or
- a USB terminal 18 for connecting a USB cable for connecting the imaging device 1 to a personal computer or the like is provided near the upper end of one side surface of the housing 2.
- FIG. 3 shows a functional configuration of the imaging apparatus 1.
- the imaging device 1 includes the system control unit 7 on the circuit board inside the housing 2.
- the system control unit 7 includes a CPU (Central Processing Unit), a RAM (Random Access Memorv) composed of rewritable semiconductor elements, and a nonvolatile semiconductor device. It consists of ROM (Read Only Memory) that consists of conductor memory.
- each component of the imaging device 1 is connected to the system control unit 7, and the system control unit 7 expands the processing program recorded in the ROM into the RAM and executes the processing program by the CPU. By doing so, these components are driven and controlled.
- the system control unit 7 includes a lens unit 3, an aperture shutter control unit 19, an image sensor 4, a signal processing unit 8, a timing generation unit 20, a recording unit 10, an irradiation unit 5, and an adjustment unit.
- An optical sensor 6, a monitor 11, an operation unit 21 and an inflection point changing unit 22 are connected.
- the lens unit 3 is configured with a plurality of lenses that form a subject light image on the imaging surface of the image sensor 4 and a diaphragm shutter force that adjusts the amount of light collected by the lens.
- the aperture shatter control unit 19 drives and controls the aperture shatter unit that adjusts the amount of light collected by the lens in the lens unit 3. That is, the aperture shatter control unit 19 sets the aperture to a predetermined aperture value based on the control value input from the system control unit 7 and opens the shatter unit immediately before the imaging operation of the image sensor 4 is started. After the prescribed exposure time has elapsed, the shirter is closed, and the incident light to the image sensor 4 is blocked when not imaging! /
- the image sensor 4 photoelectrically converts incident light of each color component of R, G, and B, which is a subject light image, into an electric signal and takes it in / out.
- the image sensor 4 includes a plurality of pixels G arranged in a matrix (matrix arrangement).
- n and m are integers of 1 or more.
- Each of the pixels G to G photoelectrically converts incident light and outputs an electrical signal.
- the pixels G to G can switch the electrical signal conversion operation according to the amount of incident light.
- linear conversion operation for linearly converting incident light into an electric signal and a logarithmic conversion operation for logarithmic conversion are switched.
- linear conversion or logarithmic conversion of incident light into an electric signal means that the time integral value of the light amount is converted into an electric signal that changes linearly, or an electric signal that changes logarithmically.
- Logarithmic conversion to a signal means that the time integral value of the light amount is converted into an electric signal that changes linearly, or an electric signal that changes logarithmically.
- Pixels G to G have red (Red) and green (Green) on the lens unit 3 side, respectively.
- a filter (not shown) of one color of blue or blue is arranged.
- the pixels G to G include a power supply line 23 and signal application lines L to L, L
- G is also connected to the clock line, bias supply line, and other lines.
- Signal application lines L to L, L to L, and L to L are connected to the pixels G to G with a signal ⁇
- a vertical scanning circuit 24 is connected to L 1, L to L, and L to L. This vertical scan
- the circuit 24 is based on the signal from the timing generator 20 (see FIG. 3).
- the signal application lines L to L, L to L, and L to L are sequentially switched in the X direction.
- the electrical signals generated by the pixels G to G are derived from the signal readout lines L to L, respectively.
- the selection circuits S to S provide pixels G to G force through the signal readout lines L to L, respectively.
- the obtained noise signal and the electric signal at the time of imaging are sampled and held.
- a horizontal scanning circuit 25 and a correction circuit 26 are connected to these selection circuits S to S. Horizontal running
- the saddle circuit 25 sequentially switches the selection circuits S to S that sample and hold an electric signal and transmit it to the correction circuit 26 in the Y direction.
- the correction circuit 26 removes the noise signal from the electric signal based on the noise signal transmitted from the selection circuits S to S m 1 and the electric signal at the time of imaging.
- One correction circuit 26 may be provided for each of ⁇ S.
- each of the pixels G to G includes a photodiode P, transistors T to T, and
- Transistors T to T are the MOS transistors of the ⁇ channel.
- the signal ⁇ is input to the gate ⁇ of the transistor ⁇ , and the source ⁇
- 1 1G S 1S is connected to gate ⁇ ⁇ ⁇ ⁇ and drain ⁇ of transistor ⁇ .
- the source ⁇ of this transistor ⁇ has a signal application line L (corresponding to L to L in Fig. 4).
- the signal application line L force is also connected to the signal ⁇ .
- the signal ⁇ is a binary voltage signal.
- the transistor T When the light intensity exceeds the predetermined incident light intensity th, the transistor T is moved in the subthreshold region.
- the gate T of the transistor T is connected to the source T of the transistor T.
- a DC voltage V is applied to the drain T of the transistor T.
- the source T of the transistor T includes one end of the capacitor C and the drain of the transistor T.
- a signal application line L (corresponding to L to L in FIG. 4) is connected to the other end of the capacitor C.
- the signal application line L force is also given the signal ⁇ .
- the signal ⁇ is a ternary voltage signal. More specifically, the capacitor C is integrated.
- the voltage value Vh at the time of operation, the voltage value Vm at the time of reading out the photoelectrically converted electrical signal, and the voltage value VI at the time of reading the noise signal are taken.
- DC voltage V power is input to the source T of the transistor T, and the signal ⁇ is input to the gate T.
- the DC voltage V is applied to the drain T of the transistor T.
- a signal read line L (corresponding to L to L in FIG. 4) is connected to the source T of the transistor T.
- the signal ⁇ is input.
- each of the pixels G to G performs the following reset operation.
- the vertical scanning circuit 24 performs the reset operation of the pixels G to G.
- signal ⁇ force ow signal ⁇ is Hi
- signal ⁇ is VL
- the vertical scanning circuit 24 From the state where ⁇ is Hi and signal ⁇ is Vh, the vertical scanning circuit 24 generates a pulse signal ⁇
- a pulse signal ⁇ with a voltage value of Vm are applied to the pixels G to G to read out the electrical signal.
- VPS 22 2G drain T
- the vertical scanning circuit 24 sets the signal ⁇ to Low and
- the vertical scanning circuit 24 sets the signal ⁇ to VL, so that the potential of the transistor T is increased.
- the signal ⁇ is set to Hi and the transistor T is turned OFF.
- the capacitor C performs integration.
- the voltage at the connection node between the capacitor C and the gate T of the transistor T is changed to the gate of the reset transistor T.
- the vertical scanning circuit 24 applies the pulse signal ⁇ to the gate T of the transistor T.
- the transistor T is turned ON and the pulse signal ⁇ with the voltage value VI is applied to the capacitor C.
- the transistor T is a source follower type MOS transistor.
- the vertical scanning circuit 24 supplies the pulse signal ⁇ to the gate T of the transistor T.
- each of the pixels G to G performs the following imaging operation.
- the transistor T is in the cutoff state, so that the transistor
- a voltage corresponding to the amount of photocharge accumulated in the gate T of the data T appears at the gate T. So
- the gate T of transistor T should be designed so that a voltage obtained by linearly converting the incident light appears.
- the transistor T operates in the subthreshold region.
- the incident light is logarithmically changed to the gate T of the transistor T in a natural logarithm.
- the converted voltage appears.
- the predetermined value is not equal between pixels G 1 to G 1 mn
- the current flowing through the drain T of the transistor T is amplified. for that reason,
- the incident light from the photodiode P is linearly converted or logarithmically changed at the gate T of the transistor T.
- the converted voltage appears.
- the vertical scanning circuit 24 sets the voltage value of the signal ⁇ to Vm and sets the signal ⁇ to Low.
- the electrical signal of time appears as a voltage signal.
- the signal value of the electrical signal output through the transistors ⁇ 4 and T 6 is a value proportional to the gate voltage of the transistor ⁇ .
- the signal value is a value obtained by linear conversion or logarithmic conversion of the incident light of the photodiode P. Then, the vertical scanning circuit 24 sets the voltage value of the signal ⁇ to Vh and sets the signal ⁇ to Hi.
- the transistor T operates in the cutoff state due to the large potential difference
- the output signal of the imaging element 4 according to the present embodiment continuously changes from the linear region to the logarithmic region in accordance with the amount of incident light.
- the voltage value VL when the luminance range of the subject is narrow, the voltage value VL is lowered to widen the luminance range to be linearly converted, and when the subject luminance range is wide, the voltage value VL is increased.
- the luminance range for number conversion photoelectric conversion characteristics that match the characteristics of the subject can be achieved.
- the voltage value VL is minimized, the linear conversion state can always be used, and when the voltage value VH is maximized, the logarithmic conversion state can always be used.
- the dynamic range can be switched by switching the value. That is, when the system controller 2 switches the voltage value VL of the signal ⁇ ,
- the imaging device 4 may be an image sensor having a pixel having a configuration different from that shown in FIG. 5 as long as the linear conversion operation and the logarithmic conversion operation are automatically switched at each pixel.
- the image element 4 may be used.
- a line is obtained by changing the voltage value VL of the signal ⁇ at the time of imaging.
- the inflection point between the linear conversion operation and logarithmic conversion operation may be changed by changing the VPS voltage value VH. Further, the inflection point between the linear conversion operation and the logarithmic conversion operation may be changed by changing the reset time.
- the imaging device 4 of the present embodiment is a force sensor in which each pixel is provided with an RGB filter.
- RGB filter Other color filters such as Cyan, Magenta, and Yellow can be used.
- the signal processing unit 8 includes an amplifier 27, an AZD converter 28, a black reference correction unit 29, A
- the evaluation value calculation unit 30, the WB processing unit 31, the color interpolation unit 32, the color correction unit 33, the gradation conversion unit 34, and the color space conversion unit 35 are included.
- the amplifier 27 amplifies the electrical signal output from the image sensor 4 to a predetermined specified level to compensate for a lack of level in the captured image.
- the AZD converter 28 converts the electric signal amplified by the amplifier 27 from an analog signal to a digital signal.
- the black reference correction unit 29 is configured to correct the black level that is the lowest luminance value to the reference value. In other words, since the black level differs depending on the dynamic range of the image sensor 4, the black reference correction is performed by subtracting the signal level that becomes the black level from the signal level of each RGB signal output from the AZD converter 28. It becomes.
- the AE evaluation value calculation unit 30 detects an evaluation value necessary for AE (automatic exposure) from the electric signal after black reference correction. That is, by checking the luminance value of the electrical signal composed of each RGB primary color component, the average value distribution range representing the luminance range of the subject is calculated, and the system control unit 7 sets the incident light quantity as an AE evaluation value. Output.
- the WB processing unit 31 adjusts the level ratio (RZG, BZG) of each color component of the captured image by calculating a correction coefficient from the electric signal after the black reference correction. The white color is displayed correctly!
- the color interpolation unit 32 obtains R, G, and B color component values for each pixel when the signals obtained in the pixels of the image sensor 4 are only one or two of the primary colors. Color interpolation processing is performed to interpolate missing color components for each pixel.
- the color correction unit 33 corrects the color component value of each pixel of the image data input from the color interpolation unit 32, and generates an image in which the color tone of each pixel is emphasized! /
- the gradation conversion unit 34 performs from the input to the final output of a solid image that faithfully reproduces the image.
- the gradation conversion unit 34 performs from the input to the final output of a solid image that faithfully reproduces the image.
- a gamma correction process that corrects the tone response characteristics of the image to an optimal curve according to the gamma value of the imaging device 1. It has become.
- the color space conversion unit 35 converts the RGB color power into YUV.
- YUV is a color space management method that expresses colors with two chromaticities: luminance (Y) signal, blue color difference (U, Cb), and red color difference (V, Cr), and converts the color space to YUV. By making the color difference signal It ’s easier.
- the timing generation unit 20 controls a photographing operation (charge accumulation based on exposure, reading of accumulated charge, etc.) by the image sensor 4. That is, a predetermined timing pulse (pixel drive signal, horizontal synchronization signal, vertical synchronization signal, horizontal scanning circuit drive signal, vertical scanning circuit drive signal, etc.) is generated based on the imaging control signal from the system control unit 7. And output to the image sensor 4.
- the timing generation unit 20 also generates an AZD conversion clock used in the AZD converter 28.
- the recording unit 10 is a powerful recording memory such as a semiconductor memory, and has an image data recording area for recording the image data input from the signal processing unit 8.
- the recording unit 10 may be, for example, a built-in memory such as a flash memory, a removable memory card or a memory stick, or a magnetic recording medium such as a hard disk.
- the strobe as the illuminator 5 emits stroboscopic light to the subject at a predetermined illumination timing and dose under the control of the system controller 7 when the brightness of the surrounding environment detected at the time of photographing the subject is insufficient. I started to irradiate.
- the light control sensor 6 detects the amount of reflected light of the subject power of the light emitted from the irradiation unit 5 and outputs the detection result to the system control unit 7 in order to control the irradiation amount of the irradiation unit 5 It is supposed to do.
- the monitor 11 functions as a display unit, displays a preview image of a subject, and displays a captured image that has been subjected to image processing by the signal processing unit 8 under the control of the system control unit 7.
- a text screen such as a menu screen for the user to select a function is displayed. That is, the monitor 11 has a shooting mode selection screen for selecting a still image shooting mode or a moving image shooting mode, an auto mode, an off mode, or an on mode.
- the flash mode selection screen for selecting V or deviation is displayed.
- the operation unit 21 includes a zoom button W12, a zoom button T13, a selection cross key 15, a release switch 16, and a power switch 17, and each button or switch is operated by the user operating the operation unit 21.
- An instruction signal corresponding to this function is transmitted to the system control unit 7, and each component of the imaging device 1 is driven and controlled in accordance with this instruction signal.
- the selection cross key 15 functions to move the cursor window on the screen of the monitor 11 when the cross key is pressed, and the selection contents by the cursor or window when the center key is pressed. It fulfills the function of confirming.
- the cursor displayed on the monitor 11 is moved, and the menu screen power also opens the shooting mode selection screen, and further on the shooting mode selection screen. Then, move the cursor to the desired shooting mode button and press the confirm key to determine the shooting mode.
- the selection cross key 15 is a cross key.
- the window a can be moved up, down, left and right on the screen of the monitor 11.
- the range specified by window a can be changed by a range specifying operation using the cross key 15 for selection. In this way, the user can specify the subject and its range by changing the position or size of the window a.
- the monitor 11 For example, it is possible to divide the monitor 11 into two display screens, one as a preview screen, and an enlarged display of the range specified by the window on the preview screen on the other screen. It is.
- a plurality of windows a and windows b can be displayed.
- the user can specify a plurality of subjects individually.
- the zoom button W12 has a function of adjusting the zoom when pressed to make the subject smaller
- the zoom button T13 has a function of adjusting the zoom when pressed to make the subject larger.
- the release switch 16 starts shooting preparation operation by “half-pressing” in the still image shooting mode, and also exposes the image sensor 4 by “full-pressing”, and is obtained by the exposure.
- a predetermined signal processing is performed on the electrical signal and the result is recorded in the recording unit 10, so that a series of imaging operations are executed.
- the imaging device 1 is repeatedly turned ON and OFF in order.
- the inflection point changing unit 22 evaluates the output signal of the imaging element force of the object in the area specified by the user on the screen of the monitor 11, and then designates based on the evaluation result.
- the inflection point of the image sensor 4 that is optimal for displaying and photographing the subject in the specified area is determined.
- the distribution of the output signal value of the imaging element force corresponding to the designated range is calculated as the image data within the range designated by the user, and the most distributed within the designated range.
- the output signal value of the multiple powers is used as the output signal value for the specified subject.
- the inflection point is determined based on the output signal value of the designated subject! /.
- the evaluation of the output signal of the subject in the designated area is not limited to the above, and for example, the average value of the output signal values within the designated range by the user may be obtained.
- the output signal evaluation in this embodiment includes determining an inflection point by using output signal value data of a specific pixel in an area designated by the user as it is.
- the output signal value at point A1 at the saturation level is Points on the logarithmic transformation domain It can be an output signal value indicated by Bl.
- the inflection point changing unit 22 determines whether the output signal value of the designated subject is in the saturation level region. ing. If the output signal value of the specified subject is at the saturation level, the inflection point should be determined using the logarithmic conversion operation of the image sensor 4 by lowering the inflection point to prevent overexposure. It becomes.
- the output signal value at point ⁇ 2 above can be the output signal value indicated by point B2 on the linear transformation region on graph (d). Inflection point ⁇
- the output signal value of the subject specified by the user is changed from the logarithmic region of graph (C) to the output value of the linear region of graph (d).
- the contrast can be improved.
- the inflection point changing unit 22 performs the linear conversion operation of the image sensor 4 by raising the inflection point in order to improve contrast when the output signal value of the designated subject is within the logarithmic conversion region.
- the optimal inflection point is determined using the electric signal obtained by the above.
- the inflection point changing unit 22 is based on the output signal value of the subject specified by the user.
- the calculation of the voltage value VL set to the image sensor 4 as the inflection point will be described.
- the image sensor 4 of the present embodiment switches from the linear conversion operation to the logarithmic conversion operation by switching the voltage value VL of the signal ⁇ applied to the pixels Gll to Gmn shown in FIG.
- the inflection point that switches to the work can be changed.
- the inflection point changing unit 22 changes the signal ⁇ given to the pixels Gl 1 to Gmn in order to make the inflection point of the image sensor 4 the optimum inflection point.
- the voltage value VL is calculated.
- an LUT created in advance by associating the imaging element output signal value of the subject specified by the user with the voltage value VL is stored in the inflection point changing unit 22, and this LUT is used.
- the voltage value VL may be calculated.
- the inflection point changing unit 22 includes a DA converter 36, which converts the calculated voltage value VL into analog data and inputs the analog data to the pixels Gl 1 to Gmn of the image sensor 4 so that the image sensor 4 The inflection point is changed to the optimal inflection point.
- the inflection points are changed to the optimum inflection points for continuous shooting. ing.
- the user also adjusts the zoom by pressing the zoom button W12 or the zoom button T13 provided on the back of the imaging device 1, and the size of the subject displayed on the monitor 11 is adjusted. You can adjust the height.
- the photographing mode selection screen is displayed on the monitor 11.
- the shooting mode selection screen operate the cross key of the selection cross key 15 to select “inflection point selection shooting mode” and press the center key to select the inflection point selection shooting.
- the mode is set (step S1), the process proceeds to the image display process, and a subject selection window is displayed on the preview screen of the monitor 11 (step S2).
- step S3 the user operates the cross key of the selection cross key 15 to move the window on the preview screen (step S3), and the window size is adjusted by the range specification operation of the selection cross key 15.
- Step S4 the subject and range of the subject are specified. Note that the order of step S3 and step S4 may be reversed.
- step S5 when the center key of the selection cross key 15 is pressed while the window c is displayed on the subject that the user wants to designate, the subject designated by the user is confirmed (step S5).
- the inflection point changing unit 22 evaluates the image data of the designated subject (step S6).
- the distribution of output signal values from the image sensor corresponding to the specified range, which is image data within the range specified by the user, is calculated, and the distribution is the largest in the specified range. The evaluation is made so that the output signal values of the multiple output signals are the output signal values of the designated subject.
- the inflection point changing unit 22 proceeds to the inflection point changing process, and determines an optimum inflection point based on the output signal value of the designated subject (step S 7).
- the inflection point changing unit 22 calculates and sets the voltage value VL based on the output signal value of the specified subject as the inflection point of the image sensor 4 (step S8). For example, in FIG. 13, when the output signal value of the designated subject is a saturation level value, the designated subject becomes a photographed image that is overburden due to saturation of the output signal. Therefore, as shown in FIG. 10, the inflection point that is the boundary between the linear region and the logarithmic region in the output signal of the imaging element 4 is lowered. That is, the inflection point is changed so that the output signal of the specified subject is located in the logarithmic region. Therefore, it is possible to prevent overexposure of the specified subject by preventing the output signal from being saturated, ensuring the dynamic range of the image sensor, and expressing all image data within the specified output signal range. .
- the distribution of the output signal value of the imaging element force corresponding to the designated range is calculated as the image data within the range designated by the user, and the most distributed within the designated range.
- the output signal value of the multiple powers is used as the output signal value for the specified subject.
- the optimal inflection point is determined based on the output signal value of the designated subject.
- the inflection point changing unit 22 calculates the voltage value VL of the signal ⁇ given to the pixels Gll to Gmn.
- the voltage value VL may be calculated by using a LUT created in advance by associating the output signal value of the designated subject with the voltage value VL (steps S7 and S8).
- the voltage value VL is calculated for each subject.
- the DA converter 36 included in the inflection point changing unit 22 converts the calculated voltage value VL into analog data and inputs the analog data to the pixels Gl 1 to Gmn of the image sensor 4 to obtain the image sensor 4 Change the inflection point.
- the monitor 11 displays a subject preview screen after changing the inflection point as shown in FIG. 14 (step S9).
- FIG. 14 by changing the inflection point, overexposure is avoided for the subject specified by the user.
- the user confirms whether or not the desired photographed image can be obtained by visually observing the photographed image displayed on the preview screen (step S10). If there is a subject in which the overexposure is further prevented or the subject or gradation is improved in the captured image (step S10; No), the process returns to step S3 and is displayed again on the preview screen of the monitor 11. The designated window is moved (step S4), and the object is specified (step S5).
- step SIO When it is confirmed on the preview screen of the monitor 11 that the desired captured image can be obtained by changing the inflection point (step SIO; Yes), the release switch 16 is pressed halfway to perform the shooting preparation operation. As the AF operation is performed, the AE evaluation value is calculated. If the release switch 16 is not pressed, the preview image after the change of the inflection point is displayed on the monitor 11.
- the aperture shatter control unit 19 drives and controls the aperture shatter unit based on the AE evaluation value calculated by the AE evaluation value calculation unit 32.
- the pixels Gl 1 to Gmn of the image sensor 4 photoelectrically convert incident light by switching between linear conversion operation and logarithmic conversion operation at the inflection point determined by the inflection point changing unit 24. Then, the electric signal obtained by the photoelectric conversion is output to the signal processing unit 8.
- the signal processing unit 8 performs predetermined image processing on the electrical signal obtained by the photoelectric conversion. That is, when the amplifier 27 amplifies the electrical signal output from the image sensor 4 to a predetermined specified level, the AZD converter 28 converts the amplified electrical signal into a digital signal.
- the black reference correction unit 29 corrects the black level at which the minimum luminance value is obtained to the reference value. Further, the AE evaluation value calculation unit 30 detects an evaluation value necessary for AE (automatic exposure) from the electric signal after the black reference correction, and sends it to the system control unit 7. On the other hand, the WB processing unit 31 calculates the correction coefficient for the electric signal force after the black reference correction, so that each of the R, G, and B colors of the captured image is calculated. Adjust the component level ratio (RZG, B / G) to display white correctly.
- the color interpolation unit 32 performs a color interpolation process for interpolating the missing color components for each pixel. Then, the color correction unit 33 corrects the color component value for each pixel to generate an image in which the hue of each pixel is emphasized.
- the tone conversion unit 34 performs gamma correction processing that corrects the tone response characteristics of the image to an optimal curve according to the gamma value of the imaging device 1
- the color space conversion unit 35 converts the color space to RGB power YUV. Convert to
- the preview is performed with the inflection points changed to the respective inflection points. And take multiple shots.
- a plurality of captured images may be recorded together, one of the images may be selected and recorded, or a plurality of captured images may be combined and recorded. ,.
- USB cable connected to the USB terminal 18 is connected to the personal computer.
- the subject range is designated by the operation unit 21 so that the user can reproduce the subject himself / herself. can do.
- the inflection point is changed based on the evaluation result of the output signal value of the subject specified by the user, the photoelectric conversion characteristics of the image sensor 4 can be changed according to the user's needs.
- the window displayed on the monitor 11 makes it possible to easily view the object range designated by the user in the captured image.
- the user can easily specify a desired subject by moving the window of the monitor 11 by operating the operation unit 21 or changing the window designation range. .
- the inflection point is changed based on the evaluation result of the output signal value of the subject specified by the user.
- the photoelectric conversion characteristics of the child By changing the photoelectric conversion characteristics of the child, a photographed image intended by the user can be obtained.
- the subject range designated by the user can be easily visually confirmed by the window displayed on the display unit.
- the user can easily designate a desired subject by moving the window of the display unit by operating the operation unit or changing the designated range of the window.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Nonlinear Science (AREA)
- Studio Devices (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020077021925A KR101230200B1 (ko) | 2005-03-29 | 2006-03-07 | 촬상 장치 |
US11/887,191 US7948525B2 (en) | 2005-03-29 | 2006-03-07 | Imaging device having a linear/logarithmic imaging sensor |
JP2007510355A JP4114707B2 (ja) | 2005-03-29 | 2006-03-07 | 撮像装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005-094400 | 2005-03-29 | ||
JP2005094400 | 2005-03-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006103881A1 true WO2006103881A1 (ja) | 2006-10-05 |
Family
ID=37053148
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/304302 WO2006103881A1 (ja) | 2005-03-29 | 2006-03-07 | 撮像装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US7948525B2 (ja) |
JP (1) | JP4114707B2 (ja) |
KR (1) | KR101230200B1 (ja) |
WO (1) | WO2006103881A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009126391A (ja) * | 2007-11-26 | 2009-06-11 | Honda Motor Co Ltd | 車載撮像装置 |
WO2014181743A1 (ja) * | 2013-05-07 | 2014-11-13 | 株式会社デンソー | 画像処理装置及び画像処理方法 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8976257B2 (en) * | 2012-07-31 | 2015-03-10 | Jadak, Llc | Automatic exposure calibration and compensation for machine vision |
US9380229B2 (en) * | 2014-02-28 | 2016-06-28 | Samsung Electronics Co., Ltd. | Digital imaging systems including image sensors having logarithmic response ranges and methods of determining motion |
US9978798B2 (en) * | 2015-08-03 | 2018-05-22 | Sony Corporation | Sensors with variable sensitivity to maximize data use |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001218107A (ja) * | 2000-02-01 | 2001-08-10 | Olympus Optical Co Ltd | 電子カメラ |
JP2004088312A (ja) * | 2002-08-26 | 2004-03-18 | Minolta Co Ltd | 撮像装置 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR0185909B1 (ko) * | 1994-07-11 | 1999-05-01 | 김광호 | 비데오 카메라의 노출 조절장치 |
US6545710B1 (en) * | 1995-08-11 | 2003-04-08 | Minolta Co., Ltd. | Image pick-up apparatus |
US6850249B1 (en) * | 1998-04-03 | 2005-02-01 | Da Vinci Systems, Inc. | Automatic region of interest tracking for a color correction system |
JPH11298799A (ja) * | 1998-04-15 | 1999-10-29 | Honda Motor Co Ltd | 光センサ信号処理装置 |
JP2001008110A (ja) | 1999-06-24 | 2001-01-12 | Minolta Co Ltd | 固体撮像装置 |
US8379126B2 (en) * | 1999-06-24 | 2013-02-19 | Konica Minolta Holdings, Inc. | Image-sensing apparatus |
JP4374745B2 (ja) * | 2000-07-19 | 2009-12-02 | コニカミノルタホールディングス株式会社 | 固体撮像装置 |
US7369160B2 (en) * | 2001-06-15 | 2008-05-06 | Yokogawa Electric Corporation | Camera system for transferring both image data and an image processing program to transfer the image data to an external device |
KR20040033967A (ko) * | 2002-10-16 | 2004-04-28 | 삼성테크윈 주식회사 | 지정 검출 영역의 위치가 변하는 디지털 카메라의 제어 방법 |
JP2004282282A (ja) * | 2003-03-14 | 2004-10-07 | Yokogawa Electric Corp | カメラシステム及びカメラ制御方法 |
JP3948433B2 (ja) * | 2003-05-21 | 2007-07-25 | コニカミノルタホールディングス株式会社 | 固体撮像装置 |
US7012238B2 (en) * | 2003-07-02 | 2006-03-14 | Sharp Kabushiki Kaisha | Amplification-type solid-state image pickup device incorporating plurality of arrayed pixels with amplification function |
US7545412B2 (en) * | 2003-09-09 | 2009-06-09 | Konica Minolta Holdings, Inc. | Image-sensing apparatus with a solid-state image sensor switchable between linear and logarithmic conversion |
US7714928B2 (en) * | 2004-05-28 | 2010-05-11 | Konica Minolta Holdings, Inc. | Image sensing apparatus and an image sensing method comprising a logarithmic characteristic area and a linear characteristic area |
US7656561B2 (en) * | 2004-05-31 | 2010-02-02 | Phase One A/S | Image compression for rapid high-quality imaging |
JP2006020055A (ja) * | 2004-07-01 | 2006-01-19 | Konica Minolta Holdings Inc | 撮像装置 |
US20060065811A1 (en) * | 2004-09-27 | 2006-03-30 | Hongil Yoon | Wide dynamic range CMOS image sensor having controllabale photo-response characteristic and control method thereof |
US20090128650A1 (en) * | 2005-03-29 | 2009-05-21 | Kazusei Takahashi | Imaging Device |
JP2007006453A (ja) * | 2005-05-24 | 2007-01-11 | Konica Minolta Holdings Inc | 固体撮像装置 |
JP4735051B2 (ja) * | 2005-05-25 | 2011-07-27 | コニカミノルタオプト株式会社 | 撮像装置 |
-
2006
- 2006-03-07 WO PCT/JP2006/304302 patent/WO2006103881A1/ja active Application Filing
- 2006-03-07 JP JP2007510355A patent/JP4114707B2/ja not_active Expired - Fee Related
- 2006-03-07 KR KR1020077021925A patent/KR101230200B1/ko not_active IP Right Cessation
- 2006-03-07 US US11/887,191 patent/US7948525B2/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001218107A (ja) * | 2000-02-01 | 2001-08-10 | Olympus Optical Co Ltd | 電子カメラ |
JP2004088312A (ja) * | 2002-08-26 | 2004-03-18 | Minolta Co Ltd | 撮像装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009126391A (ja) * | 2007-11-26 | 2009-06-11 | Honda Motor Co Ltd | 車載撮像装置 |
WO2014181743A1 (ja) * | 2013-05-07 | 2014-11-13 | 株式会社デンソー | 画像処理装置及び画像処理方法 |
JP2014219790A (ja) * | 2013-05-07 | 2014-11-20 | 株式会社デンソー | 画像処理装置及び画像処理方法 |
US9800881B2 (en) | 2013-05-07 | 2017-10-24 | Denso Corporation | Image processing apparatus and image processing method |
Also Published As
Publication number | Publication date |
---|---|
KR101230200B1 (ko) | 2013-02-05 |
JPWO2006103881A1 (ja) | 2008-09-04 |
US20090141139A1 (en) | 2009-06-04 |
US7948525B2 (en) | 2011-05-24 |
JP4114707B2 (ja) | 2008-07-09 |
KR20080002773A (ko) | 2008-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8212890B2 (en) | Imaging device and imaging method | |
JP2008124994A (ja) | 撮像装置及び露出制御方法 | |
JP2007259237A (ja) | 撮像装置および撮像方法 | |
JP2001346069A (ja) | 映像信号処理装置及び輪郭強調補正装置 | |
US20170318208A1 (en) | Imaging device, imaging method, and image display device | |
JP4735051B2 (ja) | 撮像装置 | |
JP2004157417A (ja) | デジタルカメラ及びそのaf制御時の露出設定方法 | |
US20060055991A1 (en) | Image capture apparatus and image capture method | |
JP4114707B2 (ja) | 撮像装置 | |
JP2014161079A (ja) | 画像生成装置及び方法、並びに撮像装置 | |
WO2006103880A1 (ja) | 撮像装置 | |
JP2004328460A (ja) | ホワイトバランス調整方法および電子カメラ | |
US20060268154A1 (en) | Image pickup apparatus | |
JP2008160190A (ja) | 撮像装置、およびその方法 | |
JP2009118052A (ja) | 画像信号処理方法及び装置 | |
JP4335648B2 (ja) | デジタルカメラ及びデジタルカメラの撮像方法 | |
JP2006261928A (ja) | 撮像装置及びデジタルカメラ | |
JP2006303755A (ja) | 撮像装置及び撮像方法 | |
JP2006279714A (ja) | 撮像装置及び撮像方法 | |
JP4274207B2 (ja) | 撮像装置及びその露出制御方法並びに記録媒体 | |
JP2020182179A (ja) | 画像処理装置およびその制御方法 | |
JP2006345301A (ja) | 撮像装置 | |
JP2007067491A (ja) | 撮像装置 | |
JP2006303756A (ja) | 撮像装置及び撮像方法 | |
JP2007110492A (ja) | 撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2007510355 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020077021925 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11887191 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
NENP | Non-entry into the national phase |
Ref country code: RU |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06728671 Country of ref document: EP Kind code of ref document: A1 |