CN104272721A - Imaging device - Google Patents

Imaging device Download PDF

Info

Publication number
CN104272721A
CN104272721A CN201380022701.9A CN201380022701A CN104272721A CN 104272721 A CN104272721 A CN 104272721A CN 201380022701 A CN201380022701 A CN 201380022701A CN 104272721 A CN104272721 A CN 104272721A
Authority
CN
China
Prior art keywords
pixel
signal
block
group
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201380022701.9A
Other languages
Chinese (zh)
Inventor
栗山孝司
村田宽信
纲井史郎
小西哲也
铃木政央
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to CN201910825470.4A priority Critical patent/CN110572586A/en
Publication of CN104272721A publication Critical patent/CN104272721A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • H04N25/75Circuitry for providing, modifying or processing image signals from the pixel array
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14634Assemblies, i.e. Hybrid structures
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1464Back illuminated imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • H04N25/443Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/533Control of the integration time by using differing integration times for different sensor regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/581Control of the dynamic range involving two or more exposures acquired simultaneously
    • H04N25/583Control of the dynamic range involving two or more exposures acquired simultaneously with different integration times
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • H04N25/771Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising storage means other than floating diffusion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/77Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
    • H04N25/772Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising A/D, V/T, V/F, I/T or I/F converters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/79Arrangements of circuitry being divided between different or multiple substrates, chips or circuit boards, e.g. stacked image sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Power Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

Provided is an imaging element equipped with: an imaging unit which has a plurality of groups comprising at least one pixel, and a plurality of signal-reading units that are provided for each group and read signals from the pixels; and a control unit which controls the signal-reading unit of at least one group from among the plurality of groups. Each of the plurality of groups may include a plurality of pixels. The control unit may select at least one group from among the plurality of groups, and control the signal-reading unit by means of a control parameter that is different from other groups from among the plurality of groups.

Description

Filming apparatus
Technical field
The present invention relates to filming apparatus.
Background technology
There will be a known and take parts as follows, wherein, rear surface irradiation type shooting chip and signal processing chip, by each unit of cells collecting multiple pixel, connect via dimpling block (micro-bump).
Prior art document
Patent documentation
Patent documentation 1: Japanese Unexamined Patent Publication 2006-49361 publication
Summary of the invention
In above-mentioned shooting parts, by each unit, there is control line.But, there is no the time that stores of the electric charge subtly between control unit and the reading of picture element signal.
In the 1st scheme of the present invention, provide a kind of capturing element, have: shoot part, it has multiple groups of being made up of at least one pixel and to arrange by each group and from multiple signal reading units of pixel read output signal; And control part, it controls the signal reading unit of at least one group in multiple groups.
In the 2nd mode of the present invention, provide a kind of capturing element, have: shoot part, it has multiple groups of being made up of at least one pixel and to arrange by each group and from multiple signal reading units of pixel read output signal; And multiple control part, it is arranged by each group, and carrys out control signal reading unit based on the signal from pixel.
In the 3rd scheme of the present invention, a kind of capturing element is provided, have: shoot part, its 2nd reading circuit that there is the shooting area being provided with the 1st pixel and the 2nd pixel, the 1st reading circuit reading the 1st picture element signal exported from the 1st pixel and read the 2nd picture element signal exported from the 2nd pixel; 1st operational part, it carries out computing based on the 1st picture element signal to the 1st assessed value; 2nd operational part, it carries out computing based on the 2nd picture element signal to the 2nd assessed value; 1st control part, it carries out with the exposure of the 1st pixel based on the 1st assessed value or reads relevant control; And the 2nd control part, it carries out with the exposure of the 2nd pixel based on the 2nd assessed value or reads relevant control.
In the 4th scheme of the present invention, provide a kind of capturing element, have: shoot part, it has multiple groups of being made up of at least one pixel and to arrange by each group and from multiple signal reading units of pixel read output signal; And operational part, it is arranged by each group, and the information relevant with the control of signal reading unit is sent to the image processing part to signal real-time image processing.
In the 5th scheme of the present invention, a kind of capturing element is provided, have: shoot part, its 2nd reading circuit that there is the shooting area being configured with the 1st pixel and the 2nd pixel, the 1st reading circuit reading the 1st picture element signal exported from the 1st pixel and read the 2nd picture element signal exported from the 2nd pixel; 1st operational part, it carries out computing based on the 1st picture element signal to the 1st assessed value, and the 1st assessed value calculated is sent to the image processing part of the rear class to the 1st pixel data real-time image processing corresponding with the 1st picture element signal; And the 2nd operational part, it carries out computing based on the 2nd picture element signal to the 2nd assessed value, and the 2nd assessed value calculated is sent to the image processing part of the rear class to the 2nd pixel data real-time image processing corresponding with the 2nd picture element signal.
In the 6th scheme of the present invention, provide a kind of capturing element, have: shoot part, it has multiple group be made up of at least one pixel; And storage part, it has multiple memory block, and the plurality of memory block and multiple groups are arranged accordingly, and stores respectively the signal of the signal of pixel and the pixel beyond coming self-corresponding group that come self-corresponding group.
In addition, above-mentioned brief summary of the invention does not list whole essential feature of the present invention.In addition, the sub-portfolio of these feature groups is also the present invention.
Accompanying drawing explanation
Fig. 1 is the cutaway view of the MOS type capturing element of the rear surface irradiation type of present embodiment.
Fig. 2 is the pixel arrangement of shooting chip and the key diagram of unit group.
Fig. 3 is the circuit diagram corresponding with the unit group of shooting chip.
Fig. 4 is the block diagram of the functional structure representing capturing element.
Fig. 5 is the block diagram of the structure of the filming apparatus representing present embodiment.
Fig. 6 is the functional block diagram of image processing part.
Fig. 7 represents that filming apparatus generates and records the flow chart of the action of dynamic image.
Fig. 8 illustrates an example of the image photographed by capturing element.
Fig. 9 illustrates an example of the image photographed by capturing element.
Figure 10 illustrates the relation of the output timing of each frame frequency and picture signal.
Figure 11 schematically illustrates the watching area dynamic image and neighboring area dynamic image that are generated by dynamic image generating unit.
Figure 12 illustrates an example of the heading message (header information) attached by dynamic image generating unit.
Figure 13 represents that filming apparatus reproduces and shows the flow chart of the action of dynamic image.
Figure 14 represents that filming apparatus generates and stores the flow chart of the example of other actions of dynamic image.
Figure 15 illustrates for unit group to draw I rate except rate (Inter) example of pixel read-out by 0.5.
Figure 16 represents that filming apparatus reproduces and shows the flow chart of the action of dynamic image.
Figure 17 is the key diagram of scene example and Region dividing.
Figure 18 is the key diagram storing control by the electric charge carried out based on each region after the example division of Figure 17.
Figure 19 is the figure of the relation representing cumulative number and dynamic range.
Figure 20 is the flow chart of the process representing photographing actions.
Figure 21 is the block diagram of the concrete structure of the example represented as signal processing chip.
Figure 22 is the cutaway view of the MOS type capturing element of other rear surface irradiation types of present embodiment.
Figure 23 is the pixel arrangement of shooting chip and the key diagram of unit group.
Figure 24 is the circuit diagram corresponding with the unit group of shooting chip.
Figure 25 is the block diagram of the structure of the filming apparatus representing present embodiment.
Figure 26 is the block diagram of the concrete structure of the example represented as signal processing chip.
Figure 27 is an example of the functional block of computing circuit 1415.
Figure 28 illustrates an example of the residual quantity d of interframe and the corresponding relation of frame frequency f.
Figure 29 illustrates an example of the image photographed by capturing element.
Figure 30 illustrates an example of the image photographed by capturing element.
Figure 31 is an example of the functional block of other computing circuits.
Figure 32 illustrates for a unit group with except the example of the pixel 1188 read-out by rate 0.5.
Figure 33 illustrates an example of the functional block of another other computing circuits.
Figure 34 schematically illustrates the relation of gain and picture element signal.
Figure 35 is the cutaway view of the MOS type capturing element of the rear surface irradiation type of present embodiment.
Figure 36 is the pixel arrangement of shooting chip and the key diagram of block of pixels.
Figure 37 is the circuit diagram corresponding with the block of pixels of shooting chip.
Figure 38 is a part for the structure representing capturing element and the figure of action case thereof.
Figure 39 is the block diagram of the structure of the filming apparatus representing present embodiment.
Figure 40 is the functional block diagram of image processing part.
Figure 41 represents that filming apparatus generates and records the flow chart of the action of dynamic image.
Figure 42 illustrates an example of the image photographed by capturing element.
Figure 43 illustrates an example of the image photographed by capturing element.
Figure 44 illustrates the relation of the output timing of each frame frequency and picture signal.
Figure 45 schematically illustrates the watching area dynamic image and neighboring area dynamic image that are generated by dynamic image generating unit.
Figure 46 illustrates an example of the heading message attached by dynamic image generating unit.
Figure 47 represents that filming apparatus reproduces and shows the flow chart of the action of dynamic image.
Figure 48 represents that filming apparatus generates and records the flow chart of the example of other actions of dynamic image.
Figure 49 illustrates for a block of pixels with except the example of the pixel read-out by rate 0.5.
Figure 50 represents that filming apparatus reproduces and shows the flow chart of the action of dynamic image.
Figure 51 A is the key diagram of scene example.
Figure 51 B is the key diagram of Region dividing.
Figure 52 is the key diagram storing control by the electric charge carried out based on each region after the example division of Figure 51 B.
Figure 53 is the figure of the relation representing cumulative number and dynamic range.
Figure 54 is the flow chart of the process representing photographing actions.
Figure 55 is the block diagram of the concrete structure of the example represented as signal processing chip.
Figure 56 is the block diagram of the structure representing neighboring pixel data processing division.
Figure 57 is the block diagram of an example of the structure representing computing circuit.
Figure 58 is the flow chart of the action case representing computing circuit.
Figure 59 illustrates the structure of the data group (data array) generated by output circuit.
Figure 60 illustrates the content of the data group shown in Figure 59.
Figure 61 is the cutaway view of the MOS type capturing element of the rear surface irradiation type of present embodiment.
Figure 62 is the pixel arrangement of shooting chip and the key diagram of block of pixels.
Figure 63 is the circuit diagram corresponding with the block of pixels of shooting chip.
Figure 64 A is a part for the structure representing capturing element and the figure of action case thereof.
Figure 64 B is the figure of other action cases representing capturing element.
Figure 64 C is the figure of other action cases representing capturing element.
Figure 65 is the block diagram of the structure of the filming apparatus representing present embodiment.
Figure 66 is the functional block diagram of image processing part.
Figure 67 represents that filming apparatus generates and records the flow chart of the action of dynamic image.
Figure 68 illustrates an example of the image photographed by capturing element.
Figure 69 illustrates an example of the image photographed by capturing element.
Figure 70 illustrates the relation of the output timing of each frame frequency and picture signal.
Figure 71 schematically illustrates the watching area dynamic image and neighboring area dynamic image that are generated by dynamic image generating unit.
Figure 72 illustrates an example of the heading message attached by dynamic image generating unit.
Figure 73 represents that filming apparatus reproduces and shows the flow chart of the action of dynamic image.
Figure 74 is the pixel region of capturing element and the vertical view of action case thereof.
Figure 75 is other structures of the pixel region of capturing element and the vertical view of action case thereof.
Figure 76 is other structures of the pixel region of capturing element and the vertical view of action case thereof.
Figure 77 is other structures of the pixel region of capturing element and the vertical view of action case thereof.
Figure 78 is other structures of the pixel region representing capturing element and the figure of action case thereof.
Figure 79 represents that filming apparatus generates and records the flow chart of other action cases of dynamic image.
Figure 80 illustrates with the example of pixel read except rate 0.5.
Figure 81 represents that filming apparatus reproduces and shows the flow chart of the action of dynamic image.
Figure 82 A is the key diagram of scene example.
Figure 82 B is the key diagram of Region dividing.
Figure 83 is the key diagram storing control by the electric charge carried out based on each region after the example division of Figure 82 B.
Figure 84 is the figure of the relation representing cumulative number and dynamic range.
Figure 85 is the flow chart of the process representing photographing actions.
Figure 86 is the block diagram of the concrete structure of the example represented as signal processing chip.
Embodiment
Below, by working of an invention mode, the present invention will be described, but following execution mode does not limit the invention involved by claims.In addition, the Feature Combination illustrated in execution mode is all not necessarily necessary for the scheme of the technical solution problem of invention.
Fig. 1 is the cutaway view of the capturing element 100 of the rear surface irradiation type of present embodiment.Capturing element 100 has: the shooting chip 113 exporting the picture element signal corresponding with incident light; To the signal processing chip 111 that picture element signal processes; With the storage chip 112 of storage pixel signal.These shooting chip 113, signal processing chip 111 and storage chips 112 are stacked together, and have the projection 109 of conductivity by Cu etc. and be electrically connected to each other.
In addition, as shown in the figure, incident light is mainly towards with the Z axis forward entrance shown in white hollow arrow.In the present embodiment, in shooting chip 113, the face of the side of incident light beam strikes is called the back side.In addition, as shown in reference axis, the paper left direction orthogonal with Z axis is set to X-axis forward, the paper orthogonal with Z axis and X-axis is nearby set to Y-axis forward in direction.In some width figure afterwards, using by the reference axis of Fig. 1 as benchmark distinguish each figure towards mode displaing coordinate axle.
One example of shooting chip 113 is the mos image sensor of rear surface irradiation type.PD layer 106 is configured in the rear side of wiring layer 108.PD layer 106 configures in two dimensions, and has the multiple PD (photodiodes stored the electric charge corresponding to incident light; PhotoDiode) 104 and the transistor 105 that arranges accordingly with PD104.
The light incident side of the incident light in PD layer 106, is provided with colored filter 102 across passivating film (passivation film) 103.Colored filter 102 has the multiple kinds for wavelength region may transmission different from each other, has specific arrangement accordingly respectively with PD104.Arrangement about colored filter 102 will be described later.The group of colored filter 102, PD104 and transistor 105 forms a pixel.
The light incident side of the incident light in colored filter 102, is provided with lenticule 101 accordingly with each pixel.Lenticule 101 assembles incident light towards the PD104 of correspondence.
Wiring layer 108 has the wiring 107 transmitted to signal processing chip 111 by the picture element signal from PD layer 106.Wiring 107 can be multilayer, in addition, also can be provided with passive component and active element.
The surface of wiring layer 108 is configured with multiple projection 109.The plurality of projection 109 and multiple projections 109 contraposition be arranged on the opposite face of signal processing chip 111, by pressurizeing to shooting chip 113 and signal processing chip 111, make the projection after contraposition 109 be engaged with each other and be electrically connected.
Similarly, on respect to one another of signal processing chip 111 and storage chip 112, multiple projection 109 is configured with.These projection 109 alignment therewith, by pressurizeing to signal processing chip 111 and storage chip 112, make the projection after contraposition 109 be engaged with each other and be electrically connected.
In addition, for the joint between projection 109, be not limited to the Cu bump bond based on solid-state diffusion, the dimpling agllutination based on soldering melting also can be adopted to close.In addition, projection 109 such as arranges about one for a unit group described later.Therefore, the size of projection 109 can be greater than the spacing of PD104.In addition, the neighboring area beyond the pixel region that pixel arranges, also can arrange the projection larger than the projection 109 corresponding to pixel region simultaneously.
Signal processing chip 111 has the interconnective TSV of circuit (silicon penetrating electrode) 110 that will be separately positioned on the table back side.TSV110 is preferably arranged on neighboring area.In addition, TSV110 also can be arranged on the shooting neighboring area of chip 113, storage chip 112.
Fig. 2 is the pixel arrangement of shooting chip 113 and the key diagram of unit group 131.The situation about observing from rear side of shooting chip 113 is especially shown.With the rectangular pixel being arranged with more than 2,000 ten thousand in pixel region.In the present embodiment, 16 pixels of 4 adjacent pixel × 4 pixels form a unit group 131.Ruling in figure illustrates that adjacent pixel is grouped and forms the concept of unit group 131.The quantity forming the pixel of unit group 131 is not limited thereto, and also can be about 1000, such as 32 pixel × 64 pixels, can be its above also can be below it.
As shown in the partial enlarged drawing of pixel region, unit group 131 is being built-in with four so-called Bayer array be made up of these 4 pixels of green pixel Gb, Gr, blue pixel B and red pixel R up and down.Green pixel is the pixel as colored filter 102 with green color filter, accepts the light of the green band in incident light.Similarly, blue pixel is the pixel as colored filter 102 with blue color filter, accepts the light of blue wave band, and red pixel is the pixel as colored filter 102 with Red lightscreening plate, accepts the light of red band.
In the present embodiment, select at least one the unit group in multiple units group 131, by the controling parameters different from other unit group, the pixel that each unit group comprises is controlled.About the example of controling parameters, be frame frequency, except rate, by picture element signal be added addition line number or be added columns, electric charge the time that stores or store number of times, digitized figure place etc.And controling parameters also can be the parameter obtained from pixel the image procossing after picture signal.
Fig. 3 is the circuit diagram corresponding with the unit group 131 of shooting chip 113.In figure, representatively, the rectangle that dotted line surrounds illustrates the circuit corresponding with a pixel.In addition, each transistor below illustrated corresponding with the transistor 105 of Fig. 1 at least partially.
As mentioned above, unit group 131 is formed by 16 pixels.16 PD104s corresponding with each pixel are connected with transmission transistor (transfer transistor) 302 respectively, and the TX of each grid of each transmission transistor 302 and supply transmission pulse connects up and 307 to be connected.In the present embodiment, TX wiring 307 is share to connect relative to 16 transmission transistors 302.
The drain electrode of each transmission transistor 302 connects with the source electrode of corresponding each reset transistor 303, further, the floating diffusion of what is called (Floating Diffusion) FD between the drain electrode of transmission transistor 302 and the source electrode of reset transistor 303 is connected with the grid of amplifier transistor 304.The drain electrode of reset transistor 303 and the Vdd of supply line voltage connect up and 310 to be connected, and the replacement that its grid and supply reset pulse is connected up and 306 to be connected.In the present embodiment, resetting wiring 306 relative to 16 reset transistors 303 is share to connect.
The drain electrode of each amplifier transistor 304 and the Vdd of supply line voltage connect up and 310 to be connected.In addition, the source electrode of each amplifier transistor 304 selects the drain electrode of transistor 305 to connect with corresponding each.Select each grid of transistor and supply the decoding of strobe pulse and connect up and 308 to be connected.In the present embodiment, wiring 308 of decoding selects transistor 305 independently to arrange relative to 16.And each selects the source electrode of transistor 305 and common output to connect up 309 to be connected.Load current source 311 supplies electric current to output wiring 309.That is, for selecting the output of transistor 305 wiring 309 to be formed by source follower.In addition, load current source 311 can be arranged on shooting chip 113 side, also can be arranged on signal processing chip 111 side.
At this, the flowing from storing till the pixel after storing end exports of electric charge is described.If apply to reset pulse to reset transistor 303 by resetting wiring 306, apply transmission pulse by TX wiring 307 to transmission transistor 302, then the current potential of PD104 and floating diffusion FD is reset simultaneously.
When removing the applying of transmission pulse, PD104 converts accepted incident light to electric charge and stores.Then, when with do not apply to reset the state of pulse again apply transmission pulse time, the electric charge stored is transmitted by floating diffusion FD, the current potential of floating diffusion FD from replacement current potential become electric charge store after signal potential.And when applying strobe pulse by decoding wiring 308 to selection transistor 305, the variation of the signal potential of floating diffusion FD is via amplifier transistor 304 and select transistor 305 and be delivered to output wiring 309.Thus, corresponding with replacement current potential and signal potential picture element signal outputs to from unit picture element and exports wiring 309.
As shown in the figure, in the present embodiment, relative to 16 pixels forming unit group 131, replacement wiring 306 and TX wiring 307 are shared.That is, replacement pulse and transmission pulse apply for 16 whole pixels respectively simultaneously.Therefore, the whole pixels forming unit group 131 start electric charge in identical timing and store, and terminate electric charge in identical timing and store.But, each select transistor 305 applied successively according to strobe pulse with picture element signal corresponding to the electric charge that stores, and optionally output to output wiring 309.In addition, replacement wiring 306, TX wiring 307, output wiring 309 are arranged individually by each unit group 131.
By like this with unit group 131 for benchmark carrys out forming circuit, electric charge can be controlled by each unit group 131 and store the time.In other words, adjacent unit group 131 each other in, can export respectively and store the time based on different electric charges and the picture element signal obtained.Again in other words, carry out during an electric charge stores making a unit group 131, make another unit group 131 repeat repeatedly electric charge store and export each picture element signal, thus, also can these unit groups 131 each other in export each frame of dynamic image with different frame frequency.
Fig. 4 is the block diagram of the functional structure representing capturing element 100.16 PD104 of formation unit group 131 selected in order by the multiplexer (multiplexer) 411 of simulation, and connected up to the output arranged accordingly with this unit group 131 by respective picture element signal and 309 to export.Multiplexer 411 is formed on shooting chip 113 together with PD104.
The picture element signal exported via multiplexer 411 by be formed on signal processing chip 111, carry out the signal processing circuit 412 that correlated-double-sampling (CDS) and mould/number (A/D) change, carry out CDS and A/D and change.Picture element signal after A/D conversion is delivered in demultiplexer (demultiplexer) 413, and is stored in the pixel memories 414 corresponding with each pixel.Pixel memories 414 has the capacity that can store the picture element signal corresponding with largest cumulative number of times described later respectively.Demultiplexer 413 and pixel memories 414 are formed on storage chip 112.
Computing circuit 415 is delivered to the image processing part of rear class after processing the picture element signal be stored in pixel memories 414.Computing circuit 415 can be arranged on signal processing chip 111, also can be arranged on storage chip 112.In addition, connection corresponding to a unit group 131 shown in figure, but in fact these connections exist by each unit group 131, and action concurrently.Such as, but computing circuit 415 also can exist not according to each unit group 131, also can be, a computing circuit 415 processes on one side according to the order of sequence with reference to the value of the pixel memories 414 corresponding with each unit group 131 in order.
As mentioned above, be provided with accordingly respectively with unit group 131 and export wiring 309.Capturing element 100 is owing to being laminated shooting chip 113, signal processing chip 111 and storage chip 112, so utilize by exporting wiring 309 to these electrical connection employing the chip chamber of projection 109, can not wiring be pulled around on direction, face, with increasing each chip.
Fig. 5 is the block diagram of the structure of the filming apparatus representing present embodiment.Filming apparatus 500 has the photographic lens 520 as photographic optical system, and the subject light beam along optical axis OA incidence guides to capturing element 100 by photographic lens 520.Photographic lens 520 can be the replacing formula lens can installed and removed relative to filming apparatus 500.Filming apparatus 500 mainly has capturing element 100, systems control division 501, drive division 502, photometric measurer 503, working storage (work memory) 504, recording unit 505 and display part 506.
Photographic lens 520 is made up of multiple optical lens group, and makes the subject light beam imaging near its focus face from scene.In addition, represent this photographic lens 520 with the imaginary one piece of lens be configured near pupil in Fig. 1 and represent.Drive division 502 is the control circuits storing control according to the electric charge performing the timing controlled, Region control etc. of capturing element 100 from the instruction of systems control division 501.On that point, drive division 502 assume responsibility for and makes capturing element 100 perform electric charge to store and the function of capturing element control part of output pixel signal.
The image processing part 511 of picture element signal to systems control division 501 is paid by capturing element 100.Working storage 504 is implemented various image procossing and image data generating by image processing part 511 as working region.Such as, when generating the view data of jpeg file form, after the signal obtained by Bayer array generates chromatic image signal, perform compression process.The view data generated is recorded in recording unit 505, and, be converted into display and in the time preset, be presented on display part 506.
Photometric measurer 503, before a series of photographic process of image data generating, detects the Luminance Distribution of scene.Photometric measurer 503 comprises the AE transducer about such as 1,000,000 pixels.The operational part 512 of systems control division 501 accepts the output of photometric measurer 503 and calculates the brightness in each region of scene.Operational part 512 determines shutter speed, f-number, ISO photosensitivity according to calculated Luminance Distribution.Also can be photometric measurer 503 by capturing element 100 dual-purpose.In addition, operational part 512 also performs the various computings for making filming apparatus 500 action.
Drive division 502 can be that part or all is mounted on shooting chip 113, also can be that part or all is mounted on signal processing chip 111.A part for systems control division 501 can be mounted on shooting chip 113 or signal processing chip 111.
Fig. 6 is the functional block diagram of image processing part.Image processing part 511, except above-mentioned function, also has subject presumption unit 150, group selection portion 152, dynamic image generating unit 154 and dynamic image combining unit 156.These each functions will be described later.
Fig. 7 represents that filming apparatus generates and records the flow chart of the action of dynamic image.Fig. 8 and Fig. 9 illustrates an example of the image photographed by capturing element.Figure 10 illustrates the relation of the output timing of each frame frequency and picture signal.
The action of Fig. 7 is started when indicating being pressed record button etc. by user generate dynamic image to filming apparatus 500.First, subject presumption unit 150 pairs of drive divisions 502 drive to obtain the view data based on the picture signal from capturing element 100, and estimate (S100) the main objects body in the image be included in represented by this view data.
In this situation, be preferably, drive division 502 makes picture signal from the unit group 131 that whole shooting area comprises, such as all unit group 131 exports.In addition, also can be that whole pixels that drive division 502 makes picture signal comprise from constituent parts group 131 export, the pixel output image signal after removing between rate can also be removed from predetermined.Subject presumption unit 150 compares the multiple image obtained chronologically from capturing element 100, by the subject in movement specifically for main objects body.In addition, the presumption for main objects body also can use additive method.
Such as, subject presumption unit 150 when obtaining the image 170 of Fig. 8 and the image 178 of Fig. 9 from capturing element 100 as temporal front and back image, according to its residual quantity by children specifically for main objects body 171.In addition, the border of the ruling representation unit group 131 in image 170 and image 178, but the quantity of unit group 131 is only illustration, is not limited to the quantity shown in these figure.
The unit group 131 (S102) that group selection portion 152 at least selects the picture light of a main objects body 171 estimated by subject presumption unit 150 incident.Such as, in image 170, select the unit group 131 at least comprising a part for main objects body 171.And consider the situation of main objects body 171 meeting movement in shooting area, be preferably, the further unit group 131 around of the unit group 131 of the part at least comprising main objects body 171 is also selected in group selection portion 152.
Group selection portion 152 using the set of the unit group 131 selected by these as watching area 172.And group selection portion 152 using the set that is made up of the unit group 131 not being contained in watching area 172 in whole shooting area as neighboring area 176.Group selection portion 152 carries out specific to expression watching area 172 relative to the area information 174 of the scope of whole shooting area.
In the example shown in Fig. 8, watching area 172 is the rectangular areas be made up of laterally 7, longitudinally 4, altogether 28 unit groups 131.In contrast, neighboring area 176 by from as laterally 21, longitudinally 6 of shooting area, amount to 126 unit groups 131 remove watching area 172 after 98 unit groups 131 form.In addition, as area information 174, specific go out in shooting area, the position (9,2) from left end and upper end number of the unit group 131 of left upper end in the figure of watching area 172.And as dimension information, specific go out the transverse and longitudinal number 7 × 4 of watching area 172.
The unit group 131 comprised watching area 172 is carried out specific information and carries out specific information to neighboring area 176 being delivered to drive division 502 by group selection portion 152.In this situation, transmit the information to the frame frequency that watching area 172 and neighboring area 176 are suitable for respectively in the lump.At this, the frame frequency that the frame frequency comparison neighboring area 176 be preferably suitable for watching area 172 is suitable for is high.Such as, when the frame frequency be suitable for neighboring area 176 is 60fps, the frame frequency be suitable for watching area 172 is set as 180fps.Preferably the value of these frame frequencies is preset, and can be stored by the mode of group selection portion 152 reference, but also by after user, value can be revised.
Drive division 502 drives capturing element 100 to carry out taking (S104) with each frame frequency.That is, drive division 502 unit group 131 that watching area 172 is comprised, performs electric charge with high frame rate and stores and the output of picture signal, make the unit group 131 that neighboring area 176 comprises, and performs electric charge store and the output of picture signal with low frame rate.In other words, drive division 502 in the unit group 131 comprised for neighboring area 176 during obtaining the picture signal corresponding with 1 frame in, the unit group 131 that watching area 172 is comprised and obtain the picture signal corresponding with multiple frames that sequential arranges.
Such as, when the frame frequency of neighboring area 176 is 60fps and the frame frequency of watching area 172 is set as 180fps, as shown in Figure 10, drive division 502 to obtain from neighboring area 176 1 frame B1 picture signal time 1/60s during in, obtain the picture signal (1/60s=3 × 1/180s) of 3 frames A1, A2, A3 from watching area 172.In this situation, drive division 502 by the reset transistor 303 of the reset transistor 303 of the unit group 131 that drives neighboring area 176 to comprise individually, transmission transistor 302 and the unit group 131 of selecting the group of transistor 305 and watching area 172 to comprise, transmission transistor 302 and the group selecting transistor 305, and obtains picture signal with different frame frequency.
In addition, Figure 10 shows the output timing of picture signal, but is not that the length of time for exposure also illustrates.Drive division 502, to become the mode of the time for exposure precomputed by operational part 512, drives the group of above-mentioned transistor for neighboring area 176 and watching area 172.
In addition, the length of time for exposure also can be changed according to frame frequency.Such as in the example shown in Figure 10, also the time for exposure of 1 frame of neighboring area 176 can be set to 1/3 times in advance, and become the time for exposure identical with watching area 172 essence.In addition, after the output of picture signal, this picture signal can also be revised with frame frequency ratio.In addition, between neighboring area 176 and watching area 172, the output timing of picture signal can be synchronous unlike Figure 10, but asynchronous.
Picture signal from watching area 172 is stored into (S106) in the predetermined storage area of working storage 504 by each frame by image processing part 511 successively.Similarly, the picture signal from neighboring area 176 is stored into (same step) in the predetermined storage area of working storage 504 by each frame by image processing part 511 successively.
Dynamic image generating unit 154 reads the picture signal (S108) of the watching area 172 be stored in working storage 504, and generates the data (S110) comprising the watching area dynamic image of multiple frame of watching area 172.Similarly, dynamic image generating unit 154 reads the picture signal of the neighboring area 176 be stored in working storage 504, and generates the data (same step) comprising the neighboring area dynamic image of multiple frame of neighboring area 176.At this, watching area dynamic image and neighboring area dynamic image can generate with the general format that MPEG is such and can individually reproduce respectively, if also can not process via synthesis described later, the professional format that cannot reproduce generates.
Figure 11 schematically illustrates the watching area dynamic image and neighboring area dynamic image that are generated by dynamic image generating unit.Dynamic image generating unit 154, to drive with drive division 502 frame frequency that the frame frequency of watching area 172 is corresponding, generates watching area dynamic image.In the example shown in Figure 11, to drive with drive division 502 the frame frequency 1/180fps that the frame frequency 1/180fps of watching area 172 is identical, generate watching area dynamic image.
Similarly, dynamic image generating unit 154, to drive with drive division 502 frame frequency that the frame frequency of neighboring area 176 is corresponding, generates neighboring area dynamic image.In the example shown in Figure 11, to drive with drive division 502 the frame frequency 1/60fps that the frame frequency 1/60fps of neighboring area 176 is identical, generate neighboring area dynamic image.In addition, in the dynamic image of neighboring area, do not have valid value in the region corresponding with watching area 172, represent with oblique line in figure.
And their data also to watching area dynamic image and neighboring area dynamic image additional header, and are recorded to (S112) in recording unit 505 by dynamic image generating unit 154.Heading message comprises: represent the timing information of watching area 172 relative to the relation between the output timing of the picture signal of the area information of the position of whole shooting area, the dimension information representing the size of watching area 172 and expression watching area 172 and the output timing of the picture signal of neighboring area 176.
Systems control division 501 judges whether the shooting (S114) carrying out next unit interval.About the shooting whether carrying out next unit interval, judged by the record button whether pressing dynamic image this moment user.When carrying out the shooting of next unit interval (S114: yes), returning above-mentioned steps S102, when not carrying out the shooting of next unit interval (S114: no), terminating this action.
At this, " unit interval " is the time being pre-set in systems control division 501, for about the several seconds.According to the frame frequency of this unit interval, watching area 172 and the frame frequency of unit group number and neighboring area 176 and unit group number, determine the memory capacity in step s 106 for storing.In addition, based on these information, the region of the region determining the data storing watching area 172 in this memory capacity and the data storing neighboring area 176.
According to more than, picture signal can be obtained from the watching area 172 comprising main objects body 171 with high frame rate, and neighboring area 176 can be suppressed for low frame rate, thus can data volume be reduced.Thus, and carry out from whole pixel compared with high speed readout, the load of driving and image procossing can be reduced, and suppress power consumption and heating.
In addition, when next unit interval starts in the example shown in Fig. 7, unit group 131 is reselected in step s 102, and update area information and dimension information.Thereby, it is possible to follow main objects body 171, watching area 172 is successively upgraded.In the example shown in Figure 11, in the first frame A7 of the unit interval in watching area dynamic image, select the watching area 182 that the unit group 131 different by the last frame A6 from the former unit interval is formed, and, concomitantly also have updated area information 184 and neighboring area 186 with it.
Figure 12 illustrates an example of the heading message attached by dynamic image generating unit.The heading message of Figure 12 comprises: to watching area dynamic image carry out specific watching area dynamic image ID, watching area dynamic image frame frequency, the neighboring area dynamic image corresponding with this watching area dynamic image carried out to specific neighboring area dynamic image ID, the frame frequency of neighboring area dynamic image, timing information, area information and dimension information.These heading messages can be additional to the one party of watching area dynamic image and neighboring area dynamic image as heading message, also can be additional to its both sides.
Figure 13 represents that filming apparatus reproduces and shows the flow chart of the action of dynamic image.This action is carried out specific by user to the some of the watching area dynamic image be presented at thumbnail on display part 506 and press reproduction button and start.
Dynamic image combining unit 156 reads by the data (S150) of the specific watching area dynamic image of user from recording unit 505.Dynamic image combining unit 156 reads the data (S152) of the neighboring area dynamic image corresponding with this watching area dynamic image from recording unit 505.
In this situation, dynamic image combining unit 156 carrys out specific neighboring area dynamic image by the neighboring area dynamic image ID shown in the heading message of watching area dynamic image that reads in step S150.Also can replace, retrieve the neighboring area image that the timing information identical with the timing information shown in heading message is included as heading message and carry out specific.
In addition, in watching area dynamic image, heading message is comprised in the above example.On the other hand, when not comprising heading message and comprise heading message in watching area dynamic image in the dynamic image of neighboring area, also can be, first in step S150, user's specific neighboring area dynamic image also reads, and according to its heading message specific watching area dynamic image reading in step S152.
Dynamic image combining unit 156 uses the frame of watching area dynamic image and the frame of neighboring area dynamic image, carrys out the frame (S154) of compound display dynamic image.In this situation, first, the first frame A1 of watching area dynamic image is embedded into the position shown in the area information 174 in the first frame B1 of neighboring area dynamic image, thus the first frame C1 of compound display dynamic image.As shown in figure 11, dynamic image combining unit 156 makes the first frame C1 of display dynamic image be presented at (S156) on display part 506.
Dynamic image combining unit 156 judges at the next frame (S158) that whether there is watching area dynamic image between the next frame B2 in the dynamic image of neighboring area.Dynamic image combining unit 156 is (S158: yes) when there is the next frame of watching area dynamic image, watching area 172 being upgraded with frame A2, A3 below and neighboring area 176 is remained former frame B1 (S162), frame C2, C3 (S162) thus below compound display dynamic image also carry out showing (S156) successively.
On the other hand, in step S158, when there is not the next frame of watching area dynamic image between the next frame B2 in the dynamic image of neighboring area (S158), watching area 172 to upgrade with next frame A4 and is also upgraded (S164) neighboring area 176 with next frame B2 by dynamic image combining unit 156, thus compound display dynamic image next frame C4 (S162) and carry out showing (S156).
As long as there is the next frame (S160: yes) of neighboring area 176 in the dynamic image of neighboring area, then repeat step S154 to S160.When there is not the next frame of neighboring area 176 in the dynamic image of neighboring area (S160: no), dynamic image combining unit 156 retrieve the unit interval of the group that whether there is this watching area dynamic image and neighboring area dynamic image, the group (S166) of watching area dynamic image in next unit interval and neighboring area dynamic image.Such as, dynamic image combining unit 156 is in the identical file folder of recording unit 505, and whether retrieval exists the watching area dynamic image of the timing information after the timing shown in timing information comprising expression immediately this watching area dynamic image in heading message.
As long as the group of the watching area dynamic image existed in next unit interval and neighboring area dynamic image (S166: yes), just repeat step S150 to S166.When the group of the watching area dynamic image do not existed in next unit interval and neighboring area dynamic image (S166: no), terminate this action.
According to above content, overall data amount can be reduced, and smooth dynamic image can be shown for the watching area 172 comprising main objects body 171.In addition, in step S162, watching area 172 is directly updated the frame of compound display image at next frame, but synthetic method is not limited thereto.As other examples, also can be, the boundary line of the main objects body 171 in specific watching area 172 is come by image procossing, the main objects body 171 that this boundary line is surrounded and be updated to next frame, and former frame is maintained for the region in watching area 172 but outside the boundary line being positioned at main objects body 171, and synthesizes with the frame of neighboring area 176.That is, for the frame frequency that can fall into neighboring area 176 outside the boundary line in watching area 172.Thereby, it is possible to prevent the border of the fluency shown in dynamic image from seeming not nature.In addition, the frame frequency of reproduction does not need identical with the frame frequency (watching area is 180fps, and neighboring area is 60fps) during photography, such as, watching area can be made to be 60fps, make neighboring area be 20fps etc.For slow motion (slow motion) is reproduced in this situation.
Figure 14 represents that filming apparatus generates and records the flow chart of the example of other actions of dynamic image.In fig. 14 identical Reference numeral marked to the action identical with Fig. 7 and omit the description.
In the action of Figure 14, in watching area 172 and neighboring area 176, replace the frame frequency of Fig. 7 make between except rate different, or, except rate is different between the basis of the frame frequency of Fig. 7 makes.More specifically, in the step s 120, the unit group 131 that drive division 502 comprises for watching area 172, store and the output of picture signal except the pixel after removing between rate performs electric charge with low tone, for the unit group 131 that neighboring area 176 comprises, to store and the output of picture signal except the pixel after removing between rate performs electric charge between height.The unit group 131 such as watching area 172 comprised and except rate be 0, namely read whole pixel, the unit group 131 that neighboring area 176 is comprised and except rate be 0.5, namely read the pixel of half.
In this situation, the reset transistor 303 of the reset transistor 303 of the unit group 131 that drive division 502 individually drives neighboring area 176 to comprise, transmission transistor 302 and the unit group 131 of selecting the group of transistor 305 and watching area 172 to comprise, transmission transistor 302 and select the group of transistor 305, thus with between different except rate obtains picture signal.
In step s 110, dynamic image generating unit 154, based on the picture signal of the watching area 172 exported except rate with low tone, generates the watching area dynamic image corresponding with watching area 172.Dynamic image generating unit 154 similarly based on the picture signal to remove the neighboring area 176 that rate exports between height, generates the neighboring area dynamic image corresponding with neighboring area 176.In addition in step S112, dynamic image generating unit 154 add respective between except rate information watching area dynamic image and neighboring area dynamic image are recorded in recording unit 505.
Figure 15 illustrates for a unit group with the example of pixel 188 read except rate 0.5.In the example shown in Figure 15, when the unit group 132 of neighboring area 176 is Bayer array, vertical direction is read every a unit of Bayer array, that is, every two row when observing with pixel unit is alternately set to the pixel 188 and unread pixel that will read.Thereby, it is possible to avoid color balance to lack of proper care carry out between except read.
Figure 16 and Figure 14 is corresponding, expression filming apparatus reproduces the flow chart that dynamic image carries out the action shown.In figure 16 identical Reference numeral marked to the action identical with Figure 13 and omit the description.
In the step S170 of Figure 16, the pixel of the frame of dynamic image combining unit 156 pairs of neighboring area dynamic images carries out interpolation to after making exploring degree mate with the exploring degree of the frame of watching area dynamic image, the frame of watching area dynamic image is embedded in the frame of neighboring area dynamic image, thus the frame of compound display image.Thereby, it is possible to obtain picture signal from the watching area 172 comprising main objects body 171 with high-resolution, and neighboring area 176 can be suppressed for low exploring degree, thus can data volume be reduced.Thus, and carry out from whole pixel compared with high speed readout, the load of driving and image procossing can be reduced, and suppress power consumption and heating.
In addition, in the example shown in Fig. 1 to Figure 16, watching area 172 is rectangle, but the shape of watching area 172 is not limited thereto.As long as watching area 172 along unit group 131 boundary line and exist, then also can be the annular shape etc. that convex polygon, concave polygon or centre are embedded with neighboring area 176.In addition, watching area 172 also can be provided with multiple with being spaced from each other.In this case, different frame frequencies can be set in watching area 172 each other.
In addition, the frame frequency of watching area 172 and neighboring area 176 can be variable.Such as, often can detect the amount of movement of main objects body 171 through the unit interval, the amount of movement of main objects body 171 is larger, sets higher frame frequency to watching area 172.In addition, within the unit interval, also can follow main objects body 171 at any time and upgrade the selection that should be contained in the unit group 131 of watching area 172.
The generation of the dynamic image of Fig. 7 and Figure 14 is pressed record button by user and starts, starting reproduced by user presses reproduction button of the dynamic image of Figure 13 and Figure 16, but start time is not limited thereto.As other examples, by the push-botton operation undertaken by user, the generation action of dynamic image and reproducing movement can be performed continuously, and make display part 506 carry out live view image display (also referred to as live view display).In this situation, identify that for making user the display of watching area 172 can be overlapping.Such as, in display part 506, at the boundary line place display box of watching area 172, or the brightness of neighboring area 176 can be reduced or improves the brightness of watching area 172.
In the action of Figure 14, except rate is different between making in watching area 172 with neighboring area 176.Also can replace except rate is different between making, and make line number when being added by the picture element signal of adjacent lines pixel different.Such as, in watching area 172, line number is 1, that is, not by adjacent lines additively output pixel signal, be the line number than watching area more than 172 in neighboring area 176, such as line number is 2, exports the picture element signal of the same column pixel of 2 adjacent row.Thus, in the same manner as Figure 14, the exploring degree of watching area 172 can be maintained higher compared with neighboring area 176, and overall semaphore can be reduced.In addition, also can replace the addition of the picture element signal of adjacent lines and the picture element signal of adjacent column is added.In this situation, in watching area 172 and neighboring area 176, make columns when being added by the picture element signal of adjacent column pixel different, and, in above-mentioned addition, can also comprise and this additive value is calculated average process divided by addition line number or columns.
In addition, also can replace and dynamic image combining unit 156 is arranged on the image processing part 511 of filming apparatus 500, and be arranged on outside display unit, such as PC.In addition, above-mentioned execution mode is not limited to the situation generating dynamic image, also goes for the situation generating still image.
In addition, multiple units group 131 is all divided into watching area 172 and these two regions, neighboring area 176 by above-mentioned execution mode, but is not limited thereto, and also can be divided into the region of more than three.In this situation, using the unit group 131 suitable with the border of watching area 172 and neighboring area 176 as borderline region, this borderline region can be used to the value of the controling parameters being used for watching area 172 and control for the median of the value of the controling parameters of neighboring area 176.Thereby, it is possible to prevent watching area 172 and the border of neighboring area 176 from seeming not nature.
Also can make the time that stores of electric charge in watching area 172 and neighboring area 176, store the differences such as number of times.In this situation, watching area 172 and neighboring area 176 can be divided based on brightness, and can zone line be set.
Figure 17 is the key diagram of scene example and Region dividing.(a) of Figure 17 illustrates the scene that the pixel region of shooting chip 113 captures.Specifically, be the scene of the high light subject 603 simultaneously mirroring shade (shadow) subject 601 and middle subject 602 that environment within doors comprises and the room external environment observed in the inner side of window frame 604.When photographing to the larger scene of such light and shade from high light portion to shadow part difference, if capturing element in the past, then when to perform electric charge with high light portion for benchmark and store, produce in shadow part complete black (?collapse れ), when with shadow part be benchmark perform electric charge store time, in high light portion, produce complete white (Bai Fly び).That is, can say to make high light portion and shadow part store output image signal by an electric charge without exception, the dynamic range of larger scene poor relative to light and shade and photodiode is not enough.Therefore, in the present embodiment, be the subregions such as high light portion, shadow part, and it is different from each other to make the electric charge of the photodiode corresponding with regional store number of times by scene partitioning, the essence of seeking dynamic range thus expands.
(b) of Figure 17 illustrates the Region dividing in the pixel region of shooting chip 113.The scene of (a) of Figure 17 that operational part 512 pairs of photometric measurers 503 capture is resolved, and is that pixel region divides by benchmark with brightness.Such as, systems control division 501 makes photometric measurer 503 change the time for exposure and performs repeatedly scene to obtain, and operational part 512 determines the dividing line of pixel region with reference to the changes in distribution of its full white region, full black region.In the example of (b) of Figure 17, operational part 512 is divided into these three regions of shadow region 611, zone line 612 and highlight area 613.
Dividing line defines along the border of unit group 131.That is, each region after division comprises the individual group of integer respectively.And, the pixel being contained in each group of the same area during corresponding with the shutter speed determined by operational part 512 in, the electric charge carrying out same number stores and picture element signal exports.If affiliated area is different, then the electric charge carrying out different number of times stores and picture element signal output.
Figure 18 is the key diagram storing control by the electric charge carried out based on each region after the example division of Figure 17.After receive photography preparation instruction from user, operational part 512 determines shutter speed T according to the output of photometric measurer 503 0.And, be divided into shadow region 611, zone line 612 and highlight area 613 as mentioned above, and decide electric charge according to respective monochrome information and store number of times.Electric charge is stored number of times and can not make the saturated mode of pixel to be stored by each electric charge and determine.Such as, electric charge store in action store can store electric charge most probably to ninety percent electric charge, and decide electric charge as benchmark and store number of times.
At this, make shadow region 611 for once.That is, determined shutter speed T is made 0time consistency is stored with electric charge.In addition, making the electric charge of zone line 612 store number of times is twice.That is, making electric charge once store the time is T 0/ 2, at shutter speed T 0during repeat twice electric charge and store.In addition, making the electric charge of highlight area 613 store number of times is four times.That is, making electric charge once store the time is T 0/ 4, at shutter speed T 0during repeat four electric charges and store.
When receiving photography instruction at moment t=0 from user, drive division 502, to the pixel of group belonging to arbitrary region, all applies to reset pulse and transmission pulse.Be applied for opportunity with this, all pixels all start electric charge and store.
At moment t=T 0/ 4, the pixel of drive division 502 to the group belonging to highlight area 613 applies transmission pulse.Then, successively strobe pulse applied to the pixel in each group and respective picture element signal outputted to output wiring 309.After the picture element signal of the whole pixels in output group, the pixel of drive division 502 to the group belonging to highlight area 613 applies to reset pulse and transmission pulse again, starts secondary electric charge and stores.
In addition, the selection due to picture element signal exports and needs the time, thus between the beginning that stores of the end stored at primary electric charge and secondary electric charge can generation time poor.If essence this time difference can be ignored, then as mentioned above, make shutter speed T 0the time obtained divided by electric charge stores number of times is that electric charge once stores the time.On the other hand, if cannot ignore, then consider this time, adjustment shutter speed T 0, or make electric charge once store Time transfer receiver shutter speed T 0the time obtained divided by electric charge stores number of times is short.
At moment t=T 0/ 2, drive division 502 to belong to zone line 612 group and belong to highlight area 613 group pixel apply transmission pulse.Then, successively strobe pulse is applied to the pixel in each group, and respective picture element signal is outputted to output wiring 309.After the picture element signal of the whole pixels in output group, drive division 502 applies to reset pulse and transmission pulse to the group belonging to zone line 612 and the pixel of group that belongs to highlight area 613 again, start secondary electric charge to zone line 612 to store, electric charge highlight area 613 being started to third time stores.
At moment t=3T 0/ 4, the pixel of drive division 502 to the group belonging to highlight area 613 applies transmission pulse.Then, successively strobe pulse is applied to the pixel in each group, and respective picture element signal is outputted to output wiring 309.After the picture element signal of the whole pixels in output group, the pixel of drive division 502 to the group belonging to highlight area 613 applies to reset pulse and transmission pulse again, and the electric charge starting the 4th time stores.
At moment t=T 0, drive division 502 applies transmission pulse to the pixel in whole region.Then, successively strobe pulse is applied to the pixel in each group, and respective picture element signal is outputted to output wiring 309.By above control, the picture element signal of amount is once stored respectively in the pixel memories 414 corresponding with shadow region 611, in the pixel memories 414 corresponding with zone line 612, store the picture element signal of the amount of twice respectively, in the pixel memories 414 corresponding with highlight area 613, store the picture element signal of the amount of four times respectively.
These picture element signals transmit to image processing part 511 successively.Image processing part 511 generates the view data of high dynamic range according to these picture element signals.Concrete process will be described later.
Figure 19 is the figure of the relation representing cumulative number and dynamic range.The picture element signal storing corresponding amount repeatedly with the electric charge repeated is carried out accumulative process by image processing part 511, forms a part for the view data of high dynamic range.
When cumulative number be 1 time, be about to have carried out the dynamic range in the region that 1 electric charge stores as benchmark, cumulative number is 2 times, namely carried out 2 electric charges stores and is the amount of 1 grade by the extensive magnitude of the dynamic range in region accumulative for output signal.Similarly, if cumulative number is 4 times, be the amount of 2 grades, if be then the amount of 7 grades for 128 times.That is, the dynamic range in order to seek the amount of n level expands, as long as by 2 nsecondary output signal adds up.
At this, image processing part 511, in order to identify which zoning to be carried out electric charge several times and stored, gives the exponent bits of the 3bit representing cumulative number to picture signal.As shown in the figure, exponent bits be for 1 time 000 for accumulative total, for 2 times be 001 ... being 111 for 128 times, mode is distributed in order.
Image processing part 511, with reference to the exponent bits of each picture element signal received from computing circuit 415, when reference results is the accumulative total of more than 2 times, performs the accumulative process of picture element signal.Such as, in the situation (1 grade) that cumulative number is 2 times, for 2 picture element signals, the high-order 11bit stored with electric charge in the picture element signal of corresponding 12bit is added each other, generates 1 picture element signal of 12bit.Similarly, be the situation of 128 times (7 grades) at cumulative number, for 128 picture element signals, the high-order 5bit stored with electric charge in the picture element signal of corresponding 12bit be added each other, generate 1 picture element signal of 12bit.That is, make to deduct the progression corresponding with cumulative number and the high-order bit that obtains is added each other from 12, generate 1 picture element signal of 12bit.In addition, removing does not become the low level bit being added object.
By processing like this, the brightness range of imparting GTG and cumulative number can be made to change to correspondingly high brightness side.That is, 12bit is distributed to the limited range of high brightness side.Therefore, it is possible to give GTG in the past entirely white image-region.
But, for other zonings, owing to being assigned with 12bit to different brightness ranges, so image data generating cannot be carried out by each region by connecting synthesis simply.Therefore, image processing part 511, in order to maintain the GTG that obtains as far as possible and make whole region be the view data of 12bit, using high-high brightness pixel and minimum brightness pixel as benchmark, carries out re-quantization process.Specifically, implement gamma conversion and perform quantification, thus maintain GTG more smoothly.By processing like this, the view data of high dynamic range can be obtained.
In addition, cumulative number is not limited to the situation of the exponent bits of giving 3bit as described above to picture element signal, also can describe as the additional information different from picture element signal.In addition, also can remove exponent bits from picture element signal, replacing counts the quantity of the picture element signal be stored in pixel memories 414, obtains cumulative number thus when being added process.
In addition, in above-mentioned image procossing, perform the re-quantization process whole region being accommodated in the view data of 12bit, but also for the bit number of picture element signal, correspondingly can increase with upper limit cumulative number and export bit number.Such as, if be decided to be by upper limit cumulative number 16 times (4 grades), then for the picture element signal of 12bit, whole region is made to be the view data of 16bit.If process like this, ground, position image data generating can not fall.
Next, a series of photographing actions process is described.Figure 20 is the flow chart of the process representing photographing actions.Flow process is from after the power supply of filming apparatus 500 is connected.
In step s 201, systems control division 501 is standby until make photography and prepare instruction and interrupteur SW 1 and be pressed.Step S202 is entered after the pressing of interrupteur SW 1 being detected.
In step S202, systems control division 501 performs light-metering process.Specifically, obtain the output of photometric measurer 503, operational part 512 calculates the Luminance Distribution of scene.Then, enter step S203, as mentioned above, determine shutter speed, Region dividing, cumulative number etc.
After warming-up exercise completes in photography, enter step S204, standby until make photography instruction and interrupteur SW 2 is pressed.Now, if the elapsed time has exceeded the time Tw (step S205 is yes) preset, then step S201 has been returned.If (step S205 is no) detects pressing of interrupteur SW 2 before more than Tw, then enter step S206.
In step S206, the drive division 502 receiving the instruction of systems control division 501 performs the electric charge using Figure 18 to illustrate and stores process, signal readout process.Then, after whole signals has read, enter into step S207, perform the image procossing using Figure 19 to illustrate, and perform generated Imagery Data Recording to the recording processing in recording unit.
After recording processing completes, enter into step S208, judge whether the power supply of filming apparatus 500 disconnects.If power supply does not disconnect, then return step S201, if disconnect, terminate a series of photographing actions process.
Figure 21 is the block diagram of the concrete structure of the example represented as signal processing chip 111.Employ in the explanation of Fig. 4 above-mentioned, show demultiplexer 413 and pixel memories 414 and be formed in an example on storage chip 112, but at this, example be formed on signal processing chip 111 is described.
Signal processing chip 111 bears the function of drive division 502.Signal processing chip 111 comprises as the sensor controller 441 of the controlling functions of sharing, block control part 442, Synchronization Control portion 443, signal control part 444 and blanket the drive control part 420 controlling these each control parts.Drive control part 420 converts the instruction from systems control division 501 to each control part executable control signal and is delivered to each control part.
Sensor controller 441 bear export to shooting chip 113, store with the electric charge of each pixel, output that electric charge reads relevant control impuls controls.Specifically, sensor controller 441 resets pulse and transmission pulse control beginning that electric charge stores and end by exporting object pixel, and by exporting strobe pulse to reading pixel, picture element signal is exported to output wiring 309.
Block control part 442 performs output that export to shooting chip 113, that the unit group 131 becoming control object is carried out to specific certain pulses.Waiting as used Figure 17 illustrating, in the region after division, comprising multiple units group 131 adjacent one another are.These unit groups 131 belonging to the same area form a block.The same piece of pixel comprised starts electric charge in identical timing and stores, and terminates electric charge in identical timing and store.Therefore, block control part 442 is born by exporting the Briquetting effect of certain pulses Lai Shi unit group 131 based on the appointment from drive control part 420 to the unit group 131 becoming object.Each pixel is via TX wiring 307 and reset the transmission pulse that receives of wiring 306 and reset the logic product that pulse becomes each pulse that sensor controller 441 exports and the certain pulses that block control part 442 exports.Like this, by each region is controlled as separate block, achieve the electric charge using Figure 18 to illustrate and store control.About the Briquetting appointment from drive control part, will after describe in detail.
Synchronization Control portion 443 exports synchronizing signal to shooting chip 113.Each pulse and synchronizing signal synchronously become effectively (active) taking in chip 113.Such as, by adjustment synchronizing signal, achieve and only will belong to the STOCHASTIC CONTROL, thinning control etc. of specific pixel as control object of the pixel of same unit group 131.
Signal control part 444 mainly bears the timing controlled for A/D converter 412b.Via the picture element signal exporting wiring 309 output, be input to A/D converter 412b via CDS circuit 412a and multiplexer 411.A/D converter 412b is controlled by signal control part 444, and converts the picture element signal of input to digital signal.The picture element signal converting digital signal to is delivered to demultiplexer 413, then as numerical data pixel value and be stored in the pixel memories 414 corresponding with each pixel.
Signal processing chip 111 has the timing memory 430 as storing control storage, and what this timing memory 430 stored that the block forming block about which unit group 131 being combined distinguishes information and repeat that electric charge several times stores about each formed block stores number information.Timing memory 430 is made up of such as flash memory ram.
As mentioned above, form block, the testing result that the Luminance Distribution based on the scene performed before a series of photographic process detects about by which unit group combination, determined by systems control division 501.The block determined is with such as the 1st piece, the 2nd piece ... mode distinguish, and comprise which unit group 131 by each block and specify.Drive control part 420 receives this block from systems control division 501 and distinguishes information, and stores to timing memory 430.
In addition, systems control division 501, based on the testing result of Luminance Distribution, determines that each piece repeats electric charge several times and store.Drive control part 420 receives this from systems control division 501 and stores number information, and distinguishes information with corresponding block and store to timing memory 430 in couples.By distinguishing information to timing memory 430 memory block like this and storing number information, drive control part 420 successively can perform a series of electric charge independently with reference to timing memory 430 and store control.That is, if drive control part 420 is once receive the signal of photography instruction in the acquisition of piece image controls from systems control division 501, then for the control of each pixel after, can completes with receiving instruction from systems control division 501 at every turn store control.
Drive control part 420 receives based on preparing indicate with photography the photometry result (testing result of Luminance Distribution) synchronously performed from systems control division 501 that the block upgraded is distinguished information and stored number information, and the storage content of suitable renewal timing memory 430.Such as, drive control part 420 prepares to indicate or photograph to indicate synchronously to upgrade timing memory 430 with photography.By forming in this wise, the electric charge that can realize more at a high speed stores control, and performs during electric charge stores control at drive control part 420, and systems control division 501 can perform other process concurrently.
Drive control part 420 is not limited to store with reference to timing memory 430 in control for the electric charge of shooting chip 113 performing, and is reading in the execution controlled also with reference to timing memory 430.Such as, drive control part 420 with reference to each piece store number information, the picture element signal exported from demultiplexer 413 is stored in the corresponding address of pixel memories 414.
Object pixel signal, according to the delivery request from systems control division 501, reads from pixel memories 414 and pays to image processing part 511 by drive control part 420.As mentioned above, pixel memories 414 has the memory space that can store the picture element signal corresponding with largest cumulative number of times for each pixel, and stores with each picture element signal storing number of times corresponding performed as pixel value.Such as, be repeated 4 electric charges when storing in certain block, the pixel comprised due to this block exports the picture element signal of the amount of 4 times, so store 4 pixel values in the memory space of each pixel in pixel memories 414.Drive control part 420 is when receiving the delivery request of asking the picture element signal of specific pixel from systems control division 501, the address of this specific pixel on specified pixel memory 414, read the whole picture element signals stored, and pay to image processing part 511.Such as when storing 4 pixel values, these 4 pixel values all being paid successively, when only storing 1 pixel value, this pixel value being paid.
The picture element signal be stored in pixel memories 414 can be read out to computing circuit 415 by drive control part 420, and in computing circuit 415, perform above-mentioned accumulative process.Picture element signal after accumulative process is stored in the object pixel address of pixel memories 414.Object pixel address can be adjacent to arrange with accumulative address space before treatment, also can become same address in a covered manner relative to accumulative picture element signal before treatment.In addition, the private space that the pixel value that can also arrange each pixel after by accumulative process intensively stores.Picture element signal after accumulative process, when receiving the delivery request of asking the picture element signal of specific pixel from systems control division 501, according to the form of this delivery request, can be paid to image processing part 511 by drive control part 420.Certainly, also the picture element signal before and after accumulative process can be paid together.
Pixel memories 414 is provided with the data transmission interface transmitting picture element signal according to delivery request.Data transmission interface is connected with the data line be connected with image processing part 511.Data line is made up of the data/address bus in such as bus.In this situation, specified by the address that make use of address bus to the delivery request of drive control part 420 from systems control division 501 and perform.
The transmission of the picture element signal carried out based on data transmission interface is not limited to address-specifying manners, can adopt in various manners.Such as, can adopt when carrying out transfer of data, utilize each circuit synchronous in the rising edge of clock signal, the trailing edge both sides that use carry out Double Data Rate (the double data rate) mode that processes.In addition, also can adopt by a part for the steps such as address appointment is omitted and data at one stroke transmission are sought burst (burst) transmission means of high speed.In addition, can also combine adopt employ the circuit that control part, storage part, input and output portion are connected in parallel bus mode, by data serial ground one one transmission serial mode etc.
By forming in this wise, image processing part 511 only can receive necessary picture element signal, therefore, especially in the situation etc. of image forming low exploring degree, can complete image procossing at high speed.In addition, when making computing circuit 415 perform accumulative process, image processing part 511 can not perform accumulative process, so can be sought the high speed of image procossing by function sharing and parallel processing.
In the example of above-mentioned Figure 17 to Figure 21, store the differences such as number of times by what make electric charge in watching area 172 and neighboring area 176, and make the figure place when the picture element signal digitlization by watching area 172 be greater than neighboring area 176.In additive method, also can change the digitized figure place of watching area 172 and neighboring area 176.Such as, also can be, the A/D circuit of signal processing circuit 412 according to the instruction from drive division 502, for being all storing number of times and making watching area 172 with the digit digital higher than neighboring area 176 once.
Also can use the signal processing chip 111 of Figure 21, after using different controling parameters to obtain picture element signal in watching area 172 and neighboring area 176, carry out image procossing.Such as, in Fig. 7 to Figure 10, in watching area 172 and neighboring area 176 from the Computer image genration obtained with different frame frequencies dynamic image, but also can to replace, to carry out making the image procossing of the image averaging obtained with high frame rate improving S/N ratio.In this situation, during such as obtaining picture element signal once by drive control part 420 from neighboring area 176, obtain repeatedly from watching area 142, the picture signal of such as four times, and to be stored in pixel memories 414.Computing circuit 415 reads from pixel memories 414 multiple picture element signals that each pixel for watching area 142 obtains, and is averaged by each pixel.Thus, the random noise in each pixel of watching area 172 reduces, and can improve the S/N ratio of watching area 172.
In addition, in Fig. 7 to Figure 10, in watching area 172 and neighboring area 176 from the Computer image genration obtained with different frame frequencies dynamic image, but also can for the responsiveness based on subject and different frame frequencies.In this situation, subject presumption unit 150 estimates the speed in direction up and down in the change in location of the subject of interframe.In addition, subject presumption unit 150, in the size variation of the subject of this interframe, estimates the speed of the fore-and-aft direction of subject.Based on this presumption, group selection portion 152 is specific for low speed or the unit group 131 of static subject light, the unit group 131 to the unit group 131 of the subject light of middling speed and the subject light to high speed.
Drive division 502 drives capturing element 100, take with low frame rate in the unit group 131 to low speed or static subject light, take with moderate frame frequency in the unit group 131 of the subject light to middling speed, take with high frame rate in the unit group 131 to subject light at a high speed.One example of each frame frequency is 60fps, 120fps, 240fps.
Figure 22 is the cutaway view of the capturing element 1100 of other rear surface irradiation types of present embodiment.Capturing element 1100 has: the signal processing chip 1111 export the shooting chip 1113 of the picture element signal corresponding with incident light, processing to picture element signal and the storage chip 1112 of storage pixel signal.These shooting chip 1113, signal processing chip 1111 and storage chips 1112 are stacked together, and have the projection 1109 of conductivity by Cu etc. and be electrically connected to each other.
In addition, as shown in the figure, incident light is mainly towards the Z axis forward entrance shown in white hollow arrow.In the present embodiment, in shooting chip 1113, the face of the side of incident light beam strikes is called the back side.In addition, as shown in reference axis, the paper left direction orthogonal with Z axis is set to X-axis forward, the paper orthogonal with Z axis and X-axis is nearby set to Y-axis forward in direction.In some width figure afterwards, using the reference axis of Figure 22 is distinguished as benchmark each figure towards mode carry out displaing coordinate axle.
One example of shooting chip 1113 is the mos image sensor of rear surface irradiation type.PD layer 1106 is configured in the rear side of wiring layer 1108.PD layer 1106 configures in two dimensions, and has the multiple PD (photodiode) 104 stored the electric charge corresponding to incident light and the transistor 1105 arranged accordingly with PD1104.
The light incident side of the incident light in PD layer 1106, is provided with colored filter 1102 across passivating film 1103.Colored filter 1102 has the multiple kinds for wavelength region may transmission different from each other, has specific arrangement accordingly respectively with PD1104.Arrangement about colored filter 1102 will be described later.The group of colored filter 1102, PD1104 and transistor 1105 forms a pixel.
The light incident side of the incident light in colored filter 1102, is provided with lenticule 1101 accordingly with each pixel.Lenticule 1101 assembles incident light towards the PD1104 of correspondence.
Wiring layer 1108 has the wiring 1107 transmitted to signal processing chip 1111 by the picture element signal from PD layer 1106.Wiring 1107 can be multilayer, in addition, also can be provided with passive component and active element.
The surface of wiring layer 1108 is configured with multiple projection 1109.The plurality of projection 1109 and multiple projections 1109 contraposition be arranged on the opposite face of signal processing chip 1111, by pressurizeing to shooting chip 1113 and signal processing chip 1111, make the projection after contraposition 1109 be engaged with each other and be electrically connected.
Similarly, on respect to one another of signal processing chip 1111 and storage chip 1112, multiple projection 1109 is configured with.These projection 1109 alignment therewith, by pressurizeing to signal processing chip 1111 and storage chip 1112, make the projection after contraposition 1109 be engaged with each other and be electrically connected.
In addition, for the joint between projection 1109, be not limited to the Cu bump bond based on solid-state diffusion, the dimpling agllutination based on soldering melting also can be adopted to close.In addition, projection 1109 such as arranges about one for a unit group described later.Therefore, the size of projection 1109 can be greater than the spacing of PD1104.In addition, the neighboring area beyond the pixel region that pixel arranges, also can arrange the projection larger than the projection 1109 corresponding to pixel region simultaneously.
Signal processing chip 1111 has the interconnective TSV of circuit (silicon penetrating electrode) 1110 that will be separately positioned on the table back side.TSV1110 is preferably arranged on neighboring area.In addition, TSV1110 also can be arranged on the shooting neighboring area of chip 1113, storage chip 1112.
Figure 23 is the pixel arrangement of shooting chip 1113 and the key diagram of unit group 1131.The situation about observing from rear side of shooting chip 1113 is especially shown.With the rectangular pixel being arranged with more than 2,000 ten thousand in pixel region.In the example of Figure 23,16 pixels of 4 adjacent pixel × 4 pixels form a unit group 1131.Ruling in figure illustrates that adjacent pixel is grouped and forms the concept of unit group 1131.The quantity forming the pixel of unit group 1131 is not limited thereto, and also can be about 1000, such as 32 pixel × 64 pixels, can be its above also can be below it.
As shown in the partial enlarged drawing of pixel region, unit group 1131 is being built-in with four so-called Bayer array be made up of these 4 pixels of green pixel Gb, Gr, blue pixel B and red pixel R up and down.Green pixel is the pixel as colored filter 1102 with green color filter, accepts the light of the green band in incident light.Similarly, blue pixel is the pixel as colored filter 1102 with blue color filter, accepts the light of blue wave band, and red pixel is the pixel as colored filter 1102 with Red lightscreening plate, accepts the light of red band.
In the present embodiment, respectively assessed value is calculated to multiple units group 1131, and to control exposure or the reading of the pixel that unit group comprises based on the controling parameters of this assessed value.About the example of assessed value, be the picture element signal in unit group 1131 average, unit group 1131 inside and outside the weighted average of picture element signal, the contrast (contrast) in unit group 1131, the weighted average of the contrast inside and outside unit group 1131, the brightness in unit group 1131, the brightness inside and outside unit group 1131 weighted average etc.In addition, about the example of controling parameters, be frame frequency, except rate, by picture element signal be added addition line number or be added columns, electric charge the time that stores or store number of times, digitized figure place etc.And controling parameters also can be the parameter obtained from pixel the image procossing after picture signal.
Figure 24 is the circuit diagram corresponding with the unit group 1131 of shooting chip 1113.In figure, representatively, the rectangle that dotted line surrounds illustrates the circuit corresponding with a pixel.In addition, each transistor below illustrated corresponding with the transistor 1105 of Figure 22 at least partially.
As mentioned above, unit group 1131 is formed by 16 pixels.16 PD1104s corresponding with each pixel are connected with transmission transistor 1302 respectively, and the TX of each grid of each transmission transistor 1302 and supply transmission pulse connects up and 1307 to be connected.In the present embodiment, TX wiring 1307 is share to connect relative to 16 transmission transistors 302.
The drain electrode of each transmission transistor 1302 connects with the source electrode of corresponding each reset transistor 1303, and the floating FD that spreads of the what is called between the drain electrode of transmission transistor 1302 with the source electrode of reset transistor 1303 is connected with the grid of amplifier transistor 1304.The drain electrode of reset transistor 1303 and the Vdd of supply line voltage connect up and 1310 to be connected, and the replacement that its grid and supply reset pulse is connected up and 1306 to be connected.In the present embodiment, resetting wiring 1306 relative to 16 reset transistors 1303 is share to connect.
The drain electrode of each amplifier transistor 1304 and the Vdd of supply line voltage connect up and 1310 to be connected.In addition, the source electrode of each amplifier transistor 1304 selects the drain electrode of transistor 1305 to connect with corresponding each.Select each grid of transistor and supply the decoding of strobe pulse and connect up and 1308 to be connected.In the present embodiment, wiring 1308 of decoding selects transistor 1305 independently to arrange relative to 16.And each selects the source electrode of transistor 1305 and shared output to connect up 1309 to be connected.Load current source 1311 supplies electric current to output wiring 1309.That is, for selecting the output of transistor 1305 wiring 1309 to be formed by source follower.In addition, load current source 1311 can be arranged on shooting chip 1113 side, also can be arranged on signal processing chip 1111 side.
At this, the flowing from storing till the pixel after storing end exports of electric charge is described.If apply to reset pulse to reset transistor 1303 by resetting wiring 1306, apply transmission pulse by TX wiring 1307 to transmission transistor 1302, then the current potential of PD1104 and floating diffusion FD is reset simultaneously.
When removing the applying of transmission pulse, PD1104 converts accepted incident light to electric charge and stores.Then, when with do not apply to reset the state of pulse again apply transmission pulse time, the electric charge stored is transmitted by floating diffusion FD, the current potential of floating diffusion FD from replacement current potential become electric charge store after signal potential.And when applying strobe pulse by decoding wiring 1308 to selection transistor 1305, the variation of the signal potential of floating diffusion FD is via amplifier transistor 1304 and select transistor 1305 and be delivered to output wiring 1309.Thus, corresponding with replacement current potential and signal potential picture element signal outputs to from unit picture element and exports wiring 1309.
As shown in the figure, in the present embodiment, for 16 pixels forming unit group 1131, replacement wiring 1306 and TX wiring 1307 are shared.That is, replacement pulse and transmission pulse apply for 16 whole pixels respectively simultaneously.Therefore, the whole pixels forming unit group 1131 start electric charge in identical timing and store, and terminate electric charge in identical timing and store.But each selects transistor 1305 to be applied in the picture element signal corresponding with stored electric charge successively according to strobe pulse, and optionally output to output wiring 1309.In addition, replacement wiring 1306, TX wiring 1307, output wiring 1309 are arranged individually by each unit group 1131.
By like this with unit group 1131 for benchmark carrys out forming circuit, electric charge can be controlled by each unit group 1131 and store the time.In other words, adjacent unit group 1131 each other in, can export respectively and store the time based on different electric charges and the picture element signal obtained.Again in other words, carry out during an electric charge stores making a unit group 1131, make another unit group 1131 repeat repeatedly electric charge store and export each picture element signal, thus, also can these unit groups 1131 each other in export each frame of dynamic image with different frame frequency.
Figure 25 is the block diagram of the structure of the filming apparatus representing present embodiment.Filming apparatus 1500 has the photographic lens 1520 as photographic optical system, and the subject light beam along optical axis OA incidence guides to capturing element 1100 by photographic lens 1520.Photographic lens 1520 can be the replacing formula lens can installed and removed relative to filming apparatus 1500.Filming apparatus 1500 mainly has capturing element 1100, systems control division 1501, drive division 1502, photometric measurer 1503, working storage 1504, recording unit 1505 and display part 1506.
Photographic lens 1520 is made up of multiple optical lens group, and makes the subject light beam imaging near its focus face from scene.In addition, represent this photographic lens 1520 with the imaginary one piece of lens be configured near pupil in Figure 25 and represent.Drive division 1502 is the control circuits storing control according to the electric charge performing the timing controlled, Region control etc. of capturing element 1100 from the instruction of systems control division 1501.
The image processing part 1511 of picture element signal to systems control division 1501 is paid by capturing element 1100.Working storage 1504 is implemented various image procossing and image data generating by image processing part 1511 as working region.Such as, when generating the view data of jpeg file form, after the signal obtained by Bayer array generates chromatic image signal, perform compression process.The view data generated is recorded in recording unit 1505, and, be converted into display and in the time preset, be presented on display part 1506.
Photometric measurer 1503, before a series of photographic process of image data generating, detects the Luminance Distribution of scene.Photometric measurer 1503 comprises the AE transducer about such as 1,000,000 pixels.The operational part 1512 of systems control division 1501 accepts the output of photometric measurer 1503 and calculates the brightness in each region of scene.Operational part 1512 decides shutter speed, f-number, ISO photosensitivity according to calculated Luminance Distribution.Also can be photometric measurer 1503 by capturing element 1100 dual-purpose.In addition, operational part 1512 also performs the various computings for making filming apparatus 1500 action.
Drive division 1502 can be that part or all is mounted on shooting chip 1113, also can be that part or all is mounted on signal processing chip 1111.A part for systems control division 1501 can be mounted on shooting chip 1113 or signal processing chip 1111.
Figure 26 is the block diagram of the concrete structure of the example represented as signal processing chip 1111.Signal processing chip 1111 bears the function of drive division 1502.
Signal processing chip 1111 comprises sensor controller 1441, block control part 1442, Synchronization Control portion 1443, signal control part 1444, individual circuit portion 1450A etc. as the controlling functions of sharing and blanket the drive control part 1420 controlling these each control parts.And signal processing chip 1111 also comprises the I/F circuit 1418 between the systems control division 1501 of drive control part 1420 and filming apparatus 1500 main body.These sensor controller 1441, block control part 1442, Synchronization Control portion 1443, signal control part 1444 and drive control part 1420 respectively arrange one relative to signal processing chip 1111.
On the other hand, individual circuit portion 1450A, 450B, 450C, 450D, 450E is by each unit group 1131A, 131B, 131C, 131D, 131E and arrange.Because individual circuit portion 1450A, 450B, 450C, 450D, 450E have identical structure, so be described individual circuit portion 1450A below.Individual circuit portion 1450A comprises CDS circuit 1410, multiplexer 1411, A/D change-over circuit 1412, demultiplexer 1413, pixel memories 1414 and computing circuit 1415.Computing circuit 1415 via I/F circuit 1418 between systems control division 1501 receiving and transmitting signal.
Preferred individual circuit portion 1450A is configured on the region of the region overlapping configured with the pixel of corresponding unit group 1131A.Thereby, it is possible to do not increase each chip on direction, face individual circuit portion 1450A is not separately positioned on multiple unit group 1131A.
Instruction from systems control division 1501, with reference to timing memory 1430, is converted to the executable control signal of each control part and pays to each control part by drive control part 1420.Especially, drive control part 1420 when controlling with independent controling parameters respectively unit group 1131A etc., by this controling parameters with together with specific for unit group 1131A information, be delivered to each control part.Drive control part 1420 once receive the signal of photography instruction from systems control division 1501, then for the control of each pixel after, can complete from systems control division 1501 at every turn store control in the acquisition of piece image controls with receiving instruction.
Sensor controller 1441 bear export to shooting chip 1113, store with the electric charge of each pixel, output that electric charge reads relevant control impuls controls.Specifically, sensor controller 1441 resets pulse and transmission pulse control beginning that electric charge stores and end by exporting object pixel, and by exporting strobe pulse to reading pixel, picture element signal is exported to output wiring 1309.
Block control part 1442 performs output that export to shooting chip 1113, that the unit group 1131 becoming control object is carried out to specific certain pulses.Each pixel is via TX wiring 1307 and reset the transmission pulse that receives of wiring 1306 and reset the logic product that pulse becomes each pulse that sensor controller 1441 exports and the certain pulses that block control part 1442 exports.Like this, each region can be controlled as separate block.
Synchronization Control portion 1443 exports synchronizing signal to shooting chip 1113.Each pulse and synchronizing signal synchronously become effective taking in chip 1113.Such as, by adjustment synchronizing signal, achieve and only will belong to the STOCHASTIC CONTROL of specific pixel as control object, thinning control etc. of the pixel of same unit group 1131A etc.
Signal control part 1444 mainly bears the timing controlled for A/D converter 1412.Via the picture element signal exporting wiring 1309 output, be input to A/D converter 1412 via CDS circuit 1410 and multiplexer 1411.CDS circuit 1410 removes denoising from picture element signal.
A/D converter 1412 is controlled by signal control part 1444, and converts the picture element signal of input to digital signal.The picture element signal converting digital signal to is delivered to demultiplexer 1413, then as numerical data pixel value and be stored in the pixel memories 1414 corresponding with each pixel.
Pixel memories 1414 is provided with the data transmission interface transmitting picture element signal according to delivery request.Data transmission interface is connected with the data line be connected with image processing part 1511.Data line is made up of the data/address bus in such as bus.In this situation, specified by the address that make use of address bus to the delivery request of drive control part 1420 from systems control division 1501 and perform.
The transmission of the picture element signal carried out based on data transmission interface is not limited to address-specifying manners, can adopt in various manners.Such as, can adopt when carrying out transfer of data, utilize each circuit synchronous in the rising edge of clock signal, the trailing edge both sides that use carry out the Double Data Rate mode that processes.In addition, also can adopt by a part for the steps such as address appointment is omitted and data at one stroke transmission are sought the burst transfer mode of high speed.In addition, can also combine adopt employ the circuit that control part, storage part, input and output portion are connected in parallel bus mode, by data serial ground one one transmission serial mode etc.
By forming in this wise, image processing part 1511 only can receive necessary picture element signal, therefore, especially in the situation etc. of image forming low exploring degree, can complete image procossing at high speed.In addition, when making computing circuit 1415 perform accumulative process, image processing part 1511 can not perform accumulative process, so can be sought the high speed of image procossing by function sharing and parallel processing.
Signal processing chip 1111 has the timing memory 1430 formed by flash memory ram etc.Timing memory 1430 is by about which unit group 1131A etc. being repeated to the controling parameters storing number information etc. that electric charge several times stores, storing with setting up corresponding by specific information such as unit group 1131A.Controling parameters calculates by the computing circuit 1415 of individual circuit portion 1450A etc., and is stored in above-mentioned timing memory 1430.
Drive control part 1420 is not limited to store with reference to timing memory 1430 in control for the electric charge of shooting chip 1113 performing, and is reading in the execution controlled also with reference to timing memory 1430.Such as, drive control part 1420 stores number information with reference to constituent parts group 1131, is stored into by the picture element signal exported in the corresponding address of pixel memories 1414 from demultiplexer 1413.
Object pixel signal, according to the delivery request from systems control division 1501, reads from pixel memories 1414 and pays to image processing part 1511 by drive control part 1420.Pixel memories 1414 has the memory space that can store the picture element signal corresponding with largest cumulative number of times for each pixel, and stores with each picture element signal storing number of times corresponding performed as pixel value.Such as, repeat 4 electric charges when storing in certain block, the pixel comprised due to this block exports the picture element signal of the amount of 4 times, so store 4 pixel values in the memory space of each pixel in pixel memories 1414.Drive control part 1420 is when receiving the delivery request of asking the picture element signal of specific pixel from systems control division 1501, the address of this specific pixel on specified pixel memory 1414, read the whole picture element signals stored, and pay to image processing part 1511.Such as when storing 4 pixel values, these 4 pixel values all being paid successively, when only storing 1 pixel value, this pixel value being paid.
The picture element signal be stored in pixel memories 1414 can be read out to computing circuit 1415 by drive control part 1420, and in computing circuit 1415, perform above-mentioned accumulative process.Picture element signal after accumulative process is stored in the object pixel address of pixel memories 1414.Object pixel address can be adjacent to arrange with accumulative address space before treatment, also can become same address in a covered manner relative to accumulative picture element signal before treatment.In addition, the private space that the pixel value that can also arrange each pixel after by accumulative process intensively stores.Picture element signal after accumulative process, when receiving the delivery request of asking the picture element signal of specific pixel from systems control division 1501, according to the form of this delivery request, can be paid to image processing part 1511 by drive control part 1420.Certainly, also the picture element signal before and after accumulative process can be paid together.
As mentioned above, be provided with accordingly respectively with unit group 1131 and export wiring 1309.Capturing element 1100 is owing to being laminated shooting chip 1113, signal processing chip 1111 and storage chip 1112, so utilize by exporting wiring 1309 to these electrical connection employing the chip chamber of projection 1109, can not wiring be pulled around on direction, face, with increasing each chip.Similarly, for the holding wire leading to unit group from each control part, also by utilizing the electrical connection employing the chip chamber of projection 1109, can not wiring be pulled around on direction, face, with increasing each chip.
Figure 27 is an example of the functional block of computing circuit 1415.Computing circuit 1415 uses the picture element signal be stored in the pixel memories 1414 of individual circuit portion 1450A to carry out computing to assessed value, and based on this assessed value, exports the controling parameters controlled exposure or the reading of the unit group 1131A of correspondence.In the example shown in Figure 27, computing circuit 1415, based on the average residual quantity in the sequential of the picture element signal of unit group 1131A, calculates the frame frequency being applicable to this pixel unit group 1131A.
The computing circuit 1415 of Figure 27 has average computation portion 1452, average storage part 1454, residual quantity calculating part 1456 and frame frequency calculating part 1458.The G picture element signal simple average of each pixel of unit group 1131A that average computation portion 1452 will be stored in pixel memories 1414, and calculate mean value Ag.In this situation, average computation portion 1452, with the time interval corresponding with predetermined frame frequency, calculates the mean value Ag in frame now.
In the above example, mean value Ag calculates a value by each unit group 1131A, and is stored in average storage part 1454.Owing to getting the residual quantity of the mean value Ag of front and back frame, so be provided with the memory space at least storing two values in average storage part 1454.
Residual quantity calculating part 1456 calculates the residual quantity d between the mean value Ag in the up-to-date frame be stored in average the storage part 1454 and mean value Ag in the time upper frame being its former frame.This residual quantity can export with absolute value.
The residual quantity d calculated by residual quantity calculating part 1456 and predetermined fiducial value d0 etc. compares by frame frequency calculating part 1458, calculates frame frequency f thus.Such as, at this, by larger for the residual quantity d of interframe, frame frequency f is higher, and such corresponding relation is stored in as table in frame frequency calculating part 1458.
The frame frequency f calculated is outputted to drive control part 1420 by frame frequency calculating part 1458.Also can to replace or on this basis, this frame frequency f is directly written in timing memory 1430 by frame frequency calculating part 1458.
Figure 28 illustrates an example of the residual quantity d of interframe and the corresponding relation of frame frequency f.In Figure 28, the residual quantity of interframe has 2 fiducial value d0, d1, and correspondingly, frame frequency f is provided with frame frequency f0, f1, f2 of 3 ranks.
When the residual quantity d of interframe is lower below fiducial value d0, minimum frame frequency f0 exports as the frame frequency f being applicable to this unit group 1131A by frame frequency calculating part 1458.In addition, when the residual quantity d of interframe is between fiducial value d0 and higher fiducial value d1, frame frequency calculating part 1458 exports middle frame frequency f1.When the residual quantity d of interframe is greater than fiducial value d1, frame frequency calculating part 1458 exports the highest frame frequency f2.
At this, the time interval that preferred computing circuit 1415 carries out above-mentioned a series of computing is set as (1/f0) second corresponding with the interframe of minimum frame frequency f0.Thus, with the frame frequency now carrying out driving independently, all between multiple unit group 1131A, 131B etc., next frame frequency can be calculated with identical timing ga(u)ge.In addition, even if when driving with this minimum frame frequency f0, also can calculate based on the frame different from during last computation the frame frequency f made new advances.
Figure 29 and Figure 30 illustrates an example of the image photographed by capturing element.The border of the ruling representation unit group 1131 in image 1170 and image 1178, but the quantity of unit group 1131 is only illustration, is not limited to the quantity shown in these figure.In addition, unit group 1131A etc. is only represented with " A " etc.In addition, the unit group comprising main objects body 1171 represents with thick line.
Capturing element 1100 such as obtains the image 1170 of Figure 29 and the image 1178 of Figure 30 as temporal front and back image.When being conceived to the unit group 1131A in figure, in the image 1170 of frame above, this unit group 1131A does not comprise main objects body 1171, but comprises in the image 1178 of frame below.Thus, the image 1170 calculated by average computation portion 1452 shows larger with the residual quantity d of the mean value Ag of the unit group 1131A in image 1178.
Thus, the frame frequency f of unit group 1131A later for image 1178, based on the corresponding relation of Figure 28, calculates higher by frame frequency calculating part 1458.Thus, each pixel of the drive control part 1420 unit group 1131A that drives image 1178 later with high frame rate f2 etc.Therefore, drive control part 1420, for the larger subject of front and back interframe action in time, can obtain picture element signal with high frame rate f2 etc.
About the unit group 1131 driven with high frame rate f2, during the unit group 1131 driven with low frame rate f0 once electrically stores, electrically storing repeatedly can be carried out.Thereby, it is possible to make the figure place during picture element signal digitlization of the unit group 1131 driven with high frame rate f2 etc. to be greater than the unit group 1131 driven with low frame rate f0.Thereby, it is possible to generate the image of high gray from the unit group 1131 driven with high frame rate f2 etc.
Also can replace and digitized figure place is increased, and carry out the image procossing of image averaging that obtains with high frame rate f2 etc. to improve S/N ratio.In this situation, during the unit group 1131 driven with low frame rate f0 once electrically stores, to obtain repeatedly from the unit group 1131 driven with high frame rate f2, the picture signal of such as 4 times, and be stored in pixel memories 1414.Computing circuit 1415 reads from pixel memories 1414 multiple picture element signals that each pixel for the unit group 1131 controlled with high frame rate f2 obtains, and is averaged by each pixel.Thereby, it is possible to the random noise reduced in each pixel of this unit group 1131 is to improve S/N ratio.
According to more than, obtain the picture element signal of entirety such as image 1170 grade compared with calculating the situation of the frame frequency f of constituent parts group 1131A etc. after estimating main objects body again with the image processing part 1511 of rear class, can rapidly and power saving calculate frame frequency f.In addition, though the pixel self of a certain unit group 1131, wiring, treatment circuit etc. occur unfavorable condition, for other unit group 1131, also can rapidly and power saving calculate frame frequency f.
In addition, the average computation portion 1452 of Figure 27 is by average for the picture element signal of the G pixel of the unit group 1131A of correspondence.Replace, average computation portion 1452 also can calculate the average of the picture element signal comprising R pixel and B pixel.In addition, average computation portion 1452 can also calculate the average of the average of average, the R pixel of G pixel and B pixel.In this situation, whether frame frequency calculating part 1458 can exceed the conditions such as threshold value based on average residual quantity, the average residual quantity of R pixel and the some of the residual quantity of B pixel of G pixel, calculates frame frequency f.And, can also by judging with the result of regulation ratio by the average addition of average, the B pixel of average, the R pixel of G pixel.In addition, about the calculating of mean value, also can be the mean value of the subregion be configured in unit group.
In addition, average computation portion 1452 also can obtain shown in Figure 29 etc., the unit group 1131B of periphery of this unit group 1131A, the mean value Ag of 131C, 131D, 131E etc. from the computing circuit 1415 of other individual circuit portion 1450 grades, and adds the mean value Ag of this unit group 1131A.Such as, can be weighted on average these mean value.Average computation portion 1452 also can replace and obtains the unit group 1131B of the periphery of this unit group 1131A, the mean value Ag of 131C, 131D, 131E etc. from other computing circuit 1415 grades, and reads picture element signal and by self calculating mean value Ag from the pixel memories 1414 of other individual circuit portions 1450B etc.
In addition, in the example of Figure 28, the fiducial value of residual quantity is 2 and frame frequency is 3 ranks, but the number of the fiducial value of residual quantity and the number of levels of frame frequency are not limited thereto.
Figure 31 is an example of the functional block of other computing circuits 1416.In the example shown in Figure 31, computing circuit 1416, based on the contrast of the picture element signal of unit group 1131A, calculates and is applicable between this pixel unit group 1131A except rate.
The computing circuit 1416 of Figure 31 has radio-frequency component calculating part 1460, summation calculating part 1462 and except rate calculating part 1464.Radio-frequency component calculating part 1460 reads the G picture element signal of each pixel of the unit group 1131A be stored in pixel memories 1414, and carries out the high-pass filtering process based on its two-dimensional arrangements, extracts radio-frequency component Gh spatially thus.Similarly, radio-frequency component calculating part 1460 calculates the radio-frequency component Rh of R pixel and the radio-frequency component Bh of B pixel.
Summation calculating part 1462 calculates the summation of the absolute value of above-mentioned radio-frequency component Gh, Rh, Bh.Between except rate calculating part 1464 is based on above-mentioned summation, calculate the pixel that this unit group 1131A is comprised carry out between except between reading except rate.In this situation, preferably by representing summation larger except the table of the lower corresponding relation of rate be prestored between remove in rate calculating part 1464.Such as, change over the corresponding relation of Figure 28, by the fiducial value of summation with set up corresponding except rate.
Such as, it is 1 by the reference value of summation, except namely reading whole pixel between not having when summation is higher than fiducial value, except rate is 0.5 between calculating when summation is lower than fiducial value.Between except rate calculating part 1464 by between calculating except rate outputs to drive control part 1420.Also can to replace or on this basis, except rate calculating part 1464 by this except rate directly writes in timing memory 1430.
Drive control part 1420 with above-mentioned except rate calculating part 1464 calculate between except rate, is removed the output of carries out image signal between the pixel that the unit group 1131 of correspondence is comprised.In this situation, between drive control part 1420 individually drives except rate be the reset transistor 1303 of the unit group 1131 of 0.5, transmission transistor 1302 and select the group of transistor 1305 and except rate be the reset transistor 1303 of the unit group 1131 of 0, transmission transistor 1302 and select the group of transistor 1305, thus, with between different except rate obtains picture element signal.
Thereby, it is possible to maintain the exploring degree of unit group 1131 corresponding to the region high with contrast higher, and the semaphore of unit group 1131 corresponding to the region low with contrast can be reduced.And, in this situation, between calculating with the image processing part 1511 by rear class except rate situation compared with, can rapidly and power saving calculate between remove rate.In addition, though the pixel self of a certain unit group 1131, wiring, treatment circuit etc. occur bad, for other unit group 1131 also can rapidly and power saving calculate between remove rate.
Figure 32 illustrates a unit group with the example of pixel 1188 read except rate 0.5.In the example shown in Figure 32, when unit group 1132 is Bayer array, vertical direction is read every a unit of Bayer array, that is, every two row when observing with pixel unit is alternately set to the pixel 1188 and unread pixel that will read.Thereby, it is possible to avoid color balance to lack of proper care carry out between except read.
In the example of Figure 32, except reading between carrying out by row, but also can replace, except reading between being undertaken by row.And, can also be that radio-frequency component calculating part 1460 pairs of column directions and line direction extract high wavelength components respectively, and by except rate calculating part 1464 calculate respectively between column direction except between rate and line direction except rate.
In the structure of Figure 31 and Figure 32, to calculate between corresponding pixel groups except rate except rate calculating part 1464.Also can replace, calculate pixel count when being added by the picture element signal of homochromy neighbor.Such as, when the summation calculated by summation calculating part 1462 is more than fiducial value, line number is 1, namely, not by homochromy adjacent lines additively output pixel signal, be more multirow number when being less than fiducial value, such as, line number be set to 2, export the picture element signal after by the same column pixel addition of homochromy 2 adjacent row.
Thus, in the same manner as Figure 32, the exploring degree in the high region of contrast can be maintained higher, and overall semaphore can be reduced.In addition, also can replace and the picture element signal of homochromy adjacent lines is added, and the picture element signal of homochromy adjacent column is added.And, in above-mentioned addition, can comprise and calculating this additive value divided by the average process being added line number or columns.In addition, the picture element signal of homochromy adjacent rows and columns can also be added.
In addition, in above-mentioned radio-frequency component calculating part 1460 grade, the high wavelength components Rh of each R pixel, G pixel and B pixel, Gh, Bh is employed.Also can replace, use the luminance components calculated from R pixel, G pixel and B pixel to obtain radio-frequency component.In this situation, after gain can being adjusted between the luminance components of R pixel, G pixel and B pixel, obtain radio-frequency component.
In addition, in summation calculating part 1462, shown in Figure 29 etc., the unit group 1131B of periphery of this unit group 1131A, the radio-frequency component of 131C, 131D, 131E etc. can be obtained from the computing circuit 1416 of other individual circuit portions 1450B etc., and add in the radio-frequency component of this unit group 1131A.Such as, can be weighted on average these mean value.Also can be, summation calculating part 1462 replaces and obtains the unit group 1131B of the periphery of this unit group 1131A, the mean value Ag of 131C, 131D, 131E etc. from other computing circuits 1416 grade, and reads picture element signal from the pixel memories 1414 of other individual circuit portions 1450B etc. and calculate radio-frequency component by self.
In addition, at frame frequency calculating part 1458 and except the unit group having exceeded threshold value in rate calculating part 1464, the digitized figure place of picture element signal can be made larger than the unit group below threshold value.Such as, Ke Yishi, A/D change-over circuit 1412, according to the instruction from drive division 1502, carries out digitlization to the number of times that stores be all once with seniority top digit.
Figure 33 illustrates an example of the functional block of another other computing circuits 1417.Computing circuit 1417 has self average computation portion 1472, adjacent average calculating part 1470, gain calculating part 1474 and correction portion 1476.
The G picture element signal simple average of each pixel of unit group 1131A that self average computation portion 1472 will be stored in pixel memories 1414, carrys out calculating mean value Ag.Similarly, R picture element signal, the B picture element signal simple average of each pixel of unit group 1131A that self average computation portion 1472 will be stored in pixel memories 1414 respectively, come calculating mean value Ar, Ab.And mean value Ag, Ar, Ab of unit group 1131A are outputted to the adjacent average calculating part 1470 of the unit group 1131B of periphery etc. by self average computation portion 1472.
Adjacent average calculating part 1470 obtains mean value Ag, Ar, Ab from self the average computation portion 1472 corresponding to the other unit group 1131B adjacent with this unit group 1131A, 131C, 131D, 131E, and calculates the weighted average of these mean values.Gain calculating part 1474 is weighted on average mean value Ag, Ar, Ab of being calculated by self average computation portion 1472 and mean value Ag, Ar, Ab of being calculated by adjacent average calculating part 1470 by each RGB, and based on they ratio and calculate the gain relative to G picture element signal of R picture element signal and B picture element signal.In this situation, such as, use and the weight of the mean value of this unit group 1131A be set to 4/8, the weight of the mean value of adjacent unit group 1131B etc. be set to the weighted average of 1/8.
The gain of R picture element signal and B picture element signal, via I/F circuit 1418, is sent to systems control division 1501 as additional information.In addition, also can be, replace and obtain adjacent average calculating part 1470, the unit group 1131B of periphery of this unit group 1131A, the mean value Ag etc. of 131C, 131D, 131E etc. from computing circuit 1417 grade of other individual circuit portions 1450B etc., and read picture element signal and by self calculating mean value Ag etc. from the pixel memories 1414 of other individual circuit portions 1450B etc.
Correction portion 1476 is revised R picture element signal and B picture element signal by the gain calculated by gain calculating part 1474, and is written in pixel memories 1414.In this situation, correction portion 1476 is multiplied by the gain of R picture element signal to each R picture element signal, each B picture element signal is multiplied by the gain of B picture element signal.In addition, also can be, correction portion 1476 obtains feedback information from systems control division 1501, and revises this gain further.
Figure 34 schematically illustrates the relation of gain and picture element signal.The action that picture element signal is revised of preferred calculated gains by such as frame frequency f0 1 frame, namely undertaken by (1/f0) second.As shown in figure 34, calculate the gain of R picture element signal and the gain of B picture element signal second by (1/f0), and the output valve of R picture element signal and the output valve of B picture element signal are revised.According to more than, compared with situation about picture element signal being revised by image processing part 1511 calculated gains of rear class, can rapidly and power saving calculate gain to revise picture element signal.
In the above-described embodiment, sensor controller 1441, block control part 1442, Synchronization Control portion 1443, signal control part 1444 and drive control part 1420 are respectively provided with one relative to signal processing chip 1111, and individual circuit portion 1450A, 450B, 450C, 450D, 450E are by each unit group 1131A, 131B, 131C, 131D, 131E and arrange.Also can replace, sensor controller 1441, block control part 1442, Synchronization Control portion 1443, signal control part 1444 and drive control part 1420 are respectively provided with multiple relative to signal processing chip 1111, and control multiple units group 1131 with sharing separately.
In addition, also can be that individual circuit portion 1450A etc. are provided with one by every multiple units group 1131, and share by the plurality of unit group 1131.Individual circuit portion 1450A etc. can also be arranged by each pixel.That is, in the above-described embodiment, unit group 1131 can be made up of single pixel.
Figure 35 is the cutaway view of other capturing elements 2100 of present embodiment.Capturing element 2100 has: the shooting chip 2113 exporting the picture element signal corresponding with incident light; To the signal processing chip 2111 that picture element signal processes; With the storage chip 2112 of storage pixel signal.These shooting chip 2113, signal processing chip 2111 and storage chips 2112 are stacked together, and have the projection 2109 of conductivity by Cu etc. and be electrically connected to each other.
In addition, as shown in the figure, incident light is mainly towards with the Z axis forward entrance shown in white hollow arrow.In this manual, in shooting chip 2113, the face of the side of incident light beam strikes is called the back side.In addition, as shown in reference axis, the paper left direction orthogonal with Z axis is set to X-axis forward, the paper orthogonal with Z axis and X-axis is nearby set to Y-axis forward in direction.In some width figure afterwards, using the reference axis of Figure 35 is distinguished as benchmark each figure towards mode carry out displaing coordinate axle.
One example of shooting chip 2113 is the mos image sensor of rear surface irradiation type.PD layer 2106 is configured in the rear side of wiring layer 2108.PD layer 2106 configures in two dimensions, has and stores and the multiple PD (photodiode) 2104 generating the picture element signal corresponding with the electric charge stored and the transistor 2105 arranged accordingly with PD2104 the electric charge corresponding to incident light.
The light incident side of the incident light in PD layer 2106, is provided with colored filter 2102 across passivating film 2103.Colored filter 2102 has the multiple kinds for wavelength region may transmission different from each other, has specific arrangement accordingly respectively with PD2104.Arrangement about colored filter 2102 will be described later.The group of colored filter 2102, PD2104 and transistor 2105 forms a pixel.
The light incident side of the incident light in colored filter 2102, is provided with lenticule 2101 accordingly with each pixel.Lenticule 2101 assembles incident light towards the PD2104 of correspondence.
Wiring layer 2108 has the wiring 2107 transmitted to signal processing chip 2111 by the picture element signal from PD layer 2106.Wiring 2107 can be multilayer, in addition, also can be provided with passive component and active element.
The surface of wiring layer 2108 is configured with multiple projection 2109.The plurality of projection 2109 and multiple projections 2109 contraposition be arranged on the opposite face of signal processing chip 2111, by pressurizeing to shooting chip 2113 and signal processing chip 2111, make the projection after contraposition 2109 be engaged with each other and be electrically connected.
Similarly, on respect to one another of signal processing chip 2111 and storage chip 2112, multiple projection 2109 is configured with.These projection 2109 alignment therewith, by pressurizeing to signal processing chip 2111 and storage chip 2112, make the projection after contraposition 2109 be engaged with each other and be electrically connected.
In addition, for the joint between projection 2109, be not limited to the Cu bump bond based on solid-state diffusion, the dimpling agllutination based on soldering melting also can be adopted to close.In addition, projection 2109 such as arranges about one for a block of pixels described later.Therefore, the size of projection 2109 can be greater than the spacing of PD2104.In addition, the neighboring area beyond the shooting area that pixel arranges, also can arrange the projection larger than the projection 2109 corresponding to shooting area simultaneously.
Signal processing chip 2111 has the interconnective TSV of circuit (silicon penetrating electrode) 2110 that will be separately positioned on the table back side.TSV2110 is preferably arranged on neighboring area.In addition, TSV2110 also can be arranged on the shooting neighboring area of chip 2113 and storage chip 2112.
Figure 36 is the pixel arrangement of shooting chip 2113 and the key diagram of block of pixels 2131.The situation about observing from rear side of shooting chip 2113 shown in Figure 36.Multiple pixel is arranged with rectangular in shooting area 2700.Shooting area 2700 have by multiple pixel in the row direction and column direction divide multiple block of pixels 2131.Each block of pixels 2131 in the row direction and column direction has m × n pixel.At this, m, n are the integers of more than 2.In addition, line direction and column direction refer to two different directions in the face of shooting area 2700, can be mutually orthogonal.In Figure 36,16 pixels of 4 adjacent pixel × 4 pixels form a block of pixels 2131.Ruling in figure illustrates that adjacent pixel is grouped and forms the concept of block of pixels 2131.The quantity of pixel forming block of pixels 2131 is not limited thereto, and also can be about 1000, such as 32 pixel × 64 pixels, can be its above also can be below it.
As shown in the partial enlarged drawing of shooting area 2700, block of pixels 2131 is being built-in with four so-called Bayer array be made up of these 4 pixels of green pixel Gb, Gr, blue pixel B and red pixel R up and down.Green pixel is the pixel as colored filter 2102 with green color filter, accepts the light of the green band in incident light.Similarly, blue pixel is the pixel as colored filter 2102 with blue color filter, accepts the light of blue wave band, and red pixel is the pixel as colored filter 2102 with Red lightscreening plate, accepts the light of red band.
In the present embodiment, select at least one block of pixels in multiple block of pixels 2131, and with the controling parameters different from other block of pixels, control the pixel that each block of pixels comprises.About the example of controling parameters, be frame frequency, except rate, by picture element signal be added addition line number, electric charge the time that stores or store number of times, digitized figure place etc.And controling parameters also can be the parameter obtained from pixel the image procossing after picture signal.Frame frequency refers to the cycle generating picture element signal.In addition in this manual, there is the situation that frame frequency refers to the frame frequency of each block of pixels 2131.Such as, benchmark frame frequency and high speed frame frequency refer to the frame frequency of each block of pixels 2131.
Figure 37 is the circuit diagram corresponding with the block of pixels 2131 of shooting chip 2113.In figure, representatively, the rectangle that dotted line surrounds illustrates the circuit corresponding with a pixel.In addition, each transistor below illustrated corresponding with the transistor 2105 of Figure 35 at least partially.
In Figure 37, the block of pixels 2131 formed is shown, but the pixel count of block of pixels 2131 is not limited to this by 16 pixels.16 PD104s corresponding with each pixel are connected with transmission transistor 2302 respectively, and the TX of each grid of each transmission transistor 2302 and supply transmission pulse connects up and 2307 to be connected.In the example shown in Figure 37, TX wiring 2307 is share to connect relative to 16 transmission transistors 2302.
The drain electrode of each transmission transistor 2302 connects with the source electrode of corresponding each reset transistor 2303, and the floating FD that spreads of the what is called between the drain electrode of transmission transistor 2302 with the source electrode of reset transistor 2303 is connected with the grid of amplifier transistor 2304.The drain electrode of reset transistor 2303 and the Vdd of supply line voltage connect up and 2310 to be connected, and the replacement that its grid and supply reset pulse is connected up and 2306 to be connected.In the example shown in Figure 37, resetting wiring 2306 relative to 16 reset transistors 2303 is share to connect.
The drain electrode of each amplifier transistor 2304 and the Vdd of supply line voltage connect up and 2310 to be connected.In addition, the source electrode of each amplifier transistor 2304 selects the drain electrode of transistor 2305 to connect with corresponding each.Select each grid of transistor and supply the decoding of strobe pulse and connect up and 2308 to be connected.In the example shown in Figure 37, decoding wiring 2308 selects transistor 2305 independently to arrange relative to 16.And each selects the source electrode of transistor 2305 and shared output to connect up 2309 to be connected.Load current source 2311 supplies electric current to output wiring 2309.That is, for selecting the output of transistor 2305 wiring 2309 to be formed by source follower.In addition, load current source 2311 can be arranged on shooting chip 2113 side, also can be arranged on signal processing chip 2111 side.
At this, the flowing from storing till the pixel after storing end exports of electric charge is described.If apply to reset pulse to reset transistor 2303 by resetting wiring 2306, apply transmission pulse by TX wiring 2307 to transmission transistor 2302, then the current potential of PD2104 and floating diffusion FD is reset simultaneously.
When removing the applying of transmission pulse, PD2104 converts accepted incident light to electric charge and stores.Then, when with do not apply to reset the state of pulse again apply transmission pulse time, the electric charge stored is transmitted by floating diffusion FD, the current potential of floating diffusion FD from replacement current potential become electric charge store after signal potential.And when applying strobe pulse by decoding wiring 2308 to selection transistor 2305, the variation of the signal potential of floating diffusion FD is via amplifier transistor 2304 and select transistor 2305 and be delivered to output wiring 2309.Thus, corresponding with replacement current potential and signal potential picture element signal outputs to from unit picture element and exports wiring 2309.
In the example shown in Figure 37, for 16 pixels forming block of pixels 2131, replacement wiring 2306 and TX wiring 2307 are shared.That is, replacement pulse and transmission pulse apply for 16 whole pixels respectively simultaneously.Therefore, form whole pixels of block of pixels 2131 and start electric charge in identical timing and store, and terminate electric charge in identical timing and store.But, each select transistor 2305 applied successively according to strobe pulse with picture element signal corresponding to the electric charge that stores, and optionally output to output wiring 2309.In addition, replacement wiring 2306, TX wiring 2307, output wiring 2309 are arranged individually by each block of pixels 2131.
By like this with block of pixels 2131 for benchmark carrys out forming circuit, electric charge can be controlled by each block of pixels 2131 and store the time.In other words, adjacent block of pixels 2131 each other in, can export respectively and store the time based on different electric charges and the picture element signal obtained.Again in other words, carry out during an electric charge stores making a block of pixels 2131, make one other pixel block 2131 repeat repeatedly electric charge store and export each picture element signal, thus, also can these block of pixels 2131 each other in export each frame of dynamic image with different frame frequency.In addition, each transistor shown in Figure 37 and each wiring play function as the reading circuit reading the picture element signal exported from each pixel at least partially.Reading circuit is arranged by each pixel.A part of structures such as the wiring of the reading circuit of each pixel can have between pixel.
Figure 38 illustrates a part and the action case thereof of the structure of capturing element 2100.The capturing element 2100 of this example, on the basis of the structure shown in Figure 35, also has storage part 2114.In addition, storage part 2114 can be arranged on signal processing chip 2111.In this situation, capturing element 2100 can not have storage chip 2112.In addition, storage part 2114 also can be arranged on storage chip 2112.
Shooting chip 2113 has shooting area 2700, and this shooting area 2700 is configured with multiple pixels of the picture element signal that generation is corresponding to incident light respectively.For ease of illustrate, in Figure 38, in the row direction and column direction three block of pixels 2131 are respectively shown.The pixel count that each block of pixels 2131 preferred comprises is equal.In addition, the pixel count that each block of pixels 2131 in preferred shooting area 2700 comprises is fixing.
The signal processing chip 111 of this example has multiplexer 2411, A/D converter 2412, demultiplexer 2413, control part 2740 and computing circuit 2415 by each block of pixels 2131.The pixel that multiplexer 2411 selects corresponding block of pixels 2131 to comprise successively, and the picture element signal corresponding with selected pixel is input to A/D converter 2412.A/D converter 2412 converts analog pixel signal to digital pixel data, and is input to demultiplexer 2413.This pixel data, in the memory block 2730 of correspondence, is stored into the storage area corresponding with this pixel by demultiplexer 2413.Stored pixel data is delivered to the computing circuit 2415 of rear class by each memory block 2730.
Storage part 2114 and multiple block of pixels 2131 are arranged accordingly, and have multiple memory blocks 2730 of the pixel data that can store corresponding block of pixels 2131 respectively.Memory block 2730 and block of pixels 2131 one_to_one corresponding.Memory block 2730 can connect with corresponding block of pixels 2131 via bus 2720.Memory block 2730 can be buffer storage.
In addition, memory block 2730 at least partially also can store the pixel data of the block of pixels beyond corresponding block of pixels 2131.That is, a memory block 2730 can be common by multiple block of pixels 2131.In other words, control part 2740 can make the pixel data of a block of pixels 2131 be stored in multiple memory block 2730.By making memory block 2730 be had, multiple memory block 2730 can be utilized as described later efficiently, therefore, it is possible to suppress the memory capacity of storage part 2114 entirety.
In addition, about whole block of pixels 2131, preferred pixel data, except can being written into corresponding memory block 2730, also can be read and write at least one other memory block 2730.These other memory blocks 2730 can pre-determine by each block of pixels 2131, also can dynamically change.In addition, for whole memory block 2730, preferably except the pixel data of corresponding block of pixels 2131 can be read and write, the pixel data of at least one other block of pixels 2131 also can be read and write.These other block of pixels 2131 can pre-determine by each memory block 2730, also can dynamically change.
In addition, each memory block 2730 can be in signal processing chip 2111, the memory arranged pressing each block of pixels 2131 with the region of corresponding block of pixels 2131 overlap.That is, memory block 2730 in signal processing chip 2111, can be arranged on the positive lower area of corresponding block of pixels 2131.In this situation, block of pixels 2131 and memory block 2730 can be electrically connected by TSV.In addition, in signal processing chip 2111, at the above-mentioned zone overlapping with each block of pixels 2131, be provided with corresponding memory block 2730, A/D converter 2412, computing circuit 2415 etc.In addition, each memory block 2730 also can be on signal processing chip 2111, be separately positioned on the memory in the outside in the region overlapping with shooting area 2700.
In addition, when each memory block 2730, A/D converter 2412 and computing circuit 2415 are arranged at the region with corresponding block of pixels 2131 overlap, when each memory block 2730 stores the pixel data of the block of pixels 2131 beyond corresponding block of pixels 2131, analog pixel signal can be sent to the setting area of this memory block 2730, also digital pixel data can be sent to the setting area of this memory block 2730.In the former case, corresponding with this memory block 2730 A/D converter 2412 is input in this memory block 2730 after converting picture element signal to pixel data.In the latter case, in the A/D converter 2412 on the region overlapping with this block of pixels 2131, after converting picture element signal to pixel data, this pixel data is sent to the memory block 2730 storing this pixel data.Signal processing chip 2111 is provided with the wiring for transmitting these picture element signals or pixel data.
Computing circuit 2415 described later is delivered to the image processing part 2511 of rear class after processing the pixel data being stored in memory block 2730.Computing circuit 2415 can be arranged on signal processing chip 2111.In addition, connection corresponding to a block of pixels 2131 shown in figure, but in fact these connections exist by each block of pixels 2131, and action concurrently.Preferred computing circuit 2415 is arranged by each block of pixels 2131.
As mentioned above, be provided with accordingly respectively with block of pixels 2131 and export wiring 2309.Capturing element 2100 is stacked owing to will take chip 2113, signal processing chip 2111 and storage part 2114, so utilize by exporting wiring 2309 to these electrical connection employing the chip chamber of projection 2109, can not wiring be pulled around on direction, face, with increasing each chip.
In addition, in control part 2740, the frequency information relevant with the frame frequency of each block of pixels 2131 is provided.Control part 2740, based on this frequency information, selects the pixel data of the block of pixels 2131 of reply high speed frame frequency to carry out the memory block 2730 stored.Such as control part 2740 using memory block 2730 corresponding for the block of pixels 2131 with benchmark frame frequency as the memory block 2730 tackled this pixel data and carry out storing.
In this example shown in external each figure, describe the example of computing circuit 2415 by each block of pixels 2131 and setting that comprise multiple pixel.But computing circuit 2415 also can be arranged by each pixel.In addition, computing circuit 2415 can not be arranged relative to whole pixel.That is, shooting area 2700 is at least configured with the 1st pixel and the 2nd pixel, capturing element 2100 at least has 1st computing circuit 2415 corresponding with the 1st pixel and 2nd computing circuit 2415 corresponding with the 2nd pixel.
The 1st picture element signal that 1st pixel exports is read out by the 1st reading circuit, and the 2nd picture element signal that the 2nd pixel exports is read out by the 2nd reading circuit.1st computing circuit 2415 carries out computing based on the 1st picture element signal exported from the 1st pixel to the 1st assessed value, and is sent to the image processing part 2511 of rear class.2nd computing circuit 2415 carries out computing based on the 2nd picture element signal exported from the 2nd pixel to the 2nd assessed value, and is sent to the image processing part 2511 of rear class.At this, assessed value is the value using the value of picture element signal to carry out regulation computing and obtain.Such as, can be the neighbor signal that the value of picture element signal that exports of determined pixel and the neighbor adjacent with this pixel export value between residual quantity, also can be average etc.In addition, can also be the residual quantity, average etc. of value of multiple picture element signals that determined pixel exports in different frames.Various parameter can be used to this computing.
Figure 39 is the block diagram of the structure of the filming apparatus representing present embodiment.Filming apparatus 2500 has the photographic lens 2520 as photographic optical system, and the subject light beam along optical axis OA incidence guides to capturing element 2100 by photographic lens 2520.Photographic lens 2520 can be the replacing formula lens can installed and removed relative to filming apparatus 2500.Filming apparatus 2500 mainly has capturing element 2100, systems control division 2501, drive division 2502, photometric measurer 2503, working storage 2504, recording unit 2505 and display part 2506.
Photographic lens 2520 is made up of multiple optical lens group, and makes the subject light beam imaging near its focus face from scene.In addition, represent this photographic lens 2520 with the imaginary one piece of lens be configured near pupil in Figure 35 and represent.Drive division 2502 is the control circuits storing control according to the electric charge performing the timing controlled, Region control etc. of capturing element 100 from the instruction of systems control division 2501.On that point, drive division 2502 assume responsibility for and makes capturing element 2100 perform electric charge to store and the function of capturing element control part of output pixel signal.
The image processing part 2511 of picture element signal to systems control division 2501 is paid by capturing element 2100.Working storage 2504 is implemented various image procossing and image data generating by image processing part 2511 as working region.The image processing part 2511 of the rear class of the 1st and the 2nd computing circuit 2415 is based on the 1st assessed value received from the 1st computing circuit 2415, to the 1st pixel data real-time image processing of the image corresponding with the 1st picture element signal, based on the 2nd pixel data real-time image processing of the 2nd assessed value pair image corresponding with the 2nd picture element signal received from the 2nd computing circuit 2415.Such as, when generating the view data of jpeg file form, after the signal obtained with Bayer array generates chromatic image signal, perform compression process.The view data generated is recorded in recording unit 2505, and, be converted into display and in the time preset, be presented on display part 2506.In addition, image processing part 2511 can be arranged on capturing element 2100, also can be arranged on the systems control division 2501 of the outside of capturing element 2100.In addition, image processing part 2511 can be arranged by each pixel, also can arrange by each block of pixels 2131 comprising multiple pixel.
Photometric measurer 2503, before a series of photographic process of image data generating, detects the Luminance Distribution of scene.Photometric measurer 2503 comprises the AE transducer about such as 1,000,000 pixels.The operational part 2512 of systems control division 2501 accepts the output of photometric measurer 2503 and calculates the brightness in each region of scene.Operational part 2512 determines shutter speed, f-number, ISO photosensitivity according to calculated Luminance Distribution.Also can be photometric measurer 2503 by capturing element 2100 dual-purpose.In addition, operational part 2512 also performs the various computings for making filming apparatus 2500 action.
Drive division 2502 can be that part or all is mounted on shooting chip 2113, also can be that part or all is mounted on signal processing chip 2111.A part for systems control division 2501 can be mounted on shooting chip 2113 or signal processing chip 2111.In addition, in the filming apparatus 2500 of this example, the image procossing merit at least partially of image processing part 2511 is arranged in capturing element 2100.
Figure 40 is the functional block diagram of image processing part.The image processing part 2511 of this example extracts with the block of pixels 2131 (neighboring area 2176 described later) of benchmark frame frequency action with the block of pixels 2131 (watching area 2172 described later) of high speed frame frequency action.Image processing part 2511, except above-mentioned functions, also has subject presumption unit 2150, group selection portion 2152, dynamic image generating unit 2154 and dynamic image combining unit 2156.These each functions will be described later.
Figure 41 represents that filming apparatus generates and records the flow chart of the action of dynamic image.Figure 42 and Figure 43 illustrates an example of the image photographed by capturing element.Figure 44 illustrates the relation of the output timing of each frame frequency and picture signal.
The action of Figure 41 starts when pressing record button etc. by user and indicate generate dynamic image to filming apparatus 2500.First, subject presumption unit 2150 pairs of drive divisions 2502 drive to obtain the view data based on the picture signal from capturing element 2100, and estimate (S100) the main objects body in the image be included in represented by this view data.
In this situation, the block of pixels 2131 that preferred drive division 2502 comprises from whole shooting area, such as from whole block of pixels 2131 output image signal.In addition, whole pixel output image signals that drive division 2502 can comprise from each block of pixels 2131, also can remove the pixel output image signal after removing between rate from predetermined.Subject presumption unit 2150 compares the multiple image obtained chronologically from capturing element 2100, by the subject in movement specifically for main objects body.In addition, the presumption for main objects body also can use additive method.
Such as, subject presumption unit 2150 when obtaining the image 2170 of Figure 42 and the image 2178 of Figure 43 from capturing element 2100 as temporal front and back image, according to its residual quantity by children specifically for main objects body 2171.In addition, the ruling in image 2170 and image 2178 represents the border of block of pixels 2131, but the quantity of block of pixels 2131 is only illustration, is not limited to the quantity shown in these figure.
The block of pixels 2131 (S2102) that group selection portion 2152 at least selects the picture light of a main objects body 2171 estimated by subject presumption unit 2150 incident.Such as, in image 2170, select the block of pixels 2131 at least comprising a part for main objects body 2171.And consider the situation of main objects body 2171 meeting movement in shooting area, the further block of pixels 2131 around of the block of pixels 2131 of the part at least comprising main objects body 2171 is also selected in preferred group selection portion 2152.
The set of the block of pixels 2131 that these select by group selection portion 152 is as watching area 2172.And group selection portion 2152 using the set that is made up of the block of pixels 2131 not being contained in watching area 2172 in whole shooting area as neighboring area 2176.Group selection portion 2152 carries out specific to expression watching area 2172 relative to the area information 2174 of the scope of whole shooting area.
In the example shown in Figure 42, watching area 2172 is the rectangular areas be made up of laterally 7, longitudinally 4, altogether 28 block of pixels 2131.In contrast, neighboring area 2176 by from as laterally 21, longitudinally 6 of shooting area, amount to 126 block of pixels 2131 remove watching area 2172 after 98 block of pixels 2131 form.In addition, as area information 2174, specific go out in shooting area, the position (9,2) counted from left end and upper end of the block of pixels 2131 of left upper end in the figure of watching area 2172.And as dimension information, specific go out watching area 2172 several in length and breadth 7 × 4.
The block of pixels 2131 comprised watching area 2172 is carried out specific information and carries out specific information to neighboring area 2176 being delivered to drive division 2502 by group selection portion 2152.In this situation, transmit the information to the frame frequency that watching area 2172 and neighboring area 2176 are suitable for respectively in the lump.At this, the frame frequency that the frame frequency comparison neighboring area 2176 be preferably suitable for watching area 2172 is suitable for is high.Such as, when the frame frequency be suitable for neighboring area 2176 is 60fps, the frame frequency be suitable for watching area 2172 is set as 180fps.Preferably the value of these frame frequencies is preset, and can be stored by the mode of group selection portion 2152 reference, but also by after user, value can be revised.
Drive division 2502 drives capturing element 2100 to carry out taking (S2104) with each frame frequency.That is, the block of pixels 2131 that comprises for watching area 2172 of drive division 2502, performs electric charge with high frame rate and stores and the output of picture signal, for the block of pixels 2131 that neighboring area 2176 comprises, performs electric charge store and the output of picture signal with low frame rate.In other words, drive division 2502 is during the block of pixels 2131 comprised for neighboring area 2176 obtains the picture signal corresponding with 1 frame, and the block of pixels 2131 comprised for watching area 2172 obtains the picture signal corresponding with multiple frames that sequential arranges.
Such as, when the frame frequency of neighboring area 2176 is 60fps and the frame frequency of watching area 2172 is set as 180fps, as shown in figure 44, drive division 2502 to obtain from neighboring area 2,176 1 frame B1 picture signal time 1/60s during, obtain the picture signal (1/60s=3 × 1/180s) of 3 frames A1, A2, A3 from watching area 2172.In this situation, drive division 2502 by the reset transistor 2303 of the reset transistor 2303 of the block of pixels 2131 that individually drives neighboring area 2176 to comprise, transmission transistor 2302 and the block of pixels 2131 of selecting the group of transistor 2305 and watching area 2172 to comprise, transmission transistor 2302 and the group selecting transistor 2305, and obtains picture signal with different frame frequency.
In addition, Figure 44 shows the output timing of picture signal, but is not that the length of time for exposure also illustrates.Drive division 2502, to become the mode of the time for exposure precomputed by operational part 2512, drives the group of above-mentioned transistor for neighboring area 2176 and watching area 2172.
In addition, the length of time for exposure also can be changed according to frame frequency.Such as in the example shown in Figure 44, the time for exposure of 1 frame of neighboring area 2176 can be set to 1/3 times in advance, and become the time for exposure identical with watching area 2172 essence.In addition, after the output of picture signal, this picture signal can also be revised with frame frequency ratio.In addition, between neighboring area 2176 and watching area 2172, the output timing of picture signal can be synchronous unlike Figure 44, but asynchronous.
Picture signal from watching area 2172 is stored into (S2106) in the predetermined storage area of working storage 2504 by each frame by image processing part 2511 successively.Similarly, the picture signal from neighboring area 2176 is stored into (same step) in the predetermined storage area of working storage 2504 by each frame by image processing part 2511 successively.Working storage 2504, as illustrated in Figure 38, has multiple memory block 2730.Working storage 2504 can be the memory be made up of the memory set corresponding with each block of pixels 2131.
Dynamic image generating unit 2154 reads the picture signal (S2108) of the watching area 2172 be stored in working storage 2504, and generates the data (S2110) comprising the watching area dynamic image of multiple frame of watching area 2172.Similarly, dynamic image generating unit 2154 reads the picture signal of the neighboring area 2176 be stored in working storage 2504, and generates the data (same step) comprising the neighboring area dynamic image of multiple frame of neighboring area 2176.At this, watching area dynamic image and neighboring area dynamic image can generate with the general format that MPEG is such and can individually reproduce respectively, if also can not process via synthesis described later, the professional format that cannot reproduce generates.
Figure 45 schematically illustrates the watching area dynamic image and neighboring area dynamic image that are generated by dynamic image generating unit.Dynamic image generating unit 2154, to drive with drive division 2502 frame frequency that the frame frequency of watching area 2172 is corresponding, generates watching area dynamic image.In the example shown in Figure 45, to drive with drive division 2502 the frame frequency 1/180fps that the frame frequency 1/180fps of watching area 2172 is identical, generate watching area dynamic image.
Similarly, dynamic image generating unit 2154, to drive with drive division 2502 frame frequency that the frame frequency of neighboring area 2176 is corresponding, generates neighboring area dynamic image.In the example shown in Figure 45, to drive with drive division 2502 the frame frequency 1/60fps that the frame frequency 1/60fps of neighboring area 2176 is identical, generate neighboring area dynamic image.In addition, in the dynamic image of neighboring area, do not have valid value in the region corresponding with watching area 2172, represent with oblique line in figure.
And these data also to watching area dynamic image and neighboring area dynamic image additional header, and are recorded to (S2112) in recording unit 2505 by dynamic image generating unit 2154.Heading message comprises: represent the timing information of watching area 2172 relative to the relation between the output timing of the picture signal of the area information of the position of whole shooting area, the dimension information representing the size of watching area 2172 and expression watching area 2172 and the output timing of the picture signal of neighboring area 2176.
Systems control division 2501 judges whether the shooting (S2114) carrying out next unit interval.About the shooting whether carrying out next unit interval, judge with the record button whether this moment presses dynamic image by user.When carrying out the shooting of next unit interval (S2114: yes), returning above-mentioned steps S2102, when not carrying out the shooting of next unit interval (S2114: no), terminating this action.
At this, " unit interval " is the time being pre-set in systems control division 2501, for about the several seconds.According to the frame frequency of this unit interval, watching area 2172 and the frame frequency of block of pixels number and neighboring area 2176 and block of pixels number, determine the memory capacity for storing in step S2106.In addition, based on these information, the region of the region determining the data storing watching area 2172 in this memory capacity and the data storing neighboring area 2176.
According to more than, picture signal can be obtained from the watching area 2172 comprising main objects body 2171 with high frame rate, and neighboring area 2176 can be suppressed for low frame rate, thus can data volume be reduced.Thus, and carry out from whole pixel compared with high speed readout, the load of driving and image procossing can be reduced to suppress power consumption and heating.
In addition, when next unit interval starts in the example shown in Figure 41, in step S2102, block of pixels 2131 is reselected, and update area information and dimension information.Thereby, it is possible to follow main objects body 2171, watching area 2172 is successively upgraded.In the example shown in Figure 45, in the first frame A7 of the unit interval in watching area dynamic image, select the watching area 2182 that the block of pixels 2131 different by the last frame A6 from the former unit interval is formed, further, concomitantly also have updated area information 2184 and neighboring area 2186 with it.
Figure 46 illustrates an example of the heading message attached by dynamic image generating unit.The heading message of Figure 46 comprises: to watching area dynamic image carry out specific watching area dynamic image ID, watching area dynamic image frame frequency, the neighboring area dynamic image corresponding with this watching area dynamic image carried out to specific neighboring area dynamic image ID, the frame frequency of neighboring area dynamic image, timing information, area information and dimension information.These heading messages can be additional to the one party of watching area dynamic image and neighboring area dynamic image as heading message, also can be additional to its both sides.
Figure 47 represents that filming apparatus reproduces and shows the flow chart of the action of dynamic image.This action is carried out specific by user to the some of the watching area dynamic image be presented at thumbnail on display part 2506 and press reproduction button and start.
Dynamic image combining unit 2156 reads by the data (S2150) of the specific watching area dynamic image of user from recording unit 2505.Dynamic image combining unit 2156 reads the data (S2152) of the neighboring area dynamic image corresponding with this watching area dynamic image from recording unit 2505.
In this situation, dynamic image combining unit 2156 carrys out specific neighboring area dynamic image by the neighboring area dynamic image ID shown in the heading message of watching area dynamic image that reads in step S2150.Also can replace, retrieve the neighboring area image that the timing information identical with the timing information shown in heading message is included as heading message and carry out specific.
In addition, in watching area dynamic image, heading message is comprised in the above example.On the other hand, when not comprising heading message in watching area dynamic image and comprise heading message in the dynamic image of neighboring area, also can be, first in step S2150, user's specific neighboring area dynamic image also reads, and according to its heading message specific watching area dynamic image reading in step S2152.
Dynamic image combining unit 2156 uses the frame of watching area dynamic image and the frame of neighboring area dynamic image, carrys out the frame (S2154) of compound display dynamic image.In this situation, first, the first frame A1 of watching area dynamic image is embedded into the position shown in the area information 2174 in the first frame B1 of neighboring area dynamic image, thus the first frame C1 of compound display dynamic image.As shown in figure 45, dynamic image combining unit 2156 makes the first frame C1 of display dynamic image be presented at (S2156) on display part 2506.
Dynamic image combining unit 2156 judges at the next frame (S2158) that whether there is watching area dynamic image between the next frame B2 in the dynamic image of neighboring area.Dynamic image combining unit 2156 is (S2158: yes) when there is the next frame of watching area dynamic image, watching area 2172 being upgraded with frame A2, A3 below and neighboring area 2176 is remained former frame B1 (S2162), frame C2, C3 (S2162) thus below compound display dynamic image also carry out showing (S2156) successively.
On the other hand, in step S2158, when there is not the next frame of watching area dynamic image between the next frame B2 in the dynamic image of neighboring area (S2158), watching area 2172 to upgrade with next frame A4 and is also upgraded (S2164) neighboring area 2176 with next frame B2 by dynamic image combining unit 2156, thus compound display dynamic image next frame C4 (S2162) and carry out showing (S2156).
As long as there is the next frame (S2160: yes) of neighboring area 2176 in the dynamic image of neighboring area, then repeat step S2154 to S2160.When there is not the next frame of neighboring area 2176 in the dynamic image of neighboring area (S2160: no), dynamic image combining unit 2156 retrieve the unit interval of the group that whether there is this watching area dynamic image and neighboring area dynamic image, the group (S2166) of watching area dynamic image in next unit interval and neighboring area dynamic image.Such as, dynamic image combining unit 2156 is in the identical file folder of recording unit 2505, and whether retrieval exists the watching area dynamic image of the timing information after the timing shown in timing information comprising expression immediately this watching area dynamic image in heading message.
As long as the group of the watching area dynamic image existed in next unit interval and neighboring area dynamic image (S2166: yes), then repeat step S2150 to S2166.When the group of the watching area dynamic image do not existed in next unit interval and neighboring area dynamic image (S2166: no), terminate this action.
According to more than, overall data amount can be reduced, and smooth dynamic image can be shown for the watching area 2172 comprising main objects body 2171.In addition, in step S2162, the frame of the direct compound display image of more newly arriving with next frame of watching area 2172, but synthetic method is not limited thereto.As other examples, also can be, the boundary line of the main objects body 2171 in specific watching area 2172 is come by image procossing, the main objects body 2171 that this boundary line is surrounded and be updated to next frame, and former frame is maintained for the region in watching area 2172 but outside the boundary line being positioned at main objects body 2171, and synthesizes with the frame of neighboring area 2176.That is, for the frame frequency that can fall into neighboring area 2176 outside the boundary line in watching area 2172.Thereby, it is possible to prevent the border of the fluency shown in dynamic image from seeming not nature.In addition, reproducing frame frequency does not need identical with the frame frequency (watching area is 180fps, and neighboring area is 60fps) during photography, such as, watching area can be made to be 60fps, make neighboring area be 20fps etc.For slow motion is reproduced in this situation.
Figure 48 represents that filming apparatus generates and records the flow chart of the example of other actions of dynamic image.In Figure 48, identical Reference numeral marked to the action identical with Figure 41 and omit the description.
In the action of Figure 48, in watching area 2172 and neighboring area 2176, replace Figure 41 frame frequency and except rate is different between making, or, except rate is different between the basis of the frame frequency of Figure 41 makes.More specifically, in step S2120, the block of pixels 2131 that drive division 2502 comprises about watching area 2172, store and the output of picture signal except the pixel after removing between rate performs electric charge with low tone, about the block of pixels 2131 that neighboring area 2176 comprises, to store and the output of picture signal except the pixel after removing between rate performs electric charge between height.The block of pixels 2131 such as watching area 2172 comprised and except rate be 0, namely read whole pixel, the block of pixels 2131 that neighboring area 2176 is comprised and except rate be 0.5, namely read the pixel of half.
In this situation, the reset transistor 2303 of the reset transistor 2303 of the block of pixels 2131 that drive division 2502 individually drives neighboring area 2176 to comprise, transmission transistor 2302 and the block of pixels 2131 of selecting the group of transistor 2305 and watching area 2172 to comprise, transmission transistor 2302 and select the group of transistor 2305, thus with between different except rate obtains picture signal.
In step S2110, dynamic image generating unit 2154, based on the picture signal of the watching area 2172 exported except rate with low tone, generates the watching area dynamic image corresponding with watching area 2172.Dynamic image generating unit 2154 similarly based on the picture signal to remove the neighboring area 2176 that rate exports between height, generates the neighboring area dynamic image corresponding with neighboring area 2176.In addition in step S2112, dynamic image generating unit 2154 is removed the information of rate between adding separately and watching area dynamic image and neighboring area dynamic image is recorded in recording unit 2505.
Figure 49 illustrates a block of pixels with the example of pixel 2188 read except rate 0.5.In the example shown in Figure 49, when the block of pixels 2132 of neighboring area 2176 is Bayer array, vertical direction is read every a unit of Bayer array, that is, every two row when observing with pixel unit are alternately set to the pixel 2118 and unread pixel that will read.Thereby, it is possible to avoid color balance to lack of proper care carry out between except read.
Figure 50 and Figure 48 is corresponding, expression filming apparatus reproduces and shows the flow chart of the action of dynamic image.In Figure 50, identical Reference numeral marked to the action identical with Figure 47 and omit the description.
In the step S2170 of Figure 50, the pixel of the frame of dynamic image combining unit 2156 pairs of neighboring area dynamic images carries out interpolation to after making exploring degree mate with the exploring degree of the frame of watching area dynamic image, the frame of watching area dynamic image is embedded in the frame of neighboring area dynamic image, thus the frame of compound display image.Thereby, it is possible to obtain picture signal from the watching area 2172 comprising main objects body 2171 with high-resolution, and neighboring area 2176 can be suppressed for low exploring degree, thus can data volume be reduced.Thus, and carry out from whole pixel compared with high speed readout, the load of driving and image procossing can be reduced, and suppress power consumption and heating.
In addition, in the example shown in Figure 35 to Figure 50, watching area 2172 is rectangle, but the shape of watching area 2172 is not limited thereto.As long as watching area 2172 along block of pixels 2131 boundary line and exist, then can be convex polygon, concave polygon or the middle annular shape etc. embedding neighboring area 2176.In addition, watching area 2172 also can be provided with multiple with being spaced from each other.In this case, different frame frequencies can be set in watching area 2172 each other.
In addition, the frame frequency of watching area 2172 and neighboring area 2176 can be variable.Such as, often can detect the amount of movement of main objects body 2171 through the unit interval, the amount of movement of main objects body 2171 is larger, sets higher frame frequency to watching area 2172.In addition, within the unit interval, also can follow main objects body 2171 at any time and upgrade the selection that should be contained in the block of pixels 2131 of watching area 2172.
The generation of the dynamic image of Figure 41 and Figure 48 is pressed record button by user and starts, starting reproduced by user presses reproduction button of the dynamic image of Figure 47 and Figure 50, but start time is not limited thereto.As other examples, by the push-botton operation undertaken by user, the generation action of dynamic image and reproducing movement can be performed continuously, and make display part 2506 carry out live view image display (also referred to as live view display).In this situation, identify that for making user the display of watching area 2172 can be overlapping.Such as, in display part 2506, at the boundary line place display box of watching area 2172, or the brightness of neighboring area 2176 can be reduced or improves the brightness of watching area 2172.
In the action of Figure 48, except rate is different between making in watching area 172 with neighboring area 176.Also can replace except rate is different between making, and make line number when being added by the picture element signal of adjacent lines pixel different.Such as, in watching area 172, line number is 1, that is, not by adjacent lines additively output pixel signal, be the line number than watching area more than 172 in neighboring area 176, such as line number is 2, exports the picture element signal of the same column pixel of 2 adjacent row.Thus, in the same manner as Figure 48, the exploring degree of watching area 172 can be maintained higher compared with neighboring area 176, and overall semaphore can be reduced.
In addition, also can replace and dynamic image combining unit 2156 is arranged on the image processing part 2511 of filming apparatus 2500, and be arranged on outside display unit, such as PC.In addition, above-mentioned execution mode is not limited to the situation generating dynamic image, also goes for the situation generating still image.
In addition, multiple block of pixels 2131 is all divided into watching area 2172 and these two regions, neighboring area 2176 by above-mentioned execution mode, but is not limited thereto, and also can be divided into the region of more than three.In this situation, using the block of pixels 2131 suitable with the border of watching area 2172 and neighboring area 2176 as borderline region, this borderline region can be used to the value of the controling parameters being used for watching area 2172 and control for the median of the value of the controling parameters of neighboring area 2176.Thereby, it is possible to prevent watching area 2172 and the border of neighboring area 2176 from seeming not nature.
Also can make the time that stores of electric charge in watching area 2172 and neighboring area 2176, store the differences such as number of times.In this situation, watching area 2172 and neighboring area 2176 can be divided based on brightness, and can zone line be set.
Figure 51 A and Figure 51 B is the key diagram of scene example and Region dividing.Figure 51 A illustrates the scene that the shooting area of shooting chip 2113 captures.Specifically, be the scene of the high light subject 2603 simultaneously mirroring the shade subject 2601 and middle subject 2602 that environment within doors comprises and the room external environment observed in the inner side of window frame 2604.When photographing to the larger scene of such light and shade from high light portion to shadow part difference, if capturing element in the past, then when to perform electric charge with high light portion for benchmark and store, produce entirely black in shadow part, when with shadow part be benchmark perform electric charge store time, produce complete white in high light portion.That is, can say to make high light portion and shadow part store output image signal by an electric charge without exception, the dynamic range of larger scene poor relative to light and shade and photodiode is not enough.Therefore, in the present embodiment, be the subregions such as high light portion, shadow part, and it is different from each other to make the electric charge of the photodiode corresponding with regional store number of times by scene partitioning, the essence of seeking dynamic range thus expands.
Figure 51 B illustrates the Region dividing in the pixel region of shooting chip 2113.The scene of Figure 51 A that operational part 2512 pairs of photometric measurers 2503 capture is resolved, and is that pixel region divides by benchmark with brightness.Such as, systems control division 2501 makes photometric measurer 2503 change the time for exposure and performs repeatedly scene to obtain, and operational part 2512 decides the dividing line of pixel region with reference to the changes in distribution of its full white region, full black region.In the example of Figure 51 B, operational part 2512 is divided into these three regions of shadow region 2611, zone line 2612 and highlight area 2613.
Dividing line defines along the border of block of pixels 2131.That is, each region after division comprises the individual group of integer respectively.And, the pixel being contained in each group of the same area during corresponding with the shutter speed determined by operational part 2512 in, the electric charge carrying out same number stores and picture element signal exports.If affiliated area is different, then the electric charge carrying out different number of times stores and picture element signal output.
Figure 52 is the key diagram storing control by the electric charge carried out based on each region after the example division of Figure 51 A and Figure 51 B.After receive photography preparation instruction from user, operational part 2512 decides shutter speed T according to the output of photometric measurer 2503 0.And, be divided into shadow region 2611, zone line 2612 and highlight area 2613 as mentioned above, and according to its separately monochrome information decide electric charge and store number of times.Electric charge is stored number of times and can not make the saturated mode of pixel to be stored by each electric charge and determine.Such as, electric charge store in action store can store electric charge most probably to ninety percent electric charge, and decide electric charge as benchmark and store number of times.
At this, make shadow region 2611 for once.That is, determined shutter speed T is made 0time consistency is stored with electric charge.In addition, making the electric charge of zone line 2612 store number of times is twice.That is, making electric charge once store the time is T 0/ 2, at shutter speed T 0during repeat twice electric charge and store.In addition, making the electric charge of highlight area 2613 store number of times is four times.That is, making electric charge once store the time is T 0/ 4, at shutter speed T 0during repeat four electric charges and store.
When receiving photography instruction at moment t=0 from user, drive division 2502, to the pixel of group belonging to arbitrary region, all applies to reset pulse and transmission pulse.Be applied for opportunity with this, all pixels all start electric charge and store.
At moment t=T 0/ 4, the pixel of drive division 2502 to the group belonging to highlight area 2613 applies transmission pulse.Then, successively strobe pulse applied to the pixel in each group and respective picture element signal outputted to output wiring 2309.After the picture element signal of the whole pixels in output group, the pixel of drive division 2502 to the group belonging to highlight area 2613 applies to reset pulse and transmission pulse again, starts secondary electric charge and stores.
In addition, the selection due to picture element signal exports and needs the time, thus between the beginning that stores of the end stored at primary electric charge and secondary electric charge can generation time poor.If essence this time difference can be ignored, then as mentioned above, make shutter speed T 0the time obtained divided by electric charge stores number of times is that electric charge once stores the time.On the other hand, if cannot ignore, then consider its time, adjustment shutter speed T 0, or make electric charge once store Time transfer receiver shutter speed T 0the time obtained divided by electric charge stores number of times is short.
At moment t=T 0/ 2, drive division 2502 to belong to zone line 2612 group and belong to highlight area 2613 group pixel apply transmission pulse.Then, successively strobe pulse is applied to the pixel in each group, and respective picture element signal is outputted to output wiring 2309.After the picture element signal of the whole pixels in output group, drive division 2502 applies to reset pulse and transmission pulse to the group belonging to zone line 2612 and the pixel of group that belongs to highlight area 2613 again, start secondary electric charge to zone line 2612 to store, electric charge highlight area 2613 being started to third time stores.
At moment t=3T 0/ 4, the pixel of drive division 2502 to the group belonging to highlight area 2613 applies transmission pulse.Then, successively strobe pulse is applied to the pixel in each group, and respective picture element signal is outputted to output wiring 2309.After the picture element signal of the whole pixels in output group, the pixel of drive division 2502 to the group belonging to highlight area 2613 applies to reset pulse and transmission pulse again, and the electric charge starting the 4th time stores.
At moment t=T 0, drive division 2502 applies transmission pulse to the pixel in whole region.Then, successively strobe pulse is applied to the pixel in each group, and respective picture element signal is outputted to output wiring 2309.By above control, the picture element signal of amount is once stored respectively in the pixel memories 2414 corresponding with shadow region 2611, in the pixel memories 2414 corresponding with zone line 2612, store the picture element signal of the amount of twice respectively, in the pixel memories 2414 corresponding with highlight area 2613, store the picture element signal of the amount of four times respectively.
In addition, drive division 2502 also can apply to reset pulse to the pixel of the group belonging to arbitrary region successively, resets the pixel of the group belonging to regional successively.In addition, drive division 2502 also the pixel of group of counterweight postpone can apply transmission pulse successively.Be applied for opportunity with this, the pixel of each group can start electric charge successively and store.Also can be that terminating after electric charge stores to the pixel of the group belonging to Zone Full, drive division 2502 applies transmission pulse to the pixel in whole region.And, after successively strobe pulse is applied to the pixel in each group, respective picture element signal can be outputted to and exports wiring 2309.
These picture element signals transmit to image processing part 2511 successively.Image processing part 2511 generates the view data of high dynamic range according to this picture element signal.Concrete process will be described later.
Figure 53 is the figure of the relation representing cumulative number and dynamic range.The pixel data storing corresponding amount repeatedly with the electric charge repeated is carried out accumulative process by image processing part 2511, and forms a part for the view data of high dynamic range.
When cumulative number be 1 time, be about to have carried out the dynamic range in the region that 1 electric charge stores as benchmark, cumulative number is 2 times, namely carried out 2 electric charges stores and is the amount of 1 grade by the extensive magnitude of the dynamic range in region accumulative for output signal.Similarly, if cumulative number is 4 times, be the amount of 2 grades, if be then the amount of 7 grades for 128 times.That is, the dynamic range in order to seek the amount of n level expands, as long as by 2 nsecondary output signal adds up.
At this, image processing part 2511, in order to identify which zoning to be carried out electric charge several times and stored, gives the exponent bits of the 3bit representing cumulative number to picture signal.As shown in the figure, exponent bits be for 1 time 000 for accumulative total, for 2 times be 001 ... being 111 for 128 times, mode is distributed in order.
Image processing part 2511, with reference to the exponent bits of each pixel data received from computing circuit 2415, when reference results is the accumulative total of more than 2 times, performs the accumulative process of pixel data.Such as, in the situation (1 grade) that cumulative number is 2 times, for 2 pixel datas, the high-order 11bit stored with electric charge in the pixel data of corresponding 12bit is added each other, generates 1 pixel data of 12bit.Similarly, when cumulative number is 128 times (7 grades), for 128 pixel datas, the high-order 5bit stored with electric charge in the pixel data of corresponding 12bit is added each other, generates 1 pixel data of 12bit.That is, make to deduct the progression corresponding with cumulative number and the high-order bit that obtains is added each other from 12, generate 1 pixel data of 12bit.In addition, removing does not become the low level bit being added object.
By processing like this, the brightness range of imparting GTG and cumulative number can be made to change to correspondingly high brightness side.That is, 12bit is distributed to the limited range of high brightness side.Therefore, it is possible to give GTG in the past entirely white image-region.
But, for other zonings, owing to being assigned with 12bit to different brightness ranges, so image data generating cannot be carried out by each region by connecting synthesis simply.Therefore, image processing part 2511, in order to maintain the GTG that obtains as far as possible and make whole region be the view data of 12bit, using high-high brightness pixel and minimum brightness pixel as benchmark, carries out re-quantization process.Specifically, implement gamma conversion and perform quantification, thus maintain GTG more smoothly.By processing like this, the view data of high dynamic range can be obtained.
In addition, cumulative number is not limited to the situation of the exponent bits of giving 3bit as described above to pixel data, also can describe as the additional information different from pixel data.In addition, also can remove exponent bits from pixel data, replacing counts the quantity of the pixel data be stored in pixel memories 2414, obtains cumulative number thus when being added process.
In addition, in above-mentioned image procossing, perform the re-quantization process whole region being accommodated in the view data of 12bit, but also for the bit number of pixel data, correspondingly can increase with upper limit cumulative number and export bit number.Such as, if be decided to be by upper limit cumulative number 16 times (4 grades), then for the pixel data of 12bit, whole region is made to be the view data of 16bit.If process like this, ground, position image data generating can not fall.
Next, a series of photographing actions process is described.Figure 54 is the flow chart of the process representing photographing actions.Flow process is from after the power supply of filming apparatus 500 is connected.
In step S2201, systems control division 2501 is standby until make photography and prepare instruction and interrupteur SW 1 and be pressed.Step S2202 is entered after the pressing of interrupteur SW 1 being detected.
In step S2202, systems control division 2501 performs light-metering process.Specifically, obtain the output of photometric measurer 2503, operational part 2512 calculates the Luminance Distribution of scene.Then, enter step S2203, as mentioned above, determine shutter speed, Region dividing, cumulative number etc.
After warming-up exercise completes in photography, enter step S2204, standby until make photography instruction and interrupteur SW 2 is pressed.Now, if the elapsed time exceedes the time Tw (step S2205 is yes) preset, then step S2201 is returned.If (step S2205 is no) detects pressing of interrupteur SW 2 before more than Tw, then enter step S2206.
In step S2206, the drive division 2502 receiving the instruction of systems control division 2501 performs the electric charge using Figure 52 to illustrate and stores process, signal readout process.Then, after whole signals has read, enter into step S2207, perform the image procossing using Figure 53 to illustrate, and perform generated Imagery Data Recording to the recording processing in recording unit.
After recording processing completes, enter into step S2208, judge whether the power supply of filming apparatus 2500 disconnects.If power supply does not disconnect, then return step S2201, if disconnect, terminate a series of photographing actions process.
Figure 55 is the block diagram of the concrete structure of the example represented as signal processing chip 2111.Pixel data handling part 2910 shown in Figure 55 is arranged by each block of pixels 2131.But, and in the same manner as the computing circuit 2415 associating explanation with Figure 38, pixel data handling part 2910 also can be arranged by each pixel for the pixel of more than 2.In addition, in the structure of pixel data handling part 2910, beyond computing circuit 2415 structure can be arranged by each block of pixels 2131.
Control part 2740 in the signal processing chip 2111 of this example bears part or all of the function of drive division 2502.Control part 2740 comprises as the sensor controller 2441 of the controlling functions of sharing, block control part 2442, Synchronization Control portion 2443, signal control part 2444 and the blanket drive control part 2420 controlling these each control parts.Drive control part 2420 converts the instruction from systems control division 2501 to each control part executable control signal and is delivered to each control part.
Sensor controller 2441 bear export to shooting chip 2113, store with the electric charge of each pixel, output that electric charge reads relevant control impuls controls.Specifically, sensor controller 2441 resets pulse and transmission pulse control beginning that electric charge stores and end by exporting object pixel, and by exporting strobe pulse to reading pixel, picture element signal is exported to output wiring 2309.
Block control part 2442 performs output that export to shooting chip 2113, that the block of pixels 2131 becoming control object is carried out to specific certain pulses.Figure 51 B etc. illustrates as used, in each region being divided into watching area 2172 and neighboring area 2176, multiple block of pixels 2131 adjacent one another are can be comprised.These block of pixels 2131 belonging to the same area form a block group.The pixel that same group comprises starts electric charge in identical timing and stores, and terminates electric charge in identical timing and store.Therefore, block control part 2442 bears the effect making block of pixels 2131 Briquetting by exporting certain pulses to the block of pixels 2131 becoming object based on the appointment from drive control part 2420.Each pixel is via TX wiring 2307 and reset the transmission pulse that receives of wiring 2306 and reset the logic product that pulse becomes each pulse that sensor controller 2441 exports and the certain pulses that block control part 2442 exports.
Like this, by each region is controlled as separate block group, achieve the electric charge using Figure 52 to illustrate and store control.The pixel that drive control part 2420 can comprise same group, applies to reset pulse and transmission pulse with different timing.In addition, drive control part 2420 also can store after identical timing terminates at the electric charge of the pixel making same group comprise, and applies strobe pulse successively, and read respective picture element signal successively to the pixel in block group.
Synchronization Control portion 2443 exports synchronizing signal to shooting chip 2113.Each pulse and synchronizing signal synchronously become effective taking in chip 2113.Such as, by adjustment synchronizing signal, achieve and only will belong to the STOCHASTIC CONTROL, thinning control etc. of specific pixel as control object of the pixel of same block of pixels 2131.
Signal control part 2444 mainly bears the timing controlled for A/D converter 2412.Via the picture element signal exporting wiring 2309 output, be input to A/D converter 2412 via CDS circuit 2410 and multiplexer 2411.A/D converter 2412 is controlled by signal control part 2444, and converts the picture element signal of input to digital pixel data.The pixel data converting digital signal to is delivered to demultiplexer 2413, then as numerical data pixel value and be stored in the pixel memories 2414 corresponding with each pixel.Pixel memories 2414 is an example of memory block 2730.
Signal processing chip 2111 has the timing memory 2430 as storing control storage, and what this timing memory 2430 stored that the block forming the block group of watching area 2172 and neighboring area 2176 about which block of pixels 2131 being combined distinguishes information and repeat that electric charge several times stores about each formed block group stores number information.Timing memory 2430 is made up of such as flash memory ram.
As mentioned above, form block group, the testing result that the Luminance Distribution based on the scene performed before a series of photographic process detects about by which block of pixels 2131 combination, determined by systems control division 2501.The block group determined is with such as the 1st piece of group, the 2nd piece of group ... mode distinguish, and comprise which block of pixels 2131 by each block group and specify.Drive control part 2420 receives this block from systems control division 2501 and distinguishes information, and stores to timing memory 2430.
In addition, systems control division 2501, based on the testing result of Luminance Distribution, determines that each piece of group repeats electric charge several times and store.Drive control part 2420 receives this from systems control division 2501 and stores number information, and distinguishes information with corresponding block and store to timing memory 2430 in couples.By distinguishing information to timing memory 2430 memory block like this and storing number information, drive control part 2420 successively can perform a series of electric charge independently with reference to timing memory 2430 and store control.That is, drive control part 2420 is once receive the signal of photography instruction in the acquisition of piece image controls from systems control division 2501, then for the control of each pixel after, can complete with receiving instruction at every turn store control from systems control division 2501.
Drive control part 2420 receives based on preparing indicate with photography the photometry result (testing result of Luminance Distribution) synchronously performed from systems control division 2501 that the block upgraded is distinguished information and stored number information, and the storage content of suitable renewal timing memory 2430.Such as, drive control part 2420 prepares to indicate or photograph to indicate synchronously to upgrade timing memory 2430 with photography.By forming in this wise, the electric charge that can realize more at a high speed stores control, and performs during electric charge stores control at drive control part 2420, and systems control division 2501 can perform other process concurrently.
Drive control part 2420 is not limited to store with reference to timing memory 2430 in control for the electric charge of shooting chip 2113 performing, and is reading in the execution controlled also with reference to timing memory 2430.Such as, drive control part 2420 stores number information with reference to each piece of group, is stored into by the pixel data exported in the corresponding address of pixel memories 2414 from demultiplexer 2413.
Object pixel data by each block of pixels, according to the delivery request from systems control division 2501, read from pixel memories 2414 and pay to image processing part 2511 by drive control part 2420.Now, the additional data corresponding with each object pixel data is paid to image processing part 2511 by drive control part 2420 in the lump.
Computing circuit 2415, for the pixel data corresponding to the picture element signal that corresponding block of pixels 2131 generates, carries out predetermined computing by each block of pixels 2131.That is, computing circuit 415 and block of pixels 2131 are arranged accordingly, and carry out calculation process by each block of pixels 2131.In addition, computing circuit 2415 and block of pixels 2131 are arranged with man-to-man relation.That is, computing circuit 2415 is arranged on the circuit on the signal processing chip 2111 immediately below block of pixels 2131.The pixel data be stored in pixel memories 2414 is read out to computing circuit 2415 by drive control part 2420, and performs predetermined calculation process in computing circuit 2415.
Pixel memories 2414 is provided with the data transmission interface transmitting pixel data or differential data described later according to delivery request.Data transmission interface and the data line 2920 be connected with image processing part 2511 are connected.Data line 2920 is made up of such as universal serial bus.In this situation, specified by the address that make use of address bus to the delivery request of drive control part 2420 from systems control division 2501 and perform.
The signal processing chip 2111 of Figure 55 can be used, the computing put rules into practice after using different controling parameters to obtain pixel data in watching area 2172 and neighboring area 2176.Such as, in Figure 41 to Figure 44, in watching area 2172 and neighboring area 2176 from the Computer image genration obtained with different frame frequencies dynamic image, but also can to replace, to carry out making the image procossing of the image averaging obtained with high frame rate improving S/N ratio.In this situation, during such as obtaining picture element signal once by drive control part 2420 from neighboring area 2176, obtain repeatedly from watching area 2142, the picture signal of such as four times, and pixel data to be stored in pixel memories 2414.Computing circuit 2415 reads from pixel memories 2414 multiple pixel datas that each pixel for watching area 2142 obtains, and is averaged by each pixel.Thus, the random noise in each pixel of watching area 2172 reduces, and can improve the S/N ratio of watching area 2172.
In addition, data line 2920 is connected with memory 2940.Memory 2940 can be volatile memory pixel data being stored into successively assigned address from pixel memories 2414.Such as, memory 2940 is DRAM.Memory 2940 stores the RGB data employing 1 frame amount of the pixel data of each block of pixels 2131 received.
Control part 2740 in the computing circuit 2415 corresponding to this block of pixels 2131, and carries out giving and accepting of data between the computing circuit 2415 corresponding to block of pixels 2131 of periphery.In the example of Figure 55, drive control part 2420 transmits data between multiple computing circuit 2415.Other operation results in other computing circuits 2415 that the reception of each computing circuit 2415 is corresponding with other block of pixels 2131 at least partially.Each computing circuit 2415 also based on other received operation results, and can generate self operation result.
In addition, what calculation process obtained by computing circuit 2415 outputs to output circuit 2922 by the operation result of each block of pixels 2131.Operation result in computing circuit 2415 and pixel data are set up corresponding and are exported to systems control division 2501 by output circuit 2922.At this, set up corresponding and exports and refer to pixel data, the information which block of pixels the operation result that computing circuit 2415 is implemented computing to the pixel data of this block of pixels 2131, the pixel data being implemented this computing with expression belong to is associated and exports.
In addition, the data being transferred to systems control division 2501 via output circuit 2922 are the operation results by each block of pixels 2131, but, as long as the data knowing received by systems control division 2501 implement the result of what kind of computing and the data obtained in each block of pixels 2131, just these data can be utilized.In this example, the data code (data code) of the operation content represented in each computing circuit 2415 is additional to operation result and exports by output circuit 2922.This data code can pre-determine by each computing circuit 2415.In addition, when computing circuit 2415 can carry out multiple computing, preferred computing circuit 2415 notifies to output circuit 2922 to represent the information of having carried out which kind of computing.That is, the operation content of being undertaken by each block of pixels 2131, operation result and control information generate as a data group and export by output circuit 2922.The example of the concrete data group that output circuit 2922 exports will be described later.
Figure 56 illustrates the multiple computing circuits 2415 mutually paying operation result.Such as the 1st computing circuit 2415 receives the 2nd assessed value in the 2nd computing circuit 2415, or the 2nd computing circuit 2415 receives the operation result carried out the 2nd assessed value in calculating process.In this situation, the 1st computing circuit 2415 carries out computing based on the 2nd assessed value or this operation result to the 1st assessed value.Or, also can be that each computing circuit 2415 reads the picture element signal corresponding with other computing circuits 2415 from the pixel memories 2414 corresponding with computing circuit 2415, and carries out the computing for this picture element signal by self.Such as the 1st computing circuit 2415 reads 2nd picture element signal corresponding with the 2nd computing circuit 2415.In this situation, the 1st computing circuit 2415 carries out computing based on the 2nd read-out picture element signal to the 1st assessed value.
In this example, the each block of pixels 2131 corresponding with computing circuit 2415-1, computing circuit 2415-2 and computing circuit 2415-4 is adjacent in a column direction, and each block of pixels 2131 corresponding with computing circuit 2415-1, computing circuit 2415-3 and computing circuit 2415-5 is adjacent in the row direction.Other operation results in other computing circuits 2415 corresponding to block of pixels 2131 that the reception of each computing circuit 2415 is adjacent with the block of pixels 2131 corresponding to self at least partially.At this, to be adjacently not limited in the row direction and situation adjacent on column direction.The situation that block of pixels 2131 is adjacent in the diagonal direction can be comprised.In this example, illustrate in the row direction and situation adjacent on column direction.
Each adjacent computing circuit 2415 is via operation result being outputted to the output bus of the computing circuit 2415 corresponding with adjacent block of pixels 2131 and operation result being input to the input bus of the computing circuit 2415 corresponding with adjacent block of pixels 2131 and being connected.Control part 2740, in the computing circuit 2415 corresponding with this block of pixels 2131, based on the operation result from computing circuit 2415 corresponding to other block of pixels 2131 with adjacent, and generates the operation result of self of this block of pixels 2131.
Figure 57 is the block diagram of an example of the structure representing computing circuit 2415.Each computing circuit 2415 has this block calculating part 2912, average computation portion 2913, average-average computation portion 2914, periphery block calculating part 2911 and pixel-average computation portion 2915.The input of this block calculating part 2912 is connected with the output of the pixel memories 2414 corresponding to this block of pixels 2131, and the output of this block calculating part 2912 is connected with each computing circuit 2415 corresponding to the input in average computation portion 2913, the average-input in average computation portion 2914 and the input of output circuit 2922 and adjacent block of pixels 2131.Such as this block calculating part 2912 exports the average of the pixel value of each color in corresponding block of pixels 2131.
Periphery block calculating part 2911 has multiple input, and the output of each input and the computing circuit 2415 multiple block of pixels 2131 corresponding to adjacent with this block of pixels 2131 is connected.The output of periphery block calculating part 2911 is connected with the input in average computation portion 913.Can be such as that periphery block calculating part 2911 based on pixel value average of each color received from other computing circuits 2415, and calculates that it is average.In addition, also can be that the average of the pixel value of each color received from other computing circuits 2415 directly exports by periphery block calculating part 2911.
Average computation portion 2913 has two input parts, and the input of a side is connected with the output of this block calculating part 2912, and the input of the opposing party is connected with the output of periphery block calculating part 2911.The mean value that such as average computation portion 2913 exports based on this block calculating part 2912 and the mean value that periphery block calculating part 2911 exports, and export the average of the pixel value of each color in corresponding block of pixels 2131 and adjacent block of pixels 2131.
On average-average computation portion 2914 has two inputs, and the input of a side is connected with the output in average computation portion 2913, and the input of the opposing party is connected with the output of this block calculating part 2912.The output on average-average computation portion 2914 is connected with the input of output circuit 2922.The pixel value of each color that average and this block calculating part 2912 of the pixel value of each color that such as average-average computation portion 2913 of 2914 pairs, average computation portion calculates calculates average between residual quantity calculate.
Pixel-average computation portion 2915 has two inputs, and the input of a side is connected with the output in average computation portion 2913, and the input of the opposing party is connected with the output of the pixel memories 2414 corresponding to this block of pixels 2131.The output in pixel-average computation portion 2915 is connected with the input of the pixel memories 2414 corresponding to this block of pixels 2131.Such as pixel-average computation portion 2915 export the pixel value of each color that each pixel value in this block of pixels 2131 and average computation portion 2913 calculate average in corresponding color mean value between residual quantity.
Operation result in this block calculating part 2912 is sent to other computing circuits 2415 and output circuit 2922 by control part 2740.In addition, the operation result in average-average computation portion 2914 is sent to output circuit 2922 by control part 2740.And the operation result in pixel-average computation portion 2915 is fed back the pixel memories 2414 in this block of pixels 2131 by control part 2740.
In addition, each calculating part of computing circuit 2415 can be made up of add circuit, subtraction circuit and division circuit.Like this, by being simplified by the circuit structure of computing circuit 2415, by each block of pixels 2131, computing circuit 2415 can be installed.
Figure 58 is the flow chart of an example of the action that computing circuit 2415 is described.After computing circuit 2415 starts action, in step S2300, control part 2740 reads the rgb pixel data of this block of pixels 2131 of taking with the frame frequency of this block of pixels 2131 from the pixel memories 2414 corresponding with this block of pixels 2131, and is input to this block calculating part 2912.In step S2310, the operation result in adjacent pixel blocks 2131 synchronously, is input to periphery block calculating part 2911 from adjacent computing circuit 2415 by control part 2740 and step S2300 at least partially.In this example, each computing circuit 2415 calculates pixel value average by each rgb pixel, and periphery block calculating part 2911 receives the average of the pixel value by each rgb pixel that adjacent computing circuit 2415 calculates.
In step S2320, control part 2740 carries out the predetermined computing of pixel data to the block of pixels 2131 corresponding to self in this block calculating part 2912.Such as, this block calculating part 2912 calculates the addition mean value (Ar, Ag, Ab) by each rgb pixel of this block of pixels 2131.Be added mean value to be calculated by Ai=Σ (the i pixel in block of pixels)/(the i pixel count in block of pixels) (i=r, g, b).In step S2322, control part 2740 makes this block calculating part 2912 to input input mean value (Ar, Ag, Ab) of each computing circuit 2415 of the correspondence of the input of output circuit 2922 and four adjacent pixel blocks 2131.
In step S2340, control part 2740 is in periphery block calculating part 2911, based on the addition mean value by each rgb pixel of adjacent block of pixels 2131, calculate average (Br, Bg, Bb) (the carrying out adjacent pixel blocks average) in adjacent multiple block of pixels 2131.Such as, adjacent pixel blocks is on average calculated (quantity of wherein, adjacent block of pixels 2131 is 4) by Bi=Σ Ai/4 (i=r, g, b).In step S2350, control part 2740, in average computation portion 2913, to the operation result in other operation results received from other computing circuits 2415 and this block calculating part 2912, carries out predetermined computing.Such as, average computation portion 2913 calculates calculate in step S2340 four adjacent pixel blocks mean values (Br, Bg, Bb) and the overall average (Cr, Cg, Cb) of the addition mean value (Ar, Ag, Ab) of this block of pixels 2131 that calculates in step S2320.Overall average is calculated by Ci=(Bi+Ai)/2 (i=r, g, b).
In step S2360, control part 2740 in average-average computation portion 2914, to the addition mean value (Ar, Ag, Ab) of this block that this block calculating part 2912 in step S2320 calculates, and the overall average (Cr, Cg, Cb) that average computation portion 2913 calculates in step S2350 between residual quantity value (Δ Ar, Δ Ag, Δ Ab) calculate.Residual quantity value is calculated by Δ Ai=(Ai-Ci) (i=r, g, b).Control part 2740, in step S2370, makes residual quantity value (Δ Ar, Δ Ag, Δ Ab) input to output circuit 2922 in average-average computation portion 2914.In addition, also can be that computing circuit 2415 does not have on average-average computation portion 2914, replace the operation result in average-average computation portion 2914, and the operation result in average computation portion 2913 is inputted to output circuit 2922.
Control part 2740 is in step S2380, in pixel-average computation portion 2915, to the rgb pixel data of this block of pixels obtained in step S2310, and the overall average (Cr, Cg, Cb) that average computation portion 2913 calculates in step S2350 between residual quantity value (Δ Cr, Δ Cg, Δ Cb) calculate.Residual quantity value is calculated by Δ Ci=(the i pixel in this block of pixels of Ci-) (i=r, g, b).Thereby, it is possible to the less residual quantity value of use value and mean value preserve the information of former pixel data.That is, based on the operation result in average computation portion 2913, the pixel data compression of the block of pixels 2131 corresponding to self can be made.
In step S2390, (Δ Cr, Δ Cg, Δ Cb) feeds back to the pixel memories 2414 of this block of pixels 2131 by control part 2740.In step S2392, control part 2740 judges whether to continue computing, returns step S2300 in the former case, terminates calculation process in the latter case.
Control part 2740 performs the above action of computing circuit 2415 by each block of pixels 2131.In addition, computing circuit 2415 can, to the pixel data in present frame, use the pixel data in former frame to carry out predetermined computing.In this situation, control part 2740 is for computing circuit 2415, replace adjacent block of pixels 2131 by the mean value of each rgb pixel, and use the block of pixels 2131 of self, the addition mean value (Dr, Dg, Db) by each rgb pixel in such as former frame.The addition mean value of frame is above obtained by Di=Σ (the i pixel in the block of pixels of frame above)/(the i pixel count in the block of pixels of frame above) (i=r, g, b).Control part 2740 reads the rgb pixel data of frame above from memory 2940, and calculates addition mean value (Dr, Dg, Db) in the 4th operational part.Action is in addition identical with Figure 58, therefore omits the description.
Like this, according to this example, can by by the operation result of each block of pixels 2131 and operation content and carrying to systems control division 2501 from block of pixels 2131 via output circuit 2922 by the control information of each block of pixels 2131 of generating based on control part 2740.Its result is, significantly can reduce the load of the image procossing in systems control division 2501.In addition, as long as the correlation between the pixel data of the block of pixels 2131 of periphery is exported as the assessed value of this block of pixels 2131 due to computing circuit 2415, so the data volume that should send to systems control division 501 can be reduced.And, because residual quantity value (Δ Cr, Δ Cg, Δ Cb) can feed back to the pixel memories 2414 corresponding to this block of pixels 2131 by the computing circuit 2415 of this example, so the data volume sent to systems control division 2501 correspondingly can be reduced.And, the image processing part 2511 that systems control division 2501 comprises can based on the operation result received from each output circuit 2922, and generate a view data, therefore, with by the rgb pixel data of whole block of pixels 2131 once be stored in after in memory 2940, read the situation also again forming an image to compare, the speed of image procossing can be improved.In addition, the signal processing chip 2111 of this example has the function at least partially of the image processing function in image processing part 2511.Such as computing circuit 2415 also plays function as based on each assessed value to the image processing part of the view data real-time image processing of the image corresponding to the picture element signal of correspondence.As an example, this image processing function can be function residual quantity value (Δ Cr, Δ Cg, Δ Cb) fed back to pixel memories 2414.In addition, about the example of assessed value, be the picture element signal in block of pixels 2131 average, block of pixels 2131 inside and outside the weighted average of picture element signal, the contrast in block of pixels 2131, the weighted average of the contrast inside and outside block of pixels 2131, the brightness in block of pixels 2131, the brightness inside and outside block of pixels 2131 weighted average etc.And assessed value can be the value average the addition with regulation ratio of average, the B pixel of average, the R pixel of G pixel obtained.In addition, about the calculating of mean value, also can be the mean value of the subregion be configured in unit group.
Figure 59 illustrates an example of the data group 2950 that output circuit 2922 generates based on the input from computing circuit 2415.Data group 2950 has data code region 2952 and data area 2954.In data code region 2952, can to data code distribution 4.In this example, to data assignment of code D12 to D15.In data area 2954,12 that the additional data corresponding to each data code can be distributed.In this example, to data code distribution D0 to D11.The figure place of data group 2950 is not limited to 16, and the figure place that reply data code and additional data distribute also can set arbitrarily.
In addition, the operation result data from computing circuit 2415 can export with different route from the pixel data from pixel memories 2414 by control part 2740.Such as, the operation result of computing circuit 415 can be sent to systems control division 2501 by output circuit 2922 by control part 2740.In addition, the pixel data of pixel memories 2414 can be stored into memory 2940 via data line 2920 by control part 2740.In other examples, control part 2740 also can add the operation result of the pixel data of this block of pixels 2131 to the pixel data of block of pixels 2131 and together send to systems control division 2501 from output circuit 2922.
In addition, be explained above the example on average calculated to pixel value, but the operation content in computing circuit 2415 is not limited to this.Parameter for computing circuit 2415 can comprise the information beyond pixel value.Such as, computing circuit 2415 computing that the electric charge in the position in the XY plane of pixel, information to the distance of subject, f-number, PD2104 can be used to store the parameters such as the charge voltage conversion gain in time, this block of pixels 2131, the driving frame rate (frame frequency) in this block of pixels 2131 carry out specifying.
Figure 60 illustrates an example of the content of the data group 2950 shown in Figure 59.16 kinds of data codes (0 ~ 9, a ~ f) are stored in data code region 2952.Data code 0 is distributed to the R pixel addition mean value (Ar) of this block of pixels 2131, and as 12 additional data and export.Data code 1 is distributed to the G pixel addition mean value (Ag) of this block of pixels 2131, and as 12 additional data and export.Data code 2 is distributed to the B pixel addition mean value (Ar) of this block of pixels 2131, and as 12 additional data and export.Data code 3 is distributed to the residual quantity Δ Ar of overall average Cr and Ar, and as 12 additional data and export.Data code 4 is distributed to the residual quantity Δ Ag of overall average Cg and Ag, and as 12 additional data and export.Data code 5 is distributed to the residual quantity Δ Ab of overall average Cb and Ab, and as 12 additional data and export.It is more than an example of the operation content of computing circuit 2415 output and the data of operation result.
Data group 2950 also comprises the control information of control part 2740.In this example, data code d is distributed to the charge voltage conversion gain of this block of pixels 2131, and as 12 additional data and export.Data code e is distributed to the driving frame rate of this block of pixels 2131, and as 12 additional data and export.That distributes this block of pixels 2131 to data code f stores the time, and as 12 additional data and export.By adding the control information (control daily record) of control part 2740 to data group 2950, the control information that how expression control part 2740 can be controlled each block of pixels 2131 is transmitted from block of pixels lateral system control part 2501.
That is, systems control division 2501 is owing to can receive the illustrative data group 2950 of Figure 59 by each block of pixels 2131, so by processing the data read the conducting interviews by the differential data of each rgb pixel of this block of pixels 2131 be stored in memory 2940 based on the data code of data group 2950, the image procossing by each block of pixels 2131 easily can be performed.That is, owing to being carried out a part for the process in systems control division 2501 by computing circuit 2415, so the load of the pixel data process of systems control division 2501 in dynamic image generates significantly can be reduced.In addition, the content of the data group 2950 that systems control division 2501 can effectively utilize output circuit 2922 to export, and the load of systems control division 2501 self can be reduced.Such as, systems control division 2501 based on the content of data group 2950, can change compression ratio to generate dynamic image by each block of pixels 2131.
Figure 61 is the cutaway view of other capturing elements 3100 of present embodiment.Capturing element 3100 has: the shooting chip 3113 exporting the picture element signal corresponding with incident light; To the signal processing chip 3111 that picture element signal processes; With the storage chip 3112 of storage pixel signal.These shooting chip 3113, signal processing chip 3111 and storage chips 3112 are stacked together, and have the projection 3109 of conductivity by Cu etc. and be electrically connected to each other.
In addition, as shown in the figure, incident light is mainly towards with the Z axis forward entrance shown in white hollow arrow.In this manual, in shooting chip 3113, the face of the side of incident light beam strikes is called the back side.In addition, as shown in reference axis, the paper left direction orthogonal with Z axis is set to X-axis forward, the paper orthogonal with Z axis and X-axis is nearby set to Y-axis forward in direction.In some width figure afterwards, using the reference axis of Figure 61 is distinguished as benchmark each figure towards mode carry out displaing coordinate axle.
One example of shooting chip 3113 is the mos image sensor of rear surface irradiation type.PD layer 3106 is configured in the rear side of wiring layer 3108.PD layer 3106 configures in two dimensions, has and stores and the multiple PD (photodiode) 104 generating the picture element signal corresponding with the electric charge stored and the transistor 3105 arranged accordingly with PD3104 the electric charge corresponding to incident light.
The light incident side of the incident light in PD layer 3106, is provided with colored filter 3102 across passivating film 3103.Colored filter 3102 has the multiple kinds for wavelength region may transmission different from each other, has specific arrangement accordingly respectively with PD3104.Arrangement about colored filter 3102 will be described later.The group of colored filter 3102, PD3104 and transistor 3105 forms a pixel.
The light incident side of the incident light in colored filter 3102, is provided with lenticule 3101 accordingly with each pixel.Lenticule 3101 assembles incident light towards the PD3104 of correspondence.
Wiring layer 3108 has the wiring 3107 transmitted to signal processing chip 3111 by the picture element signal from PD layer 3106.Wiring 3107 can be multilayer, in addition, also can be provided with passive component and active element.
The surface of wiring layer 3108 is configured with multiple projection 3109.The plurality of projection 3109 and multiple projections 3109 contraposition be arranged on the opposite face of signal processing chip 3111, by pressurizeing to shooting chip 3113 and signal processing chip 3111, make the projection after contraposition 3109 be engaged with each other and be electrically connected.
Similarly, on respect to one another of signal processing chip 3111 and storage chip 3112, multiple projection 3109 is configured with.These projection 3109 alignment therewith, by pressurizeing to signal processing chip 3111 and storage chip 3112, make the projection after contraposition 3109 be engaged with each other and be electrically connected.
In addition, for the joint between projection 3109, be not limited to the Cu bump bond based on solid-state diffusion, the dimpling agllutination based on soldering melting also can be adopted to close.In addition, projection 3109 such as arranges about one for a block of pixels described later.Therefore, the size of projection 3109 can be greater than the spacing of PD3104.In addition, the neighboring area beyond the pixel region that pixel arranges, also can arrange the projection larger than the projection 3109 corresponding to pixel region simultaneously.
Signal processing chip 3111 has the interconnective TSV of circuit (silicon penetrating electrode) 110 that will be separately positioned on the table back side.TSV3110 is preferably arranged on neighboring area.In addition, TSV3110 also can be arranged on the shooting neighboring area of chip 3113 and storage chip 3112.
Figure 62 is the pixel arrangement of shooting chip 3113 and the key diagram of block of pixels 3131.The situation about observing from rear side of shooting chip 3113 shown in Figure 62.Multiple pixel is arranged with rectangular in pixel region 3700.In Figure 62,16 pixels of 4 adjacent pixel × 4 pixels form a block of pixels 3131.Ruling in figure illustrates that adjacent pixel is grouped and forms the concept of block of pixels 3131.The quantity of pixel forming block of pixels 3131 is not limited thereto, and also can be about 1000, such as 32 pixel × 64 pixels, can be its above also can be below it.
As shown in the partial enlarged drawing of pixel region 3700, block of pixels 3131 is being built-in with four so-called Bayer array be made up of these 4 pixels of green pixel Gb, Gr, blue pixel B and red pixel R up and down.Green pixel is the pixel as colored filter 3102 with green color filter, accepts the light of the green band in incident light.Similarly, blue pixel is the pixel as colored filter 3102 with blue color filter, accepts the light of blue wave band, and red pixel is the pixel as colored filter 3102 with Red lightscreening plate, accepts the light of red band.
In the present embodiment, select at least one block of pixels in multiple block of pixels 3131, under the controling parameters different from other block of pixels, the pixel that each block of pixels comprises is controlled.About the example of controling parameters, be frame frequency, except rate, by picture element signal be added addition line number, electric charge the time that stores or store number of times, digitized figure place etc.And controling parameters also can be the parameter obtained from pixel the image procossing after picture signal.Frame frequency refers to the cycle generating picture element signal.In addition, in this manual, there is the situation that frame frequency refers to the frame frequency by each block of pixels 3131.Such as, benchmark frame frequency and high speed frame frequency refer to the frame frequency by each block of pixels 3131.
Figure 63 is the circuit diagram corresponding with the block of pixels 3131 of shooting chip 3113.In figure, representatively, the rectangle that dotted line surrounds illustrates the circuit corresponding with a pixel.In addition, each transistor below illustrated corresponding with the transistor 3105 of Figure 61 at least partially.
In Figure 63, the block of pixels 3131 formed is shown, but the pixel count of block of pixels 3131 is not limited to this by 16 pixels.16 PD3104s corresponding with each pixel are connected with transmission transistor 3302 respectively, and the TX of each grid of each transmission transistor 3302 and supply transmission pulse connects up and 3307 to be connected.In the example shown in Figure 63, TX wiring 3307 is share to connect relative to 16 transmission transistors 3302.
The drain electrode of each transmission transistor 3302 connects with the source electrode of corresponding each reset transistor 3303, and the floating FD that spreads of the what is called between the drain electrode of transmission transistor 3302 with the source electrode of reset transistor 3303 is connected with the grid of amplifier transistor 3304.The drain electrode of reset transistor 3303 and the Vdd of supply line voltage connect up and 3310 to be connected, and the replacement that its grid and supply reset pulse is connected up and 3306 to be connected.In the example shown in Figure 63, resetting wiring 3306 relative to 16 reset transistors 3303 is share to connect.
The drain electrode of each amplifier transistor 3304 and the Vdd of supply line voltage connect up and 3310 to be connected.In addition, the source electrode of each amplifier transistor 3304 selects the drain electrode of transistor 3305 to connect with corresponding each.Select each grid of transistor and supply the decoding of strobe pulse and connect up and 3308 to be connected.In the example shown in Figure 63, decoding wiring 3308 selects transistor 3305 independently to arrange relative to 16.And each selects the source electrode of transistor 3305 and shared output to connect up 3309 to be connected.Load current source 3311 supplies electric current to output wiring 3309.That is, for selecting the output of transistor 3305 wiring 3309 to be formed by source follower.In addition, load current source 3311 can be arranged on shooting chip 3113 side, also can be arranged on signal processing chip 3111 side.
At this, the flowing from storing till the pixel after storing end exports of electric charge is described.If apply to reset pulse to reset transistor 3303 by resetting wiring 3306, apply transmission pulse by TX wiring 3307 to transmission transistor 3302, then the current potential of PD3104 and floating diffusion FD is reset simultaneously.
When removing the applying of transmission pulse, PD3104 converts accepted incident light to electric charge and stores.Then, when with do not apply to reset the state of pulse again apply transmission pulse time, the electric charge stored is transmitted by floating diffusion FD, the current potential of floating diffusion FD from replacement current potential become electric charge store after signal potential.And when applying strobe pulse by decoding wiring 3308 to selection transistor 3305, the variation of the signal potential of floating diffusion FD is via amplifier transistor 3304 and select transistor 3305 and be delivered to output wiring 3309.Thus, corresponding with replacement current potential and signal potential picture element signal outputs to from unit picture element and exports wiring 3309.
In the example shown in Figure 63, for 16 pixels forming block of pixels 3131, replacement wiring 3306 and TX wiring 3307 are shared.That is, replacement pulse and transmission pulse apply for 16 whole pixels respectively simultaneously.Therefore, form whole pixels of block of pixels 3131 and start electric charge in identical timing and store, and terminate electric charge in identical timing and store.But, each select transistor 3305 applied successively according to strobe pulse with picture element signal corresponding to the electric charge that stores, and optionally output to output wiring 3309.In addition, replacement wiring 3306, TX wiring 3307, output wiring 3309 are arranged separately by each block of pixels 3131.
By like this with block of pixels 3131 for benchmark carrys out forming circuit, electric charge can be controlled by each block of pixels 3131 and store the time.In other words, adjacent block of pixels 3131 each other in, can export respectively and store the time based on different electric charges and the picture element signal obtained.Again in other words, carry out during an electric charge stores making a block of pixels 3131, make one other pixel block 3131 repeat repeatedly electric charge store and export each picture element signal, thus, also can these block of pixels 3131 each other in export each frame of dynamic image with different frame frequency.
Figure 64 A illustrates a part and the action case thereof of the structure of capturing element 3100.The capturing element 3100 of this example, except the structure shown in Figure 61, also has storage part 3114.In addition, storage part 3114 can be arranged on signal processing chip 3111.In this situation, capturing element 3100 can not have storage chip 3112.In addition, storage part 3114 also can be arranged on storage chip 3112.
Shooting chip 3113 has pixel region 3700, and this pixel region 3700 is configured with multiple pixels of the picture element signal that generation is corresponding to incident light respectively.At this, multiple pixel can arrange and form by pixel region 3700 in two dimensions.Each block of pixels 3131 in the row direction and column direction has m × n pixel.At this, m, n are the integer of more than 2.Multiple block of pixels 3131 that pixel region 3700 has in the row direction and column direction divides.As shown in Figure 62, block of pixels 3131 refers to that multiple pixel is fit with the set of pixels of rectangular configuration.In addition, line direction and column direction refer to two different directions in the face of pixel region 3700, can be mutually orthogonal.
For ease of illustrating, in Figure 64 A to Figure 64 C, in the row direction and column direction respectively illustrates three block of pixels 3131, but the quantity of block of pixels 3131 that pixel region 3700 comprises can be more.The pixel count that each block of pixels 3131 preferred comprises is equal.In addition, the pixel count that each block of pixels 3131 in preferred pixel region 3700 comprises is fixing.Block of pixels 3131 is made up of such as 32 × 64 pixels.
The signal processing chip 3111 of this example has multiplexer 3411, A/D converter 3412, demultiplexer 3413 and control part 3740 by each block of pixels 3131.The pixel that multiplexer 3411 selects corresponding block of pixels 3131 to comprise successively, and the picture element signal corresponding with selected pixel is input to A/D converter 3412.A/D converter 3412 converts analog pixel signal to digital pixel data, and is input to demultiplexer 3413.This pixel data, in the memory block 3730 of correspondence, is stored into the storage area corresponding with this pixel by demultiplexer 3413.Stored pixel data is delivered to the computing circuit of rear class by each memory block 3730.
Storage part 3114 and multiple block of pixels 3131 are arranged accordingly, and have multiple memory blocks 3730 of the pixel data that can store corresponding block of pixels 3131 respectively.Memory block 3730 and block of pixels 3131 one_to_one corresponding.Memory block 3730 can connect with corresponding block of pixels 3131 via bus 3720.Memory block 3730 can be buffer storage.
In addition, memory block 3730 at least partially also can store the pixel data of the block of pixels beyond corresponding block of pixels 3131.That is, a memory block 3730 can be common by multiple block of pixels 3131.In other words, control part 3740 can make the pixel data of a block of pixels 3131 be stored in multiple memory block 3730.By making memory block 3730 be had, multiple memory block 3730 can be utilized as described later efficiently, therefore, it is possible to suppress the memory capacity of storage part 3114 entirety.
In addition, for whole block of pixels 3131, preferred pixel data, except corresponding memory block 3730, also can be read and write at least one other memory block 3730.These other memory blocks 3730 can pre-determine by each block of pixels 3131, also can dynamically change.In addition, for whole memory block 3730, preferably except corresponding block of pixels 3131, also can read and write the pixel data of at least one other block of pixels 3131.These other block of pixels 3131 can pre-determine by each memory block 3730, also can dynamically change.
In addition, each memory block 3730 can be in signal processing chip 3111, the memory arranged pressing each block of pixels 3131 with the region of corresponding block of pixels 3131 overlap.That is, memory block 3730 in signal processing chip 3111, can be arranged on the positive lower area of corresponding block of pixels 3131.In this situation, block of pixels 3131 and memory block 3730 can be electrically connected by TSV.In addition, in signal processing chip 3111, in the region overlapping with each block of pixels 3131, be provided with corresponding memory block 3730 and A/D converter 3412 etc.In addition, each memory block 3730 also can be on signal processing chip 3111, be separately positioned on the memory in the outside in the region overlapping with shooting area 3700.
In addition, when each memory block 3730 and A/D converter 3412 are arranged at the region with corresponding block of pixels 3131 overlap, when each memory block 3730 stores the pixel data of the block of pixels 3131 beyond corresponding block of pixels 3131, analog pixel signal can be sent to the setting area of this memory block 3730, also digital pixel data can be sent to the setting area of this memory block 3730.In the former case, corresponding with this memory block 3730 A/D converter 3412 is input in this memory block 3730 after converting picture element signal to pixel data.In the latter case, in the A/D converter 3412 on the region overlapping with this block of pixels 3131, after converting picture element signal to pixel data, this pixel data is sent to the memory block 3730 that should store this pixel data.Signal processing chip 3111 is provided with the wiring for transmitting these picture element signals or pixel data.
Figure 64 B illustrates other action cases of capturing element 3100.In addition, in Figure 64 B, the structure of the signal processing chip 3111 shown in Figure 64 A is eliminated.In this example, by multiple block of pixels 3131, the pixel data of block of pixels 3712 is stored in any one of other memory blocks 3731,732 and 733 beyond corresponding memory block 3734.In this example, convert by the A/D converter 3412 corresponding with other memory blocks 3731 ~ 733 analog pixel signal that block of pixels 3712 generates to digital pixel data.Like this, owing to the pixel data of arbitrary block of pixels 3712 can be stored in multiple memory block 3731 ~ 734, so the service efficiency of memory can be improved.
Such as, multiple block of pixels 3131 can generate in the timing corresponding from frame frequency with the picture element signal of the subject that different frame frequencies photographs by each block of pixels 3131.As described later, control part 3740 at least according to benchmark frame frequency and short these both sides of high speed frame frequency of this benchmark frame frequency of period ratio, selects corresponding memory block 3730 by each block of pixels 3131.The cycle of high speed frame frequency can be 1 times of the integer number in the cycle of benchmark frame frequency.Each block of pixels 3131 can export the picture element signal of the amount of 1 piece by each cycle of frame frequency.
In this example, the frame frequency of pixels illustrated block 3712 is the situation of 5 times of benchmark frame frequency.In addition, at the block of pixels 3131 output pixel signal with benchmark frame frequency roughly simultaneously, the block of pixels 3712 also output pixel signal of high speed frame frequency.In this situation, to block of pixels 3131 exports next picture element signal, block of pixels 3712 exports four picture element signals.
Control part 3740 benchmark frame frequency block of pixels 3131 not output pixel signal time, the pixel data corresponding to the picture element signal that the block of pixels of high speed frame frequency exports for 3,712 four times is stored in multiple memory block 3731 ~ 734 respectively.In addition, pixel data that is corresponding from the picture element signal that benchmark frame frequency synchronously exports with each block of pixels, 1 frame amount can be stored in the memory different with multiple memory block 3730, in addition, also can temporarily be stored in after in multiple memory block 3730, before being input to multiple memory block 3730 with next pixel data of the block of pixels 3712 of high speed frame frequency action, be delivered to memory or the circuit of the rear class of memory block 3730.Thereby, it is possible to use multiple memory block efficiently.
When having stored pixel data in the memory block 3734 that control part 3740 is corresponding in the block of pixels 3712 with high speed frame frequency, the pixel data corresponding with this block of pixels 3712 is stored in any one of the memory block 3731,732 or 733 not having storage pixel data.That is, control part 3740 by beyond the pixel data memory allocated of the block of pixels 3712 of high speed frame frequency to the memory block 3734 of correspondence, in the memory block 3731,732 and 733 that do not have storage pixel data.Now, the pixel data distributed can have the position data in the pixel region 3700 of the block of pixels 3712 corresponding to self and the frame data of frame that represent belonging to self are used as additional data.The position that should be assigned with the memory block 3730 of pixel data can be fixed by each block of pixels 3712, also can dynamic change.When being fixed by each block of pixels 3131 by the position of the memory block 3730 of carrying out distributing, position data can be omitted from additional data.
Figure 64 C illustrates other action cases of capturing element 3100.In addition, in Figure 64 C, the structure of the signal processing chip 3111 shown in Figure 64 A is eliminated.The pixel data of block of pixels 3712, also in the same manner as the example of Figure 64 B, is stored in any one of other memory blocks 3735 ~ 3737 beyond corresponding memory block 3734 by this example.But in this example, after converting picture element signal to pixel data in the A/D converter 3412 on the region overlapping with block of pixels 3712, by this pixel data to the memory block transmission that should store this pixel data.In this example, this pixel data moves between each memory block.
When having stored pixel data in the memory block 3734 that the control part 3740 of this example is corresponding in the block of pixels 3712 with high speed frame frequency, the pixel data of memory block 3734 is moved to not having the memory block 3735,3736,3737,3738 of storage pixel data, and be stored in each memory block, different from the execution mode shown in Figure 64 B in this.That is, in this example, storage part 3114, can realize the mode of data transmit-receive between memory block, makes memory block connect each other by wiring.
Control part 3740 makes the pixel data of memory block 3734 move to any one not having a memory block 3735,3736,3737,3738 of storage pixel data and store.Preferably, control part 3740, in the mode of memory block corresponding to the block of pixels 3131 towards the most peripheral with pixel region 3700, makes the pixel data of memory block 3734 move and store.Due to along with from the high block of pixels 3712 of frame frequency to the peripheral direction of pixel region 3700 away from and situation that the frame frequency of block of pixels 3131 reduces is more, so preferably control part 3740 makes pixel data disperse in two dimensions to peripheral direction.Thus, even if do not increase the capacity of buffer storage, by utilizing multiple memory block 3730 fifty-fifty, the capacity of the memory of storage part 3114 entirety also can be suppressed.In addition, control part 3740 based on the frame frequency information of each block of pixels 3131, can be selected the memory block 3730 corresponding with the block of pixels 3131 beyond most peripheral and is write by this pixel data.
In this example, the position of the memory block that pixel data should be had to disperse can be fixing, also can be dynamic change.When the position of the memory block that should have dispersion is fixing, position data can be omitted from the additional data of the pixel data that should be additional to movement.In this situation, the memory block that more preferably pixel data should be had to disperse is the memory block corresponding with the block of pixels 3131 of the most peripheral of pixel region 3700.In addition, the pixel data that each memory block 3730 stores synchronously can move successively with high speed frame frequency.Thereby, it is possible to pixel data is transmitted between the memory block 3730 of separating.By repeating the movement of this pixel data, pixel data can be made to move in any memory block 3730.
Computing circuit 3415 described later is delivered to the image processing part of rear class after processing the pixel data be stored in memory block 3730.Computing circuit 3415 can be arranged on signal processing chip 3111, also can be arranged on storage part 3114.In addition, connection corresponding to a block of pixels 3131 shown in figure, but in fact these connections exist by each block of pixels 3131, and action concurrently.But computing circuit 3415 can exist not according to each block of pixels 3131, also can be such as, a computing circuit processes on one side according to the order of sequence with reference to the value of the memory block 3730 corresponding with each block of pixels 3131 in order.
As mentioned above, be provided with accordingly respectively with block of pixels 3131 and export wiring 3309.Capturing element 3100 is stacked owing to will take chip 3113, signal processing chip 3111 and storage part 3114, so utilize by exporting wiring 3309 to these electrical connection employing the chip chamber of projection 3109, can not wiring be pulled around on direction, face, with increasing each chip.
In addition, in control part 3740, the frequency information relevant with the frame frequency of each block of pixels 3131 is provided.Control part 3740, based on this frequency information, selects the pixel data of the block of pixels 3131 of reply high speed frame frequency to carry out the memory block 3730 stored.Such as control part 3740 using memory block 3730 corresponding for the block of pixels 3131 with benchmark frame frequency as the memory block 3730 tackled this pixel data and carry out storing.In addition, also can be that control part 3740, based on this frequency information, in the mode shown in Figure 64 C, determines the route for pixel data movement.Such as control part 3740, when making the pixel data of each memory block 3730 move, selects memory block 3730 in the memory block 3730 adjacent and corresponding with benchmark frame frequency with this memory block 3730, that increase with the distance of memory block 3730 corresponding to high speed frame frequency.
Figure 65 is the block diagram of the structure of the filming apparatus representing present embodiment.Filming apparatus 3500 has the photographic lens 3520 as photographic optical system, and the subject light beam along optical axis OA incidence guides to capturing element 3100 by photographic lens 3520.Photographic lens 3520 can be the replacing formula lens can installed and removed relative to filming apparatus 3500.Filming apparatus 3500 mainly has capturing element 3100, systems control division 3501, drive division 3502, photometric measurer 3503, working storage 3504, recording unit 3505 and display part 3506.
Photographic lens 3520 is made up of multiple optical lens group, and makes the subject light beam imaging near its focus face from scene.In addition, represent this photographic lens 3520 with the imaginary one piece of lens be configured near pupil in Figure 61 and represent.Drive division 3502 is the control circuits storing control according to the electric charge performing the timing controlled, Region control etc. of capturing element 3100 from the instruction of systems control division 3501.On that point, drive division 3502 assume responsibility for and makes capturing element 3100 perform electric charge to store and the function of capturing element control part of output pixel signal.
The image processing part 3511 of picture element signal to systems control division 3501 is paid by capturing element 3100.Working storage 3504 is implemented various image procossing and image data generating by image processing part 3511 as working region.Such as, when generating the view data of jpeg file form, after the signal obtained with Bayer array generates chromatic image signal, perform compression process.The view data generated is recorded in recording unit 3505, and, be converted into display and in the time preset, be presented on display part 3506.
Photometric measurer 3503, before a series of photographic process of image data generating, detects the Luminance Distribution of scene.Photometric measurer 3503 comprises the AE transducer about such as 1,000,000 pixels.The operational part 3512 of systems control division 3501 accepts the output of photometric measurer 3503 and calculates the brightness in each region of scene.Operational part 3512 determines shutter speed, f-number, ISO photosensitivity according to calculated Luminance Distribution.Also can be photometric measurer 3503 by capturing element 3100 dual-purpose.In addition, operational part 3512 also performs the various computings for making filming apparatus 3500 action.
Drive division 3502 can be that part or all is mounted on shooting chip 3113, also can be that part or all is mounted on signal processing chip 3111.A part for systems control division 3501 can be mounted on shooting chip 3113 or signal processing chip 3111.
Figure 66 is the functional block diagram of image processing part.The image processing part 3511 of this example extracts with the block of pixels 3131 (neighboring area 3176 described later) of benchmark frame frequency action with the block of pixels 3131 (watching area 3172 described later) of high speed frame frequency action.Image processing part 3511, except above-mentioned functions, also has subject presumption unit 3150, group selection portion 3152, dynamic image generating unit 3154 and dynamic image combining unit 3156.These each functions will be described later.
Figure 67 represents that filming apparatus generates and records the flow chart of the action of dynamic image.Figure 68 and Figure 69 illustrates an example of the image photographed by capturing element.Figure 70 illustrates the relation of the output timing of each frame frequency and picture signal.
The action of Figure 67 is started when indicating being pressed record button etc. by user generate dynamic image to filming apparatus 3500.First, subject presumption unit 3150 pairs of drive divisions 3502 drive to obtain the view data based on the picture signal from capturing element 3100, and estimate (S3100) the main objects body in the image be included in represented by this view data.
In this situation, the block of pixels 3131 that preferred drive division 3502 comprises from whole shooting area, such as whole block of pixels 3131 output image signal.In addition, whole pixel output image signals that drive division 3502 can comprise from each block of pixels 3131, also can remove the pixel output image signal after removing between rate from predetermined.Subject presumption unit 3150 compares the multiple image obtained chronologically from capturing element 3100, by the subject in movement specifically for main objects body.In addition, the presumption for main objects body also can use additive method.
Such as, subject presumption unit 3150 when obtaining the image 3170 of Figure 68 and the image 3178 of Figure 69 from capturing element 3100 as temporal front and back image, according to its residual quantity by children specifically for main objects body 3171.In addition, the ruling in image 3170 and image 3178 represents the border of block of pixels 3131, but the quantity of block of pixels 3131 is only illustration, is not limited to the quantity shown in these figure.
The block of pixels 3131 (S3102) that group selection portion 3152 at least selects the picture light of a main objects body 3171 estimated by subject presumption unit 3150 incident.Such as, in image 3170, select the block of pixels 3131 at least comprising a part for main objects body 3171.And consider the situation of main objects body 3171 meeting movement in shooting area, the further block of pixels 3131 around of the block of pixels 3131 of the part at least comprising main objects body 3171 is also selected in preferred group selection portion 3152.
The set of the block of pixels 3131 that these select by group selection portion 3152 is as watching area 3172.And group selection portion 3152 using the set that is made up of the block of pixels 3131 not being contained in watching area 3172 in whole shooting area as neighboring area 3176.Group selection portion 3152 carries out specific to expression watching area 3172 relative to the area information 3174 of the scope of whole shooting area.
In the example shown in Figure 68, watching area 3172 is the rectangular areas be made up of laterally 7, longitudinally 4, altogether 28 block of pixels 3131.In contrast, neighboring area 3176 by from as laterally 21, longitudinally 6 of shooting area, amount to 126 block of pixels 3131 remove watching area 3172 after 98 block of pixels 3131 form.In addition, as area information 3174, specific go out in shooting area, the position (9,2) counted from left end and upper end of the block of pixels 3131 of left upper end in the figure of watching area 3172.And as dimension information, specific go out the transverse and longitudinal number 7 × 4 of watching area 3172.
The block of pixels 3131 comprised watching area 3172 is carried out specific information and carries out specific information to neighboring area 3176 being delivered to drive division 3502 by group selection portion 3152.In this situation, transmit the information to the frame frequency that watching area 3172 and neighboring area 3176 are suitable for respectively in the lump.At this, the frame frequency that the frame frequency comparison neighboring area 3176 be preferably suitable for watching area 3172 is suitable for is high.Such as, when the frame frequency be suitable for neighboring area 3176 is 60fps, the frame frequency be suitable for watching area 3172 is set as 180fps.Preferably the value of these frame frequencies is preset, and can be stored by the mode of group selection portion 3152 reference, but also by after user, value can be revised.
Drive division 3502 drives capturing element 3100 to carry out taking (S3104) with each frame frequency.That is, the block of pixels 3131 that comprises for watching area 3172 of drive division 3502, performs electric charge with high frame rate and stores and the output of picture signal, for the block of pixels 3131 that neighboring area 3176 comprises, performs electric charge store and the output of picture signal with low frame rate.In other words, drive division 3502 is during the block of pixels 3131 comprised for neighboring area 3176 obtains the picture signal corresponding with 1 frame, and the block of pixels 3131 comprised for watching area 3172 obtains the picture signal corresponding with multiple frames that sequential arranges.
Such as, when the frame frequency of neighboring area 3176 is 60fps and the frame frequency of watching area 3172 is set as 180fps, as shown in figure 70, drive division 3502 to obtain from neighboring area 3,176 1 frame B1 picture signal time 1/60s during, obtain the picture signal (1/60s=3 × 1/180s) of 3 frames A1, A2, A3 from watching area 3172.In this situation, drive division 3502 by the reset transistor 3303 of the reset transistor 3303 of the block of pixels 3131 that individually drives neighboring area 3176 to comprise, transmission transistor 3302 and the block of pixels 3131 of selecting the group of transistor 3305 and watching area 3172 to comprise, transmission transistor 3302 and the group selecting transistor 3305, and obtains picture signal with different frame frequency.
In addition, Figure 70 shows the output timing of picture signal, but is not that the length of time for exposure also illustrates.Drive division 3502, to become the mode of the time for exposure precomputed by operational part 3512, drives the group of above-mentioned transistor for neighboring area 3176 and watching area 3172.
In addition, the length of time for exposure also can be changed according to frame frequency.Such as in the example shown in Figure 70, also the time for exposure of 1 frame of neighboring area 3176 can be set to 1/3 times, and become the time for exposure identical with watching area 3172 essence.In addition, after the output of picture signal, this picture signal can also be revised with frame frequency ratio.In addition, between neighboring area 3176 and watching area 3172, the output timing of picture signal can be synchronous unlike Figure 70, but asynchronous.
Picture signal from watching area 3172 is stored into (S3106) in the predetermined storage area of working storage 3504 by each frame by image processing part 3511 successively.Similarly, the picture signal from neighboring area 3176 is stored into (same step) in the predetermined storage area of working storage 3504 by each frame by image processing part 3511 successively.As illustrated in Figure 64 A to Figure 64 C, working storage 3504 has multiple memory block 3730.Working storage 3504 can be the memory be made up of the memory set corresponding with each block of pixels 3131.
Dynamic image generating unit 3154 reads the picture signal (S3108) of the watching area 3172 be stored in working storage 3504, and generates the data (S3110) comprising the watching area dynamic image of multiple frame of watching area 3172.Similarly, dynamic image generating unit 3154 reads the picture signal of the neighboring area 3176 be stored in working storage 3504, and generates the data (same step) comprising the neighboring area dynamic image of multiple frame of neighboring area 3176.At this, watching area dynamic image and neighboring area dynamic image can generate with the general format that MPEG is such and can individually reproduce respectively, if also can not process via synthesis described later, the professional format that cannot reproduce generates.
Figure 71 schematically illustrates the watching area dynamic image and neighboring area dynamic image that are generated by dynamic image generating unit.Dynamic image generating unit 3154, to drive with drive division 3502 frame frequency that the frame frequency of watching area 3172 is corresponding, generates watching area dynamic image.In the example shown in Figure 71, to drive with drive division 3502 the frame frequency 1/180fps that the frame frequency 1/180fps of watching area 3172 is identical, generate watching area dynamic image.
Similarly, dynamic image generating unit 3154, to drive with drive division 3502 frame frequency that the frame frequency of neighboring area 3176 is corresponding, generates neighboring area dynamic image.In the example shown in Figure 71, to drive with drive division 3502 the frame frequency 1/60fps that the frame frequency 1/60fps of neighboring area 3176 is identical, generate neighboring area dynamic image.In addition, in the dynamic image of neighboring area, do not have valid value in the region corresponding with watching area 3172, represent with oblique line in figure.
And its data also to watching area dynamic image and neighboring area dynamic image additional header, and are recorded to (S3112) in recording unit 3505 by dynamic image generating unit 3154.Heading message comprises: represent the timing information of watching area 3172 relative to the relation between the output timing of the picture signal of the area information of the position of whole shooting area, the dimension information representing the size of watching area 3172 and expression watching area 3172 and the output timing of the picture signal of neighboring area 3176.
Systems control division 3501 judges whether the shooting (S3114) carrying out next unit interval.About the shooting whether carrying out next unit interval, judge with the record button whether this moment presses dynamic image by user.When carrying out the shooting of next unit interval (S3114: yes), returning above-mentioned steps S3102, when not carrying out the shooting of next unit interval (S3114: no), terminating this action.
At this, " unit interval " is the time being pre-set in systems control division 3501, for about the several seconds.According to the frame frequency of this unit interval, watching area 3172 and the frame frequency of block of pixels number and neighboring area 3176 and block of pixels number, determine the memory capacity for storing in step S3106.In addition, based on these information, the region of the region determining the data storing watching area 3172 in this memory capacity and the data storing neighboring area 3176.
According to more than, picture signal can be obtained from the watching area 3172 comprising main objects body 3171 with high frame rate, and neighboring area 3176 can be suppressed for low frame rate, thus can data volume be reduced.Thus, and carry out from whole pixel compared with high speed readout, the load of driving and image procossing can be reduced, and suppress power consumption and heating.
In addition, when next unit interval starts in the example shown in Figure 67, in step S3102, block of pixels 3131 is reselected, and update area information and dimension information.Thereby, it is possible to follow main objects body 3171 and watching area 3172 successively upgraded.In the example shown in Figure 71, in the first frame A7 of the unit interval in watching area dynamic image, select the watching area 3182 that the block of pixels 3131 different by the last frame A6 from the former unit interval is formed, further, concomitantly also have updated area information 3184 and neighboring area 3186 with it.
Figure 72 illustrates an example of the heading message attached by dynamic image generating unit.The heading message of Figure 72 comprises: to watching area dynamic image carry out specific watching area dynamic image ID, watching area dynamic image frame frequency, the neighboring area dynamic image corresponding with this watching area dynamic image carried out to specific neighboring area dynamic image ID, the frame frequency of neighboring area dynamic image, timing information, area information and dimension information.These heading messages can be additional to the one party of watching area dynamic image and neighboring area dynamic image as heading message, also can be additional to its both sides.
Figure 73 represents that filming apparatus reproduces the flow chart that dynamic image carries out the action shown.This action is carried out specific by user to the some of the watching area dynamic image be presented at thumbnail on display part 3506 and press reproduction button and start.
Dynamic image combining unit 3156 reads by the data (S3150) of the specific watching area dynamic image of user from recording unit 3505.Dynamic image combining unit 3156 reads the data (S3152) of the neighboring area dynamic image corresponding with this watching area dynamic image from recording unit 3505.
In this situation, dynamic image combining unit 3156 carrys out specific neighboring area dynamic image by the neighboring area dynamic image ID shown in the heading message of watching area dynamic image that reads in step S3150.Also can replace, retrieve the neighboring area image that the timing information identical with the timing information shown in heading message is included as heading message and carry out specific.
In addition, in watching area dynamic image, heading message is comprised in the above example.On the other hand, when not comprising heading message and comprise heading message in watching area dynamic image in the dynamic image of neighboring area, also can be, first in step S150, user's specific neighboring area dynamic image also reads, and according to its heading message specific watching area dynamic image reading in step S152.
Dynamic image combining unit 3156 uses the frame of watching area dynamic image and the frame of neighboring area dynamic image, carrys out the frame (S3154) of compound display dynamic image.In this situation, first, the first frame A1 of watching area dynamic image is embedded into the position shown in the area information 3174 in the first frame B1 of neighboring area dynamic image, thus the first frame C1 of compound display dynamic image.As shown in Figure 71, dynamic image combining unit 3156 makes the first frame C1 of display dynamic image be presented at (S3156) on display part 3506.
Dynamic image combining unit 3156 judges at the next frame (S3158) that whether there is watching area dynamic image between the next frame B2 in the dynamic image of neighboring area.Dynamic image combining unit 3156 is (S3158: yes) when there is the next frame of watching area dynamic image, watching area 3172 being upgraded with frame A2, A3 below and neighboring area 3176 is remained former frame B1 (S3162), frame C2, C3 (S162) thus below compound display dynamic image also carry out showing (S3156) successively.
On the other hand, in step S3158, when there is not the next frame of watching area dynamic image between the next frame B2 in the dynamic image of neighboring area (S3158), watching area 3172 to upgrade with next frame A4 and is also upgraded (S3164) neighboring area 3176 with next frame B2 by dynamic image combining unit 3156, thus compound display dynamic image next frame C4 (S3162) and carry out showing (S3156).
As long as there is the next frame (S3160: yes) of neighboring area 3176 in the dynamic image of neighboring area, then repeat step S154 to S3160.When there is not the next frame of neighboring area 3176 in the dynamic image of neighboring area (S3160: no), dynamic image combining unit 3156 retrieve the unit interval of the group that whether there is this watching area dynamic image and neighboring area dynamic image, the group (S3166) of watching area dynamic image in next unit interval and neighboring area dynamic image.Such as, dynamic image combining unit 3156 is in the identical file folder of recording unit 3505, and whether retrieval exists the watching area dynamic image of the timing information after the timing shown in timing information comprising expression immediately this watching area dynamic image in heading message.
As long as the group of the watching area dynamic image existed in next unit interval and neighboring area dynamic image (S3166: yes), then repeat step S150 to S3166.When the group of the watching area dynamic image do not existed in next unit interval and neighboring area dynamic image (S3166: no), terminate this action.
According to more than, overall data amount can be reduced, and smooth dynamic image can be shown for the watching area 3172 comprising main objects body 3171.In addition, in step S162, the frame of the direct compound display image of more newly arriving with next frame of watching area 3172, but synthetic method is not limited thereto.As other examples, also can be, the boundary line of the main objects body 3171 in specific watching area 3172 is come by image procossing, the main objects body 3171 that this boundary line is surrounded and be updated to next frame, and former frame is maintained for the region in watching area 3172 but outside the boundary line being positioned at main objects body 3171, and synthesizes with the frame of neighboring area 3176.That is, for the frame frequency that can fall into neighboring area 3176 outside the boundary line in watching area 3172.Thereby, it is possible to prevent the border of the smoothness shown in dynamic image from seeming not nature.In addition, reproducing frame frequency does not need identical with the frame frequency (watching area is 180fps, and neighboring area is 60fps) during photography, such as, watching area can be made to be 60fps, make neighboring area be 20fps etc.For slow motion is reproduced in this situation.
Figure 74 illustrates the structure of pixel region 3700 and the vertical view of action case thereof of capturing element 3100.In addition, in Figure 74 to Figure 77, illustrate and each block of pixels 3131 in pixel region 3700 and each memory block 3730 in storage part 3114 are projected to the figure that same plane obtains.Each block of pixels 3131 configures in the entire scope of pixel region 3700, with separating fixed intervals along the line of the column direction.Block of pixels 3131 has m × n pixel, and n, m are more than 2.Block of pixels 3131 can be made up of rectangular 32 × 64 pixels configured.In this example, each memory block 3730 is each and the memories arranged by each block of pixels 3131.That is, each block of pixels 3131 has memory block 3730 one to one.Each memory block 3730 is arranged on in the region of corresponding block of pixels 3131 overlap on signal processing chip 3111.
Each block of pixels 3131 is unitisation by the every multiple block of pixels 3131 distributed with separating fixed intervals in pixel region 3700.The memory block 3730 corresponding with the block of pixels 3131 in group is common by the block of pixels 3131 in group.Have and refer to, the pixel data of multiple block of pixels 3131 can be read and write to this memory block 3730 directly or indirectly.Preferably with whole block of pixels 3131 unitisation making the maximum mode of the distance between the block of pixels 3131 in a group be comprised by pixel region 3700.In addition, more preferably the group of block of pixels 3131 comprises the multiple block of pixels 3131 being positioned at pixel region 3700 most peripheral on shooting chip 3113.In this situation, control part 3740 controls with the frame frequency (in this example for benchmark frame frequency) lower than high speed frame frequency multiple block of pixels 3131 that this is positioned at most peripheral regularly.
At this, represent the position of block of pixels 3131 with coordinate (x, y).In this example, be located at position (4,4), (4,1), (Isosorbide-5-Nitrae), (1,1) four block of pixels 3131 be grouped.Other block of pixels 3131 similarly, are grouped by each block of pixels 3131 separating fixed intervals.
Each memory block 3730 corresponding with the block of pixels 3131 in group is common by the whole block of pixels 3131 in group.Thereby, it is possible to the pixel data of the block of pixels 3131 of high speed frame frequency is stored in the memory block 3730 corresponding with the block of pixels 3131 of the benchmark frame frequency in group.In this example, the position (4 represented with oblique line, 4) pixel data of the block of pixels 3131 of high speed frame frequency is stored in and position (4 in order, 4), (4,1), (1,4) in the memory block 3730 of the benchmark frame frequency in the memory block 3730 of, block of pixels 3131 correspondence of (1,1).
That is, when having stored pixel data in the memory block 3730 that control part 3740 is corresponding in the block of pixels 3131 with high speed frame frequency, in the arbitrary memory block 3730 in the pixel data corresponding with block of pixels 3131 being stored into be same group with this block of pixels 3131.At this, as shown in Figure 68, watching area 3172 is formed by the block of pixels 3131 configured continuously.Therefore, by pixel region 3700 by the every multiple block of pixels 3131 distributed with separating fixed intervals unitisation, the block of pixels 3131 of the block of pixels 3131 and benchmark frame frequency that can improve high speed frame frequency is mixed in the probability in group.Thereby, it is possible to do not improve the service efficiency of memory with not increasing the memory capacity of memory block 3730.In addition, because the group that memory block 3730 is total is fixing, so can reduce or omit pixel data that each memory block 3730 of expression the stores additional data corresponding with which block of pixels 3131.
Figure 75 is the vertical view of an example of other structures of the capturing element 3100 shown in Figure 74.The capturing element 3100 of this example replaces storage part 3114, and has the outside that is positioned at pixel region 3700 and in the row direction and the column direction storage part 3810 arranged separately, different from the execution mode described in Figure 74 in this.In addition, storage part 3810 is except its physical location, and other can be identical with storage part 3114.
The storage part 3810 of this example by the row direction and on column direction and the multiple storage areas 3812 relatively arranged on the region that the block of pixels 3131 of the most peripheral with pixel region 3700 is overlapping form.Each storage area 3812 can be made up of the memory block 3730 of 2 × 2.Each memory block 3730 is by the storage area 3812 in each group of memory arranged.Pixel data based on the respective position of the block of pixels 3131 with unitisation, frame frequency, the relevant information of timing and calculated address information, and is written in memory block 3730 by control part 3740 successively.
In this example, corresponding with the block of pixels 3131 of unitisation memory block 3730 constitutes the storage area 3812 of 2 × 2.That is, because the memory block 3730 that the block of pixels 3131 with unitisation is corresponding is adjacent to concentrate on a place, so do not need as the memory block 3730 of separating is connected via wiring each other by memory block 3730 by situation about arranging with each region that block of pixels 3131 repeats.Therefore, the time required for write/read of the pixel data caused based on RC delay can be reduced.In addition, by computing circuit from pixel data to subordinate input time, relative to storage area 3812, a bus is set.And, compared to the situation arranging memory block by each region of repeating with block of pixels 3131, the circuit structure required for the write/read of pixel data can be simplified.
Figure 76 is the vertical view of other action cases representing the capturing element 3100 shown in Figure 74.In this example, also there is the transfer path 3710 transmitting pixel data between the memory block 3730 that adjacent block of pixels 3131 is corresponding, different from the execution mode shown in Figure 74 in this.Transfer path 3710 can be the wiring each memory block 3730 be connected to each other.Control part 3740 is connected with whole memory block 3730 by transfer path 3710.Control part 3740 makes the pixel data corresponding to block of pixels 3131 of high speed frame frequency and high speed frame frequency synchronously, moves successively to adjacent memory block 3730.At this, refer to high speed synchronize, read the identical timing of the timing of pixel data with the block of pixels 3131 high with frame frequency, pixel data is stored in adjacent multiple memory blocks 3730 successively.
At this, the situation being 5 times of benchmark frame frequency with the frame frequency of the block of pixels 3131 of position (4,4) is that example is described.If benchmark frame frequency is set to 60fps, then high speed frame frequency is 300fps.In shooting timing under high speed frame frequency, timing during moment t=0 is set to T0, timing during moment t=1/300s is set to T1, timing during moment t=2/300s is set to T2, timing during moment t=3/300s is set to T3, timing during moment t=4/300s is set to T4, by moment t=5/300 stime timing be set to T5.
Control part 3740 makes the pixel data of captured subject be stored in the memory block 3730 corresponding respectively with whole block of pixels 3131 in the timing of T0 respectively.Then, control part 3740 is in the timing of T1, make frame frequency lower and be stored in adjacent position (3,4) pixel data in memory block 3730 is to the position (2 of peripheral direction, 4) memory block 3730 moves, and make to be stored in and to move with the pixel data in the memory block 3730 of block of pixels 3131 correspondence of position (4,4) and to be stored in the memory block 3730 of the position (3,4) becoming blank state.Meanwhile, control part 3740 makes the pixel data of the block of pixels 3131 of the position of the timing acquisition at T1 (4,4) be stored in the memory block 3730 of corresponding position (4,4).
Control part 3740 is in the timing of T2, make to be stored in position (4,3) pixel data in memory block 3730 moves and is stored into the position (4 of peripheral direction, 2) in memory block 3730, make and position (4,4) pixel data of the memory block 3730 of block of pixels 3131 correspondence moves and is stored in the memory block 3730 of the position (4,3) becoming blank state.Meanwhile, control part 3740 makes the pixel data of the block of pixels 3131 of the position (4,4) obtained in the timing of T2 be stored in the memory block 3730 of corresponding position (4,4).
Control part 3740 is in the timing of T3, make to be stored in position (5,4) pixel data in memory block 3730 moves and is stored into the position (6 of peripheral direction, 4) in memory block 3730, make and position (4,4) pixel data of the memory block 3730 of block of pixels 3131 correspondence moves and is stored in the memory block 3730 of the position (5,4) becoming blank state.Meanwhile, control part 3740 makes the pixel data of the block of pixels 3131 of the position (4,4) obtained in the timing of T3 be stored in the memory block 3730 of corresponding position (4,4).
Control part 3740 is in the timing of T4, make to be stored in position (4,5) pixel data in memory block 3730 moves and is stored into the position (4 of peripheral direction, 6) in memory block 3730, make and position (4,4) pixel data of the memory block 3730 of block of pixels 3131 correspondence moves and is stored in the memory block 3730 of the position (4,5) becoming blank state.Meanwhile, control part 3740 makes the pixel data of the block of pixels 3131 of the position (4,4) obtained in the timing of T4 be stored in the memory block 3730 of corresponding position (4,4).Now, with position (4,4) position (4 of block of pixels 3131 correspondence, 4) in memory block 3730 and surround the position (3 of this memory block 3730 in two dimensions, 4), (4,3), (5,4), (4,5), in memory block 3730, the pixel data of timing T0 to T4 is stored.
Control part 3740 can make to be stored in position (3,4), (4,3), (5,4) each pixel data in the memory block 3730 of, (4,5) in adjacent memory block 3730, the memory block 3730 nearest from the edge of pixel region 3700 move.That is, control part 3740 can make to be stored in position (3,4), (4,3) each pixel data in the memory block 3730 of, (5,4), (4,5) move and be stored into be positioned at the edge of pixel region 3700, position (1,4), (4,1) in the memory block 3730 of, (6,4), (4,6).
Pixel region 3700, in the timing of T5, is stored in whole pixel datas in memory block 3730 to be delivered to rear class memory or computing circuit by bus by control part 3740.Control part 3740 upgrades the circulation of frame, and repeats the action of above-mentioned timing T0 to T4.
The frame frequency of the block of pixels 3131 along pixel region 3700 most peripheral in multiple block of pixels 3131 is fixed on benchmark frame frequency by control part 3740.But, when the block of pixels 3131 of high speed frame frequency is positioned at the edge of pixel region 3700, because adjacent memory block 3730 is limited, so be difficult to make pixel data disperse in two dimensions.Therefore, control part 3740 does not make the block of pixels 3131 of high speed frame frequency be positioned at the most peripheral of pixel region 3700.Such as, the frame frequency of the block of pixels 3131 of the most peripheral of pixel region 3700 is fixed on benchmark frame frequency by control part 3740.
Control part 3740 simultaneously writes new pixel data to the memory block 3730 corresponding respectively with whole block of pixels 3131, and sends to the arithmetic processing circuit of rear class with being concentrated by the pixel data by each block of pixels 3131.Like this, control part 3740 makes the pixel data of the block of pixels 3131 of high speed frame frequency move in the memory block 3730 corresponding with adjacent block of pixels 3131 successively towards the direction close to pixel region 3700 edge, memory block 3730 can be made thus to be that multiple block of pixels 3131 is common, therefore, it is possible to minimizing memory capacity.Be distributed in the pixel data of adjacent multiple memory blocks 3730, as heading message, the position data in the pixel region 3700 of the block of pixels 3131 corresponding with self can be had and the frame data of frame that represent belonging to self are used as additional data.
In this example, control part 3740 makes the pixel data of the block of pixels 3131 of high speed frame frequency move successively and is stored in the memory block 3730 corresponding with adjacent block of pixels 3131, but pixel data also can be made to move to is separated by the memory block 3730 of, pixel data can also be made to move and be stored in the memory block in the memory block 3730 on angular direction but not on ranks direction.Control part 3740 can also based on the frame frequency information of each block of pixels 3131, selects pixel data will the memory block 3730 of movement.
Figure 77 is the vertical view of other structure example representing capturing element 3100.In this example, in the same manner as the capturing element 3100 shown in Figure 76, between the memory block 3730 that adjacent block of pixels 3131 is corresponding, transmit pixel data.But the capturing element 3100 of this example, in the same manner as the capturing element 3100 shown in Figure 75, signal processing chip 3111 has the storage part 3810 in the outside being arranged on the region overlapping with pixel region 3700.Storage part 3810 has the storage area 3820 divided with the quantity of block of pixels on line direction 3131 (being 6 in this example) and the storage area 3822 divided with the quantity of the block of pixels 3131 on column direction (being 6 in this example).Control part 3740 makes the pixel data corresponding to block of pixels 3131 of high speed frame frequency and high speed frame frequency synchronously be stored in the storage area 3820,3822 of regulation.
Control part 3740 can with synchronize the pixel data of the block of pixels 3131 of the high speed frame frequency of position (4,4) be written to establish in corresponding storage area 3820,3822 with the block of pixels 3131 of the most peripheral of low frame rate.In addition, control part 3740 also based on the frame frequency information of each block of pixels 3131, can be selected to establish corresponding storage area 3820,3822 with the block of pixels 3131 beyond most peripheral, and writes this pixel data.Storage area 3820,3822 is common by the pixel data of the block of pixels 3131 of the pixel data of the block of pixels 3131 of high speed frame frequency and low speed frame frequency.In this example, as long as carry out write/read by each storage area 3820,3822, carry out write/read, so circuit structure can be simplified owing to not needing each memory block 3730 to arranging by each block of pixels 3131.In addition, the equal and opposite in direction of storage area in the storage part 3810 of this example 3820,3822 respective memory spaces.And the position of the memory space of storage area 3820,3822 can be fixing in storage part 3810, also can be dynamic change.
Figure 78 illustrates a part of structure and the action thereof of the capturing element 3100 of other execution modes.In this example, storage part 3114 is made up of the buffer storage of multi-ply construction, and the execution mode from above-mentioned in this is different.The storage part 3114 of this example comprises temporary storage 3850 and transmission memory 3860.Temporary storage 3850 has the memory block 3830 corresponding with each block of pixels 3131, is the memory used in the control of the pixel data of the block of pixels 3712 in high data rate.Transmission memory 3860 receives the pixel data inputted from temporary storage 3850, and by pixel data to the memory of subordinate or computing circuit transmission.Transmission memory 3860 has the storage area of at least formed objects relative to total storage area of multiple memory block 3730.At this, total storage area refers to the size of the memory space that temporary storage 3850 has.The temporary storage 3850 of this example has the function identical with the memory block 3730 shown in Figure 76 and structure.
At this, be that the situation of 5 times of benchmark frame frequency is described for the frame frequency of block of pixels 3712.If benchmark frame frequency is set to 60fps, then high speed frame frequency is 300fps.In shooting timing under high speed frame frequency, timing during moment t=0 is set to T0, timing during moment t=1/300s is set to T1, timing during moment t=2/300s is set to T2, timing during moment t=3/300s is set to T3, timing during moment t=4/300s is set to T4, timing during moment t=5/300s is set to T5.
Control part 3740 make T0 BR to whole pixel datas of subject be stored in the memory block 3830 corresponding respectively with whole block of pixels 3131.Stored pixel data transmits to transmission memory 3860 by the timing of control part 3740 before T1.That is, control part 3740 from before inputting next pixel data with the block of pixels 3712 of high speed frame frequency action, make T0 BR to whole pixel datas of subject copy and be stored in the storage area 3870 of the correspondence of transmission memory 3860.
Control part 3740, in the timing of the T1 with high speed synchronize, makes pixel data be stored into the memory block 3853 of the correspondence of temporary storage 3850 from the block of pixels 3712 of high speed frame frequency via bus 3720.Control part 3740 makes to be stored in the timing of the pixel data in memory block 3853 before the timing or T2 of T2 and moves and be stored in adjacent memory block 3854.Control part 3740 in the timing of T2, with high speed frame frequency synchronously, pixel data is stored into the memory block 3853 of the correspondence of temporary storage 3850 from block of pixels 3712 via bus 3720.
Control part 3740 makes to be stored in the timing of the pixel data in memory block 3853 before the timing or T3 of T3 and moves and be stored in adjacent memory block 3855.Control part 3740 in the timing of T3, with high speed frame frequency synchronously, pixel data is stored into the memory block 3853 of the correspondence of temporary storage 3850 from block of pixels 3712 via bus 3720.The timing of control part 3740 before the timing or T4 of T4, makes the pixel data be stored in memory block 3853 move and is stored in adjacent memory block 3856.Control part 3740 in the timing of T4, with high speed frame frequency synchronously, pixel data is stored into the memory block 3853 of the correspondence of temporary storage 3850 from block of pixels 3712 via bus 3720.
The timing of control part 3740 before the timing or T5 of T5, makes the pixel data be stored in the memory block 3854,3855,3856,3857 of temporary storage 3850 be stored in the storage area 3864,3865,3866,3867 of the correspondence of transmission memory 3860 via bus 3840.That is, pixel data, after receiving the pixel data in high speed frame frequency in high speed frame frequency, that be about to become before benchmark timing, during the pixel data receiving next benchmark timing, transmits to transmission memory 3860 by temporary storage 3850.
In addition, control part 3740 is synchronously can move the pixel data that the memory block 3854,3855,3856,3857 adjacent with memory block 3853 stores to other adjacent memory blocks further with high speed frame frequency.Control part 3740 by whole pixel datas of being stored in transmission memory 3860 to the memory of rear class or computing circuit transmission.
According to the present embodiment, as long as due to memory block 3853 corresponding for the block of pixels 3712 with high speed frame frequency and each memory block 3854,3855,3856,3857 adjacent with this memory block 3853 are connected by transfer path 3710, so do not need whole memory block to be connected by transfer path 3710.Therefore, it is possible to carry out the movement of pixel data at high speed.In addition, as temporary storage 3850, due to can the cache memories such as SRAM be used, so read/write can be carried out at high speed.And, in temporary storage 3850, because memory block 3830 is not total, so the circuit structure required for write/read can be simplified.And in transmission memory 3860, total storage area is only the storage area adjacent with the storage area 3863 corresponding to the block of pixels 3712 of high speed frame frequency.Thus, in transmission memory 3860, do not need the wiring be connected to each other by storage area 3863.In addition, the situation for temporary storage 3850 with the structure of the memory block 3730 shown in Figure 76 is illustrated, but temporary storage 3850 also can be the structure of arbitrary memory block 3730 of Figure 74 to Figure 77.
Figure 79 represents that filming apparatus generates and records the flow chart of the example of other actions of dynamic image.In Figure 79, identical Reference numeral marked to the action identical with Figure 67 and omit the description.
In the action of Figure 79, in watching area 3172 and neighboring area 3176, replace Figure 67 frame frequency and except rate is different between making, or, except rate is different between the basis of the frame frequency of Figure 67 makes.More specifically, in step S3120, the block of pixels 3131 that drive division 3502 comprises for watching area 3172, store and the output of picture signal except the pixel after removing between rate performs electric charge with low tone, for the block of pixels 3131 that neighboring area 3176 comprises, to store and the output of picture signal except the pixel after removing between rate performs electric charge between height.The block of pixels 2131 such as watching area 3172 comprised and except rate be 0, namely read whole pixel, the block of pixels 3131 that neighboring area 3176 is comprised and except rate be 0.5, namely read the pixel of half.
In this situation, the reset transistor 3303 of the reset transistor 3303 of the block of pixels 3131 that drive division 3502 drives neighboring area 3176 to comprise individually, transmission transistor 3302 and the block of pixels 3131 of selecting the group of transistor 3305 and watching area 3172 to comprise, transmission transistor 3302 and select the group of transistor 3305, thus with between different except rate obtains picture signal.
In step S3110, dynamic image generating unit 3154, based on the picture signal of the watching area 3172 exported except rate with low tone, generates the watching area dynamic image corresponding with watching area 3172.Dynamic image generating unit 3154 similarly based on the picture signal to remove the neighboring area 3176 that rate exports between height, generates the neighboring area dynamic image corresponding with neighboring area 3176.In addition in step S3112, dynamic image generating unit 3154 is removed the information of rate between adding separately and watching area dynamic image and neighboring area dynamic image is recorded in recording unit 3505.
Figure 80 illustrates a block of pixels with the example of pixel 3188 read except rate 0.5.In the example shown in Figure 80, when the block of pixels 3132 of neighboring area 3176 is Bayer array, vertical direction is read every a unit of Bayer array, that is, every two row when observing with pixel unit are alternately set to the pixel 3188 and unread pixel that will read.Thereby, it is possible to avoid color balance to lack of proper care carry out between except read.
Figure 81 and Figure 79 is corresponding, expression filming apparatus reproduces the flow chart that dynamic image carries out the action shown.In Figure 81, identical Reference numeral marked to the action identical with Figure 73 and omit the description.
In the step S3170 of Figure 81, the pixel of the frame of dynamic image combining unit 3156 pairs of neighboring area dynamic images carries out interpolation to after making exploring degree mate with the exploring degree of the frame of watching area dynamic image, the frame of watching area dynamic image is embedded in the frame of neighboring area dynamic image, thus the frame of compound display image.Thereby, it is possible to obtain picture signal from the watching area 3172 comprising main objects body 3171 with high-resolution, and neighboring area 3176 can be suppressed for low exploring degree, thus can data volume be reduced.Thus, and carry out from whole pixel compared with high speed readout, the load of driving and image procossing can be reduced, and suppress power consumption and heating.
In addition, in the example shown in Figure 61 to Figure 81, watching area 3172 is rectangle, but the shape of watching area 3172 is not limited thereto.As long as watching area 3172 along block of pixels 3131 boundary line and exist, then can be convex polygon, concave polygon or the middle annular shape etc. embedding neighboring area 3176.In addition, watching area 3172 also can be provided with multiple with being spaced from each other.In this case, different frame frequencies can be set in watching area 3172 each other.
In addition, the frame frequency of watching area 3172 and neighboring area 3176 can be variable.Such as, often can detect the amount of movement of main objects body 3171 through the unit interval, the amount of movement of main objects body 3171 is larger, sets higher frame frequency to watching area 3172.In addition, within the unit interval, also can follow main objects body 3171 at any time and upgrade the selection that should be contained in the block of pixels 3131 of watching area 3172.
The generation of the dynamic image of Figure 67 and Figure 79 is pressed record button by user and starts, starting reproduced by user presses reproduction button of the dynamic image of Figure 73 and Figure 81, but start time is not limited thereto.As other examples, by the push-botton operation undertaken by user, the generation action of dynamic image and reproducing movement can be performed continuously, and make display part 3506 carry out live view image display (also referred to as live view display).In this situation, identify that for making user the display of watching area 3172 can be overlapping.Such as, in display part 3506, at the boundary line place display box of watching area 3172, or the brightness of neighboring area 3176 can be reduced or improves the brightness of watching area 3172.
In the action of Figure 79, except rate is different between making in watching area 3172 with neighboring area 3176.Also can replace except rate is different between making, and make line number when being added by the picture element signal of adjacent lines pixel different.Such as, in watching area 3172, line number is 1, that is, not by adjacent lines additively output pixel signal, be the line number than watching area more than 3172 in neighboring area 3176, such as line number is 2, exports the picture element signal of the same column pixel of 2 adjacent row.Thus, in the same manner as Figure 79, the exploring degree of watching area 3172 can be maintained higher compared with neighboring area 3176, and overall semaphore can be reduced.
In addition, also can replace and dynamic image combining unit 3156 is arranged on the image processing part 3511 of filming apparatus 3500, and be arranged on outside display unit, such as PC.In addition, above-mentioned execution mode is not limited to the situation generating dynamic image, also goes for the situation generating still image.
In addition, multiple block of pixels 3131 is all divided into watching area 3172 and these two regions, neighboring area 3176 by above-mentioned execution mode, but is not limited thereto, and also can be divided into the region of more than three.In this situation, using the block of pixels 3131 suitable with the border of watching area 3172 and neighboring area 3176 as borderline region, this borderline region can be used to the value of the controling parameters being used for watching area 3172 and control for the median of the value of the controling parameters of neighboring area 3176.Thereby, it is possible to prevent watching area 3172 and the border of neighboring area 3176 from seeming not nature.
Also can make the time that stores of electric charge in watching area 3172 and neighboring area 3176, store the differences such as number of times.In this situation, watching area 3172 and neighboring area 3176 can be divided based on brightness, and can zone line be set.
Figure 82 A and Figure 82 B is the key diagram of scene example and Region dividing.Figure 82 A illustrates the scene that the pixel region of shooting chip 3113 captures.Specifically, be the scene of the high light subject 3603 simultaneously mirroring the shade subject 3601 and middle subject 3602 that environment within doors comprises and the room external environment observed in the inner side of window frame 3604.When photographing to the larger scene of such light and shade from high light portion to shadow part difference, if capturing element in the past, then when to perform electric charge with high light portion for benchmark and store, produce entirely black in shadow part, when with shadow part be benchmark perform electric charge store time, produce complete white in high light portion.That is, can say to make high light portion and shadow part store output image signal by an electric charge without exception, the dynamic range of larger scene poor relative to light and shade and photodiode is not enough.Therefore, in the present embodiment, be the subregions such as high light portion, shadow part, and it is different from each other to make the electric charge of the photodiode corresponding with regional store number of times by scene partitioning, the essence of seeking dynamic range thus expands.
Figure 82 B illustrates the Region dividing in the pixel region of shooting chip 3113.The scene of Figure 82 A that operational part 3512 pairs of photometric measurers 3503 capture is resolved, and is that pixel region divides by benchmark with brightness.Such as, systems control division 3501 makes photometric measurer 3503 change the time for exposure and performs repeatedly scene to obtain, and operational part 3512 determines the dividing line of pixel region with reference to the changes in distribution of its full white region, full black region.In the example of Figure 82 B, operational part 3512 is divided into these three regions of shadow region 3611, zone line 3612 and highlight area 3613.
Dividing line defines along the border of block of pixels 3131.That is, each region after division comprises the individual group of integer respectively.And, the pixel being contained in each group of the same area during corresponding with the shutter speed determined by operational part 3512 in, the electric charge carrying out same number stores and picture element signal exports.If affiliated area is different, then the electric charge carrying out different number of times stores and picture element signal output.
Figure 83 is the key diagram storing control by the electric charge carried out based on each region after the example division of Figure 82 A and Figure 82 B.After receive photography preparation instruction from user, operational part 3512 decides shutter speed T according to the output of photometric measurer 3503 0.And, be divided into shadow region 3611, zone line 3612 and highlight area 3613 as mentioned above, and according to its separately monochrome information determine that electric charge stores number of times.Electric charge is stored number of times and can not make the saturated mode of pixel to be stored by each electric charge and determine.Such as, electric charge store in action store can store electric charge most probably to ninety percent electric charge, and decide electric charge as benchmark and store number of times.
At this, make shadow region 3611 for once.That is, determined shutter speed T is made 0time consistency is stored with electric charge.In addition, making the electric charge of zone line 3612 store number of times is twice.That is, making electric charge once store the time is T 0/ 2, at shutter speed T 0during repeat twice electric charge and store.In addition, making the electric charge of highlight area 3613 store number of times is four times.That is, making electric charge once store the time is T 0/ 4, at shutter speed T 0during repeat four electric charges and store.
When receiving photography instruction at moment t=0 from user, drive division 3502, to the pixel of group belonging to arbitrary region, all applies to reset pulse and transmission pulse.Be applied for opportunity with this, all pixels all start electric charge and store.
At moment t=T 0/ 4, the pixel of drive division 3502 to the group belonging to highlight area 3613 applies transmission pulse.Then, successively strobe pulse applied to the pixel in each group and respective picture element signal outputted to output wiring 3309.After the picture element signal of the whole pixels in output group, the pixel of drive division 3502 to the group belonging to highlight area 3613 applies to reset pulse and transmission pulse again, starts secondary electric charge and stores.
In addition, the selection due to picture element signal exports and needs the time, thus between the beginning that stores of the end stored at primary electric charge and secondary electric charge can generation time poor.If essence this time difference can be ignored, then as mentioned above, make shutter speed T 0the time obtained divided by electric charge stores number of times is that electric charge once stores the time.On the other hand, if cannot ignore, then consider its time, adjustment shutter speed T 0, or make electric charge once store Time transfer receiver shutter speed T 0the time obtained divided by electric charge stores number of times is short.
At moment t=T 0/ 2, drive division 3502 to belong to zone line 3612 group and belong to highlight area 3613 group pixel apply transmission pulse.Then, successively strobe pulse is applied to the pixel in each group, and respective picture element signal is outputted to output wiring 3309.After the picture element signal of the whole pixels in output group, drive division 3502 applies to reset pulse and transmission pulse to the group belonging to zone line 3612 and the pixel of group that belongs to highlight area 3613 again, start secondary electric charge to zone line 3612 to store, electric charge highlight area 3613 being started to third time stores.
At moment t=3T 0/ 4, the pixel of drive division 3502 to the group belonging to highlight area 3613 applies transmission pulse.Then, successively strobe pulse is applied to the pixel in each group, and respective picture element signal is outputted to output wiring 3309.After the picture element signal of the whole pixels in output group, the pixel of drive division 3502 to the group belonging to highlight area 3613 applies to reset pulse and transmission pulse again, and the electric charge starting the 4th time stores.
At moment t=T 0, drive division 3502 applies transmission pulse to the pixel in whole region.Then, successively strobe pulse is applied to the pixel in each group, and respective picture element signal is outputted to output wiring 3309.By above control, the picture element signal of amount is once stored respectively in the pixel memories 3414 corresponding with shadow region 3611, in the pixel memories 3414 corresponding with zone line 3612, store the picture element signal of the amount of twice respectively, in the pixel memories 3414 corresponding with highlight area 3613, store the picture element signal of the amount of four times respectively.
In addition, drive division 3502 also can apply to reset pulse and transmission pulse to the pixel of the group belonging to arbitrary region successively, resets the pixel of the group belonging to regional successively.Be applied for opportunity with this, the pixel of each group can start electric charge successively and store.Also can be that terminating after electric charge stores to the pixel of the group belonging to Zone Full, drive division 3502 applies transmission pulse for the pixel of Zone Full.And, strobe pulse can be applied successively to the pixel in each group and make each picture element signal output to output wiring 3309.
These picture element signals transmit to image processing part 3511 successively.Image processing part 3511 generates the view data of high dynamic range according to these picture element signals.Concrete process will be described later.
Figure 84 is the figure of the relation representing cumulative number and dynamic range.The picture element signal storing corresponding amount repeatedly with the electric charge repeated is carried out accumulative process by image processing part 3511, and forms a part for the view data of high dynamic range.
When cumulative number be 1 time, be about to have carried out the dynamic range in the region that 1 electric charge stores as benchmark, cumulative number is 2 times, namely carried out 2 electric charges stores and is the amount of 1 grade by the extensive magnitude of the dynamic range in region accumulative for output signal.Similarly, if cumulative number is 4 times, be the amount of 2 grades, if be then the amount of 7 grades for 128 times.That is, the dynamic range in order to seek the amount of n level expands, as long as by 2 nsecondary output signal adds up.
At this, image processing part 3511, in order to identify which zoning to be carried out electric charge several times and stored, gives the exponent bits of the 3bit representing cumulative number to picture signal.As shown in the figure, exponent bits be for 1 time 000 for accumulative total, for 2 times be 001 ... being 111 for 128 times, mode is distributed in order.
Image processing part 3511, with reference to the exponent bits of each pixel data received from computing circuit 3415, when reference results is the accumulative total of more than 2 times, performs the accumulative process of pixel data.Such as, in the situation (1 grade) that cumulative number is 2 times, for 2 picture element signals, the high-order 11bit stored with electric charge in the pixel data of corresponding 12bit is added each other, generates 1 pixel data of 12bit.Similarly, be the situation of 128 times (7 grades) at cumulative number, for 128 pixel datas, the high-order 5bit stored with electric charge in the pixel data of corresponding 12bit be added each other, generate 1 pixel data of 12bit.That is, make to deduct the progression corresponding with cumulative number and the high-order bit that obtains is added each other from 12, generate 1 pixel data of 12bit.In addition, removing does not become the low level bit being added object.
By processing like this, the brightness range of imparting GTG and cumulative number can be made to shift to correspondingly high brightness side.That is, 12bit is distributed to the limited range of high brightness side.Therefore, it is possible to give GTG in the past entirely white image-region.
But, for other zonings, owing to being assigned with 12bit to different brightness ranges, so image data generating cannot be carried out by each region by connecting synthesis simply.Therefore, image processing part 3511, in order to maintain the GTG that obtains as far as possible and make whole region be the view data of 12bit, using high-high brightness pixel and minimum brightness pixel as benchmark, carries out re-quantization process.Specifically, implement gamma conversion and perform quantification, thus maintain GTG more smoothly.By processing like this, the view data of high dynamic range can be obtained.
In addition, cumulative number is not limited to the situation of the exponent bits of giving 3bit as described above to pixel data, also can describe as the additional information different from pixel data.In addition, also can remove exponent bits from pixel data, replacing counts the quantity of the pixel data be stored in pixel memories 3414, obtains cumulative number thus when being added process.
In addition, in above-mentioned image procossing, perform the re-quantization process whole region being accommodated in the view data of 12bit, but also for the bit number of pixel data, correspondingly can increase with upper limit cumulative number and export bit number.Such as, if be decided to be by upper limit cumulative number 16 times (4 grades), then for the pixel data of 12bit, whole region is made to be the view data of 16bit.If process like this, ground, position image data generating can not fall.
Next, a series of photographing actions process is described.Figure 85 is the flow chart of the process representing photographing actions.Flow process is from after the power supply of filming apparatus 3500 is connected.
In step S3201, systems control division 3501 is standby until make photography and prepare instruction and interrupteur SW 1 and be pressed.Step S3202 is entered after the pressing of interrupteur SW 1 being detected.
In step S3202, systems control division 3501 performs light-metering process.Specifically, obtain the output of photometric measurer 3503, operational part 3512 calculates the Luminance Distribution of scene.Then, enter step S3203, as mentioned above, determine shutter speed, Region dividing, cumulative number etc.
After warming-up exercise completes in photography, enter step S3204, standby until make photography instruction and interrupteur SW 2 is pressed.Now, if the elapsed time exceedes the time Tw (step S3205 is yes) preset, then step S3201 is returned.If (step S3205 is no) detects pressing of interrupteur SW 2 before more than Tw, then enter step S3206.
In step S3206, the drive division 3502 receiving the instruction of systems control division 3501 performs the electric charge using Figure 83 to illustrate and stores process, signal readout process.Then, after whole signals has read, enter into step S3207, perform the image procossing using Figure 84 to illustrate, and perform generated Imagery Data Recording to the recording processing in recording unit.
After recording processing completes, enter into step S3208, judge whether the power supply of filming apparatus 3500 disconnects.If power supply does not disconnect, then return step S3201, if disconnect, terminate a series of photographing actions process.
Figure 86 is the block diagram of the concrete structure of the example represented as signal processing chip 3111.In figure, the pixel data handling part 3910 arranged by each block of pixels 3131 is shown by the region of dotted line.
Signal processing chip 3111 bears the function of drive division 3502.Signal processing chip 3111 comprises as the sensor controller 3441 of the controlling functions of sharing, block control part 3442, Synchronization Control portion 3443, signal control part 3444 and blanket the drive control part 3420 controlling these each control parts.Drive control part 3420 converts the instruction from systems control division 3501 to each control part executable control signal and is delivered to each control part.
Sensor controller 3441 bear export to shooting chip 3113, store with the electric charge of each pixel, output that electric charge reads relevant control impuls controls.Specifically, sensor controller 3441 resets pulse and transmission pulse control beginning that electric charge stores and end by exporting object pixel, and by exporting strobe pulse to reading pixel, picture element signal is exported to output wiring 3309.
Block control part 3442 performs output that export to shooting chip 3113, that the block of pixels 3131 becoming control object is carried out to specific certain pulses.Figure 82 B etc. illustrates as used, in the region after division, comprise multiple block of pixels 3131 adjacent one another are.These block of pixels 3131 belonging to the same area form a block.The pixel that same comprises starts electric charge in identical timing and stores, and terminates electric charge in identical timing and store.Therefore, block control part 3442 bears the effect making block of pixels 3131 Briquetting by exporting certain pulses to the block of pixels 3131 becoming object based on the appointment from drive control part 3420.Each pixel is via TX wiring 3307 and reset the transmission pulse that receives of wiring 3306 and reset the logic product that pulse becomes each pulse that sensor controller 3441 exports and the certain pulses that block control part 3442 exports.
Like this, by each region is controlled as separate block, achieve the electric charge using Figure 83 to illustrate and store control.About the Briquetting appointment from drive control part, will after describe in detail.In addition, the pixel that same comprises also can not start electric charge in identical timing and store.That is the pixel that drive control part 3420 also can comprise same applies to reset pulse and transmission pulse with different timing.In addition, drive control part 3420 also can store at the electric charge of the pixel making same comprise and store after the time terminates identical, applies strobe pulse successively, and read each picture element signal successively to the pixel in block.
Synchronization Control portion 3443 exports synchronizing signal to shooting chip 3113.Each pulse and synchronizing signal synchronously become effective taking in chip 3113.Such as, by adjustment synchronizing signal, achieve and only will belong to the STOCHASTIC CONTROL, thinning control etc. of specific pixel as control object of the pixel of same block of pixels 3131.
Signal control part 3444 mainly bears the timing controlled for A/D converter 3412.Via the picture element signal exporting wiring 3309 output, be input to A/D converter 3412 via CDS circuit 3410 and multiplexer 3411.A/D converter 3412 is controlled by signal control part 3444, and converts the picture element signal of input to digital pixel data.The pixel data converting digital signal to is delivered to demultiplexer 3413, then it can be used as the pixel value of numerical data to be stored in the pixel memories 3414 corresponding with each pixel.Pixel memories 3414 is an example of memory block 3730.
Signal processing chip 3111 has the timing memory 3430 as storing control storage, and what this timing memory 3430 stored that the block forming block about which block of pixels 3131 being combined distinguishes information and repeat that electric charge several times stores about each formed block stores number information.Timing memory 3430 is made up of such as flash memory ram.
As mentioned above, form block, the testing result that the Luminance Distribution based on the scene performed before a series of photographic process detects about by which block of pixels 3131 combination, determined by systems control division 3501.The block determined is with such as the 1st piece, the 2nd piece ... mode distinguish, and comprise which block of pixels 3131 by each block and specify.Drive control part 3420 receives this block from systems control division 3501 and distinguishes information, and stores to timing memory 3430.
In addition, systems control division 3501, based on the testing result of Luminance Distribution, determines that each piece repeats electric charge several times and store.Drive control part 3420 receives this from systems control division 3501 and stores number information, and distinguishes information with corresponding block and store to timing memory 430 in couples.By distinguishing information to timing memory 3430 memory block like this and storing number information, drive control part 3420 successively can perform a series of electric charge independently with reference to timing memory 3430 and store control.That is, if drive control part 3420 is once receive the signal of photography instruction in the acquisition of piece image controls from systems control division 3501, then for the control of each pixel after, can completes with accepting instruction from systems control division 3501 at every turn store control.
Drive control part 3420 receives based on preparing indicate with photography the photometry result (testing result of Luminance Distribution) synchronously performed from systems control division 3501 that the block upgraded is distinguished information and stored number information, and the storage content of suitable renewal timing memory 3430.Such as, drive control part 3420 prepares to indicate or photograph to indicate synchronously to upgrade timing memory 3430 with photography.By forming in this wise, the electric charge that can realize more at a high speed stores control, and performs during electric charge stores control at drive control part 3420, and systems control division 3501 can perform other process concurrently.
Drive control part 3420 is not limited to store with reference to timing memory 3430 in control for the electric charge of shooting chip 3113 performing, and is reading in the execution controlled also with reference to timing memory 3430.Such as, drive control part 3420 with reference to each piece store number information, the pixel data exported from demultiplexer 3413 is stored in the corresponding address of pixel memories 3414.
Object pixel data by each block of pixels, according to the delivery request from systems control division 3501, read from pixel memories 3414 and pay to image processing part 3511 by drive control part 3420.Now, the additional data corresponding with each object pixel data is paid to image processing part 3511 by drive control part 3420 in the lump.As mentioned above, pixel memories 3414 has the memory space that can store the picture element signal corresponding with largest cumulative number of times for each block of pixels, and stores with each pixel data storing number of times corresponding performed as pixel value.Such as, repeat 4 electric charges when storing in certain block, the pixel comprised due to this block exports the picture element signal of the amount of 4 times, so store 4 pixel values in the memory space of each pixel in pixel memories 3414.Drive control part 3420 is when receiving the delivery request of asking the pixel data of specific pixel from systems control division 3501, the address of this specific pixel on specified pixel memory 3414, read the whole pixel datas stored, and pay to image processing part 3511.Such as when storing 4 pixel values, these 4 pixel values all being paid successively, when only storing 1 pixel value, this pixel value being paid.
The pixel data be stored in pixel memories 3414 can be read out to computing circuit 3415 by drive control part 3420, and in computing circuit 3415, perform above-mentioned accumulative process.Pixel data after accumulative process is stored in the object pixel address of pixel memories 3414.Object pixel address can be adjacent to arrange with accumulative address space before treatment, also can become same address in a covered manner relative to accumulative pixel data before treatment.In addition, the private space that the pixel value that can also arrange each pixel after by accumulative process intensively stores.Pixel data after accumulative process, when receiving the delivery request of asking the pixel data of specific pixel from systems control division 3501, according to the form of this delivery request, can be paid to image processing part 3511 by drive control part 3420.Certainly, also the pixel data before and after accumulative process can be paid together.
Pixel memories 3414 is provided with the data transmission interface transmitting pixel data according to delivery request.Data transmission interface is connected with the data line be connected with image processing part 3511.Data line 3920 is made up of such as universal serial bus.In this situation, specified by the address that make use of address bus to the delivery request of drive control part 3420 from systems control division 3501 and perform.
The transmission of the pixel data carried out based on data transmission interface is not limited to address-specifying manners, can adopt in various manners.Such as, can adopt when carrying out transfer of data, utilize each circuit synchronous in the rising edge of clock signal, the trailing edge both sides that use carry out the Double Data Rate mode that processes.In addition, also can adopt by a part for the steps such as address appointment is omitted and data at one stroke transmission are sought the burst transfer mode of high speed.In addition, can also combine adopt employ the circuit that control part, storage part, input and output portion are connected in parallel bus mode, by data serial ground one one transmission serial mode etc.
By forming in this wise, image processing part 3511 only can receive necessary pixel data, therefore, especially in the situation etc. of image forming low exploring degree, can complete image procossing at high speed.In addition, when making computing circuit 3415 perform accumulative process, image processing part 3511 can not perform accumulative process, so can be sought the high speed of image procossing by function sharing and parallel processing.
Also can use the signal processing chip 3111 of Figure 86, after using different controling parameters to obtain pixel data in watching area 3172 and neighboring area 3176, carry out image procossing.Such as, in Figure 67 to Figure 70, in watching area 3172 and neighboring area 3176 from the Computer image genration obtained with different frame frequencies dynamic image, but also can to replace, to carry out making the image procossing of the image averaging obtained with high frame rate improving S/N ratio.In this situation, during such as obtaining picture element signal once by drive control part 3420 from neighboring area 3176, obtain repeatedly from watching area 3142, the picture signal of such as four times, and pixel data to be stored in pixel memories 3414.Computing circuit 3415 reads from pixel memories 3414 multiple pixel datas that each pixel for watching area 3142 obtains, and is averaged by each pixel.Thus, the random noise in each pixel of watching area 3172 reduces, and can improve the S/N ratio of watching area 3172.
In addition, memory 930 can be connected with on data line 3920.Memory 930 can be volatile memory pixel data being stored into successively assigned address from pixel memories 3414.Such as, memory 930 is DRAM.Pixel data can be identical with benchmark frame frequency or low compared with benchmark frame frequency to the transmission rate of memory 930 from pixel memories 3414.Memory 930 plays function as carrying out the buffer of data transmission from pixel memories 3414 to image processing part 3511.That is, memory 930 when the data processing rate of the message transmission rate from multiple pixel memories 3414 faster than image processing part 3511, the buffer memory at least partially of the pixel data that pixel memories 3414 is exported.Such as memory 930 will be pressed the pixel data of each benchmark frame frequency from pixel memories 3414 and store with the pixel data of the block of pixels 3131 of high speed frame frequency action.
Above, use execution mode to describe the present invention, but technical scope of the present invention is not limited to the scope described in above-mentioned execution mode.Those skilled in the art clearly can apply numerous variations or improvement to above-mentioned execution mode.According to the record of claims, the mode being applied with such change or improvement also can be included in technical scope of the present invention.
It should be noted, action in device shown in claims, specification and accompanying drawing, system, program and method, sequentially, the execution sequence of each process such as step and rank is as long as no expressing for " before " especially, " ... before " etc., in addition, as long as the output of pre-treatment does not use in reprocessing, just can realize with random order.About the motion flow in claims, specification and accompanying drawing, even if employ " first, " for convenience of explanation, " then, " etc. are illustrated, and also do not mean that and must implement according to this order.

Claims (61)

1. a capturing element, is characterized in that, has:
Shoot part, it has multiple groups of being made up of at least one pixel and by each described group and arrange and the multiple signal reading units read by the signal from described pixel; With
Control part, the described signal reading unit of at least one group during it controls described multiple groups.
2. capturing element as claimed in claim 1, is characterized in that,
Each of described multiple groups comprises multiple described pixel.
3. capturing element as claimed in claim 1 or 2, is characterized in that,
At least one group during described control part selects described multiple groups, and by organizing different controling parameters from other in described multiple groups, control described signal reading unit.
4. capturing element as claimed in claim 3, is characterized in that,
Described controling parameters comprises frame frequency,
Described control part with the 1st frame frequency control with described at least one organize corresponding described signal reading unit, and control to organize corresponding described signal reading unit with described other with the 2nd frame frequency different from described 1st frame frequency.
5. a filming apparatus, is characterized in that,
There is capturing element according to claim 4,
And there is dynamic image generating unit, this dynamic image generating unit based on described 1st frame frequency export described at least one group signal, generate with described at least one organize the dynamic image in corresponding part 1 region, and based on signals of other groups described in exporting with described 2nd frame frequency, generate and organize the dynamic image in corresponding part 2 region with described other.
6. filming apparatus as claimed in claim 5, is characterized in that,
Described dynamic image generating unit, by the timing information of relation of the output timing of the signal that represents the signal that described part 1 region exports with described 1st frame frequency relative to the area information of the scope in the whole region photographed by described shoot part and representing and export with described 2nd frame frequency, is set up with the dynamic image in described part 1 region and the dynamic image in described part 2 region and is stored into accordingly in storage part.
7. capturing element as claimed in claim 3, is characterized in that,
Except rate between described controling parameters comprises,
Described control part with between the 1st except rate control with described at least one organize corresponding described signal reading unit, and control to organize corresponding described signal reading unit with described other to remove rate between the different from rate of removing between the described 1st the 2nd.
8. a filming apparatus, is characterized in that,
There is capturing element according to claim 7,
And there is dynamic image generating unit, this dynamic image generating unit removes at least one signal organized described in rate output based between the described 1st, generate with described at least one organize the dynamic image in corresponding part 1 region, and based on between the described 2nd except rate export described in signals of other groups, generate and organize the dynamic image in corresponding part 2 region with described other.
9. filming apparatus as claimed in claim 8, is characterized in that,
Described dynamic image generating unit by represent described part 1 region relative to the area information of the scope in the whole region photographed by described shoot part and represent between the described 1st except between rate and the 2nd except between the relation of rate except information, set up with the dynamic image in described part 1 region and the dynamic image in described part 2 region and be stored into accordingly in storage part.
10. capturing element as claimed in claim 3, is characterized in that,
Described controling parameters comprises the addition line number of the signal plus from pixel or is added columns,
Described control part control should with the 1st line number or columns by signal plus, with described at least one organize corresponding described signal reading unit, and control should with 2nd line number different from described 1st line number or columns or columns by signal plus, organize corresponding described signal reading unit with described other.
11. capturing elements as claimed in claim 3, is characterized in that,
What described controling parameters comprised electric charge stores the time,
Described control part performs during an electric charge stores making described multiple groups, makes at least one group described perform repeatedly electric charge and stores and export respective signal.
12. capturing elements as claimed in claim 3, is characterized in that,
Described controling parameters comprises digitized for picture element signal figure place,
Described control part is to organize large figure place by the signal digital of at least one group described than other in described multiple groups.
13. capturing elements according to any one of Claims 1-4,7,10 to 12, is characterized in that also having:
Subject presumption unit, it estimates main objects body based on the image photographed by described shoot part; With
Group selection portion, the group selection incident as light of the described main objects body estimated by described subject presumption unit is at least one group described by it.
14. capturing elements according to any one of claim 3,4,7,10 to 12, is characterized in that,
Described control part for being present at least one group described and other borderline groups of organize described in described multiple groups, by the median of described value of described controling parameters of at least one group and the value of the described controling parameters of described other unit group as described controling parameters.
15. capturing elements according to any one of Claims 1-4,7,10 to 14, is characterized in that,
Be laminated with the shooting chip being configured with described multiple groups in two dimensions and the signal processing chip being at least partially configured with described control part.
16. capturing elements as claimed in claim 15, is characterized in that,
Described shooting chip is formed by rear surface irradiation type CMOS chip.
17. 1 kinds of capturing elements, is characterized in that having:
Shoot part, it has multiple groups of being made up of at least one pixel and by each described group and arrange and the multiple signal reading units read by the signal from described pixel; With
Multiple control part, it is by each described group and arrange and control described signal reading unit based on the signal from described pixel.
18. capturing elements as claimed in claim 17, is characterized in that,
Each of described multiple groups comprises multiple described pixel.
19. 1 kinds of capturing elements, is characterized in that having:
Shoot part, its 2nd reading circuit that there is the shooting area being provided with the 1st pixel and the 2nd pixel, the 1st reading circuit reading the 1st picture element signal exported from described 1st pixel and read the 2nd picture element signal exported from described 2nd pixel;
1st operational part, it carries out computing based on described 1st picture element signal to the 1st assessed value;
2nd operational part, it carries out computing based on described 2nd picture element signal to the 2nd assessed value;
1st control part, it carries out with the exposure of described 1st pixel based on described 1st assessed value or reads relevant control; And
2nd control part, it carries out with the exposure of described 2nd pixel based on described 2nd assessed value or reads relevant control.
20. capturing elements as claimed in claim 19, is characterized in that,
Described shooting area has the 1st region being provided with multiple described 1st pixel and the 2nd region being provided with multiple described 2nd pixel,
Described 1st control part carries out with the exposure of multiple described 1st pixel be arranged in described 1st region or reads relevant control,
Described 2nd control part carries out with the exposure of multiple described 2nd pixel be arranged in described 2nd region or reads relevant control.
21. capturing elements as described in claim 19 or 20, is characterized in that,
Described 1st control part controls the frame frequency described 1st pixel read based on described 1st assessed value,
Described 2nd control part controls the frame frequency described 2nd pixel read based on described 2nd assessed value.
22. capturing elements as claimed in claim 20, is characterized in that,
Described 1st control part, based on described 1st assessed value, controls by removing between multiple described 1st pixel be arranged in described 1st region except rate between reading,
Described 2nd control part, based on described 2nd assessed value, controls removing between multiple described 2nd pixel be arranged in described 2nd region except rate between reading.
23. capturing elements as claimed in claim 20, is characterized in that,
Described 1st control part, based on described 1st assessed value, controls to be arranged on multiple described 1st pixel addition in described 1st region and the addition pixel count read,
Described 2nd control part, based on described 2nd assessed value, controls to be arranged on multiple described 2nd pixel addition in described 2nd region and the addition pixel count read.
24. capturing elements according to any one of claim 19 to 23, is characterized in that,
Described 1st operational part carries out computing according to described 2nd picture element signal to described 1st assessed value.
25. capturing elements according to any one of claim 19 to 23, is characterized in that,
Described 1st operational part carries out computing according to described 2nd assessed value to described 1st assessed value.
26. capturing elements according to any one of claim 19 to 25, is characterized in that, formed by with lower part:
There is the shooting chip of described shoot part; With
There is described 1st operational part and described 2nd operational part and by stacked and with the signal processing chip of described shooting chip join.
27. capturing elements as claimed in claim 26, is characterized in that,
Described shooting chip is formed by rear surface irradiation type CMOS chip.
28. 1 kinds of filming apparatus, is characterized in that, have the capturing element according to any one of claim 19 to 27.
29. 1 kinds of capturing elements, is characterized in that having:
Shoot part, it has multiple groups of being made up of at least one pixel and by each described group and arrange and the multiple signal reading units read by the signal from described pixel; With
Multiple operational part, the information relevant with the control of described signal reading unit by each described group and arrange, and is sent to the image processing part to described signal real-time image processing by it.
30. capturing elements as claimed in claim 1, is characterized in that,
Each of described multiple groups is made up of multiple pixel.
31. 1 kinds of capturing elements, is characterized in that having:
Shoot part, its 2nd reading circuit that there is the shooting area being configured with the 1st pixel and the 2nd pixel, the 1st reading circuit reading the 1st picture element signal exported from described 1st pixel and read the 2nd picture element signal exported from described 2nd pixel;
1st operational part, it carries out computing based on described 1st picture element signal to the 1st assessed value, and described 1st assessed value calculated is sent to the image processing part of the rear class to the 1st pixel data real-time image processing corresponding with described 1st picture element signal; And
2nd operational part, it carries out computing based on described 2nd picture element signal to the 2nd assessed value, and described 2nd assessed value calculated is sent to the image processing part of the rear class to the 2nd pixel data real-time image processing corresponding with described 2nd picture element signal.
32. capturing elements as claimed in claim 31, is characterized in that,
Described shooting area has the 1st block of pixels being configured with multiple described 1st pixel and the 2nd block of pixels being configured with multiple described 2nd pixel,
Multiple described 1st picture element signal that multiple described 1st pixel that described 1st operational part comprises based on described 1st block of pixels exports calculates described 1st assessed value,
Multiple described 2nd picture element signal that multiple described 2nd pixel that described 2nd operational part comprises based on described 2nd block of pixels exports calculates described 2nd assessed value.
33. capturing elements as described in claim 31 or 32, is characterized in that,
Described 1st assessed value is set up corresponding with described 1st pixel data by described 1st operational part,
Described 2nd assessed value is set up corresponding with described 2nd pixel data by described 2nd operational part.
34. capturing elements according to any one of claim 31 to 33, is characterized in that,
Described 1st operational part gives the data code of the operation content representing described 1st assessed value to described 1st assessed value,
Described 2nd operational part gives the data code of the operation content representing described 2nd assessed value to described 2nd assessed value.
35. capturing elements according to any one of claim 31 to 34, is characterized in that,
Described 1st operational part also carries out computing based on described 2nd picture element signal to described 1st assessed value.
36. capturing elements according to any one of claim 31 to 35, is characterized in that,
Described 1st operational part also based on the operation result in the 2nd assessed value process described in described 2nd assessed value of described 2nd operational part or described 2nd operational part computing, carries out computing to described 1st assessed value.
37. capturing elements as claimed in claim 36, is characterized in that,
Described 1st operational part has:
This block calculating part, it carries out predetermined computing to described 1st pixel data; With
Average computation portion, it carries out predetermined computing to described 2nd assessed value of described 2nd operational part or the operation result of described operation result and this block calculating part,
The operation result of this block calculating part is sent to described 2nd operational part by described 1st operational part,
The operation result in described average computation portion is sent to described image processing part as described 1st assessed value.
38. capturing elements as claimed in claim 37, is characterized in that,
Described 1st operational part also have based on described average computation portion operation result to described 1st pixel data compress average-average computation portion.
39. capturing elements according to any one of claim 31 to 34, is characterized in that,
At least one party of described 1st operational part and described 2nd operational part, to the described pixel data in present frame, uses the described pixel data in former frame to carry out predetermined computing.
40. capturing elements as claimed in claim 32, is characterized in that also having:
1A/D transducer, itself and described 1st block of pixels are arranged accordingly, and convert respective described 1st picture element signal to described 1st pixel data; With
2A/D transducer, itself and described 2nd block of pixels are arranged accordingly, and convert respective described 2nd picture element signal to described 2nd pixel data.
41. capturing elements according to any one of claim 31 to 40, is characterized in that,
Described shoot part is formed on shooting chip,
Described 1st operational part and described 2nd operational part are formed in on the signal processing chip of described shooting chip laminate.
42. capturing elements as claimed in claim 41, is characterized in that,
Described shooting chip is rear surface irradiation type CMOS chip.
43. capturing elements according to any one of claim 31 to 42, is characterized in that,
Described image processing part is arranged on described capturing element, and based on described 1st assessed value and described 2nd assessed value to described 1st pixel data and described 2nd pixel data real-time image processing.
44. 1 kinds of filming apparatus, have the capturing element according to any one of claim 31 to 42.
45. filming apparatus as claimed in claim 44, is characterized in that,
Described image processing part is arranged on the outside of described capturing element, and based on described 1st assessed value and described 2nd assessed value to described 1st pixel data and described 2nd pixel data real-time image processing.
46. 1 kinds of capturing elements, is characterized in that having:
Shoot part, it has multiple groups that are made up of at least one pixel; With
Storage part, it has multiple memory block, and the plurality of memory block and described multiple groups are arranged accordingly, and carries out the storage of signal of pixel and the storage of the signal from the pixel beyond the group of described correspondence that come self-corresponding group respectively.
47. capturing elements as claimed in claim 46, is characterized in that,
Each of described multiple groups is made up of multiple pixel.
48. capturing elements as described in claim 46 or 47, is characterized in that,
At least can carry out from both high speed frame frequencies that benchmark frame frequency described in benchmark frame frequency and period ratio is short the frame frequency that option table is shown in the cycle generating described signal described multiple groups by each described group,
Also have control part, this control part makes the described signal storage of the described memory block corresponding with described group of described high speed frame frequency in described memory block corresponding to described group with described benchmark frame frequency.
49. capturing elements as claimed in claim 48, is characterized in that,
Respective described group in the entire scope of pixel region by every multiple described group and Briquetting that distributes with separating fixed intervals along the line of the column direction, the respective described memory block corresponding with described group in block is common by whole described group in block,
Described control part has stored described signal in the described memory block that described group with described high speed frame frequency is corresponding, make the described signal storage corresponding with this group to being in the arbitrary described memory block in same with this group.
50. capturing elements as described in claim 46 or 47, is characterized in that,
At least can carry out from both high speed frame frequencies that benchmark frame frequency described in benchmark frame frequency and period ratio is short the frame frequency that option table is shown in the cycle generating described signal described group by each described group,
Also there is control part, this control part has stored described signal in the described memory block that described group with described high speed frame frequency is corresponding, the described signal of this memory block is moved to the described memory block that described group with described benchmark frame frequency is corresponding, and makes the described signal storage corresponding with this group in corresponding described memory block.
51. capturing elements as claimed in claim 50, is characterized in that,
Also there is the transfer path transmitting described signal between the adjacent described memory block corresponding with described group,
Described control part and described high speed frame frequency synchronously, make the described signal corresponding with described group of described high speed frame frequency move to adjacent described memory block successively.
52. capturing elements as claimed in claim 51, is characterized in that,
Described control part make respective described signal in adjacent described memory block, move from the described memory block that the edge of pixel region is nearest.
53. capturing elements as claimed in claim 52, is characterized in that,
Described control part by described multiple groups, be fixed as described benchmark frame frequency along the described frame frequency of described group of the most peripheral of pixel region.
54. capturing elements according to any one of claim 48 to 53, is characterized in that,
Respective described memory block is the memory arranged by respective each described group.
55. capturing elements according to any one of claim 46 to 54, is characterized in that,
Described storage part has the storage area of at least formed objects relative to total storage area of multiple described memory block, also has the transmission memory that the described signal that multiple described memory block stored transmitted by predetermined each cycle.
56. 1 kinds of filming apparatus, is characterized in that, have the capturing element according to any one of claim 46 to 55.
57. capturing elements as claimed in claim 1, is characterized in that,
Also have by every described multiple groups and arrange and the signal from described pixel converted to multiple A/D converters of pixel data.
58. capturing elements as claimed in claim 17, is characterized in that,
Also have by every described multiple groups and arrange and the signal from described pixel converted to multiple A/D converters of pixel data.
59. capturing elements as claimed in claim 20, is characterized in that also having:
1A/D transducer, itself and described 1st region are arranged accordingly, and convert respective described 1st picture element signal to the 1st pixel data; With
2A/D transducer, itself and described 2nd region are arranged accordingly, and convert respective described 2nd picture element signal to the 2nd pixel data.
60. capturing elements as claimed in claim 29, is characterized in that,
Also have by every described multiple groups and arrange and the signal from described pixel converted to multiple A/D converters of pixel data.
61. capturing elements as claimed in claim 46, is characterized in that,
Also have by every described multiple groups and arrange and the signal from described pixel converted to multiple A/D converters of pixel data.
CN201380022701.9A 2012-05-02 2013-05-02 Imaging device Pending CN104272721A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910825470.4A CN110572586A (en) 2012-05-02 2013-05-02 Imaging element and electronic device

Applications Claiming Priority (11)

Application Number Priority Date Filing Date Title
JP2012105316 2012-05-02
JP2012-105316 2012-05-02
JP2012-139026 2012-06-20
JP2012139026 2012-06-20
JP2012-142126 2012-06-25
JP2012142126 2012-06-25
JP2012149844 2012-07-03
JP2012-149844 2012-07-03
JP2012149946 2012-07-03
JP2012-149946 2012-07-03
PCT/JP2013/002927 WO2013164915A1 (en) 2012-05-02 2013-05-02 Imaging device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201910825470.4A Division CN110572586A (en) 2012-05-02 2013-05-02 Imaging element and electronic device

Publications (1)

Publication Number Publication Date
CN104272721A true CN104272721A (en) 2015-01-07

Family

ID=49514319

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201380022701.9A Pending CN104272721A (en) 2012-05-02 2013-05-02 Imaging device
CN201910825470.4A Pending CN110572586A (en) 2012-05-02 2013-05-02 Imaging element and electronic device

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201910825470.4A Pending CN110572586A (en) 2012-05-02 2013-05-02 Imaging element and electronic device

Country Status (7)

Country Link
US (3) US20150077590A1 (en)
EP (1) EP2846538B1 (en)
JP (6) JPWO2013164915A1 (en)
CN (2) CN104272721A (en)
BR (1) BR112014027066A2 (en)
RU (2) RU2018109081A (en)
WO (1) WO2013164915A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106973196A (en) * 2015-09-25 2017-07-21 佳能株式会社 Imaging sensor, image capture method and picture pick-up device
CN107925730A (en) * 2015-07-24 2018-04-17 索尼半导体解决方案公司 Imaging sensor and electronic equipment
CN110225269A (en) * 2015-08-26 2019-09-10 意法半导体国际有限公司 Image sensor apparatus and relevant apparatus and method with macro processes pixel
US10535687B2 (en) 2015-01-22 2020-01-14 Sony Semiconductor Solutions Corporation Solid-state imaging device and electronic apparatus
CN111194547A (en) * 2017-09-29 2020-05-22 株式会社尼康 Video compression device, electronic apparatus, and video compression program
CN111902761A (en) * 2018-04-09 2020-11-06 浜松光子学株式会社 Sample observation device and sample observation method
US11962920B2 (en) 2019-08-20 2024-04-16 Sony Semiconductor Solutions Corporation Imaging device, method of driving imaging device, and electronic equipment

Families Citing this family (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10924697B2 (en) 2013-02-27 2021-02-16 Nikon Corporation Image sensor and electronic device having imaging regions for focus detection extending in first and second different directions
US11290652B2 (en) 2013-05-31 2022-03-29 Nikon Corporation Electronic apparatus and control program
CN109256404B (en) 2013-07-04 2023-08-15 株式会社尼康 Image pickup device and electronic apparatus
US10321083B2 (en) 2013-08-12 2019-06-11 Nikon Corporation Electronic apparatus, method for controlling electronic apparatus, and control program
JP6375607B2 (en) 2013-09-30 2018-08-22 株式会社ニコン Electronic device, electronic device control method, and control program
CN110365899B (en) * 2013-09-30 2022-03-22 株式会社尼康 Electronic device
EP3076662A4 (en) 2013-11-26 2017-08-09 Nikon Corporation Electronic device, imaging device, and imaging element
JP6916418B2 (en) * 2013-11-26 2021-08-11 株式会社ニコン Imaging device
US9386220B2 (en) * 2013-12-06 2016-07-05 Raytheon Company Electro-optical (EO)/infrared (IR) staring focal planes with high rate region of interest processing and event driven forensic look-back capability
JP2015111761A (en) * 2013-12-06 2015-06-18 株式会社ニコン Electronic apparatus
JP6405646B2 (en) * 2014-02-26 2018-10-17 株式会社ニコン Imaging device
JP6303689B2 (en) * 2014-03-25 2018-04-04 株式会社ニコン Electronics
JP6409301B2 (en) * 2014-03-31 2018-10-24 株式会社ニコン Electronics
JP6258480B2 (en) * 2014-05-21 2018-01-10 株式会社日立製作所 Image processing apparatus and positioning system
CN106537892B (en) 2014-05-29 2021-01-05 株式会社尼康 Image pickup device and vehicle
JP6583268B2 (en) * 2014-06-10 2019-10-02 ソニー株式会社 Imaging control apparatus, imaging apparatus, imaging system, and imaging control method
US10104388B2 (en) * 2014-06-30 2018-10-16 Sony Corporation Video processing system with high dynamic range sensor mechanism and method of operation thereof
JP6520035B2 (en) 2014-09-30 2019-05-29 株式会社ニコン Electronics
JP2016072863A (en) 2014-09-30 2016-05-09 株式会社ニコン Electronic apparatus, reproduction apparatus, recording medium, recording program, reproduction program, recording method, and reproduction method
US10284771B2 (en) 2014-12-03 2019-05-07 Nikon Corporation Image-capturing apparatus, electronic device, and program
JP6617428B2 (en) * 2015-03-30 2019-12-11 株式会社ニコン Electronics
JP6700673B2 (en) * 2015-05-15 2020-05-27 キヤノン株式会社 Imaging device, imaging system
JP7029961B2 (en) * 2015-05-19 2022-03-04 マジック リープ, インコーポレイテッド Semi-global shutter imager
GB2541713A (en) * 2015-08-27 2017-03-01 Rowan Graham Processing of high frame rate video data
JP6369433B2 (en) * 2015-09-18 2018-08-08 株式会社ニコン Imaging device
JP6451575B2 (en) * 2015-09-18 2019-01-16 株式会社ニコン Imaging device
JP6369432B2 (en) * 2015-09-18 2018-08-08 株式会社ニコン Imaging device
JP6451576B2 (en) * 2015-09-18 2019-01-16 株式会社ニコン Imaging device
CN113099138A (en) 2015-09-30 2021-07-09 株式会社尼康 Imaging element
WO2017057279A1 (en) 2015-09-30 2017-04-06 株式会社ニコン Imaging device, image processing device and display device
JP6643871B2 (en) * 2015-11-13 2020-02-12 キヤノン株式会社 Radiation imaging apparatus and photon counting method
WO2017086442A1 (en) 2015-11-18 2017-05-26 株式会社ニコン Imaging capture element, measurement device, and measurement method
JP6764880B2 (en) 2015-12-11 2020-10-07 株式会社ニコン Polarization characteristic image measuring device, polarization characteristic image measurement method
WO2017104765A1 (en) 2015-12-16 2017-06-22 株式会社ニコン Image-capturing device and method for detecting motion
JP6674255B2 (en) * 2015-12-28 2020-04-01 キヤノン株式会社 Solid-state imaging device and imaging device
WO2017149962A1 (en) * 2016-03-02 2017-09-08 ソニー株式会社 Imaging control device, imaging control method, and program
EP3435152B1 (en) * 2016-03-23 2020-12-02 Hitachi Automotive Systems, Ltd. Vehicle-mounted image processing device
CN113225497A (en) * 2016-03-24 2021-08-06 株式会社尼康 Image pickup element and image pickup apparatus
US10720465B2 (en) * 2016-03-31 2020-07-21 Nikon Corporation Image sensor and image capture device
JP6794649B2 (en) * 2016-04-06 2020-12-02 株式会社ニコン Electronics and recording programs
JP6668975B2 (en) * 2016-07-01 2020-03-18 株式会社ニコン Electronics and vehicles
JP7057635B2 (en) * 2017-08-15 2022-04-20 キヤノン株式会社 Imaging equipment, cameras and transportation equipment
WO2019065917A1 (en) * 2017-09-29 2019-04-04 株式会社ニコン Moving-image compression device, electronic apparatus, and moving-image compression program
WO2019065919A1 (en) * 2017-09-29 2019-04-04 株式会社ニコン Imaging device, image-processing device, moving-image compression device, setting program, image-processing program, and moving-image compression program
WO2019065918A1 (en) * 2017-09-29 2019-04-04 株式会社ニコン Image-processing device, moving-image compression device, image-processing program, and moving-image compression program
KR102430496B1 (en) 2017-09-29 2022-08-08 삼성전자주식회사 Image sensing apparatus and manufacturing method thereof
JP7167928B2 (en) 2017-09-29 2022-11-09 株式会社ニコン MOVIE COMPRESSORS, ELECTRONICS AND MOVIE COMPRESSION PROGRAMS
KR102499033B1 (en) * 2018-01-31 2023-02-13 삼성전자주식회사 Image sensor and electronic device including the image sensor
CN110413805B (en) * 2018-04-25 2022-02-01 杭州海康威视数字技术股份有限公司 Image storage method and device, electronic equipment and storage medium
JP6642641B2 (en) * 2018-07-11 2020-02-05 株式会社ニコン Imaging device
JP6635221B1 (en) * 2018-08-31 2020-01-22 ソニー株式会社 Imaging device, imaging system, imaging method, and imaging program
JP2019004520A (en) * 2018-09-27 2019-01-10 株式会社ニコン Electronic apparatus and electronic apparatus system
JP6683280B1 (en) 2018-10-19 2020-04-15 ソニー株式会社 Sensor device and signal processing method
CN109660698B (en) * 2018-12-25 2021-08-03 苏州佳世达电通有限公司 Image processing system and image processing method
RU2707714C1 (en) * 2019-01-28 2019-11-28 федеральное государственное бюджетное образовательное учреждение высшего образования "Южно-Российский государственный политехнический университет (НПИ) имени М.И. Платова" Device for automatic acquisition and processing of images
US11037968B2 (en) * 2019-04-05 2021-06-15 Waymo Llc Image sensor architecture
JPWO2020246181A1 (en) * 2019-06-07 2020-12-10
JP2021005846A (en) * 2019-06-27 2021-01-14 オリンパス株式会社 Stacked imaging device, imaging device, imaging method, learning method, and image readout circuit
JP7366635B2 (en) * 2019-08-07 2023-10-23 キヤノン株式会社 Imaging device, computer program and storage medium
US20210136274A1 (en) * 2019-11-01 2021-05-06 Semiconductor Components Industries, Llc Systems and methods for performing high dynamic range imaging with partial transfer gate pulsing and digital accumulation
JP7362441B2 (en) 2019-11-18 2023-10-17 キヤノン株式会社 Imaging device and method of controlling the imaging device
TWI741429B (en) * 2019-12-04 2021-10-01 晶睿通訊股份有限公司 Image analyzing method of increasing analysis accuracy and related image monitoring apparatus
JP7028279B2 (en) * 2020-06-08 2022-03-02 株式会社ニコン Electronics
JP7248163B2 (en) * 2020-06-08 2023-03-29 株式会社ニコン Imaging device
JP2021040325A (en) * 2020-11-10 2021-03-11 株式会社ニコン Electronic apparatus and program
JP7131595B2 (en) * 2020-11-12 2022-09-06 株式会社ニコン playback equipment and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231738A1 (en) * 2009-03-11 2010-09-16 Border John N Capture of video with motion
WO2012016374A1 (en) * 2010-08-03 2012-02-09 Empire Technology Development Llc Method for identifying objects in video

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06113195A (en) * 1992-09-29 1994-04-22 Canon Inc Video camera device
JP3882828B2 (en) * 1993-08-30 2007-02-21 ソニー株式会社 Electronic zoom device and electronic zoom method
US5406334A (en) 1993-08-30 1995-04-11 Sony Corporation Apparatus and method for producing a zoomed image signal
JP3448091B2 (en) * 1994-02-18 2003-09-16 富士通株式会社 Encoding / decoding device and encoding / decoding method
JP2002290908A (en) 2001-03-28 2002-10-04 Minolta Co Ltd Photographing device, method for controlling recording of animation and still picture and picture editing device
US20030049925A1 (en) 2001-09-10 2003-03-13 Layman Paul Arthur High-density inter-die interconnect structure
JP2004214985A (en) * 2002-12-27 2004-07-29 Canon Inc Image processor and image reproducing device
JP4311181B2 (en) * 2003-12-05 2009-08-12 ソニー株式会社 Semiconductor device control method, signal processing method, semiconductor device, and electronic apparatus
JP4371797B2 (en) * 2003-12-12 2009-11-25 コニカミノルタホールディングス株式会社 Solid-state imaging device
JP4380439B2 (en) * 2004-07-16 2009-12-09 ソニー株式会社 Data processing method, data processing apparatus, semiconductor device for detecting physical quantity distribution, and electronic apparatus
JP4349232B2 (en) 2004-07-30 2009-10-21 ソニー株式会社 Semiconductor module and MOS solid-state imaging device
JP2006109103A (en) * 2004-10-05 2006-04-20 Victor Co Of Japan Ltd Focusing processing circuit for imaging apparatus
JP4277216B2 (en) 2005-01-13 2009-06-10 ソニー株式会社 Imaging apparatus and imaging result processing method
US20060219862A1 (en) * 2005-03-31 2006-10-05 Kai-Kuang Ho Compact camera module with reduced thickness
JP2006324834A (en) * 2005-05-18 2006-11-30 Hitachi Ltd Device and method for imaging
TW201101476A (en) 2005-06-02 2011-01-01 Sony Corp Semiconductor image sensor module and method of manufacturing the same
JP4687404B2 (en) 2005-11-10 2011-05-25 ソニー株式会社 Image signal processing apparatus, imaging apparatus, and image signal processing method
JP4816336B2 (en) 2006-02-07 2011-11-16 日本ビクター株式会社 Imaging method and imaging apparatus
JP2007228019A (en) * 2006-02-21 2007-09-06 Olympus Corp Imaging apparatus
JP3996618B1 (en) * 2006-05-11 2007-10-24 総吉 廣津 Semiconductor image sensor
JP2009302946A (en) * 2008-06-13 2009-12-24 Fujifilm Corp Solid-state image pickup element, driving method for solid-state image pickup element and image pickup device
JP5266916B2 (en) 2008-07-09 2013-08-21 ソニー株式会社 Image sensor, camera, image sensor control method, and program
JP5132497B2 (en) * 2008-09-16 2013-01-30 キヤノン株式会社 IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
JP5251412B2 (en) * 2008-10-09 2013-07-31 ソニー株式会社 Solid-state imaging device, driving method thereof, and camera system
JP5178458B2 (en) 2008-10-31 2013-04-10 キヤノン株式会社 Solid-state imaging device, imaging system, and driving method of solid-state imaging device
JP5215262B2 (en) * 2009-02-03 2013-06-19 オリンパスイメージング株式会社 Imaging device
JP2010183357A (en) 2009-02-05 2010-08-19 Panasonic Corp Solid state imaging element, camera system, and method of driving solid state imaging element
JP5589284B2 (en) * 2009-02-19 2014-09-17 株式会社ニコン Imaging apparatus and correction amount calculation program
JP4835710B2 (en) 2009-03-17 2011-12-14 ソニー株式会社 Solid-state imaging device, method for manufacturing solid-state imaging device, driving method for solid-state imaging device, and electronic apparatus
JP5423187B2 (en) * 2009-07-08 2014-02-19 カシオ計算機株式会社 Imaging apparatus, image selection method, and program
JP5500007B2 (en) 2010-09-03 2014-05-21 ソニー株式会社 Solid-state imaging device and camera system
JP2013121027A (en) * 2011-12-07 2013-06-17 Sony Corp Solid-state imaging element, method for driving the same, and camera system
EP3490247B1 (en) * 2012-03-30 2022-09-07 Nikon Corporation Imaging unit and imaging apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231738A1 (en) * 2009-03-11 2010-09-16 Border John N Capture of video with motion
WO2012016374A1 (en) * 2010-08-03 2012-02-09 Empire Technology Development Llc Method for identifying objects in video

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10535687B2 (en) 2015-01-22 2020-01-14 Sony Semiconductor Solutions Corporation Solid-state imaging device and electronic apparatus
CN107925730B (en) * 2015-07-24 2021-07-20 索尼半导体解决方案公司 Image sensor and electronic device
CN113660430B (en) * 2015-07-24 2023-06-20 索尼半导体解决方案公司 Light detection device
US10645313B2 (en) 2015-07-24 2020-05-05 Sony Semiconductor Solutions Corporation Image sensor and electronic apparatus
CN113660430A (en) * 2015-07-24 2021-11-16 索尼半导体解决方案公司 Light detection equipment
CN107925730A (en) * 2015-07-24 2018-04-17 索尼半导体解决方案公司 Imaging sensor and electronic equipment
CN110225269A (en) * 2015-08-26 2019-09-10 意法半导体国际有限公司 Image sensor apparatus and relevant apparatus and method with macro processes pixel
CN110225269B (en) * 2015-08-26 2022-03-25 意法半导体国际有限公司 Image sensor device with macropixel processing and related devices and methods
US10735678B2 (en) 2015-09-25 2020-08-04 Canon Kabushiki Kaisha Image sensor, imaging method, and imaging apparatus
CN106973196A (en) * 2015-09-25 2017-07-21 佳能株式会社 Imaging sensor, image capture method and picture pick-up device
CN111194547A (en) * 2017-09-29 2020-05-22 株式会社尼康 Video compression device, electronic apparatus, and video compression program
CN111194547B (en) * 2017-09-29 2022-04-29 株式会社尼康 Video compression device, electronic apparatus, and storage medium
US11589059B2 (en) 2017-09-29 2023-02-21 Nikon Corporation Video compression apparatus, electronic apparatus, and video compression program
US11178406B2 (en) 2017-09-29 2021-11-16 Nikon Corporation Video compression apparatus, electronic apparatus, and video compression program
CN111902761A (en) * 2018-04-09 2020-11-06 浜松光子学株式会社 Sample observation device and sample observation method
CN111902761B (en) * 2018-04-09 2022-06-03 浜松光子学株式会社 Sample observation device and sample observation method
US11709350B2 (en) 2018-04-09 2023-07-25 Hamamatsu Photonics K.K. Sample observation device and sample observation method
US11962920B2 (en) 2019-08-20 2024-04-16 Sony Semiconductor Solutions Corporation Imaging device, method of driving imaging device, and electronic equipment

Also Published As

Publication number Publication date
RU2018109081A (en) 2019-02-26
JPWO2013164915A1 (en) 2015-12-24
JP7363983B2 (en) 2023-10-18
US20180295309A1 (en) 2018-10-11
US20150077590A1 (en) 2015-03-19
JP2023168553A (en) 2023-11-24
JP2022132541A (en) 2022-09-08
EP2846538A1 (en) 2015-03-11
EP2846538A4 (en) 2016-01-13
JP2018186577A (en) 2018-11-22
JP2018186576A (en) 2018-11-22
US11825225B2 (en) 2023-11-21
US20240031701A1 (en) 2024-01-25
RU2014148323A (en) 2016-06-27
JP2020205644A (en) 2020-12-24
CN110572586A (en) 2019-12-13
EP2846538B1 (en) 2021-06-23
BR112014027066A2 (en) 2017-06-27
RU2649967C2 (en) 2018-04-06
JP7111141B2 (en) 2022-08-02
WO2013164915A1 (en) 2013-11-07

Similar Documents

Publication Publication Date Title
CN104272721A (en) Imaging device
US11082646B2 (en) Imaging unit, imaging apparatus, and computer readable medium storing thereon an imaging control program
CN105324985A (en) Electronic apparatus, electronic apparatus control method, and control program
CN105103537A (en) Imaging element, and electronic device
CN104937924A (en) Electronic device and control program
CN105684421A (en) Electronic apparatus
CN111194547B (en) Video compression device, electronic apparatus, and storage medium
US20240089402A1 (en) Electronic apparatus, reproduction device, reproduction method, recording medium, and recording method
JP6613554B2 (en) Image processing apparatus and program
WO2019065919A1 (en) Imaging device, image-processing device, moving-image compression device, setting program, image-processing program, and moving-image compression program
US10686987B2 (en) Electronic apparatus with image capturing unit having first and second imaging regions that capture an image of a subject under differing imaging conditions
US20170324911A1 (en) Electronic apparatus, reproduction device, reproduction method, recording medium, and recording method
JP7131595B2 (en) playback equipment and electronic equipment
WO2019065918A1 (en) Image-processing device, moving-image compression device, image-processing program, and moving-image compression program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20150107

RJ01 Rejection of invention patent application after publication