US20210021779A1 - Image sensing device, camera, and transportation equipment - Google Patents
Image sensing device, camera, and transportation equipment Download PDFInfo
- Publication number
- US20210021779A1 US20210021779A1 US17/061,710 US202017061710A US2021021779A1 US 20210021779 A1 US20210021779 A1 US 20210021779A1 US 202017061710 A US202017061710 A US 202017061710A US 2021021779 A1 US2021021779 A1 US 2021021779A1
- Authority
- US
- United States
- Prior art keywords
- image sensing
- pixel
- pixels
- type pixels
- sensing operation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 239000004065 semiconductor Substances 0.000 claims description 25
- 238000006243 chemical reaction Methods 0.000 claims description 19
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000012545 processing Methods 0.000 description 22
- 230000000875 corresponding effect Effects 0.000 description 9
- 238000009792 diffusion process Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 8
- 208000009989 Posterior Leukoencephalopathy Syndrome Diseases 0.000 description 6
- 230000001276 controlling effect Effects 0.000 description 6
- 101000622137 Homo sapiens P-selectin Proteins 0.000 description 5
- 102100023472 P-selectin Human genes 0.000 description 5
- 230000005856 abnormality Effects 0.000 description 5
- 238000000034 method Methods 0.000 description 5
- 102100032752 C-reactive protein Human genes 0.000 description 4
- 101000942118 Homo sapiens C-reactive protein Proteins 0.000 description 4
- 101000920800 Homo sapiens Endoplasmic reticulum-Golgi intermediate compartment protein 2 Proteins 0.000 description 4
- 101000583156 Homo sapiens Pituitary homeobox 1 Proteins 0.000 description 4
- KJWMGLBVDNMNQW-VWTMXFPPSA-N Pectenotoxin 1 Chemical compound O[C@@H]1[C@H](C)CCO[C@]1(O)[C@H]1O[C@@H]2/C=C/C(/C)=C/[C@H](C)C[C@](C)(O3)CC[C@@H]3[C@](O3)(O4)CC[C@@]3(CO)C[C@@H]4[C@@H](O3)C(=O)C[C@]3(C)[C@@H](O)[C@@H](O3)CC[C@@]3(O3)CCC[C@H]3[C@@H](C)C(=O)O[C@@H]2C1 KJWMGLBVDNMNQW-VWTMXFPPSA-N 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 238000009825 accumulation Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- 101001092910 Homo sapiens Serum amyloid P-component Proteins 0.000 description 2
- 102100036202 Serum amyloid P-component Human genes 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003071 parasitic effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
Images
Classifications
-
- H04N5/378—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/71—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
- H04N25/75—Circuitry for providing, modifying or processing image signals from the pixel array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
- H04N25/445—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by skipping some contiguous pixels within the read portion of the array
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/53—Control of the integration time
- H04N25/531—Control of the integration time by controlling rolling shutters in CMOS SSIS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/53—Control of the integration time
- H04N25/533—Control of the integration time by using differing integration times for different sensor regions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/702—SSIS architectures characterised by non-identical, non-equidistant or non-planar pixel layout
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/78—Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters
-
- H04N5/345—
-
- H04N5/3456—
-
- H04N5/3532—
-
- H04N5/3535—
-
- H04N5/3696—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/50—Control of the SSIS exposure
- H04N25/53—Control of the integration time
Definitions
- the present invention relates to an image sensing device, a camera, and a transportation equipment.
- Japanese Patent Laid-Open No. 2002-320235 discloses a CMOS image sensor that has, in addition to a mode for reading out signals from all of the pixels arranged in a pixel array, a mode for thinned-out reading of pixel signals when reduced image signals are to be output.
- Japanese Patent Laid-Open No. 2005-86245 discloses a solid-state image sensing device that reduces the number of pixels which are read out for each frame to improve the frame rate and alternately reads out, for each frame, an image sensing signal such as that for a moving image and an image-sensing target recognition signal such as that for autofocus.
- Japanese Patent Laid-Open Nos. 2002-320235 and 2005-86245 each have an arrangement in which signal readout is performed for each row when only signals from some of the pixels which are arranged in a pixel array are to be read out, the readout operation time can be long if the pixels whose signals are to be read out are arranged over a plurality of rows.
- the present invention provides a technique advantageous in reducing the readout time when signals are to be read out from some of the pixels which are arranged in a pixel array.
- an image sensing device that comprises a pixel array in which a plurality of pixels are arranged in a matrix and a plurality of readout circuits configured to read out signals from the pixel array, the plurality of pixels comprising a first pixel which belongs to a first pixel row of the pixel array and a first pixel column of the pixel array, a second pixel which belongs to a second pixel row of the pixel array and the first pixel column of the pixel array, and a third pixel which belongs to the second pixel row of the pixel array and a second pixel column of the pixel array, and the plurality of readout units comprising a first readout circuit connected to the first pixel and the second pixel and a second readout circuit connected to the third pixel, wherein the image sensing device performs a first image sensing operation and performs a second image sensing operation after the first image sensing operation, wherein in the first image sensing operation, signal readout from the first pixel by the first readout circuit
- FIG. 1 is a view showing an example of the arrangement of an image sensing device according to an embodiment of the present invention
- FIGS. 2A to 2C are views each showing an example of the arrangement of a pixel array of the image sensing device of FIG. 1 ;
- FIG. 3 is a timing chart of a full pixel readout operation of the image sensing device of FIG. 1 ;
- FIG. 4 is a timing chart of a thinned-out reading operation of the image sensing device of FIG. 1 ;
- FIGS. 5A and 5B are a view showing an example of the arrangement of the pixel array and a timing chart of the thinned-out reading operation, respectively, of the image sensing device of FIG. 1 ;
- FIG. 6 is a timing chart of an operation of the image sensing device of FIG. 1 ;
- FIG. 7 is a view showing an example of the arrangement of a pixel array of the image sensing device of FIG. 1 ;
- FIG. 8 is a timing chart of an operation of the image sensing device which includes the pixel array of FIG. 7 ;
- FIGS. 9A to 9D are views showing examples of the arrangement of a camera incorporating the image sensing device.
- FIGS. 10A and 10B are views showing examples of a transportation equipment mounted with the image sensing device of FIG. 1 .
- FIG. 1 is a view showing an arrangement of an image sensing device 100 according to an embodiment of the present invention.
- the image sensing device 100 includes a pixel array 101 , a vertical scanning circuit 102 , readout circuits 103 , a horizontal scanning circuit 104 , a controller 105 , and a control parameter line 106 .
- a lateral direction is called a row direction (horizontal direction)
- a longitudinal direction is called a column direction.
- 16 columns, of which the rightmost end is the 0th column and the leftmost end is the 15th column, of pixels are arranged in the pixel array 101 .
- the vertical scanning circuit 102 selects pixels arranged in the row direction.
- the readout circuit 103 is arranged for each column and reads out a signal, via a column signal line of each row, from each pixel in a row that has been selected by the vertical scanning circuit 102 .
- the controller 105 processes the signals from the readout circuits 103 which are scanned by the horizontal scanning circuit 104 and feeds back the generated control parameters to the vertical scanning circuit 102 and the readout circuits 103 by using the control parameter line 106 .
- the controller 105 may control the components of the image sensing device 100 , such as the vertical scanning circuit 102 , the readout circuits 103 , and the horizontal scanning circuit 104 .
- a pixel color can be associated with each pixel of the pixel array 101 by using a color filter array.
- a color filter array can, for example, employ a Bayer array in which green pixels are assigned to a diagonal pixel pair of 2 ⁇ 2 pixels and a red pixel and a blue pixel are assigned to the remaining two pixels.
- Pixels arranged in the pixel array 101 includes a plurality of first-type pixels 110 which are used in the thinned-out reading operation (to be described later) and a plurality of second-type pixels 120 which are not used in the thinned-out reading operation but are used for image generation.
- the first-type pixels 110 can be referred to as thinned-out reading pixels and the second-type pixels 120 can be referred to as non-thinned-out reading pixels or normal readout pixels. Note that when reading out the second-type pixels 120 , the first-type pixels 110 can also be read out without executing a thinning operation in the same manner as the second-type pixels 120 .
- the first-type pixels 110 and the second-type pixels 120 can be distinguished from each other in the point that their respective readout methods are different, they may have the same pixel structure.
- a row in which the first-type pixel 110 and the second-type pixel 120 are arranged in the row direction will be called a first-type pixel row.
- the pixel array 101 includes a plurality of first-type pixel rows each including at least one first-type pixel of the plurality of first-type pixels 110 and one of the plurality of second-type pixels 120 .
- the pixel array 101 also includes a plurality of second-type pixel rows in which only the second-type pixels 120 , other than the first-type pixels, are arranged in the row direction.
- each first-type pixel row at least one second-type pixel 120 is arranged between adjacent first-type pixels 110 . Also, at least one row of pixels other than the first-type pixels, more specifically, a pixel row formed by only the second-type pixels 120 is arranged between adjacent first-type pixel rows.
- the first-type pixel 110 is arranged for every 4 pixels (4 rows) in the column direction and for every 4 pixels (4 columns) in the row direction. Furthermore, in each first-type pixel row, there are a plurality of types of positions where the first-type pixels 110 are to be arranged in the row direction.
- the 1st row of the first-type pixel rows (to be referred to as the 1st row hereinafter) in the pixel array 101 and the 5th row of the first-type pixel rows (to be referred to as the 2nd row hereinafter) which is adjacent to the 1st row of the first-type pixel rows in the pixel array 101 .
- the columns where the first-type pixels 110 , of the plurality of first-type pixels 110 , which are arranged in the 1st row are positioned are different from the columns where the first-type pixels 110 , of the plurality of first-type pixels 110 , which are arranged in the 2nd row are positioned.
- the first-type pixels 110 may be arranged in different columns in adjacent first-type pixel rows.
- the first-type pixels 110 are arranged so as to be positioned at different columns from each other in the pixel array 101 in which pixels are arranged in 16 rows ⁇ 16 columns.
- FIG. 2A shows the connection relation of the plurality of readout circuits 103 arranged for the respective columns in correspondence with the first-type pixels 110 , second-type pixels 120 , the vertical scanning circuit 102 , and the pixel array 101 .
- FIG. 2A shows the 1st, 2nd, and 5th rows and 0th to 3rd columns of the pixel array 101 .
- the first-type pixels 110 and the second-type pixels 120 each include a photoelectric conversion element PD, a floating diffusion region FD, and transistors M 1 to M 4 .
- the transistor M 1 is a transfer transistor that transfers, to the floating diffusion region FD, charges converted from light and accumulated by the photoelectric conversion element PD.
- the transistor M 2 is a reset transistor for resetting the photoelectric conversion element PD and the floating diffusion region FD.
- the transistor M 3 is a source-follower transistor that converts the charges transferred to the floating diffusion region FD into a voltage signal and outputs the converted signal.
- the transistor M 4 is a selection transistor for outputting a signal generated from light incident on each pixel to a corresponding column signal line 107 arranged along the column direction.
- a signal line group 130 for controlling the first-type pixels 110 and a signal line group 140 for controlling the second-type pixels 120 are arranged in the first-type pixel rows (the 1st row and the 5th row) from the vertical scanning circuit 102 .
- a signal line group 141 for controlling the second-type pixels 120 is arranged in each pixel row (the 2nd row) in which only the second-type pixels 120 are arranged.
- Each of the signal line groups 130 , 140 , and 141 includes a signal line PTX (transfer control signal line) for controlling the transistor M 1 , a signal line PRES (reset control signal line) for controlling the transistor M 2 , and a signal line PSEL (row selection signal line) for controlling the transistor M 4 .
- Each of the signal lines PTX, PRES, and PSEL can extend in the row direction crossing the column direction in which each column signal line extends.
- “1” is added to the reference symbol of each of the signal lines PTX, PRES, and PSEL that is connected to the first-type pixels 110
- “2” is added to the reference symbol of each of the signal lines PTX, PRES, and PSEL that is connected to the second-type pixels 120 .
- the number in brackets following the reference symbol of each of the signal lines PTX, PRES, and PSEL indicates the row number.
- a total of three signal lines are arranged as the signal line group 141 in the 2nd pixel row in which the first-type pixels 110 are not arranged and only the second-type pixels 120 are arranged.
- the present invention is not limited to this arrangement.
- wiring lines may be added so that it will have six signal lines which is the same number of lines as that of each first-type pixel row.
- the total number of signal lines of the signal line group 130 and the signal line group 140 may be the same as the number of signal lines of the signal line group 141 .
- the wiring lines to be added to the signal line group 141 may be, as shown in FIG. 2B , dummy signal lines PTXD, PRESD, and PSELD which are not connected to any of the second-type pixels 120 .
- the signal lines which are to be added to the signal line group 141 may be a second signal line group which is connected to some of the second-type pixels 120 of the same number as the first-type pixels 110 arranged in the first-type pixel row in the plurality of second-type pixels 120 as shown in FIG. 2C .
- the output wiring line load from the vertical scanning circuit 102 can be made equal in the first-type pixel rows and pixel rows other than the second-type pixel rows. In this case, for example, as shown in FIG.
- the second-type pixel 120 at the 2nd column of each of the 0th, 2nd, and 3rd pixel rows may be connected to signal lines PTX 2 B, PRES 2 B, and PSEL 2 B of the signal line group 141 . That is, the connection relation between the signal line group 141 and the second-type pixel 120 of in the second column of each of the 0th, 2nd, and 3rd pixel rows may be the same as the connection relation between the signal line groups 130 and 140 and the first-type pixels 110 and the second-type pixels 120 of the 1st first-type pixel row.
- connection relation between the 5th row and the 4th, 6th, and 7th rows, the connection relation between the 9th row and the 8th, 10th, and 11th rows, and the connection relation between the 13th row and the 12th, 14th, and 15th rows may be the same.
- FIG. 3 is a timing chart of a readout operation performed to read out signals from all of the pixels arranged in the pixel array 101 .
- FIG. 3 shows the timings at which signals are read out from pixels belonging to the 0th row to the 5th row of the pixel array 101 .
- the transistor M 2 resets the floating diffusion region FD by supplying a Hi signal to a signal line PSEL 2 [ 0 ] and a signal line PRES 2 [ 0 ].
- the transistor M 4 executes an ON operation (changes to a conductive state) simultaneously with the resetting of the floating diffusion region FD, the 0th row changes to the selected state, and a reset level is output from the transistor M 3 via the transistor M 4 to the corresponding column signal line 107 .
- a signal line PRES [ 0 ] changes to a Lo signal
- the reset level of the 0th row is read out by the readout circuit 103 of each column.
- accumulated charges are transferred from each photoelectric conversion element PD to the corresponding floating diffusion region FD when a Hi signal is supplied to a signal line PTX 2 [ 0 ].
- the signal line PTX 2 [ 0 ] changes to a Lo signal
- the signal level of the 0th row is read out by the readout circuits 103 .
- Correlated double sampling processing can be performed on the readout reset level and signal level in each readout circuit 103 or in the controller 105 .
- the 0th row is set to an unselected state when the transistor M 4 is changed to an OFF operation (a release state) by the signal line PSEL 2 [ 0 ] changing to a Lo signal.
- the time from time t 1 to time t 3 is the readout time of one row.
- the readout operation of all of the pixels in the 1st row is started when a Hi signal is supplied simultaneously to each of signal lines PRES 1 [ 1 ], PRES 2 [ 1 ], PSEL 1 [ 1 ], and PSEL 2 [ 1 ], and the readout operation ends at time t 4 . Subsequently, each row is sequentially scanned in the same manner, and signals are read out from the pixels belonging to the row.
- FIG. 4 is a timing chart of the thinned-out reading operation.
- a Hi signal is supplied to only signal lines PSEL 1 and PRES 1 of the 1st, 5th, 9th, and 13th rows which are the first-type pixel rows, and the first-type pixels 110 of each of the first-type pixel rows are reset. Subsequently, each signal line PRES 1 changes to a Lo signal, and the reset level is read out.
- a Hi signal is supplied to only a signal line PTX 1 of each first-type pixel row, the signal level of each first-type pixel row is read out when the signal line PTX 1 changes to a Lo signal.
- the signal line PSEL 1 of each first-type pixel row changes to a Lo signal and the readout of each first-type pixel row ends.
- the first-type pixels 110 of the 1st, 5th, 9th, and 13th rows are arranged in different columns from each other.
- signals from the first-type pixels 110 arranged in the manner described in FIG. 4 can be simultaneously read out by the readout circuits 103 arranged in corresponding columns within the readout time of one row from time t 11 to time t 13 .
- the speed of the thinned-out reading operation can be increased by simultaneously reading out the signals from the first-type pixels 110 arranged in different columns.
- the controller 105 causes the first-type pixels 110 which are arranged in different columns of the plurality of first-type pixels 110 , to connect to corresponding different column signal lines 107 among the plurality of column signal lines 107 .
- signals from at least two or more first-type pixels 110 arranged in two first-type pixel rows among the plurality of first-type pixel rows can be read out simultaneously by the readout circuits 103 arranged in corresponding columns. More specifically, in the arrangement shown in FIG.
- the plurality of pixels include a first-type pixel 110 a which is arranged in the 1st pixel row and the 2nd pixel column of the pixel array and a first-type pixel 110 b which is arranged in the 5th pixel row and the 1st pixel column of the pixel array.
- the plurality of pixels also include a second-type pixel 120 a arranged in the 5th pixel row and the 2nd pixel column of the pixel array.
- the plurality of readout circuits 103 include a readout circuit 103 a which is connected to the first-type pixel 110 a and the second-type pixel 120 a and a readout circuit 103 b which is connected to the first-type pixel 110 b .
- This arrangement allows the readout circuit 103 a to read out a signal from the first-type pixel 110 a and the readout circuit 103 b to read out a signal from the first-type pixel 110 b simultaneously.
- a signal is not read out from the second-type pixel 120 a which is connected to a column signal line 107 a as in the first-type pixel 110 a , and a signal is read out from the second-type pixel 120 a at a separate timing.
- a signal is not read out from another second-type pixel 120 which is connected to a column signal line 107 b as in the first-type pixel 110 b , and a signal is read out from the second-type pixel 120 at another timing.
- FIG. 5B is a timing chart of the thinned-out reading operation performed in the pixel array 101 that includes 64 rows ⁇ 64 columns of pixels in which the first-type pixels 110 are arranged at the same regularity as in FIG. 1 as shown in FIG. 5A .
- the first-type pixels 110 which are present in the 64 rows and are to be arranged in the first-type pixel rows, the first-type pixels 110 of the 1st, 5th, 9th, and 13th rows are read out in the readout time of one row.
- the readout of 17th, 21st, 25th, and 29th rows, the readout of 33rd, 37th, 41st, and 45th rows, and the readout of 49th, 53rd, 57th, and 61st rows can be performed so that signals from all of the first-type pixels 110 can be read out in the readout time of four rows from time t 21 to time t 22 .
- each of the plurality of first-type pixels 110 is arranged for every M pixels (M columns), and the plurality of first-type pixel rows are arranged for every N pixels (N rows) in the pixel array 101 .
- signals are read out simultaneously from the first-type pixels 110 belonging to continuous L first-type pixel rows of the plurality of first-type pixel rows. This can increase the speed of the thinned-out reading operation.
- L, M, and N each are a positive integer not less than 2 and may be a positive integer not less than 3.
- M and N each are not less than 3, a sufficient range of pixels can be subjected to readout at high speed by executing thinned-out reading.
- L, M, and N may be different from each other, two of the integers may be different from each other, two of integers may be the same, or all may be the same.
- L, M, and N all have the same integer of 4.
- the relation between L, M, and N may at least satisfy one of the relations of at least one of L, M, and N being not less than 3 and at least two of L, M, and N being equal to each other.
- L is not less than 1 ⁇ 2 of M
- L may be not more than double of M (M/2 ⁇ L ⁇ 2 ⁇ M)
- L is not less than 1 ⁇ 2 of N
- L may be not more than double of N (N/2 ⁇ L ⁇ 2 ⁇ N).
- This embodiment has described how, in a case in which signals are to be read out from only some of the pixels arranged in the pixel array 101 , the speed of the thinned-out reading operation can be increased by arranging the first-type pixels 110 at suitable positions.
- the embodiment will describe a processing operation in which the suitable image sensing condition by determining, based on the information of an image sensing operation by the first-type pixels 110 whose reading operation speed has been increased, a control parameter for the signals of the second-type pixels 120 of the next readout operation and tracking a high-speed moving object.
- FIG. 6 is a timing chart of a case in which an image sensing operation by using the second-type pixels 120 is performed by using a control parameter based on signals obtained from performing an image sensing operation by using the first-type pixels 110 which perform the thinned-out reading operation.
- the arrangement of the image sensing device 100 is the same as those shown in FIGS. 1 and 2A .
- a shutter operation 1000 (broken line) is a shutter operation of performing an image sensing operation by the first-type pixels 110 .
- the longitudinal direction of the broken line indicating the shutter operation 1000 represents the column direction (or the pixel row position at which the shutter operation is to be performed).
- the shutter operation 1000 indicates that the vertical scanning circuit 102 performs scanning from the upper end to the lower end or from the lower end to the upper end of the pixel array 101 , and that an exposure operation of the first-type pixels 110 of each first-type pixel row is to be started.
- the exposure operation is started after a Hi signal is supplied to the signal lines PTX 1 and PRES 1 of each selected first-type pixel row, the photoelectric conversion element PD of each first-type pixel 110 is reset, and the signal line PTX 1 subsequently changes to a Lo signal.
- a readout operation 1100 (solid line) is an operation of reading out signals from the first-type pixels 110 .
- the signals of the first-type pixels 110 of the 1st row selected by the vertical scanning circuit 102 are read out simultaneously.
- the thinned-out reading operation of reading out signals from the first-type pixels 110 is performed at the same timings as those described above in FIGS. 4 and 5B .
- the period between the shutter operation 1000 and the readout operation 1100 is the maximum width of a period (to be referred to as a first image-sensing period hereinafter) of image-sensing by the first-type pixels 110 , and the interval between the shutter operation 1000 and the readout operation 1100 may be shortened as necessary.
- each control parameter to be used in the subsequent image sensing operation by the second-type pixels 120 is determined by the controller 105 based on the signals of the first-type pixels 110 that have undergone readout by the readout circuits 103 .
- the control parameter includes, for example, an exposure time of accumulating charges in the image sensing operation by the second-type pixels 120 , the gain of each readout circuit 103 , a conversion resolution to be used when performing AD conversion, a region (ROI: Region of Interest) where the signal readout is to be performed in the pixel array 101 , or the like.
- the control parameter may also be used for the shutter speed setting, the ISO sensitivity setting, the f-number setting, the focusing of the lens, the signal processing level (for example, the intensity of the noise removal), and the like which are to be made in the camera for the image sensing operation by the second-type pixels 120 .
- a shutter operation 1200 (broken line) is the shutter operation of the second-type pixels 120 .
- a readout operation 1300 is the readout operation of reading out signals from the second-type pixels 120 .
- the period between the shutter operation 1200 and the readout operation 1300 is the maximum width of a period (to be referred to as a second image-sensing period hereinafter) of image-sensing by the second-type pixels 120 .
- the controller 105 performs control so that an image sensing operation is performed by the first-type pixels 110 in the first image-sensing operation.
- scanning for the shutter operation 1000 is started from the first-type pixel row on the upper end of the pixel array 101 , and the shutter operation 1000 is completed at time t 102 .
- the readout operation 1100 is started.
- the controller 105 causes the readout circuits 103 , arranged in corresponding columns, to read out signals generated by the first-type pixels 110 by the image sensing operation, sequentially from the first-type pixel row on the upper-end side of the pixel array 101 .
- the controller 105 starts the signal processing operation 1110 by using these signals.
- the controller 105 determines the length of the exposure time of charge accumulation in the second image-sensing operation by the second-type pixels 120 in each row.
- the exposure time is determined for each row in this case, the exposure time may be determined for each plurality of rows. If the image sensing device 100 also includes an exposure control mechanism for each arbitrary number of pixels in the row direction, the length of the exposure time for each arbitrary column may also be determined in addition to the exposure time for each row.
- the control parameter may not only be the exposure time of the second-type pixels 120 but also the gain of each readout circuit 103 or the conversion resolution of AD conversion in the readout operation 1300 of reading out signals from the image sensing operation by the second-type pixels 120 or the readout region where the signals are to be read out in the pixel array 101 .
- the controller 105 can determine the control parameter for at least one of not less than one row in the pixel array 101 and not less than one column in the pixel array 101 .
- the control parameters determined by the controller 105 are fed back to the vertical scanning circuit 102 and the readout circuits 103 via the control parameter line 106 .
- the shutter operation 1000 of the first image-sensing operation of the second frame can be started.
- the thinned-out reading operation of reading out signals from the first-type pixels 110 at time t 103 to time t 104 is performed at the same timing as described above in FIGS. 4 and 5B .
- the controller 105 performs control so that an image sensing operation will be performed by the second-type pixels 120 in accordance with each determined control parameter. More specifically, when each control parameter has been determined by the controller 105 , the shutter operation 1200 of the second image-sensing operation is started at time t 105 after every shutter operation 1000 of the first image-sensing operation has been completed. Since the shutter operation 1200 of each row is performed based on the exposure time determined for each row by the signal processing operation 1110 , for example, the shutter operation for an nth row is performed at time t 106 and the exposure of the nth row is started. The exposure of each subsequent row is started in the same manner.
- the readout operation 1100 of the second frame in the first image-sensing operation is performed.
- the readout operation 1300 of the first frame of the second image-sensing operation is started, and the readout of the signals generated in all of the second-type pixels 120 is completed at time t 111 .
- the second-type pixels 120 are arranged in all of the rows of the pixel array 101 and are present in the same column for adjacent rows in most of the rows, the second-type pixels 120 need to be scanned and subjected to readout for each row.
- the scanning time in the readout operation 1100 of reading out signals from the first-type pixels 110 can be shorter than the scanning time of the readout operation 1300 of reading out signals from the second-type pixels 120 .
- a detailed timing chart of the readout operation 1300 will not be illustrated, it is the same as that when the entire signal line group 130 is changed to a Lo signal in FIG. 3 .
- the subsequent operation is the same as that of the previous frame, and thus a description will be omitted.
- the controller 105 may control, as a control parameter, the gain of each readout circuit 103 , the conversion resolution of AD conversion in the readout operation 1300 of reading out signals from the image sensing operation by the second-type pixels 120 , or the readout region where the signals are to be read out in the pixel array 101 .
- the exposure time of the second-type pixels 120 need not be controlled as a control parameter or a plurality of parameters including the exposure time may be combined and controlled.
- the control parameter may be used to control the external operation of the image sensing device.
- control parameter can be used for the for the shutter speed setting, the ISO sensitivity setting, the f-number setting, the focusing of the lens, the signal processing level (for example intensity of the noise removal), and the like which are to be made in the camera for the image sensing operation by the second-type pixels 120 .
- the exposure time of accumulating charges in the second image-sensing operation is not used as the control parameter in each of the image sensing operations in which the image sensing device 100 repeats one first image-sensing operation and one second image-sensing operation.
- the controller 105 may perform the second image-sensing operation by using a first control parameter determined by the first image-sensing operation in the same image-sensing operation period.
- the controller 105 may feed back, to the readout operation 1300 of the immediately following second image-sensing operation (time t 109 to time t 110 ), the control parameter determined based on the signals of the first-type pixels 110 obtained in the readout operation 1100 of the first image-sensing operation performed at time t 108 to time t 109 .
- the control parameter for the signals of the second-type pixels 120 to be read out next is determined. As a result, it is possible to determine a suitable image-sensing condition by tracking a high-speed moving object.
- a processing operation of cutting out a suitable region corresponding to a higher speed moving object by making a determination to reduce the next signal readout region in a stepwise manner based on the information of the first-type pixels 110 obtained in the high-speed first image-sensing operation (thinned-out reading operation) will be described next.
- first-type pixels 110 operated by dividing the first-type pixels into two pixel groups of first preliminary image sensing pixels 111 and second preliminary image sensing pixels 112 will be described.
- FIG. 7 is a view showing the arrangement of the pixels of a pixel array 101 ′ according to this embodiment.
- the pixel array 101 ′ includes 64 rows ⁇ 64 columns of pixels.
- a first preliminary image sensing operation and a second preliminary image sensing operation performed after the first preliminary image sensing operation are performed as the first image-sensing operation in which the thinned-out reading operation is performed.
- the first-type pixels 110 are classified into the first preliminary image sensing pixels 111 to be used for the first preliminary image sensing operation of the first-type pixels 110 and the second preliminary image sensing pixels 112 different from the first preliminary image sensing pixels 111 to be used for the second preliminary image sensing operation.
- the first preliminary image sensing pixel 111 is arranged from the 5th row of the pixel array 101 ′ at an interval of 8 rows.
- the second preliminary image sensing pixel 112 is arranged from the 1st row at an interval of 8 rows.
- Components other than the pixel array 101 ′ may be the same as those in the arrangement shown in FIG. 1 , and thus a description of components other than the pixel array 101 ′ will be omitted.
- FIG. 8 is a timing chart for explaining the operation of the image sensing device 100 that includes the pixel array 101 ′.
- a shutter operation 8000 is the shutter operation of the first preliminary image sensing pixels 111 .
- the readout operation 8100 is the readout operation of the first preliminary image sensing pixels 111 .
- the period between the shutter operation 8000 and the readout operation 8100 is the maximum width of a first preliminary image sensing period in the first preliminary image sensing pixels 111 .
- a shutter operation 8200 is the shutter operation of the second preliminary image sensing pixels 112 .
- a readout operation 8300 is the readout operation of the second preliminary image sensing pixels 112 .
- the period between the shutter operation 8200 and the readout operation 8300 is the maximum width of a second preliminary image sensing period in the second preliminary image sensing pixels 112 .
- a preliminary image sensing parameter of the second preliminary image sensing operation by the second preliminary image sensing pixels 112 is determined by the controller 105 based on the signals of the first preliminary image sensing pixel 111 read out by the readout circuits 103 .
- the control parameter of an image sensing operation by the second-type pixels 120 is determined by the controller 105 based on the signals of the second preliminary image sensing pixels 112 read out by the readout circuits 103 .
- the preliminary image sensing parameter and the control parameter determined by the signal processing operation 8110 and the signal processing operation 8310 are the same as the control parameter described with reference to FIG. 6 , and thus a description will be omitted.
- the operation of the image sensing device 100 which includes the pixel array 101 ′ will be described next.
- the shutter operation 8000 of the first preliminary image sensing operation is performed.
- the readout operation 8100 of the first preliminary image sensing pixel is performed.
- the signal processing operation 8110 of the first preliminary image sensing operation is started.
- the controller 105 determines, based on the signals generated by the first preliminary image sensing pixels 111 of an arbitrary region that has at least already undergone readout, a signal readout region 700 in the pixel array 101 ′ in the image sensing operation using the second preliminary image sensing pixels 112 .
- the region 700 that includes specific image-sensing target region may be determined from the signals obtained in the first preliminary image sensing operation.
- the controller 105 selects, from the pixel array 101 ′ on which 64 rows ⁇ 64 columns of pixels are arranged, the region 700 on which 32 rows ⁇ 32 columns of pixels are arranged.
- the controller 105 feeds back, to the vertical scanning circuit 102 and the horizontal scanning circuit 104 , the signal readout region 700 of the image sensing operation using the second preliminary image sensing pixels 112 determined via the control parameter line 106 .
- the shutter operation 8000 of the second frame in the first preliminary image sensing operation can be started.
- the region (that is, the entire region of the pixel array 101 ) where the first preliminary image sensing pixels 111 , which are the first-type pixels 110 whose signals are to be read out in the first preliminary image sensing operation, are to be arranged in the pixel array 101 includes the region 700 where the second preliminary image sensing pixels 112 which are the first-type pixels 110 whose signals are to be read out in the second preliminary image sensing operation in the pixel array 101 are arranged. Also, although this embodiment has set the region 700 as a region where 32 rows ⁇ 32 columns of pixels are arranged, the present invention is not limited to this, and the region may be set appropriately.
- the readout operation 8100 of the second frame in the first preliminary image sensing operation is performed in the period of time t 137 to time t 138 .
- the readout operation 8300 of the second preliminary image sensing operation is performed in the period of time t 138 to time t 139 .
- the signal processing operation 8310 of the second preliminary image sensing operation is started.
- the controller 105 determines, based on the signals generated by the second preliminary image sensing pixels 112 of an arbitrary region that has at least already undergone readout, a signal readout region 701 in the pixel array 101 ′ in the second image sensing operation using the second-type pixels 120 .
- the region 701 which includes a specific image sensing target may be determined from signals obtained in the second preliminary image sensing operation.
- the controller 105 selects, from the region 700 where 32 rows ⁇ 32 columns of pixels are arranged, the region 701 where 16 rows ⁇ 16 columns of pixels are arranged.
- the controller 105 feeds back, to the vertical scanning circuit 102 and the horizontal scanning circuit 104 , the region 701 which has been determined via the control parameter line 106 and from which signals are to be read out in the second image-sensing operation using the second-type pixels 120 .
- the shutter operation 8200 of the second frame can be started.
- the region 700 where the second preliminary image sensing pixels 112 which are the first-type pixels 110 whose signals are to be read in the second preliminary image sensing operation in the pixel array 101 are arranged, includes the region 701 where the second-type pixels 120 whose signals are to be read in the second image-sensing operation in the pixel array 101 are arranged.
- the region 701 is a region in which 16 rows ⁇ 16 columns of pixels are arranged.
- the present invention is not limited to this, and the region may be set appropriately.
- the shutter operation 1200 is performed in the period from time t 140 to time t 141 , and the readout operation 1300 is performed in the period from time t 142 to time t 143 .
- the readout of signals generated by the second-type pixels 120 arranged in the region 701 is completed.
- the controller 105 need not control, as the second preliminary image sensing parameter and the control parameter, the exposure time in the image sensing operation by the second preliminary image sensing pixels 112 and the second-type pixels 120 . If the exposure time is not to be controlled, the controller 105 may feed back, to the immediately following readout operation 8300 , the preliminary image sensing parameter determined based on the signals of the first preliminary image sensing pixels 111 obtained in the readout operation 8100 . In the same manner, the controller 105 may feed back, to the immediately following readout operation 1300 , the preliminary image sensing parameter determined based on the signals of the first preliminary image sensing pixels 111 obtained in the readout operation 8300 .
- the preliminary image sensing parameter and the control parameter need not only be those of the signal readout regions (the regions 700 and 701 ) in the pixel array 101 .
- the preliminary image sensing parameter may be the exposure time during which charge accumulation is performed in the second preliminary image sensing operation by using the second preliminary image sensing pixels 112 .
- the control parameter may be the exposure time during which the charge accumulation is performed in the second image-sensing operation by using the second-type pixels 120 .
- the preliminary parameter and the control parameter may be the conversion resolution of the AD conversion or the gain of the readout circuits 103 when the readout operations 8300 and 1300 of reading signals obtained in the image sensing operation by the second preliminary image sensing pixels 112 and the second-type pixels 120 .
- the regions 700 and 701 may be selected by dividing the pixel array 101 into appropriate sizes in advance or an arbitrary region may be selected from the pixel array 101 based on the signals obtained in the first preliminary image sensing operation and the second preliminary image sensing operation.
- next signal readout region is determined stepwise based on the information of the thinned-out pixels whose readout operation speed has been increased. As a result, it is possible to perform a more suitable image sensing operation by tracking a higher speed moving object.
- a camera incorporating the image sensing device 100 will be exemplified hereinafter.
- the concept of a camera includes not only a device whose main purpose is image capturing but also a device (for example, a personal computer, mobile terminal, etc.) that auxiliarly has an image capturing function.
- the image sensing device 100 may include, in one semiconductor chip 910 , the pixel array 101 , a signal processor 902 , and a control circuit 901 that includes the vertical scanning circuit 102 , the readout circuits 103 , the horizontal scanning circuit 104 , and controller 105 .
- the image sensing device 100 may be formed from a plurality of semiconductor chips.
- the image sensing device 100 includes a semiconductor chip 910 a and a semiconductor chip 910 b which are stacked in the manner shown in FIG. 9B .
- the control circuit 901 and the pixel array 101 may be included in the semiconductor chip 910 a
- the signal processor 902 may be included in the semiconductor chip 910 b .
- the image sensing device 100 may include the pixel array 101 in the semiconductor chip 910 a and the control circuit 901 and the signal processor 902 in the semiconductor chip 910 b as shown in FIG. 9C .
- the semiconductor chip 910 a and the semiconductor chip 910 b are electrically connected to each other by direct connection of wiring lines, through-silicon vias, or bumps.
- the signal processor 902 can include an A/D conversion circuit and a processor (ISP: Image Signal Processor) that processes digital data of the A/D-converted image data.
- ISP Image Signal Processor
- FIG. 9D is a schematic view of an equipment EQP incorporating the image sensing device 100 .
- An electronic equipment such as a camera, an information equipment such as a smartphone, a transportation equipment such as an automobile or an airplane, or the like is an example of the equipment EQP.
- the image sensing device 100 can include, other than a semiconductor device IC which includes the semiconductor chip on which the pixel array 101 is arranged, a package PKG that contains the semiconductor device IC.
- the package PKG can include a base on which the semiconductor device IC is fixed and a lid member made of glass or the like which faces the semiconductor device IC, and connection members such as a bump and a bonding wire that connect a terminal arranged in the base and a terminal arranged in the semiconductor device IC to each other.
- the equipment EQP can further include at least one of an optical system OPT, a control device CTRL, a processing device PRCS, a display device DSPL, and a memory device MMRY.
- the optical system OPT forms images in the image sensing device 100 and is formed from, for example, a lens, a shutter, and a mirror.
- the control device CTRL controls the operation of the image sensing device 100 and is a semiconductor device such as an ASIC.
- the processing device PRCS processes signals output from the image sensing device 100 and is a semiconductor device such as a CPU or an ASIC for forming an AFE (Analog Front End) or a DFE (Digital Front End).
- the display device DSPL is an EL display device or a liquid crystal display device that displays information (image) acquired by the image sensing device 100 .
- the memory device MMRY is a magnetic device or a semiconductor device for storing information (image) acquired by the image sensing device 100 .
- the memory device MMRY is a volatile memory such as an SRAM, DRAM, or the like or a nonvolatile memory such as a flash memory, a hard disk drive, or the like.
- a mechanical device MCHN includes a driving unit or propulsion unit such as a motor, an engine, or the like. The mechanical device MCHN in the camera can drive the components of the optical system OPT for zooming, focusing, and shutter operations.
- the equipment EQP signals output from the image sensing device 100 are displayed on the display device DSPL and are transmitted externally by a communication device (not shown) included in the equipment EQP.
- the equipment EQP it is preferable for the equipment EQP to further include the memory device MMRY and the processing device PRCS that are separate from a storage circuit unit and calculation circuit unit included in the control circuit 901 and the signal processor 902 in the image sensing device 100 .
- the image sensing device 100 can track a high speed moving object.
- a camera incorporating the image sensing device 100 is applicable as a monitoring camera, an onboard camera mounted in a transportation equipment such as an automobile or an airplane, or the like.
- a transportation equipment 2100 is, for example, an automobile including an onboard camera 2101 shown in FIGS. 10A and 10B .
- FIG. 10A schematically shows the outer appearance and the main internal structure of the transportation equipment 2100 .
- the transportation equipment 2100 includes an image sensing device 2102 , an image sensing system ASIC (Application Specific Integrated Circuit) 2103 , a warning device 2112 , and a control device 2113 .
- ASIC Application Specific Integrated Circuit
- the above-described image sensing device 100 is used for the image sensing device 2102 .
- the warning device 2112 warns a driver when it receives an abnormality signal from an image-sensing system, a vehicle sensor, a control unit, or the like.
- the control device 2113 comprehensively controls the operations of the image sensing system, the vehicle sensor, the control unit, and the like.
- the transportation equipment 2100 need not include the control device 2113 .
- the image sensing system, the vehicle sensor, and the control unit each can individually include a communication interface and exchange control signals via a communication network (for example, CAN standard).
- FIG. 10B is a block diagram showing the system arrangement of the transportation equipment 2100 .
- the transportation equipment 2100 includes the first image sensing device 2102 and the second image sensing device 2102 . That is, the onboard camera according to this embodiment is a stereo camera.
- An object image is formed by an optical unit 2114 on each image sensing device 2102 .
- An image signal output from each image sensing device 2102 is processed by an image pre-processor 2115 and transmitted to the image sensing system ASIC 2103 .
- the image pre-processor 2115 performs processing such as S-N calculation and synchronization signal addition.
- the above-described signal processor 902 corresponds to at least a part of the image pre-processor 2115 and the image sensing system ASIC 2103 .
- the image sensing system ASIC 2103 includes an image processor 2104 , a memory 2105 , an optical distance measuring unit 2106 , a parallax calculator 2107 , an object recognition unit 2108 , an abnormality detection unit 2109 , and an external interface (I/F) unit 2116 .
- the image processor 2104 generates an image signal by processing signals output from the pixels of each image sensing device 2102 .
- the image processor 2104 also performs correction of image signals and interpolation of abnormal pixels.
- the memory 2105 temporarily holds the image signal.
- the memory 2105 may also store the position of a known abnormal pixel in the image sensing device 2102 .
- the optical distance measuring unit 2106 uses the image signal to perform focusing or distance measurement of an object.
- the parallax calculator 2107 performs object collation (stereo matching) of a parallax image.
- the object recognition unit 2108 analyzes image signals to recognize objects such as transportation equipment, a person, a road sign, a road, and the like.
- the abnormality detection unit 2109 detects the fault or an error operation of the image sensing device 2102 . When detecting a fault or an error operation, the abnormality detection unit 2109 transmits a signal indicating the detection of an abnormality to the control device 2113 .
- the external I/F unit 2116 mediates the exchange of information between the units of the image sensing system ASIC 2103 and the control device 2113 or the various kinds of control units.
- the transportation equipment 2100 includes a vehicle information acquisition unit 2110 and a driving support unit 2111 .
- the vehicle information acquisition unit 2110 includes vehicle sensors such as a speed/acceleration sensor, an angular velocity sensor, a steering angle sensor, a ranging radar, and a pressure sensor.
- the driving support unit 2111 includes a collision determination unit.
- the collision determination unit determines whether there is a possibility of collision with an object based on the pieces of information from the optical distance measuring unit 2106 , the parallax calculator 2107 , and the object recognition unit 2108 .
- the optical distance measuring unit 2106 and the parallax calculator 2107 are examples of distance information acquisition units that acquire distance information of a target object. That is, distance information is pieces of information related to the parallax, the defocus amount, the distance to the target object and the like.
- the collision determination unit may use one of these pieces of distance information to determine the possibility of a collision.
- Each distance information acquisition unit may be implemented by dedicated hardware or a software module.
- the driving support unit 2111 controls the transportation equipment 2100 so it does not collide against another object has been described. However, it is also applicable to control of automatic driving following another vehicle or control of automatic driving not to drive off a lane.
- the transportation equipment 2100 also includes driving devices, which are used for movement or supporting a movement, such as an air bag, an accelerator, a brake, a steering, a transmission, an engine, a motor, wheels, propellers, and the like.
- the transportation equipment 2100 also includes control units for these devices. Each control unit controls a corresponding driving device based on a control signal of the control device 2113 .
- the image sensing system used in the embodiment is applicable not only to an automobile and a railway vehicle but also to, for example, transportation equipment such as a ship, an airplane, or an industrial robot.
- the image sensing system is also applicable not only to the transportation equipment but also widely to equipment using object recognition such as an ITS (Intelligent Transportation System).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Mechanical Engineering (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Abstract
An image sensing device is provided. The device comprises pixels including a first pixel which belongs to a first row and a first column, a second pixel which belongs to a second row and the first column and a third pixel which belongs to the second row and a second column, and readout units including a first readout circuit connected to the first and second pixels and a second readout circuit connected to the third pixel. The device performs a first operation and a second operation after the first operation. In the first operation, signal readout from the first and third pixels are performed. In the second operation, signal readout from the second pixel is performed. A controller determines, based on the signal generated by the first operation, a control parameter using to control the second operation.
Description
- The present invention relates to an image sensing device, a camera, and a transportation equipment.
- An image sensing device using a CMOS circuit is widely used in digital cameras, digital camcorders, monitoring cameras, and the like. Japanese Patent Laid-Open No. 2002-320235 discloses a CMOS image sensor that has, in addition to a mode for reading out signals from all of the pixels arranged in a pixel array, a mode for thinned-out reading of pixel signals when reduced image signals are to be output. Japanese Patent Laid-Open No. 2005-86245 discloses a solid-state image sensing device that reduces the number of pixels which are read out for each frame to improve the frame rate and alternately reads out, for each frame, an image sensing signal such as that for a moving image and an image-sensing target recognition signal such as that for autofocus.
- Since Japanese Patent Laid-Open Nos. 2002-320235 and 2005-86245 each have an arrangement in which signal readout is performed for each row when only signals from some of the pixels which are arranged in a pixel array are to be read out, the readout operation time can be long if the pixels whose signals are to be read out are arranged over a plurality of rows.
- The present invention provides a technique advantageous in reducing the readout time when signals are to be read out from some of the pixels which are arranged in a pixel array.
- According to some embodiments, an image sensing device that comprises a pixel array in which a plurality of pixels are arranged in a matrix and a plurality of readout circuits configured to read out signals from the pixel array, the plurality of pixels comprising a first pixel which belongs to a first pixel row of the pixel array and a first pixel column of the pixel array, a second pixel which belongs to a second pixel row of the pixel array and the first pixel column of the pixel array, and a third pixel which belongs to the second pixel row of the pixel array and a second pixel column of the pixel array, and the plurality of readout units comprising a first readout circuit connected to the first pixel and the second pixel and a second readout circuit connected to the third pixel, wherein the image sensing device performs a first image sensing operation and performs a second image sensing operation after the first image sensing operation, wherein in the first image sensing operation, signal readout from the first pixel by the first readout circuit and signal readout from the third pixel by the second readout circuit are performed simultaneously, and wherein in second image sensing operation, signal readout from the second pixel by the first readout circuit is performed, and wherein a controller determines, based on the signal generated by the first image sensing operation, a control parameter which is to be used to control the second image sensing operation, is provided.
- Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
-
FIG. 1 is a view showing an example of the arrangement of an image sensing device according to an embodiment of the present invention; -
FIGS. 2A to 2C are views each showing an example of the arrangement of a pixel array of the image sensing device ofFIG. 1 ; -
FIG. 3 is a timing chart of a full pixel readout operation of the image sensing device ofFIG. 1 ; -
FIG. 4 is a timing chart of a thinned-out reading operation of the image sensing device ofFIG. 1 ; -
FIGS. 5A and 5B are a view showing an example of the arrangement of the pixel array and a timing chart of the thinned-out reading operation, respectively, of the image sensing device ofFIG. 1 ; -
FIG. 6 is a timing chart of an operation of the image sensing device ofFIG. 1 ; -
FIG. 7 is a view showing an example of the arrangement of a pixel array of the image sensing device ofFIG. 1 ; -
FIG. 8 is a timing chart of an operation of the image sensing device which includes the pixel array ofFIG. 7 ; -
FIGS. 9A to 9D are views showing examples of the arrangement of a camera incorporating the image sensing device; and -
FIGS. 10A and 10B are views showing examples of a transportation equipment mounted with the image sensing device ofFIG. 1 . - A detailed embodiment of an image sensing device according to the present invention will now be described with reference to the accompanying drawings. Note that in the following description and drawings, common reference numerals denote common components throughout a plurality of drawings. Hence, the common components will be described by cross-reference to the plurality of drawings, and a description of components denoted by common reference numerals will be appropriately omitted.
- An arrangement and an operation of the image sensing device according to the embodiment of the present invention will be described with reference to
FIGS. 1 to 10B .FIG. 1 is a view showing an arrangement of animage sensing device 100 according to an embodiment of the present invention. Theimage sensing device 100 includes apixel array 101, avertical scanning circuit 102,readout circuits 103, ahorizontal scanning circuit 104, acontroller 105, and acontrol parameter line 106. - A plurality of pixels, on which photoelectric conversion elements are arranged, are arranged in a matrix in the
pixel array 101. Here, inFIG. 1 , a lateral direction is called a row direction (horizontal direction), and a longitudinal direction is called a column direction. In the arrangement shown inFIG. 1 , 16 rows, of which the uppermost end is the 0th row and the lowermost end is the 15th row, and 16 columns, of which the rightmost end is the 0th column and the leftmost end is the 15th column, of pixels are arranged in thepixel array 101. Thevertical scanning circuit 102 selects pixels arranged in the row direction. Thereadout circuit 103 is arranged for each column and reads out a signal, via a column signal line of each row, from each pixel in a row that has been selected by thevertical scanning circuit 102. Thecontroller 105 processes the signals from thereadout circuits 103 which are scanned by thehorizontal scanning circuit 104 and feeds back the generated control parameters to thevertical scanning circuit 102 and thereadout circuits 103 by using thecontrol parameter line 106. Thecontroller 105 may control the components of theimage sensing device 100, such as thevertical scanning circuit 102, thereadout circuits 103, and thehorizontal scanning circuit 104. Note that a pixel color can be associated with each pixel of thepixel array 101 by using a color filter array. A color filter array can, for example, employ a Bayer array in which green pixels are assigned to a diagonal pixel pair of 2×2 pixels and a red pixel and a blue pixel are assigned to the remaining two pixels. - Pixels arranged in the
pixel array 101 includes a plurality of first-type pixels 110 which are used in the thinned-out reading operation (to be described later) and a plurality of second-type pixels 120 which are not used in the thinned-out reading operation but are used for image generation. The first-type pixels 110 can be referred to as thinned-out reading pixels and the second-type pixels 120 can be referred to as non-thinned-out reading pixels or normal readout pixels. Note that when reading out the second-type pixels 120, the first-type pixels 110 can also be read out without executing a thinning operation in the same manner as the second-type pixels 120. Although the first-type pixels 110 and the second-type pixels 120 can be distinguished from each other in the point that their respective readout methods are different, they may have the same pixel structure. Here, a row in which the first-type pixel 110 and the second-type pixel 120 are arranged in the row direction will be called a first-type pixel row. In other words, thepixel array 101 includes a plurality of first-type pixel rows each including at least one first-type pixel of the plurality of first-type pixels 110 and one of the plurality of second-type pixels 120. Thepixel array 101 also includes a plurality of second-type pixel rows in which only the second-type pixels 120, other than the first-type pixels, are arranged in the row direction. In each first-type pixel row, at least one second-type pixel 120 is arranged between adjacent first-type pixels 110. Also, at least one row of pixels other than the first-type pixels, more specifically, a pixel row formed by only the second-type pixels 120 is arranged between adjacent first-type pixel rows. In the arrangement shown inFIG. 1 , the first-type pixel 110 is arranged for every 4 pixels (4 rows) in the column direction and for every 4 pixels (4 columns) in the row direction. Furthermore, in each first-type pixel row, there are a plurality of types of positions where the first-type pixels 110 are to be arranged in the row direction. For example, among the plurality of first-type pixel rows, note the 1st row of the first-type pixel rows (to be referred to as the 1st row hereinafter) in thepixel array 101 and the 5th row of the first-type pixel rows (to be referred to as the 2nd row hereinafter) which is adjacent to the 1st row of the first-type pixel rows in thepixel array 101. The columns where the first-type pixels 110, of the plurality of first-type pixels 110, which are arranged in the 1st row are positioned are different from the columns where the first-type pixels 110, of the plurality of first-type pixels 110, which are arranged in the 2nd row are positioned. In this manner, the first-type pixels 110 may be arranged in different columns in adjacent first-type pixel rows. In the arrangement shown inFIG. 1 , the first-type pixels 110 are arranged so as to be positioned at different columns from each other in thepixel array 101 in which pixels are arranged in 16 rows×16 columns. -
FIG. 2A shows the connection relation of the plurality ofreadout circuits 103 arranged for the respective columns in correspondence with the first-type pixels 110, second-type pixels 120, thevertical scanning circuit 102, and thepixel array 101.FIG. 2A shows the 1st, 2nd, and 5th rows and 0th to 3rd columns of thepixel array 101. The first-type pixels 110 and the second-type pixels 120 each include a photoelectric conversion element PD, a floating diffusion region FD, and transistors M1 to M4. The transistor M1 is a transfer transistor that transfers, to the floating diffusion region FD, charges converted from light and accumulated by the photoelectric conversion element PD. The transistor M2 is a reset transistor for resetting the photoelectric conversion element PD and the floating diffusion region FD. The transistor M3 is a source-follower transistor that converts the charges transferred to the floating diffusion region FD into a voltage signal and outputs the converted signal. The transistor M4 is a selection transistor for outputting a signal generated from light incident on each pixel to a correspondingcolumn signal line 107 arranged along the column direction. - A
signal line group 130 for controlling the first-type pixels 110 and asignal line group 140 for controlling the second-type pixels 120 are arranged in the first-type pixel rows (the 1st row and the 5th row) from thevertical scanning circuit 102. Asignal line group 141 for controlling the second-type pixels 120 is arranged in each pixel row (the 2nd row) in which only the second-type pixels 120 are arranged. Each of thesignal line groups FIG. 2A , “1” is added to the reference symbol of each of the signal lines PTX, PRES, and PSEL that is connected to the first-type pixels 110, and “2” is added to the reference symbol of each of the signal lines PTX, PRES, and PSEL that is connected to the second-type pixels 120. The number in brackets following the reference symbol of each of the signal lines PTX, PRES, and PSEL indicates the row number. - In the arrangement shown in
FIG. 2A , a total of three signal lines are arranged as thesignal line group 141 in the 2nd pixel row in which the first-type pixels 110 are not arranged and only the second-type pixels 120 are arranged. However, the present invention is not limited to this arrangement. For example, to ensure the opening of the photoelectric conversion element PD and a uniform parasitic capacitance of the floating diffusion region FD, wiring lines may be added so that it will have six signal lines which is the same number of lines as that of each first-type pixel row. In other words, the total number of signal lines of thesignal line group 130 and thesignal line group 140 may be the same as the number of signal lines of thesignal line group 141. The wiring lines to be added to thesignal line group 141 may be, as shown inFIG. 2B , dummy signal lines PTXD, PRESD, and PSELD which are not connected to any of the second-type pixels 120. The signal lines which are to be added to thesignal line group 141 may be a second signal line group which is connected to some of the second-type pixels 120 of the same number as the first-type pixels 110 arranged in the first-type pixel row in the plurality of second-type pixels 120 as shown inFIG. 2C . By adding a second signal group, the output wiring line load from thevertical scanning circuit 102 can be made equal in the first-type pixel rows and pixel rows other than the second-type pixel rows. In this case, for example, as shown inFIG. 2C , the second-type pixel 120 at the 2nd column of each of the 0th, 2nd, and 3rd pixel rows may be connected to signal lines PTX2B, PRES2B, and PSEL2B of thesignal line group 141. That is, the connection relation between thesignal line group 141 and the second-type pixel 120 of in the second column of each of the 0th, 2nd, and 3rd pixel rows may be the same as the connection relation between thesignal line groups type pixels 110 and the second-type pixels 120 of the 1st first-type pixel row. In the same manner, the connection relation between the 5th row and the 4th, 6th, and 7th rows, the connection relation between the 9th row and the 8th, 10th, and 11th rows, and the connection relation between the 13th row and the 12th, 14th, and 15th rows may be the same. - The operation of the
image sensing device 100 will be described next.FIG. 3 is a timing chart of a readout operation performed to read out signals from all of the pixels arranged in thepixel array 101.FIG. 3 shows the timings at which signals are read out from pixels belonging to the 0th row to the 5th row of thepixel array 101. - At time t1, the transistor M2 resets the floating diffusion region FD by supplying a Hi signal to a signal line PSEL2 [0] and a signal line PRES2 [0]. When the transistor M4 executes an ON operation (changes to a conductive state) simultaneously with the resetting of the floating diffusion region FD, the 0th row changes to the selected state, and a reset level is output from the transistor M3 via the transistor M4 to the corresponding
column signal line 107. Subsequently, when a signal line PRES [0] changes to a Lo signal, the reset level of the 0th row is read out by thereadout circuit 103 of each column. - Next, at time t2, accumulated charges are transferred from each photoelectric conversion element PD to the corresponding floating diffusion region FD when a Hi signal is supplied to a signal line PTX2 [0]. When the signal line PTX2 [0] changes to a Lo signal, the signal level of the 0th row is read out by the
readout circuits 103. Correlated double sampling processing can be performed on the readout reset level and signal level in eachreadout circuit 103 or in thecontroller 105. - At time t3, the 0th row is set to an unselected state when the transistor M4 is changed to an OFF operation (a release state) by the signal line PSEL2 [0] changing to a Lo signal. The time from time t1 to time t3 is the readout time of one row. At time t3, the readout operation of all of the pixels in the 1st row is started when a Hi signal is supplied simultaneously to each of signal lines PRES1 [1], PRES2 [1], PSEL1 [1], and PSEL2 [1], and the readout operation ends at time t4. Subsequently, each row is sequentially scanned in the same manner, and signals are read out from the pixels belonging to the row.
- A thinned-out reading operation of reading out signals from only the first-
type pixels 110 among the pixels arranged in thepixel array 101 will be described next.FIG. 4 is a timing chart of the thinned-out reading operation. - At time t11, a Hi signal is supplied to only signal lines PSEL1 and PRES1 of the 1st, 5th, 9th, and 13th rows which are the first-type pixel rows, and the first-
type pixels 110 of each of the first-type pixel rows are reset. Subsequently, each signal line PRES1 changes to a Lo signal, and the reset level is read out. Next, at time t12, a Hi signal is supplied to only a signal line PTX1 of each first-type pixel row, the signal level of each first-type pixel row is read out when the signal line PTX1 changes to a Lo signal. Next, at time t13, the signal line PSEL1 of each first-type pixel row changes to a Lo signal and the readout of each first-type pixel row ends. - In this embodiment, as shown in
FIG. 1 , the first-type pixels 110 of the 1st, 5th, 9th, and 13th rows are arranged in different columns from each other. Hence, signals from the first-type pixels 110 arranged in the manner described inFIG. 4 can be simultaneously read out by thereadout circuits 103 arranged in corresponding columns within the readout time of one row from time t11 to time t13. In this manner, when signals are to be read out from some of the pixels arranged in thepixel array 101, the speed of the thinned-out reading operation can be increased by simultaneously reading out the signals from the first-type pixels 110 arranged in different columns. - In the readout operation in which the
readout circuits 103 read out signals from the plurality of first-type pixels 110, thecontroller 105 causes the first-type pixels 110 which are arranged in different columns of the plurality of first-type pixels 110, to connect to corresponding differentcolumn signal lines 107 among the plurality of column signal lines 107. As a result, in theimage sensing device 100, signals from at least two or more first-type pixels 110 arranged in two first-type pixel rows among the plurality of first-type pixel rows can be read out simultaneously by thereadout circuits 103 arranged in corresponding columns. More specifically, in the arrangement shown inFIG. 2A , for example, the plurality of pixels include a first-type pixel 110 a which is arranged in the 1st pixel row and the 2nd pixel column of the pixel array and a first-type pixel 110 b which is arranged in the 5th pixel row and the 1st pixel column of the pixel array. The plurality of pixels also include a second-type pixel 120 a arranged in the 5th pixel row and the 2nd pixel column of the pixel array. The plurality ofreadout circuits 103 include areadout circuit 103 a which is connected to the first-type pixel 110 a and the second-type pixel 120 a and areadout circuit 103 b which is connected to the first-type pixel 110 b. This arrangement allows thereadout circuit 103 a to read out a signal from the first-type pixel 110 a and thereadout circuit 103 b to read out a signal from the first-type pixel 110 b simultaneously. A signal is not read out from the second-type pixel 120 a which is connected to acolumn signal line 107 a as in the first-type pixel 110 a, and a signal is read out from the second-type pixel 120 a at a separate timing. In the same manner, a signal is not read out from another second-type pixel 120 which is connected to acolumn signal line 107 b as in the first-type pixel 110 b, and a signal is read out from the second-type pixel 120 at another timing. That is, when signals are to be read out simultaneously from the first-type pixels 110 a and 110 b, signals are not read out from pixel rows which are arranged between the first-type pixel rows (the 1st pixel row and the 5th pixel row in the arrangement ofFIG. 2A ) in which the first-type pixels 110 a and 110 b are arranged respectively. - The timing chart of
FIG. 4 described an example using thepixel array 101 that includes 16 rows×16 columns of pixels. However, the arrangement of thepixel array 101 is not limited to this. For example,FIG. 5B is a timing chart of the thinned-out reading operation performed in thepixel array 101 that includes 64 rows×64 columns of pixels in which the first-type pixels 110 are arranged at the same regularity as inFIG. 1 as shown inFIG. 5A . In this case, of the first-type pixels 110 which are present in the 64 rows and are to be arranged in the first-type pixel rows, the first-type pixels 110 of the 1st, 5th, 9th, and 13th rows are read out in the readout time of one row. Subsequently, the readout of 17th, 21st, 25th, and 29th rows, the readout of 33rd, 37th, 41st, and 45th rows, and the readout of 49th, 53rd, 57th, and 61st rows can be performed so that signals from all of the first-type pixels 110 can be read out in the readout time of four rows from time t21 to time t22. - In this manner, in each of the plurality of first-type pixel rows, each of the plurality of first-
type pixels 110 is arranged for every M pixels (M columns), and the plurality of first-type pixel rows are arranged for every N pixels (N rows) in thepixel array 101. In the readout operation of reading out signals from the first-type pixels 110, signals are read out simultaneously from the first-type pixels 110 belonging to continuous L first-type pixel rows of the plurality of first-type pixel rows. This can increase the speed of the thinned-out reading operation. In this case, L, M, and N each are a positive integer not less than 2 and may be a positive integer not less than 3. If M and N each are not less than 3, a sufficient range of pixels can be subjected to readout at high speed by executing thinned-out reading. L, M, and N may be different from each other, two of the integers may be different from each other, two of integers may be the same, or all may be the same. To reduce the distortion of an image that is obtained by thinned-out reading, M and N may be equal (M=N). In this example, L, M, and N all have the same integer of 4. In this manner, the relation between L, M, and N may at least satisfy one of the relations of at least one of L, M, and N being not less than 3 and at least two of L, M, and N being equal to each other. Also, in consideration of the balance between the readout speed and the image quality, it may be set so that L is not less than ½ of M, L may be not more than double of M (M/2≤L≤2×M), L is not less than ½ of N, and L may be not more than double of N (N/2≤L≤2×N). - This embodiment has described how, in a case in which signals are to be read out from only some of the pixels arranged in the
pixel array 101, the speed of the thinned-out reading operation can be increased by arranging the first-type pixels 110 at suitable positions. Next, the embodiment will describe a processing operation in which the suitable image sensing condition by determining, based on the information of an image sensing operation by the first-type pixels 110 whose reading operation speed has been increased, a control parameter for the signals of the second-type pixels 120 of the next readout operation and tracking a high-speed moving object. -
FIG. 6 is a timing chart of a case in which an image sensing operation by using the second-type pixels 120 is performed by using a control parameter based on signals obtained from performing an image sensing operation by using the first-type pixels 110 which perform the thinned-out reading operation. The arrangement of theimage sensing device 100 is the same as those shown inFIGS. 1 and 2A . - A shutter operation 1000 (broken line) is a shutter operation of performing an image sensing operation by the first-
type pixels 110. The longitudinal direction of the broken line indicating theshutter operation 1000 represents the column direction (or the pixel row position at which the shutter operation is to be performed). Theshutter operation 1000 indicates that thevertical scanning circuit 102 performs scanning from the upper end to the lower end or from the lower end to the upper end of thepixel array 101, and that an exposure operation of the first-type pixels 110 of each first-type pixel row is to be started. More specifically, in theshutter operation 1000, the exposure operation is started after a Hi signal is supplied to the signal lines PTX1 and PRES1 of each selected first-type pixel row, the photoelectric conversion element PD of each first-type pixel 110 is reset, and the signal line PTX1 subsequently changes to a Lo signal. - A readout operation 1100 (solid line) is an operation of reading out signals from the first-
type pixels 110. The signals of the first-type pixels 110 of the 1st row selected by thevertical scanning circuit 102 are read out simultaneously. At this time, the thinned-out reading operation of reading out signals from the first-type pixels 110 is performed at the same timings as those described above inFIGS. 4 and 5B . The period between theshutter operation 1000 and thereadout operation 1100 is the maximum width of a period (to be referred to as a first image-sensing period hereinafter) of image-sensing by the first-type pixels 110, and the interval between theshutter operation 1000 and thereadout operation 1100 may be shortened as necessary. - In a
signal processing operation 1110, each control parameter to be used in the subsequent image sensing operation by the second-type pixels 120 (to be described later) is determined by thecontroller 105 based on the signals of the first-type pixels 110 that have undergone readout by thereadout circuits 103. The control parameter includes, for example, an exposure time of accumulating charges in the image sensing operation by the second-type pixels 120, the gain of eachreadout circuit 103, a conversion resolution to be used when performing AD conversion, a region (ROI: Region of Interest) where the signal readout is to be performed in thepixel array 101, or the like. The control parameter may also be used for the shutter speed setting, the ISO sensitivity setting, the f-number setting, the focusing of the lens, the signal processing level (for example, the intensity of the noise removal), and the like which are to be made in the camera for the image sensing operation by the second-type pixels 120. A shutter operation 1200 (broken line) is the shutter operation of the second-type pixels 120. Areadout operation 1300 is the readout operation of reading out signals from the second-type pixels 120. The period between theshutter operation 1200 and thereadout operation 1300 is the maximum width of a period (to be referred to as a second image-sensing period hereinafter) of image-sensing by the second-type pixels 120. - The operation of the
image sensing device 100 will be described next. First, thecontroller 105 performs control so that an image sensing operation is performed by the first-type pixels 110 in the first image-sensing operation. At time t101, scanning for theshutter operation 1000 is started from the first-type pixel row on the upper end of thepixel array 101, and theshutter operation 1000 is completed at time t102. Next, at time t103, thereadout operation 1100 is started. Thecontroller 105 causes thereadout circuits 103, arranged in corresponding columns, to read out signals generated by the first-type pixels 110 by the image sensing operation, sequentially from the first-type pixel row on the upper-end side of thepixel array 101. Thecontroller 105 starts thesignal processing operation 1110 by using these signals. In thesignal processing operation 1110, based on the signals generated by the first-type pixels 110 of an arbitrary region which have already undergone readout as the control parameters, thecontroller 105 determines the length of the exposure time of charge accumulation in the second image-sensing operation by the second-type pixels 120 in each row. Although the exposure time is determined for each row in this case, the exposure time may be determined for each plurality of rows. If theimage sensing device 100 also includes an exposure control mechanism for each arbitrary number of pixels in the row direction, the length of the exposure time for each arbitrary column may also be determined in addition to the exposure time for each row. The control parameter may not only be the exposure time of the second-type pixels 120 but also the gain of eachreadout circuit 103 or the conversion resolution of AD conversion in thereadout operation 1300 of reading out signals from the image sensing operation by the second-type pixels 120 or the readout region where the signals are to be read out in thepixel array 101. For example, since the exposure time and the gain can be suitably set for each arbitrary row or for each region, the dynamic range of theimage sensing device 100 can be increased. In this manner, thecontroller 105 can determine the control parameter for at least one of not less than one row in thepixel array 101 and not less than one column in thepixel array 101. - The control parameters determined by the
controller 105 are fed back to thevertical scanning circuit 102 and thereadout circuits 103 via thecontrol parameter line 106. At time t104, after the signals of the first-type pixels 110 of every first-type pixel row have been read out, theshutter operation 1000 of the first image-sensing operation of the second frame can be started. The thinned-out reading operation of reading out signals from the first-type pixels 110 at time t103 to time t104 is performed at the same timing as described above inFIGS. 4 and 5B . - Next, in the second image-sensing operation performed after the first image-sensing operation, the
controller 105 performs control so that an image sensing operation will be performed by the second-type pixels 120 in accordance with each determined control parameter. More specifically, when each control parameter has been determined by thecontroller 105, theshutter operation 1200 of the second image-sensing operation is started at time t105 after everyshutter operation 1000 of the first image-sensing operation has been completed. Since theshutter operation 1200 of each row is performed based on the exposure time determined for each row by thesignal processing operation 1110, for example, the shutter operation for an nth row is performed at time t106 and the exposure of the nth row is started. The exposure of each subsequent row is started in the same manner. In the period from time t108 to time t109, thereadout operation 1100 of the second frame in the first image-sensing operation is performed. When the signals generated from all of the first-type pixels 110 have been read out, thereadout operation 1300 of the first frame of the second image-sensing operation is started, and the readout of the signals generated in all of the second-type pixels 120 is completed at time t111. - As the second-
type pixels 120 are arranged in all of the rows of thepixel array 101 and are present in the same column for adjacent rows in most of the rows, the second-type pixels 120 need to be scanned and subjected to readout for each row. Hence, the scanning time in thereadout operation 1100 of reading out signals from the first-type pixels 110 can be shorter than the scanning time of thereadout operation 1300 of reading out signals from the second-type pixels 120. Although a detailed timing chart of thereadout operation 1300 will not be illustrated, it is the same as that when the entiresignal line group 130 is changed to a Lo signal inFIG. 3 . The subsequent operation is the same as that of the previous frame, and thus a description will be omitted. - The above-described embodiment showed a case in which the exposure time of accumulating charges in the second image-sensing operation is controlled as a control parameter. However, the
controller 105 may control, as a control parameter, the gain of eachreadout circuit 103, the conversion resolution of AD conversion in thereadout operation 1300 of reading out signals from the image sensing operation by the second-type pixels 120, or the readout region where the signals are to be read out in thepixel array 101. In this case, the exposure time of the second-type pixels 120 need not be controlled as a control parameter or a plurality of parameters including the exposure time may be combined and controlled. The control parameter may be used to control the external operation of the image sensing device. For example, the control parameter can be used for the for the shutter speed setting, the ISO sensitivity setting, the f-number setting, the focusing of the lens, the signal processing level (for example intensity of the noise removal), and the like which are to be made in the camera for the image sensing operation by the second-type pixels 120. Here, consider a case in which the exposure time of accumulating charges in the second image-sensing operation is not used as the control parameter in each of the image sensing operations in which theimage sensing device 100 repeats one first image-sensing operation and one second image-sensing operation. In other words, consider a case in which the control parameter is the gain of eachreadout circuit 103, the conversion resolution of eachreadout circuit 103, or the readout region where the signals are to be read out in thepixel array 101. In this case, thecontroller 105 may perform the second image-sensing operation by using a first control parameter determined by the first image-sensing operation in the same image-sensing operation period. For example, thecontroller 105 may feed back, to thereadout operation 1300 of the immediately following second image-sensing operation (time t109 to time t110), the control parameter determined based on the signals of the first-type pixels 110 obtained in thereadout operation 1100 of the first image-sensing operation performed at time t108 to time t109. - As described above, based on the information of the first-
type pixels 110 in which the speed of thereadout operation 1100 has been increased, the control parameter for the signals of the second-type pixels 120 to be read out next is determined. As a result, it is possible to determine a suitable image-sensing condition by tracking a high-speed moving object. - A processing operation of cutting out a suitable region corresponding to a higher speed moving object by making a determination to reduce the next signal readout region in a stepwise manner based on the information of the first-
type pixels 110 obtained in the high-speed first image-sensing operation (thinned-out reading operation) will be described next. Here, an example in which the above-described first-type pixels 110 operated by dividing the first-type pixels into two pixel groups of first preliminaryimage sensing pixels 111 and second preliminaryimage sensing pixels 112 will be described. -
FIG. 7 is a view showing the arrangement of the pixels of apixel array 101′ according to this embodiment. Thepixel array 101′ includes 64 rows×64 columns of pixels. In this embodiment, a first preliminary image sensing operation and a second preliminary image sensing operation performed after the first preliminary image sensing operation are performed as the first image-sensing operation in which the thinned-out reading operation is performed. Hence, the first-type pixels 110 are classified into the first preliminaryimage sensing pixels 111 to be used for the first preliminary image sensing operation of the first-type pixels 110 and the second preliminaryimage sensing pixels 112 different from the first preliminaryimage sensing pixels 111 to be used for the second preliminary image sensing operation. In this embodiment, the first preliminaryimage sensing pixel 111 is arranged from the 5th row of thepixel array 101′ at an interval of 8 rows. The second preliminaryimage sensing pixel 112 is arranged from the 1st row at an interval of 8 rows. Components other than thepixel array 101′ may be the same as those in the arrangement shown inFIG. 1 , and thus a description of components other than thepixel array 101′ will be omitted. -
FIG. 8 is a timing chart for explaining the operation of theimage sensing device 100 that includes thepixel array 101′. Ashutter operation 8000 is the shutter operation of the first preliminaryimage sensing pixels 111. Thereadout operation 8100 is the readout operation of the first preliminaryimage sensing pixels 111. The period between theshutter operation 8000 and thereadout operation 8100 is the maximum width of a first preliminary image sensing period in the first preliminaryimage sensing pixels 111. Ashutter operation 8200 is the shutter operation of the second preliminaryimage sensing pixels 112. Areadout operation 8300 is the readout operation of the second preliminaryimage sensing pixels 112. The period between theshutter operation 8200 and thereadout operation 8300 is the maximum width of a second preliminary image sensing period in the second preliminaryimage sensing pixels 112. - In a
signal processing operation 8110, after the first preliminary image sensing operation, a preliminary image sensing parameter of the second preliminary image sensing operation by the second preliminaryimage sensing pixels 112 is determined by thecontroller 105 based on the signals of the first preliminaryimage sensing pixel 111 read out by thereadout circuits 103. In asignal processing operation 8310, after the second preliminary image sensing operation, the control parameter of an image sensing operation by the second-type pixels 120 is determined by thecontroller 105 based on the signals of the second preliminaryimage sensing pixels 112 read out by thereadout circuits 103. The preliminary image sensing parameter and the control parameter determined by thesignal processing operation 8110 and thesignal processing operation 8310, respectively, are the same as the control parameter described with reference toFIG. 6 , and thus a description will be omitted. - The operation of the
image sensing device 100 which includes thepixel array 101′ will be described next. First, in the period of time t131 to time t132, theshutter operation 8000 of the first preliminary image sensing operation is performed. Next, in the period of time t133 to time t134, thereadout operation 8100 of the first preliminary image sensing pixel is performed. After the start of thereadout operation 8100, thesignal processing operation 8110 of the first preliminary image sensing operation is started. In thesignal processing operation 8110, thecontroller 105 determines, based on the signals generated by the first preliminaryimage sensing pixels 111 of an arbitrary region that has at least already undergone readout, a signal readout region 700 in thepixel array 101′ in the image sensing operation using the second preliminaryimage sensing pixels 112. For example, the region 700 that includes specific image-sensing target region may be determined from the signals obtained in the first preliminary image sensing operation. In the arrangement shown inFIG. 7 , thecontroller 105 selects, from thepixel array 101′ on which 64 rows×64 columns of pixels are arranged, the region 700 on which 32 rows×32 columns of pixels are arranged. Thecontroller 105 feeds back, to thevertical scanning circuit 102 and thehorizontal scanning circuit 104, the signal readout region 700 of the image sensing operation using the second preliminaryimage sensing pixels 112 determined via thecontrol parameter line 106. At time t134, after all of the first preliminaryimage sensing pixels 111 have undergone readout, theshutter operation 8000 of the second frame in the first preliminary image sensing operation can be started. Here, the region (that is, the entire region of the pixel array 101) where the first preliminaryimage sensing pixels 111, which are the first-type pixels 110 whose signals are to be read out in the first preliminary image sensing operation, are to be arranged in thepixel array 101 includes the region 700 where the second preliminaryimage sensing pixels 112 which are the first-type pixels 110 whose signals are to be read out in the second preliminary image sensing operation in thepixel array 101 are arranged. Also, although this embodiment has set the region 700 as a region where 32 rows×32 columns of pixels are arranged, the present invention is not limited to this, and the region may be set appropriately. - After the
shutter operation 8200 of the second preliminary image sensing operation has been performed in the period of time t135 to time t136, thereadout operation 8100 of the second frame in the first preliminary image sensing operation is performed in the period of time t137 to time t138. After all of thereadout operations 8100 have been completed, thereadout operation 8300 of the second preliminary image sensing operation is performed in the period of time t138 to time t139. After the start of thereadout operation 8300, thesignal processing operation 8310 of the second preliminary image sensing operation is started. In thesignal processing operation 8310, thecontroller 105 determines, based on the signals generated by the second preliminaryimage sensing pixels 112 of an arbitrary region that has at least already undergone readout, asignal readout region 701 in thepixel array 101′ in the second image sensing operation using the second-type pixels 120. For example, theregion 701 which includes a specific image sensing target may be determined from signals obtained in the second preliminary image sensing operation. In the arrangement shown inFIG. 7 , thecontroller 105 selects, from the region 700 where 32 rows×32 columns of pixels are arranged, theregion 701 where 16 rows×16 columns of pixels are arranged. Thecontroller 105 feeds back, to thevertical scanning circuit 102 and thehorizontal scanning circuit 104, theregion 701 which has been determined via thecontrol parameter line 106 and from which signals are to be read out in the second image-sensing operation using the second-type pixels 120. At time t139, after all of the second preliminaryimage sensing pixels 112 have undergone readout, theshutter operation 8200 of the second frame can be started. Here the region 700, where the second preliminaryimage sensing pixels 112 which are the first-type pixels 110 whose signals are to be read in the second preliminary image sensing operation in thepixel array 101 are arranged, includes theregion 701 where the second-type pixels 120 whose signals are to be read in the second image-sensing operation in thepixel array 101 are arranged. In this embodiment, theregion 701 is a region in which 16 rows×16 columns of pixels are arranged. However, the present invention is not limited to this, and the region may be set appropriately. - After the
shutter operation 8200 has been performed, theshutter operation 1200 is performed in the period from time t140 to time t141, and thereadout operation 1300 is performed in the period from time t142 to time t143. At time t143, the readout of signals generated by the second-type pixels 120 arranged in theregion 701 is completed. - In the operation of the
image sensing device 100 shown inFIGS. 7 and 8 , thecontroller 105 need not control, as the second preliminary image sensing parameter and the control parameter, the exposure time in the image sensing operation by the second preliminaryimage sensing pixels 112 and the second-type pixels 120. If the exposure time is not to be controlled, thecontroller 105 may feed back, to the immediately followingreadout operation 8300, the preliminary image sensing parameter determined based on the signals of the first preliminaryimage sensing pixels 111 obtained in thereadout operation 8100. In the same manner, thecontroller 105 may feed back, to the immediately followingreadout operation 1300, the preliminary image sensing parameter determined based on the signals of the first preliminaryimage sensing pixels 111 obtained in thereadout operation 8300. - In the operation of the
image sensing device 100 shown inFIGS. 7 and 8 , the preliminary image sensing parameter and the control parameter need not only be those of the signal readout regions (the regions 700 and 701) in thepixel array 101. The preliminary image sensing parameter may be the exposure time during which charge accumulation is performed in the second preliminary image sensing operation by using the second preliminaryimage sensing pixels 112. The control parameter may be the exposure time during which the charge accumulation is performed in the second image-sensing operation by using the second-type pixels 120. The preliminary parameter and the control parameter may be the conversion resolution of the AD conversion or the gain of thereadout circuits 103 when thereadout operations image sensing pixels 112 and the second-type pixels 120. Theregions 700 and 701 may be selected by dividing thepixel array 101 into appropriate sizes in advance or an arbitrary region may be selected from thepixel array 101 based on the signals obtained in the first preliminary image sensing operation and the second preliminary image sensing operation. - As described above, the next signal readout region is determined stepwise based on the information of the thinned-out pixels whose readout operation speed has been increased. As a result, it is possible to perform a more suitable image sensing operation by tracking a higher speed moving object.
- As an application example of the
image sensing device 100 according to the above-described embodiment, a camera incorporating theimage sensing device 100 will be exemplified hereinafter. Here, the concept of a camera includes not only a device whose main purpose is image capturing but also a device (for example, a personal computer, mobile terminal, etc.) that auxiliarly has an image capturing function. - As shown in
FIG. 9A , theimage sensing device 100 may include, in onesemiconductor chip 910, thepixel array 101, asignal processor 902, and acontrol circuit 901 that includes thevertical scanning circuit 102, thereadout circuits 103, thehorizontal scanning circuit 104, andcontroller 105. Theimage sensing device 100 may be formed from a plurality of semiconductor chips. For example, theimage sensing device 100 includes asemiconductor chip 910 a and asemiconductor chip 910 b which are stacked in the manner shown inFIG. 9B . In this case, thecontrol circuit 901 and thepixel array 101 may be included in thesemiconductor chip 910 a, and thesignal processor 902 may be included in thesemiconductor chip 910 b. Theimage sensing device 100 may include thepixel array 101 in thesemiconductor chip 910 a and thecontrol circuit 901 and thesignal processor 902 in thesemiconductor chip 910 b as shown inFIG. 9C . In a case in which theimage sensing device 100 has a structure in which thesemiconductor chip 910 a and thesemiconductor chip 910 b are stacked in the manner shown inFIGS. 9B and 9C , thesemiconductor chip 910 a and thesemiconductor chip 910 b are electrically connected to each other by direct connection of wiring lines, through-silicon vias, or bumps. Thesignal processor 902 can include an A/D conversion circuit and a processor (ISP: Image Signal Processor) that processes digital data of the A/D-converted image data. -
FIG. 9D is a schematic view of an equipment EQP incorporating theimage sensing device 100. An electronic equipment such as a camera, an information equipment such as a smartphone, a transportation equipment such as an automobile or an airplane, or the like is an example of the equipment EQP. Theimage sensing device 100 can include, other than a semiconductor device IC which includes the semiconductor chip on which thepixel array 101 is arranged, a package PKG that contains the semiconductor device IC. The package PKG can include a base on which the semiconductor device IC is fixed and a lid member made of glass or the like which faces the semiconductor device IC, and connection members such as a bump and a bonding wire that connect a terminal arranged in the base and a terminal arranged in the semiconductor device IC to each other. The equipment EQP can further include at least one of an optical system OPT, a control device CTRL, a processing device PRCS, a display device DSPL, and a memory device MMRY. The optical system OPT forms images in theimage sensing device 100 and is formed from, for example, a lens, a shutter, and a mirror. The control device CTRL controls the operation of theimage sensing device 100 and is a semiconductor device such as an ASIC. The processing device PRCS processes signals output from theimage sensing device 100 and is a semiconductor device such as a CPU or an ASIC for forming an AFE (Analog Front End) or a DFE (Digital Front End). The display device DSPL is an EL display device or a liquid crystal display device that displays information (image) acquired by theimage sensing device 100. The memory device MMRY is a magnetic device or a semiconductor device for storing information (image) acquired by theimage sensing device 100. The memory device MMRY is a volatile memory such as an SRAM, DRAM, or the like or a nonvolatile memory such as a flash memory, a hard disk drive, or the like. A mechanical device MCHN includes a driving unit or propulsion unit such as a motor, an engine, or the like. The mechanical device MCHN in the camera can drive the components of the optical system OPT for zooming, focusing, and shutter operations. In the equipment EQP, signals output from theimage sensing device 100 are displayed on the display device DSPL and are transmitted externally by a communication device (not shown) included in the equipment EQP. Hence, it is preferable for the equipment EQP to further include the memory device MMRY and the processing device PRCS that are separate from a storage circuit unit and calculation circuit unit included in thecontrol circuit 901 and thesignal processor 902 in theimage sensing device 100. - As described above, the
image sensing device 100 according to this embodiment can track a high speed moving object. Hence, a camera incorporating theimage sensing device 100 is applicable as a monitoring camera, an onboard camera mounted in a transportation equipment such as an automobile or an airplane, or the like. A case in which the camera incorporating theimage sensing device 100 is applied to the transportation equipment will be exemplified here. Atransportation equipment 2100 is, for example, an automobile including anonboard camera 2101 shown inFIGS. 10A and 10B .FIG. 10A schematically shows the outer appearance and the main internal structure of thetransportation equipment 2100. Thetransportation equipment 2100 includes animage sensing device 2102, an image sensing system ASIC (Application Specific Integrated Circuit) 2103, awarning device 2112, and acontrol device 2113. - The above-described
image sensing device 100 is used for theimage sensing device 2102. Thewarning device 2112 warns a driver when it receives an abnormality signal from an image-sensing system, a vehicle sensor, a control unit, or the like. Thecontrol device 2113 comprehensively controls the operations of the image sensing system, the vehicle sensor, the control unit, and the like. Note that thetransportation equipment 2100 need not include thecontrol device 2113. In this case, the image sensing system, the vehicle sensor, and the control unit each can individually include a communication interface and exchange control signals via a communication network (for example, CAN standard). -
FIG. 10B is a block diagram showing the system arrangement of thetransportation equipment 2100. Thetransportation equipment 2100 includes the firstimage sensing device 2102 and the secondimage sensing device 2102. That is, the onboard camera according to this embodiment is a stereo camera. An object image is formed by anoptical unit 2114 on eachimage sensing device 2102. An image signal output from eachimage sensing device 2102 is processed by animage pre-processor 2115 and transmitted to the imagesensing system ASIC 2103. Theimage pre-processor 2115 performs processing such as S-N calculation and synchronization signal addition. The above-describedsignal processor 902 corresponds to at least a part of theimage pre-processor 2115 and the imagesensing system ASIC 2103. - The image
sensing system ASIC 2103 includes animage processor 2104, amemory 2105, an opticaldistance measuring unit 2106, aparallax calculator 2107, anobject recognition unit 2108, anabnormality detection unit 2109, and an external interface (I/F)unit 2116. Theimage processor 2104 generates an image signal by processing signals output from the pixels of eachimage sensing device 2102. Theimage processor 2104 also performs correction of image signals and interpolation of abnormal pixels. Thememory 2105 temporarily holds the image signal. Thememory 2105 may also store the position of a known abnormal pixel in theimage sensing device 2102. The opticaldistance measuring unit 2106 uses the image signal to perform focusing or distance measurement of an object. Theparallax calculator 2107 performs object collation (stereo matching) of a parallax image. Theobject recognition unit 2108 analyzes image signals to recognize objects such as transportation equipment, a person, a road sign, a road, and the like. Theabnormality detection unit 2109 detects the fault or an error operation of theimage sensing device 2102. When detecting a fault or an error operation, theabnormality detection unit 2109 transmits a signal indicating the detection of an abnormality to thecontrol device 2113. The external I/F unit 2116 mediates the exchange of information between the units of the imagesensing system ASIC 2103 and thecontrol device 2113 or the various kinds of control units. - The
transportation equipment 2100 includes a vehicleinformation acquisition unit 2110 and a drivingsupport unit 2111. The vehicleinformation acquisition unit 2110 includes vehicle sensors such as a speed/acceleration sensor, an angular velocity sensor, a steering angle sensor, a ranging radar, and a pressure sensor. - The driving
support unit 2111 includes a collision determination unit. The collision determination unit determines whether there is a possibility of collision with an object based on the pieces of information from the opticaldistance measuring unit 2106, theparallax calculator 2107, and theobject recognition unit 2108. The opticaldistance measuring unit 2106 and theparallax calculator 2107 are examples of distance information acquisition units that acquire distance information of a target object. That is, distance information is pieces of information related to the parallax, the defocus amount, the distance to the target object and the like. The collision determination unit may use one of these pieces of distance information to determine the possibility of a collision. Each distance information acquisition unit may be implemented by dedicated hardware or a software module. - An example in which the driving
support unit 2111 controls thetransportation equipment 2100 so it does not collide against another object has been described. However, it is also applicable to control of automatic driving following another vehicle or control of automatic driving not to drive off a lane. - The
transportation equipment 2100 also includes driving devices, which are used for movement or supporting a movement, such as an air bag, an accelerator, a brake, a steering, a transmission, an engine, a motor, wheels, propellers, and the like. Thetransportation equipment 2100 also includes control units for these devices. Each control unit controls a corresponding driving device based on a control signal of thecontrol device 2113. - The image sensing system used in the embodiment is applicable not only to an automobile and a railway vehicle but also to, for example, transportation equipment such as a ship, an airplane, or an industrial robot. The image sensing system is also applicable not only to the transportation equipment but also widely to equipment using object recognition such as an ITS (Intelligent Transportation System).
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
- This application claims the benefit of Japanese Patent Application No. 2017-156884, filed Aug. 15, 2017 which is hereby incorporated by reference wherein in its entirety.
Claims (19)
1-20. (canceled)
21. An image sensing device that comprises a first semiconductor chip which includes a pixel array in which a plurality of pixels are arranged in a matrix and a second semiconductor chip which includes a plurality of readout circuits configured to read out signals from the pixel array,
wherein the first semiconductor chip and the second semiconductor chip are stacked on each other,
the plurality of pixels includes a plurality of first-type pixels and a plurality of second-type pixels,
a number of the plurality of first-type pixels is less than a number of the plurality of the second-type pixels,
the image sensing device performs a first image sensing operation and performs a second image sensing operation,
in a first image sensing operation, first signal read out from the plurality of first-type pixels by the plurality of readout circuits is performed,
in a second image sensing operation, second signal read out from the plurality of second-type pixels by the plurality of readout circuits is performed, and
a control parameter for reading out the second signal from the plurality of second-type pixels in the second image sensing is determined based on the first signal read out from the plurality of first-type pixels in the first image sensing operation.
22. The device according to claim 21 , wherein the control parameter comprises at least one of an exposure time of accumulating charges, a gain of the plurality of readout circuits, conversion resolution of the plurality of readout circuits, and a signal readout region of the pixel array in the second image sensing operation.
23. The device according to claim 21 , wherein the control parameter comprises an exposure time of accumulating charges in the second image sensing operation.
24. The device according to claim 23 , wherein at least one of the plurality of first-type pixels is arranged in a first pixel row, and
the plurality of the first-type pixels is not arranged in a second pixel row adjacent to the first pixel row.
25. The device according to claim 23 , wherein at least one of the plurality of first-type pixels is arranged in a first pixel column, and
the plurality of the first-type pixels is not arranged in a second pixel column adjacent to the first pixel column.
26. The device according to claim 23 , wherein the control parameter is determined by at least one of not less than one row of the pixel array and not less than one column of the pixel array.
27. The device according to claim 23 , wherein a signal is read out from the plurality of second-type pixels in a second image sensing operation.
28. The device according to claim 23 , wherein
in each of pixel rows to which the plurality of the first-type pixels belong, each of the plurality of first-type pixels is arranged for every M pixels (M is a positive integer not less than 2),
in the pixel array, the pixel row to which the plurality of the first-type pixels belong is arranged for every N rows (N is an integer not less than 2), and
signal readout is performed simultaneously from first-type pixels belonging to continuous L pixel rows (L is an integer not less than 2) among pixel rows to which the plurality of the first-type pixels belong.
29. The device according to claim 28 , further comprising in each of the pixel rows to which the plurality of the first-type pixels belong, a first signal line group configured to control the first-type pixels belonging to each of the pixel rows among the plurality of first-type pixels, and a second signal line group configured to control a pixel other than the plurality of first-type pixels.
30. The device according to claim 29 , further comprising a third signal line group configured to control a pixel included in each of plurality of pixel rows which do not include the plurality of first-type pixels, and
a total signal line count of the first signal line group and the second signal line group and a line count of the third signal line group are equal to each other.
31. The device according to claim 21 , wherein an image sensing operation of performing one first image sensing operation and one second image sensing operation is repeated, and
in the image sensing operation, after signals generated by the first image sensing operation are read out by the plurality of readout circuits, signals generated by the second image sensing operation are read out by the plurality of readout circuits.
32. The device according to claim 31 , further comprising a controller configured to determine the control parameter,
wherein the first image sensing operation comprises a first preliminary image sensing operation and a second preliminary image sensing operation which is performed after the first preliminary image sensing operation,
the plurality of first-type pixels comprise a first preliminary image sensing pixel which is used in the first preliminary image sensing operation and a second preliminary image sensing pixel which is different from the first preliminary image sensing pixel and is used in the second preliminary image sensing operation,
the controller determines, based on a signal generated by the first preliminary image sensing pixel in the first preliminary image sensing operation, a preliminary image sensing parameter to be used for accumulating charges in the second preliminary image sensing operation and controlling the second preliminary image sensing operation,
the controller causes the plurality of readout circuits to read out each signal generated by the second preliminary image sensing pixel in the second preliminary image sensing operation, and
the controller determines the control parameter by using the signal generated by the second preliminary image sensing pixel.
33. The device according to claim 32 , wherein in the pixel array, a region in which the first preliminary image sensing pixel is arranged comprises a region in which the second preliminary image sensing pixel is arranged.
34. The device according to claim 33 , wherein in the pixel array, a region in which the second preliminary image sensing pixel is arranged comprises a region in which of the plurality of pixels, a pixel whose signal is to be read out in the second image sensing operation is arranged.
35. The device according to claim 34 , wherein the preliminary image sensing parameter comprises at least one of an exposure time of accumulating charges, a gain of the plurality of readout circuits, conversion resolution of the plurality of readout circuits, and a signal readout region of the pixel array in the second preliminary image sensing operation.
36. The device according to claim 21 , wherein a scanning time of the first image sensing operation is shorter than a scanning time of the second image sensing operation.
37. A camera comprising:
an image sensing device defined in claim 21 ; and
a control device configured to control an operation of the image sensing device.
38. A transportation equipment that includes a driving device, comprising:
an image sensing device defined in claim 21 ; and
a control device configured to control the driving device based on information acquired by the image sensing device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/061,710 US20210021779A1 (en) | 2017-08-15 | 2020-10-02 | Image sensing device, camera, and transportation equipment |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017156884A JP7057635B2 (en) | 2017-08-15 | 2017-08-15 | Imaging equipment, cameras and transportation equipment |
JP2017-156884 | 2017-08-15 | ||
US16/100,510 US10834350B2 (en) | 2017-08-15 | 2018-08-10 | Image sensing device, camera, and transportation equipment |
US17/061,710 US20210021779A1 (en) | 2017-08-15 | 2020-10-02 | Image sensing device, camera, and transportation equipment |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/100,510 Continuation US10834350B2 (en) | 2017-08-15 | 2018-08-10 | Image sensing device, camera, and transportation equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210021779A1 true US20210021779A1 (en) | 2021-01-21 |
Family
ID=65361293
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/100,510 Active 2038-09-25 US10834350B2 (en) | 2017-08-15 | 2018-08-10 | Image sensing device, camera, and transportation equipment |
US17/061,710 Abandoned US20210021779A1 (en) | 2017-08-15 | 2020-10-02 | Image sensing device, camera, and transportation equipment |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/100,510 Active 2038-09-25 US10834350B2 (en) | 2017-08-15 | 2018-08-10 | Image sensing device, camera, and transportation equipment |
Country Status (2)
Country | Link |
---|---|
US (2) | US10834350B2 (en) |
JP (1) | JP7057635B2 (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7286309B2 (en) | 2018-12-18 | 2023-06-05 | キヤノン株式会社 | Photoelectric conversion device, photoelectric conversion system, and signal processing device |
JP7299711B2 (en) | 2019-01-30 | 2023-06-28 | キヤノン株式会社 | Photoelectric conversion device and driving method thereof |
JP7358079B2 (en) | 2019-06-10 | 2023-10-10 | キヤノン株式会社 | Imaging devices, imaging systems and semiconductor chips |
JP7406314B2 (en) * | 2019-06-24 | 2023-12-27 | キヤノン株式会社 | electronic modules and equipment |
CN110536083B (en) | 2019-08-30 | 2020-11-06 | 上海芯仑光电科技有限公司 | Image sensor and image acquisition system |
KR102284128B1 (en) * | 2020-03-23 | 2021-07-30 | 삼성전기주식회사 | Camera for vehicle |
JP2022143220A (en) * | 2021-03-17 | 2022-10-03 | ソニーセミコンダクタソリューションズ株式会社 | Imaging device, imaging method, and electronic apparatus |
Family Cites Families (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3524391B2 (en) | 1998-08-05 | 2004-05-10 | キヤノン株式会社 | Imaging device and imaging system using the same |
US6956605B1 (en) | 1998-08-05 | 2005-10-18 | Canon Kabushiki Kaisha | Image pickup apparatus |
JP2002314866A (en) | 2001-04-17 | 2002-10-25 | Fuji Photo Film Co Ltd | Image-processing apparatus and power control method in photograph mode |
JP2002320235A (en) | 2001-04-19 | 2002-10-31 | Fujitsu Ltd | Cmos image sensor for generating reduced image signal by suppressing decrease in space resolution |
JP4084991B2 (en) | 2002-11-29 | 2008-04-30 | 富士通株式会社 | Video input device |
JP2004266369A (en) | 2003-02-21 | 2004-09-24 | Sony Corp | Solid-state image pickup unit and its driving method |
US20050094012A1 (en) | 2003-09-04 | 2005-05-05 | Yuichi Gomi | Solid-state image sensing apparatus |
JP4334950B2 (en) | 2003-09-04 | 2009-09-30 | オリンパス株式会社 | Solid-state imaging device |
JP4385844B2 (en) | 2004-04-23 | 2009-12-16 | ソニー株式会社 | Solid-state imaging device and driving method of solid-state imaging device |
JP4510523B2 (en) | 2004-06-02 | 2010-07-28 | キヤノン株式会社 | Solid-state imaging device and imaging system |
KR20090091646A (en) * | 2006-12-18 | 2009-08-28 | 소니 가부시끼 가이샤 | Imaging device and method, recording device and method, and reproduction device and method |
JP4981623B2 (en) | 2007-11-01 | 2012-07-25 | キヤノン株式会社 | Solid-state imaging device and driving method thereof, camera and copying machine |
JP5224942B2 (en) | 2008-06-30 | 2013-07-03 | キヤノン株式会社 | Solid-state imaging device |
JP5311954B2 (en) | 2008-09-30 | 2013-10-09 | キヤノン株式会社 | Driving method of solid-state imaging device |
JP5225145B2 (en) | 2009-02-23 | 2013-07-03 | キヤノン株式会社 | Solid-state imaging device |
JP5322696B2 (en) | 2009-02-25 | 2013-10-23 | キヤノン株式会社 | Solid-state imaging device and driving method thereof |
JP5233828B2 (en) | 2009-05-11 | 2013-07-10 | ソニー株式会社 | Solid-state imaging device, driving method of solid-state imaging device, and electronic apparatus |
JP5495701B2 (en) | 2009-10-07 | 2014-05-21 | キヤノン株式会社 | Solid-state imaging device |
JP5595014B2 (en) | 2009-11-09 | 2014-09-24 | キヤノン株式会社 | Imaging device |
JP2013120956A (en) | 2011-12-06 | 2013-06-17 | Canon Inc | Imaging apparatus |
JPWO2013164915A1 (en) * | 2012-05-02 | 2015-12-24 | 株式会社ニコン | Imaging device |
JP6166562B2 (en) | 2013-03-21 | 2017-07-19 | キヤノン株式会社 | IMAGING ELEMENT, ITS DRIVING METHOD, AND IMAGING DEVICE |
JP6149572B2 (en) | 2013-07-25 | 2017-06-21 | ソニー株式会社 | Image sensor, control method, and electronic device |
JP2015115637A (en) | 2013-12-09 | 2015-06-22 | 株式会社東芝 | Solid-state imaging apparatus |
JP6274898B2 (en) | 2014-02-17 | 2018-02-07 | キヤノン株式会社 | Solid-state imaging device and camera |
JP6341688B2 (en) | 2014-02-25 | 2018-06-13 | キヤノン株式会社 | Solid-state imaging device and imaging system |
JP6541347B2 (en) | 2014-03-27 | 2019-07-10 | キヤノン株式会社 | Solid-state imaging device and imaging system |
JP6548391B2 (en) | 2014-03-31 | 2019-07-24 | キヤノン株式会社 | Photoelectric conversion device and imaging system |
JP6368128B2 (en) | 2014-04-10 | 2018-08-01 | キヤノン株式会社 | Imaging device and imaging apparatus |
JP6620395B2 (en) | 2014-08-28 | 2019-12-18 | 株式会社ニコン | Imaging device |
JP6393087B2 (en) | 2014-06-16 | 2018-09-19 | キヤノン株式会社 | Imaging device and imaging apparatus |
US20160050377A1 (en) * | 2014-08-12 | 2016-02-18 | Samsung Electronics Co., Ltd. | Active pixel sensors and image devices having stacked pixel structure supporting global shutter |
JP2016208402A (en) | 2015-04-27 | 2016-12-08 | ソニー株式会社 | Solid state imaging device and driving method therefor, and electronic equipment |
US10003761B2 (en) | 2015-09-10 | 2018-06-19 | Canon Kabushiki Kaisha | Imaging device having multiple analog-digital conversion circuits that perform multiple ad conversions for a singular one of a pixel signal |
JP6674224B2 (en) | 2015-10-22 | 2020-04-01 | キヤノン株式会社 | Solid-state imaging device |
CN109155826A (en) * | 2016-06-09 | 2019-01-04 | 索尼公司 | Control device and control method |
JP2018082261A (en) | 2016-11-15 | 2018-05-24 | キヤノン株式会社 | Imaging device |
-
2017
- 2017-08-15 JP JP2017156884A patent/JP7057635B2/en active Active
-
2018
- 2018-08-10 US US16/100,510 patent/US10834350B2/en active Active
-
2020
- 2020-10-02 US US17/061,710 patent/US20210021779A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US10834350B2 (en) | 2020-11-10 |
US20190058842A1 (en) | 2019-02-21 |
JP7057635B2 (en) | 2022-04-20 |
JP2019036843A (en) | 2019-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210021779A1 (en) | Image sensing device, camera, and transportation equipment | |
US10110835B2 (en) | Imaging apparatus, imaging system, and moving object | |
EP3627829B1 (en) | Photoelectric conversion device and image sensing system | |
US11736813B2 (en) | Imaging device and equipment | |
US11800253B2 (en) | Imaging device and imaging system | |
US11503231B2 (en) | Imaging device, imaging system, and moving body | |
US11810930B2 (en) | Imaging device, imaging system, and moving body | |
US10587831B2 (en) | Solid-state imaging device, imaging system, and moving body | |
CN110611777B (en) | Image pickup apparatus, image pickup system, and mobile device | |
US20190174085A1 (en) | Solid-state imaging device and signal processing device | |
US11490041B2 (en) | Photoelectric converter and imaging system | |
US10841519B2 (en) | Photoelectric conversion apparatus, equipment, and driving method of photoelectric conversion apparatus | |
US10965896B2 (en) | Photoelectric conversion device, moving body, and signal processing device | |
JP2024011562A (en) | Photoelectric conversion device and system | |
US11140348B2 (en) | AD conversion device, imaging device, imaging system, and mobile apparatus | |
US11575868B2 (en) | Photoelectric conversion apparatus, method of driving photoelectric conversion apparatus, photoelectric conversion system, and moving body | |
US11394903B2 (en) | Imaging apparatus, imaging system, and moving body | |
US20230276150A1 (en) | Photoelectric conversion device and method of driving photoelectric conversion device | |
EP4057622A2 (en) | Device, system, moving body, and substrate | |
US20220208810A1 (en) | Photoelectric conversion apparatus, photoelectric conversion system, moving body, and semiconductor substrate | |
JP2024078503A (en) | Photoelectric conversion devices and equipment | |
CN116347258A (en) | Photoelectric conversion apparatus, photoelectric conversion system, and moving body |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |