WO2018051809A1 - 撮像装置、及び、電子機器 - Google Patents
撮像装置、及び、電子機器 Download PDFInfo
- Publication number
- WO2018051809A1 WO2018051809A1 PCT/JP2017/031539 JP2017031539W WO2018051809A1 WO 2018051809 A1 WO2018051809 A1 WO 2018051809A1 JP 2017031539 W JP2017031539 W JP 2017031539W WO 2018051809 A1 WO2018051809 A1 WO 2018051809A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- imaging
- signal processing
- output
- unit
- Prior art date
Links
- 238000012545 processing Methods 0.000 claims abstract description 427
- 238000003384 imaging method Methods 0.000 claims description 407
- 238000007906 compression Methods 0.000 claims description 106
- 230000006835 compression Effects 0.000 claims description 67
- 238000007499 fusion processing Methods 0.000 claims description 36
- 238000004891 communication Methods 0.000 claims description 30
- 230000003287 optical effect Effects 0.000 claims description 12
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000005516 engineering process Methods 0.000 abstract description 19
- 238000000034 method Methods 0.000 description 91
- 230000008569 process Effects 0.000 description 76
- 238000006243 chemical reaction Methods 0.000 description 34
- 238000001514 detection method Methods 0.000 description 27
- 238000010586 diagram Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 8
- 238000010276 construction Methods 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000002583 angiography Methods 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 210000004761 scalp Anatomy 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/617—Upgrading or updating of programs or applications for camera control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/44—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
- H04N25/443—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/71—Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
- H04N25/75—Circuitry for providing, modifying or processing image signals from the pixel array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/79—Arrangements of circuitry being divided between different or multiple substrates, chips or circuit boards, e.g. stacked image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/917—Television signal processing therefor for bandwidth reduction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
Definitions
- the present technology relates to an imaging device and an electronic device, and more particularly, to an imaging device and an electronic device that enable, for example, an imaging device that outputs information required by a user to be configured in a small size.
- Patent Document 1 An image pickup apparatus for picking up an image, an image pickup apparatus in which a chip of a sensor chip, a memory chip, and a DSP (Digital Signal Processor) chip is connected in parallel with a plurality of bumps has been proposed (for example, Patent Document 1). reference).
- a chip of a sensor chip, a memory chip, and a DSP (Digital Signal Processor) chip is connected in parallel with a plurality of bumps.
- the user who uses the imaging device may need information (metadata) obtained from the image instead of the image itself, in addition to the case where the image captured by the imaging device is required.
- the present technology has been made in view of such a situation, and enables an imaging apparatus that outputs information required by a user to be configured in a small size.
- An imaging device or an electronic apparatus includes an imaging unit that captures an image in which a plurality of pixels are arranged in two dimensions, a signal processing unit that performs signal processing using a captured image output from the imaging unit, and the signal A signal processing result of processing and an output I / F for outputting the captured image to the outside, and a signal processing result of the signal processing and the captured image are selectively output to the outside from the output I / F.
- a one-chip imaging device including an output control unit that performs output control, or an electronic device including such an imaging device.
- signal processing using a captured image output by an imaging unit that captures an image in which a plurality of pixels are arranged two-dimensionally is performed, and the signal processing result of the signal processing, and The signal processing result of the signal processing and the captured image are selectively output from the output I / F that outputs the captured image to the outside.
- the imaging device may be an independent device or an internal block constituting one device.
- the imaging device that outputs information required by the user can be configured in a small size.
- FIG. 1 is a perspective view illustrating an outline of an external configuration example of an imaging apparatus 2.
- FIG. 1 is a perspective view illustrating an outline of an external configuration example of an imaging apparatus 2.
- FIG. It is a figure explaining the outline
- 6 is a timing chart for explaining a first example of processing timing of the imaging apparatus 2 when recognition processing is performed as signal processing of the DSP 32; 12 is a timing chart illustrating a second example of processing timing of the imaging apparatus 2 when recognition processing is performed as signal processing of the DSP 32. 12 is a timing chart for explaining a third example of processing timing of the imaging apparatus 2 when recognition processing is performed as signal processing of the DSP 32. 16 is a timing chart illustrating a fourth example of processing timing of the imaging apparatus 2 when recognition processing is performed as signal processing of the DSP 32. It is a figure explaining the outline
- FIG. 6 is a timing chart illustrating a first example of processing timing of the imaging apparatus 2 when SLAM processing is performed as signal processing of the DSP 32.
- 12 is a timing chart for explaining a second example of processing timing of the imaging apparatus 2 when SLAM processing is performed as signal processing of the DSP 32.
- 12 is a timing chart illustrating a third example of processing timing of the imaging apparatus 2 when SLAM processing is performed as signal processing of the DSP 32.
- FIG. 10 is a block diagram illustrating another configuration example of the imaging apparatus 2.
- 6 is a diagram illustrating a usage example in which the imaging apparatus 2 is used.
- FIG. It is a block diagram which shows the schematic structural example of a vehicle control system. It is explanatory drawing which shows an example of the installation position of an imaging part.
- FIG. 1 is a block diagram illustrating a configuration example of an embodiment of a digital camera to which the present technology is applied.
- the digital camera can capture both still images and moving images.
- the digital camera includes an optical system 1, an imaging device 2, a memory 3, a signal processing unit 4, an output unit 5, and a control unit 6.
- the optical system 1 has, for example, a zoom lens (not shown), a focus lens, a diaphragm, and the like, and makes external light incident on the imaging device 2.
- the imaging device 2 is, for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor configured by one chip, receives incident light from the optical system 1, performs photoelectric conversion, and converts the incident light from the optical system 1 into incident light. Output the corresponding image data.
- CMOS Complementary Metal Oxide Semiconductor
- the imaging device 2 performs, for example, recognition processing for recognizing a predetermined recognition target and other signal processing using the image data or the like, and outputs a signal processing result of the signal processing.
- the memory 3 temporarily stores image data and the like output from the imaging device 2.
- the signal processing unit 4 performs processing such as noise removal and white balance adjustment as necessary as camera signal processing using the image data stored in the memory 3, and supplies the processing to the output unit 5. .
- the output unit 5 outputs the image data from the signal processing unit 4 and the signal processing result stored in the memory 3.
- the output unit 5 has a display (not shown) made of, for example, liquid crystal, and displays an image corresponding to the image data from the signal processing unit 4 as a so-called through image.
- the output unit 5 includes a driver (not shown) that drives a recording medium such as a semiconductor memory, a magnetic disk, or an optical disk, for example, and the image data from the signal processing unit 4 and the signal stored in the memory 3 The processing result is recorded on a recording medium.
- a driver not shown
- a recording medium such as a semiconductor memory, a magnetic disk, or an optical disk, for example
- the output unit 5 functions as, for example, an I / F (Interface) that performs data transmission with an external device, and outputs image data from the signal processing unit 4, image data recorded on a recording medium, and the like. And transmit to an external device.
- I / F Interface
- the control unit 6 controls each block constituting the digital camera in accordance with a user operation or the like.
- the imaging device 2 captures an image. That is, the imaging device 2 receives incident light from the optical system 1 and performs photoelectric conversion, and acquires and outputs image data corresponding to the incident light.
- the image data output from the imaging device 2 is supplied to and stored in the memory 3.
- the image data stored in the memory 3 is subjected to camera signal processing by the signal processing unit 4, and the resulting image data is supplied to the output unit 5 and output.
- the imaging device 2 performs signal processing using an image (data) obtained by imaging, and outputs a signal processing result of the signal processing.
- the signal processing result output from the imaging device 2 is stored in the memory 3, for example.
- the imaging device 2 the output of the image itself obtained by the imaging and the output of the signal processing result of the signal processing using the image or the like are selectively performed.
- FIG. 2 is a block diagram illustrating a configuration example of the imaging device 2 of FIG.
- the imaging apparatus 2 includes an imaging block 20 and a signal processing block 30.
- the imaging block 20 and the signal processing block 30 are electrically connected by connection lines (internal buses) CL1, CL2, and CL3.
- the imaging block 20 includes an imaging unit 21, an imaging processing unit 22, an output control unit 23, an output I / F (Interface) 24, and an imaging control unit 25, and captures an image.
- the imaging unit 21 includes a plurality of pixels arranged in two dimensions.
- the imaging unit 21 is driven by the imaging processing unit 22 to capture an image.
- the imaging unit 21 receives incident light from the optical system 1 in each pixel, performs photoelectric conversion, and outputs an analog image signal corresponding to the incident light.
- the size of the image (signal) output from the imaging unit 21 is selected from a plurality of sizes such as 12M (3968 ⁇ 2976) pixels and VGA (Video Graphics Array) size (640 ⁇ 480 pixels). be able to.
- the imaging unit 21 it is possible to select, for example, whether it is an RGB (red, green, blue) color image or a luminance-only monochrome image.
- RGB red, green, blue
- the imaging processing unit 22 is controlled by the imaging unit 21 such as driving of the imaging unit 21, AD (Analog-to-Digital) conversion of analog image signals output from the imaging unit 21, imaging signal processing, and the like according to the control of the imaging control unit 25. An imaging process related to the imaging of the image is performed.
- AD Analog-to-Digital
- the imaging signal processing for example, with respect to the image output from the imaging unit 21, a process for obtaining the brightness of each small region by calculating an average value of pixel values for each predetermined small region, There are processing for converting an image output by the imaging unit 21 into an HDR (High Dynamic Range) image, defect correction, development, and the like.
- HDR High Dynamic Range
- the imaging processing unit 22 outputs a digital image signal (here, for example, a 12M pixel or VGA size image) obtained by AD conversion of an analog image signal output from the imaging unit 21 as a captured image.
- a digital image signal here, for example, a 12M pixel or VGA size image
- the captured image output by the imaging processing unit 22 is supplied to the output control unit 23 and is also supplied to the image compression unit 35 of the signal processing block 30 via the connection line CL2.
- the output control unit 23 is supplied with the captured image from the imaging processing unit 22 and the signal processing result of the signal processing using the captured image or the like from the signal processing block 30 via the connection line CL3.
- the output control unit 23 selects the captured image from the imaging processing unit 22 and the signal processing result from the signal processing block 30 from the (one) output I / F 24 to the outside (for example, the memory 3 in FIG. 1). Output control is performed.
- the output control unit 23 selects a captured image from the imaging processing unit 22 or a signal processing result from the signal processing block 30 and supplies it to the output I / F 24.
- the output I / F 24 is an I / F that outputs the captured image supplied from the output control unit 23 and the signal processing result to the outside.
- the output IF 24 for example, a relatively high-speed parallel I / F such as MIPI (Mobile Industry Processor Interface) can be adopted.
- the captured image from the imaging processing unit 22 or the signal processing result from the signal processing block 30 is output to the outside in accordance with the output control of the output control unit 23. Therefore, for example, when only the signal processing result from the signal processing block 30 is required outside, and the captured image itself is not necessary, only the signal processing result can be output, and the output I / F 24 externally The amount of data to be output can be reduced.
- the signal processing block 30 performs signal processing for obtaining a signal processing result required externally, and outputs the signal processing result from the output I / F 24, thereby eliminating the need for external signal processing.
- the load on the external block can be reduced.
- the imaging control unit 25 includes a communication I / F 26 and a register group 27.
- the communication I / F 26 is a first communication I / F such as a serial communication I / F such as I2C (Inter-Integrated Circuit), for example, and is connected to the outside (for example, the control unit 6 in FIG. 1). Necessary information such as information to be read from and written to the register 27 group is exchanged.
- I2C Inter-Integrated Circuit
- the register group 27 has a plurality of registers, and stores imaging information related to imaging of an image by the imaging unit 21 and other various information.
- the register group 27 stores imaging information received from the outside in the communication I / F 26 and results of imaging signal processing in the imaging processing unit 22 (for example, brightness for each small area of the captured image).
- the imaging information stored in the register group 27 includes, for example, ISO sensitivity (analog gain at the time of AD conversion in the imaging processing unit 22), exposure time (shutter speed), frame rate, focus, shooting mode, clipping range, and the like. (Information representing).
- the shooting modes include, for example, a manual mode in which the exposure time and frame rate are manually set, and an automatic mode in which the exposure time and frame rate are automatically set according to the scene.
- the automatic mode includes a mode corresponding to various shooting scenes such as a night view and a human face.
- the cutout range represents a range cut out from an image output by the imaging unit 21 when a part of an image output by the imaging unit 21 is cut out and output as a captured image in the imaging processing unit 22.
- the cutout range for example, it is possible to cut out only the range in which a person is shown from the image output by the imaging unit 21.
- the imaging control unit 25 controls the imaging processing unit 22 according to the imaging information stored in the register group 27, thereby controlling the imaging of the image by the imaging unit 21.
- the register group 27 can store output control information related to output control in the output control unit 23 in addition to imaging information and results of imaging signal processing in the imaging processing unit 22.
- the output control unit 23 can perform output control for selectively outputting the captured image and the signal processing result according to the output control information stored in the register group 27.
- the imaging control unit 25 and the CPU 31 of the signal processing block 30 are connected via a connection line CL1, and the CPU 31 is connected to the register group 27 via the connection line CL1. Can read and write information.
- reading and writing of information with respect to the register group 27 can be performed from the CPU 31 in addition to the communication I / F 26.
- the signal processing block 30 includes a CPU (Central Processing Unit) 31, a DSP (Digital Signal Processing) 32, a memory 33, a communication I / F 34, an image compression unit 35, and an input I / F 36. Predetermined signal processing is performed using the captured image or the like.
- CPU Central Processing Unit
- DSP Digital Signal Processing
- the CPU 31 or the input I / F 36 constituting the signal processing block 30 are connected to each other via a bus, and can exchange information as necessary.
- the CPU 31 executes the program stored in the memory 33 to control the signal processing block 30, read / write information to / from the register group 27 of the imaging control unit 25 via the connection line CL1, and other various processes. I do.
- the CPU 31 functions as an imaging information calculation unit that calculates imaging information using a signal processing result obtained by signal processing in the DSP 32 by executing a program, and calculates a new value calculated using the signal processing result.
- the imaging information is fed back and stored in the register group 27 of the imaging control unit 25 via the connection line CL1.
- the CPU 31 can control imaging in the imaging unit 21 and imaging signal processing in the imaging processing unit 22 according to the signal processing result of the captured image.
- the imaging information stored in the register group 27 by the CPU 31 can be provided (output) to the outside from the communication I / F 26.
- focus information of the imaging information stored in the register group 27 can be provided from the communication I / F 26 to a focus driver (not shown) that controls the focus.
- the DSP 32 executes a program stored in the memory 33, so that a captured image supplied from the imaging processing unit 22 to the signal processing block 30 via the connection line CL2 and information received by the input I / F 36 from the outside. It functions as a signal processing unit that performs signal processing using.
- the memory 33 is composed of SRAM (Static Random Access Memory), DRAM (Dynamic RAM) or the like, and stores data necessary for processing of the signal processing block 30.
- SRAM Static Random Access Memory
- DRAM Dynamic RAM
- the memory 33 is a program received from the outside, a captured image compressed by the image compression unit 35 and used in signal processing by the DSP 32, and a signal processing result of signal processing performed by the DSP 32.
- the information received by the input I / F 36 is stored.
- the communication I / F 34 is a second communication I / F such as a serial communication I / F such as SPI (Serial Peripheral Interface), and is connected to the outside (for example, the memory 3 and the control unit 6 in FIG. 1). Necessary information such as programs executed by the CPU 31 and the DSP 32 is exchanged between them.
- SPI Serial Peripheral Interface
- the communication I / F 34 downloads a program executed by the CPU 31 and the DSP 32 from the outside, and supplies the program to the memory 33 for storage.
- the communication I / F 34 can exchange arbitrary data in addition to programs with the outside.
- the communication I / F 34 can output a signal processing result obtained by signal processing in the DSP 32 to the outside.
- the communication I / F 34 outputs information in accordance with an instruction from the CPU 31 to an external device, and thereby can control the external device in accordance with the instruction from the CPU 31.
- the signal processing result obtained by the signal processing in the DSP 32 is output to the outside from the communication I / F 34 and can be written into the register group 27 of the imaging control unit 25 by the CPU 31.
- the signal processing result written in the register group 27 can be output to the outside from the communication I / F 26. The same applies to the processing result of the processing performed by the CPU 31.
- the captured image is supplied to the image compression unit 35 from the imaging processing unit 22 via the connection line CL2.
- the image compression unit 35 performs a compression process for compressing the captured image, and generates a compressed image having a data amount smaller than that of the captured image.
- the compressed image generated by the image compression unit 35 is supplied to and stored in the memory 33 via the bus.
- the signal processing in the DSP 32 can be performed by using the compressed image generated from the captured image by the image compression unit 35 in addition to the captured image itself. Since the compressed image has a smaller amount of data than the captured image, it is possible to reduce the load of signal processing in the DSP 32 and save the storage capacity of the memory 33 that stores the compressed image.
- the compression processing in the image compression unit 35 for example, a scale-down of converting a captured image of 12M (3968 ⁇ 2976) pixels into a VGA size image can be performed.
- the compression processing is, for example, YUV conversion that converts an RGB image into a YUV image. It can be carried out.
- the image compression unit 35 can be realized by software, or can be realized by dedicated hardware.
- the input I / F 36 is an I / F that receives information from the outside.
- the input I / F 36 receives an output (external sensor output) of the external sensor from an external sensor, and supplies the output to the memory 33 via the bus for storage.
- a parallel I / F such as MIPI (Mobile Industry Processor Interface) can be adopted as in the case of the output IF 24.
- MIPI Mobile Industry Processor Interface
- an external sensor for example, a distance sensor that senses information related to distance can be employed. Further, as an external sensor, for example, an image that senses light and outputs an image corresponding to the light can be used. A sensor, that is, an image sensor different from the imaging device 2 can be employed.
- the DSP 32 receives the input I / F 36 from the external sensor as described above and performs signal processing using the external sensor output stored in the memory 33. Can do.
- the imaging device 2 configured as described above, signal processing using a captured image (compressed image generated from the image) obtained by imaging by the imaging unit 21 is performed by the DSP 32, and the signal processing signal A processing result and a captured image are selectively output from the output I / F 24. Therefore, the imaging device that outputs information required by the user can be configured in a small size.
- the imaging device 2 when the imaging device 2 does not perform the signal processing of the DSP 32 and therefore does not output the signal processing result from the imaging device 2 and outputs the captured image, that is, the imaging device 2 simply captures the image.
- the imaging device 2 when configured as an image sensor that only outputs, the imaging device 2 can be configured only by the imaging block 20 that does not include the output control unit 23.
- FIG. 3 is a perspective view showing an outline of an external configuration example of the imaging device 2 of FIG.
- the imaging device 2 can be configured, for example, as a one-chip semiconductor device having a stacked structure in which a plurality of dies are stacked as shown in FIG.
- the imaging device 2 is configured by stacking two dies 51 and 52.
- the imaging unit 21 is mounted on the upper die 51, and the imaging processing unit 22 to the imaging control unit 25 and the CPU 31 or the input I / F 36 are mounted on the lower die 52.
- the upper die 51 and the lower die 52 are formed, for example, by forming a through-hole that penetrates the die 51 and reaches the die 52, or a Cu wiring exposed on the lower surface side of the die 51 and the die 52. Electrical connection is made, for example, by performing Cu-Cu bonding that directly connects the Cu wiring exposed on the upper surface side of the metal.
- a column parallel AD method or an area AD method can be adopted as a method for performing AD conversion of the image signal output from the image pickup unit 21 in the image pickup processing unit 22.
- an ADC AD Converter
- the ADC of each column is in charge of AD conversion of the pixel signal of the pixel of the column.
- AD conversion of image signals of pixels in each column of one row is performed in parallel.
- a part of the imaging processing unit 22 that performs AD conversion of the column parallel AD method may be mounted on the upper die 51.
- pixels constituting the imaging unit 21 are divided into a plurality of blocks, and an ADC is provided for each block.
- the ADC of each block is responsible for AD conversion of the pixel signals of the pixels of the block, so that AD conversion of the image signals of the pixels of the plurality of blocks is performed in parallel.
- AD conversion (reading and AD conversion) of an image signal can be performed only for necessary pixels among pixels constituting the imaging unit 21 with a block as a minimum unit.
- the imaging device 2 can be configured by a single die.
- the two dies 51 and 52 are stacked to form the one-chip imaging device 2, but the one-chip imaging device 2 is configured by stacking three or more dies. can do.
- the memory 33 in FIG. 3 can be mounted on another die.
- an imaging device (hereinafter, also referred to as a bump connection imaging device) in which chips of a sensor chip, a memory chip, and a DSP chip are connected in parallel by a plurality of bumps.
- the thickness is greatly increased and the device is increased in size.
- the bump connection imaging device it may be difficult to ensure a sufficient rate as a rate at which the captured image is output from the imaging processing unit 22 to the output control unit 23 due to signal degradation at a bump connection portion. possible.
- the image pickup apparatus 2 having the laminated structure, it is possible to prevent the apparatus from being enlarged as described above and from being unable to ensure a sufficient rate as the rate between the image pickup processing unit 22 and the output control unit 23. be able to.
- the imaging device 2 having the laminated structure it is possible to realize a compact configuration of the imaging device that outputs information required by the user.
- the imaging device 2 can output the captured image.
- the imaging device 2 When the information required by the user is obtained by signal processing using a captured image, the imaging device 2 performs signal processing in the DSP 32, thereby performing signal processing as information required by the user. The result can be obtained and output.
- the signal processing performed in the imaging device 2 that is, the signal processing of the DSP 32
- recognition processing for recognizing a predetermined recognition target from a captured image can be employed.
- the imaging device 2 can receive the output of a distance sensor such as a ToF (Time of Flight) sensor arranged so as to have a predetermined positional relationship with the imaging device 2 by the input I / F 36.
- a distance sensor such as a ToF (Time of Flight) sensor arranged so as to have a predetermined positional relationship with the imaging device 2 by the input I / F 36.
- the signal processing of the DSP 32 for example, the output of the distance sensor and the imaging such as the processing of removing the noise of the distance image obtained from the output of the distance sensor received by the input I / F 36 using the captured image. It is possible to adopt a fusion process that integrates an image and obtains a precise distance.
- the imaging device 2 can receive an image output by an image sensor arranged so as to have a predetermined positional relationship with the imaging device 2 by the input I / F 36.
- the signal processing of the DSP 32 for example, self-position estimation processing (SLAM (Simultaneously Localization and Mapping)) using the image received at the input I / F 36 and the captured image as a stereo image can be employed. .
- SLAM Simultaneously Localization and Mapping
- SLAM processing self-position estimation processing
- processing order of the imaging apparatus 2 described below can be changed within a possible range. That is, the processing order of the imaging device 2 is not limited to the order described below.
- the communication I / F 34 downloads a program (code) to be executed by the CPU 31 and the DSP 32 from the outside and stores the program in the memory 33 when performing the recognition process as the signal process of the DSP 32.
- the program executed by the DSP 32 is a recognition processing program that performs recognition processing as signal processing.
- the CPU 31 starts a predetermined process by executing a program stored in the memory 33.
- step S12 the CPU 31 reads the brightness (information) for each small area of the captured image and other necessary information from the register group 27 via the connection line CL1.
- step S13 the CPU 31 performs control related to the compression process such as determining a reduction ratio indicating the degree to which the captured image is scaled down by the compression process of the image compression unit 35.
- step S14 the imaging unit 21 starts imaging, and the imaging processing unit 22 starts outputting the image from the imaging unit 21 as a captured image. Thereby, supply of the captured image from the imaging processing unit 22 to the output control unit 23 and supply from the imaging processing unit 22 to the image compression unit 35 via the connection line CL2 are started.
- the captured image supplied from the imaging processing unit 22 to the output control unit 23 is selected as necessary by the output control unit 23 and is output to the outside from the output I / F 24.
- step S15 the image compression unit 35 starts compression processing of the captured image supplied from the imaging processing unit 22 via the connection line CL2.
- the image output by the imaging unit 21 is also referred to as a captured image.
- the imaging unit 21 can output, for example, a captured image of 12 M pixels or VGA size. Furthermore, the imaging unit 21 can output, for example, an RGB (red, green, blue) color image or a monochrome image as a captured image.
- RGB red, green, blue
- the image compression unit 35 When the captured image is a full-size image of 12M pixels, the image compression unit 35 performs, for example, a process of scaling down the captured image of 12M pixels to a captured image of VGA size or the like as a compression process.
- the image compression unit 35 does not perform the process of scaling down.
- the image compression unit 35 performs YUV conversion as a compression process in order to convert a color captured image into a black and white captured image, for example.
- the image compression unit 35 does not perform YUV conversion.
- the image compression unit 35 performs scale-down of the captured image and YUV conversion as compression processing.
- the image compression unit 35 performs YUV conversion of the captured image as a compression process.
- the image compression unit 35 stores the VGA size monochrome captured image obtained as a result of the compression process in the memory 33 as a compressed image.
- the imaging device 2 can be configured without the image compression unit 35.
- the load on the DSP 32 and the storage capacity required for the memory 33 become large.
- step S21 of FIG. 5 the DSP 32 reads and executes the recognition processing program stored in the memory 33 in step S11, thereby starting recognition processing as signal processing corresponding to the recognition processing program.
- the DSP 32 sequentially reads each area of the compressed image stored in the memory 33 as a recognition processing target from the memory 33, and selects a predetermined recognition target (for example, a human face) from the processing target.
- the recognition process for recognition is performed as signal processing using a compressed image (and thus a captured image).
- Recognition processing can be performed using a technique such as deep learning such as CNN (Convolutional Neural Network).
- CNN Convolutional Neural Network
- the recognition process in addition to detecting a specific subject such as a human face as a recognition target, it is possible to detect a scene appearing in the image with a scene appearing in the image as a recognition target.
- step S22 the DSP 32 supplies the result of the recognition process to the memory 33 as a signal processing result to be stored.
- the result of the recognition process includes, for example, information such as whether or not the recognition target is detected and the detection position where the recognition target is detected.
- the compressed image level is set such that the average brightness of the compressed image is set to a predetermined fixed value.
- Key conversion can be performed. Such gradation conversion can be performed using the brightness of each small region of the captured image read from the register group 27 by the CPU 31 in step S12 of FIG.
- step S31 of FIG. 6 the CPU 31 reads the recognition result as the signal processing result stored in the memory 33, and uses the recognition result to calculate imaging information such as an exposure time suitable for imaging the captured image. I do.
- the CPU 31 detects the face appearing at the detection position according to the brightness of the face detection position on the compressed image (captured image). Calculate the exposure time appropriate for shooting. Further, for example, the CPU 31 calculates imaging information for controlling autofocus so as to focus on the face detection position.
- the CPU 31 calculates imaging information such as a frame rate suitable for imaging a captured image, a shooting mode, and a cutout range as necessary using the recognition result.
- step S32 the CPU 31 feeds back the imaging information calculated in step S31 to the register group 27 via the connection line CL1.
- the register group 27 newly stores the imaging information fed back from the CPU 31, and thereafter, the imaging control unit 25 controls the imaging processing unit 22 according to the imaging information newly stored in the register group 27.
- step S ⁇ b> 33 the CPU 31 reads the recognition result as the signal processing result stored in the memory 33 and supplies it to the output control unit 23.
- the recognition result as the signal processing result supplied from the memory 33 to the output control unit 23 is selected as necessary by the output control unit 23 and is output to the outside from the output I / F 24.
- FIG. 7 is a timing chart for explaining a first example of processing timing of the imaging apparatus 2 when recognition processing is performed as signal processing of the DSP 32.
- the imaging unit 21 takes 1/30 seconds as a frame period T1, and captures one frame during 1/60 seconds of the first half of the frame period T1.
- a captured image obtained by imaging by the imaging unit 21 is supplied from the imaging processing unit 22 to the output control unit 23 and the image compression unit 35.
- the captured image is a 12M pixel color image.
- the output control unit 23 when a captured image is supplied from the imaging processing unit 22, the captured image is selected and output to the outside from the output I / F 24.
- scale-down and YUV conversion are performed as compression processing of a 12M pixel color captured image, and the 12M pixel color captured image is converted into a VGA sized black and white compressed image. This compressed image is stored in the memory 33.
- the frame period T1 is also referred to as an attention frame period T1.
- the compression processing of the captured image (captured at 1/60 second of the first half) of the noticed frame period T1 ends in the first half of the noticeable frame period T1, and then the latter half of the noticed frame period T1
- the DSP 32 starts recognition processing using a compressed image stored in the memory 33, that is, a compressed image obtained from a captured image of the frame period of interest T1.
- the DSP 32 ends the recognition process using the compressed image obtained from the captured image in the frame period of interest T1 at a timing slightly before the end of the frame period of interest T1, and uses the recognition result of the recognition process as the signal processing result. , And supplied to the output control unit 23.
- the output control unit 23 selects the signal processing result and outputs it to the outside from the output I / F 24.
- the signal processing result for the attention frame period T1 that is, the signal processing result (recognition result) of the recognition processing using the compressed image obtained from the captured image of the attention frame period T1 is the end of the attention frame period T1. It is output from the output I / F 24 during the period from the moment to the end of the frame period of interest T1.
- FIG. 8 is a timing chart for explaining a second example of processing timing of the imaging apparatus 2 when recognition processing is performed as signal processing of the DSP 32.
- the imaging unit 21 takes 1/30 seconds as a frame period T1 and captures one frame during 1/60 seconds of the first half of the frame period T1 to obtain 12M pixels. A color captured image is acquired.
- the captured image acquired by the imaging unit 21 is supplied from the imaging processing unit 22 to the output control unit 23 and the image compression unit 35.
- the output control unit 23 selects the captured image in response to the supply of the captured image from the imaging processing unit 22 and outputs it to the outside from the output I / F 24 as in FIG. 7.
- scale down and YUV conversion are performed as compression processing of a 12M pixel color captured image, and the 12M pixel color captured image is converted into a VGA sized monochrome compressed image. Converted. This compressed image is stored in the memory 33.
- the DSP 32 after starting the compression process, the DSP 32 does not wait for the end of the compression process, and at a timing when the DSP 32 can use a part of the compressed image generated by the compression process for the recognition process.
- a recognition process using a compressed image (a part thereof) obtained from the captured image of the target frame period T1 is started.
- the DSP 32 ends the recognition process using the compressed image obtained from the captured image in the frame period of interest T1 at a timing slightly before the end of the frame period of interest T1, and uses the recognition result of the recognition process as the signal processing result. , And supplied to the output control unit 23.
- the output control unit 23 selects the signal processing result according to the supply of the recognition result as the signal processing result, and outputs it to the outside from the output I / F 24.
- the recognition result using the compressed image obtained from the captured image of the target frame period T ⁇ b> 1 as the signal processing result for the target frame period T ⁇ b> 1 is immediately after the end of the target frame period T ⁇ b> 1. It is output from the output I / F 24 during the period until the frame period of interest T1 ends.
- FIG. 9 is a timing chart illustrating a third example of processing timing of the imaging apparatus 2 when recognition processing is performed as signal processing of the DSP 32.
- the imaging unit 21 captures one frame with 1/30 seconds as the frame period T ⁇ b> 1.
- the imaging unit 21 captures a VGA size color captured image, not a 12 M pixel color captured image.
- the imaging of one frame is completed in a very short time from the start of the frame period T1.
- the VGA-size captured image captured by the imaging unit 21 is supplied from the imaging processing unit 22 to the output control unit 23 and the image compression unit 35.
- the captured image is not used externally. Therefore, even if the captured image is supplied from the imaging processing unit 22, the output control unit 23 does not select the captured image and outputs I / F24 does not output externally.
- the image compression unit 35 compresses the captured image and stores the compressed image obtained as a result in the memory 33.
- FIG. 9 since the captured image is a VGA size color image, YUV conversion is performed as the compression processing, but scale down is not performed. Therefore, the compression process of FIG. 9 is completed in a shorter time than the cases of FIG. 7 and FIG.
- a part of the compressed image generated by the compression process can be used for the recognition process without waiting for the end of the compression process after the compression process is started.
- the DSP 32 has started recognition processing using (a part of) the compressed image obtained from the captured image of the frame period of interest T1.
- the DSP 32 ends the recognition process using the compressed image obtained from the captured image in the frame period of interest T1 at a timing slightly before the end of the frame period of interest T1, and uses the recognition result of the recognition process as the signal processing result. , And supplied to the output control unit 23.
- the output control unit 23 selects the signal processing result according to the supply of the recognition result as the signal processing result, and outputs it to the outside from the output I / F 24.
- the recognition result using the compressed image obtained from the captured image of the frame of interest T1 as the signal processing result for the frame of interest T1 is the frame of the frame of interest T1. It is output from the output I / F 24 in the period from the end of the period until the frame period of interest T1 ends.
- the recognition process is started without waiting for the end of the compression process. Therefore, the recognition process is performed in comparison with the case of FIG. 7 in which the recognition process is started after the end of the compression process. More time can be secured for processing.
- the captured image output by the imaging unit 21 is a VGA size image, it is not necessary to perform a scale down in the compression process, and the load of the compression process can be reduced.
- the form in which the captured image output by the imaging unit 21 is not output as the VGA size image from the output I / F 24 is, for example, externally, the captured image itself is not necessary, and the signal processing result (here, This is useful when the recognition result of the recognition process is necessary.
- FIG. 10 is a timing chart for explaining a fourth example of processing timing of the imaging apparatus 2 when recognition processing is performed as signal processing of the DSP 32.
- the imaging unit 21 captures a VGA size color captured image with 1/30 second as the frame period T ⁇ b> 1.
- the captured image captured by the imaging unit 21 is supplied from the imaging processing unit 22 to the output control unit 23 and the image compression unit 35.
- the output control unit 23 does not output the captured image to the outside from the output I / F 24 (not selected). .
- the image compression unit 35 compresses the captured image and stores the compressed image obtained as a result in the memory 33.
- a part of the compressed image generated by the compression process can be used for the recognition process without waiting for the end of the compression process after the compression process is started.
- the DSP 32 starts a recognition process using (a part of) a compressed image obtained from the captured image of the frame period of interest T1.
- the DSP 32 ends the recognition process using the compressed image obtained from the captured image in the frame period of interest T1 at a timing slightly before the end of the frame period of interest T1, and uses the recognition result of the recognition process as the signal processing result. , And supplied to the output control unit 23.
- the output control unit 23 selects a recognition result using a compressed image obtained from the captured image of the frame of interest T1 as a signal processing result for the frame of interest T1. Output from the output I / F 24 just before the end of the frame period T1.
- the DSP 32 appropriately outputs intermediate data obtained during the recognition process while performing the recognition process as the signal process.
- the intermediate data output from the DSP 32 is supplied to the output control unit 23.
- the output control unit 23 selects the intermediate data and outputs it from the output I / F 24, as in the signal processing result. To do.
- the intermediate data obtained in the middle of the signal processing is output from the output I / F 24 to the outside
- the intermediate data is converted into a program (here, It can be used for debugging of the recognition processing program.
- 11, 12, and 13 are diagrams for explaining an outline of an example of processing of the imaging apparatus 2 when fusion processing is performed as signal processing of the DSP 32.
- the communication I / F 34 downloads a program to be executed by the CPU 31 and the DSP 32 from the outside and stores the program in the memory 33 when performing the fusion processing as the signal processing of the DSP 32.
- the program executed by the DSP 32 is a fusion processing program for performing fusion processing as signal processing.
- the CPU 31 starts a predetermined process by executing a program stored in the memory 33.
- step S42 the CPU 31 reads the brightness (information) for each small area of the captured image and other necessary information from the register group 27 via the connection line CL1.
- step S43 the CPU 31 performs control related to the compression process such as determining a reduction ratio indicating the degree to which the captured image is scaled down by the compression process of the image compression unit 35.
- step S44 the imaging unit 21 starts capturing a captured image, and the imaging processing unit 22 starts outputting the captured image from the imaging unit 21. Thereby, supply of the captured image from the imaging processing unit 22 to the output control unit 23 and supply from the imaging processing unit 22 to the image compression unit 35 via the connection line CL2 are started.
- the captured image supplied from the imaging processing unit 22 to the output control unit 23 is selected as necessary by the output control unit 23 and is output to the outside from the output I / F 24.
- step S45 the image compression unit 35 starts compression processing of the captured image supplied from the imaging processing unit 22 via the connection line CL2.
- the image compression unit 35 stores the VGA size monochrome captured image obtained as a result of the compression process in the memory 33 as a compressed image.
- the sensor output of the ToF sensor is input from, for example, a ToF sensor (not shown) as a distance sensor installed so as to have a predetermined positional relationship with the imaging device 2. Supplied to the I / F 36.
- the sensor output of the ToF sensor is, for example, a distance sensing result (for example, a value corresponding to a time until light emitted from the ToF sensor is reflected by the subject and received by the ToF sensor) as a pixel value It is in the form of an image.
- this image is also referred to as a ToF image.
- the ToF image is, for example, an image having a QQVGA size or a QVGA size.
- step S46 the input I / F 36 starts receiving a ToF image as a sensor output of the ToF sensor.
- the ToF image received by the input I / F 36 is supplied to the memory 33 and stored therein.
- step S51 of FIG. 12 the DSP 32 reads and executes the fusion processing program stored in the memory 33 in step S41, thereby starting fusion processing as signal processing corresponding to the fusion processing program.
- the DSP 32 sequentially reads each area of the compressed image stored in the memory 33 as a fusion processing target from the memory 33, reads out the ToF image from the memory 33, and processes the compressed image processing target and the ToF image. Fusion processing using and.
- the sensor output that is the pixel value of the ToF image is converted into a value representing the distance, and a distance image having the value representing the distance as the pixel value is generated.
- one distance image is obtained from four consecutive ToF images.
- calibration is performed to align each pixel of the compressed image (the processing target) with the corresponding pixel of the distance image based on the positional relationship between the imaging device 2 and the ToF sensor. Is called.
- the values representing the distances as the pixel values of the plurality of pixels of the distance image corresponding to the plurality of pixels of the compressed image in which the subject at an equal distance appears are matched.
- the noise of the distance image is removed.
- step S52 the DSP 32 supplies the memory 33 with the result of the fusion processing as a signal processing result and stores it.
- the result of the fusion processing is, for example, a distance image from which noise has been removed.
- the average brightness of the compressed image is set to a predetermined fixed value.
- the gradation conversion of the compressed image can be performed. Such gradation conversion can be performed using the brightness of each small area of the captured image read out from the register group 27 by the CPU 31 in step S42 of FIG.
- step S61 of FIG. 13 the CPU 31 reads out a distance image as a signal processing result stored in the memory 33, and uses the distance image to calculate imaging information such as a focus suitable for imaging the captured image. Do.
- the CPU 31 detects the closest object or the object located in the vicinity of a predetermined distance from the distance image, and calculates imaging information for controlling autofocus so as to focus the object. Further, for example, the CPU 31 detects the closest object or a subject located near a predetermined distance from the distance image, and calculates an appropriate exposure time for photographing the subject according to the luminance of the subject. To do.
- the CPU 31 calculates imaging information such as a frame rate suitable for imaging a captured image, a shooting mode, and a cutout range as necessary using the distance image.
- step S62 the CPU 31 feeds back the imaging information calculated in step S61 to the register group 27 via the connection line CL1.
- the register group 27 newly stores the imaging information fed back from the CPU 31, and thereafter, the imaging control unit 25 controls the imaging processing unit 22 according to the imaging information newly stored in the register group 27.
- step S 63 the CPU 31 reads out the distance image as the signal processing result stored in the memory 33 and supplies it to the output control unit 23.
- the distance image as the signal processing result supplied from the memory 33 to the output control unit 23 is selected by the output control unit 23 as necessary, and is output to the outside from the output I / F 24.
- FIG. 14 is a timing chart for explaining a first example of processing timing of the imaging apparatus 2 when fusion processing is performed as signal processing of the DSP 32.
- the imaging unit 21 takes 1/30 seconds as a frame period T1, and captures a 12M pixel color captured image during 1/60 seconds of the first half of the frame period T1.
- a captured image obtained by imaging by the imaging unit 21 is supplied from the imaging processing unit 22 to the output control unit 23 and the image compression unit 35.
- the output control unit 23 selects the captured image in response to the supply of the captured image from the imaging processing unit 22, and outputs the selected image from the output I / F 24 to the outside.
- scale-down and YUV conversion are performed as compression processing of a 12M pixel color captured image, and the 12M pixel color captured image is converted into a VGA sized black and white compressed image. This compressed image is stored in the memory 33.
- a ToF sensor is connected to the input I / F 36 of the imaging device 2, and the ToF sensor outputs a QVGA size ToF image as a sensor output.
- the input I / F 36 receives the ToF image as the sensor output of the ToF sensor and stores it in the memory 33.
- the ToF sensor can output a QVGA size ToF image at 240 fps (frame per second).
- the ToF sensor outputs only four 240 fps QVGA sized ToF images during 1/60 second of the first half of the frame period T1, and the input I / F 36 outputs the four ToF images. Received during 1/60 second of first half of frame period T1.
- the compression processing of the captured image (captured in 1/60 second of the first half) of the target frame period T1 ends in the first half of the target frame period T1. Furthermore, in the first half of the frame period of interest T1, reception of four ToF images is completed at the input I / F 36.
- the DSP 32 stores the compressed image stored in the memory 33, that is, the compressed image obtained from the captured image of the attention frame period T1. Then, the fusion processing using the four ToF images stored in the memory 33 is started.
- one distance image of the attention frame period T1 is generated from four ToF images of the attention frame period T1, and alignment calibration between each pixel of the compressed image and each pixel of the distance image is performed. Is performed. Then, the noise of the distance image is removed using the compressed image.
- the frame rate of the ToF image is 240 fps, but the frame rate of the distance image is 30 fps corresponding to the frame period T1 (1/30 second).
- the DSP 32 ends the fusion processing using the compressed image obtained from the captured image of the frame of interest T1 at a timing just before the end of the frame of interest T1, and the noise obtained as a result of the fusion processing is removed.
- the obtained distance image is supplied to the output control unit 23 as a signal processing result.
- the output control unit 23 selects the signal processing result and outputs it to the outside from the output I / F 24.
- the signal processing result for the attention frame period T1 that is, the signal processing result (distance image) of the fusion processing using the compressed image obtained from the captured image of the attention frame period T1 is the end of the attention frame period T1. It is output from the output I / F 24 during the period from the moment to the end of the frame period of interest T1.
- FIG. 15 is a timing chart for explaining a second example of processing timing of the imaging apparatus 2 when fusion processing is performed as signal processing of the DSP 32.
- a VGA-sized black and white compressed image is generated by scale down and YUV conversion as compression processing of a 12 M pixel color captured image, and the memory 33 Is remembered.
- a ToF sensor is connected to the input I / F 36 of the imaging device 2, and the ToF sensor outputs a ToV image of QVGA size as the sensor output.
- the input I / F 36 receives the ToF image as the sensor output of the ToF sensor and stores it in the memory 33.
- the ToF sensor outputs four 120 fps QVGA sized ToF images during the frame period T1, so that the input I / F 36 outputs the four ToF images. , Received during the frame period T1.
- the DSP 32 detects from the ToF sensor between the compressed image obtained from the captured image of the frame period of interest T1 and the frame period of interest T1. A fusion process using the four received ToF images has started.
- the compressed image obtained from the captured image of the target frame period T1 is also referred to as a compressed image of the target frame period T1, and the (four) ToF images received from the ToF sensor during the target frame period T1. This is also referred to as (four) ToF images of the frame period of interest T1.
- the DSP 32 starts the fusion processing using the compressed image of the noticed frame period T1 and the four ToF images of the noticed frame period T1 at the start timing of the frame period T1 next to the noticed frame period T1.
- the process ends at a timing slightly before the end of the first half of the next frame period T1 of the period T1.
- the DSP 32 supplies a distance image from which noise is removed, obtained as a result of the fusion processing, to the output control unit 23 as a signal processing result.
- the signal processing result of the fusion processing using the compressed image of the noticed frame period T1 and the four ToF images of the noticed frame period T1, and the distance image as the signal processing result are respectively represented by the noticed frame period. It is also called a signal processing result of T1 and a distance image of the frame period of interest T1.
- the output control unit 23 selects a distance image as a signal processing result of the noticed frame period T1 after the output from the output I / F 24 of the captured image of the next frame period T1 of the noticed frame period T1 ends. Output from the output I / F 24 to the outside.
- the frame rate of the distance image is 30 fps corresponding to the frame period T1 (1/30 seconds), but the distance image as the signal processing result of the frame period of interest T1 Are not output during the frame period of interest T1, but are output in the next frame period T1.
- the distance image as the signal processing result of the attention frame period T1 is output in the attention frame period T1
- the distance image as the signal processing result of the attention frame period T1 is the attention frame. It is output in the next frame period T1 of the period T1. Therefore, in FIG. 15, a ToF sensor having a slower frame rate of the ToF image than the case of FIG. 14, that is, a low-cost ToF sensor, can be used as the ToF sensor connected to the input I / F 36.
- the usage form of the imaging device 2 that receives the sensor output of the distance sensor such as the ToF sensor from the input I / F 36 and performs the fusion process is, for example, an automatic driving of the vehicle or the like. Can be applied to.
- FIG. 16, FIG. 17, and FIG. 18 are diagrams for explaining an outline of an example of processing of the imaging apparatus 2 when SLAM processing is performed as signal processing of the DSP 32.
- FIG. 16 is diagrams for explaining an outline of an example of processing of the imaging apparatus 2 when SLAM processing is performed as signal processing of the DSP 32.
- the communication I / F 34 downloads a program to be executed by the CPU 31 and the DSP 32 from the outside and stores the program in the memory 33 when performing the SLAM processing as the signal processing of the DSP 32.
- the program executed by the DSP 32 is a SLAM processing program that performs SLAM processing as signal processing.
- the CPU 31 starts a predetermined process by executing a program stored in the memory 33.
- step S72 the CPU 31 reads the brightness (information) for each small area of the captured image and other necessary information from the register group 27 via the connection line CL1.
- step S73 the CPU 31 performs control related to the compression process such as determining a reduction ratio indicating the degree to which the captured image is scaled down by the compression process of the image compression unit 35.
- step S ⁇ b> 74 the imaging unit 21 starts capturing the captured image, and the imaging processing unit 22 starts outputting the captured image from the imaging unit 21. Thereby, supply of the captured image from the imaging processing unit 22 to the output control unit 23 and supply from the imaging processing unit 22 to the image compression unit 35 via the connection line CL2 are started.
- the captured image supplied from the imaging processing unit 22 to the output control unit 23 is selected as necessary by the output control unit 23 and is output to the outside from the output I / F 24.
- step S75 the image compression unit 35 starts compression processing of the captured image supplied from the imaging processing unit 22 via the connection line CL2.
- the image compression unit 35 stores the VGA size monochrome captured image obtained as a result of the compression process in the memory 33 as a compressed image.
- the sensor output of the image sensor is supplied to the input I / F 36 from an image sensor (not shown) installed so as to have a predetermined positional relationship with the imaging device 2. .
- an image sensor different from the imaging device 2 installed so as to have a predetermined positional relationship with the imaging device 2 is also referred to as another sensor hereinafter.
- the other sensor senses light and outputs an image corresponding to the light as a sensor output.
- the image as the sensor output of the other sensor is also referred to as an other sensor image.
- the other sensor image is, for example, an image of VGA size.
- step S76 the input I / F 36 starts receiving a VGA size other sensor image as a sensor output of another sensor.
- the other sensor image of the VGA size received by the input I / F 36 is supplied to the memory 33 and stored therein.
- step S81 in FIG. 17 the DSP 32 reads and executes the SLAM processing program stored in the memory 33 in step S71, thereby starting SLAM processing as signal processing corresponding to the SLAM processing program.
- the DSP 32 sequentially reads each area of the compressed image stored in the memory 33 as the processing target of the SLAM processing from the memory 33 and reads out the other sensor image from the memory 33 and sets the processing target of the compressed image as the processing target of the compressed image.
- SLAM processing is performed using the other sensor image as a stereo image.
- the compressed image as a stereo image and the other sensor image are parallelized (the imaging device 2 and the other sensor are parallelized). Rectification is performed.
- SLAM processing for example, self-position estimation and map construction (growth) are performed using a compressed image as a stereo image after rectification and another sensor image.
- step S82 the DSP 32 supplies the result of the SLAM processing to the memory 33 as a signal processing result and stores it.
- the result of the SLAM process is, for example, a self-position estimation estimation result (hereinafter also referred to as a position estimation result) and a map constructed together with the self-position estimation.
- the average brightness of the compressed image and the other sensor image is determined in advance in order to prevent the luminance of the compressed image (captured image) and the other sensor image from affecting self-position estimation and map construction. It is possible to perform gradation conversion of the compressed image and the other sensor image so as to obtain a fixed value. Such gradation conversion can be performed using the brightness of each small area of the captured image read out from the register group 27 by the CPU 31 in step S72 of FIG.
- step S91 in FIG. 18 the CPU 31 reads out the position estimation result and the map as the signal processing result stored in the memory 33, and uses the position estimation result and the map to expose an appropriate exposure time for capturing the captured image, Calculations for calculating imaging information such as focus, frame rate, shooting mode, and clipping range are performed as necessary.
- step S92 the CPU 31 feeds back the imaging information calculated in step S91 to the register group 27 via the connection line CL1.
- the register group 27 newly stores the imaging information fed back from the CPU 31, and thereafter, the imaging control unit 25 controls the imaging processing unit 22 according to the imaging information newly stored in the register group 27.
- step S93 the CPU 31 reads out the position estimation result and the map as the signal processing result stored in the memory 33 and supplies them to the output control unit 23.
- the position estimation result and the map as the signal processing result supplied from the memory 33 to the output control unit 23 are selected by the output control unit 23 and output to the outside from the output I / F 24.
- FIG. 19 is a timing chart for explaining a first example of processing timing of the imaging apparatus 2 when SLAM processing is performed as signal processing of the DSP 32.
- the imaging unit 21 takes 1/30 seconds as a frame period T1, and captures a 12M pixel color captured image during 1/60 seconds of the first half of the frame period T1.
- a captured image obtained by imaging by the imaging unit 21 is supplied from the imaging processing unit 22 to the output control unit 23 and the image compression unit 35.
- the output control unit 23 selects the captured image in response to the supply of the captured image from the imaging processing unit 22, and outputs the selected image from the output I / F 24 to the outside.
- scale-down and YUV conversion are performed as compression processing of a 12M pixel color captured image, and the 12M pixel color captured image is converted into a VGA sized black and white compressed image. This compressed image is stored in the memory 33.
- another sensor is connected to the input I / F 36 of the imaging device 2, and the other sensor outputs another sensor image of VGA size as a sensor output.
- the input I / F 36 receives other sensor images as sensor outputs of other sensors and stores them in the memory 33.
- another sensor outputs another sensor image of VGA size at 30 fps equal to the frame period T1. That is, in FIG. 19, the other sensors output another sensor image of VGA size of 30 fps at the start of the frame period T ⁇ b> 1 in synchronization with the imaging device 2.
- the input I / F 36 receives other sensor images.
- the compression processing of the captured image in the target frame period T1 ends in the first half of the target frame period T1.
- the DSP 32 stores the compressed image obtained from the captured image of the target frame period T1 stored in the memory 33 and the memory 33 at the timing when 1/60 second of the second half of the target frame period T1 starts thereafter. SLAM processing using the other sensor image of the noticed frame period T1 is started.
- a rectification is performed between a compressed image (of a captured image) of the frame of interest T1 and another sensor image of the frame of interest T1, and the compressed image and the other sensor image after the rectification are processed.
- the self-position estimation and map construction of the frame period of interest T1 are performed.
- the DSP 32 ends SLAM processing using the compressed image and other sensor images of the frame of interest T1 at a timing just before the end of the frame of interest T1, and the position estimation result and map obtained as a result of the SLAM processing are The result is supplied to the output control unit 23 as a signal processing result.
- the output control unit 23 selects the signal processing result and outputs it to the outside from the output I / F 24.
- the signal processing result for the target frame period T1 that is, the signal processing result (position estimation result and map) of the SLAM processing using the compressed image and other sensor images of the target frame period T1 is the same as that of the target frame period T1. It is output from the output I / F 24 in the period from the end of the period until the frame period of interest T1 ends.
- FIG. 20 is a timing chart for explaining a second example of processing timing of the imaging apparatus 2 when SLAM processing is performed as signal processing of the DSP 32.
- a color captured image of 12 M pixels is captured with 1/30 seconds as the frame period T ⁇ b> 1 and output to the outside from the output I / F 24.
- the image compression unit 35 similarly to FIG. 19, the image compression unit 35 generates a VGA-sized black and white compressed image by scale-down and YUV conversion as compression processing of a color captured image of 12 M pixels, and the memory 33. Is remembered.
- another sensor is connected to the input I / F 36 of the imaging apparatus 2, and the other sensor outputs another sensor image of VGA size as a sensor output.
- the input I / F 36 receives other sensor images as sensor outputs of other sensors and stores them in the memory 33 as in FIG.
- the DSP 32 stores the compressed image obtained from the captured image of the target frame period T1 stored in the memory 33 and the memory 33 at the timing when 1/60 second of the second half of the target frame period T1 starts thereafter. SLAM processing using the other sensor image of the noticed frame period T1 is started.
- SLAM processing for example, a rectification of a compressed image of the frame of interest T1 and another sensor image of the frame of interest T1 is performed, and the frame of interest using the compressed image and the other sensor image after the rectification is used. Self-position estimation and map construction with period T1 are performed.
- the DSP 32 ends the SLAM process using the compressed image and the other sensor image of the frame of interest T1 at a timing slightly before the end of the first half of the next frame cycle T1 of the frame of interest T1.
- a position estimation result and a map obtained as a result of the processing are supplied to the output control unit 23 as a signal processing result.
- the output control unit 23 selects the position estimation result and the map as the signal processing result of the frame period of interest T1 after the output from the output I / F 24 of the captured image of the frame period T1 next to the frame period of interest T1 is completed. Then, the data is output from the output I / F 24 to the outside.
- the position estimation result and the map as the signal processing result of the target frame period T1 are not output during the target frame period T1, but are output in the next frame period T1.
- the position estimation result and the map as the signal processing result of the target frame period T1 are output in the target frame period T1
- the position estimation result as the signal processing result of the target frame period T1 are output in the target frame period T1
- the position estimation result as the signal processing result of the target frame period T1 is output in the frame period T1 next to the frame period of interest T1. Therefore, in FIG. 20, it is possible to allocate a longer time to the SRAM processing than in the case of FIG. 19, and as a result, it is possible to improve the position estimation result and the map accuracy as the signal processing result of the SRAM processing.
- FIG. 21 is a timing chart for explaining a third example of processing timing of the imaging apparatus 2 when SLAM processing is performed as signal processing of the DSP 32.
- the imaging unit 21 captures one frame with 1/30 seconds as a frame period T1. However, in FIG. 21, the imaging unit 21 captures a VGA size color captured image, not a 12 M pixel color captured image. For this reason, in FIG. 21, the imaging of one frame is completed in a very short time from the start of the frame period T1.
- the captured image captured by the imaging unit 21 is supplied from the imaging processing unit 22 to the output control unit 23 and the image compression unit 35.
- the captured image is not used externally. Therefore, even when the captured image is supplied from the imaging processing unit 22, the output control unit 23 does not select the captured image and outputs I / F24 does not output externally.
- the image compression unit 35 compresses the captured image and stores the compressed image obtained as a result in the memory 33.
- the input I / F 36 receives other sensor images as sensor outputs of other sensors and stores them in the memory 33.
- both the compressed image and the other sensor image of the target frame period T1 are stored in the memory 33.
- SLAM processing using the sensor image can be started.
- the DSP 32 starts SLAM processing using the compressed image of the frame period of interest T1 stored in the memory 33 and the other sensor image as a stereo image.
- SLAM processing for example, a rectification of a compressed image of the frame of interest T1 and another sensor image of the frame of interest T1 is performed, and the frame of interest using the compressed image and the other sensor image after the rectification is used. Self-position estimation and map construction with period T1 are performed.
- the DSP 32 ends the SLAM process using the compressed image and other sensor images of the frame of interest T1 at a timing just before the end of the frame of interest T1, and the position estimation result obtained as a result of the SLAM process. And a map is supplied to the output control part 23 as a signal processing result.
- the output control unit 23 selects the signal processing result and outputs it to the outside from the output I / F 24.
- the signal processing result for the target frame period T1 that is, the signal processing result (position estimation result and map) of the SLAM processing using the compressed image and other sensor images of the target frame period T1 is the same as that of the target frame period T1. It is output from the output I / F 24 in the period from the end of the period until the frame period of interest T1 ends.
- the captured image output by the imaging unit 21 is a VGA size image, it is not necessary to perform scale down in the compression process, and the load of the compression process can be reduced.
- the form in which the captured image output by the imaging unit 21 is not output as the VGA size image from the output I / F 24 is, for example, externally, the captured image itself is not necessary, and the signal processing result (here, This is useful when a signal processing result of SLAM processing is required.
- the usage pattern of the imaging device 2 that receives the other sensor image from the input I / F 36 and performs the SLAM processing is applied to, for example, a robot acting autonomously. Can do.
- rectification is performed as part of SLAM processing performed by causing the DSP 32 to execute the SLAM processing program. That is, rectification is performed by software. However, when the other sensor image and the captured image are used as a stereo image, the rectification required in that case is not performed by software, but dedicated hardware is provided in the imaging device 2 and the dedicated image is provided. This can be done with hardware.
- FIG. 22 is a block diagram showing another configuration example of the imaging device 2 of FIG.
- FIG. 22 shows a configuration example of the imaging apparatus 2 provided with dedicated hardware for performing rectification.
- the imaging apparatus 2 includes an imaging unit 21 to an imaging control unit 25, a CPU 31 to an input I / F 36, and a rectification unit 71.
- the image pickup apparatus 2 in FIG. 22 is common to the case in FIG. 2 in that the image pickup unit 21 to the image pickup control unit 25 and the CPU 31 to the input I / F 36 are included.
- the imaging device 2 of FIG. 22 is different from the case of FIG. 2 in that a rectification unit 71 is newly provided.
- the rectification unit 71 is dedicated hardware for performing rectification, and performs rectification on the compressed image and other sensor images stored in the memory 33.
- the DSP 32 performs SLAM processing using the compressed image after rectification by the rectification unit 71 and the other sensor image.
- the rectification speed can be increased by providing the rectification unit 71 as hardware dedicated to rectification.
- FIG. 23 is a diagram illustrating a usage example in which the imaging device 2 of FIG. 1 is used.
- the imaging device 2 can be used for various electronic devices that sense light such as visible light, infrared light, ultraviolet light, and X-ray as follows.
- Electronic devices that capture images for viewing such as digital cameras and mobile devices with camera functions
- Electronic devices used for traffic such as in-vehicle sensors that take pictures of the back, surroundings, inside the car, surveillance cameras that monitor traveling vehicles and roads, and ranging sensors that measure distances between vehicles, etc.
- Electronic devices used for home appliances such as TVs, refrigerators, air conditioners, etc.
- Electronic devices used for medical and healthcare such as angiography devices
- Electronic devices used for security such as surveillance cameras for crime prevention and cameras for personal authentication
- Photographs the skin Photographing skin measuring instrument and scalp Electronic devices used for beauty such as a microscope to perform
- Electronic devices used for sports such as action cameras and wearable cameras for sports applications, etc.
- the technology (this technology) according to the present disclosure can be applied to various products.
- the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
- FIG. 24 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
- the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
- a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
- the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
- the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
- the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
- the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
- the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
- the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
- the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
- the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
- the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
- the vehicle interior information detection unit 12040 detects vehicle interior information.
- a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
- the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
- the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
- the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
- the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
- the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
- the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
- FIG. 25 is a diagram illustrating an example of an installation position of the imaging unit 12031.
- the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
- the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
- the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
- the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
- the forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 25 shows an example of the shooting range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
- the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
- the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
- a predetermined speed for example, 0 km / h or more
- the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
- automatic brake control including follow-up stop control
- automatic acceleration control including follow-up start control
- cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
- the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
- the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
- the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
- the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
- the display unit 12062 is controlled so as to be superimposed and displayed.
- voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
- the technology according to the present disclosure may be applied to the imaging unit 12031, for example.
- the imaging device 2 in FIG. 2 can be applied to the imaging unit 12031.
- the imaging unit 12031 outputs information required by the user, that is, information necessary for a block that performs subsequent processing (hereinafter also referred to as a subsequent block). be able to. Therefore, the subsequent block need not perform processing for generating necessary information from the image, and the load on the subsequent block can be reduced accordingly.
- the processing performed by the computer (processor) according to the program does not necessarily have to be performed in time series in the order described as the flowchart. That is, the processing performed by the computer according to the program includes processing executed in parallel or individually (for example, parallel processing or object processing).
- program may be processed by one computer (processor), or may be distributedly processed by a plurality of computers.
- the present technology can be applied not only to an image sensor that senses visible light but also to an image sensor that senses electromagnetic waves other than visible light such as infrared rays.
- this technique can take the following structures.
- An image pickup unit for picking up an image in which a plurality of pixels are arranged two-dimensionally;
- a signal processing unit that performs signal processing using a captured image output by the imaging unit;
- the signal processing result of the signal processing, and the output I / F that outputs the captured image to the outside
- a one-chip imaging apparatus comprising: an output control unit that performs output control to selectively output a signal processing result of the signal processing and the captured image to the outside from the output I / F.
- the imaging apparatus according to ⁇ 1> having a stacked structure in which a plurality of dies are stacked.
- the imaging apparatus according to ⁇ 1> or ⁇ 2>, further including an image compression unit that compresses the captured image and generates a compressed image having a data amount smaller than that of the captured image.
- An imaging controller that stores imaging information related to imaging of the captured image, and controls imaging of the captured image according to the imaging information;
- An imaging information calculation unit that calculates the imaging information using the signal processing result; and
- the imaging control unit and the imaging information calculation unit are connected via a predetermined connection line,
- the imaging apparatus according to any one of ⁇ 1> to ⁇ 3>, wherein the imaging information calculation unit feeds back the imaging information to the register of the imaging control unit via the predetermined connection line.
- the register also stores output control information related to the output control
- the imaging apparatus according to ⁇ 4> wherein the output control unit performs the output control according to the output control information stored in the register.
- the imaging apparatus according to ⁇ 4> or ⁇ 5> further including a first communication I / F that exchanges information to be read and written to the register with the outside.
- the signal processing unit is a processor that executes a program, The imaging apparatus according to any one of ⁇ 1> to ⁇ 6>, further including a second communication I / F that downloads a program executed by the processor from outside.
- ⁇ 8> The imaging apparatus according to any one of ⁇ 1> to ⁇ 7>, wherein the signal processing unit performs recognition processing for recognizing a predetermined recognition target from the captured image as the signal processing.
- An input I / F that receives an external sensor output from an external sensor is further provided.
- the external sensor output is an output of a distance sensor that senses information related to a distance, or an output of an image sensor that senses light and outputs an image corresponding to the light.
- the signal processing unit as the signal processing, a fusion process for obtaining a distance using the captured image and the output of the distance sensor, or a self-position using an image as the output of the image sensor and the captured image
- the imaging device according to ⁇ 10> which performs an estimation process.
- An optical system that collects the light; A one-chip imaging device that receives light and outputs an image corresponding to the amount of light received; The imaging device An image pickup unit for picking up an image in which a plurality of pixels are arranged two-dimensionally; A signal processing unit that performs signal processing using a captured image output by the imaging unit; The signal processing result of the signal processing, and the output I / F that outputs the captured image to the outside, An electronic apparatus comprising: an output control unit that performs output control to selectively output the signal processing result of the signal processing and the captured image to the outside from the output I / F.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Power Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Biomedical Technology (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
Description
・自動停止等の安全運転や、運転者の状態の認識等のために、自動車の前方や後方、周囲、車内等を撮影する車載用センサ、走行車両や道路を監視する監視カメラ、車両間等の測距を行う測距センサ等の、交通の用に供される電子機器
・ユーザのジェスチャを撮影して、そのジェスチャに従った機器操作を行うために、TVや、冷蔵庫、エアーコンディショナ等の家電に供される電子機器
・内視鏡や、電子顕微鏡、赤外光の受光による血管撮影を行う装置等の、医療やヘルスケアの用に供される電子機器
・防犯用途の監視カメラや、人物認証用途のカメラ等の、セキュリティの用に供される電子機器
・肌を撮影する肌測定器や、頭皮を撮影するマイクロスコープ等の、美容の用に供される電子機器
・スポーツ用途等向けのアクションカメラやウェアラブルカメラ等の、スポーツの用に供される電子機器
・畑や作物の状態を監視するためのカメラ等の、農業の用に供される電子機器
複数の画素が2次元に並んだ、画像を撮像する撮像部と、
前記撮像部が出力する撮像画像を用いた信号処理を行う信号処理部と、
前記信号処理の信号処理結果、及び、前記撮像画像を外部に出力する出力I/Fと、
前記信号処理の信号処理結果、及び、前記撮像画像を、前記出力I/Fから外部に選択的に出力させる出力制御を行う出力制御部と
を備える1チップの撮像装置。
<2>
複数のダイが積層された積層構造を有する
<1>に記載の撮像装置。
<3>
前記撮像画像を圧縮し、前記撮像画像よりもデータ量が少ない圧縮画像を生成する画像圧縮部をさらに備える
<1>又は<2>に記載の撮像装置。
<4>
前記撮像画像の撮像に関する撮像情報を記憶するレジスタを有し、前記撮像情報に従って、前記撮像画像の撮像を制御する撮像制御部と、
前記信号処理結果を用いて、前記撮像情報を算出する撮像情報算出部と
をさらに備え、
前記撮像制御部と前記撮像情報算出部とは、所定の接続線を介して接続され、
前記撮像情報算出部は、前記撮像情報を、前記所定の接続線を介して、前記撮像制御部の前記レジスタにフィードバックする
<1>ないし<3>のいずれかに記載の撮像装置。
<5>
前記レジスタは、前記出力制御に関する出力制御情報をも記憶し、
前記出力制御部は、前記レジスタに記憶された前記出力制御情報に従って、前記出力制御を行う
<4>に記載の撮像装置。
<6>
前記レジスタに読み書きする情報を外部との間でやりとりする第1の通信I/Fをさらに備える
<4>又は<5>に記載の撮像装置。
<7>
前記信号処理部は、プログラムを実行するプロセッサであり、
前記プロセッサが実行するプログラムを外部からダウンロードする第2の通信I/Fをさらに備える
<1>ないし<6>のいずれかに記載の撮像装置。
<8>
前記信号処理部は、前記信号処理として、前記撮像画像から、所定の認識対象を認識する認識処理を行う
<1>ないし<7>のいずれかに記載の撮像装置。
<9>
外部のセンサから外部センサ出力を受け取る入力I/Fをさらに備え、
前記信号処理部は、前記撮像画像及び前記外部センサ出力を用いた信号処理を行う
<1>ないし<7>のいずれかに記載の撮像装置。
<10>
前記外部センサ出力は、距離に関する情報をセンシングする距離センサの出力、又は、光をセンシングし、前記光に対応する画像を出力するイメージセンサの出力である
<9>に記載の撮像装置。
<11>
前記信号処理部は、前記信号処理として、前記撮像画像と前記距離センサの出力とを用いて距離を求めるフュージョン処理、又は、前記イメージセンサの出力としての画像と前記撮像画像とを用いた自己位置推定処理を行う
<10>に記載の撮像装置。
<12>
光を集光する光学系と、
光を受光し、前記光の受光量に対応する画像を出力する1チップの撮像装置と
を備え、
前記撮像装置は、
複数の画素が2次元に並んだ、画像を撮像する撮像部と、
前記撮像部が出力する撮像画像を用いた信号処理を行う信号処理部と、
前記信号処理の信号処理結果、及び、前記撮像画像を外部に出力する出力I/Fと、
前記信号処理の信号処理結果、及び、前記撮像画像を、前記出力I/Fから外部に選択的に出力させる出力制御を行う出力制御部と
を有する
電子機器。
Claims (12)
- 複数の画素が2次元に並んだ、画像を撮像する撮像部と、
前記撮像部が出力する撮像画像を用いた信号処理を行う信号処理部と、
前記信号処理の信号処理結果、及び、前記撮像画像を外部に出力する出力I/Fと、
前記信号処理の信号処理結果、及び、前記撮像画像を、前記出力I/Fから外部に選択的に出力させる出力制御を行う出力制御部と
を備える1チップの撮像装置。 - 複数のダイが積層された積層構造を有する
請求項1に記載の撮像装置。 - 前記撮像画像を圧縮し、前記撮像画像よりもデータ量が少ない圧縮画像を生成する画像圧縮部をさらに備える
請求項1に記載の撮像装置。 - 前記撮像画像の撮像に関する撮像情報を記憶するレジスタを有し、前記撮像情報に従って、前記撮像画像の撮像を制御する撮像制御部と、
前記信号処理結果を用いて、前記撮像情報を算出する撮像情報算出部と
をさらに備え、
前記撮像制御部と前記撮像情報算出部とは、所定の接続線を介して接続され、
前記撮像情報算出部は、前記撮像情報を、前記所定の接続線を介して、前記撮像制御部の前記レジスタにフィードバックする
請求項1に記載の撮像装置。 - 前記レジスタは、前記出力制御に関する出力制御情報をも記憶し、
前記出力制御部は、前記レジスタに記憶された前記出力制御情報に従って、前記出力制御を行う
請求項4に記載の撮像装置。 - 前記レジスタに読み書きする情報を外部との間でやりとりする第1の通信I/Fをさらに備える
請求項4に記載の撮像装置。 - 前記信号処理部は、プログラムを実行するプロセッサであり、
前記プロセッサが実行するプログラムを外部からダウンロードする第2の通信I/Fをさらに備える
請求項1に記載の撮像装置。 - 前記信号処理部は、前記信号処理として、前記撮像画像から、所定の認識対象を認識する認識処理を行う
請求項1に記載の撮像装置。 - 外部のセンサから外部センサ出力を受け取る入力I/Fをさらに備え、
前記信号処理部は、前記撮像画像及び前記外部センサ出力を用いた信号処理を行う
請求項1に記載の撮像装置。 - 前記外部センサ出力は、距離に関する情報をセンシングする距離センサの出力、又は、光をセンシングし、前記光に対応する画像を出力するイメージセンサの出力である
請求項9に記載の撮像装置。 - 前記信号処理部は、前記信号処理として、前記撮像画像と前記距離センサの出力とを用いて距離を求めるフュージョン処理、又は、前記イメージセンサの出力としての画像と前記撮像画像とを用いた自己位置推定処理を行う
請求項10に記載の撮像装置。 - 光を集光する光学系と、
光を受光し、前記光の受光量に対応する画像を出力する1チップの撮像装置と
を備え、
前記撮像装置は、
複数の画素が2次元に並んだ、画像を撮像する撮像部と、
前記撮像部が出力する撮像画像を用いた信号処理を行う信号処理部と、
前記信号処理の信号処理結果、及び、前記撮像画像を外部に出力する出力I/Fと、
前記信号処理の信号処理結果、及び、前記撮像画像を、前記出力I/Fから外部に選択的に出力させる出力制御を行う出力制御部と
を有する
電子機器。
Priority Applications (16)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/330,468 US10795024B2 (en) | 2016-09-16 | 2017-09-01 | Imaging device and electronic device |
EP17850712.5A EP3515057B1 (en) | 2016-09-16 | 2017-09-01 | Image pickup device and electronic apparatus |
KR1020237013656A KR102647268B1 (ko) | 2016-09-16 | 2017-09-01 | 촬상 장치 및 전자 기기 |
KR1020197006833A KR102374013B1 (ko) | 2016-09-16 | 2017-09-01 | 촬상 장치 및 전자 기기 |
EP22172135.0A EP4064680A1 (en) | 2016-09-16 | 2017-09-01 | Imaging device and electronic device |
KR1020227014481A KR20220058975A (ko) | 2016-09-16 | 2017-09-01 | 촬상 장치 및 전자 기기 |
CN201780055363.7A CN109691079B (zh) | 2016-09-16 | 2017-09-01 | 成像装置和电子设备 |
KR1020227015678A KR102526559B1 (ko) | 2016-09-16 | 2017-09-01 | 촬상 장치 및 전자 기기 |
CN202110539020.6A CN113271400B (zh) | 2016-09-16 | 2017-09-01 | 成像装置和电子设备 |
KR1020227002597A KR102490799B1 (ko) | 2016-09-16 | 2017-09-01 | 촬상 장치 및 전자 기기 |
JP2018539628A JP6633216B2 (ja) | 2016-09-16 | 2017-09-01 | 撮像装置、及び、電子機器 |
EP22172106.1A EP4064679A1 (en) | 2016-09-16 | 2017-09-01 | Imaging device and electronic device |
US16/986,049 US12130362B2 (en) | 2016-09-16 | 2020-08-05 | Imaging device and electronic device |
US17/702,172 US20220214458A1 (en) | 2016-09-16 | 2022-03-23 | Imaging device and electronic device |
US17/718,504 US20220244387A1 (en) | 2016-09-16 | 2022-04-12 | Imaging device and electronic device |
US17/722,117 US12061264B2 (en) | 2016-09-16 | 2022-04-15 | Imaging device and electronic device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016181194 | 2016-09-16 | ||
JP2016-181194 | 2016-09-16 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/330,468 A-371-Of-International US10795024B2 (en) | 2016-09-16 | 2017-09-01 | Imaging device and electronic device |
US16/986,049 Continuation US12130362B2 (en) | 2016-09-16 | 2020-08-05 | Imaging device and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018051809A1 true WO2018051809A1 (ja) | 2018-03-22 |
Family
ID=61619417
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/031539 WO2018051809A1 (ja) | 2016-09-16 | 2017-09-01 | 撮像装置、及び、電子機器 |
Country Status (6)
Country | Link |
---|---|
US (4) | US10795024B2 (ja) |
EP (3) | EP4064679A1 (ja) |
JP (4) | JP6633216B2 (ja) |
KR (5) | KR20220058975A (ja) |
CN (2) | CN109691079B (ja) |
WO (1) | WO2018051809A1 (ja) |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020027161A1 (ja) | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 積層型受光センサ及び電子機器 |
WO2020027074A1 (ja) * | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置および電子機器 |
WO2020027229A1 (ja) * | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置および電子機器 |
WO2020027233A1 (ja) | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置及び車両制御システム |
WO2020027230A1 (ja) | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 積層型受光センサ及び車載撮像装置 |
JP2020025261A (ja) * | 2018-07-31 | 2020-02-13 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置および電子機器 |
JP2020025268A (ja) * | 2018-07-31 | 2020-02-13 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置および電子機器 |
JP2020038410A (ja) * | 2018-08-31 | 2020-03-12 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、情報処理装置、情報処理システム、情報処理方法及びプログラム |
JP2020043615A (ja) * | 2018-08-31 | 2020-03-19 | ソニー株式会社 | 撮像装置、撮像システム、撮像方法および撮像プログラム |
JP2020047191A (ja) * | 2018-09-21 | 2020-03-26 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像システム、固体撮像装置、情報処理装置、画像処理方法及びプログラム |
WO2020090509A1 (ja) | 2018-10-31 | 2020-05-07 | ソニーセミコンダクタソリューションズ株式会社 | 積層型受光センサ及び電子機器 |
WO2020145142A1 (ja) * | 2019-01-08 | 2020-07-16 | ソニー株式会社 | 固体撮像素子およびその信号処理方法、並びに電子機器 |
JP2020120406A (ja) * | 2018-08-31 | 2020-08-06 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、情報処理システム、固体撮像方法及びプログラム |
WO2020170890A1 (ja) | 2019-02-19 | 2020-08-27 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置、電子機器、および撮像方法 |
WO2020170729A1 (ja) * | 2019-02-20 | 2020-08-27 | 富士フイルム株式会社 | 撮像素子、撮像装置、撮像素子の作動方法、及びプログラム |
JP2021013175A (ja) * | 2018-07-31 | 2021-02-04 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置及び車両制御システム |
CN112470461A (zh) * | 2018-07-31 | 2021-03-09 | 索尼半导体解决方案公司 | 层叠型受光传感器以及车载摄像装置 |
WO2021070894A1 (ja) * | 2019-10-11 | 2021-04-15 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置、電子機器及び撮像方法 |
WO2021075352A1 (ja) * | 2019-10-18 | 2021-04-22 | ソニー株式会社 | 撮像装置及び電子機器 |
WO2021075292A1 (ja) * | 2019-10-18 | 2021-04-22 | ソニーセミコンダクタソリューションズ株式会社 | 受光装置、電子機器及び受光方法 |
WO2021084814A1 (ja) | 2019-10-31 | 2021-05-06 | ソニー株式会社 | 微小粒子回収方法、微小粒子分取用マイクロチップ、微小粒子回収装置、エマルションの製造方法、及びエマルション |
WO2021152877A1 (ja) | 2020-01-30 | 2021-08-05 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、電子機器及び撮像システム |
WO2021161893A1 (ja) | 2020-02-14 | 2021-08-19 | ソニーグループ株式会社 | 分析方法、分析システム、及び分析用表面 |
JP2021140032A (ja) * | 2020-03-05 | 2021-09-16 | ソニーグループ株式会社 | 信号取得装置、信号取得システム、及び信号取得方法 |
WO2021187350A1 (ja) * | 2020-03-19 | 2021-09-23 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置 |
JP2021176242A (ja) * | 2020-04-23 | 2021-11-04 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、固体撮像装置の動作方法及びプログラム |
WO2022009642A1 (ja) | 2020-07-06 | 2022-01-13 | ソニーグループ株式会社 | 生体粒子分析方法及び生体粒子分析システム |
WO2022014643A1 (en) | 2020-07-14 | 2022-01-20 | Sony Group Corporation | Fine-particle sorting apparatus, fine-particle sorting method, program, and fine particle-sorting system |
WO2023157651A1 (ja) * | 2022-02-17 | 2023-08-24 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置および信号処理方法 |
WO2023182049A1 (ja) | 2022-03-23 | 2023-09-28 | ソニーセミコンダクタソリューションズ株式会社 | イメージセンサ、データ構造 |
WO2023218936A1 (ja) * | 2022-05-10 | 2023-11-16 | ソニーセミコンダクタソリューションズ株式会社 | イメージセンサ、情報処理方法、プログラム |
WO2024014278A1 (ja) * | 2022-07-11 | 2024-01-18 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置およびデータ出力方法 |
US12108183B2 (en) | 2020-03-19 | 2024-10-01 | Sony Semiconductor Solutions Corporation | Solid-state imaging apparatus |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109691079B (zh) | 2016-09-16 | 2021-05-14 | 索尼半导体解决方案公司 | 成像装置和电子设备 |
JP6996025B2 (ja) * | 2019-02-20 | 2022-02-04 | 富士フイルム株式会社 | 撮像素子、撮像装置、撮像素子の作動方法、及びプログラム |
US20230121905A1 (en) * | 2020-03-26 | 2023-04-20 | Sony Semiconductor Solutions Corporation | Information processing apparatus, information processing method, and program |
US11706546B2 (en) * | 2021-06-01 | 2023-07-18 | Sony Semiconductor Solutions Corporation | Image sensor with integrated single object class detection deep neural network (DNN) |
US11989888B2 (en) * | 2021-08-04 | 2024-05-21 | Sony Semiconductor Solutions Corporation | Image sensor with integrated efficient multiresolution hierarchical deep neural network (DNN) |
US11763417B2 (en) * | 2021-11-19 | 2023-09-19 | Renesas Electronics Corporation | Semiconductor device and image processing system for processing regions of interest images |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003283930A (ja) * | 2002-03-27 | 2003-10-03 | Sony Corp | 露光制御方法、露光制御回路、撮像装置、プログラム、記憶媒体 |
JP2006067291A (ja) * | 2004-08-27 | 2006-03-09 | Canon Inc | 撮像装置 |
JP2007174160A (ja) * | 2005-12-21 | 2007-07-05 | Konica Minolta Photo Imaging Inc | 撮像装置 |
JP2008048313A (ja) | 2006-08-21 | 2008-02-28 | Sony Corp | 物理量検出装置、物理量検出装置の駆動方法及び撮像装置 |
JP2008070120A (ja) * | 2006-09-12 | 2008-03-27 | Hitachi Ltd | 距離計測装置 |
JP2009081808A (ja) * | 2007-09-27 | 2009-04-16 | Fujifilm Corp | 撮影制御装置、撮影制御方法、撮影制御プログラム、および撮影装置 |
JP2011023898A (ja) * | 2009-07-14 | 2011-02-03 | Panasonic Corp | 表示装置、表示方法および集積回路 |
JP2014082365A (ja) * | 2012-10-17 | 2014-05-08 | Canon Inc | 半導体装置 |
Family Cites Families (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10336519A (ja) * | 1997-06-02 | 1998-12-18 | Sony Corp | 撮像素子及び撮像装置 |
JP2000241131A (ja) * | 1998-12-22 | 2000-09-08 | Matsushita Electric Ind Co Ltd | レンジファインダ装置および撮像装置 |
JP2001036793A (ja) * | 1999-07-21 | 2001-02-09 | Sharp Corp | ビデオカメラ |
JP5108172B2 (ja) | 2000-09-06 | 2012-12-26 | 株式会社ニコン | 画像データサイズ変換処理装置、電子スチルカメラ、および画像データサイズ変換処理用記録媒体 |
US6617565B2 (en) * | 2001-11-06 | 2003-09-09 | Omnivision Technologies, Inc. | CMOS image sensor with on-chip pattern recognition |
JP2004048455A (ja) | 2002-07-12 | 2004-02-12 | Niles Co Ltd | 撮像システム |
JP2004146816A (ja) * | 2002-09-30 | 2004-05-20 | Matsushita Electric Ind Co Ltd | 固体撮像装置およびこれを用いた機器 |
JP4574249B2 (ja) * | 2004-06-29 | 2010-11-04 | キヤノン株式会社 | 画像処理装置及びその方法、プログラム、撮像装置 |
JP4244218B2 (ja) * | 2004-07-09 | 2009-03-25 | パナソニック株式会社 | 撮像信号処理回路およびカメラシステム |
JP4277216B2 (ja) * | 2005-01-13 | 2009-06-10 | ソニー株式会社 | 撮像装置及び撮像結果の処理方法 |
WO2007032006A2 (en) * | 2005-09-13 | 2007-03-22 | Ben Gurion University Of The Negev Research And Development Authority | A configurable asic-based sensing circuit |
JP2007082059A (ja) * | 2005-09-16 | 2007-03-29 | Fujifilm Corp | 電子カメラ |
JP4827611B2 (ja) * | 2006-05-23 | 2011-11-30 | ローム株式会社 | シリアルインタフェイス装置、画像形成装置 |
JP2008124742A (ja) * | 2006-11-10 | 2008-05-29 | Sony Corp | 画像処理装置、画像処理方法、およびプログラム |
JP4301308B2 (ja) * | 2007-03-02 | 2009-07-22 | ソニー株式会社 | 撮像装置および画像処理方法 |
US7676146B2 (en) | 2007-03-09 | 2010-03-09 | Eastman Kodak Company | Camera using multiple lenses and image sensors to provide improved focusing capability |
JP2009081818A (ja) * | 2007-09-27 | 2009-04-16 | Mitsubishi Electric Corp | 無線通信基地局装置 |
US9451142B2 (en) | 2007-11-30 | 2016-09-20 | Cognex Corporation | Vision sensors, systems, and methods |
JP2009145598A (ja) | 2007-12-13 | 2009-07-02 | Sharp Corp | 固体撮像装置およびそれを備えた電子機器 |
JP5082973B2 (ja) * | 2008-03-26 | 2012-11-28 | 株式会社日立製作所 | 映像記録システム、及び撮像装置 |
JP2009246803A (ja) * | 2008-03-31 | 2009-10-22 | Aisin Seiki Co Ltd | 画像認識装置及びコンピュータプログラム |
JP5266864B2 (ja) * | 2008-05-13 | 2013-08-21 | ソニー株式会社 | イメージセンサ、データ出力方法、撮像デバイス、及び、カメラ |
JP2010283787A (ja) * | 2009-06-08 | 2010-12-16 | Panasonic Corp | 撮像装置 |
JP5625298B2 (ja) * | 2009-09-28 | 2014-11-19 | ソニー株式会社 | 撮像装置 |
JP5700963B2 (ja) * | 2010-06-29 | 2015-04-15 | キヤノン株式会社 | 情報処理装置およびその制御方法 |
JP5631108B2 (ja) * | 2010-08-19 | 2014-11-26 | キヤノン株式会社 | 撮像装置及びその制御方法、並びにプログラム |
KR101706093B1 (ko) | 2010-11-30 | 2017-02-14 | 삼성전자주식회사 | 3차원 좌표 추출 시스템 및 그 방법 |
US9568985B2 (en) * | 2012-11-23 | 2017-02-14 | Mediatek Inc. | Data processing apparatus with adaptive compression algorithm selection based on visibility of compression artifacts for data communication over camera interface and related data processing method |
JP2014143667A (ja) * | 2012-12-28 | 2014-08-07 | Canon Inc | 撮像素子、撮像装置、その制御方法、および制御プログラム |
JP6091216B2 (ja) * | 2013-01-08 | 2017-03-08 | キヤノン株式会社 | 画像信号処理装置およびその制御方法、並びに撮像装置 |
US9465484B1 (en) * | 2013-03-11 | 2016-10-11 | Amazon Technologies, Inc. | Forward and backward looking vision system |
JP5900393B2 (ja) | 2013-03-21 | 2016-04-06 | ソニー株式会社 | 情報処理装置、操作制御方法及びプログラム |
JP6141084B2 (ja) * | 2013-04-19 | 2017-06-07 | キヤノン株式会社 | 撮像装置 |
US9443167B2 (en) * | 2013-08-02 | 2016-09-13 | Emotient, Inc. | Filter and shutter based on image emotion content |
EP3035667B1 (en) * | 2013-08-12 | 2024-06-19 | Nikon Corporation | Electronic device |
JP2015056700A (ja) * | 2013-09-10 | 2015-03-23 | 株式会社東芝 | 撮像素子、撮像装置および半導体装置 |
JP2015091005A (ja) * | 2013-11-05 | 2015-05-11 | オリンパス株式会社 | 撮像装置 |
JP2015185927A (ja) | 2014-03-20 | 2015-10-22 | ソニー株式会社 | 撮像素子、制御方法、並びに、撮像装置 |
WO2015182751A1 (ja) * | 2014-05-30 | 2015-12-03 | 株式会社日立国際電気 | 監視システムおよびカメラ装置 |
JP2016010125A (ja) * | 2014-06-26 | 2016-01-18 | ソニー株式会社 | 信号処理装置、制御方法、撮像素子、並びに、電子機器 |
US10609862B2 (en) * | 2014-09-23 | 2020-04-07 | Positec Technology (China) Co., Ltd. | Self-moving robot |
US9940533B2 (en) * | 2014-09-30 | 2018-04-10 | Qualcomm Incorporated | Scanning window for isolating pixel values in hardware for computer vision operations |
JP6316976B2 (ja) | 2014-09-30 | 2018-04-25 | 日立オートモティブシステムズ株式会社 | 車載画像認識装置 |
EP3213288B8 (en) * | 2014-10-30 | 2019-11-06 | Verizon Patent and Licensing Inc. | Parking and traffic analysis |
JP6440303B2 (ja) * | 2014-12-02 | 2018-12-19 | エヌ・ティ・ティ・コムウェア株式会社 | 対象認識装置、対象認識方法、およびプログラム |
US20160282626A1 (en) * | 2015-03-27 | 2016-09-29 | Osterhout Group, Inc. | See-through computer display systems |
CN107710731B (zh) | 2015-06-26 | 2021-05-04 | 麦克赛尔株式会社 | 摄像装置以及图像处理方法 |
JP2017017624A (ja) | 2015-07-03 | 2017-01-19 | ソニー株式会社 | 撮像素子、画像処理方法、および電子機器 |
US20170041540A1 (en) | 2015-08-04 | 2017-02-09 | Ronald B Foster | Energy-efficient secure vision processing applying object detection algorithms |
KR102460838B1 (ko) * | 2015-08-28 | 2022-10-28 | 삼성전자주식회사 | 얼굴 검출을 이용한 카메라의 자동 초점 조절 방법 및 카메라 제어 장치 |
JP6743882B2 (ja) | 2016-03-14 | 2020-08-19 | 株式会社リコー | 画像処理装置、機器制御システム、撮像装置、画像処理方法及びプログラム |
KR102462644B1 (ko) * | 2016-04-01 | 2022-11-03 | 삼성전자주식회사 | 전자 장치 및 그의 동작 방법 |
US11631005B2 (en) * | 2016-05-31 | 2023-04-18 | Nokia Technologies Oy | Method and apparatus for detecting small objects with an enhanced deep neural network |
CN109691079B (zh) | 2016-09-16 | 2021-05-14 | 索尼半导体解决方案公司 | 成像装置和电子设备 |
-
2017
- 2017-09-01 CN CN201780055363.7A patent/CN109691079B/zh active Active
- 2017-09-01 EP EP22172106.1A patent/EP4064679A1/en active Pending
- 2017-09-01 KR KR1020227014481A patent/KR20220058975A/ko not_active Application Discontinuation
- 2017-09-01 CN CN202110539020.6A patent/CN113271400B/zh active Active
- 2017-09-01 KR KR1020227015678A patent/KR102526559B1/ko active IP Right Grant
- 2017-09-01 EP EP22172135.0A patent/EP4064680A1/en active Pending
- 2017-09-01 US US16/330,468 patent/US10795024B2/en active Active
- 2017-09-01 WO PCT/JP2017/031539 patent/WO2018051809A1/ja unknown
- 2017-09-01 KR KR1020227002597A patent/KR102490799B1/ko active IP Right Grant
- 2017-09-01 JP JP2018539628A patent/JP6633216B2/ja active Active
- 2017-09-01 KR KR1020237013656A patent/KR102647268B1/ko active IP Right Grant
- 2017-09-01 KR KR1020197006833A patent/KR102374013B1/ko active IP Right Grant
- 2017-09-01 EP EP17850712.5A patent/EP3515057B1/en active Active
-
2019
- 2019-12-11 JP JP2019223950A patent/JP7105754B2/ja active Active
-
2021
- 2021-03-01 JP JP2021031510A patent/JP6937443B2/ja active Active
-
2022
- 2022-03-23 US US17/702,172 patent/US20220214458A1/en active Pending
- 2022-04-12 US US17/718,504 patent/US20220244387A1/en active Pending
- 2022-04-15 US US17/722,117 patent/US12061264B2/en active Active
- 2022-05-25 JP JP2022085048A patent/JP7342197B2/ja active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003283930A (ja) * | 2002-03-27 | 2003-10-03 | Sony Corp | 露光制御方法、露光制御回路、撮像装置、プログラム、記憶媒体 |
JP2006067291A (ja) * | 2004-08-27 | 2006-03-09 | Canon Inc | 撮像装置 |
JP2007174160A (ja) * | 2005-12-21 | 2007-07-05 | Konica Minolta Photo Imaging Inc | 撮像装置 |
JP2008048313A (ja) | 2006-08-21 | 2008-02-28 | Sony Corp | 物理量検出装置、物理量検出装置の駆動方法及び撮像装置 |
JP2008070120A (ja) * | 2006-09-12 | 2008-03-27 | Hitachi Ltd | 距離計測装置 |
JP2009081808A (ja) * | 2007-09-27 | 2009-04-16 | Fujifilm Corp | 撮影制御装置、撮影制御方法、撮影制御プログラム、および撮影装置 |
JP2011023898A (ja) * | 2009-07-14 | 2011-02-03 | Panasonic Corp | 表示装置、表示方法および集積回路 |
JP2014082365A (ja) * | 2012-10-17 | 2014-05-08 | Canon Inc | 半導体装置 |
Cited By (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112544073B (zh) * | 2018-07-31 | 2024-08-27 | 索尼半导体解决方案公司 | 摄像装置以及车辆控制系统 |
JP2020025268A (ja) * | 2018-07-31 | 2020-02-13 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置および電子機器 |
WO2020027229A1 (ja) * | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置および電子機器 |
WO2020027233A1 (ja) | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置及び車両制御システム |
CN112470461B (zh) * | 2018-07-31 | 2024-09-06 | 索尼半导体解决方案公司 | 层叠型受光传感器以及车载摄像装置 |
JP2020025261A (ja) * | 2018-07-31 | 2020-02-13 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置および電子機器 |
JP2020025265A (ja) * | 2018-07-31 | 2020-02-13 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置及び撮像装置 |
JP2020182219A (ja) * | 2018-07-31 | 2020-11-05 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、電子機器及び制御方法 |
JP7414869B2 (ja) | 2018-07-31 | 2024-01-16 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、電子機器及び固体撮像装置の制御方法 |
JP2021013175A (ja) * | 2018-07-31 | 2021-02-04 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置及び車両制御システム |
US11665442B2 (en) | 2018-07-31 | 2023-05-30 | Sony Semiconductor Solutions Corporation | Solid-state imaging device and electronic device |
US11643014B2 (en) | 2018-07-31 | 2023-05-09 | Sony Semiconductor Solutions Corporation | Image capturing device and vehicle control system |
US11735614B2 (en) | 2018-07-31 | 2023-08-22 | Sony Semiconductor Solutions Corporation | Stacked light-receiving sensor and electronic device |
JP7090666B2 (ja) | 2018-07-31 | 2022-06-24 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、電子機器及び制御方法 |
EP3833006A4 (en) * | 2018-07-31 | 2021-09-01 | Sony Semiconductor Solutions Corporation | STRATIFIED LIGHT-RECEIVING SENSOR AND IN-VEHICLE IMAGING DEVICE |
US11350046B2 (en) | 2018-07-31 | 2022-05-31 | Sony Semiconductor Solutions Corporation | Solid-state imaging device and electronic device |
WO2020027230A1 (ja) | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 積層型受光センサ及び車載撮像装置 |
WO2020027074A1 (ja) * | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置および電子機器 |
US20210266488A1 (en) * | 2018-07-31 | 2021-08-26 | Sony Semiconductor Solutions Corporation | Stacked light-receiving sensor and electronic device |
CN112470461A (zh) * | 2018-07-31 | 2021-03-09 | 索尼半导体解决方案公司 | 层叠型受光传感器以及车载摄像装置 |
KR20210029202A (ko) | 2018-07-31 | 2021-03-15 | 소니 세미컨덕터 솔루션즈 가부시키가이샤 | 적층형 수광 센서 및 차량 탑재 촬상 장치 |
KR20210030359A (ko) | 2018-07-31 | 2021-03-17 | 소니 세미컨덕터 솔루션즈 가부시키가이샤 | 촬상 장치 및 차량 제어 시스템 |
CN112544073A (zh) * | 2018-07-31 | 2021-03-23 | 索尼半导体解决方案公司 | 摄像装置以及车辆控制系统 |
JP2022081504A (ja) * | 2018-07-31 | 2022-05-31 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、電子機器及び固体撮像装置の制御方法 |
JP7423491B2 (ja) | 2018-07-31 | 2024-01-29 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置及び車両制御システム |
US12069386B2 (en) | 2018-07-31 | 2024-08-20 | Sony Semiconductor Solutions Corporation | Solid-state imaging device and electronic device |
EP3833007A4 (en) * | 2018-07-31 | 2021-08-25 | Sony Semiconductor Solutions Corporation | LAYERED LIGHT RECEIVING SENSOR AND ELECTRONIC DEVICE |
US11820289B2 (en) | 2018-07-31 | 2023-11-21 | Sony Semiconductor Solutions Corporation | Solid-state imaging device and electronic device |
WO2020027161A1 (ja) | 2018-07-31 | 2020-02-06 | ソニーセミコンダクタソリューションズ株式会社 | 積層型受光センサ及び電子機器 |
US11983931B2 (en) | 2018-07-31 | 2024-05-14 | Sony Semiconductor Solutions Corporation | Image capturing device and vehicle control system |
JP7380180B2 (ja) | 2018-08-31 | 2023-11-15 | ソニーグループ株式会社 | 固体撮像素子、撮像装置、撮像方法および撮像プログラム |
JP2020043615A (ja) * | 2018-08-31 | 2020-03-19 | ソニー株式会社 | 撮像装置、撮像システム、撮像方法および撮像プログラム |
JP2020120406A (ja) * | 2018-08-31 | 2020-08-06 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、情報処理システム、固体撮像方法及びプログラム |
JP2020038410A (ja) * | 2018-08-31 | 2020-03-12 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、情報処理装置、情報処理システム、情報処理方法及びプログラム |
WO2020059464A1 (ja) * | 2018-09-21 | 2020-03-26 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像システム、固体撮像装置、情報処理装置、画像処理方法、情報処理方法及びプログラム |
US12079712B2 (en) | 2018-09-21 | 2024-09-03 | Sony Semiconductor Solutions Corporation | Solid state image capturing system, solid state image capturing device, information processing device, image processing method, information processing method |
JP2020047191A (ja) * | 2018-09-21 | 2020-03-26 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像システム、固体撮像装置、情報処理装置、画像処理方法及びプログラム |
US20210385403A1 (en) * | 2018-10-31 | 2021-12-09 | Sony Semiconductor Solutions Corporation | Stacked light receiving sensor and electronic apparatus |
EP3876521A4 (en) * | 2018-10-31 | 2021-12-29 | Sony Semiconductor Solutions Corporation | Stacked light receiving sensor and electronic device |
CN112889267A (zh) * | 2018-10-31 | 2021-06-01 | 索尼半导体解决方案公司 | 堆叠式光接收传感器和电子装置 |
CN112889267B (zh) * | 2018-10-31 | 2024-08-06 | 索尼半导体解决方案公司 | 堆叠式光接收传感器和电子装置 |
US11792551B2 (en) | 2018-10-31 | 2023-10-17 | Sony Semiconductor Solutions Corporation | Stacked light receiving sensor and electronic apparatus |
WO2020090509A1 (ja) | 2018-10-31 | 2020-05-07 | ソニーセミコンダクタソリューションズ株式会社 | 積層型受光センサ及び電子機器 |
US12010418B2 (en) | 2019-01-08 | 2024-06-11 | Sony Group Corporation | Solid-state imaging element, signal processing method thereof, and electronic device |
WO2020145142A1 (ja) * | 2019-01-08 | 2020-07-16 | ソニー株式会社 | 固体撮像素子およびその信号処理方法、並びに電子機器 |
US11889170B2 (en) | 2019-02-19 | 2024-01-30 | Sony Semiconductor Solutions Corporation | Imaging device, electronic equipment, and imaging method that change processing based on detected temperature |
WO2020170890A1 (ja) | 2019-02-19 | 2020-08-27 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置、電子機器、および撮像方法 |
JPWO2020170729A1 (ja) * | 2019-02-20 | 2021-12-02 | 富士フイルム株式会社 | 撮像素子、撮像装置、撮像素子の作動方法、及びプログラム |
WO2020170729A1 (ja) * | 2019-02-20 | 2020-08-27 | 富士フイルム株式会社 | 撮像素子、撮像装置、撮像素子の作動方法、及びプログラム |
JP7022866B2 (ja) | 2019-02-20 | 2022-02-18 | 富士フイルム株式会社 | 撮像素子、撮像装置、撮像素子の作動方法、及びプログラム |
US11812177B2 (en) | 2019-02-20 | 2023-11-07 | Fujifilm Corporation | Imaging element, imaging apparatus, operation method of imaging element, and program |
JP7477543B2 (ja) | 2019-02-20 | 2024-05-01 | 富士フイルム株式会社 | 撮像素子、撮像装置、撮像素子の作動方法、及びプログラム |
US20220385809A1 (en) * | 2019-10-11 | 2022-12-01 | Sony Semiconductor Solutions Corporation | Imaging device, electronic apparatus, and imaging method |
WO2021070894A1 (ja) * | 2019-10-11 | 2021-04-15 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置、電子機器及び撮像方法 |
US11962916B2 (en) | 2019-10-11 | 2024-04-16 | Sony Semiconductor Solutions Corporation | Imaging device with two signal processing circuitry partly having a same type of signal processing, electronic apparatus including imaging device, and imaging method |
WO2021075352A1 (ja) * | 2019-10-18 | 2021-04-22 | ソニー株式会社 | 撮像装置及び電子機器 |
WO2021075292A1 (ja) * | 2019-10-18 | 2021-04-22 | ソニーセミコンダクタソリューションズ株式会社 | 受光装置、電子機器及び受光方法 |
WO2021084814A1 (ja) | 2019-10-31 | 2021-05-06 | ソニー株式会社 | 微小粒子回収方法、微小粒子分取用マイクロチップ、微小粒子回収装置、エマルションの製造方法、及びエマルション |
WO2021152877A1 (ja) | 2020-01-30 | 2021-08-05 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、電子機器及び撮像システム |
WO2021161893A1 (ja) | 2020-02-14 | 2021-08-19 | ソニーグループ株式会社 | 分析方法、分析システム、及び分析用表面 |
JP2021140032A (ja) * | 2020-03-05 | 2021-09-16 | ソニーグループ株式会社 | 信号取得装置、信号取得システム、及び信号取得方法 |
JP7494490B2 (ja) | 2020-03-05 | 2024-06-04 | ソニーグループ株式会社 | 信号取得装置、信号取得システム、及び信号取得方法 |
WO2021187350A1 (ja) * | 2020-03-19 | 2021-09-23 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置 |
US12101571B2 (en) | 2020-03-19 | 2024-09-24 | Sony Semiconductor Solutions Corporation | Solid-state imaging apparatus for performing noise reduction processing on image data |
US12108183B2 (en) | 2020-03-19 | 2024-10-01 | Sony Semiconductor Solutions Corporation | Solid-state imaging apparatus |
JP2021176242A (ja) * | 2020-04-23 | 2021-11-04 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、固体撮像装置の動作方法及びプログラム |
JP7105976B2 (ja) | 2020-04-23 | 2022-07-25 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置、固体撮像装置の動作方法及びプログラム |
WO2022009642A1 (ja) | 2020-07-06 | 2022-01-13 | ソニーグループ株式会社 | 生体粒子分析方法及び生体粒子分析システム |
WO2022014643A1 (en) | 2020-07-14 | 2022-01-20 | Sony Group Corporation | Fine-particle sorting apparatus, fine-particle sorting method, program, and fine particle-sorting system |
WO2023157651A1 (ja) * | 2022-02-17 | 2023-08-24 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置および信号処理方法 |
WO2023182049A1 (ja) | 2022-03-23 | 2023-09-28 | ソニーセミコンダクタソリューションズ株式会社 | イメージセンサ、データ構造 |
WO2023218936A1 (ja) * | 2022-05-10 | 2023-11-16 | ソニーセミコンダクタソリューションズ株式会社 | イメージセンサ、情報処理方法、プログラム |
WO2024014278A1 (ja) * | 2022-07-11 | 2024-01-18 | ソニーセミコンダクタソリューションズ株式会社 | 撮像装置およびデータ出力方法 |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7342197B2 (ja) | 撮像装置及び撮像装置の制御方法 | |
WO2020116185A1 (ja) | 固体撮像装置、信号処理チップ、および、電子機器 | |
WO2017175492A1 (ja) | 画像処理装置、画像処理方法、コンピュータプログラム及び電子機器 | |
JP7566761B2 (ja) | 撮像装置 | |
KR102493027B1 (ko) | 촬상 소자 및 그 구동 방법, 그리고 전자 기기 | |
JP7144926B2 (ja) | 撮像制御装置、撮像装置、および、撮像制御装置の制御方法 | |
WO2018139187A1 (ja) | 固体撮像装置およびその駆動方法、並びに電子機器 | |
US12130362B2 (en) | Imaging device and electronic device | |
WO2019208204A1 (ja) | 撮像装置 | |
US20200014899A1 (en) | Imaging device, imaging system, and method of controlling imaging device | |
WO2017199487A1 (ja) | 制御装置、制御方法、およびプログラム | |
US20240078803A1 (en) | Information processing apparatus, information processing method, computer program, and sensor apparatus | |
US11818470B2 (en) | Image generation device, image generation method, and vehicle control system | |
WO2022019025A1 (ja) | 情報処理装置、情報処理システム、情報処理方法、及び情報処理プログラム | |
JP2022163882A (ja) | 信号処理装置および方法、並びにプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17850712 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018539628 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20197006833 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017850712 Country of ref document: EP Effective date: 20190416 |