WO2017169233A1 - Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique Download PDF

Info

Publication number
WO2017169233A1
WO2017169233A1 PCT/JP2017/005575 JP2017005575W WO2017169233A1 WO 2017169233 A1 WO2017169233 A1 WO 2017169233A1 JP 2017005575 W JP2017005575 W JP 2017005575W WO 2017169233 A1 WO2017169233 A1 WO 2017169233A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
exposure
imaging processing
processing apparatus
exposure time
Prior art date
Application number
PCT/JP2017/005575
Other languages
English (en)
Japanese (ja)
Inventor
淳 橋爪
真生 全
洋輔 田中
中島 務
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017169233A1 publication Critical patent/WO2017169233A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors

Definitions

  • the present disclosure relates to an imaging processing device, an imaging processing method, a computer program, and an electronic device.
  • the digital still camera has a function of setting a plurality of frames in advance and imaging a plurality of consecutive frames while automatically changing the setting one by one from a predetermined imaging trigger as a starting point.
  • AEB Automatic Exposure Bracketing
  • the existing AEB was capturing multiple frames at the same frame rate. Therefore, when the exposure time is short, a large interval is generated between the frames. Blur (subject blurring) becomes noticeable when captured images with large intervals between frames are combined. On the other hand, blurring can be suppressed by narrowing the readout interval, but on the other hand, the upper limit of the maximum exposure time is shortened, and the settable range of exposure time is narrowed.
  • an imaging processing apparatus when performing imaging in a plurality of frames with a single shutter trigger, it is possible to simultaneously realize suppression of blurring of a composite image and widening of a settable range of exposure time.
  • An imaging processing apparatus, an imaging processing method, a computer program, and an electronic device are proposed.
  • a first image is generated by exposing a pixel by first exposure with a first exposure time, and a pixel is exposed by second exposure with a second exposure time following the first image.
  • An imaging processing apparatus comprising: a control unit that generates a second image, wherein the control unit minimizes an interval between a read start of the first exposure and a shutter start of the second exposure in a predetermined row of the pixels.
  • the first image is generated by exposing the pixel by the first exposure with the first exposure time, and the pixel is exposed by the second exposure by the second exposure time following the first image.
  • Providing an imaging processing method including: generating a second image; and minimizing an interval between the read start of the first exposure and the shutter start of the second exposure in a predetermined row of the pixels. Is done.
  • the computer generates a first image by exposing the pixel by the first exposure according to the first exposure time, and then the pixel is performed by the second exposure by the second exposure time following the first image.
  • a computer that generates a second image by exposure and minimizes the interval between the read start of the first exposure and the shutter start of the second exposure in a predetermined row of the pixels.
  • a program is provided.
  • an electronic apparatus including the imaging processing device is provided.
  • a new and improved imaging processing apparatus, imaging processing method, computer program, and electronic apparatus can be provided.
  • FIG. 2 is an explanatory diagram illustrating a configuration example of a sensor module 100 included in an imaging unit 11.
  • FIG. It is explanatory drawing which shows the function structural example of the sensor module 100 which concerns on the same embodiment. It is a flowchart which shows the operation example of the sensor module 100 which concerns on the same embodiment.
  • FIG. It is explanatory drawing which shows an example of the mode of the AEB function which the sensor module 100 concerning the embodiment performs. It is explanatory drawing which shows the example of an effect of the sensor module 100 which concerns on the same embodiment.
  • 5 is an explanatory diagram illustrating an example of a pixel reset period in the sensor module 100 according to the embodiment.
  • FIG. It is explanatory drawing which shows an example of the mode of the AEB function which the sensor module 100 which concerns on embodiment of the same execution performs. It is explanatory drawing which shows the example of the conversion process of the frame rate by the sensor module 100 concerning the embodiment. It is explanatory drawing which shows the example when the sensor module 100 which concerns on the same embodiment speeds up the time concerning the shutter and lead with respect to a pixel.
  • Embodiment of the present disclosure [1.1. Overview] Before describing the embodiments of the present disclosure in detail, an outline of the embodiments of the present disclosure will be described first.
  • the digital still camera has a setting for a plurality of frames in advance, and images a plurality of continuous frames while automatically changing the settings one by one starting from a predetermined imaging trigger.
  • Some are equipped with.
  • a function for imaging while changing exposure system settings such as exposure time, gain, flash, and high sensitivity mode is referred to as AEB.
  • FIG. 1 is an explanatory diagram illustrating a conventional AEB function, and is an explanatory diagram illustrating a conventional AEB function for imaging a plurality of frames at the same frame rate.
  • the solid line in FIG. 1 indicates the shutter timing for the image sensor, and the broken line indicates the read timing for reading data from the image sensor.
  • the readout interval of data from the pixels is a steady interval as shown in FIG. Therefore, it is necessary to match the shutter timing of the pixel with the timing of reading data from the pixel. Therefore, as shown in FIG. 1, depending on the exposure time, a large blank may occur in the data read timing of the previous frame and the shutter timing of the next frame.
  • FIG. 2 is an explanatory diagram for explaining a conventional AEB function for generating an image with a plurality of continuous frames by reading data from a pixel at the timing of a vertical synchronization signal (V synchronization signal).
  • FIG. 2 shows a state in which images with different exposure times are generated in four consecutive frames from frames f1 to f4. Note that at least a reset period for resetting the pixels is required between the lead and the shutter. Therefore, the exposure interval includes this pixel reset period. In this way, when an image is generated with a plurality of continuous frames by reading data from the pixel at the timing of the vertical synchronization signal, the exposure interval becomes longer depending on the exposure time.
  • FIG. 3 is an explanatory diagram for explaining a conventional AEB function for generating an image with a plurality of continuous frames by reading data from a pixel at the timing of a vertical synchronization signal (V synchronization signal).
  • FIG. 3 shows a state in which images having different exposure times are generated in four consecutive frames from frames f1 to f4, as in FIG.
  • the exposure time is shorter than the interval of the vertical synchronization signal, the exposure interval between the data read of the previous frame and the shutter of the next frame becomes longer.
  • blurring of a subject may occur when the plurality of images are combined.
  • FIG. 4 is an explanatory diagram showing an example of combining a plurality of images with an exposure interval opened.
  • FIG. 4 shows an example in which imaging with a long exposure time and imaging with a short exposure time are repeated, and an image captured with a long exposure time and an image captured with a short exposure time are combined.
  • imaging with a long exposure time is also referred to as “long accumulation”
  • imaging with a short exposure time is also referred to as “short accumulation”.
  • an image by long accumulation and an image by short accumulation are synthesized.
  • the data of the previous frame is read.
  • the exposure interval between the shutter of the next frame becomes longer. Therefore, when a moving subject is continuously imaged and an image with long accumulation and an image with short accumulation are combined, blurring of the combined image increases as shown in FIG.
  • the data reading interval from the pixel that is, the interval of the vertical synchronization signal may be narrowed.
  • the upper limit of the maximum exposure time is shortened, and the settable range of the exposure time is narrowed.
  • the settable range of the exposure time it becomes impossible to generate an image based on long accumulation as shown in FIG.
  • the present disclosure has intensively studied a technique capable of simultaneously realizing blur suppression and widening of a settable range of exposure time.
  • the present inventor has devised a technology that can simultaneously realize blur suppression and widening of a settable range of exposure time by minimizing the data reading interval from the pixel. It came to do.
  • FIG. 5 is an explanatory diagram illustrating a functional configuration example of the electronic device 10 according to the embodiment of the present disclosure.
  • a functional configuration example of the electronic device 10 according to the embodiment of the present disclosure will be described with reference to FIG.
  • the electronic device 10 includes an imaging unit 11, an image processing unit 12, a display unit 13, a control unit 14, a storage unit 15, and an operation unit 16. And comprising.
  • the imaging unit 11 includes a lens, a sensor module, and the like, and accumulates electrons for a predetermined period according to an image formed on the light receiving surface of the sensor module through the lens.
  • the imaging unit 11 performs predetermined signal processing on a signal corresponding to the accumulated electrons. Then, the imaging unit 11 outputs the signal after the signal processing to the image processing unit 12.
  • the configuration of the sensor module included in the imaging unit 11 will be described in detail later.
  • the imaging unit 11 includes, as the predetermined signal processing, camera shake correction processing using an electronic camera shake correction method, automatic white balance processing, automatic exposure processing, distortion correction processing, defect correction processing, noise reduction processing, high dynamic range synthesis processing, and the like. Signal processing may be performed.
  • the image processing unit 12 is configured by an application processor (AP), for example, and executes image processing using a signal output from the imaging unit 11.
  • the image processing executed by the image processing unit 12 includes, for example, demosaic processing using a signal output from the imaging unit 11, display processing of the demosaiced image on the display unit 13, storage processing in the storage unit 15, and the like. There is.
  • the display unit 13 is a display device configured by, for example, a liquid crystal display, an organic EL display, or the like. Display contents of the display unit 13 are controlled by the control unit 14. For example, the display unit 13 displays an image captured by the imaging unit 11 and subjected to image processing by the image processing unit 12 based on the control of the control unit 14.
  • the control unit 14 includes a processor such as a CPU (Central Processing Unit), ROM, RAM, and the like, and controls the operation of each unit of the electronic device 10.
  • a processor such as a CPU (Central Processing Unit), ROM, RAM, and the like, and controls the operation of each unit of the electronic device 10.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the like controls the operation of each unit of the electronic device 10.
  • the storage unit 15 is configured by a storage medium such as a flash memory or other nonvolatile memory.
  • the storage unit 15 stores an image captured by the imaging unit 11 and subjected to image processing by the image processing unit 12.
  • the image stored in the storage unit 15 can be displayed on the display unit 13 in accordance with the operation of the user of the electronic device 10.
  • the operation unit 16 is a device for operating the electronic device 10 and includes, for example, buttons and a touch panel.
  • the touch panel is provided on the display surface of the display unit 13.
  • the user of the electronic device 10 wants to record the image captured by the imaging unit 11 in the electronic device 10, the user triggers a shutter trigger by operating a predetermined button of the operation unit 16.
  • the imaging unit 11 and the image processing unit 12 detect the occurrence of the shutter trigger, the imaging unit 11 and the image processing unit 12 execute a process for recording an image on the electronic device 10 according to the generation of the shutter trigger.
  • the electronic device 10 shown in FIG. 5 is not limited to a specific device, and can take various forms such as a digital camera, a smartphone, a tablet portable terminal, a portable music player, and a game machine.
  • FIG. 6 is an explanatory diagram illustrating a configuration example of the sensor module 100 included in the imaging unit 11.
  • the sensor module 100 according to the embodiment of the present disclosure is an example of the image processing apparatus of the present disclosure, and is configured by stacking three substrates as illustrated in FIG. 6.
  • the sensor module 100 according to the embodiment of the present disclosure has a configuration in which a pixel substrate 110, a memory substrate 120, and a signal processing substrate 130 are stacked in this order.
  • the pixel substrate 110 is a substrate having an image sensor composed of pixel regions in which unit pixels are formed in an array. Each unit pixel receives light from a subject, photoelectrically converts the incident light, accumulates charges, and outputs the charges as pixel signals at a predetermined timing. Pixel signals output from the pixel substrate 110 are stored in the memory substrate 120, and signal processing is performed in the signal processing substrate 130.
  • the pixel substrate 110 includes an AD converter that converts an analog signal into a digital signal. That is, the pixel signal output from the pixel substrate 110 is a digital signal.
  • the memory substrate 120 is a substrate having a memory such as a DRAM (Dynamic Random Access Memory) that temporarily stores pixel signals output from the pixel substrate 110.
  • the memory substrate 120 has a capacity capable of temporarily storing pixel signals of a plurality of frames.
  • the pixel signal stored in the memory substrate 120 is read based on a read command from the signal processing substrate 130.
  • the signal processing board 130 performs various signal processing on the pixel signals stored in the memory board 120.
  • the signal processing executed by the signal processing board 130 is signal processing related to image quality with respect to the pixel signals stored in the memory board 120.
  • camera shake correction processing by an electronic camera shake correction method, automatic white balance processing, automatic exposure processing, distortion Signal processing such as correction processing, defect correction processing, noise reduction processing, and high dynamic range synthesis processing can be executed.
  • the signal processing board 130 can execute a synthesis process of images captured in a plurality of frames.
  • the present disclosure is not limited to such an example.
  • the sensor module 100 may have a configuration in which a pixel substrate 110, a signal processing substrate 130, and a memory substrate 120 are stacked in this order.
  • the configuration example of the sensor module 100 has been described above with reference to FIG. Subsequently, a functional configuration example of the sensor module 100 will be described.
  • FIG. 7 is an explanatory diagram illustrating a functional configuration example of the sensor module 100 according to the embodiment of the present disclosure.
  • a functional configuration example of the sensor module 100 according to the embodiment of the present disclosure will be described with reference to FIG.
  • the pixel substrate 110 includes an image sensor 111 having a pixel region in which unit pixels are formed in an array, and a control unit 112 that supplies a predetermined clock signal and timing signal to the image sensor 111.
  • a pixel signal output from the image sensor 111 in response to a signal from the control unit 112 is once sent to the signal processing board 130 and then sent to the memory board 120.
  • a CMOS image sensor is used as the image sensor 111, and a pixel signal generated by exposure is read out by a rolling shutter system.
  • the control unit 112 sets an interval between the read start of exposure (first exposure) in a certain frame and the shutter start of exposure (second exposure) in the next frame.
  • a timing signal is supplied to the image sensor 111 so as to minimize it. In this way, the control unit 112 supplies the timing signal to the image sensor 111, so that the sensor module 100 according to the embodiment of the present disclosure simultaneously realizes suppression of blur and widening of a settable range of exposure time. be able to.
  • the memory board 120 includes an image storage unit 121 configured by DRAM (Dynamic Random Access Memory) or the like.
  • the image storage unit 121 temporarily stores pixel signals output from the image sensor 111.
  • the image storage unit 121 has a capacity capable of temporarily storing pixel signals of a plurality of frames.
  • the pixel signal stored in the image storage unit 121 is read based on a read command from the signal processing board 130.
  • the signal processing board 130 includes a pre-processing unit 131 and a post-processing unit 132.
  • the pre-processing unit 131 performs signal processing on the pixel signal output from the image sensor 111.
  • the preprocessing unit 131 stores the pixel signal after the signal processing in the image storage unit 121.
  • the signal processing executed by the preprocessing unit 131 can include, for example, gain adjustment processing, clamping processing, pixel addition processing, and the like.
  • the post-processing unit 132 performs signal processing on the pixel signal stored in the image storage unit 121.
  • the post-processing unit 132 outputs the pixel signal after the signal processing to the image processing unit 12.
  • the signal processing executed by the post-processing unit 132 can include, for example, automatic white balance processing, automatic exposure processing, distortion correction processing, defect correction processing, noise reduction processing, high dynamic range synthesis processing, and the like. Further, the post-processing unit 132 can execute a composition process of images captured in a plurality of frames.
  • the functional configuration example of the sensor module 100 according to the embodiment of the present disclosure has been described above.
  • the sensor module 100 according to the embodiment of the present disclosure has such a configuration, so that blurring of composite images and exposure are performed when performing imaging in a plurality of frames with a single shutter trigger, like the AEB function. It is possible to simultaneously realize a wide setting range of time.
  • FIG. 8 is a flowchart illustrating an operation example of the sensor module 100 according to the embodiment of the present disclosure.
  • FIG. 8 illustrates an operation example of the sensor module 100 according to the embodiment of the present disclosure when a plurality of images are continuously captured with a single shutter trigger as in the AEB function.
  • FIG. 8 illustrates an operation example of the sensor module 100 according to the embodiment of the present disclosure when a plurality of images are continuously captured with a single shutter trigger as in the AEB function.
  • the sensor module 100 determines the N + 1th image from the read time of the Nth image (N is an integer of 1 or more and less than M) of consecutive images. It is determined whether the exposure time is longer (step S101).
  • the determination in step S101 can be executed by the control unit 112, for example.
  • Whether or not the exposure time of the (N + 1) th sheet is longer than the read time of the Nth sheet of consecutive images means that the (N + 1) th sheet is longer than the reading time from the first line to the last line of the image sensor 111 for the Nth sheet. This means whether the exposure time is long.
  • step S101 If the exposure time of the (N + 1) th sheet is longer than the read time of the Nth sheet of consecutive images (step S101, Yes), the sensor module 100 performs the Nth sheet for a row with pixels provided in the image sensor 111. The interval between the start of reading and the start of the (N + 1) th shutter is minimized (step S102).
  • the control in step S102 can be executed by the control unit 112, for example.
  • FIG. 9 is an explanatory diagram illustrating an example of a state of the AEB function executed by the sensor module 100 according to the embodiment of the present disclosure.
  • FIG. 9 shows a state in which images having different exposure times are generated in four consecutive frames from frames f1 to f4.
  • the sensor module 100 minimizes the distance between the lead and the shutter.
  • the interval between the lead and the shutter is narrowed to an interval corresponding to a reset period for resetting pixels provided in the image sensor 111. That is, in this embodiment, as shown in FIG. 9, the interval of the vertical synchronization signal is not constant.
  • the interval between the vertical synchronization signal for reading the signal from the pixel in frame f1 and the vertical synchronization signal for reading the signal from the pixel in frame f2 corresponds to the exposure time of the image in frame f2 plus the reset period. To do.
  • FIG. 10 is an explanatory diagram showing an effect example of the sensor module 100 according to the present embodiment.
  • the sensor module 100 according to the present embodiment minimizes the interval between the lead and the shutter, thereby blurring the synthesized image in the case where the image by the long accumulation and the image by the short accumulation are alternately captured and synthesized. It can be kept small.
  • the post-processing unit 132 executes this synthesis process. This is because the amount of movement of the subject can be minimized by minimizing the distance between the lead and the shutter.
  • FIG. 11 is an explanatory diagram illustrating an example of a pixel reset period in the sensor module 100 according to the present embodiment.
  • a vertical synchronization signal and a horizontal synchronization signal are supplied to the pixel.
  • one access unit of the image sensor 111 is 32 lines. This one access unit means that eight lines are read simultaneously, and AD conversion for converting the read analog data into digital data is performed four times.
  • the interval from the lead to the shutter is set with the period required for the four AD conversions as a reset period. Note that the amount of one access unit varies depending on the image sensor. Accordingly, the reset period also varies depending on the image sensor.
  • the factors affecting the reset period are organized, there are two types: the number of AD conversions during one horizontal period and the number of lines that can be read simultaneously.
  • the interval between the lead and the shutter is narrowed to the interval corresponding to the reset period determined by these two elements.
  • step S101 the sensor module 100 determines the N-th sheet for a row with pixels provided in the image sensor 111.
  • the interval between the N-th read start and the N + 1-th shutter start is minimized so that the N + 1th lead and the (N + 1) th lead do not overlap (step S103).
  • the control in step S103 can be executed by the control unit 112, for example.
  • the N-th lead time and the (N + 1) -th lead time may overlap when the interval between the N-th lead start and the (N + 1) th shutter start is too short.
  • FIG. 12 is an explanatory diagram illustrating an example of a state of the AEB function executed by the sensor module 100 according to the embodiment of the present disclosure.
  • FIG. 12 shows a state in which images with different exposure times are generated in four consecutive frames from frames f1 to f4.
  • the read period of frame f1 (read period 1) and the read period of frame f2 (read period 2) overlap.
  • the read period 2 the read period of the frame f3 (read period 3), the read period 3 and the read period of the frame f4 (read period 4) overlap each other.
  • the sensor module 100 when imaging is performed under an exposure time condition in which the lead period overlaps between the preceding and following frames, the sensor module 100 outputs the image of the later frame to the outside (for example, the image processing unit 12) as it is. I can't do it.
  • the sensor module 100 may adjust the shutter start timing so that the lead period does not overlap between the preceding and following frames. That is, the sensor module 100 may increase the interval between the lead between the frame and the shutter from the time corresponding to the reset period.
  • the sensor module 100 can output an image to the outside (for example, the image processing unit 12) even if a frame with a short exposure time occurs. I can do it.
  • the sensor module 100 according to the present embodiment includes an image storage unit 121. Therefore, the sensor module 100 according to the present embodiment stores the data read from the image sensor 111 once in the image storage unit 121 after minimizing the interval between the lead and the shutter between frames to a time corresponding to the reset period. Then, it may be output to the outside (for example, the image processing unit 12).
  • FIG. 13 is an explanatory diagram showing an example of frame rate conversion processing by the sensor module 100 according to the present embodiment.
  • FIG. 13 shows a state in which the sensor module 100 according to the present embodiment stores an image read from the image sensor 111 once in the image storage unit 121 and outputs the image.
  • the image read from the image sensor 111 is stored once in the image storage unit 121, and the images stored in the image storage unit 121 are sequentially read, so that the sensor module 100 according to the present embodiment can perform the read period. Even when the frames overlap in the preceding and following frames, the interval between the lead and the shutter between the frames can be minimized to a time corresponding to the reset period.
  • the sensor module 100 according to the present embodiment can also speed up the time required for shutter and reading for the pixels.
  • FIG. 14 is an explanatory diagram illustrating an example in which the sensor module 100 according to the present embodiment speeds up the time required for shutter and reading for a pixel. As described above, the sensor module 100 can speed up the time required for the shutter and the lead for the pixel so that the lead period does not overlap between the previous and next frames.
  • the sensor module 100 stores an image read from the image sensor 111 in the image storage unit 121 once, and the image stored in the image storage unit 121 is converted into a frame rate. May be converted and output to the outside (for example, the image processing unit 12).
  • the sensor module 100 can be read from the image sensor 111.
  • the read image may be stored once in the image storage unit 121, and the image stored in the image storage unit 121 may be converted to, for example, 30 fps and output to the outside.
  • FIG. 15 is an explanatory diagram showing the frame rate conversion processing by the sensor module 100 via the image storage unit 121 in more detail.
  • the sensor module 100 When reading from the image sensor 111, the sensor module 100 is faster than the frame rate for outputting an image from the sensor module 100 (for example, the frame rate for outputting an image is 30 fps, and the frame rate for reading from the image sensor 111 is 120 fps to Read out at a speed of about 240 fps), and the image is stored in the image storage unit 121 once.
  • the sensor module 100 outputs an image to the outside, the sensor module 100 executes a predetermined subsequent process at a rate limited to the interface band of the external output and outputs the image.
  • the sensor module 100 can convert the reading speed from the image sensor 111 and the output speed from the sensor module 100.
  • the sensor module 100 may capture all images with the same exposure time when capturing a plurality of images with a single shutter trigger. And the sensor module 100 can output the image which suppressed noise by performing the 3D noise reduction (3DNR) process which synthesize
  • 3DNR 3D noise reduction
  • each pixel signal is stored once in the image storage unit 121, and then the pixel signal stored in the image storage unit 121 by the post-processing unit 132. Is read out and the composition processing is performed.
  • the sensor module 100 according to the present embodiment can narrow the search range because there is less subject blur in 3DNR by minimizing the period between the lead and the shutter between the front and rear frames. Therefore, the sensor module 100 according to this embodiment can reduce the circuit for performing 3DNR by minimizing the period between the lead and the shutter between the front and rear frames.
  • the sensor module 100 synthesizes a predetermined number of images, for example, six images, with a single shutter trigger when combining the above-described image with long accumulation and the image with short accumulation or when performing 3DNR processing. Images may be taken, or a predetermined number or more, for example, 7 or more images may be taken. When the sensor module 100 according to the present embodiment captures more than a predetermined number of images, the sensor module 100 selects a predetermined number from the images and synthesizes an image based on long accumulation and an image based on short accumulation, or 3DNR. Processing may be performed.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
  • FIG. 16 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
  • the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
  • the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
  • the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
  • the vehicle interior information detection unit 12040 detects vehicle interior information.
  • a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
  • the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
  • the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
  • the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 17 is a diagram illustrating an example of an installation position of the imaging unit 12031.
  • the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
  • the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
  • the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
  • the forward images acquired by the imaging units 12101 and 12105 are mainly used for detection of a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 17 shows an example of the shooting range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
  • the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
  • the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
  • the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
  • the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to be superimposed and displayed.
  • voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above.
  • the period between the lead and the shutter can be minimized between the previous and subsequent frames when a plurality of images are captured with a single shutter trigger.
  • the sensor module 100 that minimizes the period between the lead and the shutter between the previous and next frames when a plurality of images are captured with a single shutter trigger.
  • the sensor module 100 minimizes the period between the lead and the shutter between the front and rear frames, thereby simultaneously realizing the suppression of blur and the wide setting range of the exposure time. Can do.
  • each step in the processing executed by each device in this specification does not necessarily have to be processed in chronological order in the order described as a sequence diagram or flowchart.
  • each step in the processing executed by each device may be processed in an order different from the order described as the flowchart, or may be processed in parallel.
  • the sensor module 100 having a configuration in which the pixel substrate 110, the memory substrate 120, and the signal processing substrate 130 are stacked in this order is shown, but the present disclosure is limited to such an example. It is not something.
  • the sensor module 100 may have a structure in which two substrates of a pixel substrate 110 and a signal processing substrate 130 are stacked. In this case, the sensor module 100 has a structure in which the pixel substrate 110 is stacked on the signal processing substrate 130.
  • a first image is generated by exposing pixels to the first exposure with the first exposure time
  • a second image is generated by exposing pixels to the second exposure with the second exposure time following the first image.
  • the control unit is an imaging processing apparatus that minimizes an interval between a read start of the first exposure and a shutter start of the second exposure in a predetermined row of the pixels.
  • the imaging processing apparatus When the second exposure time is less than the lead time of the first exposure, the lead of the first exposure in a predetermined row of the pixel under the condition that the lead period of the first exposure and the lead period of the second exposure do not overlap.
  • the imaging processing apparatus according to any one of (1) to (5), wherein an interval between the start and the shutter start of the second exposure is minimized.
  • the imaging processing apparatus according to any one of (1) to (6), further including an image processing unit that synthesizes the first image and the second image.
  • the speed at which the first image and the second image are stored in the storage unit from the control unit and read from the storage unit is lower than the speed at which the first image and the second image are directly read from the pixels.
  • Imaging processing apparatus (10) The imaging processing apparatus according to (8) or (9), wherein the control unit is configured to change a speed at which the first image and the second image are read from the storage unit. (11) It is configured by laminating two semiconductor substrates consisting of a first semiconductor substrate and a second semiconductor substrate, The first semiconductor substrate includes at least the pixel and the control unit, The image processing apparatus according to any one of (1) to (7), wherein image processing is performed on the first image and the second image on the second semiconductor substrate.
  • (12) It is configured by stacking three semiconductor substrates consisting of a first semiconductor substrate, a second semiconductor substrate, and a third semiconductor substrate,
  • the first semiconductor substrate includes at least the pixel and the control unit,
  • the third semiconductor substrate is formed with at least the storage unit,
  • the imaging processing apparatus according to any one of (8) to (10), wherein image processing is performed on the first image and the second image on the second semiconductor substrate.
  • the third semiconductor substrate is provided between the first semiconductor substrate and the second semiconductor substrate.
  • the control unit further generates a sixth image from the third image by exposing the pixels from the third exposure time to the sixth exposure time through the third exposure time to the sixth exposure time, respectively, following the second image.
  • the imaging processing apparatus according to any one of (1) to (14).
  • (16) A storage unit for storing at least the sixth image from the first image by the control unit;
  • the imaging processing apparatus according to (15), wherein the control unit reads the sixth image from the first image from the storage unit after storing the sixth image from the first image in the storage unit.
  • the control unit further generates a seventh image by exposing the pixels by the seventh exposure with at least a seventh exposure time following the sixth image, and selects six images from the generated series of images.
  • the imaging processing apparatus according to (16).
  • a first image is generated by exposing pixels to the first exposure with the first exposure time
  • a second image is generated by exposing pixels to the second exposure with the second exposure time following the first image. And Minimizing the interval between the read start of the first exposure and the shutter start of the second exposure in a predetermined row of the pixels;
  • a computer program that executes (23) An electronic apparatus comprising the imaging processing device according to any one of (1) to (20).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

[Problème] L'invention a pour objet de fournir un dispositif de traitement d'imagerie capable de mettre en œuvre simultanément la suppression du flou d'une image composite et l'élargissement de la plage réglable de temps d'exposition lorsqu'une pluralité de trames sont capturées par un déclencheur d'obturateur. [Solution] L'invention concerne un dispositif de traitement d'imagerie comportant une unité de commande qui génère une première image en exposant des pixels à une première exposition pendant un premier temps d'exposition, et suite à la première image, génère une seconde image en exposant les pixels à une seconde exposition pendant un second temps d'exposition, l'unité de commande diminuant l'intervalle entre un début de lecture dans la première exposition et un début d'obturation dans la seconde exposition dans une rangée prédéterminée des pixels.
PCT/JP2017/005575 2016-03-29 2017-02-15 Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique WO2017169233A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-065220 2016-03-29
JP2016065220A JP2017183870A (ja) 2016-03-29 2016-03-29 撮像処理装置、撮像処理方法、コンピュータプログラム及び電子機器

Publications (1)

Publication Number Publication Date
WO2017169233A1 true WO2017169233A1 (fr) 2017-10-05

Family

ID=59963931

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/005575 WO2017169233A1 (fr) 2016-03-29 2017-02-15 Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique

Country Status (2)

Country Link
JP (1) JP2017183870A (fr)
WO (1) WO2017169233A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2021010355A1 (fr) * 2019-07-16 2021-01-21
CN113395482A (zh) * 2020-03-12 2021-09-14 平湖莱顿光学仪器制造有限公司 一种颜色相关的智能二维视频装置及其二维视频播放方法
CN113906729A (zh) * 2019-06-13 2022-01-07 索尼集团公司 成像设备、成像控制方法和程序
US11575842B2 (en) 2018-08-03 2023-02-07 Canon Kabushiki Kaisha Imaging apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6955308B2 (ja) 2018-12-27 2021-10-27 富士フイルム株式会社 撮像素子、撮像装置、撮像方法及びプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006135501A (ja) * 2004-11-04 2006-05-25 Konica Minolta Photo Imaging Inc 撮像装置
JP2010078635A (ja) * 2008-09-24 2010-04-08 Sanyo Electric Co Ltd ブレ補正装置及び撮像装置
JP2010182888A (ja) * 2009-02-05 2010-08-19 Sony Corp 固体撮像装置、固体撮像装置の製造方法、固体撮像装置の駆動方法、及び電子機器
JP2013255201A (ja) * 2012-06-08 2013-12-19 Canon Inc 撮像装置、及びその制御方法
JP2015126043A (ja) * 2013-12-26 2015-07-06 ソニー株式会社 電子デバイス

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006135501A (ja) * 2004-11-04 2006-05-25 Konica Minolta Photo Imaging Inc 撮像装置
JP2010078635A (ja) * 2008-09-24 2010-04-08 Sanyo Electric Co Ltd ブレ補正装置及び撮像装置
JP2010182888A (ja) * 2009-02-05 2010-08-19 Sony Corp 固体撮像装置、固体撮像装置の製造方法、固体撮像装置の駆動方法、及び電子機器
JP2013255201A (ja) * 2012-06-08 2013-12-19 Canon Inc 撮像装置、及びその制御方法
JP2015126043A (ja) * 2013-12-26 2015-07-06 ソニー株式会社 電子デバイス

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11575842B2 (en) 2018-08-03 2023-02-07 Canon Kabushiki Kaisha Imaging apparatus
CN113906729A (zh) * 2019-06-13 2022-01-07 索尼集团公司 成像设备、成像控制方法和程序
JPWO2021010355A1 (fr) * 2019-07-16 2021-01-21
WO2021010355A1 (fr) * 2019-07-16 2021-01-21 株式会社ニコン Dispositif et procédé d'imagerie
JP7384204B2 (ja) 2019-07-16 2023-11-21 株式会社ニコン 撮像装置及び撮像方法
CN113395482A (zh) * 2020-03-12 2021-09-14 平湖莱顿光学仪器制造有限公司 一种颜色相关的智能二维视频装置及其二维视频播放方法

Also Published As

Publication number Publication date
JP2017183870A (ja) 2017-10-05

Similar Documents

Publication Publication Date Title
WO2017169233A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique
WO2017175492A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique
JP2019191118A (ja) 測距処理装置、測距モジュール、測距処理方法、およびプログラム
US20210075990A1 (en) Solid-state imaging element, electronic device, and method for controlling solid-state imaging element
WO2018012051A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie et procédé de commande pour un élément d'imagerie à semi-conducteurs
WO2017169274A1 (fr) Dispositif de commande d'imagerie, procédé de commande d'imagerie, programme informatique et équipement électronique
WO2018110002A1 (fr) Dispositif d'imagerie et procédé de commande destiné à un dispositif d'imagerie
JPWO2019003675A1 (ja) 撮像装置とフリッカー補正方法およびプログラム
WO2018207664A1 (fr) Dispositif d'imagerie, procédé d'imagerie, et programme
JP7144926B2 (ja) 撮像制御装置、撮像装置、および、撮像制御装置の制御方法
WO2018207666A1 (fr) Élément d'imagerie, procédé de commande associé et dispositif électronique
WO2018179623A1 (fr) Dispositif de capture d'image, module de capture d'image, système de capture d'image et procédé de commande de dispositif de capture d'image
CN114270798A (zh) 摄像装置
WO2017149964A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique
JP2018093392A (ja) 固体撮像装置、駆動方法、および電子機器
WO2017203752A1 (fr) Dispositif de capture d'image, et procédé de commande
WO2017212722A1 (fr) Appareil de commande et procédé de commande
CN113661700B (zh) 成像装置与成像方法
US11889207B2 (en) Solid-state imaging element, imaging device, and method for controlling solid-state imaging element
WO2020021826A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie et procédé de commande d'élément d'imagerie à semi-conducteurs
WO2018211986A1 (fr) Dispositif et procédé de traitement de signal, dispositif d'imagerie, et appareil électronique
WO2018207665A1 (fr) Dispositif d'imagerie à semi-conducteurs, procédé de commande et dispositif électronique
WO2023105916A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, et procédé de commande d'élément d'imagerie à semi-conducteurs
WO2020166284A1 (fr) Dispositif de capture d'image
WO2020100399A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, et procédé de contrôle d'élément d'imagerie à semi-conducteurs

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17773781

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17773781

Country of ref document: EP

Kind code of ref document: A1