WO2018124054A1 - Dispositif d'imagerie et son procédé de commande - Google Patents

Dispositif d'imagerie et son procédé de commande Download PDF

Info

Publication number
WO2018124054A1
WO2018124054A1 PCT/JP2017/046598 JP2017046598W WO2018124054A1 WO 2018124054 A1 WO2018124054 A1 WO 2018124054A1 JP 2017046598 W JP2017046598 W JP 2017046598W WO 2018124054 A1 WO2018124054 A1 WO 2018124054A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging apparatus
target frame
exposure
image
control unit
Prior art date
Application number
PCT/JP2017/046598
Other languages
English (en)
Japanese (ja)
Inventor
弘明 関東
健夫 南
國末 勝次
亨治 石井
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2018124054A1 publication Critical patent/WO2018124054A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/093Digital circuits for control of exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time

Definitions

  • the present disclosure relates to an imaging apparatus and a control method thereof.
  • Patent Document 1 A technique described in Patent Document 1 is known as an image sensor using an organic photoelectric conversion element.
  • Such an image pickup apparatus is desired to be further improved.
  • an object of the present disclosure is to provide an imaging apparatus or a control method thereof that can realize further improvement.
  • An imaging apparatus includes an imaging element capable of nondestructive readout, and a control unit that controls imaging of the target frame using an auxiliary image obtained by nondestructive readout in the target frame. .
  • the present disclosure can provide an imaging apparatus or a control method thereof that can realize further improvement.
  • FIG. 1 is a block diagram of the imaging apparatus according to the first embodiment.
  • FIG. 2A is a diagram illustrating an appearance example of the imaging apparatus according to Embodiment 1.
  • FIG. 2B is a diagram illustrating an appearance example of the imaging apparatus according to Embodiment 1.
  • FIG. 3 is a diagram illustrating a configuration of the image sensor according to the first embodiment.
  • FIG. 4 is a circuit diagram illustrating a configuration of a pixel according to Embodiment 1.
  • FIG. 5 is a flowchart illustrating the operation of the imaging apparatus according to the first embodiment.
  • FIG. 6 is a diagram illustrating the operation of the imaging apparatus according to the first embodiment.
  • FIG. 7 is a flowchart illustrating the operation of the imaging apparatus according to the second embodiment.
  • FIG. 8 is a diagram illustrating the operation of the imaging apparatus according to the second embodiment.
  • FIG. 9 is a diagram illustrating an example of an auxiliary image according to the second embodiment.
  • FIG. 10 is a diagram illustrating an example of a threshold setting screen according to the second embodiment.
  • FIG. 11 is a flowchart illustrating the operation of the imaging apparatus according to the third embodiment.
  • FIG. 12 is a diagram for explaining subject change determination processing according to the third embodiment.
  • FIG. 13 is a flowchart illustrating a modified example of the operation of the imaging apparatus according to the third embodiment.
  • FIG. 14 is a flowchart illustrating a modified example of the operation of the imaging apparatus according to the third embodiment.
  • FIG. 15 is a flowchart illustrating a modified example of the operation of the imaging apparatus according to the third embodiment.
  • FIG. 16 is a flowchart illustrating the operation of the imaging apparatus according to the fourth embodiment.
  • FIG. 17 is a diagram illustrating the operation of the imaging apparatus according to the fourth embodiment.
  • FIG. 18 is a diagram illustrating an example of an image selection screen according to the fourth embodiment.
  • FIG. 19 is a diagram illustrating the operation of the imaging apparatus according to the modification of the fourth embodiment.
  • FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus 100 according to the present embodiment.
  • 2A and 2B are diagrams illustrating an example of the appearance of the imaging apparatus 100.
  • the imaging apparatus 100 is a camera such as a digital still camera or a digital video camera.
  • the imaging device 101 is a solid-state imaging device (solid-state imaging device) that converts incident light into an electrical signal (image) and outputs the obtained electrical signal.
  • the imaging device 101 is an organic sensor using an organic photoelectric conversion device.
  • the control unit 102 controls the image sensor 101.
  • the control unit 102 performs various signal processing on the image obtained by the imaging element 101, and displays the obtained image on the display unit 103 or stores it in the storage unit 104.
  • the image output from the control unit 102 may be output to the outside of the imaging apparatus 100 via an input / output interface (not shown).
  • the control unit 102 is a circuit that performs information processing and is a circuit that can access the storage unit 104.
  • the control unit 102 is realized by a processor such as a DSP (Digital Signal Processor) or a GPU (Graphics Processing Unit).
  • the control unit 102 may be a dedicated or general-purpose electronic circuit.
  • the control unit 102 may be an aggregate of a plurality of electronic circuits.
  • the display unit 103 displays an image obtained by the image sensor 101, a user interface, and the like.
  • the display unit 103 is a display panel provided on the back of the camera, and is a liquid crystal panel, an organic EL (electroluminescence) panel, or the like.
  • the display unit 103 may be an electronic viewfinder (EVF) provided on the upper part of the camera. In this case, the image of the liquid crystal panel or the organic EL panel is projected through the lens.
  • the display unit 103 may be an external monitor connected to the imaging apparatus 100 via an interface such as HDMI (registered trademark), USB (registered trademark), or Wi-Fi (registered trademark).
  • the storage unit 104 is a general purpose or dedicated memory in which information is stored.
  • the storage unit 104 may be a magnetic disk or an optical disk, or may be expressed as a storage or a recording medium.
  • the storage unit 104 may be a non-volatile memory or a volatile memory.
  • the storage unit 104 is not limited to a memory built in the imaging apparatus 100, and may be a memory attached to the imaging apparatus 100.
  • the storage unit 104 may be an SD card or the like.
  • the storage unit 104 may be a combination of these plural types of memories.
  • the imaging system 106 includes, for example, one or a plurality of lenses, and condenses light from the outside of the imaging apparatus 100 on the imaging element 101.
  • the light shielding unit 107 is, for example, a mechanical shutter, and shields light from the imaging system 106.
  • FIG. 3 is a block diagram illustrating a configuration of the image sensor 101.
  • 3 includes a plurality of pixels (unit pixel cells) 201 arranged in a matrix, a vertical scanning unit 202, a column signal processing unit 203, a horizontal readout unit 204, and a row.
  • Each of the plurality of pixels 201 outputs a signal corresponding to the incident light to the vertical signal line 207 provided in the corresponding column.
  • the vertical scanning unit 202 resets the plurality of pixels 201 via the plurality of reset control lines 205.
  • the vertical scanning unit 202 sequentially selects the plurality of pixels 201 in units of rows via the plurality of address control lines 206.
  • the column signal processing unit 203 performs signal processing on the signals output to the plurality of vertical signal lines 207, and outputs the plurality of signals obtained by the signal processing to the horizontal reading unit 204.
  • the column signal processing unit 203 performs noise suppression signal processing represented by correlated double sampling, analog / digital conversion processing, and the like.
  • the horizontal readout unit 204 sequentially outputs a plurality of signals after the signal processing by the plurality of column signal processing units 203 to the horizontal output terminal 208.
  • FIG. 4 is a circuit diagram illustrating a configuration of the pixel 201.
  • the pixel 201 includes a photoelectric conversion unit 211, a charge storage unit 212, a reset transistor 213, an amplification transistor 214 (source follower transistor), and a selection transistor 215.
  • the photoelectric conversion unit 211 generates signal charges by photoelectrically converting incident light. A voltage Voe is applied to one end of the photoelectric conversion unit 211.
  • the photoelectric conversion unit 211 includes a photoelectric conversion layer made of an organic material.
  • the photoelectric conversion layer may include a layer made of an organic material and a layer made of an inorganic material.
  • the charge storage unit 212 is connected to the photoelectric conversion unit 211 and stores the signal charge generated by the photoelectric conversion unit 211. Note that the charge storage unit 212 may be configured with a parasitic capacitance such as a wiring capacitance instead of a dedicated capacitance element.
  • the reset transistor 213 is used to reset the potential of the signal charge.
  • the gate of the reset transistor 213 is connected to the reset control line 205, the source is connected to the charge storage unit 212, and the reset voltage Vreset is applied to the drain.
  • drain and source generally depend on circuit operation, and are often not specified from the element structure.
  • one of the source and the drain is referred to as a source and the other of the source and the drain is referred to as a drain.
  • the drain may be replaced with the source and the source may be replaced with the drain.
  • the amplification transistor 214 amplifies the voltage of the charge storage unit 212 and outputs a signal corresponding to the voltage to the vertical signal line 207.
  • the gate of the amplification transistor 214 is connected to the charge storage unit 212, and the power supply voltage Vdd or the ground voltage Vss is applied to the drain.
  • the selection transistor 215 is connected in series with the amplification transistor 214, and switches whether to output the signal amplified by the amplification transistor 214 to the vertical signal line 207.
  • the selection transistor 215 has a gate connected to the address control line 206, a drain connected to the source of the amplification transistor 214, and a source connected to the vertical signal line 207.
  • the voltage Voe, the reset voltage Vreset, and the power supply voltage Vdd are voltages commonly used in all the pixels 201.
  • Non-destructive reading is a process of reading image data during an exposure period and continuing exposure.
  • conventional readout hereinafter referred to as destructive readout
  • nondestructive reading it is possible to read the image data exposed up to that time during the exposure period and continue the exposure. Thereby, a plurality of images having different exposure times can be obtained by one exposure.
  • the electronic ND control is a process for electrically controlling the transmittance of the image sensor.
  • the transmittance means the proportion of light that is converted into an electrical signal in the incident light. That is, by setting the transmittance to 0%, it is possible to electrically shield the light.
  • the transmittance is controlled by controlling the voltage Voe shown in FIG. Thereby, exposure can be electrically terminated without using a mechanical shutter.
  • the image sensor 101 includes a mechanical shutter (light shielding unit 107), and electronic ND control and light shielding by the mechanical shutter may be used together, or light shielding by the mechanical shutter may be used without using the electronic ND control.
  • the image sensor 101 is an organic sensor.
  • the image sensor 101 only needs to realize nondestructive reading or electronic ND control, and may be other than an organic sensor.
  • the photoelectric conversion layer included in the photoelectric conversion unit 211 may be made of an inorganic material.
  • the photoelectric conversion layer may be made of amorphous silicon or chalcopyrite semiconductor.
  • the imaging apparatus 100 controls photographing of the target frame using an auxiliary image obtained by nondestructive readout in the target frame. Specifically, the imaging apparatus 100 performs automatic exposure (AE: Auto Exposure) of the target frame using the auxiliary image.
  • AE Auto Exposure
  • FIG. 5 is a flowchart showing an operation flow of the imaging apparatus 100.
  • FIG. 6 is a diagram for explaining the operation of the imaging apparatus 100.
  • the imaging device 100 captures a live image that is a moving image of a real-time subject and displays the live image on the display unit 103 in a period before time t ⁇ b> 1 at which still image capturing starts. is doing.
  • the imaging apparatus 100 starts exposure for capturing a still image (S101).
  • the imaging apparatus 100 acquires one or a plurality of auxiliary images by performing non-destructive readout a predetermined number of times during the exposure period (S102). In FIG. 6, two nondestructive readings are performed, but the number of nondestructive readings may be arbitrary.
  • the imaging apparatus 100 performs AE control using the obtained auxiliary image (S103).
  • the imaging apparatus 100 can perform the AE control using the auxiliary image by the same method as the AE control using the live image.
  • the imaging apparatus 100 controls the exposure time T1 of the target frame using the auxiliary image.
  • the imaging apparatus 100 detects the movement of the subject based on the difference between the plurality of auxiliary images, and controls the exposure time T1 based on the obtained movement.
  • the imaging apparatus 100 sets the exposure time T1 to be shorter as the movement is larger.
  • the imaging apparatus 100 controls the exposure time T1 based on the luminance level of one or more auxiliary images. For example, the imaging apparatus 100 sets the exposure time T1 longer as the luminance level is lower.
  • the imaging apparatus 100 ends the exposure (S104). For example, the imaging apparatus 100 ends the exposure when the above-described electronic ND control or the mechanical shutter performs light shielding.
  • the imaging apparatus 100 acquires an image by performing destructive reading (S105).
  • S105 destructive reading
  • nondestructive readout is performed immediately after the end of exposure, but it may not be immediately after.
  • the imaging apparatus 100 controls shooting of the target frame using the auxiliary image obtained by nondestructive reading in the target frame. Specifically, the imaging apparatus 100 performs automatic exposure of the target frame using the auxiliary image.
  • the imaging apparatus 100 performs automatic exposure of the target frame using the auxiliary image.
  • the accuracy of automatic exposure can be improved.
  • the method of the present embodiment is different from the conventional method in that the image used for AE control is changed from a live image to an auxiliary image obtained by nondestructive reading. Therefore, more accurate automatic exposure can be realized while diverting part of the conventional automatic exposure algorithm.
  • FIG. 7 is a flowchart showing an operation flow of the imaging apparatus 100 according to the present embodiment.
  • FIG. 8 is a diagram for explaining the operation of the imaging apparatus 100.
  • the imaging apparatus 100 determines a threshold value TH used for determination processing described later (S111). Details of this process will be described later.
  • the imaging apparatus 100 starts exposure at time t1 (S112).
  • the imaging device 100 acquires an auxiliary image by nondestructive reading (S113).
  • the imaging apparatus 100 determines whether the luminance of the obtained auxiliary image exceeds the threshold value TH set in step S111 (S114).
  • the imaging apparatus 100 When the brightness of the auxiliary image does not exceed the threshold value TH (No in S114), the imaging apparatus 100 performs nondestructive readout again at time t3 after the time T2 has elapsed (S113), and the obtained auxiliary image It is determined whether the luminance exceeds a threshold value TH (S114). Then, the imaging apparatus 100 repeats these processes until the luminance of the auxiliary image exceeds the threshold value TH.
  • the imaging apparatus 100 ends the exposure by electronic ND control or mechanical shutter (S115), and acquires an image by destructive reading (S116: time). t6). Note that non-destructive reading may be performed instead of destructive reading.
  • FIG. 9 is a diagram illustrating an example of an auxiliary image and a luminance histogram obtained at each time.
  • the horizontal axis of the luminance histogram shown in FIG. 9 indicates luminance, and the vertical axis indicates the frequency of each luminance.
  • the auxiliary images obtained at each time have different brightness levels because of different exposure times. In other words, the brightness level increases as the obtained time is later.
  • the imaging apparatus 100 determines, for example, whether the maximum luminance value of the auxiliary image exceeds the threshold value TH. Note that the imaging apparatus 100 may compare the average value or median value of the luminance with the threshold value TH without being limited to the maximum luminance value.
  • the imaging apparatus 100 ends the exposure immediately after it is determined that the luminance of the auxiliary image exceeds the threshold value TH.
  • the exposure may end after a predetermined time.
  • the threshold value TH is set lower by a difference corresponding to the time.
  • the interval T2 at which nondestructive reading is performed is constant, but it may not be constant.
  • the imaging apparatus 100 may set an exposure time in advance by AE control or the like, and set a plurality of nondestructive readout timings so that the nondestructive readout interval at a time approaching the exposure time is shortened. Good.
  • the imaging apparatus 100 may shorten the nondestructive reading interval as the luminance of the obtained auxiliary image approaches the threshold value TH.
  • the time from the end of exposure to the time when destructive readout is performed may be arbitrary.
  • the imaging apparatus 100 sequentially performs non-destructive reading a plurality of times in the target frame, and stops exposure of the target frame when the luminance of the auxiliary image exceeds a predetermined threshold value TH.
  • the exposure period is determined before the image is captured, and an image having a desired luminance level is not always obtained.
  • the imaging apparatus 100 determines the threshold value TH according to the currently set shooting mode.
  • the imaging apparatus 100 may display a user interface for setting the threshold value TH on the display unit 103 and determine the threshold value TH based on a user instruction via the user interface user.
  • FIG. 10 is a diagram showing an example of this user interface.
  • a threshold TH corresponding to the sample image is set by the user selecting one of the sample images.
  • the sample image may be an image stored in advance or an auxiliary image obtained by nondestructive reading at the time of past shooting.
  • FIG. 10 shows an example in which the user selects the sample image, but the brightness level may be selected using an operation bar or the like for changing the brightness level. Alternatively, an operation of increasing or decreasing the luminance level with respect to the current luminance level may be used.
  • the user can intuitively and easily perform a brightness level selection operation.
  • threshold value TH need not be variable, and a fixed value may be used as the threshold value TH.
  • the imaging apparatus 100 sequentially performs a plurality of nondestructive readings in a target frame, and uses a plurality of auxiliary images obtained by the plurality of nondestructive readings in the target frame. Changes in the subject are detected. For example, the imaging apparatus 100 detects a change in an undesired subject that occurs due to the occurrence of a disturbance to the camera or a tripod falling when a long night is exposed, such as a starlit night.
  • FIG. 11 is a flowchart showing an operation flow of the imaging apparatus 100 according to the present embodiment.
  • FIG. 12 is a diagram illustrating an example of a change in luminance of the auxiliary image when a change in the subject occurs.
  • the imaging apparatus 100 starts exposure (S121). At time t1 after the predetermined time T3 has elapsed, the imaging device 100 acquires an auxiliary image by nondestructive reading (S122). Then, the imaging apparatus 100 determines whether the subject has changed using a plurality of auxiliary images obtained after the start of exposure (S123). If it is determined that there is no change in the subject (No in S123) and the exposure period has not ended (No in S125), the imaging apparatus 100 again performs nondestructive reading at time t2 after the time T3 has elapsed. (S122), it is determined again whether the subject has changed using the plurality of auxiliary images obtained so far (S123).
  • the imaging apparatus 100 issues a warning (S124). For example, the imaging apparatus 100 displays a message indicating that the subject has changed on the display unit 103. Note that the imaging apparatus 100 may notify the user that the subject has changed by a warning sound or a voice message.
  • the imaging apparatus 100 repeats these series of processes until the exposure period ends (S125).
  • the imaging apparatus 100 ends the exposure and acquires an image by destructive reading (S126).
  • the luminance of the auxiliary image increases linearly as the exposure time increases. That is, the luminance increases with a constant slope.
  • the luminance gradient changes.
  • FIG. 12 shows a case where an instantaneous disturbance has occurred. It should be noted that the inclination of the luminance changes (increases or decreases) when the shooting scene changes due to the tripod falling or when an undesired subject enters the screen.
  • the imaging apparatus 100 detects a gradient of luminance using a plurality of auxiliary images obtained at a plurality of times. Then, the imaging apparatus 100 calculates the inclination based on the newly obtained luminance of the auxiliary image and the luminance of the auxiliary image obtained immediately before, and the difference between the obtained inclination and the past inclination is determined in advance. It is determined whether the threshold value is exceeded. The imaging apparatus 100 determines that the subject has changed when the difference exceeds the threshold value, and determines that the subject has not changed when the difference does not exceed the threshold value.
  • the past inclination is calculated based on the luminance at time t3 and time t4.
  • the difference between the slope calculated based on the luminance at time t4 and time t5 and the past slope exceeds the threshold. As a result, a warning is issued.
  • the imaging apparatus 100 performs the above determination using, for example, the average luminance of the auxiliary image. Note that the imaging apparatus 100 may use the maximum luminance, or may use the average luminance or the maximum luminance of a specific area of the auxiliary image. The imaging apparatus 100 may perform the above determination for each region including one or more pixels, and may determine that the subject has changed when it is determined that any region has a change.
  • the imaging apparatus 100 sequentially performs a plurality of nondestructive readings in a target frame, and uses a plurality of auxiliary images obtained by the plurality of nondestructive readings in the target frame. Detect changes in the subject. Thereby, for example, since a change in the subject during the exposure of the long exposure can be detected, it is possible to notify the user of the shooting failure during the long exposure at an early stage. Therefore, it is possible to prevent the exposure from being performed to the end even though the shooting has failed.
  • the imaging apparatus 100 may stop shooting (S124A). As a result, the imaging apparatus 100 can automatically stop shooting, for example, when an abnormality occurs during long exposure, so that exposure is performed to the end even though shooting has failed. Can be prevented.
  • the imaging apparatus 100 may perform re-imaging (S124B). Thereby, the imaging apparatus 100 can automatically perform re-imaging when an abnormality occurs during, for example, long exposure.
  • the imaging apparatus 100 when it is determined that there is a change in the subject, the imaging apparatus 100 performs correction for reducing the influence of the change in the subject on the image of the target frame obtained by shooting. May be. Specifically, when it is determined that there is a change in the subject (Yes in S123), the imaging apparatus 100 stores change information regarding the change in the subject (S124C). Then, after acquiring the image of the target frame by destructive reading (S126), the imaging apparatus 100 corrects the image using the change information (S127).
  • the imaging apparatus 100 determines a change for each area including one or a plurality of pixels, and stores information indicating the changed area and the amount of change as change information. Then, the imaging apparatus 100 performs correction by subtracting or adding a luminance value corresponding to the amount of change with respect to the changed area.
  • the imaging apparatus 100 determines the change and stores the change information for each non-destructive reading. However, after all the non-destructive reading is performed or the destructive reading is performed. After that, determination and change information may be stored together.
  • the imaging apparatus 100 can generate an image with reduced influence even when a change occurs in the subject.
  • Embodiment 4 In the present embodiment, another method for controlling the exposure time of a target frame using an auxiliary image will be described. In the first and second embodiments, the method for automatically controlling the exposure time is described. However, in this embodiment, the user provides information for determining the exposure time, and the exposure time is determined based on the user's operation. An example of control will be described. For example, the method of the present embodiment can be applied during long-second exposure of a still image.
  • FIG. 16 is a flowchart showing an operation flow of the imaging apparatus 100 according to the present embodiment.
  • FIG. 17 is a diagram for explaining the operation of the imaging apparatus 100.
  • the imaging apparatus 100 starts exposure (S131). Next, the imaging apparatus 100 acquires an auxiliary image by nondestructive reading (S132). Next, the imaging apparatus 100 displays the obtained auxiliary image on the display unit 103 (S133).
  • the imaging apparatus 100 performs nondestructive readout again after a predetermined time has elapsed. Perform (S132), and display the newly obtained auxiliary image (S133).
  • nondestructive reading is performed at a predetermined cycle T4, and the read auxiliary images are sequentially displayed. Note that the interval at which nondestructive reading is performed may not be constant.
  • the imaging apparatus 100 stops the exposure (S135) and acquires an image by destructive reading (S137).
  • the imaging apparatus 100 acquires an image by destructive reading (S137).
  • the display unit 103 displays the auxiliary image during the exposure period of the target frame. Further, the control unit 102 receives an instruction to end the exposure of the target frame during the exposure period of the target frame. As a result, the user can check the luminance level at any time with reference to the auxiliary image displayed during the exposure period. For this reason, the user can stop the exposure at a desired luminance level. Therefore, the user can easily take an image with a desired luminance level.
  • the method for instructing the end of exposure by the user is not particularly limited.
  • an instruction to end the exposure is accepted when the user presses the shutter button or operates the touch panel during the exposure period.
  • the user controls the exposure time T1 during the exposure period.
  • the user selects an arbitrary image from a plurality of auxiliary images already obtained during or after the exposure period. Also good.
  • the imaging apparatus 100 displays an auxiliary image and stores the auxiliary image.
  • the imaging apparatus 100 displays a user interface for selecting one of a plurality of auxiliary images on the display unit 103 as illustrated in FIG.
  • auxiliary images obtained during exposure are sequentially stored in the storage unit 104. If exposure is in progress, a list of a plurality of auxiliary images obtained up to that timing is displayed. If it is after the exposure, a list of all auxiliary images obtained by the exposure is displayed.
  • the user can select an image having an arbitrary luminance level by selecting one auxiliary image via the interface.
  • the displayed image may include an image obtained by destructive readout.
  • the imaging apparatus 100 may delete auxiliary image data that has not been selected after the above selection. Further, the imaging apparatus 100 may end the exposure when a selection is made during the exposure.
  • the imaging apparatus 100 sequentially performs a plurality of nondestructive readings in the target frame, and stores a plurality of images obtained by the plurality of nondestructive readings in the storage unit 104 as moving images. Also good. Further, the imaging apparatus 100 may reproduce this moving image and display the reproduced moving image on the display unit 103. That is, the imaging apparatus 100 stores a plurality of images obtained by a plurality of nondestructive readings in a moving image file format. For example, a publicly known format including a moving image encoding method using inter-screen prediction or the like is used.
  • the stored moving image is played back automatically after the exposure is completed or based on a user operation.
  • Storing a plurality of images obtained by non-destructive readout in the moving image format in this way allows the shooting process itself during the exposure period to be stored as a moving image work.
  • the method it is possible to generate an operation without performing still image composition processing.
  • the amount of data can be reduced as compared to storing a plurality of still images.
  • the imaging apparatus 100 uses an imaging element 101 capable of nondestructive readout and an auxiliary image obtained by nondestructive readout in the target frame, and a control unit 102 that controls shooting of the target frame.
  • an imaging element 101 capable of nondestructive readout and an auxiliary image obtained by nondestructive readout in the target frame
  • a control unit 102 that controls shooting of the target frame.
  • the imaging apparatus 100 can control photographing of the frame using the auxiliary image obtained by nondestructive readout, appropriate control can be performed according to the situation during the exposure period. As described above, the imaging apparatus 100 can realize further improvement.
  • control unit 102 may control the exposure time of the target frame using the auxiliary image.
  • the imaging apparatus 100 can control the exposure time based on the information of the image obtained during the exposure period, it is more appropriate than when the exposure time is controlled based on the information obtained before the start of exposure.
  • the exposure time can be controlled.
  • control unit 102 may sequentially perform non-destructive reading a plurality of times in the target frame, and stop exposure of the target frame when the luminance of the auxiliary image exceeds a predetermined threshold.
  • the imaging apparatus 100 can appropriately control the exposure time based on the image information obtained during the exposure period.
  • the imaging apparatus 100 may further include a display unit 103, and the control unit 102 may further display a user interface for setting a threshold on the display unit 103.
  • control unit 102 may perform automatic exposure of the target frame using the auxiliary image.
  • the imaging apparatus 100 can perform automatic exposure based on the image information obtained during the exposure period, the image capturing apparatus 100 can perform automatic exposure more appropriately than when performing automatic exposure based on the information obtained before the start of exposure. Can be exposed.
  • control unit 102 sequentially performs a plurality of nondestructive readings in the target frame, and detects a change in the subject in the target frame using a plurality of auxiliary images obtained by the plurality of nondestructive readings. Good.
  • the imaging apparatus 100 can detect, for example, a change in the subject during the long exposure using the auxiliary image obtained by nondestructive reading.
  • control unit 102 may issue a warning when a change in the subject is detected.
  • the imaging apparatus 100 can notify the user of the occurrence of an abnormality during long exposure, for example.
  • control unit 102 may stop shooting when a change in the subject is detected.
  • the imaging apparatus 100 can automatically stop photographing when, for example, an abnormality occurs during long exposure.
  • control unit 102 may perform re-photographing when a change in the subject is detected.
  • the imaging apparatus 100 can automatically perform re-imaging when, for example, an abnormality occurs during long exposure.
  • control unit 102 may perform correction for reducing the influence of the change in the subject on the image of the target frame obtained by shooting.
  • the imaging apparatus 100 can generate an image with reduced influence.
  • the imaging apparatus 100 further includes a display unit 103 that displays an auxiliary image during the exposure period of the target frame, and the control unit 102 further instructs to end the exposure of the target frame during the exposure period of the target frame. May be accepted.
  • the user can take an image of a desired luminance level with reference to the auxiliary image displayed during the exposure period.
  • control unit 102 may sequentially perform a plurality of nondestructive readings in the target frame and store a plurality of images obtained by the nondestructive readings as a moving image.
  • the image sensor 101 may be an organic sensor.
  • the control method according to an aspect of the present disclosure is a control method of the imaging apparatus 100 including the imaging element 101 capable of nondestructive readout, and uses the auxiliary image obtained by nondestructive readout in the target frame, A control step for controlling frame shooting;
  • control method can control photographing of the frame by using the auxiliary image obtained by nondestructive reading, appropriate control can be performed according to the situation during the exposure period.
  • control method can realize further improvement.
  • the imaging device according to the embodiment of the present disclosure has been described above, but the present disclosure is not limited to this embodiment.
  • each processing unit included in the imaging apparatus is typically realized as an LSI that is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
  • circuits are not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
  • An FPGA Field Programmable Gate Array
  • reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
  • each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the present disclosure may be realized as a control method executed by the imaging apparatus.
  • circuit configuration shown in the circuit diagram is an example, and the present disclosure is not limited to the circuit configuration. That is, similar to the circuit configuration described above, a circuit that can realize the characteristic function of the present disclosure is also included in the present disclosure. Moreover, all the numbers used above are illustrated for specifically explaining the present disclosure, and the present disclosure is not limited to the illustrated numbers.
  • division of functional blocks in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, a single functional block can be divided into a plurality of functions, or some functions can be transferred to other functional blocks. May be.
  • functions of a plurality of functional blocks having similar functions may be processed in parallel or time-division by a single hardware or software.
  • the imaging device has been described based on the embodiment, but the present disclosure is not limited to this embodiment. Unless it deviates from the gist of the present disclosure, various modifications conceived by those skilled in the art have been made in this embodiment, and forms constructed by combining components in different embodiments are also within the scope of one or more aspects. May be included.
  • the present disclosure can be applied to an imaging apparatus such as a digital still camera or a digital video camera.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

La présente invention concerne un dispositif d'imagerie (100) qui est pourvu d'un élément d'imagerie (101) capable de réaliser une lecture non destructive et d'une unité de commande (102) pour commander l'imagerie d'une vue de sujet à l'aide d'une image auxiliaire obtenue par lecture non destructive dans la vue de sujet. L'unité de commande (102) peut, par exemple, commander le temps d'exposition de la vue de sujet à l'aide de l'image auxiliaire. L'unité de commande (102) peut, par exemple, effectuer successivement une lecture non destructive dans la vue de sujet plusieurs fois et arrêter l'exposition pour la vue de sujet lorsque la luminance de l'image auxiliaire dépasse un seuil prédéfini.
PCT/JP2017/046598 2016-12-27 2017-12-26 Dispositif d'imagerie et son procédé de commande WO2018124054A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016254540A JP2020031259A (ja) 2016-12-27 2016-12-27 撮像装置及びその制御方法
JP2016-254540 2016-12-27

Publications (1)

Publication Number Publication Date
WO2018124054A1 true WO2018124054A1 (fr) 2018-07-05

Family

ID=62709350

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/046598 WO2018124054A1 (fr) 2016-12-27 2017-12-26 Dispositif d'imagerie et son procédé de commande

Country Status (2)

Country Link
JP (1) JP2020031259A (fr)
WO (1) WO2018124054A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11678081B2 (en) 2020-02-05 2023-06-13 Panasonic Intellectual Property Managevent Co., Ltd. Imaging device and image processing method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005204298A (ja) * 2003-12-17 2005-07-28 Hewlett-Packard Development Co Lp 画像取り込み中に露出情報を示すシステムおよび方法
JP2007259428A (ja) * 2006-02-27 2007-10-04 Seiko Epson Corp 撮像素子、撮像装置、撮像システム、撮像方法、モーションデータ生成システム、モーションデータ生成プログラム及びモーションデータ生成方法
JP2010074584A (ja) * 2008-09-19 2010-04-02 Sony Corp 撮像装置および撮像方法
JP2014168209A (ja) * 2013-02-28 2014-09-11 Canon Marketing Japan Inc 撮像装置、撮像方法ならびにプログラム
WO2015045828A1 (fr) * 2013-09-27 2015-04-02 富士フイルム株式会社 Dispositif d'imagerie et procédé d'imagerie
JP2015159353A (ja) * 2014-02-21 2015-09-03 オリンパス株式会社 撮像装置および撮像方法
JP2016161653A (ja) * 2015-02-27 2016-09-05 富士フイルム株式会社 撮影装置および方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005204298A (ja) * 2003-12-17 2005-07-28 Hewlett-Packard Development Co Lp 画像取り込み中に露出情報を示すシステムおよび方法
JP2007259428A (ja) * 2006-02-27 2007-10-04 Seiko Epson Corp 撮像素子、撮像装置、撮像システム、撮像方法、モーションデータ生成システム、モーションデータ生成プログラム及びモーションデータ生成方法
JP2010074584A (ja) * 2008-09-19 2010-04-02 Sony Corp 撮像装置および撮像方法
JP2014168209A (ja) * 2013-02-28 2014-09-11 Canon Marketing Japan Inc 撮像装置、撮像方法ならびにプログラム
WO2015045828A1 (fr) * 2013-09-27 2015-04-02 富士フイルム株式会社 Dispositif d'imagerie et procédé d'imagerie
JP2015159353A (ja) * 2014-02-21 2015-09-03 オリンパス株式会社 撮像装置および撮像方法
JP2016161653A (ja) * 2015-02-27 2016-09-05 富士フイルム株式会社 撮影装置および方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11678081B2 (en) 2020-02-05 2023-06-13 Panasonic Intellectual Property Managevent Co., Ltd. Imaging device and image processing method

Also Published As

Publication number Publication date
JP2020031259A (ja) 2020-02-27

Similar Documents

Publication Publication Date Title
US10368025B2 (en) Imaging element, imaging apparatus, its control method, and control program
JP5614993B2 (ja) 撮像装置及び固体撮像素子の駆動方法
US10785423B2 (en) Image sensor, image capturing apparatus, and image capturing method
TW201340708A (zh) 固體攝像裝置及電子機器
CN109997352B (zh) 摄像装置、相机以及摄像方法
US10277853B2 (en) Image capturing apparatus and control method of the same
JP5222068B2 (ja) 撮像装置
WO2018124054A1 (fr) Dispositif d'imagerie et son procédé de commande
US9282245B2 (en) Image capturing apparatus and method of controlling the same
JP2007013362A (ja) 撮像装置及び撮像方法
JP5043400B2 (ja) 撮像装置及びその制御方法
US7999871B2 (en) Solid-state imaging apparatus, and video camera and digital still camera using the same
JP6362099B2 (ja) 撮像装置及びその制御方法、プログラム、記憶媒体
JP2013192058A (ja) 撮像装置
WO2018124056A1 (fr) Dispositif d'imagerie et procédé de commande associé
JP2008072512A (ja) 撮像装置及びその制御方法、撮像システム
JP6664066B2 (ja) 撮像装置及びその制御方法
US9794502B2 (en) Image capturing apparatus
US10368020B2 (en) Image capturing apparatus and control method therefor
WO2018124057A1 (fr) Dispositif d'imagerie et son procédé de commande
JP5737924B2 (ja) 撮像装置
JP2022044285A (ja) 検出装置および検出方法
JP5518025B2 (ja) 光電変換装置及び撮像装置
JP6470589B2 (ja) 撮像装置およびその制御方法、プログラム、並びに記憶媒体
JP2010011047A (ja) 撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17889262

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17889262

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP