WO2024087183A1 - Imaging device, imaging device control method, and computer program product - Google Patents

Imaging device, imaging device control method, and computer program product Download PDF

Info

Publication number
WO2024087183A1
WO2024087183A1 PCT/CN2022/128325 CN2022128325W WO2024087183A1 WO 2024087183 A1 WO2024087183 A1 WO 2024087183A1 CN 2022128325 W CN2022128325 W CN 2022128325W WO 2024087183 A1 WO2024087183 A1 WO 2024087183A1
Authority
WO
WIPO (PCT)
Prior art keywords
shooting
imaging device
state
user
image sensor
Prior art date
Application number
PCT/CN2022/128325
Other languages
French (fr)
Inventor
Takafumi Kishi
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2022/128325 priority Critical patent/WO2024087183A1/en
Publication of WO2024087183A1 publication Critical patent/WO2024087183A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/65Control of camera operation in relation to power supply
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Definitions

  • Embodiments of the present disclosure relate to an imaging device, an imaging device control method, and a computer program product.
  • HDR High Dynamic Range
  • NR Noise Reduction
  • a motion blur may be generated by a motion of the subject during acquiring the multiple images for the composition.
  • the motion blur can be improved by the images for composition being as close in time as possible. For this reason, there has been known technologies of acquiring the multiple images with short time intervals by the imaging device being driven at a high frame rate in a preview state before shooting.
  • An object of the present disclosure is to acquire images with less motion blur while suppressing power consumption in the preview state.
  • An imaging device is a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected.
  • the imaging device includes a shooting prediction unit and an imaging control unit.
  • the shooting prediction unit predicts the shooting instruction by the user.
  • the imaging control unit changes control of an image sensor in the ZSL operation in case of a shooting state where it is predicted by the shooting prediction unit that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
  • a control method is executed by a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected.
  • the method includes: predicting the shooting instruction by the user; and changing control of an image sensor in the ZSL operation in case of a shooting state where it is predicted that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
  • a computer program product stores a program to be executed by a computer of a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected.
  • the program causes the computer to execute: predicting the shooting instruction by the user; and changing control of an image sensor in the ZSL operation in case of a shooting state where it is predicted that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
  • FIG. 1 is a diagram illustrating an example of a configuration of an imaging device according to a first embodiment
  • FIG. 2 is a diagram illustrating an example of a functional configuration of a controller according to the first embodiment
  • FIG. 3 is a diagram explaining shooting prediction and control change of an image sensor in control processing according to the first embodiment
  • FIG. 4 is a flowchart illustrating an example of a flow of the control processing according to the first embodiment
  • FIG. 5 is a diagram explaining shooting prediction in control processing according to a second embodiment
  • FIG. 6 is a diagram illustrating an example of a configuration of an imaging device according to a third embodiment
  • FIG. 7 is a diagram explaining shooting prediction in control processing according to the third embodiment.
  • FIG. 8 is a diagram explaining control change of an image sensor in control processing according to a fourth embodiment
  • FIG. 9 is a diagram explaining thresholds of motion detection in control processing according to a fifth embodiment.
  • FIG. 10 is a diagram explaining control change of an image sensor in control processing according to a sixth embodiment.
  • FIG. 11 is a diagram explaining control change of an image sensor in control processing according to a seventh embodiment.
  • FIG. 12 is a diagram explaining control change of an image sensor in control processing according to an eighth embodiment.
  • HDR High Dynamic Range
  • NR Noise Reduction
  • a motion blur may be generated by a motion of the subject during acquiring the multiple images for the composition.
  • the motion blur can be improved by the images for composition being as close in time as possible. For this reason, there has been known technologies of acquiring the multiple images with short time intervals by the imaging device being driven at a high frame rate in a preview state before shooting.
  • an imaging device an imaging device control method, a program and a computer program product, which can obtain images with less motion blur while suppressing power consumption during the preview, will be described in the following embodiments.
  • the following embodiments illustrate a digital imaging device configured to be able to realize ZSL starting from the shooting instruction by the user.
  • an imaging device configured to be able to realize high image quality processing, such as HDR (High Dynamic Range) composition and NR (Noise Reduction) , which obtains a composite image by using a plurality of captured images obtained during the ZSL operation.
  • HDR High Dynamic Range
  • NR Noise Reduction
  • FIG. 1 is a diagram illustrating an example of a configuration of an imaging device 1 according to the first embodiment.
  • the imaging device 1 includes an imaging unit 10, a controller 21, a digital signal processor 23, a memory 25, an input interface 27, and a gyro sensor 29.
  • the imaging unit 10 is electrically connected to the DSP (digital signal processor) 23.
  • the controller 21, the digital signal processor 23, the memory 25, the input interface 27, and the gyro sensor 29 are connected to be able to communicate with one another via a signal line such as a bus 31, for example.
  • the imaging unit 10 images a subject field to generate a captured image (image data) .
  • the imaging unit 10 includes an optical system 11, an image sensor 13, and an analog front end (AFE) 15.
  • AFE analog front end
  • the optical system 11 includes an optical element configured to form an image of a light beam from a subject on an imaging surface 131 of the image sensor 13.
  • FIG. 1 exemplifies a single lens as the optical element of the optical system 11 but the present embodiment is not limited to the above.
  • the optical system 11 may have desired imaging performance by at least one optical element having power.
  • the optical system 11 may be composed of a compound lens that includes at least one single lens, or may be composed of a combination of a lens system and a reflection system.
  • the image sensor 13 images a subject field to generate an image signal.
  • the image sensor 13 is arranged on an optical axis of the optical system 11.
  • the image sensor 13 is arranged at a position at which the image of the light beam from the subject is formed by the optical system 11.
  • the image sensor 13 can appropriately employ a solid-state imaging device such as CCD (Charge Coupled Device) and CMOS (Complementary Metal-Oxide Semiconductor) .
  • the AFE 15 is an analog signal processor configured to perform analog processing such as amplification processing on the image signal read from the image sensor 13.
  • the AFE 15 includes a correlated double sampler (CDS) , a gain control amplifier (GCA) , and an A/D converter (ADC) .
  • the CDS is provided in back of the image sensor 13, and performs noise rejection on the image signal from the image sensor 13.
  • the GCA is provided in back of the CDS, and performs amplification processing on the image signal whose noise is removed by the CDS, in accordance with the control of the controller 21.
  • the ADC is provided in back of the GCA, and converts the image signal amplified by the GCA into digital-format image data (captured image) . Note that a part or the whole of the function of the AFE 15 may be realized inside the image sensor 13.
  • the imaging unit 10 is configured to be able to change a focus position.
  • "to be able to change a focus position” means that an image formed on the imaging surface 131 can be made smaller than a diameter of a permissible circle of confusion for each of at least two object points that exist at different positions in an optical axis direction of the optical system 11.
  • a diameter of a permissible circle of confusion is defined depending on a pixel pitch of the image sensor 13 or imaging performance of the optical system 11, for example.
  • the imaging unit 10 is configured to be able to focus or blur (bokeh) an arbitrary subject.
  • the imaging unit 10 is configured to be able to move at least one, of an image-side focus position of the optical system 11, an object-side focus position of the optical system 11, and the imaging surface 131 of the image sensor 13, in the optical axis direction of the optical system 11.
  • the controller 21 controls each component of the imaging device 1 in accordance with a program stored in an internal memory or the memory 25.
  • the controller 21 includes a processor and a memory as hardware resources.
  • the processor can appropriately employ various processors such as CPU (Central Processing Unit) , DSP (Digital Signal Processor) , ASIC (Application Specific Integrated Circuit) , and FPGA (Field-Programmable Gate Array) .
  • the memory can appropriately employ various memories such as ROM (Read Only Memory) , a flash memory, and RAM (Random Access Memory) .
  • the controller 21 may employ a microcomputer.
  • the DSP 23 is provided in back of the AFE 15, and performs various image processing required for displaying and recording images on the images from the AFE 15.
  • the image processing includes, for example, an optical black (OB) subtraction process, a white balance (WB) correction process, a demosaic process, a color conversion process, a gamma conversion process, a noise reduction process, an enlargement/reduction process, a compression process, and the like.
  • the memory 25 stores a program required for operations of the imaging device 1. Moreover, the memory 25 stores information required for various processes of the imaging device 1. This information includes, for example, information indicating correspondence between the size of the motion of the imaging device 1 and the frame rate. Moreover, the memory 25 temporarily stores therein various data such as the image output from the DSP 23 and processing data in the controller 21.
  • the memory 25 includes, as hardware resources, a nonvolatile memory such as ROM and a flash memory, and a volatile memory such as DRAM (Dynamic RAM) , SDRAM (Synchronous DRAM) , and SRAM (Static RAM) .
  • the input interface 27 accepts an input operation of the user in the imaging device 1 through various input devices such as a touch panel display, a switch, a button, a keyboard, and a microphone accepting a voice input of the user, for example.
  • the input interface 27 accepts a release operation (shooting instruction) that indicates the shooting of the user.
  • the gyro sensor 29 is configured to output a detection signal with a value according to the size of a motion such as a shake of the imaging device 1.
  • FIG. 2 is a diagram illustrating an example of a functional configuration of the controller 21 according to the first embodiment.
  • the controller 21 realizes functions of an imaging control module 211, a shooting prediction module 213, and a shooting instruction detection module 215 by a processor executing a program developed on the RAM of the internal memory or the memory 25.
  • the controller 21 realizing a function of the imaging control module 211 is an example of an imaging control unit.
  • the controller 21 realizing a function of the shooting prediction module 213 is an example of a shooting prediction unit.
  • the controller 21 realizing a function of the shooting instruction detection module 215 is an example of an instruction detection unit.
  • modules 211, 213, and 215 may be realized by a single processor, or may be realized by a combination of a plurality of independent processors. Moreover, each of the modules 211, 213, and 215 may be realized by being integrated or distributed to a plurality of processors.
  • the imaging control module 211 Based on an AE evaluation value indicating a subject brightness in the captured image, the imaging control module 211 performs automatic exposure (AE) processing for setting imaging conditions that include an aperture value and a shutter speed value.
  • AE automatic exposure
  • the imaging control module 211 performs the AE processing by using a first release operation of the user as a trigger, for example.
  • the imaging control module 211 performs automatic focus (AF) adjustment processing for controlling the drive of at least one of the focusing lens and the image sensor 13 included in the optical system 11.
  • the focus information is, for example, an AF evaluation value (contrast value) calculated from the captured image.
  • the focus information may be a defocusing amount calculated from the output of the focus detection pixel.
  • the imaging control module 211 performs shooting processing of controlling the imaging unit 10 to acquire a captured image.
  • the imaging control module 211 performs the shooting processing by using a second release operation (shooting instruction) of the user as a trigger, for example.
  • the first release operation and the second release operation includes an operation of tapping an arbitrary subject etc. on a touch panel display (not illustrated) during preview display.
  • the preview display may be referred to as a live view display.
  • the imaging control module 211 performs shooting processing of controlling the imaging unit 10 and the memory 25 to acquire a composite image, that is, shooting processing with the ZSL operation.
  • the imaging control module 211 temporarily stores captured images obtained during the preview in the memory 25.
  • the imaging control module 211 reads from the memory 25 the plurality of captured images obtained before a predetermined time from this detection time point.
  • the imaging control module 211 performs high image quality processing of compositing the read plurality of captured images by the DSP 23 to acquire a composite image.
  • the imaging control module 211 changes the control of the image sensor 13 in the ZSL operation in the case of a shooting state predicted by the shooting prediction module 213 in which the shooting instruction by the user can be performed at a time point before the shooting instruction by the user is detected by the shooting instruction detection module 215.
  • the imaging control module 211 changes a frame rate for imaging pixel signals from the image sensor 13 in accordance with the shooting state predicted by the shooting prediction module 213. The details of the control change of the image sensor 13 will be described below.
  • the shooting prediction module 213 performs shooting prediction of predicting a shooting state.
  • the prediction of the shooting state in the shooting prediction includes determining directly or indirectly which shooting state the state of the imaging device is.
  • the shooting prediction includes determining directly or indirectly closeness in time to the shooting state.
  • the shooting prediction module 213 performs the shooting prediction based on the size of the motion of the imaging device 1. For example, the shooting prediction module 213 detects the size of the motion of the imaging device 1 based on the detection signal output from the gyro sensor 29. Then, the shooting prediction module 213 predicts the shooting state based on whether the size of the motion of the imaging device 1 is within a predetermined threshold range. The details of the shooting prediction will be described below.
  • the shooting prediction module 213 may perform the shooting prediction by using a threshold with respect to the detection signals from the gyro sensor 29, instead of the threshold with respect to the size of the motion of the imaging device 1.
  • the shooting prediction module 213 may perform the shooting prediction by using a threshold with respect to a convergence rate of the size of the motion indicating how much the size of the motion is converged with reference to the motion of the imaging device 1 in a first period in which the preview display is started.
  • the shooting instruction detection module 215 detects the shooting instruction performed by the user based on the output of the input interface 27 according to the second release operation (shooting instruction) , for example.
  • FIG. 3 is a diagram explaining the shooting prediction and the control change of the image sensor in control processing according to the first embodiment.
  • a period of a state where a composition has not yet been decided is referred to as the first period.
  • the user thinks about a composition etc. for shooting the subject while looking at the preview display, for example, in the state where the imaging device 1 is gripped.
  • this state is the state where a composition has not yet been decided
  • the user moves the imaging device 1 so that the subject fits within an angle of view or moves the imaging device 1 to try various compositions for the subject, while looking at the preview display. For this reason, as illustrated in FIG. 3, the motion of the imaging device 1 in the first period is large.
  • a period in a state where the composition is being decided after the first period is referred to as a second period.
  • the second period is a period of the shooting state where the shooting instruction of the user is near.
  • the second period is a period of about 1 to 3 seconds.
  • the user adjusts the direction and position of the imaging device 1 while looking at the preview display, for example, in order to fix the composition for the subject. For this reason, as illustrated in FIG. 3, the motion of the imaging device 1 in the second period is smaller than that in the first period.
  • the shooting prediction module 213 predicts that the current state is the shooting state of the second period where the shooting instruction is near.
  • the imaging control module 211 changes the control to drive the image sensor 13 at a frame rate 413 of the second period higher than the frame rate 411 of the first period.
  • a higher frame rate for imaging pixel signals from the image sensor 13 is set as the current state approaches the shooting state of a third period.
  • a higher frame rate is set as a possibility that the shooting instruction by the user is performed is increased, even if the shooting instruction is performed by the user at this stage, it is possible to reduce the motion blur while reducing power consumption, compared to the control of driving the image sensor 13 at a high frame rate at the time point at which the preview display is started, for example.
  • a plurality of the frame rates 413 of the second period may be set in accordance with the size of the motion of the imaging device 1.
  • the frame rate 413 of the second period may have a higher value as the current state approaches the shooting state of the third period.
  • each of the plurality of frame rates 413 of the second period may have a higher value as the current state approaches the shooting state of the third period.
  • the shooting prediction module 213 predicts closeness in time to the shooting state of the third period as a shooting state at its time point.
  • the imaging control module 211 may select the frame rate 413 of the second period to be set, from a table indicating a plurality of predetermined frame rates for the respective thresholds 401 of the second period.
  • the imaging control module 211 may seamlessly set the frame rate 413 of the second period in accordance with the closeness to the shooting state, by using a relational expression indicating a relationship between a frame rate and a closeness to the shooting state of the third period such as the size of the operation of the imaging device 1.
  • the third period is a period of the shooting state where the shooting instruction of the user can be performed.
  • the third period is a period of about 1 second.
  • the user times the shooting or fine-tunes the direction and position of the imaging device 1 while looking at the preview display, for example. For this reason, as illustrated in FIG. 3, the motion of the imaging device 1 in the third period is further smaller than that in the second period.
  • the shooting prediction module 213 predicts that the current state is the shooting state of the third period where the shooting instruction of the user can be performed.
  • the imaging control module 211 sets a frame rate for imaging pixel signals from the image sensor 13 higher than that at a time point before the shooting state of the third period, that is, in the first period and the second period.
  • a frame rate 415 of the third period that is set in the shooting state where it is predicted that the shooting instruction by the user can be performed is an example of the first frame rate.
  • the imaging control module 211 starts the ZSL operation. In other words, the imaging control module 211 acquires captured images at the frame rate 415 of the third period, and sequentially stores them in the memory 25, for example.
  • the shooting instruction detection module 215 detects the shooting instruction by the user based on the output of the input interface 27 according to the shooting instruction of the user.
  • the imaging control module 211 reads from the memory 25 a plurality of captured images 621, 622, and 623 obtained before a time point at which the shooting instruction by the user is detected. Then, by using the read plurality of captured images 621, 622, and 623, the imaging control module 211 generates an image 631 at the time point at which the shooting instruction by the user is performed.
  • a frame rate higher than that in the first period and the second period is set in the third period. For this reason, a change in the position of the subject with respect to the angle of view is small between the plurality of captured images 621, 622, and 623 obtained at the frame rate 415 of the third period. For this reason, the composite image 631 generated by compositing the plurality of captured images 621, 622, and 623 has a small motion blur 503.
  • the imaging control module 211 sets a frame rate 417 lower than the frame rate 415 of the third period when the shooting instruction of the user is detected by the shooting instruction detection module 215, for example.
  • the low frame rate 417 to be set after the shooting state may be the same frame rate as that at the time point before the shooting state of the third period such as the first period and the second period, or may be a frame rate different from that at the time point.
  • the frame rate 417 is an example of a second frame rate.
  • the control of driving the image sensor 13 is performed at the frame rate 411 of the first period that is a normal frame rate. Moreover, when the shooting state of the second period where the shooting instruction is near is predicted, the control of the image sensor 13 is changed to drive the image sensor 13 at the frame rate 413 of the second period higher than the frame rate 411 of the first period. Moreover, when the shooting state of the third period where the shooting instruction can be performed is predicted, the control of the image sensor 13 is changed to drive the image sensor 13 at the frame rate 415 of the third period higher than the frame rate 413 of the second period. As a result, compared to the control of driving the image sensor 13 at a high frame rate at the time point at which the preview display is started, for example, it is possible to reduce the motion blur while reducing power consumption.
  • each module according to the present embodiment such as the shooting instruction detection module 215 is realized by an application for shooting installed in the smartphone.
  • the shooting instruction detection module 215 detects the output of the input interface 27 according to the shooting instruction of the user via OS (operating system) of the smartphone, for example.
  • the imaging device 1 such as the smartphone has a system delay from a time point at which the shooting instruction by the user is performed to a time point at which the shooting instruction is detected by the shooting instruction detection module 215. For this reason, even if the control of changing the frame rate is performed at the time point at which the shooting instruction is detected, a captured image at the time point at which the shooting instruction is performed is acquired at the frame rate before the change.
  • the imaging device 1 of the present embodiment when the shooting state of the third period where the shooting instruction can be performed is predicted, the ZSL operation is started and the control of the image sensor 13 is changed to drive the image sensor 13 at the frame rate 415 higher than that in the first period and the second period. Therefore, according to the imaging device 1 of the embodiment, even if there is a system delay, it is possible to change the control of the image sensor 13 at an appropriate timing that is not too early and not too late based on the shooting prediction.
  • the imaging device 1 of the embodiment because the low frame rate control is performed at the time at which the preview display is started but the high frame rate control is already performed at the time point at which the shooting instruction is performed, captured images that are stored in the ZSL operation can be acquired at the high frame rate after the change while reducing power consumption.
  • the ZSL operation may be started in the first period or the second period. Even in this case, compared to the case where the preview display is started and concurrently the high frame rate is set, it is possible to reduce the motion blur while reducing power consumption.
  • the second period may not exist depending on the time series of the motion of the imaging device 1.
  • FIG. 4 is a flowchart illustrating an example of a flow of control processing according to the first embodiment.
  • the flow illustrated in FIG. 4 is implemented in a scene etc. of shooting a moving subject, for example, but may be implemented in another scene or may be implemented regardless of a scene.
  • the preview display by the imaging control module 211 and the shooting prediction by the shooting prediction module 213 are started (S101) .
  • the imaging control module 211 starts imaging by the image sensor 13 by using imaging conditions such as a shutter speed (exposure time) and the predetermined normal frame rate 411 of the first period, and starts the preview display of displaying captured images on a display (not illustrated) .
  • the captured images for the preview display may be a poor quality image whose exposure time, number of pixels, etc. are reduced lower than those of a captured image used for image generation according to the shooting instruction of the user or for which a part of image processing is omitted or suppressed.
  • the preview display may be referred to as a live view display.
  • the shooting prediction module 213 starts shooting prediction based on the size of the motion of the imaging device 1.
  • the shooting prediction module 213 according to the present embodiment performs the shooting prediction of predicting a shooting state by comparison between the size of the motion of the imaging device 1 acquired based on the detection signal from the gyro sensor 29 and the predetermined ranges of the thresholds 401 and 403.
  • the imaging control module 211 changes the control of the image sensor 13 (S103) .
  • the shooting prediction module 213 predicts that the current state is the shooting state where the shooting instruction is near.
  • the imaging control module 211 sets the frame rate 413 of the second period previously defined in association with the threshold 401 of the second period. In other words, when it is predicted that the current state is the shooting state of the second period where the shooting instruction is near, the imaging control module 211 changes the control of the image sensor 13 using the frame rate 411 of the first period to the control of the image sensor 13 using the frame rate 413 of the second period higher than the frame rate 411 of the first period.
  • the imaging control module 211 starts the ZSL operation and concurrently changes the control of the image sensor 13 (S105) .
  • the shooting prediction module 213 predicts that the current state is the shooting state where the shooting instruction can be performed.
  • the imaging control module 211 sets the frame rate 415 of the third period previously defined in association with the threshold 403 of the third period.
  • the imaging control module 211 changes the control of the image sensor 13 using the frame rate 413 of the second period to the control of the image sensor 13 using the frame rate 415 of the third period higher than the frame rate 413 of the second period.
  • the imaging control module 211 changes the control of the image sensor 13 in the ZSL operation to the control of the frame rate 415 of the third period.
  • Step S105 is executed or when either the shooting state where the shooting instruction is near or the shooting state where the shooting instruction can be performed is not predicted by the shooting prediction module 213 (S102, S104: NO) , the flow illustrated in FIG. 4 proceeds to Step S106.
  • the imaging control module 211 When the output of the input interface 27 according to the shooting instruction of the user is detected by the shooting instruction detection module 215 (S106: YES) , the imaging control module 211 generates an image based on the captured images (S107) .
  • the imaging control module 211 generates an image by using the captured images obtained at the time point at which the shooting instruction by the user is detected.
  • the imaging control module 211 generates, by the ZSL operation, an image corresponding to the time point at which the shooting instruction is performed by the user, by using the captured images obtained at the time point at which the shooting instruction is performed by the user.
  • the imaging control module 211 When the changed control is returned (S108: YES) , the imaging control module 211 returns the control from the control of the frame rate 415 of the third period to the control of the frame rate 417 such as the frame rate 411 of the first period and the frame rate 413 of the second period (S109) .
  • Step S109 is skipped.
  • the case where the changed control is returned means a case where the shooting instruction by the user is detected in Step S106 after the control of the image sensor 13 is changed in Step S103 and/or Step S105.
  • Step S102 when shooting is not terminated (S110: NO) , and is terminated when shooting is terminated in (S110: YES) .
  • S110: NO when shooting is not terminated
  • S110: YES the flow illustrated in FIG. 4 is returned to Step S102 when shooting is not terminated
  • S110: YES the flow illustrated in FIG. 4 is returned to Step S102 when shooting is not terminated
  • an operation indicating that the shooting of the user is terminated is received by the input interface 27, it is determined that the shooting is terminated.
  • the imaging device 1 is configured to detect the motion of the imaging device 1 and to predict that a time up to the user's shooting instruction is long when the motion is large and a time up to the user's shooting instruction is short when the motion is small. Moreover, the imaging device 1 is configured to set the higher frame rate 415 in the shooting state of the third period when the motion of the imaging device 1 is small, compared to the frame rates 411 and 413 in the shooting state of the first period or the second period when the motion of the imaging device 1 is large.
  • this configuration it is possible to predict a timing of shooting and to change the control of the image sensor 13 from a low frame rate to a high frame rate at an appropriate timing that is not too early before the predicted timing of shooting. For this reason, according to the technology of the embodiment, it is possible to obtain images with less motion blur while suppressing power consumption during the preview. Moreover, according to the technology of the embodiment, even if shooting is performed by ZSL that is widely used in a smartphone, especially, a frame rate can be changed from a low frame rate to a high frame rate just before shooting.
  • FIG. 5 is a diagram explaining shooting prediction in control processing according to the second embodiment.
  • the imaging device 1 according to the present embodiment may not include the gyro sensor 29.
  • the imaging device 1 is configured to detect the motion of the imaging device 1 based on captured images.
  • the shooting prediction module 213 calculates motion vectors 505, 507, and 509 of a subject P in an image 645 based on at least two captured images 641 and 642, and detects the motion of the imaging device 1 based on the calculated motion vectors 505, 507, and 509.
  • the shooting prediction module 213 calculates motion vectors 511 and 513 of the subject P in an image 647 based on at least two captured images 642 and 643, and detects the motion of the imaging device 1 based on the calculated motion vectors 511 and 513.
  • a motion vector may be calculated based on features such as an outline, eyes, a nose, and a mouth of the subject P.
  • the shooting prediction according to the present embodiment is to detect the motion of the imaging device 1 based on motion vectors of the subject calculated from at least two captured images acquired by using the image sensor 13. With this configuration, the same effects as those of the above embodiment are achieved without mounting the gyro sensor 29.
  • FIG. 6 is a diagram illustrating an example of a configuration of the imaging device 1 according to the third embodiment.
  • the imaging device 1 according to the present embodiment may not include the gyro sensor 29.
  • a body 101 of the imaging device 1 is provided with, as the imaging unit 10, a main camera 10a and a front camera 10b that have shooting directions different from each other.
  • the main camera 10a is similar to the imaging unit 10 according to the above embodiment.
  • the image sensor 13 of the front camera 10b is arranged to be able to shoot an opposite side to the image sensor 13 of the main camera 10a in the optical axis direction of the image sensor 13 of the main camera 10a.
  • FIG. 7 is a diagram explaining shooting prediction in control processing according to the third embodiment.
  • the shooting prediction module 213 detects the state of the user (the subject P) as a photographer from captured images acquired by using the image sensor 13 of the front camera 10b, and predicts the shooting state based on the state of the photographer. For example, when the subject P is not shown like a captured image 651, when the subject P is cut off like a captured image 652, when the subject P is inclined like a captured image 653, or when the subject P is located in the periphery like a captured image 654, the shooting prediction module 213 predicts that the current state is the shooting state of the first period or the second period where the photographer does not have a posture to shoot by using the imaging device 1, like the composition is not decided, for example.
  • the shooting prediction module 213 predicts that the current state is the shooting state of the third period where the shooting instruction can be performed.
  • the shooting prediction module 213 may perform the shooting prediction by the sight line of the photographer or may perform the shooting prediction by calculating motion vectors for the photographer or the sight line.
  • the imaging control module 211 changes the control of the image sensor 13 of the main camera 10a in accordance with the predicted shooting state.
  • the shooting prediction according to the present embodiment is to predict the shooting state of whether the current state is a state just before shooting by observing the state of the photographer without detecting the motion of the imaging device 1.
  • the shooting prediction according to the present embodiment is useful when a tripod 3 etc. is used especially.
  • FIG. 8 is a diagram explaining control change of the image sensor 13 in control processing according to the fourth embodiment.
  • the imaging device 1 according to the fourth embodiment is configured to return a frame rate to the frame rate 417 lower than the frame rate 415 of the third period when the shooting instruction is detected by the system side, as illustrated in (a) of FIG. 8.
  • the shooting instruction is not detected by the system side.
  • a timing at which a frame rate is returned to a low frame rate may be a time at which the size of the motion of the imaging device 1 increases from a state 515 to a state 517 and exceeds the predetermined range of the threshold 403, as illustrated in (b) of FIG. 8.
  • the timing at which a frame rate is returned to the low frame rate may be a time at which the control is changed to the frame rate 415 of the third period and then a predetermined time has elapsed.
  • a threshold different from the threshold 403 may be used.
  • a plurality of thresholds to return to the low frame rate in multiple steps may be used.
  • the imaging device 1 sets the frame rate 417 lower than the frame rate 415 of the third period, when the shooting instruction by the user is detected by the shooting instruction detection module 215, when the motion of the imaging device 1 is greater than the predetermined threshold 403, or when the frame rate 415 of the third period is set and then a predetermined time has elapsed.
  • the control can be returned to the low frame rate at an appropriate timing even if the shooting instruction is not performed, it is possible to reduce power consumption.
  • FIG. 9 is a diagram explaining thresholds of motion detection in control processing according to the fifth embodiment.
  • the shooting prediction module 213 records and learns a time series of the size of the motion of the imaging device 1 and information indicating the timing of the shooting instruction by the user, in the memory 25 for each photographer. Then, based on the learned information, the shooting prediction module 213 determines thresholds 403a, 403b, and 403c for respective photographers. As an example, the shooting prediction module 213 may determine a statistical value such as an average value and a medium value with respect to the sizes of the motion at the time point of the shooting instruction, as a threshold for each photographer.
  • the shooting prediction module 213 may determine as a threshold the output from a machine learning model for which parameters are determined, by using information indicating the time series of the motion and the shooting instruction timing on the input side and by using the preset threshold and the present threshold on the output side.
  • the technology according to the present embodiment may be applied to the threshold 401 of the second period, a threshold when the frame rate is returned, and a frequency for detecting the motion.
  • the imaging device 1 learns the detected motion of the imaging device 1 for each photographer, and sets a threshold for predicting the shooting state for each photographer based on the learned information.
  • the control of the image sensor 13 can be changed at the timing suitable for each photographer.
  • FIG. 10 is a diagram explaining control change of the image sensor 13 in control processing according to the sixth embodiment.
  • the imaging control module 211 controls the drive of the image sensor 13 in a single sampling mode 421 of performing one sampling for each frame.
  • the imaging control module 211 controls the drive of the image sensor 13 in a multi sampling mode 425 of performing a plurality of samplings for each frame.
  • the imaging device 1 changes the number of sampling operations of sampling pixel signals from the image sensor 13 in accordance with the predicted shooting state. According to this configuration, it is possible to suppress the increase in power consumption due to a multi sampling operation while reducing noise by the multi sampling operation.
  • FIG. 11 is a diagram explaining control change of the image sensor 13 in control processing according to the seventh embodiment.
  • the imaging control module 211 does not change an exposure time for each frame.
  • the plurality of captured images obtained without changing the exposure time are composited, like a composite image 661 in which exposure is appropriate for a subject 523 of a bird, exposure is excessive and saturated for a subject 521 of a cloud, and exposure is not enough for a subject 525 of a human, appropriate exposure cannot be realized for each of the subjects 521, 523, and 525.
  • the imaging control module 211 changes the control of the image sensor 13 to an HDR mode in which different exposure times are applied to respective frames.
  • the imaging control module 211 temporarily stores in the memory 25 a plurality of captured images 671, 672, and 673 that have different exposure times, and realizes the HDR mode in the ZSL operation of generating a composite image 681 by HDR composition.
  • the imaging device 1 executes the HDR mode of changing the exposure time in the image sensor 13 for each frame in accordance with the predicted shooting state.
  • the ZSL operation can be performed while realizing appropriate exposure for each of subjects 531, 533, and 535 by the HDR mode like the composite image 681.
  • FIG. 12 is a diagram explaining control change of the image sensor 13 in control processing according to the eighth embodiment.
  • the shooting prediction module 213 may perform shooting prediction based on the recognition of a subject. As an example, in the state of a captured image 691 in which any subject is not detected, the shooting prediction module 213 predicts that the current state is the shooting state of the first period. When a target subject is detected, the shooting prediction module 213 predicts that the current state is the shooting state of the third period. A time at which the target subject is detected is a time point T1 of a captured image 692 at which a subject 541 (e.g., human face) is detected, for example.
  • a subject 541 e.g., human face
  • the time at which the target subject is detected may be, for example, a time point T2 of a captured image 694 at which a subject 543 enters the angle of view, a time point T3 of a captured image 695 at which the subject 543 completely enters the angle of view, or a time point T4 of a captured image 696 at which the other subject 541 is out of the angle of view when assuming the subject 543 to be the target subject.
  • the imaging control module 211 raises a frame rate to the frame rate 413 of the third period.
  • the imaging control module 211 returns the frame rate to a low frame rate at a time point T5 (Case 1) at which a predetermined time has elapsed, at the time point T3 (Case 2) of the captured image 695 at which the target subject 541 is out of the angle of view, or at the time point T2 (Case 3) of the captured image 694 at which the other subject 543 (e.g., dog) enters the angle of view.
  • the imaging control module 211 may reset the elapsed time count after the change is performed to a high frame rate at the time point T2 of the captured image 694 at which the other target subject 543 (e.g., dog) enters the angle of view, and return the frame rate to the low frame rate at a time point T6 at which a predetermined time has elapsed from the time point (Case 4) .
  • the other target subject 543 e.g., dog
  • the shooting prediction according to the present embodiment is to predict that the current state is the shooting state of the third period when the target subject is detected from the captured image acquired by using the image sensor 13. Even with this configuration, the same effects as those of the above embodiment are achieved.
  • shooting prediction of predicting that the current state is the shooting state of the second period or the third period may be performed when the first release operation such as the selection (tapping) of the user for the target subject on a touch panel display is detected, for example.
  • the motion of the imaging device 1 may be detected by a plurality of means.
  • the shooting prediction may be performed based on at least two of the motion of the imaging device 1, the state of the photographer, and the state of the subject.
  • the control of the image sensor 13 at least two of a frame rate, the number of samplings, and an exposure time may be changed.
  • at least one shooting prediction described above and at least one control of the image sensor 13 may be combined as appropriate.
  • a part or the whole of processing executed by the imaging device 1 according to the present embodiments may be realized by software.
  • a program executed by a computer of the imaging device 1 according to the present embodiments is recorded and provided in a computer-readable non-transitory recording medium (computer program product) such as a flash memory (semiconductor memory) such as a USB (Universal Serial Bus) memory and SSD (Solid State Drive) , and HDD (Hard Disk Drive) , in a file with an installable format or an executable format.
  • a computer-readable non-transitory recording medium such as a flash memory (semiconductor memory) such as a USB (Universal Serial Bus) memory and SSD (Solid State Drive) , and HDD (Hard Disk Drive)
  • a computer-readable non-transitory recording medium such as a flash memory (semiconductor memory) such as a USB (Universal Serial Bus) memory and SSD (Solid State Drive) , and HDD (Hard Disk Drive)
  • a program executed by the imaging device 1 according to the present embodiments may be configured to be provided by being stored on a computer connected to a network such as the Internet and being downloaded by way of the network. Moreover, a program executed by the imaging device 1 according to the present embodiments may be configured to be provided or distributed by way of a network such as the Internet.
  • a program executed by the imaging device 1 may be configured to be previously incorporated into ROM etc. and be provided.
  • An imaging device configured to operate digitally and to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected, the device including:
  • a shooting prediction unit configured to predict the shooting instruction by the user
  • an imaging control unit configured to change control of an image sensor in the ZSL operation in case of a shooting state where it is predicted by the shooting prediction unit that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
  • the imaging control unit is configured to, in the predicted shooting state, set a frame rate for imaging pixel signals from the image sensor higher than at a time point before the shooting state.
  • the imaging control unit is configured to set a higher frame rate for imaging pixel signals from the image sensor as a state of the imaging device approaches the shooting state.
  • the imaging control unit is configured to set a first frame rate higher than at the time point before the predicted shooting state in the shooting state and then to set a second frame rate lower than the first frame rate.
  • the imaging control unit is configured to set the second frame rate lower than the first frame rate when the shooting instruction by the user is detected, when a motion of the imaging device is greater than a predetermined threshold, or when a predetermined time has elapsed from setting of the first frame rate.
  • the change of control of the image sensor is a change in a number of sampling operations for sampling pixel signals from the image sensor.
  • the imaging control unit is configured to change an exposure time in the image sensor in the predicted shooting state.
  • the shooting prediction unit is configured to detect a motion of the imaging device and to predict that a state of the imaging device is the shooting state when a size of the detected motion of the imaging device is smaller than a predetermined threshold.
  • the imaging device further comprises a gyro sensor
  • the shooting prediction unit is configured to detect the motion of the imaging device based on an output of the gyro sensor.
  • the shooting prediction unit is configured to calculate a motion vector of a subject from at least two captured images acquired by using the image sensor and to detect the motion of the imaging device based on the motion vector.
  • the shooting prediction unit is configured to learn the detected motion of the imaging device for each photographer and to set for each the photographer the threshold for predicting that the state of the imaging device is the shooting state based on the learned information.
  • the imaging device further comprises another image sensor arranged to be able to image a side opposite to the image sensor in an optical axis direction of the image sensor used for shooting, and
  • the shooting prediction unit is configured to detect a state of a photographer from a captured image acquired by using the other image sensor and to predict the shooting state based on the state of the photographer.
  • the shooting prediction unit is configured to predict that a state of the imaging device is the shooting state when a target subject is detected from a captured image acquired by using the image sensor.
  • a control method for a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected, the method including:
  • ZSL zero shutter lag
  • a computer program product storing a program to be executed by a computer of an imaging device, the program according to (15) .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

An imaging device according to an embodiment is a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected. The imaging device includes a shooting prediction unit and an imaging control unit. The shooting prediction unit predicts the shooting instruction by the user. The imaging control unit changes control of an image sensor in the ZSL operation in case of a shooting state where it is predicted by the shooting prediction unit that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.

Description

IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND COMPUTER PROGRAM PRODUCT [Technical Field]
Embodiments of the present disclosure relate to an imaging device, an imaging device control method, and a computer program product.
[Background Art]
There has been known technologies of compositing multiple captured images to achieve high image quality, such as HDR (High Dynamic Range) composition and NR (Noise Reduction) , in imaging devices such as smartphones.
For example, in scenes where a shooting subject is moving or the like, artifacts called a motion blur may be generated by a motion of the subject during acquiring the multiple images for the composition. The motion blur can be improved by the images for composition being as close in time as possible. For this reason, there has been known technologies of acquiring the multiple images with short time intervals by the imaging device being driven at a high frame rate in a preview state before shooting.
[Prior Art document (s) ]
[Patent literature 1]
U.S. Patent Application Publication No. 2020/0221008
[Disclosure of Invention]
[Problem to be Solved by the Invention]
However, there is a problem that power consumption is increased by driving the imaging device at the high frame rate in advance. The imaging devices with small body such as smartphones are more affected by increased power consumption from viewpoints of heat release and battery capacity.
An object of the present disclosure is to acquire images with less motion blur while suppressing power consumption in the preview state.
[Means for Solving Problem]
An imaging device according to an aspect of an embodiment is a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected. The imaging device includes a shooting prediction unit and an imaging control unit. The shooting prediction unit predicts the shooting instruction by the user. The imaging control unit changes control of an image sensor in the ZSL operation in case of a shooting state where it is predicted by the shooting prediction unit that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
A control method according to another aspect of the embodiment is executed by a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected. The method includes: predicting the shooting instruction by the user; and changing control of an image sensor in the ZSL operation in case of a shooting state where it is predicted that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
A computer program product according to further another aspect of the embodiment stores a program to be executed by a computer of a digital imaging device configured to be able to realize a zero shutter lag  (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected. The program causes the computer to execute: predicting the shooting instruction by the user; and changing control of an image sensor in the ZSL operation in case of a shooting state where it is predicted that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
[Effect of the Invention]
According to the embodiments of the present disclosure, it is possible to acquire images with less motion blur while suppressing power consumption in the preview state.
[Brief Description of Drawings]
FIG. 1 is a diagram illustrating an example of a configuration of an imaging device according to a first embodiment;
FIG. 2 is a diagram illustrating an example of a functional configuration of a controller according to the first embodiment;
FIG. 3 is a diagram explaining shooting prediction and control change of an image sensor in control processing according to the first embodiment;
FIG. 4 is a flowchart illustrating an example of a flow of the control processing according to the first embodiment;
FIG. 5 is a diagram explaining shooting prediction in control processing according to a second embodiment;
FIG. 6 is a diagram illustrating an example of a configuration of an imaging device according to a third embodiment;
FIG. 7 is a diagram explaining shooting prediction in control processing according to the third embodiment;
FIG. 8 is a diagram explaining control change of an image sensor in control processing according to a fourth embodiment;
FIG. 9 is a diagram explaining thresholds of motion detection in control processing according to a fifth embodiment;
FIG. 10 is a diagram explaining control change of an image sensor in control processing according to a sixth embodiment;
FIG. 11 is a diagram explaining control change of an image sensor in control processing according to a seventh embodiment; and
FIG. 12 is a diagram explaining control change of an image sensor in control processing according to an eighth embodiment.
[Embodiment (s) of Carrying Out the Invention]
Hereinafter, an imaging device, an imaging device control method, a program, and a computer program product according to embodiments will be described in detail with reference to the accompanying drawings. Note that the present invention is not limited to the embodiments.
In the description of the present embodiments, components having the same or substantially the same functions as the previously described components for the previous drawing have the same reference numbers, and their descriptions may be omitted as appropriate. Moreover, even if the same or substantially the same part is illustrated, the dimension and ratio of the part may be different depending on a drawing. Moreover, from the viewpoint of ensuring visibility of drawings, for example, reference numbers may be assigned to only main components for description of each drawing, and components having the same or substantially the same functions as  the previously described components for the previous drawing may not have reference numbers.
There has been known technologies of compositing multiple captured images to achieve high image quality, such as HDR (High Dynamic Range) composition and NR (Noise Reduction) , in imaging devices such as smartphones.
For example, in scenes where a shooting subject is moving or the like, artifacts called a motion blur may be generated by a motion of the subject during acquiring the multiple images for the composition. The motion blur can be improved by the images for composition being as close in time as possible. For this reason, there has been known technologies of acquiring the multiple images with short time intervals by the imaging device being driven at a high frame rate in a preview state before shooting.
However, there is a problem that power consumption is increased by driving the imaging device at the high frame rate in advance. The imaging devices with small body such as smartphones are more affected by increased power consumption from viewpoints of heat release and battery capacity.
For example, it is possible to reduce power consumption when the imaging device is driven at a low frame rate in the preview state before shooting and then can be driven at a higher frame rate just before shooting. In this situation, many smartphones employ ZSL (Zero Shutter lag) in order to reduce a delay between the shooting instruction by the user and the detection of the shooting instruction by the controller side. In ZSL operation, images taken during the preview are temporarily stored, and when the controller side detects the shooting instruction, a composite image according to a time point at which the shooting instruction of the user is performed is generated by using the stored captured images. For this reason, in the smartphones employing ZSL, it was difficult to reduce power consumption by the method of changing a frame rate to a high frame rate just before shooting.
Moreover, in the smartphones employing ZSL, there is a problem that, even if a frame rate is changed at the time point at which the shooting instruction of the user is detected by the system side, captured images at the time point at which the shooting instruction of the user is performed have been already taken at a low frame rate and thus the motion blur in the composite image cannot be suppressed.
Therefore, an imaging device, an imaging device control method, a program and a computer program product, which can obtain images with less motion blur while suppressing power consumption during the preview, will be described in the following embodiments.
Note that the following embodiments illustrate a digital imaging device configured to be able to realize ZSL starting from the shooting instruction by the user. Moreover, the following embodiments illustrate an imaging device configured to be able to realize high image quality processing, such as HDR (High Dynamic Range) composition and NR (Noise Reduction) , which obtains a composite image by using a plurality of captured images obtained during the ZSL operation.
First Embodiment
FIG. 1 is a diagram illustrating an example of a configuration of an imaging device 1 according to the first embodiment. As illustrated in FIG. 1, the imaging device 1 includes an imaging unit 10, a controller 21, a digital signal processor 23, a memory 25, an input interface 27, and a gyro sensor 29. The imaging unit 10 is electrically connected to the DSP (digital signal processor) 23. The controller 21, the digital signal processor 23, the memory 25, the input interface 27, and the gyro sensor 29 are connected to be able to communicate with one another via a signal line such as a bus 31, for example.
The imaging unit 10 images a subject field to generate a captured image (image data) . As illustrated in FIG. 1, the imaging unit 10 includes an optical system 11, an image sensor 13, and an analog front end (AFE) 15.
The optical system 11 includes an optical element configured to form an image of a light beam  from a subject on an imaging surface 131 of the image sensor 13. Note that FIG. 1 exemplifies a single lens as the optical element of the optical system 11 but the present embodiment is not limited to the above. The optical system 11 may have desired imaging performance by at least one optical element having power. In other words, the optical system 11 may be composed of a compound lens that includes at least one single lens, or may be composed of a combination of a lens system and a reflection system.
The image sensor 13 images a subject field to generate an image signal. The image sensor 13 is arranged on an optical axis of the optical system 11. The image sensor 13 is arranged at a position at which the image of the light beam from the subject is formed by the optical system 11. The image sensor 13 can appropriately employ a solid-state imaging device such as CCD (Charge Coupled Device) and CMOS (Complementary Metal-Oxide Semiconductor) .
The AFE 15 is an analog signal processor configured to perform analog processing such as amplification processing on the image signal read from the image sensor 13.
Optionally, the AFE 15 includes a correlated double sampler (CDS) , a gain control amplifier (GCA) , and an A/D converter (ADC) . The CDS is provided in back of the image sensor 13, and performs noise rejection on the image signal from the image sensor 13. The GCA is provided in back of the CDS, and performs amplification processing on the image signal whose noise is removed by the CDS, in accordance with the control of the controller 21. The ADC is provided in back of the GCA, and converts the image signal amplified by the GCA into digital-format image data (captured image) . Note that a part or the whole of the function of the AFE 15 may be realized inside the image sensor 13.
Note that the imaging unit 10 is configured to be able to change a focus position. Herein, "to be able to change a focus position" means that an image formed on the imaging surface 131 can be made smaller than a diameter of a permissible circle of confusion for each of at least two object points that exist at different positions in an optical axis direction of the optical system 11. A diameter of a permissible circle of confusion is defined depending on a pixel pitch of the image sensor 13 or imaging performance of the optical system 11, for example. In other words, the imaging unit 10 is configured to be able to focus or blur (bokeh) an arbitrary subject. Optionally, the imaging unit 10 is configured to be able to move at least one, of an image-side focus position of the optical system 11, an object-side focus position of the optical system 11, and the imaging surface 131 of the image sensor 13, in the optical axis direction of the optical system 11.
The controller 21 controls each component of the imaging device 1 in accordance with a program stored in an internal memory or the memory 25. The controller 21 includes a processor and a memory as hardware resources. The processor can appropriately employ various processors such as CPU (Central Processing Unit) , DSP (Digital Signal Processor) , ASIC (Application Specific Integrated Circuit) , and FPGA (Field-Programmable Gate Array) . Moreover, the memory can appropriately employ various memories such as ROM (Read Only Memory) , a flash memory, and RAM (Random Access Memory) . Note that the controller 21 may employ a microcomputer.
The DSP 23 is provided in back of the AFE 15, and performs various image processing required for displaying and recording images on the images from the AFE 15. The image processing includes, for example, an optical black (OB) subtraction process, a white balance (WB) correction process, a demosaic process, a color conversion process, a gamma conversion process, a noise reduction process, an enlargement/reduction process, a compression process, and the like.
The memory 25 stores a program required for operations of the imaging device 1. Moreover, the memory 25 stores information required for various processes of the imaging device 1. This information includes, for example, information indicating correspondence between the size of the motion of the imaging device 1 and the frame rate. Moreover, the memory 25 temporarily stores therein various data such as the image output from the DSP  23 and processing data in the controller 21. The memory 25 includes, as hardware resources, a nonvolatile memory such as ROM and a flash memory, and a volatile memory such as DRAM (Dynamic RAM) , SDRAM (Synchronous DRAM) , and SRAM (Static RAM) .
The input interface 27 accepts an input operation of the user in the imaging device 1 through various input devices such as a touch panel display, a switch, a button, a keyboard, and a microphone accepting a voice input of the user, for example. As an example, the input interface 27 accepts a release operation (shooting instruction) that indicates the shooting of the user.
The gyro sensor 29 is configured to output a detection signal with a value according to the size of a motion such as a shake of the imaging device 1.
FIG. 2 is a diagram illustrating an example of a functional configuration of the controller 21 according to the first embodiment. The controller 21 realizes functions of an imaging control module 211, a shooting prediction module 213, and a shooting instruction detection module 215 by a processor executing a program developed on the RAM of the internal memory or the memory 25. Herein, the controller 21 realizing a function of the imaging control module 211 is an example of an imaging control unit. Moreover, the controller 21 realizing a function of the shooting prediction module 213 is an example of a shooting prediction unit. Moreover, the controller 21 realizing a function of the shooting instruction detection module 215 is an example of an instruction detection unit.
Note that the  modules  211, 213, and 215 may be realized by a single processor, or may be realized by a combination of a plurality of independent processors. Moreover, each of the  modules  211, 213, and 215 may be realized by being integrated or distributed to a plurality of processors.
Based on an AE evaluation value indicating a subject brightness in the captured image, the imaging control module 211 performs automatic exposure (AE) processing for setting imaging conditions that include an aperture value and a shutter speed value. The imaging control module 211 performs the AE processing by using a first release operation of the user as a trigger, for example.
Moreover, based on focus information acquired from an image and the like, the imaging control module 211 performs automatic focus (AF) adjustment processing for controlling the drive of at least one of the focusing lens and the image sensor 13 included in the optical system 11. The focus information is, for example, an AF evaluation value (contrast value) calculated from the captured image. Moreover, when the image sensor 13 is configured to have a focus detection pixel, the focus information may be a defocusing amount calculated from the output of the focus detection pixel.
Moreover, the imaging control module 211 performs shooting processing of controlling the imaging unit 10 to acquire a captured image. The imaging control module 211 performs the shooting processing by using a second release operation (shooting instruction) of the user as a trigger, for example. Herein, the first release operation and the second release operation includes an operation of tapping an arbitrary subject etc. on a touch panel display (not illustrated) during preview display. Note that the preview display may be referred to as a live view display.
Moreover, the imaging control module 211 performs shooting processing of controlling the imaging unit 10 and the memory 25 to acquire a composite image, that is, shooting processing with the ZSL operation. In the ZSL operation, the imaging control module 211 temporarily stores captured images obtained during the preview in the memory 25. When the user's operation (shooting instruction) is detected by the shooting instruction detection module 215, the imaging control module 211 reads from the memory 25 the plurality of captured images obtained before a predetermined time from this detection time point. The imaging control module 211 performs high image quality processing of compositing the read plurality of captured images by the DSP 23 to acquire a composite image.
Moreover, the imaging control module 211 changes the control of the image sensor 13 in the ZSL operation in the case of a shooting state predicted by the shooting prediction module 213 in which the shooting instruction by the user can be performed at a time point before the shooting instruction by the user is detected by the shooting instruction detection module 215.
As an example, the imaging control module 211 changes a frame rate for imaging pixel signals from the image sensor 13 in accordance with the shooting state predicted by the shooting prediction module 213. The details of the control change of the image sensor 13 will be described below.
In the shooting processing with the ZSL operation, the shooting prediction module 213 performs shooting prediction of predicting a shooting state. Herein, the prediction of the shooting state in the shooting prediction includes determining directly or indirectly which shooting state the state of the imaging device is. Moreover, the shooting prediction includes determining directly or indirectly closeness in time to the shooting state.
As an example, the shooting prediction module 213 performs the shooting prediction based on the size of the motion of the imaging device 1. For example, the shooting prediction module 213 detects the size of the motion of the imaging device 1 based on the detection signal output from the gyro sensor 29. Then, the shooting prediction module 213 predicts the shooting state based on whether the size of the motion of the imaging device 1 is within a predetermined threshold range. The details of the shooting prediction will be described below.
Note that the shooting prediction module 213 may perform the shooting prediction by using a threshold with respect to the detection signals from the gyro sensor 29, instead of the threshold with respect to the size of the motion of the imaging device 1. Alternatively, the shooting prediction module 213 may perform the shooting prediction by using a threshold with respect to a convergence rate of the size of the motion indicating how much the size of the motion is converged with reference to the motion of the imaging device 1 in a first period in which the preview display is started.
The shooting instruction detection module 215 detects the shooting instruction performed by the user based on the output of the input interface 27 according to the second release operation (shooting instruction) , for example.
Herein, the shooting processing with the ZSL operation in the imaging device 1 according to the embodiment will be described. FIG. 3 is a diagram explaining the shooting prediction and the control change of the image sensor in control processing according to the first embodiment.
Herein, a case of a scene for shooting a subject in a state where the user is gripping the imaging device 1 will be described with reference to FIG. 3. Note that it is assumed that the imaging device 1 has started the preview display by the imaging control module 211 and the shooting prediction by the shooting prediction module 213.
<First Period>
In the present embodiment, a period of a state where a composition has not yet been decided is referred to as the first period. In the first period, the user thinks about a composition etc. for shooting the subject while looking at the preview display, for example, in the state where the imaging device 1 is gripped. At this time, because this state is the state where a composition has not yet been decided, the user moves the imaging device 1 so that the subject fits within an angle of view or moves the imaging device 1 to try various compositions for the subject, while looking at the preview display. For this reason, as illustrated in FIG. 3, the motion of the imaging device 1 in the first period is large.
When the motion of the imaging device 1 is large, the position of the subject with respect to the angle of view is changed between a plurality of captured  images  601, 602, and 603 obtained at a frame rate 411 of the first period. For this reason, when the motion of the imaging device 1 is large, a motion blur 501 occurs in a  composite image 611 generated by compositing the plurality of captured  images  601, 602, and 603.
<Second Period>
A period in a state where the composition is being decided after the first period is referred to as a second period. The second period is a period of the shooting state where the shooting instruction of the user is near. As an example, the second period is a period of about 1 to 3 seconds. In the second period, the user adjusts the direction and position of the imaging device 1 while looking at the preview display, for example, in order to fix the composition for the subject. For this reason, as illustrated in FIG. 3, the motion of the imaging device 1 in the second period is smaller than that in the first period.
As an example, when the size of the motion of the imaging device 1 is within a predetermined range of a threshold 401 of the second period, the shooting prediction module 213 predicts that the current state is the shooting state of the second period where the shooting instruction is near.
As an example, when it is predicted by the shooting prediction module 213 that the current state is the shooting state of the second period where the shooting instruction is near, the imaging control module 211 changes the control to drive the image sensor 13 at a frame rate 413 of the second period higher than the frame rate 411 of the first period.
As described above, a higher frame rate for imaging pixel signals from the image sensor 13 is set as the current state approaches the shooting state of a third period. As a result, because a higher frame rate is set as a possibility that the shooting instruction by the user is performed is increased, even if the shooting instruction is performed by the user at this stage, it is possible to reduce the motion blur while reducing power consumption, compared to the control of driving the image sensor 13 at a high frame rate at the time point at which the preview display is started, for example.
Note that a plurality of the frame rates 413 of the second period may be set in accordance with the size of the motion of the imaging device 1. In other words, the frame rate 413 of the second period may have a higher value as the current state approaches the shooting state of the third period. In this case, each of the plurality of frame rates 413 of the second period may have a higher value as the current state approaches the shooting state of the third period. For example, based on a plurality of the predetermined thresholds 401 of the second period, the shooting prediction module 213 predicts closeness in time to the shooting state of the third period as a shooting state at its time point. At this time, the imaging control module 211 may select the frame rate 413 of the second period to be set, from a table indicating a plurality of predetermined frame rates for the respective thresholds 401 of the second period. Alternatively, the imaging control module 211 may seamlessly set the frame rate 413 of the second period in accordance with the closeness to the shooting state, by using a relational expression indicating a relationship between a frame rate and a closeness to the shooting state of the third period such as the size of the operation of the imaging device 1.
<Third Period>
A period of the state where the composition is substantially decided after the second period is referred to as the third period. The third period is a period of the shooting state where the shooting instruction of the user can be performed. As an example, the third period is a period of about 1 second. In the third period, the user times the shooting or fine-tunes the direction and position of the imaging device 1 while looking at the preview display, for example. For this reason, as illustrated in FIG. 3, the motion of the imaging device 1 in the third period is further smaller than that in the second period.
As an example, when the size of the motion of the imaging device 1 is within a predetermined range of a threshold 403 of the third period, the shooting prediction module 213 predicts that the current state is the shooting state of the third period where the shooting instruction of the user can be performed.
As an example, in the shooting state of the third period where it is predicted by the shooting prediction module 213 that the shooting instruction by the user is likely to be performed, the imaging control module 211 sets a frame rate for imaging pixel signals from the image sensor 13 higher than that at a time point before the shooting state of the third period, that is, in the first period and the second period. Herein, a frame rate 415 of the third period that is set in the shooting state where it is predicted that the shooting instruction by the user can be performed is an example of the first frame rate. Moreover, the imaging control module 211 starts the ZSL operation. In other words, the imaging control module 211 acquires captured images at the frame rate 415 of the third period, and sequentially stores them in the memory 25, for example.
<After Third Period>
A case where the shooting instruction by the user is performed after the third period will be described.
As an example, the shooting instruction detection module 215 detects the shooting instruction by the user based on the output of the input interface 27 according to the shooting instruction of the user.
The imaging control module 211 reads from the memory 25 a plurality of captured  images  621, 622, and 623 obtained before a time point at which the shooting instruction by the user is detected. Then, by using the read plurality of captured  images  621, 622, and 623, the imaging control module 211 generates an image 631 at the time point at which the shooting instruction by the user is performed.
A frame rate higher than that in the first period and the second period is set in the third period. For this reason, a change in the position of the subject with respect to the angle of view is small between the plurality of captured  images  621, 622, and 623 obtained at the frame rate 415 of the third period. For this reason, the composite image 631 generated by compositing the plurality of captured  images  621, 622, and 623 has a small motion blur 503.
Moreover, after the high frame rate is set in the shooting state of the third period where it is predicted that the shooting instruction by the user can be performed, the imaging control module 211 sets a frame rate 417 lower than the frame rate 415 of the third period when the shooting instruction of the user is detected by the shooting instruction detection module 215, for example. As an example, the low frame rate 417 to be set after the shooting state may be the same frame rate as that at the time point before the shooting state of the third period such as the first period and the second period, or may be a frame rate different from that at the time point. Herein, the frame rate 417 is an example of a second frame rate.
As described above, according to the imaging device 1 of the present embodiment, in the step where the preview display is started, that is, in the first period, the control of driving the image sensor 13 is performed at the frame rate 411 of the first period that is a normal frame rate. Moreover, when the shooting state of the second period where the shooting instruction is near is predicted, the control of the image sensor 13 is changed to drive the image sensor 13 at the frame rate 413 of the second period higher than the frame rate 411 of the first period. Moreover, when the shooting state of the third period where the shooting instruction can be performed is predicted, the control of the image sensor 13 is changed to drive the image sensor 13 at the frame rate 415 of the third period higher than the frame rate 413 of the second period. As a result, compared to the control of driving the image sensor 13 at a high frame rate at the time point at which the preview display is started, for example, it is possible to reduce the motion blur while reducing power consumption.
Moreover, when the imaging device 1 is a smartphone, for example, each module according to the present embodiment such as the shooting instruction detection module 215 is realized by an application for shooting installed in the smartphone. For this reason, the shooting instruction detection module 215 detects the output of the input interface 27 according to the shooting instruction of the user via OS (operating system) of the smartphone, for example. As described above, the imaging device 1 such as the smartphone has a system delay from  a time point at which the shooting instruction by the user is performed to a time point at which the shooting instruction is detected by the shooting instruction detection module 215. For this reason, even if the control of changing the frame rate is performed at the time point at which the shooting instruction is detected, a captured image at the time point at which the shooting instruction is performed is acquired at the frame rate before the change.
In this situation, according to the imaging device 1 of the present embodiment, when the shooting state of the third period where the shooting instruction can be performed is predicted, the ZSL operation is started and the control of the image sensor 13 is changed to drive the image sensor 13 at the frame rate 415 higher than that in the first period and the second period. Therefore, according to the imaging device 1 of the embodiment, even if there is a system delay, it is possible to change the control of the image sensor 13 at an appropriate timing that is not too early and not too late based on the shooting prediction. Optionally, according to the imaging device 1 of the embodiment, because the low frame rate control is performed at the time at which the preview display is started but the high frame rate control is already performed at the time point at which the shooting instruction is performed, captured images that are stored in the ZSL operation can be acquired at the high frame rate after the change while reducing power consumption.
Note that the ZSL operation may be started in the first period or the second period. Even in this case, compared to the case where the preview display is started and concurrently the high frame rate is set, it is possible to reduce the motion blur while reducing power consumption.
Note that the second period may not exist depending on the time series of the motion of the imaging device 1.
Herein, an example of operations of the imaging device 1 according to the embodiment will be described. FIG. 4 is a flowchart illustrating an example of a flow of control processing according to the first embodiment. The flow illustrated in FIG. 4 is implemented in a scene etc. of shooting a moving subject, for example, but may be implemented in another scene or may be implemented regardless of a scene.
First, the preview display by the imaging control module 211 and the shooting prediction by the shooting prediction module 213 are started (S101) .
As an example, the imaging control module 211 starts imaging by the image sensor 13 by using imaging conditions such as a shutter speed (exposure time) and the predetermined normal frame rate 411 of the first period, and starts the preview display of displaying captured images on a display (not illustrated) . Note that the captured images for the preview display may be a poor quality image whose exposure time, number of pixels, etc. are reduced lower than those of a captured image used for image generation according to the shooting instruction of the user or for which a part of image processing is omitted or suppressed. Note that the preview display may be referred to as a live view display.
As an example, the shooting prediction module 213 starts shooting prediction based on the size of the motion of the imaging device 1. The shooting prediction module 213 according to the present embodiment performs the shooting prediction of predicting a shooting state by comparison between the size of the motion of the imaging device 1 acquired based on the detection signal from the gyro sensor 29 and the predetermined ranges of the  thresholds  401 and 403.
When the shooting state where the shooting instruction is near is predicted by the shooting prediction module 213 (S102: YES) , the imaging control module 211 changes the control of the image sensor 13 (S103) .
As an example, when the size of the motion of the imaging device 1 is within the predetermined range of the threshold 401 of the second period, the shooting prediction module 213 predicts that the current state is the shooting state where the shooting instruction is near. At this time, the imaging control module 211 sets the frame  rate 413 of the second period previously defined in association with the threshold 401 of the second period. In other words, when it is predicted that the current state is the shooting state of the second period where the shooting instruction is near, the imaging control module 211 changes the control of the image sensor 13 using the frame rate 411 of the first period to the control of the image sensor 13 using the frame rate 413 of the second period higher than the frame rate 411 of the first period.
When the shooting state where the shooting instruction can be performed is predicted by the shooting prediction module 213 (S104: YES) , the imaging control module 211 starts the ZSL operation and concurrently changes the control of the image sensor 13 (S105) .
As an example, when the size of the motion of the imaging device 1 is within the predetermined range of the threshold 403 of the third period, the shooting prediction module 213 predicts that the current state is the shooting state where the shooting instruction can be performed. At this time, the imaging control module 211 sets the frame rate 415 of the third period previously defined in association with the threshold 403 of the third period. In other words, when it is predicted that the current state is the shooting state of the third period where the shooting instruction is likely to be performed, the imaging control module 211 changes the control of the image sensor 13 using the frame rate 413 of the second period to the control of the image sensor 13 using the frame rate 415 of the third period higher than the frame rate 413 of the second period. In other words, the imaging control module 211 changes the control of the image sensor 13 in the ZSL operation to the control of the frame rate 415 of the third period.
After Step S105 is executed or when either the shooting state where the shooting instruction is near or the shooting state where the shooting instruction can be performed is not predicted by the shooting prediction module 213 (S102, S104: NO) , the flow illustrated in FIG. 4 proceeds to Step S106.
When the output of the input interface 27 according to the shooting instruction of the user is detected by the shooting instruction detection module 215 (S106: YES) , the imaging control module 211 generates an image based on the captured images (S107) .
As an example, in the case of the first period or the second period before the shooting state of the third period where the shooting instruction can be performed is predicted, the imaging control module 211 generates an image by using the captured images obtained at the time point at which the shooting instruction by the user is detected.
As an example, in the case of the third period and the following period where the ZSL operation is being started in accordance with the shooting state of the third period where the shooting instruction can be performed being predicted, the imaging control module 211 generates, by the ZSL operation, an image corresponding to the time point at which the shooting instruction is performed by the user, by using the captured images obtained at the time point at which the shooting instruction is performed by the user.
When the changed control is returned (S108: YES) , the imaging control module 211 returns the control from the control of the frame rate 415 of the third period to the control of the frame rate 417 such as the frame rate 411 of the first period and the frame rate 413 of the second period (S109) . On the other hand, when the changed control is not returned (S108: NO) , Step S109 is skipped. Note that the case where the changed control is returned means a case where the shooting instruction by the user is detected in Step S106 after the control of the image sensor 13 is changed in Step S103 and/or Step S105.
After that, the flow illustrated in FIG. 4 is returned to Step S102 when shooting is not terminated (S110: NO) , and is terminated when shooting is terminated in (S110: YES) . For example, when an operation indicating that the shooting of the user is terminated is received by the input interface 27, it is determined that the shooting is terminated.
As described above, the imaging device 1 according to the present embodiment is configured to detect the motion of the imaging device 1 and to predict that a time up to the user's shooting instruction is long when the motion is large and a time up to the user's shooting instruction is short when the motion is small. Moreover, the imaging device 1 is configured to set the higher frame rate 415 in the shooting state of the third period when the motion of the imaging device 1 is small, compared to the  frame rates  411 and 413 in the shooting state of the first period or the second period when the motion of the imaging device 1 is large.
According to this configuration, it is possible to predict a timing of shooting and to change the control of the image sensor 13 from a low frame rate to a high frame rate at an appropriate timing that is not too early before the predicted timing of shooting. For this reason, according to the technology of the embodiment, it is possible to obtain images with less motion blur while suppressing power consumption during the preview. Moreover, according to the technology of the embodiment, even if shooting is performed by ZSL that is widely used in a smartphone, especially, a frame rate can be changed from a low frame rate to a high frame rate just before shooting.
Second Embodiment
Herein, the differences from the first embodiment will be mainly described, and the explanation of the duplicated contents will be omitted as appropriate. FIG. 5 is a diagram explaining shooting prediction in control processing according to the second embodiment. The imaging device 1 according to the present embodiment may not include the gyro sensor 29. On the other hand, as illustrated in FIG. 5, the imaging device 1 is configured to detect the motion of the imaging device 1 based on captured images. As an example, as illustrated in FIG. 5, the shooting prediction module 213 calculates  motion vectors  505, 507, and 509 of a subject P in an image 645 based on at least two captured images 641 and 642, and detects the motion of the imaging device 1 based on the  calculated motion vectors  505, 507, and 509. Similarly, the shooting prediction module 213 calculates  motion vectors  511 and 513 of the subject P in an image 647 based on at least two captured  images  642 and 643, and detects the motion of the imaging device 1 based on the  calculated motion vectors  511 and 513. Note that a motion vector may be calculated based on features such as an outline, eyes, a nose, and a mouth of the subject P.
As described above, the shooting prediction according to the present embodiment is to detect the motion of the imaging device 1 based on motion vectors of the subject calculated from at least two captured images acquired by using the image sensor 13. With this configuration, the same effects as those of the above embodiment are achieved without mounting the gyro sensor 29.
Third Embodiment
Herein, the differences from the first embodiment will be mainly described, and the explanation of the duplicated contents will be omitted as appropriate. FIG. 6 is a diagram illustrating an example of a configuration of the imaging device 1 according to the third embodiment. The imaging device 1 according to the present embodiment may not include the gyro sensor 29. On the other hand, a body 101 of the imaging device 1 is provided with, as the imaging unit 10, a main camera 10a and a front camera 10b that have shooting directions different from each other. The main camera 10a is similar to the imaging unit 10 according to the above embodiment. The image sensor 13 of the front camera 10b is arranged to be able to shoot an opposite side to the image sensor 13 of the main camera 10a in the optical axis direction of the image sensor 13 of the main camera 10a.
FIG. 7 is a diagram explaining shooting prediction in control processing according to the third embodiment. The shooting prediction module 213 detects the state of the user (the subject P) as a photographer from captured images acquired by using the image sensor 13 of the front camera 10b, and predicts the shooting state based on the state of the photographer. For example, when the subject P is not shown like a captured image 651, when the subject P is cut off like a captured image 652, when the subject P is inclined like a captured image 653, or when the subject P is located in the periphery like a captured image 654, the shooting prediction module 213 predicts that the  current state is the shooting state of the first period or the second period where the photographer does not have a posture to shoot by using the imaging device 1, like the composition is not decided, for example. On the other hand, when the photographer straight faces the front in a central portion like a captured image 657, the shooting prediction module 213 predicts that the current state is the shooting state of the third period where the shooting instruction can be performed. Note that the shooting prediction module 213 may perform the shooting prediction by the sight line of the photographer or may perform the shooting prediction by calculating motion vectors for the photographer or the sight line. Then, the imaging control module 211 changes the control of the image sensor 13 of the main camera 10a in accordance with the predicted shooting state.
As described above, the shooting prediction according to the present embodiment is to predict the shooting state of whether the current state is a state just before shooting by observing the state of the photographer without detecting the motion of the imaging device 1. With this configuration, without mounting the gyro sensor 29 and even when an imaging target does not enter the angle of view, the same effects as those of the above embodiment are achieved. The shooting prediction according to the present embodiment is useful when a tripod 3 etc. is used especially.
Fourth Embodiment
Herein, the differences from the first embodiment will be mainly described, and the explanation of the duplicated contents will be omitted as appropriate. FIG. 8 is a diagram explaining control change of the image sensor 13 in control processing according to the fourth embodiment. The imaging device 1 according to the fourth embodiment is configured to return a frame rate to the frame rate 417 lower than the frame rate 415 of the third period when the shooting instruction is detected by the system side, as illustrated in (a) of FIG. 8. On the other hand, there may be a case where the shooting instruction is not detected by the system side. For this reason, a timing at which a frame rate is returned to a low frame rate may be a time at which the size of the motion of the imaging device 1 increases from a state 515 to a state 517 and exceeds the predetermined range of the threshold 403, as illustrated in (b) of FIG. 8. Alternatively, as illustrated in (c) of FIG. 8, the timing at which a frame rate is returned to the low frame rate may be a time at which the control is changed to the frame rate 415 of the third period and then a predetermined time has elapsed. Note that a threshold different from the threshold 403 may be used. Moreover, a plurality of thresholds to return to the low frame rate in multiple steps may be used.
As described above, the imaging device 1 according to the embodiment sets the frame rate 417 lower than the frame rate 415 of the third period, when the shooting instruction by the user is detected by the shooting instruction detection module 215, when the motion of the imaging device 1 is greater than the predetermined threshold 403, or when the frame rate 415 of the third period is set and then a predetermined time has elapsed. With this configuration, because the control can be returned to the low frame rate at an appropriate timing even if the shooting instruction is not performed, it is possible to reduce power consumption.
Fifth Embodiment
Herein, the differences from the first embodiment will be mainly described, and the explanation of the duplicated contents will be omitted as appropriate. FIG. 9 is a diagram explaining thresholds of motion detection in control processing according to the fifth embodiment. The shooting prediction module 213 records and learns a time series of the size of the motion of the imaging device 1 and information indicating the timing of the shooting instruction by the user, in the memory 25 for each photographer. Then, based on the learned information, the shooting prediction module 213 determines  thresholds  403a, 403b, and 403c for respective photographers. As an example, the shooting prediction module 213 may determine a statistical value such as an average value and a medium value with respect to the sizes of the motion at the time point of the shooting instruction, as a threshold for each photographer. Alternatively, the shooting prediction module 213 may determine as a threshold the output from  a machine learning model for which parameters are determined, by using information indicating the time series of the motion and the shooting instruction timing on the input side and by using the preset threshold and the present threshold on the output side. Note that the technology according to the present embodiment may be applied to the threshold 401 of the second period, a threshold when the frame rate is returned, and a frequency for detecting the motion.
As described above, the imaging device 1 according to the embodiment learns the detected motion of the imaging device 1 for each photographer, and sets a threshold for predicting the shooting state for each photographer based on the learned information. According to this configuration, in accordance with the sizes of the motion of the imaging device 1 that are different depending on the photographers, the control of the image sensor 13 can be changed at the timing suitable for each photographer.
Sixth Embodiment
Herein, the differences from the first embodiment will be mainly described, and the explanation of the duplicated contents will be omitted as appropriate. FIG. 10 is a diagram explaining control change of the image sensor 13 in control processing according to the sixth embodiment. In the first period and the second period, the imaging control module 211 controls the drive of the image sensor 13 in a single sampling mode 421 of performing one sampling for each frame. On the other hand, in the third period, the imaging control module 211 controls the drive of the image sensor 13 in a multi sampling mode 425 of performing a plurality of samplings for each frame.
As described above, the imaging device 1 according to the present embodiment changes the number of sampling operations of sampling pixel signals from the image sensor 13 in accordance with the predicted shooting state. According to this configuration, it is possible to suppress the increase in power consumption due to a multi sampling operation while reducing noise by the multi sampling operation.
Seventh Embodiment
Herein, the differences from the first embodiment will be mainly described, and the explanation of the duplicated contents will be omitted as appropriate. FIG. 11 is a diagram explaining control change of the image sensor 13 in control processing according to the seventh embodiment. In the first period and the second period, the imaging control module 211 does not change an exposure time for each frame. In this case, even if the plurality of captured images obtained without changing the exposure time are composited, like a composite image 661 in which exposure is appropriate for a subject 523 of a bird, exposure is excessive and saturated for a subject 521 of a cloud, and exposure is not enough for a subject 525 of a human, appropriate exposure cannot be realized for each of the  subjects  521, 523, and 525. On the other hand, power consumption increases when an HDR operation of previously changing an exposure time for each frame is performed, and switching to the HDR operation cannot be performed by a system lag at a time point at which the user's shooting instruction is detected. In this situation, in the third period, the imaging control module 211 according to the present embodiment changes the control of the image sensor 13 to an HDR mode in which different exposure times are applied to respective frames. In other words, the imaging control module 211 temporarily stores in the memory 25 a plurality of captured  images  671, 672, and 673 that have different exposure times, and realizes the HDR mode in the ZSL operation of generating a composite image 681 by HDR composition.
As described above, the imaging device 1 according to the present embodiment executes the HDR mode of changing the exposure time in the image sensor 13 for each frame in accordance with the predicted shooting state. According to this configuration, the ZSL operation can be performed while realizing appropriate exposure for each of  subjects  531, 533, and 535 by the HDR mode like the composite image 681.
Eighth Embodiment
Herein, the differences from the first embodiment will be mainly described, and the explanation of the  duplicated contents will be omitted as appropriate. FIG. 12 is a diagram explaining control change of the image sensor 13 in control processing according to the eighth embodiment. The shooting prediction module 213 may perform shooting prediction based on the recognition of a subject. As an example, in the state of a captured image 691 in which any subject is not detected, the shooting prediction module 213 predicts that the current state is the shooting state of the first period. When a target subject is detected, the shooting prediction module 213 predicts that the current state is the shooting state of the third period. A time at which the target subject is detected is a time point T1 of a captured image 692 at which a subject 541 (e.g., human face) is detected, for example. Note that the time at which the target subject is detected may be, for example, a time point T2 of a captured image 694 at which a subject 543 enters the angle of view, a time point T3 of a captured image 695 at which the subject 543 completely enters the angle of view, or a time point T4 of a captured image 696 at which the other subject 541 is out of the angle of view when assuming the subject 543 to be the target subject. At this time, the imaging control module 211 raises a frame rate to the frame rate 413 of the third period.
Moreover, the imaging control module 211 returns the frame rate to a low frame rate at a time point T5 (Case 1) at which a predetermined time has elapsed, at the time point T3 (Case 2) of the captured image 695 at which the target subject 541 is out of the angle of view, or at the time point T2 (Case 3) of the captured image 694 at which the other subject 543 (e.g., dog) enters the angle of view. Alternatively, the imaging control module 211 may reset the elapsed time count after the change is performed to a high frame rate at the time point T2 of the captured image 694 at which the other target subject 543 (e.g., dog) enters the angle of view, and return the frame rate to the low frame rate at a time point T6 at which a predetermined time has elapsed from the time point (Case 4) .
As described above, the shooting prediction according to the present embodiment is to predict that the current state is the shooting state of the third period when the target subject is detected from the captured image acquired by using the image sensor 13. Even with this configuration, the same effects as those of the above embodiment are achieved.
In the embodiments described above, shooting prediction of predicting that the current state is the shooting state of the second period or the third period may be performed when the first release operation such as the selection (tapping) of the user for the target subject on a touch panel display is detected, for example.
Note that the technologies according to the embodiments described above can be combined as appropriate. For example, in the shooting prediction, the motion of the imaging device 1 may be detected by a plurality of means. For example, the shooting prediction may be performed based on at least two of the motion of the imaging device 1, the state of the photographer, and the state of the subject. For example, in the control of the image sensor 13, at least two of a frame rate, the number of samplings, and an exposure time may be changed. In other words, at least one shooting prediction described above and at least one control of the image sensor 13 may be combined as appropriate.
Note that the technologies according to the embodiments described above may be applied to other processing of processing of obtaining a composite image, such as video shooting and continuous shooting.
Note that a part or the whole of processing executed by the imaging device 1 according to the present embodiments may be realized by software.
A program executed by a computer of the imaging device 1 according to the present embodiments is recorded and provided in a computer-readable non-transitory recording medium (computer program product) such as a flash memory (semiconductor memory) such as a USB (Universal Serial Bus) memory and SSD (Solid State Drive) , and HDD (Hard Disk Drive) , in a file with an installable format or an executable format.
Moreover, a program executed by the imaging device 1 according to the present embodiments may be configured to be provided by being stored on a computer connected to a network such as the Internet and  being downloaded by way of the network. Moreover, a program executed by the imaging device 1 according to the present embodiments may be configured to be provided or distributed by way of a network such as the Internet.
Moreover, a program executed by the imaging device 1 according to the present embodiments may be configured to be previously incorporated into ROM etc. and be provided.
According to at least one embodiment described above, it is possible to obtain images with less motion blur while suppressing power consumption during the preview.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
(Supplementary Notes)
(1) An imaging device configured to operate digitally and to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected, the device including:
a shooting prediction unit configured to predict the shooting instruction by the user; and
an imaging control unit configured to change control of an image sensor in the ZSL operation in case of a shooting state where it is predicted by the shooting prediction unit that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
(2) In the imaging device according to (1) , the imaging control unit is configured to, in the predicted shooting state, set a frame rate for imaging pixel signals from the image sensor higher than at a time point before the shooting state.
(3) In the imaging device according to (1) or (2) , the imaging control unit is configured to set a higher frame rate for imaging pixel signals from the image sensor as a state of the imaging device approaches the shooting state.
(4) In the imaging device according to (2) or (3) , the imaging control unit is configured to set a first frame rate higher than at the time point before the predicted shooting state in the shooting state and then to set a second frame rate lower than the first frame rate.
(5) In the imaging device according to (4) , the imaging control unit is configured to set the second frame rate lower than the first frame rate when the shooting instruction by the user is detected, when a motion of the imaging device is greater than a predetermined threshold, or when a predetermined time has elapsed from setting of the first frame rate.
(6) In the imaging device according to any one of (1) to (5) , the change of control of the image sensor is a change in a number of sampling operations for sampling pixel signals from the image sensor.
(7) In the imaging device according to any one of (1) to (6) , the imaging control unit is configured to change an exposure time in the image sensor in the predicted shooting state.
(8) In the imaging device according to any one of (1) to (7) , the shooting prediction unit is configured to detect a motion of the imaging device and to predict that a state of the imaging device is the shooting state when a size of the detected motion of the imaging device is smaller than a predetermined threshold.
(9) In the imaging device according to (8) , the imaging device further comprises a gyro sensor, and
the shooting prediction unit is configured to detect the motion of the imaging device based on an output of  the gyro sensor.
(10) In the imaging device according to (8) or (9) , the shooting prediction unit is configured to calculate a motion vector of a subject from at least two captured images acquired by using the image sensor and to detect the motion of the imaging device based on the motion vector.
(11) In the imaging device according to any one of (8) to (10) , the shooting prediction unit is configured to learn the detected motion of the imaging device for each photographer and to set for each the photographer the threshold for predicting that the state of the imaging device is the shooting state based on the learned information.
(12) In the imaging device according to any one of (1) to (11) , the imaging device further comprises another image sensor arranged to be able to image a side opposite to the image sensor in an optical axis direction of the image sensor used for shooting, and
the shooting prediction unit is configured to detect a state of a photographer from a captured image acquired by using the other image sensor and to predict the shooting state based on the state of the photographer.
(13) In the imaging device according to any one of (1) to (12) , the shooting prediction unit is configured to predict that a state of the imaging device is the shooting state when a target subject is detected from a captured image acquired by using the image sensor.
(14) A control method for a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected, the method including:
predicting the shooting instruction by the user; and
changing control of an image sensor in the ZSL operation in case of a shooting state where it is predicted that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
(15) A program to be executed by a computer of a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected, the program causing the computer to execute:
predicting the shooting instruction by the user; and
changing control of an image sensor in the ZSL operation in case of a shooting state where it is predicted that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
(16) A computer program product storing a program to be executed by a computer of an imaging device, the program according to (15) .
[Explanations of Letters or Numerals]
1: Imaging device
10: Imaging unit
10a: Main camera
10b: Front camera
101: Body
11: Optical system
13: Image sensor
131: Imaging surface
15: Analog front end
21: Controller
211: Imaging control module
213: Shooting prediction module
215: Shooting instruction detection module
23: Digital signal processor
25: Memory
27: Input interface
29: Gyro sensor
31: Bus

Claims (15)

  1. An imaging device configured to operate digitally and to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected, the device comprising:
    a shooting prediction unit configured to predict the shooting instruction by the user; and
    an imaging control unit configured to change control of an image sensor in the ZSL operation in case of a shooting state where it is predicted by the shooting prediction unit that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
  2. The imaging device according to claim 1, wherein the imaging control unit is configured to, in the predicted shooting state, set a frame rate for imaging pixel signals from the image sensor higher than at a time point before the shooting state.
  3. The imaging device according to claim 1, wherein the imaging control unit is configured to set a higher frame rate for imaging pixel signals from the image sensor as a state of the imaging device approaches the shooting state.
  4. The imaging device according to claim 2, wherein the imaging control unit is configured to set a first frame rate higher than at the time point before the predicted shooting state in the shooting state and then to set a second frame rate lower than the first frame rate.
  5. The imaging device according to claim 4, wherein the imaging control unit is configured to set the second frame rate lower than the first frame rate when the shooting instruction by the user is detected, when a motion of the imaging device is greater than a predetermined threshold, or when a predetermined time has elapsed from setting of the first frame rate.
  6. The imaging device according to claim 1, wherein the change of control of the image sensor is a change in a number of sampling operations for sampling pixel signals from the image sensor.
  7. The imaging device according to claim 1, wherein the imaging control unit is configured to change an exposure time in the image sensor in the predicted shooting state.
  8. The imaging device according to claim 1, wherein the shooting prediction unit is configured to detect a motion of the imaging device and to predict that a state of the imaging device is the shooting state when a size of the detected motion of the imaging device is smaller than a predetermined threshold.
  9. The imaging device according to claim 8, wherein
    the imaging device further comprises a gyro sensor, and
    the shooting prediction unit is configured to detect the motion of the imaging device based on an output of the gyro sensor.
  10. The imaging device according to claim 8, wherein the shooting prediction unit is configured to  calculate a motion vector of a subject from at least two captured images acquired by using the image sensor and to detect the motion of the imaging device based on the motion vector.
  11. The imaging device according to claim 8, wherein the shooting prediction unit is configured to learn the detected motion of the imaging device for each photographer and to set for each the photographer the threshold for predicting that the state of the imaging device is the shooting state based on the learned information.
  12. The imaging device according to claim 1, wherein
    the imaging device further comprises another image sensor arranged to be able to image a side opposite to the image sensor in an optical axis direction of the image sensor used for shooting, and
    the shooting prediction unit is configured to detect a state of a photographer from a captured image acquired by using the other image sensor and to predict the shooting state based on the state of the photographer.
  13. The imaging device according to claim 1, wherein the shooting prediction unit is configured to predict that a state of the imaging device is the shooting state when a target subject is detected from a captured image acquired by using the image sensor.
  14. A control method for a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected, the method comprising:
    predicting the shooting instruction by the user; and
    changing control of an image sensor in the ZSL operation in case of a shooting state where it is predicted that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
  15. A computer program product storing a program to be executed by a computer of a digital imaging device configured to be able to realize a zero shutter lag (ZSL) operation of generating an image at a time point at which a shooting instruction by a user is performed by using captured images obtained before a time point at which the shooting instruction by the user is detected, the program causing the computer to execute:
    predicting the shooting instruction by the user; and
    changing control of an image sensor in the ZSL operation in case of a shooting state where it is predicted that the shooting instruction by the user is able to be performed at a time point before the shooting instruction is detected.
PCT/CN2022/128325 2022-10-28 2022-10-28 Imaging device, imaging device control method, and computer program product WO2024087183A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/128325 WO2024087183A1 (en) 2022-10-28 2022-10-28 Imaging device, imaging device control method, and computer program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/128325 WO2024087183A1 (en) 2022-10-28 2022-10-28 Imaging device, imaging device control method, and computer program product

Publications (1)

Publication Number Publication Date
WO2024087183A1 true WO2024087183A1 (en) 2024-05-02

Family

ID=90829821

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/128325 WO2024087183A1 (en) 2022-10-28 2022-10-28 Imaging device, imaging device control method, and computer program product

Country Status (1)

Country Link
WO (1) WO2024087183A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050248660A1 (en) * 2004-05-10 2005-11-10 Stavely Donald J Image-exposure systems and methods
CN103139473A (en) * 2011-11-28 2013-06-05 三星电子株式会社 Method of eliminating a shutter-lag, camera module, and mobile device having the same
CN103259975A (en) * 2012-02-15 2013-08-21 三星电子株式会社 Image-taking method and camera apparatus
US20150138406A1 (en) * 2013-11-18 2015-05-21 Nokia Corporation Method, apparatus and computer program product for capturing images
US20170085796A1 (en) * 2015-05-01 2017-03-23 Gopro, Inc. Motion-based camera mode control to reduce rolling shutter artifacts
US20180213150A1 (en) * 2017-01-24 2018-07-26 Qualcomm Incorporated Adaptive buffering rate technology for zero shutter lag (zsl) camera-inclusive devices
US20200221008A1 (en) * 2019-01-04 2020-07-09 Gopro, Inc. Reducing power consumption for enhanced zero shutter lag
CN113424517A (en) * 2019-02-19 2021-09-21 索尼半导体解决方案公司 Image forming apparatus, image recording apparatus, and image forming method
CN113812139A (en) * 2019-05-21 2021-12-17 索尼集团公司 Image processing apparatus, image processing method, and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050248660A1 (en) * 2004-05-10 2005-11-10 Stavely Donald J Image-exposure systems and methods
CN103139473A (en) * 2011-11-28 2013-06-05 三星电子株式会社 Method of eliminating a shutter-lag, camera module, and mobile device having the same
CN103259975A (en) * 2012-02-15 2013-08-21 三星电子株式会社 Image-taking method and camera apparatus
US20150138406A1 (en) * 2013-11-18 2015-05-21 Nokia Corporation Method, apparatus and computer program product for capturing images
US20170085796A1 (en) * 2015-05-01 2017-03-23 Gopro, Inc. Motion-based camera mode control to reduce rolling shutter artifacts
US20180213150A1 (en) * 2017-01-24 2018-07-26 Qualcomm Incorporated Adaptive buffering rate technology for zero shutter lag (zsl) camera-inclusive devices
US20200221008A1 (en) * 2019-01-04 2020-07-09 Gopro, Inc. Reducing power consumption for enhanced zero shutter lag
CN113424517A (en) * 2019-02-19 2021-09-21 索尼半导体解决方案公司 Image forming apparatus, image recording apparatus, and image forming method
CN113812139A (en) * 2019-05-21 2021-12-17 索尼集团公司 Image processing apparatus, image processing method, and program

Similar Documents

Publication Publication Date Title
US10356321B2 (en) Imaging apparatus with focus adjustment control and exposure adjustment control
US9571742B2 (en) Image capture apparatus and control method thereof
JP4594257B2 (en) Digital imaging device
US20150009352A1 (en) Imaging apparatus and method for controlling the same
US10264173B2 (en) Image capturing apparatus and control method thereof, and storage medium
US10200595B2 (en) Image capturing apparatus, method of controlling same, and storage medium
JP5623256B2 (en) Imaging apparatus, control method thereof, and program
US9961269B2 (en) Imaging device, imaging device body, and lens barrel that can prevent an image diaphragm value from frequently changing
US9247150B2 (en) Image capturing apparatus, exposure control method, and computer-readable recording medium
JP4614143B2 (en) Imaging apparatus and program thereof
US9036075B2 (en) Image pickup apparatus, method for controlling the same, and storage medium
JP2018098649A (en) Imaging apparatus, control method therefor, program, and storage medium
US11190704B2 (en) Imaging apparatus and control method for performing live view display of a tracked object
JP2016142999A (en) Imaging device and control method of the same
JP5947489B2 (en) Focus adjustment device and focus adjustment method
JP2009017427A (en) Imaging device
WO2024087183A1 (en) Imaging device, imaging device control method, and computer program product
JP5832618B2 (en) Imaging apparatus, control method thereof, and program
US11294145B2 (en) Imaging device, imaging method, and program capable of suppressing decrease in autofocusing accuracy
JP6294607B2 (en) IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP2006217249A (en) Electronic camera, electronic camera system and program
JP2012104918A (en) Imaging device and method of driving the same
US9113098B2 (en) Image pickup apparatus and control method thereof, image pickup system, and non-transitory computer-readable storage medium
WO2021161959A1 (en) Information processing device, information processing method, information processing program, imaging device, imaging device control method, control program, and imaging system
JP2009118247A (en) Photographic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22963165

Country of ref document: EP

Kind code of ref document: A1