WO2019181579A1 - Camera, imaging method, and program - Google Patents

Camera, imaging method, and program Download PDF

Info

Publication number
WO2019181579A1
WO2019181579A1 PCT/JP2019/009477 JP2019009477W WO2019181579A1 WO 2019181579 A1 WO2019181579 A1 WO 2019181579A1 JP 2019009477 W JP2019009477 W JP 2019009477W WO 2019181579 A1 WO2019181579 A1 WO 2019181579A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
imaging
exposure
imaging range
captured image
Prior art date
Application number
PCT/JP2019/009477
Other languages
French (fr)
Japanese (ja)
Inventor
一樹 石田
小林 潤
祐樹 杉原
真彦 宮田
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2020508203A priority Critical patent/JP7028960B2/en
Publication of WO2019181579A1 publication Critical patent/WO2019181579A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/093Digital circuits for control of exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to a camera, an imaging method, and a program, and more particularly to a technique for performing exposure while moving an imaging range.
  • “Long exposure imaging” is a technique in which the exposure time is set longer than normal exposure
  • multiple exposure imaging is a technique in which a single image is obtained by performing multiple exposures.
  • Patent Document 1 when performing multiple exposure, there is a technique for suppressing a large movement of the imaging device by displaying a notification image on a display device and notifying the user that multiple exposure is being performed. Are listed.
  • the user cannot check the charge information accumulated in the image sensor even if the imaging apparatus has already moved in an unintended direction immediately after the exposure starts and imaging has failed. The exposure is completed to the end.
  • the user cannot check the charge information accumulated in the image sensor, when the image pickup apparatus moves in an unintended direction, the charge obtained by the unintended movement of the image pickup apparatus. Information is not available.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a camera, an imaging method, and a program capable of notifying a user without reading out charge information stored in an imaging device during exposure. Is to provide.
  • one aspect of the present invention is a camera capable of acquiring a captured image by performing exposure while moving an imaging range, and having a wider angle than an imaging scheduled area of the captured image.
  • a first captured image acquisition unit that acquires a captured first captured image
  • a second captured image acquisition unit that acquires a second captured image in which a part of the planned imaging region immediately before the start of exposure is captured
  • an imaging range for detecting an imaging range per unit time based on the imaging range of the second captured image, based on the detection information of the sensor from the state in which exposure is started and the sensor that detects the posture information of the camera
  • a detection unit and a first reproduction image generation unit that generates a first reproduction image corresponding to an imaging range for each unit time from the first imaging image, or the first imaging image and the second imaging image
  • Second during exposure integrating the first reproduced image A second reproduced image generating unit that generates a reproduced image, a display control unit for displaying the second reproduction image on the display unit, a camera equipped with.
  • the imaging range detection unit detects the imaging range per unit time based on the imaging range of the second captured image based on the detection information of the sensor from the state in which the exposure is started.
  • the first reproduction image generation unit generates a first reproduction image corresponding to the imaging range for each unit time from the first imaging image, or the first imaging image and the second imaging image, and performs the second reproduction.
  • the image generation unit generates a second reproduced image during exposure obtained by integrating the first reproduced image. Thereby, it is possible to notify the user without reading out the charge information accumulated in the image sensor during exposure.
  • the camera includes a start position detection unit that detects a start position of exposure in the first captured image by comparing the first captured image and the second captured image, and the imaging range detection unit includes: Based on the start position, an imaging range for each unit time is detected.
  • the start position detection unit detects the exposure start position in the first captured image by comparing the first captured image and the second captured image. As a result, an accurate exposure start position can be detected, and the subsequent imaging range for each unit time can be accurately specified.
  • the senor detects posture information of the camera when acquiring the first captured image and the second captured image, and the imaging range detection unit acquires the first captured image and the second captured image. Based on the detection information, the exposure start position is detected, and the imaging range for each unit time is detected.
  • the orientation information of the camera when the first captured image and the second captured image are acquired by the sensor is detected, and the first captured image and the second captured image are detected by the imaging range detection unit. Based on the detection information in the case of acquiring, the exposure start position is detected, and the imaging range for each unit time is detected. As a result, an accurate exposure start position can be detected, and the subsequent imaging range for each unit time can be accurately specified.
  • the sensor further detects at least one of camera orientation information and height information from the reference plane.
  • At least one of the orientation information of the camera and the height information from the reference plane is further detected by the sensor.
  • the imaging range can be accurately detected.
  • the imaging range detection unit detects the moving direction of the imaging range per unit time based on the detection information of the sensor.
  • the moving direction of the imaging range per unit time is detected by the imaging range detection unit based on the detection information of the sensor. Therefore, the imaging range can be accurately detected.
  • the camera includes an exposure time control unit that acquires an exposure time set for exposure and controls the exposure.
  • the exposure time set by the exposure is acquired and the exposure is controlled by the exposure time control unit, it is possible to perform exposure for an accurate period.
  • the display control unit displays the second reproduced image on the display unit.
  • the display control unit displays the second reproduction image on the display unit.
  • the first reproduction image generation unit is configured to notify the user of a notification image and a notification character when the imaging range per unit time detected by the imaging range detection unit is in a range other than the first captured image.
  • a first reproduction image including at least one of them is generated.
  • the notification image when the imaging range for each unit time detected by the imaging range detection unit is in a range other than the first captured image by the second reproduction image generation unit, the notification image to notify the user and A second reproduction image including at least one of the notification characters is generated. Thereby, it can be notified to the user that the imaging range is in a range other than the first captured image, that is, the movement of the imaging range has greatly deviated.
  • the display control unit switches between the first reproduction image and the second reproduction image and causes the display unit to display them.
  • the display control unit switches between the first reproduction image and the second reproduction image and causes the display unit to display the image
  • the image currently captured for the user is integrated with the current image. The image can be confirmed.
  • the senor is a camera shake sensor.
  • a zoom control unit that fixes the zoom function according to the first captured image and the imaging range per unit time.
  • the zoom function is fixed by the zoom control unit according to the first captured image and the imaging range for each unit time, it is possible to suppress capturing an area other than the scheduled imaging area. can do.
  • Another aspect of the present invention is an imaging method capable of acquiring a captured image by performing exposure while moving an imaging range, and acquiring a first captured image captured at a wider angle than the planned imaging region of the captured image.
  • the first reproduced image corresponding to the imaging range is generated from the first captured image, or the first reproduced image generation step for generating the first reproduced image from the first captured image and the second captured image, and the first reproduced image is integrated.
  • exposure A second reproduced image generation step of generating a second reproduced image, and a display control step of displaying the second reproduction image on the display unit is an imaging method comprising.
  • Another aspect of the present invention is a program for causing a computer to execute an imaging process capable of acquiring a captured image by performing exposure while moving an imaging range.
  • the first image is captured at a wider angle than the planned imaging region of the captured image.
  • a first captured image acquisition step for acquiring a captured image, a second captured image acquisition step for acquiring a second captured image in which a part of the planned imaging region immediately before the start of exposure is captured, and the attitude of the camera A detection step of detecting information by a sensor, and an imaging range detection step of detecting an imaging range per unit time based on the imaging range of the second captured image based on detection information of the sensor from a state where exposure is started
  • a first reproduction image generation step for generating a first reproduction image corresponding to the imaging range for each unit time from the first imaging image or the first imaging image and the second imaging image.
  • the imaging range detection unit detects the imaging range for each unit time based on the imaging range of the second captured image based on the detection information of the sensor from the state where the exposure is started.
  • the first reproduction image generation unit generates a first reproduction image corresponding to the imaging range for each unit time from the first imaging image, or the first imaging image and the second imaging image, and performs the second reproduction.
  • the image generation unit generates a second reproduction image during exposure obtained by integrating the first reproduction image, so that the user is notified without reading out the charge information accumulated in the image sensor during the exposure. Can do.
  • FIG. 1 is a diagram illustrating imaging in which long exposure is performed while moving the imaging range.
  • FIG. 2 is a perspective view of the camera.
  • FIG. 3 is a rear view of the camera.
  • FIG. 4 is a block diagram showing an embodiment of the internal configuration of the camera.
  • FIG. 5 is a block diagram illustrating a functional configuration example of the image processing unit.
  • FIG. 6 is a diagram illustrating an example of capturing the first captured image.
  • FIG. 7 is a diagram illustrating a first captured image.
  • FIG. 8 is a diagram illustrating a second captured image.
  • FIG. 9 is a diagram illustrating an example of the first reproduced image.
  • FIG. 10 is a diagram illustrating an example of the second reproduced image.
  • FIG. 11 is a diagram illustrating an example of a second reproduced image.
  • FIG. 1 is a diagram illustrating imaging in which long exposure is performed while moving the imaging range.
  • FIG. 2 is a perspective view of the camera.
  • FIG. 3 is
  • FIG. 12 is a flowchart showing the flow of imaging by the camera.
  • FIG. 13 is a diagram illustrating a first reproduced image having a zebra pattern.
  • FIG. 14 is a diagram illustrating an example of the operation mode of the camera when moving the imaging range.
  • FIG. 15 is a diagram for explaining switching of display on the display unit.
  • FIG. 1 is a diagram illustrating imaging in which long exposure is performed while moving the imaging range.
  • FIG. 1 shows a case where a portion of the trunk 126 of the tree 125 (the outer skin pattern 145 of the trunk 126 is shown) is imaged as the imaging scheduled region H.
  • the user (photographer) images the imaging scheduled area H while moving the imaging range of the camera 10 (FIG. 2). Specifically, the camera 10 is tilted in the tilt direction (arrow L in the figure), the imaging range is moved, and the entire imaging planned area H is imaged.
  • the setting of the exposure time of the camera 10 is set so that the movement of the imaging range is completed by one exposure, and a relatively long exposure time is set.
  • the exposure time requires a period during which the imaging range H is completed by moving the imaging range, and is longer than at least 1 second, more preferably longer than 5 seconds.
  • the exposure time may be 10 seconds or 30 seconds.
  • the present invention is suitably used in an imaging mode in which the exposure time is set to a relatively long time and the imaging range is moved during the exposure time.
  • [camera] 2 and 3 are a perspective view and a rear view showing the camera (digital camera) 10, respectively.
  • the camera 10 is a digital camera that receives light passing through a lens with an imaging device, converts the light into a digital signal, and records it on a recording medium as still image or moving image data.
  • the camera 10 is provided with a photographing lens 12, a strobe 1 and the like on the front, and a shutter button 2, a power / mode switch 3, a mode dial 4 and the like on the top.
  • a liquid crystal monitor (LCD: Liquid Crystal Display) 30, a zoom button 5, a cross button 6, a MENU / OK button 7, a playback button 8, a BACK button 9 and the like are disposed on the back of the camera.
  • LCD Liquid Crystal Display
  • the photographing lens 12 is constituted by a retractable zoom lens, and is set out from the camera body by setting the operation mode of the camera 10 to the photographing mode by the power / mode switch 3.
  • the strobe 1 irradiates a main subject with strobe light.
  • the shutter button 2 is composed of a two-stroke switch composed of so-called “half-press” and “full-press”, and functions as an imaging preparation instruction unit and also functions as an image recording instruction unit.
  • the camera 10 When the still image shooting mode is selected as the shooting mode and the shutter button 2 is “half-pressed”, the camera 10 performs an imaging preparation operation for performing AF (Autofocus) / AE (Auto-Exposure) control. When “fully pressed”, a still image is captured and recorded.
  • AF Autofocus
  • AE Automatic-Exposure
  • the camera 10 starts recording a moving image when the moving image shooting mode is selected as the shooting mode and the shutter button 2 is “fully pressed”, and when the shutter button 2 is “fully pressed” again, the recording is performed. Stop and enter standby mode.
  • the power / mode switch 3 has both a function as a power switch for turning on / off the power of the camera 10 and a function as a mode switch for setting the mode of the camera 10, and “OFF position” and “reproduction position”. And “imaging position” are slidably arranged.
  • the camera 10 is turned on by sliding the power / mode switch 3 to the “reproduction position” or “imaging position”, and turned off by setting it to the “OFF position”. Then, the power / mode switch 3 is slid to match the “reproduction position” to set the “reproduction mode”, and to the “imaging position” to set the “shooting mode”.
  • the mode dial 4 functions as a mode switching unit for setting the shooting mode of the camera 10, and the shooting mode of the camera 10 is set to various modes depending on the setting position of the mode dial 4. For example, there are “still image shooting mode” in which still image shooting is performed, “moving image shooting mode” in which moving image shooting is performed, and “long exposure-moving imaging mode” described in FIG.
  • the liquid crystal monitor 30 functions as a display unit, displays a live view image in the shooting mode, displays a still image or a moving image in the playback mode, and displays a menu screen to display a part of the graphical user interface. Also works.
  • the liquid crystal monitor 30 displays a first reproduction image and a second reproduction image, which will be described later.
  • the zoom button 5 functions as zoom instruction means for instructing zooming, and includes a tele button 5T for instructing zooming to the telephoto side and a wide button 5W for instructing zooming to the wide angle side.
  • the tele button 5T and the wide button 5W are operated to change the focal length of the shooting lens 12. Further, when the tele button 5T and the wide button 5W are operated in the reproduction mode, the image being reproduced is enlarged or reduced.
  • the cross button 6 is an operation unit for inputting instructions in four directions (up, down, left, and right). Function.
  • the left / right key functions as a frame advance (forward / reverse feed) button in the playback mode.
  • the MENU / OK button 7 is an operation having a function as a menu button for instructing to display a menu on the screen of the liquid crystal monitor 30 and a function as an OK button for instructing confirmation and execution of selection contents. Button.
  • the playback button 8 is a button for switching to a playback mode in which a captured still image or moving image is displayed on the liquid crystal monitor 30.
  • the BACK button 9 functions as a button for instructing to cancel the input operation or return to the previous operation state.
  • buttons / switches are not provided with specific members, but a touch panel is provided to operate the buttons / switches by operating them. Good.
  • FIG. 4 is a block diagram showing an embodiment of the internal configuration of the camera 10.
  • the camera 10 records captured images on a memory card 54, and the operation of the entire apparatus is centrally controlled by a central processing unit (CPU: Central Processing Unit) 40.
  • CPU Central Processing Unit
  • the camera 10 includes an operation unit 38 such as a shutter button 2, a power / mode switch 3, a mode dial 4, a tele button 5T, a wide button 5W, a cross button 6, a MENU / OK button 7, a playback button 8, and a BACK button 9. Is provided.
  • a signal from the operation unit 38 is input to the CPU 40, and the CPU 40 controls each circuit of the camera 10 based on the input signal.
  • the sensor driving unit 32 controls the driving of the image sensor 16 and the lens driving unit 36 drives the lens. Control, aperture drive control by the aperture drive unit 34, imaging operation control, image processing control, image data recording / reproduction control, display control of the liquid crystal monitor 30 by the display control unit 61, and the like.
  • the luminous flux that has passed through the photographing lens 12, the diaphragm 14, the mechanical shutter (mechanical shutter) 15 and the like forms an image on the image sensor 16 that is a CMOS (Complementary Metal-Oxide Semiconductor) type color image sensor.
  • the image sensor 16 is not limited to the CMOS type, but may be an XY address type or a CCD (Charge-Coupled Device) type color image sensor.
  • the image sensor 16 has a large number of light receiving elements (photodiodes) arranged two-dimensionally, and the subject image formed on the light receiving surface of each photodiode is an amount of signal voltage (or charge) corresponding to the amount of incident light. Is converted into a digital signal via an A / D (Analog / Digital) converter in the image sensor 16 and output.
  • a / D Analog / Digital
  • the CPU 40 includes an exposure time control unit 65 and a zoom control unit 67.
  • the exposure time control unit 65 acquires the set exposure time and controls the end of exposure from the start of exposure. Specifically, when the user presses the shutter button 2, exposure is started, and the exposure time control unit 65 controls the sensor drive unit 32 so as to end the exposure for which the acquired exposure time has elapsed.
  • the zoom control unit 67 controls the lens driving unit 36 to limit the zoom function.
  • the zoom function is fixed in accordance with a first captured image to be described later and an imaging range per unit time. That is, the zoom function is fixed so that the imaging range does not deviate from the scheduled imaging area H.
  • An image signal (image data) read out from the image sensor 16 at the time of shooting a moving image or a still image is temporarily stored in a memory (SDRAM (Synchronous Dynamic Random Access Memory)) 48 via the image input controller 22, or The data is taken into the AF processing unit 42, the AE detection unit 44, and the like.
  • SDRAM Serial Dynamic Random Access Memory
  • the CPU 40 performs overall control of each unit of the camera 10 based on the operation of the operation unit 38.
  • the AF processing unit 42 always performs an AF operation.
  • the AE operation by the AE detection unit 44 is performed.
  • the shooting mode is the still image shooting mode
  • the CPU 40 performs the above-described AF control again, and if the shutter button 2 is fully pressed,
  • the brightness of the subject (shooting Ev value) is calculated, and based on the calculated shooting Ev value, the F value of the aperture 14 and the exposure time (shutter speed) by the mechanical shutter 15 are determined according to the program diagram, and still image shooting ( (Exposure control).
  • the shooting mode is the moving image shooting mode
  • the CPU 40 starts shooting and recording (recording) of the moving image.
  • the mechanical shutter 15 is opened, image data is continuously read from the image sensor 16 (for example, frame rates of 30 frames / second, 60 frames / second), and phase difference AF is continuously performed.
  • the brightness of the subject is calculated, and the shutter drive unit 33 controls the shutter speed (charge accumulation time by the rolling shutter) and / or the aperture 14 by the aperture drive unit 34.
  • the CPU 40 moves the zoom lens forward and backward in the optical axis direction via the lens driving unit 36 in accordance with the zoom command from the zoom button 5 to change the focal length.
  • a ROM (Read Only Memory) 47 stores a camera control program, defect information of the image sensor 16, various parameters and tables used for image processing, and the like. It can be substituted by EEPROM (Electrically-Erasable-Programmable-Read-Only Memory).
  • Sensor 63 detects posture information of camera 10.
  • the posture information of the camera 10 may be detected by a camera shake sensor.
  • the sensor 63 may detect direction information of the camera 10 or height information from the reference plane. Specific examples include a GPS (Global Positioning System), a gyro sensor, a pressure sensor, and the like.
  • the image processing unit 24 reads out unprocessed image data (RAW data) acquired through the image input controller 22 when a moving image or a still image is captured and temporarily stored in the memory 48.
  • the image processing unit 24 performs offset processing, pixel interpolation processing (interpolation processing of phase difference detection pixels, scratched pixels, etc.), white balance correction, gain control processing including sensitivity correction, gamma correction processing on the read RAW data, Synchronization processing (also referred to as “demosaic processing”), luminance and color difference signal generation processing, contour enhancement processing, color correction, and the like are performed.
  • VRAM Video RAM Random Access memory
  • the VRAM 50 includes an A area and a B area for recording image data each representing an image for one frame.
  • image data representing an image for one frame is rewritten alternately in the A area and the B area.
  • the written image data is read from an area other than the area where the image data is rewritten.
  • the image data read from the VRAM 50 is encoded by the video encoder 28 and output to the liquid crystal monitor 30 provided on the back of the camera. As a result, a live view image showing the subject image is displayed on the liquid crystal monitor 30.
  • the image data processed by the image processing unit 24 and processed as a still image or moving image for recording (luminance data (Y) and color difference data (Cb), (Cr)) is stored in the memory 48 again.
  • luminance data (Y) and color difference data (Cb), (Cr) is stored in the memory 48 again.
  • the compression / decompression processing unit 26 compresses the luminance data (Y) and the color difference data (Cb), (Cr) processed by the image processing unit 24 and stored in the memory 48 when recording a still image or a moving image. Apply. In the case of a still image, it is compressed in, for example, JPEG (Joint Photographic coding Experts Group) format, and in the case of a moving image, it is compressed in, for example, H.264 format.
  • the compressed image data compressed by the compression / decompression processing unit 26 is recorded on the memory card 54 via the media controller 52.
  • the compression / decompression processing unit 26 performs decompression processing on the compressed image data obtained from the memory card 54 via the media controller 52 in the playback mode.
  • the media controller 52 performs recording and reading of compressed image data with respect to the memory card 54.
  • FIG. 5 is a block diagram illustrating a functional configuration example of the image processing unit 24.
  • the image processing unit 24 includes a first captured image acquisition unit 101, a second captured image acquisition unit 103, an imaging range detection unit 105, a start position detection unit 107, a first reproduction image generation unit 109, and a second reproduction.
  • An image generation unit 111 is provided. Each process performed by the image processing unit 24 will be described below.
  • the first captured image acquisition unit 101 captures a first captured image (hereinafter, referred to as an A image) 121 (captured at a wider angle than the planned imaging region H of the captured image obtained by performing exposure while moving the imaging range) ( FIG. 7) is acquired.
  • the A image 121 is picked up at a preparatory stage in which long exposure-moving imaging is performed, and is picked up at a wider angle than the scheduled imaging region H.
  • FIG. 6 is a diagram illustrating an example of imaging of the A image 121.
  • FIG. 6 is an image of the A image 121 when the long exposure-moving imaging described with reference to FIG. 1 is performed.
  • the A image 121 is imaged by the camera 10 at a wide angle so as to include the imaging scheduled area H.
  • FIG. 7 is a diagram illustrating the A image 121 captured in the imaging mode illustrated in FIG.
  • the A image 121 contains the entire tree 125 within the angle of view, and naturally, the trunk 126 that is the imaging-scheduled area H is also included within the angle of view.
  • the A image 121 shows a scheduled imaging area H, and imaging ranges (127A, 127B, 127C) for each unit time. The imaging range will be described in detail later.
  • the second captured image acquisition unit 103 acquires a second captured image (hereinafter referred to as a B image) 131 (FIG. 8) in which a part of the planned imaging region H immediately before the start of exposure is captured.
  • the B image 131 is a captured image that is captured immediately before the start of long exposure-moving imaging main exposure.
  • FIG. 8 is a diagram showing the B image 131.
  • the B image 131 is an image that is captured immediately before the start of long exposure, and a part of the trunk 126 is enlarged in a state where the angle of view is narrower than that of the A image 121 and zoomed.
  • As the B image 131 for example, a still image is cut out from a live view image (confirmation image) immediately before the main exposure is performed to obtain a B image 131.
  • the imaging range detection unit 105 detects an imaging range per unit time based on the imaging range of the B image 131 based on the detection information of the sensor 63 from the state where exposure is started. For example, the imaging range detection unit 105 detects the imaging range of the B image 131 in the A image 121, and uses detection method 1 for detecting the imaging range for each unit time in the A image 121 using the detected imaging range as the exposure start position. To do. Further, for example, the imaging range detection unit 105 uses the detection method 2 for detecting the imaging range for each unit time from the detection information of the sensor 63 when the B image 131 and the A image 121 are captured.
  • the unit time means a frame rate (fps: frames per second) for displaying a first reproduction image or a second reproduction image, which will be described later.
  • the imaging range detection unit 105 detects an imaging range for each unit time based on the start position detected by the start position detection unit 107.
  • the start position detector 107 detects the exposure start position in the A image 121 by comparing the A image 121 and the B image 131. For example, the start position detection unit 107 detects the position of the B image 131 in the A image 121 by pattern matching between the A image 121 and the B image 131. Then, the imaging range detection unit 105 acquires the start position detection result of the start position detection unit 107. The imaging range detection unit 105 detects the imaging range for each unit time based on the detection information of the sensor 63 with reference to the acquired start position.
  • the imaging range detection unit 105 detects the start position based on the posture information of the camera 10 when the A image 121 and the B image 131 are captured, and detects the imaging range for each unit time.
  • the sensor 63 detects the posture information of the camera 10 when the A image 121 and the B image 131 are acquired. For example, when the sensor 63 is a camera shake sensor, the imaging range detection unit 105 detects the imaging range by comparing detection information of the camera shake sensor when the A image 121 and the B image 131 are captured. The imaging range detection unit 105 detects an exposure start position, and detects an imaging range for each subsequent unit time based on the detected exposure start position.
  • the imaging range detection unit 105 may detect the moving direction of the imaging range for each unit time of exposure based on the detection information of the sensor 63.
  • the imaging range detection unit 105 can accurately detect the imaging range by acquiring the moving direction of the imaging range.
  • the imaging range detected by the imaging range detection unit 105 is shown.
  • the imaging range 127A, the imaging range 127B, and the imaging range 127C are imaging ranges that sequentially move every unit time. By setting the exposure start position as the imaging range 127A and tilting the camera 10, the imaging range sequentially moves to 127B and 127C (see FIG. 1). Note that, in FIG. 7, three imaging ranges are shown in a simplified manner, but there are actually a large number of imaging ranges in the imaging scheduled region H.
  • the first reproduction image generation unit 109 generates a first reproduction image (FIG. 9) corresponding to the imaging range for each unit time from the A image 121 or the A image 121 and the B image 131.
  • the first reproduction image generation unit 109 generates a first reproduction image using image information of the imaging range for each unit time in the first reproduction image.
  • the first reproduced image generation unit 109 may generate a first reproduced image based on the B image 131 captured immediately before the exposure is started. In particular, when the change in the posture of the camera 10 is small immediately after the start of exposure, the B image 131 and the area corresponding to the imaging range immediately after the start in the A image 121 have many overlapping portions. Therefore, the first reproduction image generation unit 109 may generate a first reproduction image from the B image 131 and the A image 121.
  • FIG. 9 is a diagram showing an example of the first reproduced image.
  • the first reproduction image 133A, the first reproduction image 133B, and the first reproduction image 133C corresponding to the imaging range 127A, the imaging range 127B, and the imaging range 127C described in FIG. 7 are shown.
  • the imaging range 127A is set as the exposure start position, and a first reproduction image 133A corresponding thereto is shown.
  • the first reproduction image 133B corresponding to the imaging range 127B after the next unit time has elapsed is shown.
  • a first reproduction image 133C corresponding to the imaging range 127C after the next unit time has elapsed is shown. That is, the first reproduced image 133 represents charge information accumulated in the image sensor 16 for each unit time.
  • the first reproduced image is represented by reference numeral 133
  • the first reproduced image at each unit time is represented by reference numerals 133A, 133B, and 133C.
  • the second reproduction image generation unit 111 generates a second reproduction image by accumulating the first reproduction image 133. Specifically, the second reproduction image generation unit 111 generates a second reproduction image by superimposing the first reproduction images 133 obtained for each unit time obtained from the start of exposure to the present time.
  • FIG. 10 is a diagram showing an example of the second reproduced image. It should be noted that a portion where blur occurs in the second reproduction image 135 is indicated by a dotted line.
  • the second reproduction image 135 is obtained by superimposing the first reproduction image 133A, the first reproduction image 133B, and the first reproduction image 133C shown in FIG. Since the first reproduction images (133A, 133B, 133C) on which different images are copied every unit time are overlapped, the second reproduction image 135 is blurred. Specifically, in the second reproduced image 135, blur is generated in the outline portion of the trunk 126, and blur is also generated in the outer skin pattern 145 of the trunk 126.
  • the second reproduction image 135 obtained in this way is generated by overlapping the first reproduction image 133 for each unit time, it represents the charge information accumulated in the image sensor 16 from the start of exposure to the present time. ing.
  • FIG. 11 is a diagram showing another example of the second reproduction image 135.
  • the movement of the camera 10 has moved in an unintended direction, which is a failure example from the viewpoint that an image intended by the user cannot be obtained.
  • the second reproduction image 135 is an image of the trunk 126 of the tree 125 as in FIG.
  • the case shown in FIG. 11 is a case where the posture of the camera 10 is blurred when moving the imaging range.
  • the imaging range and the first reproduced image 133 also reflect the influence of the shake of the posture of the camera 10. Accordingly, the influence of camera shake is also expressed in the second reproduction image 135 generated by integrating the first reproduction image 133.
  • the shake of the posture of the camera 10 during the exposure can be expressed by the second reproduction image 135, the user confirms the second reproduction image 135, thereby capturing the image before the exposure is completed. Can be canceled and retaken.
  • a new image can be created by using the shake of the posture of the camera 10 during exposure.
  • FIG. 12 is a flowchart showing a flow of imaging by the camera 10.
  • the captured A image 121 is captured by the camera 10 at a wide angle including the region (expected imaging region H) where imaging is performed with long exposure (step S10), and the first captured image acquisition unit 101 acquires the A image 121. (First captured image acquisition step). Next, the camera 10 captures a live view image and stores and saves it until long-time exposure starts (step S11).
  • the camera 10 starts main exposure for moving imaging (step S12). Then, the second captured image acquisition unit 103 acquires the B image 131 captured immediately before the start of the main exposure from the stored live view image (step S13: second captured image acquisition step).
  • the start position detection unit 107 detects the position (exposure start position) of the B image 131 in the A image 121 (step S14: detection step). Then, the start position detection unit 107 determines whether or not the position of the B image 131 can be detected in the A image 121 (step S15). If the start position detection unit 107 cannot detect the position of the B image 131 in the A image 121, the display control unit 61 displays an error message on the liquid crystal monitor 30 (step S23).
  • the start position detection unit 107 determines whether the acquired exposure time is equal to or less than the threshold (step S16). ). When it is determined that the exposure time acquired by the exposure time control unit 65 is equal to or less than the threshold value, the display control unit 61 ends the display without causing the liquid crystal monitor 30 to display the second reproduction image 135. On the other hand, when it is determined that the exposure time acquired by the exposure time control unit 65 is larger than the threshold value, the display control unit 61 performs the following process to display the second reproduced image 135 on the liquid crystal monitor 30.
  • the imaging range detection unit 105 acquires the current posture information (turning or the like) of the camera 10 and relative position coordinates ( ⁇ x, ⁇ y, ⁇ z) (step S17).
  • the relative position coordinates are coordinates of the center of the imaging range and are relative coordinates in the A image 121.
  • the relative position coordinates can be calculated based on the imaging range, and the coordinates ( ⁇ x, ⁇ y, ⁇ z) are detected by the detection information from the sensor 63 with the center of the imaging range (the center of the B image) at the exposure start position as the origin. It is possible to obtain.
  • Step S18 First reproduction image generation step.
  • the imaging range detection unit 105 determines whether or not the imaging range is outside the range of the A image 121 (step S19).
  • the first reproduction image generation unit 109 displays the first reproduction image in which a portion outside the range is displayed as a zebra pattern. 133 is generated (step S24).
  • FIG. 13 is a diagram showing a first reproduced image 133 having a zebra pattern.
  • the tree trunk 126 is shown in the first reproduced image 133, the angle of view has passed beyond the left side as viewed in the figure, so that it exceeds the range of the A image 121. Therefore, the zebra pattern 143 notifies the user that the portion that has exceeded the range of the A image 121 is out of the range, and the zebra pattern 143 is an example of the notification image.
  • the notification to the user may be performed not only by the notification image but also by the notification character, or may be performed by the notification image and the notification character.
  • the second reproduction image generation unit 111 has been integrated into the imaging element 16 until the present time.
  • An image is generated by integrating the first reproduction image 133 (second reproduction image generation step), and the display control unit 61 displays the generated second reproduction image 135 on the liquid crystal monitor 30 (step S20). : Display control step).
  • the exposure time control unit 65 determines whether or not the current time is within the acquired exposure time (step S21).
  • the imaging range detection unit 105 acquires the current posture information and relative position coordinate value of the camera 10 and detects the current imaging range.
  • the flow indicated by the alternate long and short dash line F is performed, for example, every normal frame rate (50 to 60 fps).
  • the display control unit 61 ends the display of the reproduced image on the liquid crystal monitor 30 (step S22).
  • the hardware structure of a processing unit (processing unit) that executes various processes is the following various processors.
  • the circuit configuration can be changed after manufacturing a CPU (Central Processing Unit) or FPGA (Field Programmable Gate Array) that is a general-purpose processor that functions as various processing units by executing software (programs).
  • a CPU Central Processing Unit
  • FPGA Field Programmable Gate Array
  • dedicated logic circuits such as programmable logic devices (Programmable Logic Devices: PLDs) and ASICs (Application Specific Specific Integrated Circuits) that have specially designed circuit configurations to execute specific processing. It is.
  • One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPUs and FPGAs). May be. Further, the plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or server, one processor is configured with a combination of one or more CPUs and software. There is a form in which the processor functions as a plurality of processing units.
  • SoC system-on-chip
  • a form of using a processor that realizes the functions of the entire system including a plurality of processing units with a single IC (integrated circuit) chip. is there.
  • various processing units are configured using one or more of the various processors as a hardware structure.
  • circuitry circuitry in which circuit elements such as semiconductor elements are combined.
  • the first reproduction image generation unit 109 causes the first reproduction image 133 corresponding to the imaging range per unit time to be generated from the A image 121 or the A image 121 or the B image 131.
  • the second reproduction image generation unit 111 generates the second reproduction image 135 that is being exposed by integrating the first reproduction image 133. Thereby, the user can be notified by reproducing the information accumulated in the image sensor 16 during exposure.
  • ⁇ Others> [Camera movement type]
  • the camera 10 is moved in the tilt direction to move the imaging range.
  • the operation of the camera 10 that moves the imaging range is not limited to this.
  • FIG. 14 is a diagram illustrating an example of the operation mode of the camera 10 when moving the imaging range.
  • the imaging area H may be imaged by moving the imaging range by changing the height while maintaining the posture of the camera 10 as it is.
  • FIG. 15 is a diagram for explaining switching of the display of the liquid crystal monitor 30.
  • the first reproduced image 133 is displayed on the liquid crystal monitor 30.
  • the second reproduced image 135 is displayed on the liquid crystal monitor 30.
  • the user may switch whether to display the first reproduction image 133 or the second reproduction image 135 on the liquid crystal monitor 30 using the operation unit 38 according to necessary information.
  • the user can switch between the currently captured image and the accumulated image for confirmation.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)

Abstract

The present invention provides a camera, an imaging method, and a program which make it possible to notify a user without reading out electric charge information accumulated in an imaging element during exposure. A camera (10) acquires a first captured image captured at a wide angle, acquires a second captured image obtained by capturing an image of a portion of an imaging scheduled region immediately before the start of exposure, on the basis of a sensor (63) for detecting attitude information of the camera (10), and detection information of the sensor (63) from the start of exposure, detects an imaging range per unit time using the imaging range of the second captured image as a reference, generates a first reproduced image corresponding to the imaging range per unit time from the first captured image, or the first captured image and the second captured image, and generates a second reproduced image during exposure obtained by integrating the first reproduced images.

Description

カメラ、撮像方法、及びプログラムCamera, imaging method, and program
 本発明は、カメラ、撮像方法、及びプログラムに関し、特に撮像範囲を移動させながら露光を行う場合の技術に関する。 The present invention relates to a camera, an imaging method, and a program, and more particularly to a technique for performing exposure while moving an imaging range.
 従来より、露光のタイミングを変化させたり、露光を行っている間に撮像範囲を移動させたりして、得られる画像に特殊な効果を持たせる撮像方法が知られている。これらの撮像方法として、例えば長時間露光撮像や多重露光撮像があげられる。 Conventionally, there has been known an imaging method in which a special effect is given to an obtained image by changing the timing of exposure or moving an imaging range during exposure. Examples of these imaging methods include long exposure imaging and multiple exposure imaging.
 長時間露光撮像とは、露光時間を通常の露光よりも長くして撮像を行う手法であり、多重露光撮像とは複数回の露光を行って1枚の画像を取得する手法である。 “Long exposure imaging” is a technique in which the exposure time is set longer than normal exposure, and multiple exposure imaging is a technique in which a single image is obtained by performing multiple exposures.
 ここで長時間露光や多重露光を行う場合に、撮像装置がユーザ(撮影者)の意図に反してブレてしまうと、ユーザの意図する画像は得られなくなる。したがって、長時間露光や多重露光を行う場合には、ユーザは撮像装置がブレることを抑制することが必要であり、ユーザのブレの抑制を補助する技術が考えられてきた。 Here, when performing long-time exposure or multiple exposure, if the imaging device shakes against the user's (photographer's) intention, an image intended by the user cannot be obtained. Therefore, when performing long exposure or multiple exposure, it is necessary for the user to suppress blurring of the imaging apparatus, and a technique for assisting suppression of blurring of the user has been considered.
 特許文献1では、多重露光を行う際に、表示装置に報知画像を表示し多重露光を行っていることをユーザに知らせることにより、撮像装置を大きく動かすことを抑制することを目的とした技術が記載されている。 In Patent Document 1, when performing multiple exposure, there is a technique for suppressing a large movement of the imaging device by displaying a notification image on a display device and notifying the user that multiple exposure is being performed. Are listed.
特開2009-232227号公報JP 2009-232227 A
 ここで、長時間露光の期間中に意図的に所定の方向に撮像装置を移動させて、特殊な効果を持たせた画像を得る手法がある(移動撮像)。移動撮像を行う場合には、撮像装置の動きは得られる画像に現れてしまうので、ユーザが意図した方向に撮像装置を移動させることが必要であり、露光中に撮像素子に蓄積された電荷情報を確認することが望まれている。 Here, there is a method of obtaining an image having a special effect by intentionally moving the imaging device in a predetermined direction during a long exposure period (moving imaging). When moving imaging is performed, the movement of the imaging device appears in the obtained image, so it is necessary to move the imaging device in the direction intended by the user, and the charge information accumulated in the imaging device during exposure It is hoped to confirm.
 例えばユーザは、露光開始直後に既に撮像装置が意図していない方向に動いており撮像が失敗していたりする場合であっても、撮像素子に蓄積されている電荷情報を確認することができないので、最後まで露光を完了させてしまう。また、ユーザは、撮像素子に蓄積されている電荷情報を確認することができないので、撮像装置が意図していない方向に動いた場合に、その意図していない撮像装置の移動で得られた電荷情報を利用することができない。 For example, the user cannot check the charge information accumulated in the image sensor even if the imaging apparatus has already moved in an unintended direction immediately after the exposure starts and imaging has failed. The exposure is completed to the end. In addition, since the user cannot check the charge information accumulated in the image sensor, when the image pickup apparatus moves in an unintended direction, the charge obtained by the unintended movement of the image pickup apparatus. Information is not available.
 しかしながら、露光を行っている期間は、撮像素子に蓄積された電荷情報を読み出すことは困難であり、ユーザは、露光中に撮像素子に蓄積されている電荷情報を確認することができない。 However, it is difficult to read out the charge information accumulated in the image sensor during the exposure period, and the user cannot check the charge information accumulated in the image sensor during the exposure.
 したがって、露光時間中に撮像素子に蓄積されている電荷情報を読み出すことなく、撮像素子に蓄積されている電荷情報を確認する手法が望まれている。 Therefore, there is a demand for a method for confirming the charge information stored in the image sensor without reading out the charge information stored in the image sensor during the exposure time.
 本発明はこのような事情に鑑みてなされたもので、その目的は、露光中に撮像素子に蓄積されている電荷情報を読み出すことなく、ユーザに報知することができるカメラ、撮像方法、及びプログラムを提供することである。 The present invention has been made in view of such circumstances, and an object of the present invention is to provide a camera, an imaging method, and a program capable of notifying a user without reading out charge information stored in an imaging device during exposure. Is to provide.
 上記目的を達成するための、本発明の一の態様は、撮像範囲を移動させながら露光を行い撮像画像を取得可能なカメラであって、表示部と、撮像画像の撮像予定領域よりも広角で撮像した第1の撮像画像を取得する第1の撮像画像取得部と、露光を開始する直前における撮像予定領域の一部が撮像された第2の撮像画像を取得する第2の撮像画像取得部と、カメラの姿勢情報を検出するセンサと、露光を開始する状態からのセンサの検出情報に基づいて、第2の撮像画像の撮像範囲を基準にした単位時間毎の撮像範囲を検出する撮像範囲検出部と、単位時間毎の撮像範囲に対応する第1の再現画像を、第1の撮像画像、又は第1の撮像画像及び第2の撮像画像から生成する第1の再現画像生成部と、第1の再現画像を積算した露光中の第2の再現画像を生成する第2の再現画像生成部と、表示部に第2の再現画像を表示させる表示制御部と、を備えるカメラである。 In order to achieve the above object, one aspect of the present invention is a camera capable of acquiring a captured image by performing exposure while moving an imaging range, and having a wider angle than an imaging scheduled area of the captured image. A first captured image acquisition unit that acquires a captured first captured image, and a second captured image acquisition unit that acquires a second captured image in which a part of the planned imaging region immediately before the start of exposure is captured And an imaging range for detecting an imaging range per unit time based on the imaging range of the second captured image, based on the detection information of the sensor from the state in which exposure is started and the sensor that detects the posture information of the camera A detection unit, and a first reproduction image generation unit that generates a first reproduction image corresponding to an imaging range for each unit time from the first imaging image, or the first imaging image and the second imaging image; Second during exposure integrating the first reproduced image A second reproduced image generating unit that generates a reproduced image, a display control unit for displaying the second reproduction image on the display unit, a camera equipped with.
 本態様によれば、撮像範囲検出部により、露光を開始する状態からのセンサの検出情報に基づいて、第2の撮像画像の撮像範囲を基準にした単位時間毎の撮像範囲が検出され、第1の再現画像生成部により、単位時間毎の撮像範囲に対応する第1の再現画像が、第1の撮像画像、又は第1の撮像画像及び第2の撮像画像から生成され、第2の再現画像生成部により、第1の再現画像を積算した露光中の第2の再現画像が生成される。これにより、露光中に撮像素子に蓄積された電荷情報を読み出すことをせずにユーザに報知することができる。 According to this aspect, the imaging range detection unit detects the imaging range per unit time based on the imaging range of the second captured image based on the detection information of the sensor from the state in which the exposure is started. The first reproduction image generation unit generates a first reproduction image corresponding to the imaging range for each unit time from the first imaging image, or the first imaging image and the second imaging image, and performs the second reproduction. The image generation unit generates a second reproduced image during exposure obtained by integrating the first reproduced image. Thereby, it is possible to notify the user without reading out the charge information accumulated in the image sensor during exposure.
 好ましくは、カメラは、第1の撮像画像と第2の撮像画像とを比較することによって、第1の撮像画像における露光の開始位置を検出する開始位置検出部を備え、撮像範囲検出部は、開始位置に基づいて、単位時間毎の撮像範囲を検出する。 Preferably, the camera includes a start position detection unit that detects a start position of exposure in the first captured image by comparing the first captured image and the second captured image, and the imaging range detection unit includes: Based on the start position, an imaging range for each unit time is detected.
 本態様によれば、開始位置検出部により、第1の撮像画像と第2の撮像画像とを比較することによって、第1の撮像画像における露光の開始位置が検出される。これにより、正確な露光の開始位置を検出し、その後の単位時間毎の撮像範囲を正確に特定することができる。 According to this aspect, the start position detection unit detects the exposure start position in the first captured image by comparing the first captured image and the second captured image. As a result, an accurate exposure start position can be detected, and the subsequent imaging range for each unit time can be accurately specified.
 好ましくは、センサは、第1の撮像画像及び第2の撮像画像を取得する場合のカメラの姿勢情報を検出し、撮像範囲検出部は、第1の撮像画像及び第2の撮像画像を取得する場合の検出情報に基づいて、露光の開始位置を検出し、単位時間毎の撮像範囲を検出する。 Preferably, the sensor detects posture information of the camera when acquiring the first captured image and the second captured image, and the imaging range detection unit acquires the first captured image and the second captured image. Based on the detection information, the exposure start position is detected, and the imaging range for each unit time is detected.
 本態様によれば、センサにより、第1の撮像画像及び第2の撮像画像を取得する場合のカメラの姿勢情報が検出され、撮像範囲検出部により、第1の撮像画像及び第2の撮像画像を取得する場合の検出情報に基づいて、露光の開始位置を検出し、単位時間毎の撮像範囲が検出される。これにより、正確な露光の開始位置を検出し、その後の単位時間毎の撮像範囲を正確に特定することができる。 According to this aspect, the orientation information of the camera when the first captured image and the second captured image are acquired by the sensor is detected, and the first captured image and the second captured image are detected by the imaging range detection unit. Based on the detection information in the case of acquiring, the exposure start position is detected, and the imaging range for each unit time is detected. As a result, an accurate exposure start position can be detected, and the subsequent imaging range for each unit time can be accurately specified.
 好ましくは、センサは、カメラの方位情報及び基準面からの高さ情報のうち少なくとも一つをさらに検出する。 Preferably, the sensor further detects at least one of camera orientation information and height information from the reference plane.
 本態様によれば、センサにより、カメラの方位情報及び基準面からの高さ情報のうち少なくとも一つがさらに検出される。これにより、正確に撮像範囲の検出を行うことができる。 According to this aspect, at least one of the orientation information of the camera and the height information from the reference plane is further detected by the sensor. Thereby, the imaging range can be accurately detected.
 好ましくは、撮像範囲検出部は、センサの検出情報に基づいて、単位時間毎の撮像範囲の移動方向を検出する。 Preferably, the imaging range detection unit detects the moving direction of the imaging range per unit time based on the detection information of the sensor.
 本態様によれば、撮像範囲検出部により、センサの検出情報に基づいて、単位時間毎の撮像範囲の移動方向が検出される。これにより、正確に撮像範囲の検出を行うことができる。 According to this aspect, the moving direction of the imaging range per unit time is detected by the imaging range detection unit based on the detection information of the sensor. Thereby, the imaging range can be accurately detected.
 好ましくは、カメラは、露光の設定された露光時間を取得し露光を制御する露光時間制御部を備える。 Preferably, the camera includes an exposure time control unit that acquires an exposure time set for exposure and controls the exposure.
 本態様によれば、露光時間制御部により、露光の設定された露光時間が取得され露光が制御されるので、正確な期間の露光を行うことができる。 According to this aspect, since the exposure time set by the exposure is acquired and the exposure is controlled by the exposure time control unit, it is possible to perform exposure for an accurate period.
 好ましくは、露光時間が閾値より長い場合に、表示制御部は、表示部に第2の再現画像を表示させる。 Preferably, when the exposure time is longer than the threshold value, the display control unit displays the second reproduced image on the display unit.
 本態様によれば、露光時間が閾値より長い場合に、表示制御部により、表示部に第2の再現画像を表示させる。これにより、露光時間が閾値より長い長時間露光において、効率的に撮像素子に蓄積された電荷情報の報知が行われる。 According to this aspect, when the exposure time is longer than the threshold value, the display control unit displays the second reproduction image on the display unit. Thereby, in the long exposure with the exposure time longer than the threshold value, the charge information accumulated in the image sensor is efficiently notified.
 好ましくは、第1の再現画像生成部は、撮像範囲検出部が検出する単位時間毎の撮像範囲が第1の撮像画像以外の範囲にある場合には、ユーザに報知する報知画像及び報知文字のうち少なくとも一方を含む第1の再現画像を生成する。 Preferably, the first reproduction image generation unit is configured to notify the user of a notification image and a notification character when the imaging range per unit time detected by the imaging range detection unit is in a range other than the first captured image. A first reproduction image including at least one of them is generated.
 本態様によれば、第2の再現画像生成部により、撮像範囲検出部が検出する単位時間毎の撮像範囲が第1の撮像画像以外の範囲にある場合には、ユーザに報知する報知画像及び報知文字のうち少なくとも一方を含む第2の再現画像が生成される。これにより、ユーザに対して撮像範囲が第1の撮像画像以外の範囲にあること、すなわち大きく撮像範囲の移動が外れていることを報知することができる。 According to this aspect, when the imaging range for each unit time detected by the imaging range detection unit is in a range other than the first captured image by the second reproduction image generation unit, the notification image to notify the user and A second reproduction image including at least one of the notification characters is generated. Thereby, it can be notified to the user that the imaging range is in a range other than the first captured image, that is, the movement of the imaging range has greatly deviated.
 好ましくは、表示制御部は、第1の再現画像と第2の再現画像とを切り替えて表示部に表示させる。 Preferably, the display control unit switches between the first reproduction image and the second reproduction image and causes the display unit to display them.
 本態様によれば、表示制御部は、第1の再現画像と第2の再現画像とを切り替えて表示部に表示させるので、ユーザに対して現在撮像されている画像と、現在までに積算された画像とを確認させることができる。 According to this aspect, since the display control unit switches between the first reproduction image and the second reproduction image and causes the display unit to display the image, the image currently captured for the user is integrated with the current image. The image can be confirmed.
 好ましくは、センサは、手ブレセンサである。 Preferably, the sensor is a camera shake sensor.
 好ましくは、第1の撮像画像と単位時間毎の撮像範囲とに応じて、ズーム機能を固定させるズーム制御部を備える。 Preferably, a zoom control unit is provided that fixes the zoom function according to the first captured image and the imaging range per unit time.
 本態様によれば、ズーム制御部により、第1の撮像画像と単位時間毎の撮像範囲とに応じて、ズーム機能が固定されるので、撮像予定領域以外の領域を撮像してしまうことを抑制することができる。 According to this aspect, since the zoom function is fixed by the zoom control unit according to the first captured image and the imaging range for each unit time, it is possible to suppress capturing an area other than the scheduled imaging area. can do.
 本発明の他の態様は、撮像範囲を移動させながら露光を行い撮像画像を取得可能な撮像方法であって、撮像画像の撮像予定領域よりも広角で撮像した第1の撮像画像を取得する第1の撮像画像取得ステップと、露光を開始する直前における撮像予定領域の一部が撮像された第2の撮像画像を取得する第2の撮像画像取得ステップと、カメラの姿勢情報をセンサにより検出する検出ステップと、露光を開始する状態からのセンサの検出情報に基づいて、第2の撮像画像の撮像範囲を基準にした単位時間毎の撮像範囲を検出する撮像範囲検出ステップと、単位時間毎の撮像範囲に対応する第1の再現画像を、第1の撮像画像、又は第1の撮像画像及び第2の撮像画像から生成する第1の再現画像生成ステップと、第1の再現画像を積算した露光中の第2の再現画像を生成する第2の再現画像生成ステップと、表示部に第2の再現画像を表示させる表示制御ステップと、を含む撮像方法である。 Another aspect of the present invention is an imaging method capable of acquiring a captured image by performing exposure while moving an imaging range, and acquiring a first captured image captured at a wider angle than the planned imaging region of the captured image. A first captured image acquisition step, a second captured image acquisition step of acquiring a second captured image in which a part of the planned imaging region immediately before the start of exposure is captured, and camera posture information is detected by a sensor. A detection step; an imaging range detection step for detecting an imaging range per unit time based on the imaging range of the second captured image based on detection information of the sensor from a state in which exposure is started; The first reproduced image corresponding to the imaging range is generated from the first captured image, or the first reproduced image generation step for generating the first reproduced image from the first captured image and the second captured image, and the first reproduced image is integrated. exposure A second reproduced image generation step of generating a second reproduced image, and a display control step of displaying the second reproduction image on the display unit is an imaging method comprising.
 本発明の他の態様は、撮像範囲を移動させながら露光を行い撮像画像を取得可能な撮像工程をコンピュータに実行させるプログラムであって、撮像画像の撮像予定領域よりも広角で撮像した第1の撮像画像を取得する第1の撮像画像取得ステップと、露光を開始する直前における撮像予定領域の一部が撮像された第2の撮像画像を取得する第2の撮像画像取得ステップと、カメラの姿勢情報をセンサにより検出する検出ステップと、露光を開始する状態からのセンサの検出情報に基づいて、第2の撮像画像の撮像範囲を基準にした単位時間毎の撮像範囲を検出する撮像範囲検出ステップと、単位時間毎の撮像範囲に対応する第1の再現画像を、第1の撮像画像、又は第1の撮像画像及び第2の撮像画像から生成する第1の再現画像生成ステップと、第1の再現画像を積算した露光中の第2の再現画像を生成する第2の再現画像生成ステップと、表示部に第2の再現画像を表示させる表示制御ステップと、を含む撮像工程をコンピュータに実行させるプログラムである。 Another aspect of the present invention is a program for causing a computer to execute an imaging process capable of acquiring a captured image by performing exposure while moving an imaging range. The first image is captured at a wider angle than the planned imaging region of the captured image. A first captured image acquisition step for acquiring a captured image, a second captured image acquisition step for acquiring a second captured image in which a part of the planned imaging region immediately before the start of exposure is captured, and the attitude of the camera A detection step of detecting information by a sensor, and an imaging range detection step of detecting an imaging range per unit time based on the imaging range of the second captured image based on detection information of the sensor from a state where exposure is started And a first reproduction image generation step for generating a first reproduction image corresponding to the imaging range for each unit time from the first imaging image or the first imaging image and the second imaging image. Imaging, and a second reproduction image generation step for generating a second reproduction image during exposure obtained by integrating the first reproduction image, and a display control step for displaying the second reproduction image on the display unit A program for causing a computer to execute a process.
 本発明によれば、撮像範囲検出部により、露光を開始する状態からのセンサの検出情報に基づいて、第2の撮像画像の撮像範囲を基準にした単位時間毎の撮像範囲が検出され、第1の再現画像生成部により、単位時間毎の撮像範囲に対応する第1の再現画像が、第1の撮像画像、又は第1の撮像画像及び第2の撮像画像から生成され、第2の再現画像生成部により、第1の再現画像を積算した露光中の第2の再現画像が生成されるので、露光中に撮像素子に蓄積された電荷情報を読み出すことをせずにユーザに報知することができる。 According to the present invention, the imaging range detection unit detects the imaging range for each unit time based on the imaging range of the second captured image based on the detection information of the sensor from the state where the exposure is started. The first reproduction image generation unit generates a first reproduction image corresponding to the imaging range for each unit time from the first imaging image, or the first imaging image and the second imaging image, and performs the second reproduction. The image generation unit generates a second reproduction image during exposure obtained by integrating the first reproduction image, so that the user is notified without reading out the charge information accumulated in the image sensor during the exposure. Can do.
図1は、撮像範囲を移動させながら長時間露光を行う撮像について説明する図である。FIG. 1 is a diagram illustrating imaging in which long exposure is performed while moving the imaging range. 図2は、カメラの斜視図である。FIG. 2 is a perspective view of the camera. 図3は、カメラの背面図である。FIG. 3 is a rear view of the camera. 図4は、カメラの内部構成の実施形態を示すブロック図である。FIG. 4 is a block diagram showing an embodiment of the internal configuration of the camera. 図5は、画像処理部の機能構成例を示すブロック図である。FIG. 5 is a block diagram illustrating a functional configuration example of the image processing unit. 図6は、第1の撮像画像の撮像の一例を示す図である。FIG. 6 is a diagram illustrating an example of capturing the first captured image. 図7は、第1の撮像画像を示す図である。FIG. 7 is a diagram illustrating a first captured image. 図8は、第2の撮像画像を示す図である。FIG. 8 is a diagram illustrating a second captured image. 図9は、第1の再現画像の例を示す図である。FIG. 9 is a diagram illustrating an example of the first reproduced image. 図10は、第2の再現画像の例を示す図である。FIG. 10 is a diagram illustrating an example of the second reproduced image. 図11は、第2の再現画像の例を示す図である。FIG. 11 is a diagram illustrating an example of a second reproduced image. 図12は、カメラの撮像の流れを示したフロー図である。FIG. 12 is a flowchart showing the flow of imaging by the camera. 図13は、ゼブラパターンを有する第1の再現画像を示す図である。FIG. 13 is a diagram illustrating a first reproduced image having a zebra pattern. 図14は、撮像範囲の移動させる場合のカメラの動作形態の一例を示す図である。FIG. 14 is a diagram illustrating an example of the operation mode of the camera when moving the imaging range. 図15は、表示部の表示の切替を説明するための図である。FIG. 15 is a diagram for explaining switching of display on the display unit.
 以下、添付図面に従って本発明にかかるカメラ、撮像方法、及びプログラムの好ましい実施の形態について説明する。 Hereinafter, preferred embodiments of a camera, an imaging method, and a program according to the present invention will be described with reference to the accompanying drawings.
 [長時間露光-移動撮像]
 図1は、撮像範囲を移動させながら長時間露光を行う撮像について説明する図である。図1には、木125の幹126(幹126の外皮の模様145が示されている)の部分を撮像予定領域Hとして撮像する場合である。ユーザ(撮影者)は、カメラ10(図2)の撮像範囲を移動させながら撮像予定領域Hを撮像する。具体的にはカメラ10がチルト方向(図中の矢印L)に傾けられて、撮像範囲が移動し、撮像予定領域Hの全域が撮像される。
[Long exposure-moving imaging]
FIG. 1 is a diagram illustrating imaging in which long exposure is performed while moving the imaging range. FIG. 1 shows a case where a portion of the trunk 126 of the tree 125 (the outer skin pattern 145 of the trunk 126 is shown) is imaged as the imaging scheduled region H. The user (photographer) images the imaging scheduled area H while moving the imaging range of the camera 10 (FIG. 2). Specifically, the camera 10 is tilted in the tilt direction (arrow L in the figure), the imaging range is moved, and the entire imaging planned area H is imaged.
 カメラ10の露光時間の設定は1回の露光で撮像範囲の移動が完了するように設定され、比較的長い露光時間が設定される。具体的には露光時間は撮像範囲を移動させて撮像予定領域Hの撮像が完了する期間が必要であり、少なくとも1秒より大きく、このましくは5秒よりも大きい。撮像予定領域Hや被写体によっては、露光時間を10秒、30秒にしてもよい。 The setting of the exposure time of the camera 10 is set so that the movement of the imaging range is completed by one exposure, and a relatively long exposure time is set. Specifically, the exposure time requires a period during which the imaging range H is completed by moving the imaging range, and is longer than at least 1 second, more preferably longer than 5 seconds. Depending on the scheduled imaging area H and subject, the exposure time may be 10 seconds or 30 seconds.
 このように、露光時間が比較的長時間に設定され、露光時間中に撮像範囲の移動を伴う撮像形態において、本発明は好適に用いられる。 As described above, the present invention is suitably used in an imaging mode in which the exposure time is set to a relatively long time and the imaging range is moved during the exposure time.
 [カメラ]
 図2及び図3は、カメラ(デジタルカメラ)10を示す斜視図及び背面図である。このカメラ10は、レンズを通った光を撮像素子で受け、デジタル信号に変換して静止画又は動画の画像データとして記録メディアに記録するデジタルカメラである。図2に示すようにカメラ10は、その正面に撮影レンズ12、ストロボ1等が配設され、上面にはシャッタボタン2、電源/モードスイッチ3、モードダイヤル4等が配設されている。一方、図3に示すように、カメラ背面には、液晶モニタ(LCD:Liquid Crystal Display)30、ズームボタン5、十字ボタン6、MENU/OKボタン7、再生ボタン8、BACKボタン9等が配設されている。
[camera]
2 and 3 are a perspective view and a rear view showing the camera (digital camera) 10, respectively. The camera 10 is a digital camera that receives light passing through a lens with an imaging device, converts the light into a digital signal, and records it on a recording medium as still image or moving image data. As shown in FIG. 2, the camera 10 is provided with a photographing lens 12, a strobe 1 and the like on the front, and a shutter button 2, a power / mode switch 3, a mode dial 4 and the like on the top. On the other hand, as shown in FIG. 3, a liquid crystal monitor (LCD: Liquid Crystal Display) 30, a zoom button 5, a cross button 6, a MENU / OK button 7, a playback button 8, a BACK button 9 and the like are disposed on the back of the camera. Has been.
 撮影レンズ12は、沈胴式のズームレンズで構成されており、電源/モードスイッチ3によってカメラ10の作動モードを撮影モードに設定することにより、カメラ本体から繰り出される。ストロボ1は、主要被写体にストロボ光を照射するものである。 The photographing lens 12 is constituted by a retractable zoom lens, and is set out from the camera body by setting the operation mode of the camera 10 to the photographing mode by the power / mode switch 3. The strobe 1 irradiates a main subject with strobe light.
 シャッタボタン2は、いわゆる「半押し」と「全押し」とからなる2段ストローク式のスイッチで構成され、撮像準備指示部として機能するとともに、画像の記録指示部として機能する。 The shutter button 2 is composed of a two-stroke switch composed of so-called “half-press” and “full-press”, and functions as an imaging preparation instruction unit and also functions as an image recording instruction unit.
 カメラ10は、撮影モードとして静止画撮影モードが選択され、シャッタボタン2が「半押し」されると、AF(Autofocus)/AE(Auto Exposure)制御を行う撮像準備動作を行い、シャッタボタン2が「全押し」されると、静止画の撮像及び記録を行う。 When the still image shooting mode is selected as the shooting mode and the shutter button 2 is “half-pressed”, the camera 10 performs an imaging preparation operation for performing AF (Autofocus) / AE (Auto-Exposure) control. When “fully pressed”, a still image is captured and recorded.
 また、カメラ10は、撮影モードとして動画撮影モードが選択され、シャッタボタン2が「全押し」されると、動画の録画を開始し、シャッタボタン2が再度「全押し」されると、録画を停止して待機状態になる。 Further, the camera 10 starts recording a moving image when the moving image shooting mode is selected as the shooting mode and the shutter button 2 is “fully pressed”, and when the shutter button 2 is “fully pressed” again, the recording is performed. Stop and enter standby mode.
 電源/モードスイッチ3は、カメラ10の電源をON/OFFする電源スイッチとしての機能と、カメラ10のモードを設定するモードスイッチとしての機能とを併せ持っており、「OFF位置」と「再生位置」と「撮像位置」との間をスライド自在に配設されている。カメラ10は、電源/モードスイッチ3をスライドさせて、「再生位置」又は「撮像位置」に合わせることにより、電源がONになり、「OFF位置」に合わせることにより、電源がOFFになる。そして、電源/モードスイッチ3をスライドさせて、「再生位置」に合わせることにより、「再生モード」に設定され、「撮像位置」に合わせることにより、「撮影モード」に設定される。 The power / mode switch 3 has both a function as a power switch for turning on / off the power of the camera 10 and a function as a mode switch for setting the mode of the camera 10, and “OFF position” and “reproduction position”. And “imaging position” are slidably arranged. The camera 10 is turned on by sliding the power / mode switch 3 to the “reproduction position” or “imaging position”, and turned off by setting it to the “OFF position”. Then, the power / mode switch 3 is slid to match the “reproduction position” to set the “reproduction mode”, and to the “imaging position” to set the “shooting mode”.
 モードダイヤル4は、カメラ10の撮影モードを設定するモード切替部として機能し、このモードダイヤル4の設定位置により、カメラ10の撮影モードが様々なモードに設定される。例えば、静止画撮影を行う「静止画撮影モード」、動画撮影を行う「動画撮影モード」、図1で説明を行った「長時間露光-移動撮像モード」等である。 The mode dial 4 functions as a mode switching unit for setting the shooting mode of the camera 10, and the shooting mode of the camera 10 is set to various modes depending on the setting position of the mode dial 4. For example, there are “still image shooting mode” in which still image shooting is performed, “moving image shooting mode” in which moving image shooting is performed, and “long exposure-moving imaging mode” described in FIG.
 液晶モニタ30は、表示部として機能し、撮影モード時のライブビュー画像の表示、再生モード時の静止画又は動画の表示を行うとともに、メニュー画面の表示等を行うことでグラフィカルユーザーインターフェースの一部としても機能する。また、液晶モニタ30は、後で説明をする第1の再現画像及び第2の再現画像を表示する。 The liquid crystal monitor 30 functions as a display unit, displays a live view image in the shooting mode, displays a still image or a moving image in the playback mode, and displays a menu screen to display a part of the graphical user interface. Also works. The liquid crystal monitor 30 displays a first reproduction image and a second reproduction image, which will be described later.
 ズームボタン5は、ズームを指示するズーム指示手段として機能し、望遠側へのズームを指示するテレボタン5Tと、広角側へのズームを指示するワイドボタン5Wとからなる。カメラ10は、撮影モード時に、このテレボタン5Tとワイドボタン5Wとが操作されることにより、撮影レンズ12の焦点距離が変化する。また、再生モード時に、このテレボタン5Tとワイドボタン5Wとが操作されることにより、再生中の画像が拡大、縮小する。 The zoom button 5 functions as zoom instruction means for instructing zooming, and includes a tele button 5T for instructing zooming to the telephoto side and a wide button 5W for instructing zooming to the wide angle side. When the camera 10 is operated in the shooting mode, the tele button 5T and the wide button 5W are operated to change the focal length of the shooting lens 12. Further, when the tele button 5T and the wide button 5W are operated in the reproduction mode, the image being reproduced is enlarged or reduced.
 十字ボタン6は、上下左右の4方向の指示を入力する操作部であり、メニュー画面から項目を選択したり、各メニューから各種設定項目の選択を指示したりするボタン(カーソル移動操作手段)として機能する。左/右キーは再生モード時のコマ送り(順方向/逆方向送り)ボタンとして機能する。 The cross button 6 is an operation unit for inputting instructions in four directions (up, down, left, and right). Function. The left / right key functions as a frame advance (forward / reverse feed) button in the playback mode.
 MENU/OKボタン7は、液晶モニタ30の画面上にメニューを表示させる指令を行うためのメニューボタンとしての機能と、選択内容の確定及び実行などを指令するOKボタンとしての機能とを兼備した操作ボタンである。 The MENU / OK button 7 is an operation having a function as a menu button for instructing to display a menu on the screen of the liquid crystal monitor 30 and a function as an OK button for instructing confirmation and execution of selection contents. Button.
 再生ボタン8は、撮像記録した静止画又は動画を液晶モニタ30に表示させる再生モードに切り替えるためのボタンである。 The playback button 8 is a button for switching to a playback mode in which a captured still image or moving image is displayed on the liquid crystal monitor 30.
 BACKボタン9は、入力操作のキャンセルや一つ前の操作状態に戻すことを指示するボタンとして機能する。 The BACK button 9 functions as a button for instructing to cancel the input operation or return to the previous operation state.
 尚、本実施形態に係るカメラ10において、ボタン/スイッチ類に対して固有の部材を設けるのではなく、タッチパネルを設けこれを操作することでそれらボタン/スイッチ類の機能を実現するようにしてもよい。 In the camera 10 according to the present embodiment, the buttons / switches are not provided with specific members, but a touch panel is provided to operate the buttons / switches by operating them. Good.
 [カメラの内部構成]
 図4はカメラ10の内部構成の実施形態を示すブロック図である。このカメラ10は、撮像した画像をメモリカード54に記録するもので、装置全体の動作は、中央処理装置(CPU:Central Processing Unit)40によって統括制御される。
[Internal structure of camera]
FIG. 4 is a block diagram showing an embodiment of the internal configuration of the camera 10. The camera 10 records captured images on a memory card 54, and the operation of the entire apparatus is centrally controlled by a central processing unit (CPU: Central Processing Unit) 40.
 カメラ10には、シャッタボタン2、電源/モードスイッチ3、モードダイヤル4、テレボタン5T、ワイドボタン5W、十字ボタン6、MENU/OKボタン7、再生ボタン8、BACKボタン9等の操作部38が設けられている。この操作部38からの信号はCPU40に入力され、CPU40は入力信号に基づいてカメラ10の各回路を制御し、例えば、センサ駆動部32により撮像素子16の駆動制御、レンズ駆動部36によりレンズ駆動制御、絞り駆動部34により絞り駆動制御、撮像動作制御、画像処理制御、画像データの記録/再生制御、及び、表示制御部61により液晶モニタ30の表示制御などを行う。 The camera 10 includes an operation unit 38 such as a shutter button 2, a power / mode switch 3, a mode dial 4, a tele button 5T, a wide button 5W, a cross button 6, a MENU / OK button 7, a playback button 8, and a BACK button 9. Is provided. A signal from the operation unit 38 is input to the CPU 40, and the CPU 40 controls each circuit of the camera 10 based on the input signal. For example, the sensor driving unit 32 controls the driving of the image sensor 16 and the lens driving unit 36 drives the lens. Control, aperture drive control by the aperture drive unit 34, imaging operation control, image processing control, image data recording / reproduction control, display control of the liquid crystal monitor 30 by the display control unit 61, and the like.
 撮影レンズ12、絞り14、メカシャッタ(機械的シャッタ)15等を通過した光束は、CMOS(Complementary Metal-Oxide Semiconductor)型のカラーイメージセンサである撮像素子16に結像される。尚、撮像素子16は、CMOS型に限らず、XYアドレス型、又はCCD(Charge Coupled Device)型のカラーイメージセンサでもよい。 The luminous flux that has passed through the photographing lens 12, the diaphragm 14, the mechanical shutter (mechanical shutter) 15 and the like forms an image on the image sensor 16 that is a CMOS (Complementary Metal-Oxide Semiconductor) type color image sensor. The image sensor 16 is not limited to the CMOS type, but may be an XY address type or a CCD (Charge-Coupled Device) type color image sensor.
 撮像素子16は、多数の受光素子(フォトダイオード)が2次元配列されており、各フォトダイオードの受光面に結像された被写体像は、その入射光量に応じた量の信号電圧(又は電荷)に変換され、撮像素子16内のA/D(Analog/Digital)変換器を介してデジタル信号に変換されて出力される。 The image sensor 16 has a large number of light receiving elements (photodiodes) arranged two-dimensionally, and the subject image formed on the light receiving surface of each photodiode is an amount of signal voltage (or charge) corresponding to the amount of incident light. Is converted into a digital signal via an A / D (Analog / Digital) converter in the image sensor 16 and output.
 またCPU40は、露光時間制御部65及びズーム制御部67を備える。 The CPU 40 includes an exposure time control unit 65 and a zoom control unit 67.
 露光時間制御部65は、設定された露光時間を取得し露光の開始から露光の終了を制御する。具体的には、ユーザがシャッタボタン2を押下すると露光が開始され、露光時間制御部65は取得した露光時間が経過する露光を終了させるように、センサ駆動部32を制御する。 The exposure time control unit 65 acquires the set exposure time and controls the end of exposure from the start of exposure. Specifically, when the user presses the shutter button 2, exposure is started, and the exposure time control unit 65 controls the sensor drive unit 32 so as to end the exposure for which the acquired exposure time has elapsed.
 ズーム制御部67は、レンズ駆動部36を制御して、ズーム機能に制限を加える。例えば、後で説明をする第1の撮像画像と単位時間毎の撮像範囲とに応じて、ズーム機能を固定させる。すなわち、撮像範囲が撮像予定領域Hから外れないように、ズーム機能を固定させる。 The zoom control unit 67 controls the lens driving unit 36 to limit the zoom function. For example, the zoom function is fixed in accordance with a first captured image to be described later and an imaging range per unit time. That is, the zoom function is fixed so that the imaging range does not deviate from the scheduled imaging area H.
 動画又は静止画の撮影時に撮像素子16から読み出された画像信号(画像データ)は、画像入力コントローラ22を介してメモリ(SDRAM(Synchronous Dynamic Random Access Memory))48に一時的に記憶され、あるいはAF処理部42、AE検出部44等に取り込まれる。 An image signal (image data) read out from the image sensor 16 at the time of shooting a moving image or a still image is temporarily stored in a memory (SDRAM (Synchronous Dynamic Random Access Memory)) 48 via the image input controller 22, or The data is taken into the AF processing unit 42, the AE detection unit 44, and the like.
 CPU40は、操作部38での操作に基づいてカメラ10の各部を統括制御するが、ライブビュー画像の撮影(表示)、及び動画の撮影(記録)中には、常時AF処理部42によりAF動作及びAE検出部44によるAE動作を行う。 The CPU 40 performs overall control of each unit of the camera 10 based on the operation of the operation unit 38. During live view image shooting (display) and moving image shooting (recording), the AF processing unit 42 always performs an AF operation. The AE operation by the AE detection unit 44 is performed.
 撮影モードが静止画撮影モードの場合には、シャッタボタン2の第1段階の押下(半押し)があると、CPU40は、前述のAF制御を再度行い、シャッタボタン2の全押しがあると、被写体の明るさ(撮影Ev値)を算出し、算出した撮影Ev値に基づいて絞り14のF値及びメカシャッタ15による露光時間(シャッタ速度)をプログラム線図にしたがって決定し、静止画の撮影(露出制御)を行う。 When the shooting mode is the still image shooting mode, if the shutter button 2 is first pressed (half-pressed), the CPU 40 performs the above-described AF control again, and if the shutter button 2 is fully pressed, The brightness of the subject (shooting Ev value) is calculated, and based on the calculated shooting Ev value, the F value of the aperture 14 and the exposure time (shutter speed) by the mechanical shutter 15 are determined according to the program diagram, and still image shooting ( (Exposure control).
 一方、撮影モードが動画撮影モードの場合には、シャッタボタン2の全押しがあると、CPU40は、動画の撮影及び記録(録画)を開始させる。尚、動画撮影時には、メカシャッタ15を開放し、撮像素子16から画像データを連続的に読み出し(例えば、フレームレートとして30フレーム/秒、60フレーム/秒)、連続的に位相差AFを行うとともに、被写体の明るさを算出し、シャッタ駆動部33によりシャッタ速度(ローリングシャッタによる電荷蓄積時間)及び/又は絞り駆動部34による絞り14を制御する。 On the other hand, when the shooting mode is the moving image shooting mode, if the shutter button 2 is fully pressed, the CPU 40 starts shooting and recording (recording) of the moving image. At the time of moving image shooting, the mechanical shutter 15 is opened, image data is continuously read from the image sensor 16 (for example, frame rates of 30 frames / second, 60 frames / second), and phase difference AF is continuously performed. The brightness of the subject is calculated, and the shutter drive unit 33 controls the shutter speed (charge accumulation time by the rolling shutter) and / or the aperture 14 by the aperture drive unit 34.
 CPU40は、ズームボタン5からのズーム指令に応じてレンズ駆動部36を介してズームレンズを光軸方向に進退動作させ、焦点距離を変更させる。 The CPU 40 moves the zoom lens forward and backward in the optical axis direction via the lens driving unit 36 in accordance with the zoom command from the zoom button 5 to change the focal length.
 また、ROM(Read Only Memory)47は、カメラ制御プログラム、撮像素子16の欠陥情報、画像処理等に使用する各種のパラメータやテーブルが記憶している。EEPROM(Electrically Erasable Programmable Read-Only Memory)により代用可能である。 A ROM (Read Only Memory) 47 stores a camera control program, defect information of the image sensor 16, various parameters and tables used for image processing, and the like. It can be substituted by EEPROM (Electrically-Erasable-Programmable-Read-Only Memory).
 センサ63は、カメラ10の姿勢情報を検出する。例えば、手振れセンサによりカメラ10の姿勢情報を検出してもよい。さらに、センサ63は、カメラ10の方位情報、又は基準面からの高さ情報を検出するものであってもよい。具体的にはGPS(Global Positioning System)、ジャイロセンサ、圧力センサ等があげられる。 Sensor 63 detects posture information of camera 10. For example, the posture information of the camera 10 may be detected by a camera shake sensor. Further, the sensor 63 may detect direction information of the camera 10 or height information from the reference plane. Specific examples include a GPS (Global Positioning System), a gyro sensor, a pressure sensor, and the like.
 画像処理部24は、動画又は静止画の撮影時に画像入力コントローラ22を介して取得され、メモリ48に一時的に記憶された未処理の画像データ(RAWデータ)を読み出す。画像処理部24は、読み出したRAWデータに対してオフセット処理、画素補間処理(位相差検出用画素、傷画素等の補間処理)、ホワイトバランス補正、感度補正を含むゲインコントロール処理、ガンマ補正処理、同時化処理(「デモザイク処理」ともいう)、輝度及び色差信号生成処理、輪郭強調処理、及び色補正等を行う。 The image processing unit 24 reads out unprocessed image data (RAW data) acquired through the image input controller 22 when a moving image or a still image is captured and temporarily stored in the memory 48. The image processing unit 24 performs offset processing, pixel interpolation processing (interpolation processing of phase difference detection pixels, scratched pixels, etc.), white balance correction, gain control processing including sensitivity correction, gamma correction processing on the read RAW data, Synchronization processing (also referred to as “demosaic processing”), luminance and color difference signal generation processing, contour enhancement processing, color correction, and the like are performed.
 画像処理部24により処理された画像データであって、ライブビュー画像として処理された画像データは、VRAM(Video RAM Random access memory)50に入力される。 The image data processed by the image processing unit 24 and processed as a live view image is input to a VRAM (Video RAM Random Access memory) 50.
 VRAM50には、それぞれが1コマ分の画像を表す画像データを記録するA領域とB領域とが含まれている。VRAM50において1コマ分の画像を表す画像データがA領域とB領域とで交互に書き換えられる。VRAM50のA領域及びB領域のうち、画像データが書き換えられている方の領域以外の領域から、書き込まれている画像データが読み出される。 The VRAM 50 includes an A area and a B area for recording image data each representing an image for one frame. In the VRAM 50, image data representing an image for one frame is rewritten alternately in the A area and the B area. Of the A area and B area of the VRAM 50, the written image data is read from an area other than the area where the image data is rewritten.
 VRAM50から読み出された画像データは、ビデオエンコーダ28においてエンコーディングされ、カメラ背面に設けられている液晶モニタ30に出力される。これにより、被写体像を示すライブビュー画像が液晶モニタ30に表示される。 The image data read from the VRAM 50 is encoded by the video encoder 28 and output to the liquid crystal monitor 30 provided on the back of the camera. As a result, a live view image showing the subject image is displayed on the liquid crystal monitor 30.
 画像処理部24により処理された画像データであって、記録用の静止画又は動画として処理された画像データ(輝度データ(Y)及び色差データ(Cb),(Cr))は、再びメモリ48に記憶される。 The image data processed by the image processing unit 24 and processed as a still image or moving image for recording (luminance data (Y) and color difference data (Cb), (Cr)) is stored in the memory 48 again. Remembered.
 圧縮伸張処理部26は、静止画又は動画の記録時に、画像処理部24により処理され、メモリ48に格納された輝度データ(Y)及び色差データ(Cb),(Cr)に対して圧縮処理を施す。静止画の場合には、例えばJPEG(Joint Photographic coding Experts Group)形式で圧縮し、動画の場合には、例えばH.264形式で圧縮する。圧縮伸張処理部26により圧縮された圧縮画像データは、メディアコントローラ52を介してメモリカード54に記録される。 The compression / decompression processing unit 26 compresses the luminance data (Y) and the color difference data (Cb), (Cr) processed by the image processing unit 24 and stored in the memory 48 when recording a still image or a moving image. Apply. In the case of a still image, it is compressed in, for example, JPEG (Joint Photographic coding Experts Group) format, and in the case of a moving image, it is compressed in, for example, H.264 format. The compressed image data compressed by the compression / decompression processing unit 26 is recorded on the memory card 54 via the media controller 52.
 また、圧縮伸張処理部26は、再生モード時にメディアコントローラ52を介してメモリカード54から得た圧縮画像データに対して伸張処理を施す。メディアコントローラ52は、メモリカード54に対する圧縮画像データの記録及び読み出しなどを行う。 Also, the compression / decompression processing unit 26 performs decompression processing on the compressed image data obtained from the memory card 54 via the media controller 52 in the playback mode. The media controller 52 performs recording and reading of compressed image data with respect to the memory card 54.
 <画像処理部の機能構成>
 図5は、画像処理部24の機能構成例を示すブロック図である。
<Functional configuration of image processing unit>
FIG. 5 is a block diagram illustrating a functional configuration example of the image processing unit 24.
 画像処理部24は、第1の撮像画像取得部101、第2の撮像画像取得部103、撮像範囲検出部105、開始位置検出部107、第1の再現画像生成部109、及び第2の再現画像生成部111を備えている。以下に画像処理部24で行われる各処理について説明をする。 The image processing unit 24 includes a first captured image acquisition unit 101, a second captured image acquisition unit 103, an imaging range detection unit 105, a start position detection unit 107, a first reproduction image generation unit 109, and a second reproduction. An image generation unit 111 is provided. Each process performed by the image processing unit 24 will be described below.
 [第1の撮像画像]
 第1の撮像画像取得部101は、撮像範囲を移動させながら露光を行って得る撮像画像の撮像予定領域Hよりも広角で撮像した第1の撮像画像(以下、A画像と記載する)121(図7)を取得する。A画像121は、長時間露光-移動撮像を行う準備段階において撮像され、撮像予定領域Hよりも広角で撮像される。
[First captured image]
The first captured image acquisition unit 101 captures a first captured image (hereinafter, referred to as an A image) 121 (captured at a wider angle than the planned imaging region H of the captured image obtained by performing exposure while moving the imaging range) ( FIG. 7) is acquired. The A image 121 is picked up at a preparatory stage in which long exposure-moving imaging is performed, and is picked up at a wider angle than the scheduled imaging region H.
 図6は、A画像121の撮像の一例を示す図である。なお、図6は、図1で説明をした長時間露光-移動撮像を行う場合のA画像121の撮像である。A画像121は、撮像予定領域Hを含むように広角でカメラ10により撮像される。 FIG. 6 is a diagram illustrating an example of imaging of the A image 121. FIG. 6 is an image of the A image 121 when the long exposure-moving imaging described with reference to FIG. 1 is performed. The A image 121 is imaged by the camera 10 at a wide angle so as to include the imaging scheduled area H.
 図7は、図6で示した撮像形態により撮像されたA画像121を示す図である。A画像121は、木125の全体を画角内に収めており、当然に撮像予定領域Hである幹126も画角に収められている。A画像121には、撮像予定領域Hが示されており、単位時間毎の撮像範囲(127A、127B、127C)が示されている。撮像範囲に関しては後で詳述する。 FIG. 7 is a diagram illustrating the A image 121 captured in the imaging mode illustrated in FIG. The A image 121 contains the entire tree 125 within the angle of view, and naturally, the trunk 126 that is the imaging-scheduled area H is also included within the angle of view. The A image 121 shows a scheduled imaging area H, and imaging ranges (127A, 127B, 127C) for each unit time. The imaging range will be described in detail later.
 [第2の撮像画像]
 第2の撮像画像取得部103は、露光を開始する直前における撮像予定領域Hの一部が撮像された第2の撮像画像(以下、B画像と記載する)131(図8)を取得する。B画像131は、長時間露光-移動撮像の本露光を開始する直前に撮像される撮像画像である。
[Second captured image]
The second captured image acquisition unit 103 acquires a second captured image (hereinafter referred to as a B image) 131 (FIG. 8) in which a part of the planned imaging region H immediately before the start of exposure is captured. The B image 131 is a captured image that is captured immediately before the start of long exposure-moving imaging main exposure.
 図8は、B画像131を示す図である。B画像131は、長時間露光を開始する直前に撮像される画像であり、A画像121よりも画角が狭くズームされた状態で、幹126の一部が大きく写されている。B画像131は例えば、本露光が行われる直前のライブビュー画像(確認画像)から静止画を切り出して、B画像131とする。 FIG. 8 is a diagram showing the B image 131. The B image 131 is an image that is captured immediately before the start of long exposure, and a part of the trunk 126 is enlarged in a state where the angle of view is narrower than that of the A image 121 and zoomed. As the B image 131, for example, a still image is cut out from a live view image (confirmation image) immediately before the main exposure is performed to obtain a B image 131.
 [撮像範囲検出]
 撮像範囲検出部105は、露光を開始する状態からのセンサ63の検出情報に基づいて、B画像131の撮像範囲を基準にした単位時間毎の撮像範囲を検出する。例えば撮像範囲検出部105は、B画像131の撮像範囲をA画像121において検出し、その検出した撮像範囲を露光開始位置として単位時間毎の撮像範囲をA画像121において検出する検出手法1を使用する。また、例えば撮像範囲検出部105は、B画像131及びA画像121を撮像した際のセンサ63の検出情報から単位時間毎の撮像範囲を検出する検出手法2を使用する。なお単位時間とは、後で説明する第1の再現画像又は第2の再現画像の表示のフレームレート(fps:frames per second)を意味する。
[Imaging range detection]
The imaging range detection unit 105 detects an imaging range per unit time based on the imaging range of the B image 131 based on the detection information of the sensor 63 from the state where exposure is started. For example, the imaging range detection unit 105 detects the imaging range of the B image 131 in the A image 121, and uses detection method 1 for detecting the imaging range for each unit time in the A image 121 using the detected imaging range as the exposure start position. To do. Further, for example, the imaging range detection unit 105 uses the detection method 2 for detecting the imaging range for each unit time from the detection information of the sensor 63 when the B image 131 and the A image 121 are captured. The unit time means a frame rate (fps: frames per second) for displaying a first reproduction image or a second reproduction image, which will be described later.
 ≪検出手法1≫
 撮像範囲検出部105は、開始位置検出部107で検出された開始位置に基づいて単位時間毎の撮像範囲を検出する。
≪Detection method 1≫
The imaging range detection unit 105 detects an imaging range for each unit time based on the start position detected by the start position detection unit 107.
 開始位置検出部107は、A画像121とB画像131とを比較することによって、A画像121における露光の開始位置を検出する。例えば開始位置検出部107は、A画像121とB画像131とのパターンマッチングにより、A画像121におけるB画像131の位置を検出する。そして撮像範囲検出部105は、開始位置検出部107の開始位置の検出結果を取得する。撮像範囲検出部105は、取得した開始位置を基準にして、センサ63の検出情報に基づいて単位時間毎の撮像範囲を検出する。 The start position detector 107 detects the exposure start position in the A image 121 by comparing the A image 121 and the B image 131. For example, the start position detection unit 107 detects the position of the B image 131 in the A image 121 by pattern matching between the A image 121 and the B image 131. Then, the imaging range detection unit 105 acquires the start position detection result of the start position detection unit 107. The imaging range detection unit 105 detects the imaging range for each unit time based on the detection information of the sensor 63 with reference to the acquired start position.
 ≪検出手法2≫
 撮像範囲検出部105は、A画像121及びB画像131の撮像した際のカメラ10の姿勢情報に基づいて、開始位置を検出し、単位時間毎の撮像範囲を検出する。
≪Detection method 2≫
The imaging range detection unit 105 detects the start position based on the posture information of the camera 10 when the A image 121 and the B image 131 are captured, and detects the imaging range for each unit time.
 センサ63は、A画像121及びB画像131を取得する場合のカメラ10の姿勢情報を検出する。例えば、センサ63は手振れセンサである場合には、撮像範囲検出部105は、A画像121及びB画像131を撮像した場合の手振れセンサの検出情報を比較して撮像範囲を検出する。そして撮像範囲検出部105は、露光の開始位置を検出し、検出した露光の開始位置を基準にして、その後の単位時間毎の撮像範囲を検出する。 The sensor 63 detects the posture information of the camera 10 when the A image 121 and the B image 131 are acquired. For example, when the sensor 63 is a camera shake sensor, the imaging range detection unit 105 detects the imaging range by comparing detection information of the camera shake sensor when the A image 121 and the B image 131 are captured. The imaging range detection unit 105 detects an exposure start position, and detects an imaging range for each subsequent unit time based on the detected exposure start position.
 なお、撮像範囲検出部105は、センサ63の検出情報に基づいて、露光の単位時間毎の撮像範囲の移動方向を検出してもよい。撮像範囲検出部105は、撮像範囲の移動方向を取得することにより、正確な撮像範囲の検出を行うことができる。 Note that the imaging range detection unit 105 may detect the moving direction of the imaging range for each unit time of exposure based on the detection information of the sensor 63. The imaging range detection unit 105 can accurately detect the imaging range by acquiring the moving direction of the imaging range.
 先に説明をした図7のA画像121には、撮像範囲検出部105で検出された撮像範囲が示されている。撮像範囲127A、撮像範囲127B、及び撮像範囲127Cは、単位時間毎に順次移動する撮像範囲である。露光の開始位置を撮像範囲127Aとして、カメラ10をチルトさせることによって順次撮像範囲が127B及び127Cと移動する(図1参照)。なお、図7では簡略化して3つの撮像範囲を示しているが、実際では撮像予定領域Hには多数の撮像範囲が存在する。 In the A image 121 of FIG. 7 described above, the imaging range detected by the imaging range detection unit 105 is shown. The imaging range 127A, the imaging range 127B, and the imaging range 127C are imaging ranges that sequentially move every unit time. By setting the exposure start position as the imaging range 127A and tilting the camera 10, the imaging range sequentially moves to 127B and 127C (see FIG. 1). Note that, in FIG. 7, three imaging ranges are shown in a simplified manner, but there are actually a large number of imaging ranges in the imaging scheduled region H.
 [第1の再現画像]
 第1の再現画像生成部109は、単位時間毎の撮像範囲に対応する第1の再現画像(図9)を、A画像121、又は、A画像121及びB画像131から生成する。第1の再現画像生成部109は、第1の再現画像における単位時間毎の撮像範囲の画像情報を用いて、第1の再現画像を生成する。また、第1の再現画像生成部109は、露光を開始する直前に撮像されたB画像131に基づいて、第1の再現画像を生成してもよい。特に、露光開始直後であって、カメラ10の姿勢の変化が微小である場合には、B画像131と、A画像121における開始直後の撮像範囲に対応する領域とは重なる部分が多い。したがって、第1の再現画像生成部109は、B画像131とA画像121とから第1の再現画像を生成してもよい。
[First reproduction image]
The first reproduction image generation unit 109 generates a first reproduction image (FIG. 9) corresponding to the imaging range for each unit time from the A image 121 or the A image 121 and the B image 131. The first reproduction image generation unit 109 generates a first reproduction image using image information of the imaging range for each unit time in the first reproduction image. Further, the first reproduced image generation unit 109 may generate a first reproduced image based on the B image 131 captured immediately before the exposure is started. In particular, when the change in the posture of the camera 10 is small immediately after the start of exposure, the B image 131 and the area corresponding to the imaging range immediately after the start in the A image 121 have many overlapping portions. Therefore, the first reproduction image generation unit 109 may generate a first reproduction image from the B image 131 and the A image 121.
 図9は、第1の再現画像の例を示す図である。図7で説明をした撮像範囲127A、撮像範囲127B、撮像範囲127Cに対応する第1の再現画像133A、第1の再現画像133B、及び第1の再現画像133Cが示されている。具体的には、撮像範囲127Aを露光の開始位置とし、それに対応する第1の再現画像133Aが示されており、次の単位時間経過後の撮像範囲127Bに対応する第1の再現画像133Bが示されており、次の単位時間経過後の撮像範囲127Cに対応する第1の再現画像133Cが示されている。すなわち、第1の再現画像133は、単位時間毎に撮像素子16に蓄積される電荷情報を表している。なお、第1の再現画像を符号133で表し、各単位時間での第1の再現画像を符号133A、133B、及び133Cで表す。 FIG. 9 is a diagram showing an example of the first reproduced image. The first reproduction image 133A, the first reproduction image 133B, and the first reproduction image 133C corresponding to the imaging range 127A, the imaging range 127B, and the imaging range 127C described in FIG. 7 are shown. Specifically, the imaging range 127A is set as the exposure start position, and a first reproduction image 133A corresponding thereto is shown. The first reproduction image 133B corresponding to the imaging range 127B after the next unit time has elapsed is shown. A first reproduction image 133C corresponding to the imaging range 127C after the next unit time has elapsed is shown. That is, the first reproduced image 133 represents charge information accumulated in the image sensor 16 for each unit time. Note that the first reproduced image is represented by reference numeral 133, and the first reproduced image at each unit time is represented by reference numerals 133A, 133B, and 133C.
 [第2の再現画像]
 第2の再現画像生成部111は、第1の再現画像133を積算することにより第2の再現画像を生成する。具体的には第2の再現画像生成部111は、露光開始から現在まで得られた、単位時間毎の第1の再現画像133を重ねることにより、第2の再現画像を生成する。
[Second reproduction image]
The second reproduction image generation unit 111 generates a second reproduction image by accumulating the first reproduction image 133. Specifically, the second reproduction image generation unit 111 generates a second reproduction image by superimposing the first reproduction images 133 obtained for each unit time obtained from the start of exposure to the present time.
 図10は、第2の再現画像の例を示す図である。なお、第2の再現画像135においてボケが発生している箇所を点線で示している。 FIG. 10 is a diagram showing an example of the second reproduced image. It should be noted that a portion where blur occurs in the second reproduction image 135 is indicated by a dotted line.
 第2の再現画像135は、図9で示した第1の再現画像133A、第1の再現画像133B、及び第1の再現画像133Cを重ねることによって得られる。単位時間毎に異なる像が写された第1の再現画像(133A、133B、133C)を重ねるので、第2の再現画像135ではボケが発生する。具体的には、第2の再現画像135において幹126の輪郭の部分ではボケが発生しており、また幹126の外皮の模様145でもボケが発生している。 The second reproduction image 135 is obtained by superimposing the first reproduction image 133A, the first reproduction image 133B, and the first reproduction image 133C shown in FIG. Since the first reproduction images (133A, 133B, 133C) on which different images are copied every unit time are overlapped, the second reproduction image 135 is blurred. Specifically, in the second reproduced image 135, blur is generated in the outline portion of the trunk 126, and blur is also generated in the outer skin pattern 145 of the trunk 126.
 このように得られた第2の再現画像135は、単位時間毎の第1の再現画像133を重ねて生成されているので、露光開始から現在までに撮像素子16に蓄積された電荷情報を表している。 Since the second reproduction image 135 obtained in this way is generated by overlapping the first reproduction image 133 for each unit time, it represents the charge information accumulated in the image sensor 16 from the start of exposure to the present time. ing.
 図11は、第2の再現画像135の他の例を示す図である。図11の場合には、カメラ10の移動が意図しない方向へ移動してしまった場合であり、ユーザの意図する画像が得られないという観点からは失敗例である。 FIG. 11 is a diagram showing another example of the second reproduction image 135. In the case of FIG. 11, the movement of the camera 10 has moved in an unintended direction, which is a failure example from the viewpoint that an image intended by the user cannot be obtained.
 第2の再現画像135は、図10と同様に木125の幹126を撮像したものである。図11に示した場合では、撮像範囲を移動させる際にカメラ10の姿勢がブレてしまった場合である。この場合カメラ10のセンサ63の検出情報に基づいて、撮像範囲及び第1の再現画像133もカメラ10の姿勢のブレの影響が反映される。したがって、第1の再現画像133を積算して生成される第2の再現画像135においてもカメラ10のブレの影響が表現されることになる。 The second reproduction image 135 is an image of the trunk 126 of the tree 125 as in FIG. The case shown in FIG. 11 is a case where the posture of the camera 10 is blurred when moving the imaging range. In this case, based on the detection information of the sensor 63 of the camera 10, the imaging range and the first reproduced image 133 also reflect the influence of the shake of the posture of the camera 10. Accordingly, the influence of camera shake is also expressed in the second reproduction image 135 generated by integrating the first reproduction image 133.
 このように、露光中のカメラ10の姿勢のブレを第2の再現画像135で表現することができると、ユーザは、第2の再現画像135を確認することにより、露光が完了する前に撮像を中止して撮り直しを行うことができる。また、露光中のカメラ10の姿勢のブレを利用して新たな画像を作成することができる。 As described above, when the shake of the posture of the camera 10 during the exposure can be expressed by the second reproduction image 135, the user confirms the second reproduction image 135, thereby capturing the image before the exposure is completed. Can be canceled and retaken. In addition, a new image can be created by using the shake of the posture of the camera 10 during exposure.
 [撮像方法]
 次に、カメラ10の撮像方法(撮像工程)に関して説明する。図12は、カメラ10の撮像の流れを示したフロー図である。
[Imaging method]
Next, the imaging method (imaging process) of the camera 10 will be described. FIG. 12 is a flowchart showing a flow of imaging by the camera 10.
 長時間露光で撮像を行う領域(撮像予定領域H)を含む広角で、撮像したA画像121がカメラ10により撮像され(ステップS10)、第1の撮像画像取得部101はA画像121を取得する(第1の撮像画像取得ステップ)。次にカメラ10は、長時間露光の露光が開始するまで、ライブビュー画像を撮像し記憶保存をする(ステップS11)。 The captured A image 121 is captured by the camera 10 at a wide angle including the region (expected imaging region H) where imaging is performed with long exposure (step S10), and the first captured image acquisition unit 101 acquires the A image 121. (First captured image acquisition step). Next, the camera 10 captures a live view image and stores and saves it until long-time exposure starts (step S11).
 次に、カメラ10は移動撮像の本露光を開始する(ステップS12)。そして、第2の撮像画像取得部103は、本露光開始の直前に撮像されたB画像131を記憶されたライブビュー画像から取得する(ステップS13:第2の撮像画像取得ステップ)。 Next, the camera 10 starts main exposure for moving imaging (step S12). Then, the second captured image acquisition unit 103 acquires the B image 131 captured immediately before the start of the main exposure from the stored live view image (step S13: second captured image acquisition step).
 その後、開始位置検出部107は、A画像121におけるB画像131の位置(露光の開始位置)を検出する(ステップS14:検出ステップ)。そして、開始位置検出部107は、A画像121の中にB画像131の位置が検出可能か否かを判定する(ステップS15)。開始位置検出部107がA画像121の中にB画像131の位置が検出できない場合には、表示制御部61は液晶モニタ30にエラーメッセージを表示させる(ステップS23)。 After that, the start position detection unit 107 detects the position (exposure start position) of the B image 131 in the A image 121 (step S14: detection step). Then, the start position detection unit 107 determines whether or not the position of the B image 131 can be detected in the A image 121 (step S15). If the start position detection unit 107 cannot detect the position of the B image 131 in the A image 121, the display control unit 61 displays an error message on the liquid crystal monitor 30 (step S23).
 開始位置検出部107は、A画像121の中にB画像131の位置の検出ができた場合には、露光時間制御部65は、取得した露光時間が閾値以下であるかを判定する(ステップS16)。露光時間制御部65が取得した露光時間が閾値以下と判定した場合には、表示制御部61は、第2の再現画像135を液晶モニタ30に表示させることなく表示を終了させる。一方、露光時間制御部65が取得した露光時間が閾値より大きいと判定した場合には、表示制御部61は、第2の再現画像135を液晶モニタ30に表示させるために以下の処理を行う。 When the position of the B image 131 can be detected in the A image 121, the start position detection unit 107 determines whether the acquired exposure time is equal to or less than the threshold (step S16). ). When it is determined that the exposure time acquired by the exposure time control unit 65 is equal to or less than the threshold value, the display control unit 61 ends the display without causing the liquid crystal monitor 30 to display the second reproduction image 135. On the other hand, when it is determined that the exposure time acquired by the exposure time control unit 65 is larger than the threshold value, the display control unit 61 performs the following process to display the second reproduced image 135 on the liquid crystal monitor 30.
 撮像範囲検出部105は、現在のカメラ10の姿勢情報(煽り等)及び相対位置座標(Δx、Δy、Δz)を取得する(ステップS17)。なお、相対位置座標は、撮像範囲の中心の座標であり、A画像121における相対的な座標である。相対位置座標は、撮像範囲に基づいて算出することでき、露光の開始位置の撮像範囲の中心(B画像の中心)を原点として、センサ63からの検出情報により座標(Δx、Δy、Δz)を取得することが可能である。その後、撮像範囲検出部105はA画像121、B画像131、相対位置座標から現在の撮像範囲を検出し(撮像範囲検出ステップ)、第1の再現画像生成部109は第1の再現画像133を生成する(ステップS18:第1の再現画像生成ステップ)。 The imaging range detection unit 105 acquires the current posture information (turning or the like) of the camera 10 and relative position coordinates (Δx, Δy, Δz) (step S17). The relative position coordinates are coordinates of the center of the imaging range and are relative coordinates in the A image 121. The relative position coordinates can be calculated based on the imaging range, and the coordinates (Δx, Δy, Δz) are detected by the detection information from the sensor 63 with the center of the imaging range (the center of the B image) at the exposure start position as the origin. It is possible to obtain. Thereafter, the imaging range detection unit 105 detects the current imaging range from the A image 121, the B image 131, and the relative position coordinates (imaging range detection step), and the first reproduction image generation unit 109 detects the first reproduction image 133. (Step S18: First reproduction image generation step).
 そして、撮像範囲検出部105は撮像範囲がA画像121の範囲外にあるか否かを判定する(ステップS19)。そして、撮像範囲検出部105が撮像範囲がA画像121の範囲外であると判定した場合には、第1の再現画像生成部109は範囲外の部分をゼブラパターンで表示した第1の再現画像133を生成する(ステップS24)。 Then, the imaging range detection unit 105 determines whether or not the imaging range is outside the range of the A image 121 (step S19). When the imaging range detection unit 105 determines that the imaging range is outside the range of the A image 121, the first reproduction image generation unit 109 displays the first reproduction image in which a portion outside the range is displayed as a zebra pattern. 133 is generated (step S24).
 図13は、ゼブラパターンを有する第1の再現画像133を示す図である。第1の再現画像133には木の幹126が写っているが、画角が図に向かって左側により過ぎてしまっているため、A画像121の範囲を超えてしまっている。したがって、A画像121の範囲を超えてしまっている部分に関してはゼブラパターン143により、ユーザに範囲外であることを報知しており、ゼブラパターン143は報知画像の一例である。また、ユーザへの報知は、報知画像だけではなく報知文字により行ってもよいし、報知画像及び報知文字によって行ってもよい。 FIG. 13 is a diagram showing a first reproduced image 133 having a zebra pattern. Although the tree trunk 126 is shown in the first reproduced image 133, the angle of view has passed beyond the left side as viewed in the figure, so that it exceeds the range of the A image 121. Therefore, the zebra pattern 143 notifies the user that the portion that has exceeded the range of the A image 121 is out of the range, and the zebra pattern 143 is an example of the notification image. Further, the notification to the user may be performed not only by the notification image but also by the notification character, or may be performed by the notification image and the notification character.
 図12に戻って、撮像範囲検出部105が検出した撮像範囲がA画像121の範囲内である場合には、第2の再現画像生成部111は、現時点までに撮像素子16に積算されている画像を、第1の再現画像133を積算することにより生成し(第2の再現画像生成ステップ)、表示制御部61は生成された第2の再現画像135を液晶モニタ30に表示させる(ステップS20:表示制御ステップ)。 Returning to FIG. 12, when the imaging range detected by the imaging range detection unit 105 is within the range of the A image 121, the second reproduction image generation unit 111 has been integrated into the imaging element 16 until the present time. An image is generated by integrating the first reproduction image 133 (second reproduction image generation step), and the display control unit 61 displays the generated second reproduction image 135 on the liquid crystal monitor 30 (step S20). : Display control step).
 次に露光時間制御部65により、現在時刻が取得された露光時間内であるか否かの判定がされる(ステップS21)。そして、現在時刻が露光時間内である場合には、撮像範囲検出部105は、現在のカメラ10の姿勢情報及び相対位置座標値を取得して、現在の撮像範囲を検出する。なお、一点鎖線Fで示したフローは、例えば通常のフレームレート(50から60fps)毎に行われる。その後、露光時間が終了すると、表示制御部61は液晶モニタ30での再現画像の表示を終了させる(ステップS22)。 Next, the exposure time control unit 65 determines whether or not the current time is within the acquired exposure time (step S21). When the current time is within the exposure time, the imaging range detection unit 105 acquires the current posture information and relative position coordinate value of the camera 10 and detects the current imaging range. The flow indicated by the alternate long and short dash line F is performed, for example, every normal frame rate (50 to 60 fps). Thereafter, when the exposure time ends, the display control unit 61 ends the display of the reproduced image on the liquid crystal monitor 30 (step S22).
 上記実施形態において、各種の処理を実行する処理部(processing unit)のハードウェア的な構造は、次に示すような各種のプロセッサ(processor)である。各種のプロセッサには、ソフトウェア(プログラム)を実行して各種の処理部として機能する汎用的なプロセッサであるCPU(Central Processing Unit)、FPGA(Field Programmable Gate Array)などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)、ASIC(Application Specific Integrated Circuit)などの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。 In the above-described embodiment, the hardware structure of a processing unit (processing unit) that executes various processes is the following various processors. For various processors, the circuit configuration can be changed after manufacturing a CPU (Central Processing Unit) or FPGA (Field Programmable Gate Array) that is a general-purpose processor that functions as various processing units by executing software (programs). Includes dedicated logic circuits such as programmable logic devices (Programmable Logic Devices: PLDs) and ASICs (Application Specific Specific Integrated Circuits) that have specially designed circuit configurations to execute specific processing. It is.
 1つの処理部は、これら各種のプロセッサのうちの1つで構成されていてもよいし、同種又は異種の2つ以上のプロセッサ(例えば、複数のFPGA、あるいはCPUとFPGAの組み合わせ)で構成されてもよい。また、複数の処理部を1つのプロセッサで構成してもよい。複数の処理部を1つのプロセッサで構成する例としては、第1に、クライアントやサーバなどのコンピュータに代表されるように、1つ以上のCPUとソフトウェアの組合せで1つのプロセッサを構成し、このプロセッサが複数の処理部として機能する形態がある。第2に、システムオンチップ(System On Chip:SoC)などに代表されるように、複数の処理部を含むシステム全体の機能を1つのIC(Integrated Circuit)チップで実現するプロセッサを使用する形態がある。このように、各種の処理部は、ハードウェア的な構造として、上記各種のプロセッサを1つ以上用いて構成される。 One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPUs and FPGAs). May be. Further, the plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or server, one processor is configured with a combination of one or more CPUs and software. There is a form in which the processor functions as a plurality of processing units. Second, as represented by a system-on-chip (SoC), a form of using a processor that realizes the functions of the entire system including a plurality of processing units with a single IC (integrated circuit) chip. is there. As described above, various processing units are configured using one or more of the various processors as a hardware structure.
 さらに、これらの各種のプロセッサのハードウェア的な構造は、より具体的には、半導体素子などの回路素子を組み合わせた電気回路(circuitry)である。 Further, the hardware structure of these various processors is more specifically an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
 上述の各構成及び機能は、任意のハードウェア、ソフトウェア、或いは両者の組み合わせによって適宜実現可能である。例えば、上述の処理ステップ(処理手順)をコンピュータに実行させるプログラム、そのようなプログラムを記録したコンピュータ読み取り可能な記録媒体(非一時的記録媒体)、或いはそのようなプログラムをインストール可能なコンピュータに対しても本発明を適用することが可能である。 The above-described configurations and functions can be appropriately realized by arbitrary hardware, software, or a combination of both. For example, for a program that causes a computer to execute the above-described processing steps (processing procedure), a computer-readable recording medium (non-transitory recording medium) that records such a program, or a computer that can install such a program However, the present invention can be applied.
 以上説明したように、本発明では第1の再現画像生成部109により、単位時間毎の撮像範囲に対応する第1の再現画像133が、A画像121、又は、A画像121及びB画像131から生成され、第2の再現画像生成部111により、第1の再現画像133を積算した露光中の第2の再現画像135が生成される。これにより、露光中に撮像素子16に積算されている情報を再現することにより、ユーザに報知することができる。 As described above, in the present invention, the first reproduction image generation unit 109 causes the first reproduction image 133 corresponding to the imaging range per unit time to be generated from the A image 121 or the A image 121 or the B image 131. The second reproduction image generation unit 111 generates the second reproduction image 135 that is being exposed by integrating the first reproduction image 133. Thereby, the user can be notified by reproducing the information accumulated in the image sensor 16 during exposure.
 <その他>
 [カメラの移動形態]
 上述の説明では、カメラ10はチルト方向に動作させることにより撮像範囲を移動させることを説明したが、撮像範囲を移動させるカメラ10の動作はこれに限定されるものではない。
<Others>
[Camera movement type]
In the above description, the camera 10 is moved in the tilt direction to move the imaging range. However, the operation of the camera 10 that moves the imaging range is not limited to this.
 図14は、撮像範囲の移動させる場合のカメラ10の動作形態の一例を示す図である。図に示すようにカメラ10の姿勢はそのままで、高さを変えることにより、撮像範囲を移動させて、撮像予定領域Hを撮像してもよい。カメラ10は本露光が開始であるt=0からt=Δtまでに地面に対して垂直上方に移動し、撮像範囲を移動させて撮像予定領域Hの撮像を行っている。 FIG. 14 is a diagram illustrating an example of the operation mode of the camera 10 when moving the imaging range. As shown in the figure, the imaging area H may be imaged by moving the imaging range by changing the height while maintaining the posture of the camera 10 as it is. The camera 10 moves vertically upward with respect to the ground from t = 0 when the main exposure is started to t = Δt, and moves the imaging range to perform imaging of the scheduled imaging area H.
 [第1の再現画像と第2の再現画像との表示切替]
 上述の説明では、カメラ10の液晶モニタ30に第2の再現画像135を表示する形態を説明したが、これに限定されるものではない。例えば、液晶モニタ30に第1の再現画像133と第2の再現画像を切り替えて表示させてもよい。
[Display switching between first reproduction image and second reproduction image]
In the above description, the form in which the second reproduced image 135 is displayed on the liquid crystal monitor 30 of the camera 10 has been described, but the present invention is not limited to this. For example, the first reproduction image 133 and the second reproduction image may be switched and displayed on the liquid crystal monitor 30.
 図15は、液晶モニタ30の表示の切り替えを説明するための図である。符号140で示された表示形態では、第1の再現画像133が液晶モニタ30に表示されている。また符号141で示された表示形態では、第2の再現画像135が液晶モニタ30に表示されている。例えばユーザは、必要とする情報に応じて操作部38を使用して、液晶モニタ30に第1の再現画像133を表示させるか、第2の再現画像135を表示させるかを切り替えてもよい。 FIG. 15 is a diagram for explaining switching of the display of the liquid crystal monitor 30. In the display form indicated by reference numeral 140, the first reproduced image 133 is displayed on the liquid crystal monitor 30. In the display form indicated by reference numeral 141, the second reproduced image 135 is displayed on the liquid crystal monitor 30. For example, the user may switch whether to display the first reproduction image 133 or the second reproduction image 135 on the liquid crystal monitor 30 using the operation unit 38 according to necessary information.
 このように液晶モニタ30での表示を制御することにより、ユーザは現在撮像している画像と積算された画像とを切り替えて確認することができる。 Thus, by controlling the display on the liquid crystal monitor 30, the user can switch between the currently captured image and the accumulated image for confirmation.
 [露光時間]
 上述の説明では、予め露光時間が設定されている場合に関して説明したが、これに限定されるものではない。例えば、カメラ10のシャッタボタン2をユーザが押下し、ユーザが押下を解放した場合に露光が終了するような長時間露光においても、本発明は適用される。
[Exposure time]
In the above description, the case where the exposure time is set in advance has been described, but the present invention is not limited to this. For example, the present invention is also applied to long-time exposure in which exposure is completed when the user presses the shutter button 2 of the camera 10 and the user releases the press.
 以上で本発明の例に関して説明してきたが、本発明は上述した実施の形態に限定されず、本発明の精神を逸脱しない範囲で種々の変形が可能であることは言うまでもない。 The examples of the present invention have been described above, but the present invention is not limited to the above-described embodiments, and it goes without saying that various modifications can be made without departing from the spirit of the present invention.
1    :ストロボ
2    :シャッタボタン
3    :電源/モードスイッチ
4    :モードダイヤル
5    :ズームボタン
6    :十字ボタン
7    :MENU/OKボタン
8    :再生ボタン
9    :BACKボタン
10   :カメラ
12   :撮影レンズ
14   :絞り
15   :メカシャッタ
16   :撮像素子
22   :画像入力コントローラ
24   :画像処理部
26   :圧縮伸張処理部
28   :ビデオエンコーダ
30   :液晶モニタ
32   :センサ駆動部
33   :シャッタ駆動部
34   :絞り駆動部
36   :レンズ駆動部
38   :操作部
40   :CPU
42   :AF処理部
44   :AE検出部
47   :ROM
48   :メモリ
50   :URAM
52   :メディアコントローラ
54   :メモリカード
61   :表示制御部
63   :センサ
65   :露光時間制御部
67   :ズーム制御部
101  :第1の撮像画像取得部
103  :第2の撮像画像取得部
105  :撮像範囲検出部
107  :開始位置検出部
109  :第1の再現画像生成部
111  :第2の再現画像生成部
121  :第1の撮像画像
125  :木
126  :木の幹
127A :撮像範囲
127B :撮像範囲
127C :撮像範囲
131  :第2の撮像画像
133  :第1の再現画像
133A :第1の再現画像
133B :第1の再現画像
133C :第1の再現画像
135  :第2の再現画像
145  :木の幹の模様
H    :撮像予定領域
ステップS10-S24:カメラの動作工程
1: Flash 2: Shutter button 3: Power / Mode switch 4: Mode dial 5: Zoom button 6: Cross button 7: MENU / OK button 8: Playback button 9: BACK button 10: Camera 12: Shooting lens 14: Aperture 15 : Mechanical shutter 16: Image sensor 22: Image input controller 24: Image processing unit 26: Compression / decompression processing unit 28: Video encoder 30: Liquid crystal monitor 32: Sensor driving unit 33: Shutter driving unit 34: Aperture driving unit 36: Lens driving unit 38: Operation unit 40: CPU
42: AF processing unit 44: AE detection unit 47: ROM
48: Memory 50: URAM
52: Media controller 54: Memory card 61: Display control unit 63: Sensor 65: Exposure time control unit 67: Zoom control unit 101: First captured image acquisition unit 103: Second captured image acquisition unit 105: Imaging range detection Unit 107: start position detection unit 109: first reproduction image generation unit 111: second reproduction image generation unit 121: first captured image 125: tree 126: tree trunk 127A: imaging range 127B: imaging range 127C: Imaging range 131: second captured image 133: first reproduced image 133A: first reproduced image 133B: first reproduced image 133C: first reproduced image 135: second reproduced image 145: tree trunk Pattern H: Imaging scheduled area Steps S10-S24: Camera operation process

Claims (14)

  1.  撮像範囲を移動させながら露光を行い撮像画像を取得可能なカメラであって、
     表示部と、
     前記撮像画像の撮像予定領域よりも広角で撮像した第1の撮像画像を取得する第1の撮像画像取得部と、
     前記露光を開始する直前における前記撮像予定領域の一部が撮像された第2の撮像画像を取得する第2の撮像画像取得部と、
     前記カメラの姿勢情報を検出するセンサと、
     前記露光を開始する状態からの前記センサの検出情報に基づいて、前記第2の撮像画像の撮像範囲を基準にした単位時間毎の撮像範囲を検出する撮像範囲検出部と、
     前記単位時間毎の撮像範囲に対応する第1の再現画像を、前記第1の撮像画像、又は前記第1の撮像画像及び前記第2の撮像画像から生成する第1の再現画像生成部と、
     前記第1の再現画像を積算した前記露光中の第2の再現画像を生成する第2の再現画像生成部と、
     前記表示部に前記第2の再現画像を表示させる表示制御部と、
     を備えるカメラ。
    A camera capable of performing exposure while moving an imaging range and acquiring a captured image,
    A display unit;
    A first captured image acquisition unit that acquires a first captured image captured at a wider angle than an imaging planned region of the captured image;
    A second captured image acquisition unit that acquires a second captured image obtained by capturing a part of the planned imaging region immediately before starting the exposure;
    A sensor for detecting posture information of the camera;
    An imaging range detection unit that detects an imaging range per unit time based on the imaging range of the second captured image based on detection information of the sensor from the state of starting the exposure;
    A first reproduction image generating unit that generates the first reproduction image corresponding to the imaging range for each unit time from the first imaging image or the first imaging image and the second imaging image;
    A second reproduction image generation unit for generating the second reproduction image during the exposure obtained by integrating the first reproduction image;
    A display control unit for displaying the second reproduced image on the display unit;
    With a camera.
  2.  前記第1の撮像画像と前記第2の撮像画像とを比較することによって、前記第1の撮像画像における前記露光の開始位置を検出する開始位置検出部を備え、
     前記撮像範囲検出部は、前記開始位置に基づいて、前記単位時間毎の撮像範囲を検出する請求項1に記載のカメラ。
    A start position detecting unit that detects a start position of the exposure in the first captured image by comparing the first captured image and the second captured image;
    The camera according to claim 1, wherein the imaging range detection unit detects an imaging range for each unit time based on the start position.
  3.  前記センサは、前記第1の撮像画像及び前記第2の撮像画像を取得する場合の前記カメラの姿勢情報を検出し、
     前記撮像範囲検出部は、前記第1の撮像画像及び前記第2の撮像画像を取得する場合の前記検出情報に基づいて、前記露光の前記開始位置を検出し、前記単位時間毎の撮像範囲を検出する請求項2に記載のカメラ。
    The sensor detects posture information of the camera when acquiring the first captured image and the second captured image,
    The imaging range detection unit detects the start position of the exposure based on the detection information when acquiring the first captured image and the second captured image, and determines the imaging range for each unit time. The camera according to claim 2 to be detected.
  4.  前記センサは、前記カメラの方位情報及び基準面からの高さ情報のうち少なくとも一つをさらに検出する請求項1から3のいずれか1項に記載のカメラ。 The camera according to any one of claims 1 to 3, wherein the sensor further detects at least one of orientation information of the camera and height information from a reference plane.
  5.  前記撮像範囲検出部は、前記センサの前記検出情報に基づいて、前記単位時間毎の撮像範囲の移動方向を検出する請求項4に記載のカメラ。 The camera according to claim 4, wherein the imaging range detection unit detects a moving direction of the imaging range for each unit time based on the detection information of the sensor.
  6.  前記露光の設定された露光時間を取得し前記露光を制御する露光時間制御部を備える請求項1から5のいずれか1項に記載のカメラ。 The camera according to any one of claims 1 to 5, further comprising an exposure time control unit that acquires an exposure time set for the exposure and controls the exposure.
  7.  前記露光時間が閾値より長い場合に、前記表示制御部は、前記表示部に前記第2の再現画像を表示させる請求項6に記載のカメラ。 The camera according to claim 6, wherein when the exposure time is longer than a threshold, the display control unit displays the second reproduced image on the display unit.
  8.  前記第1の再現画像生成部は、前記撮像範囲検出部が検出する前記単位時間毎の撮像範囲が前記第1の撮像画像以外の範囲にある場合には、ユーザに報知する報知画像及び報知文字のうち少なくとも一方を含む前記第1の再現画像を生成する請求項1から7のいずれか1項に記載のカメラ。 The first reproduction image generation unit notifies the user of a notification image and a notification character when the imaging range for each unit time detected by the imaging range detection unit is in a range other than the first captured image. The camera according to claim 1, wherein the first reproduction image including at least one of the first reproduction image is generated.
  9.  前記表示制御部は、前記第1の再現画像と前記第2の再現画像とを切り替えて前記表示部に表示させる請求項1から8のいずれか1項に記載のカメラ。 The camera according to any one of claims 1 to 8, wherein the display control unit switches the first reproduction image and the second reproduction image to be displayed on the display unit.
  10.  前記センサは、手ぶれセンサである請求項1から9のいずれか1項に記載のカメラ。 10. The camera according to claim 1, wherein the sensor is a camera shake sensor.
  11.  前記第1の撮像画像と前記単位時間毎の撮像範囲とに応じて、ズーム機能を固定させるズーム制御部を備える請求項1から10のいずれか1項に記載のカメラ。 The camera according to any one of claims 1 to 10, further comprising a zoom control unit that fixes a zoom function according to the first captured image and an imaging range for each unit time.
  12.  撮像範囲を移動させながら露光を行い撮像画像を取得可能な撮像方法であって、
     前記撮像画像の撮像予定領域よりも広角で撮像した第1の撮像画像を取得する第1の撮像画像取得ステップと、
     前記露光を開始する直前における前記撮像予定領域の一部が撮像された第2の撮像画像を取得する第2の撮像画像取得ステップと、
     カメラの姿勢情報をセンサにより検出する検出ステップと、
     前記露光を開始する状態からの前記センサの検出情報に基づいて、前記第2の撮像画像の撮像範囲を基準にした単位時間毎の撮像範囲を検出する撮像範囲検出ステップと、
     前記単位時間毎の撮像範囲に対応する第1の再現画像を、前記第1の撮像画像、又は前記第1の撮像画像及び前記第2の撮像画像から生成する第1の再現画像生成ステップと、
     前記第1の再現画像を積算した前記露光中の第2の再現画像を生成する第2の再現画像生成ステップと、
     表示部に前記第2の再現画像を表示させる表示制御ステップと、
     を含む撮像方法。
    An imaging method capable of acquiring a captured image by performing exposure while moving an imaging range,
    A first captured image acquisition step of acquiring a first captured image captured at a wider angle than the imaging planned region of the captured image;
    A second captured image acquisition step of acquiring a second captured image in which a part of the planned imaging region immediately before the start of exposure is captured;
    A detection step of detecting camera posture information by a sensor;
    An imaging range detection step of detecting an imaging range per unit time based on an imaging range of the second captured image based on detection information of the sensor from a state of starting the exposure;
    A first reproduction image generation step of generating a first reproduction image corresponding to the imaging range for each unit time from the first imaging image or the first imaging image and the second imaging image;
    A second reproduction image generation step of generating a second reproduction image during the exposure obtained by integrating the first reproduction image;
    A display control step of displaying the second reproduction image on the display unit;
    An imaging method including:
  13.  撮像範囲を移動させながら露光を行い撮像画像を取得可能な撮像工程をコンピュータに実行させるプログラムであって、
     前記撮像画像の撮像予定領域よりも広角で撮像した第1の撮像画像を取得する第1の撮像画像取得ステップと、
     前記露光を開始する直前における前記撮像予定領域の一部が撮像された第2の撮像画像を取得する第2の撮像画像取得ステップと、
     カメラの姿勢情報をセンサにより検出する検出ステップと、
     前記露光を開始する状態からの前記センサの検出情報に基づいて、前記第2の撮像画像の撮像範囲を基準にした単位時間毎の撮像範囲を検出する撮像範囲検出ステップと、
     前記単位時間毎の撮像範囲に対応する第1の再現画像を、前記第1の撮像画像、又は前記第1の撮像画像及び前記第2の撮像画像から生成する第1の再現画像生成ステップと、
     前記第1の再現画像を積算した前記露光中の第2の再現画像を生成する第2の再現画像生成ステップと、
     表示部に前記第2の再現画像を表示させる表示制御ステップと、
     を含む撮像工程をコンピュータに実行させるプログラム。
    A program for causing a computer to execute an imaging process capable of acquiring a captured image by performing exposure while moving an imaging range,
    A first captured image acquisition step of acquiring a first captured image captured at a wider angle than the imaging planned region of the captured image;
    A second captured image acquisition step of acquiring a second captured image in which a part of the planned imaging region immediately before the start of exposure is captured;
    A detection step of detecting camera posture information by a sensor;
    An imaging range detection step of detecting an imaging range per unit time based on an imaging range of the second captured image based on detection information of the sensor from a state of starting the exposure;
    A first reproduction image generation step of generating a first reproduction image corresponding to the imaging range for each unit time from the first imaging image or the first imaging image and the second imaging image;
    A second reproduction image generation step of generating a second reproduction image during the exposure obtained by integrating the first reproduction image;
    A display control step of displaying the second reproduction image on the display unit;
    A program for causing a computer to execute an imaging process including:
  14.  非一時的かつコンピュータ読取可能な記録媒体であって、前記記録媒体に格納された指令がコンピュータによって読み取られた場合に、
     撮像範囲を移動させながら露光を行い撮像画像を取得可能な撮像機能であって、
     前記露光を開始する直前における前記撮像予定領域の一部が撮像された第2の撮像画像を取得する第2の撮像画像取得ステップと、
     カメラの姿勢情報をセンサにより検出する検出ステップと、
     前記露光を開始する状態からの前記センサの検出情報に基づいて、前記第2の撮像画像の撮像範囲を基準にした単位時間毎の撮像範囲を検出する撮像範囲検出ステップと、
     前記単位時間毎の撮像範囲に対応する第1の再現画像を、前記第1の撮像画像、又は前記第1の撮像画像及び前記第2の撮像画像から生成する第1の再現画像生成ステップと、
     前記第1の再現画像を積算した前記露光中の第2の再現画像を生成する第2の再現画像生成ステップと、
     表示部に前記第2の再現画像を表示させる表示制御ステップと、
     を含む撮像機能をコンピュータに実現させる記録媒体。
    A non-transitory and computer-readable recording medium, when a command stored in the recording medium is read by a computer,
    An imaging function capable of acquiring a captured image by performing exposure while moving the imaging range,
    A second captured image acquisition step of acquiring a second captured image in which a part of the planned imaging region immediately before the start of exposure is captured;
    A detection step of detecting camera posture information by a sensor;
    An imaging range detection step of detecting an imaging range per unit time based on an imaging range of the second captured image based on detection information of the sensor from a state of starting the exposure;
    A first reproduction image generation step of generating a first reproduction image corresponding to the imaging range for each unit time from the first imaging image or the first imaging image and the second imaging image;
    A second reproduction image generation step of generating a second reproduction image during the exposure obtained by integrating the first reproduction image;
    A display control step of displaying the second reproduction image on the display unit;
    Recording medium that causes a computer to realize an imaging function including
PCT/JP2019/009477 2018-03-20 2019-03-08 Camera, imaging method, and program WO2019181579A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020508203A JP7028960B2 (en) 2018-03-20 2019-03-08 Camera, imaging method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018052742 2018-03-20
JP2018-052742 2018-03-20

Publications (1)

Publication Number Publication Date
WO2019181579A1 true WO2019181579A1 (en) 2019-09-26

Family

ID=67987089

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/009477 WO2019181579A1 (en) 2018-03-20 2019-03-08 Camera, imaging method, and program

Country Status (2)

Country Link
JP (1) JP7028960B2 (en)
WO (1) WO2019181579A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08223481A (en) * 1995-02-16 1996-08-30 Sony Corp Panorama still image generating device
WO2009142327A1 (en) * 2008-05-20 2009-11-26 日本電気株式会社 Imaging device, mobile information processing terminal, monitor display method for imaging device, and program
JP2010004113A (en) * 2008-06-18 2010-01-07 Olympus Imaging Corp Imaging apparatus, imaging method, image processing device and image processing method
WO2012132204A1 (en) * 2011-03-30 2012-10-04 Necカシオモバイルコミュニケーションズ株式会社 Imaging device, photographing guide displaying method for imaging device, and non-transitory computer readable medium
JP2016096487A (en) * 2014-11-17 2016-05-26 株式会社クワンズ Imaging system
JP2017055268A (en) * 2015-09-09 2017-03-16 キヤノン株式会社 Imaging device, control method and program for imaging device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08223481A (en) * 1995-02-16 1996-08-30 Sony Corp Panorama still image generating device
WO2009142327A1 (en) * 2008-05-20 2009-11-26 日本電気株式会社 Imaging device, mobile information processing terminal, monitor display method for imaging device, and program
JP2010004113A (en) * 2008-06-18 2010-01-07 Olympus Imaging Corp Imaging apparatus, imaging method, image processing device and image processing method
WO2012132204A1 (en) * 2011-03-30 2012-10-04 Necカシオモバイルコミュニケーションズ株式会社 Imaging device, photographing guide displaying method for imaging device, and non-transitory computer readable medium
JP2016096487A (en) * 2014-11-17 2016-05-26 株式会社クワンズ Imaging system
JP2017055268A (en) * 2015-09-09 2017-03-16 キヤノン株式会社 Imaging device, control method and program for imaging device

Also Published As

Publication number Publication date
JP7028960B2 (en) 2022-03-02
JPWO2019181579A1 (en) 2021-03-11

Similar Documents

Publication Publication Date Title
TWI326553B (en) Imaging apparatus, data extraction method, and data extraction program
JP2006174105A (en) Electronic camera and program
US10234653B2 (en) Imaging device with focus lens stopped at each of focus positions during video capturing
US9912877B2 (en) Imaging apparatus for continuously shooting plural images to generate a still image
US20180007290A1 (en) Imaging apparatus and imaging method
JP2009159559A (en) Image capturing device and program thereof
US20130002899A1 (en) Imaging device
JP4977569B2 (en) Imaging control apparatus, imaging control method, imaging control program, and imaging apparatus
JP2013110754A (en) Camera device, and photographing method and program of the same
US9118829B2 (en) Imaging device
JP4788172B2 (en) Imaging apparatus and program
JP2010193324A (en) Camera apparatus, photographing method thereof and program
JP2006203732A (en) Digital camera, portrait/landscape aspect photographing switching method and program
JP6435527B2 (en) Imaging device
JP7028960B2 (en) Camera, imaging method, and program
JP4522232B2 (en) Imaging device
JP5831492B2 (en) Imaging apparatus, display control method, and program
JP6399120B2 (en) Imaging device
US10038835B2 (en) Image pickup device and still picture generating method
JP6642661B2 (en) Imaging device
JP5338248B2 (en) Image processing apparatus, electronic camera, and image processing program
JP4962597B2 (en) Electronic camera and program
JP2010217808A (en) Apparatus and method for processing image, electronic camera, and image processing program
EP3041219B1 (en) Image processing device and method, and program
JP2007201774A (en) Moving direction detecting apparatus, and imaging apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19772588

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020508203

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19772588

Country of ref document: EP

Kind code of ref document: A1