JP6278636B2 - Imaging apparatus, control method therefor, program, and storage medium - Google Patents

Imaging apparatus, control method therefor, program, and storage medium Download PDF

Info

Publication number
JP6278636B2
JP6278636B2 JP2013171640A JP2013171640A JP6278636B2 JP 6278636 B2 JP6278636 B2 JP 6278636B2 JP 2013171640 A JP2013171640 A JP 2013171640A JP 2013171640 A JP2013171640 A JP 2013171640A JP 6278636 B2 JP6278636 B2 JP 6278636B2
Authority
JP
Japan
Prior art keywords
shooting
mode
display
interval
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2013171640A
Other languages
Japanese (ja)
Other versions
JP2015040965A (en
Inventor
誠司 小川
誠司 小川
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to JP2013171640A priority Critical patent/JP6278636B2/en
Priority claimed from GB1414710.2A external-priority patent/GB2519416B/en
Publication of JP2015040965A publication Critical patent/JP2015040965A/en
Application granted granted Critical
Publication of JP6278636B2 publication Critical patent/JP6278636B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to information display in interval shooting.

  An imaging apparatus capable of performing interval shooting in which a plurality of shootings are automatically performed at predetermined intervals is known. In the interval photographing, there is known one in which a plurality of photographed images photographed at predetermined intervals are recorded as separate image files. Also, you can shoot multiple still images, combine them and record them as a single still image, or shoot movies several times at regular intervals and connect or combine them into a single movie There is also.

  Patent Document 1 proposes an imaging apparatus that combines a plurality of captured still images during interval shooting and displays a combined image on a display unit when a predetermined operation is performed during the interval period.

  Patent Document 2 proposes an observation device that displays a series of remaining imaging time and remaining time in the interval period in interval imaging in which the sample is intermittently imaged and observed with the interval period interposed therebetween.

JP2013-62740A JP 2008-70259 A

  Interval shooting takes a long time because multiple shots are taken at intervals. For this reason, it is desirable that the user can check the progress of shooting on the camera during interval shooting. In the interval shooting, not only the elapsed time but also the shooting interval (interval period) is useful information. This is because if the shooting interval is known, the user can make necessary preparations before the next shooting starts.

  On the other hand, in the above-mentioned Patent Document 1, it is not considered to display time information indicating the progress of interval shooting. In addition, in the series of remaining shooting times displayed in Patent Document 2, the shooting interval of the interval shooting is not known, and in the display of the remaining time in the interval period, the elapsed time from the start of the series of interval shootings is not known. If both the series of remaining photographing time and the remaining time in the interval period are displayed simultaneously, the display becomes complicated and a display space for both is required.

  In view of the above problems, an object of the present invention is to provide an imaging apparatus capable of displaying information in interval shooting that is more convenient for the user.

In order to solve the above problems, an imaging apparatus according to the present invention provides:
Shooting control means for controlling to perform interval shooting in which shooting at intervals is performed a plurality of times in response to a single shooting instruction;
Display control means for controlling to update and display the elapsed time from the start of the interval shooting according to the shooting instruction every time each shooting in the interval shooting is performed;
It is characterized by having.

  According to the present invention, it is possible for the user to grasp the information about the elapsed time of the interval shooting and the shooting interval with a more space-saving information display.

FIG. 3 is an external view of a back surface (a) and a front surface (b) of the imaging apparatus according to the present embodiment. 1 is a block diagram illustrating a configuration of an imaging apparatus according to an embodiment. The flowchart which shows the selection process of the starry sky mode in this embodiment. The figure which illustrates the selection screen of the starry sky mode of this embodiment. The flowchart which shows the imaging | photography process in the starry sky snap mode of this embodiment. The figure which shows the example of a screen display in the starry sky snap mode of this embodiment. The flowchart which shows the imaging | photography process in the starry night view mode of this embodiment. The flowchart which shows the imaging | photography process in the starry sky locus | trajectory mode of this embodiment. The figure which shows the example of a screen display in the starry sky locus mode of this embodiment.

  EMBODIMENT OF THE INVENTION Below, the form for implementing this invention is demonstrated in detail with reference to an accompanying drawing.

<Device configuration>
With reference to FIG. 1 and FIG. 2, the function and appearance of an imaging apparatus according to an embodiment to which the present invention is applied (in this embodiment, a digital camera is taken as an example) will be described.

  In FIG. 1 showing the external appearance of the digital camera 100 of the present embodiment, a display unit 101 includes a liquid crystal display panel (LCD) that displays images and various types of information. The shutter button 102 is an operation unit for issuing a shooting instruction. A mode switching button 103 is an operation unit for switching various modes. A connector 107 is an interface for connecting the connection cable 108 and the digital camera 100. The operation unit 104 is an operation unit including operation members such as various switches, buttons, and a touch panel that accept various operations from the user. The controller wheel 106 is a rotatable operation member included in the operation unit 104. A power switch 105 switches between power on and power off. The recording medium 109 is a recording medium such as a memory card or a hard disk. The recording medium slot 110 is a slot for storing the recording medium 109. The recording medium 109 stored in the recording medium slot 110 can communicate with the digital camera 100. A lid 111 is a lid of the recording medium slot 110. The light emitting unit 112 includes an LED (light emitting diode) or the like, and operates on the subject existing on the front side of the camera according to a predetermined light emission / non-light emission pattern (for example, during the countdown of the self-timer or the start / end of shooting). Etc.). The light emitting unit 112 is arranged on the front side of the camera (subject side, imaging surface side) so that it can be viewed from the subject side. The strobe 113 includes a retractable flash device that flashes to illuminate a subject, and is automatically stored (not used) (FIG. 1A) or exposed (used) in response to a user operation or in accordance with a program AE. Displacement to (FIG. 1B).

  In FIG. 2 showing the internal configuration of the digital camera 100 of the present embodiment, a photographing lens 203 is a lens group including a zoom lens and a focus lens. The shutter 204 has an aperture function. The image pickup unit 205 is an image pickup element configured by a CCD, a CMOS, or the like that converts an optical image of a subject into an electric signal. The A / D converter 206 converts an analog signal into a digital signal. The A / D converter 206 is used to convert an analog signal output from the imaging unit 205 into a digital signal. The barrier 202 covers the imaging system including the imaging lens 203 of the digital camera 100, thereby preventing the imaging system including the imaging lens 203, the shutter 204, and the imaging unit 205 from becoming dirty or damaged.

  The image processing unit 207 performs resizing processing and color conversion processing such as predetermined pixel interpolation and reduction on the data from the A / D converter 206 or the data from the memory control unit 209. The image processing unit 207 performs predetermined calculation processing using the captured image data, and the system control unit 201 performs exposure control and distance measurement control based on the obtained calculation result. Thereby, AF (autofocus) processing, AE (automatic exposure) processing, and EF (flash pre-emission) processing of the TTL (through-the-lens) method are performed. The image processing unit 207 further performs predetermined calculation processing using the captured image data, and also performs TTL AWB (auto white balance) processing based on the obtained calculation result.

  Output data from the A / D converter 206 is directly written into the memory 210 via the image processing unit 207 and the memory control unit 209 or via the memory control unit 209. The memory 210 stores image data obtained by the imaging unit 205 and converted into digital data by the A / D converter 206 and image data to be displayed on the display unit 101. The memory 210 has a storage capacity sufficient to store a predetermined number of still images and a predetermined time of moving images and audio.

  The memory 210 also serves as an image display memory (video memory). The D / A converter 208 converts the image display data stored in the memory 210 into an analog signal and supplies the analog signal to the display unit 101. Thus, the display image data written in the memory 210 is displayed on the display unit 101 via the D / A converter 208. The display unit 101 performs display according to an analog signal from the D / A converter 208 on a display such as an LCD. A digital signal that has been A / D converted by the A / D converter 206 and accumulated in the memory 210 is converted into an analog signal by the D / A converter 208, and sequentially transferred to the display unit 101 for display, thereby displaying an electronic viewfinder. It can function as a through image display. It is assumed that a live view image is obtained when the through image is rephrased, and a live view is obtained when the through image display is reworded.

  The nonvolatile memory 213 is an electrically erasable / recordable memory, and for example, an EEPROM or the like is used. The nonvolatile memory 213 stores constants, programs, and the like for operating the system control unit 201. Here, the program is a program for executing various flowcharts described later in the present embodiment.

  The system control unit 201 controls the entire digital camera 100. By executing the program recorded in the non-volatile memory 213 described above, each process of the present embodiment to be described later is realized. A system memory 212 uses a RAM. In the system memory 212, constants and variables for operation of the system control unit 201, programs read from the nonvolatile memory 213, and the like are expanded. The system control unit 201 also performs display control by controlling the memory 210, the D / A converter 208, the display unit 101, and the like.

  The system timer 211 is a time measuring unit that measures the time used for various controls and the time of a built-in clock.

  The mode switching button 103, the first shutter switch 102a, the second shutter switch 102b, and the operation unit 104 are operation means for inputting various operation instructions to the system control unit 201.

  A mode switching button 103 switches the operation mode of the system control unit 201 to any one of a still image recording mode, a moving image recording mode, a reproduction mode, and the like. Modes included in the still image recording mode include an auto shooting mode, an auto scene discrimination mode, a manual mode, various scene modes for shooting settings for each shooting scene, a program AE mode, a custom mode, and the like. A mode switching button 103 is used to directly switch to one of these modes included in the still image recording mode. Alternatively, after switching to the still image recording mode once with the mode switching button 103, switching to any one of these modes included in the still image recording mode may be performed using another operation member. Similarly, the moving image recording mode may include a plurality of modes.

  The first shutter switch 102a is turned on when the shutter button 102 provided in the digital camera 100 is being operated, so-called half-press (shooting preparation instruction), and generates a first shutter switch signal SW1. In response to the first shutter switch signal SW1, operations such as AF processing, AE processing, AWB processing, and EF processing are started.

  The second shutter switch 102b is turned on when the operation of the shutter button 102 is completed, so-called full press (shooting instruction), and generates a second shutter switch signal SW2. In response to the second shutter switch signal SW2, the system control unit 201 starts a series of shooting processing operations from reading a signal from the imaging unit 205 to writing image data on the recording medium 109 (shooting control).

  Each operation member of the operation unit 104 is appropriately assigned a function for each scene by selecting and operating various function icons displayed on the display unit 101, and functions as various function buttons. Examples of the function buttons include an end button, a return button, an image advance button, a jump button, a narrowing button, and an attribute change button. For example, when a menu button is pressed, various setting menu screens are displayed on the display unit 101. The user can make various settings intuitively using the menu screen displayed on the display unit 101 and the four-way buttons and the SET button.

  Note that as one of the operation units 104, a touch panel capable of detecting contact with the display unit 101 is provided. The touch panel and the display unit 101 can be configured integrally. For example, the touch panel is configured so that the light transmittance does not hinder the display of the display unit 101, and is attached to the upper layer of the display surface of the display unit 101. Then, the input coordinates on the touch panel are associated with the display coordinates on the display unit 101. Thereby, it is possible to configure a GUI as if the user can directly operate the screen displayed on the display unit 101.

  The controller wheel 106 is a rotatable operation member included in the operation unit 104, and is used when a selection item is designated together with a direction button. When the controller wheel 106 is rotated, an electrical pulse signal is generated according to the operation amount, and the system control unit 201 controls each unit of the digital camera 100 based on the pulse signal. Based on this pulse signal, it is possible to determine the angle at which the controller wheel 106 is rotated, how many rotations, and the like. The controller wheel 106 may be any operation member that can detect a rotation operation. For example, a dial operation member that generates a pulse signal by rotating the controller wheel 106 in accordance with a user's rotation operation may be used. Further, an operation member made of a touch sensor may be used to detect a rotation operation of a user's finger on the controller wheel 106 without rotating the controller wheel 106 itself (so-called touch wheel).

  The power control unit 214 includes a battery detection circuit, a DC-DC converter, a switch circuit that switches a block to be energized, and the like, and detects whether or not a battery is attached, the type of battery, and the remaining battery level. Further, the power control unit 214 controls the DC-DC converter based on the detection result and an instruction from the system control unit 201, and supplies a necessary voltage to each unit including the recording medium 109 for a necessary period.

  The power supply unit 215 includes a primary battery such as an alkaline battery or a lithium battery, a secondary battery such as a NiCd battery, a NiMH battery, or a lithium ion battery, an AC adapter, or the like. The recording medium I / F 216 is an interface with the recording medium 109 such as a memory card or a hard disk. The recording medium 109 is a recording medium such as a memory card for recording a captured image, and includes a semiconductor memory, a magnetic disk, or the like.

  The sound generation unit 217 includes a speaker, and generates a self-timer countdown sound, a shutter sound in accordance with the opening / closing of the shutter, other operation sounds, a moving image sound during moving image reproduction, and the like.

  In addition to the above configuration, there may be a communication unit that transmits and receives video and audio to and from an external device that is communicably connected by a wireless antenna or a wired cable. In this case, the communication unit can be connected to a wireless LAN or the Internet, and can transmit an image captured by the imaging unit 205 (including a through image) or an image file recorded on the recording medium 109 to an external device. Image data and other various information can be received from an external device.

  In some cases, an attitude detection unit such as an acceleration sensor or a gyro sensor that detects the attitude of the digital camera 100 with respect to the direction of gravity is mounted. In this case, according to the posture detected by the posture detection unit, it is possible to determine whether the image picked up by the image pickup unit 205 was shot with the digital camera 100 held sideways or vertically. . The system control unit 201 can add information related to the posture detected by the posture detection unit to the image file or rotate and record the captured image.

  The digital camera 100 can be switched between at least a playback mode for playing back an image and a shooting mode for shooting. As shooting modes, an auto mode, a manual mode, and a plurality of scene-specific shooting modes are provided. The auto mode is a mode in which various parameters of the camera are automatically determined by a program incorporated in the digital camera 100 based on the measured exposure value. The manual mode is a mode in which the user can freely change various parameters of the camera. The scene-specific shooting mode is a shooting mode realized by combining a shutter speed and an aperture value, a strobe light emission state, a sensitivity setting, a white balance (WB) setting, and the like suitable for each shooting scene. The digital camera 100 has, for example, the following scene-specific shooting modes (1) to (14). However, it is not limited to these scene-specific shooting modes.

(1) Water shooting mode (beach mode): A mode in which people can shoot without darkening even on the sea or sandy beaches where sunlight is strongly reflected. (2) Night scene shooting mode: Slow light is applied to people and the background is slow. A mode specialized for night scenes that records at shutter speed (3) Fireworks shooting mode: A mode for shooting fireworks vividly with optimal exposure (4) Underwater shooting mode: Set to the optimal white balance for underwater shooting (5) Sunset shooting mode: Mode that emphasizes silhouette and emphasizes red (6) Portrait shooting mode: Portrait that blurs the background and makes people stand out Modes specialized for shooting (7) Sports shooting mode: Shooting mode specialized for taking fast-moving subjects (8) Snow shooting Shadow mode: A mode in which a person does not darken even when the snow scene is in the background, and no blueness is left. (9) Night & Snap mode: A mode suitable for taking beautiful night views and people without a tripod. (10) Spot Light shooting mode: A mode that captures a subject with a spotlight neatly (11) A starry sky snap mode: A mode that captures a starry sky and a person together (12) A starry night view mode: A mode that allows easy shooting of the starry sky (13) Mode: A mode that records the trajectory of stars accompanying diurnal motion by compositing images taken for a long time in each shooting in interval shooting. A mode to create a fast-forward movie by making it into a file.

  Hereinafter, the starry sky snap mode, the starry night view mode, the starry sky trajectory mode, and the starry sky interval moving image mode are collectively referred to as the starry sky mode. In addition, after setting the higher-level shooting mode called the starry sky mode, you can set any shooting mode (starry sky snap mode, starry night view mode, starry sky trajectory mode, starry sky interval movie mode) included in the starry sky mode as the lower layer. You may do it. The user can perform shooting by setting the digital camera 100 to a desired shooting mode from the shooting mode selection menu (shooting mode setting). The starry sky mode is a dark place shooting mode for shooting in a dark place.

<Starry sky mode selection process>
Next, the starry sky mode selection process in the present embodiment will be described with reference to FIG. 3 is realized by reading a program recorded in the nonvolatile memory 213 to the system memory 212 and executing it by the system control unit 201. In the process of FIG. 3, the digital camera 100 is activated, and the starry sky mode selection instruction (screen Starts when (Display) is entered.

  In step S <b> 301, the system control unit 201 displays a starry sky mode selection screen (FIG. 4B) on the display unit 101.

  In step S302, the system control unit 201 determines the selected shooting mode. When the starry sky snap mode is selected, the process proceeds to S303, when the starry night view mode is selected, the process proceeds to S304, when the starry sky locus mode is selected, the process proceeds to S305, and when the starry sky interval moving image mode is selected, the process proceeds to S306. Proceed to The starry sky snap mode of S303 will be described later with reference to FIGS. The starry night view mode in S304 will be described later with reference to FIG. The starry sky trajectory mode of S305 will be described later with reference to FIGS. Note that since a moving image file is generated in the starry sky interval moving image mode in S306, a shutter sound is not generated every time shooting is performed. A shooting interval (for example, one minute interval) and a required shooting time (two-hour shooting) can be set, and even when the required shooting time is not reached, recording can be stopped by pressing the moving image recording button of the operation unit 104. During the shooting, the REC review is displayed on the display unit 101 and the elapsed shooting time is also displayed. The review display is not displayed after a predetermined time for power saving. In the menu setting, it is possible to perform a setting for recording a still image simultaneously with a moving image.

  The start and stop of shooting of the starry sky interval moving image is performed by the moving image recording button of the operation unit 104. When the shutter button 102 is pressed in the shooting standby state, still image shooting can be performed with the same settings as for moving image shooting, so that exposure adjustment and the like can be easily performed without recording moving images.

  In step S307, the system control unit 201 determines whether the power switch is turned off. If the power switch is turned off, the process proceeds to step S308. If not, the process returns to step S301.

  In step S308, the system control unit 201 performs a termination process. The termination process includes, for example, a process of changing the display on the display unit 101 to the termination state, closing the barrier 202, and protecting the imaging unit 205. The termination process may include a process of recording parameters, setting values, and setting modes including flags, control variables, and the like in the nonvolatile memory 213 and shutting off the power to parts that do not require power supply.

  When the termination process of S308 is completed, the power-off state is entered.

  Since the starry sky mode is a mode for performing wide-angle shooting, the system control unit 201 may fix the shooting lens 203 at the wide end at the start of processing in the starry sky mode. Further, the system control unit 201 may display the following guidance on the display unit 101 when the user selects either the starry sky mode or the starry sky mode.

-The mode is a mode for shooting the starry sky-Advice to shoot with the camera fixed on a tripod (Starry sky mode is for shooting in the dark, so exposure time is long, (Shooting with a tripod is recommended.) The starry sky mode assumes the use of a tripod.)
-The zoom lens is fixed at the wide end for wide-angle shooting.

<Starry sky mode selection screen>
Here, with reference to FIG. 4, the screen displayed on the display part 101 in the starry sky mode of this embodiment is demonstrated.

  FIG. 4A illustrates a starry sky mode selection screen. On the display unit 101, a shooting condition (shooting setting) information 404 as an OSD (display item) and a shooting mode icon 402 indicating that the current starry sky mode is selected are displayed superimposed on the through image 401. In addition, an operation method for selecting the starry sky snap mode, the starry night view mode, the starry sky trajectory mode, and the starry sky interval moving image mode among the starry sky modes and the starry sky mode icon 403 indicating the current shooting mode are displayed superimposed on the through image 401. . The shooting information 404 is an icon that indicates the current settings and status of the digital camera 100, and displays the remaining battery level, recording image quality (compression rate and image size), and remaining number of shootable images in order from the left.

  The shooting mode icon 402 indicates the shooting mode set in the digital camera 100 at the upper left corner. A starry sky mode icon 403 represents a currently selected mode among the starry sky modes set in the digital camera 100. The starry sky mode icon 403 is guidance indicating that a switching dialog for setting any one of the starry sky modes can be displayed by pressing the DISP button of the operation unit 104.

  FIG. 4B is a screen display example for selecting any one of the starry sky modes during shooting standby in the starry sky mode, and the DISP button of the operation unit 104 is displayed during the screen display of FIG. Displayed when pressed.

  The starry sky mode selection screen includes a guidance 411 of the selected shooting mode, a mode name 412 of the selected shooting mode, an icon 414 of the currently selected shooting mode, and a mode list 413 that displays a list of a plurality of shooting modes. Is displayed. In addition, on the starry sky mode selection screen, an operation guide 415 for displaying that the initial screen of FIG. 4A is returned by pressing the SET button of the operation unit 104 is displayed. The user can select any one of the starry sky modes by operating the left button or the right button (left / right key) of the four-direction buttons included in the operation unit 104 with this screen displayed. When the SET button included in the operation unit 104 is pressed while an arbitrary mode is selected, the selected mode is set. In this way, the user can set any mode among the starry sky modes.

  By displaying such a screen, for example, you can easily switch between portrait shooting mode and starry sky mode, and select starry sky snap mode, starry night view mode, starry sky trajectory mode, starry sky interval movie mode from starry sky mode can do.

<Starry sky snap mode>
Next, the processing in the starry sky snap mode in S303 of FIG. 3 will be described with reference to FIG.

  In S500, the system control unit 201 displays an initial screen (FIG. 6A) in the starry sky snap mode on the display unit 101.

  In step S501, the system control unit 201 determines whether or not the strobe 113 is exposed. If the strobe 113 is exposed, the process proceeds to S502, and the guidance “Please raise the strobe” in FIG. 6A is hidden. If the strobe 113 is not exposed, the process proceeds to S503.

  In step S503, the system control unit 201 determines whether the DISP button of the operation unit 104 has been pressed and a selection operation for another starry sky mode has been performed. If the DISP button is pressed, the process proceeds to S504, and the starry sky mode selection process described with reference to FIG. 3 is performed. If the DISP button has not been pressed, the process proceeds to S505.

  In step S <b> 505, the system control unit 201 determines whether a menu screen is displayed by a user operation and a dark place display on / off switching operation is performed in the dark place display (dark place display mode) menu item. To do. If a switching operation has been performed, the process proceeds to S506, and a dark place display setting switching process is performed. When the switching operation is not performed, the process proceeds to S507. In addition to the starry sky mode, the dark place display can also be set with menu items in the program AE mode, aperture priority mode, shutter speed priority mode, manual mode, night view shooting mode, and fireworks shooting mode that are expected to be shot in the dark. It is. FIG. 6B shows a screen display example in the shooting standby state when the dark place display setting is on, and FIG. 6C shows a screen display example in the shooting standby state when the dark place display setting is off. Here, an example of switching on / off of the dark place display setting by a user operation has been described, but it may be switched automatically by detecting the surrounding environment. For example, if it is determined that the through image being shot is low in brightness and is in a dark place, the dark place display setting is automatically turned on, and if the through image is bright and is determined not to be in a dark place The dark place display setting may be automatically turned off.

  In step S <b> 507, the system control unit 201 determines whether an operation related to white balance correction has been performed by the operation unit 104. When the white balance correction operation is performed, the process proceeds to S508, and white balance correction is set. FIG. 6D illustrates a setting screen for white balance correction. When the white balance correction operation is not performed, the process proceeds to S509.

  In step S <b> 508, the system control unit 201 holds the white balance correction value set by the operation unit 104 in the memory 210, and controls the imaging unit 205 and the image processing unit 207 to apply the correction value. As a result, an image reflecting the white balance correction is displayed on the display unit 101.

  In step S <b> 509, the system control unit 201 determines whether an operation related to exposure correction has been performed by the operation unit 104. If an exposure correction operation has been performed, the process proceeds to S510 and exposure correction is set. If the exposure correction operation is not performed, the process proceeds to S511.

  In S510, the system control unit 201 holds the exposure correction value set by the operation unit 104 in the memory 210, and controls the imaging unit 205 and the image processing unit 207 to apply the correction value to display the display unit. 101 displays an image reflecting exposure correction.

  In step S <b> 511, the system control unit 201 determines whether a self-timer setting operation has been performed by the operation unit 104. If the self-timer setting operation is performed, the process proceeds to S510, and the self-timer is set according to the setting operation. When the setting operation of the self timer is not performed, the process proceeds to S513.

  In step S <b> 512, the system control unit 201 holds the set time of the self-timer set by the operation unit 104 in the memory 210. The set time held in the memory 210 may be recorded in the nonvolatile memory 213 when the power is turned off, and the setting may be maintained at the next startup.

  There are the following types of self-timer settings.

・ Two-second timer (first time) used mainly for camera shake prevention
A 10-second timer (second time) used mainly to earn time for moving to the shooting range so that the photographer himself can be photographed as a subject.
-Custom timer that can set the shooting interval and number of shots as desired.

  Note that shooting in the starry sky mode is often performed in a dark place, and in many cases, the shutter speed is set to be slow (exposure time is long). Therefore, camera shake is likely to occur. Therefore, it is assumed that self-timer shooting is often performed in the starry sky mode to prevent camera shake. Therefore, in the starry sky mode, the setting state of the self-timer is recorded in the nonvolatile memory 213 as the setting for the starry sky mode so that the same self-timer setting is maintained at the next startup of the starry sky mode even when the power is turned off. . In this way, in the starry sky mode, the user who uses the self-timer shooting basically can save time and labor for performing the operation related to the self-timer setting every time the starry sky mode is activated. The self-timer setting for the starry sky mode is not diverted when activated in a shooting mode other than the starry sky mode. Further, in the shooting mode other than the starry sky mode, the self-timer setting is not recorded in the nonvolatile memory 213. That is, once the power is turned off and the next startup is performed, the self-timer setting is set to OFF as an initial setting regardless of the previous setting. By doing this, the self-timer shooting was performed unintentionally this time because the previous self-timer shooting was performed at the next power-on when the shooting scene is likely to change. It is possible to prevent a situation such as missing.

  In S513, the system control unit 201 determines whether or not the first shutter switch signal SW1 is turned on by half-pressing the shutter button 102. If the first shutter switch signal SW1 is turned on, the process proceeds to S514, and if not, the process proceeds to S500.

  In step S514, the system control unit 201 determines whether the second shutter switch signal SW2 is turned on by fully pressing the shutter button 102. If the second shutter switch signal SW2 is turned on, the process proceeds to S516, and if not, the process proceeds to S515.

  In step S515, the system control unit 201 determines whether or not the first shutter switch signal SW1 is turned on by half-pressing the shutter button 102. If the first shutter switch signal SW1 is on, the process returns to S514, and if not, the process returns to S500.

  In step S516, the system control unit 201 determines whether the self-timer setting is on. If the self-timer setting is on, the process proceeds to S517, and if not, the process proceeds to S519.

  In step S517, the system control unit 201 measures the set time of the cell timer according to the self-timer setting determined in step S516. In addition, the light emitting unit 112 blinks to notify the subject side that the self-timer is measuring (counting down). In the starry sky snap mode, since it is assumed that a person and the starry sky are simultaneously photographed in a dark place, the light emitting unit 112 blinks during the countdown of the self-timer to notify the person who is the subject that the countdown is in progress. On the other hand, in a mode in which only the starry sky is photographed in a dark place, such as a starry night view mode, a starry sky trajectory mode, and a starry sky interval moving image mode, which will be described later, the light emitting unit 112 is not lit even when the self-timer is counting down. Note that the flashing / lighting / non-lighting of the LED is also linked to the sounding / muteing of the cell timer sound for notifying that the countdown is in progress. That is, the self-timer sound is not generated when the light emitting unit 112 is not lit.

  Note that, in a shooting mode other than the starry sky snap mode, such as a portrait shooting mode, where it is assumed that person shooting is performed, the light emitting unit 112 is turned on during the countdown of the self-timer.

  In shooting modes other than the starry night view mode, such as the fireworks shooting mode and night view shooting mode, which are used in dark places and are unlikely to shoot people, the light emitting unit 112 is turned on during the self-timer countdown. Turn off.

  In addition, the system control unit 201 does not determine that the subject is in a dark place according to the shooting mode, but the subject is taken in the dark when the subject is dark based on the brightness of the subject based on the result of photometry. It is also possible to determine that When it is determined that shooting in a dark place is performed, the operation mode for the dark place is set, and when it is determined, the LED 112 light emitting unit 112 may not be turned on.

  Further, the blinking process of the light emitting unit 112 may be performed only when face detection / person detection can be performed in addition to the dark place determination by the shooting mode.

  It is also possible to realize a combination of brightness and face detection / person detection.

  Also, according to the set time of the self-timer, when setting the 2-second self-timer that is supposed to prevent camera shake, the light-emitting unit 112 is not lit, and when the 10-second self-timer is set to be assumed that there are many people shooting The light emitting unit 112 may be blinked / lighted.

  In step S518, the system control unit 201 determines whether the set time of the self timer has elapsed. If the set time has elapsed, the process proceeds to S519, and if not, the process returns to S517.

  In step S519, the system control unit 201 determines whether or not the flash 113 is caused to emit light. If the strobe 113 is caused to emit light, the process proceeds to S520, and if not, the process proceeds to S717 in the starry night view mode described later with reference to FIG. In this way, when the strobe 113 is in the retracted state (not in use) in the starry sky snap mode, the same processing as in the starry night view mode described later can be performed.

  In step S520, the system control unit 201 causes the sound generation unit 217 to generate a shutter start sound in synchronization with the opening timing of the shutter 204. As a result, the photographer can confirm the timing of the start of photographing for the first photographing. The shutter sound includes a start sound (for example, “Kaku”) linked to the start of exposure and an end sound (for example, “Sha”) linked to the end of exposure. If you listen to the start and end sounds in succession, you will hear “Cassher”, which is a sound that mimics the mechanical mirror lifting operation sound. The user can determine whether the exposure time is long or short at the interval between the start sound and the end sound. For example, if you hear “Katsu ... sha”, you know that the exposure time is long.

  In step S <b> 521, the system control unit 201 shoots a person by causing the strobe 113 to emit light.

  In step S522, the system control unit 201 causes the sound generation unit 217 to generate a shutter end sound in accordance with the end of exposure. Thus, the photographer can confirm the completion of exposure for the first photographing.

  In step S523, the system control unit 201 causes the sound generation unit 217 to generate a shutter start sound in accordance with the timing when the shutter 204 is opened. As a result, the photographer can confirm the timing of the start of photographing for the second photographing. At this time, the light emitting unit 112 does not emit light. If light is emitted toward the subject side at this timing, light on the subject side remains in the exposure period of the second shooting (S524) and third shooting (S527) described later, which adversely affects the shooting. there is a possibility. Since the second and third shootings are shooting in a dark place without a strobe, the exposure period becomes longer. If the subject is irradiated with light from the light emitting unit 112 in the meantime, the subject may be emphasized and the background may not be captured, or an unnatural color from the light emitting unit 112 may be captured in an emphasized state. . In order to prevent such a situation, the shooting end notification by the light emission of the light emitting unit 112 is not performed after the shootings other than the last (first and second) among a plurality of shootings in the starry sky snap mode. As another reason, if the user is notified of the end of the first shooting and the second shooting, a series of shootings may be ended at that timing, and the user may be misunderstood that they can move. In the starry sky snap mode, after taking a picture of a person, the background is taken for a long time, and then the image is multiplexed and composited. However, since the user only gives a single shooting instruction, the shooting is performed multiple times. You may not know that Then, if the light emitting unit 112 notifies the end of the first shooting, it may be misunderstood that it may move after the notification. As a result, the subject is moved during the second and third long exposures, and the subject is blurred. In order to prevent this, the notification by the light emission of the light emitting unit 112 is not performed after the first and second shootings in the series, and the light emitting unit 112 is turned off after the last third shooting in the series. Make it emit light.

  In step S524, the system control unit 201 performs background shooting without causing the flash 113 to emit light in order to extract the person shot in step S521.

  In step S525, the system control unit 201 causes the sound generation unit 217 to generate a shutter end sound in accordance with the end of exposure. Thus, the photographer can confirm the completion of the exposure for the second shooting. At this time, the light emitting unit 112 does not emit light.

  In step S526, the system control unit 201 causes the sound generation unit 217 to generate a shutter start sound in accordance with the timing at which the shutter 204 is opened. As a result, the photographer can confirm the timing of the start of photographing for the third photographing.

  In step S <b> 527, the system control unit 201 performs long-second shooting (long-time exposure, for example, shooting with an exposure time of 0.5 seconds or more), and a dark subject such as a starry sky or a night view.

  In step S528, the system control unit 201 generates a shutter end sound at the end of exposure. Thus, the photographer can confirm the completion of exposure for the third shooting.

  In step S529, the system control unit 201 blinks the light emitting unit 112 three times to notify the subject that a series of photographing (three times) has been completed. The subject does not move until the flashing of the light emitting unit 112 is seen, so that a photograph in which the starry sky without a subject blur and a person can be taken simultaneously can be taken. Since the subject side is notified of the completion of a series of shootings by light emission in the dark place, it can be easily seen even from a subject located away from the camera, and it can be understood that the shooting has ended. If it can be understood that the shooting has been completed, it can be understood that the subject may move without moving the pose within the shooting range, or may move outside the shooting range. In addition, by performing this notification after the exposure of the third shooting is completed, the shooting that is the long-time exposure is not affected.

  Note that the shooting completion notification can also be made by flash emission, voice, pronunciation, or the like.

  Note that the flashing of the light emitting unit 112 in this step (S529) may not be performed when the self-timer is not set. In this case, the photographer is close to the camera, the operation status is known from the shutter sound and the screen display, and the photographer can signal the subject without notifying the subject side from the camera. Further, even when the person detection (face detection) cannot be performed from the image taken in the first shooting (S521 shooting), the light emitting unit 112 does not need to be blinked in this step (S529). This is because there is no need to notify the subject if there is no person in the subject.

  In step S530, the system control unit 201 closes the shutter 204 in order to perform the long-second shooting noise processing performed in step S527 and takes a black image for a long time. The image processing unit 207 generates an image with reduced noise by processing the image obtained in S527 and the image obtained in S530.

  In step S531, the system control unit 201 refers to the current setting state stored in the non-volatile memory 213 or the system memory 212, and determines whether the star enhancement processing is set to ON. Whether the star enhancement process is performed (ON) or not (OFF) can be set in advance according to a user operation on the menu screen. If it is set to ON, the process proceeds to S532, and if not, the process proceeds to S533.

  In step S532, the system control unit 201 performs enhancement processing such as detecting a bright spot such as a star from the image whose noise has been reduced in step S530 by the image processing unit 207, and enlarging the size of the bright spot.

  In step S533, the system control unit 201 detects a bright spot such as a star from the image in which the image processing unit 207 performs noise reduction in step S530, and does not perform enhancement processing.

  In step S534, the system control unit 201 extracts a person portion from the strobe photographed image for photographing the person photographed in step S521 by the image processing unit 207 and the background image photographed in step S524. Then, the image of the person portion is combined with the starry sky image generated in S532 and S533, and is recorded on the recording medium 109.

  In step S <b> 535, the system control unit 201 determines whether a REC review is set via the operation unit 104. If it is set, the process proceeds to S536, and if not, the process proceeds to S540.

  In step S536, the system control unit 201 performs a REC review display of the image data obtained by the shooting process on the display unit 101. The REC review display is a process of displaying image data on the display unit 101 for a predetermined time (review time) immediately after shooting of a subject in order to check a shot image.

  In step S <b> 537, the system control unit 201 determines whether there is a histogram display instruction via the operation unit 104. If the histogram display is to be performed, the process proceeds to S538, and if not, the process proceeds to S539.

  In step S <b> 538, the system control unit 201 calculates a luminance histogram for each RGB color from the image data developed in the memory 210 and displays the histogram on the display unit 101. At this time, if the dark place display setting is on, the display data of the histogram is changed, and R, G, and B are displayed in the same color. In normal display, displaying R data in red, G data in green, and B data in blue is important for confirming an image immediately after shooting. However, when displaying in a dark place, there is a problem that it is difficult to visually recognize the histogram because the B component is reduced. In order to solve this problem, the R, G, and B data are changed to the same color during dark place display on a histogram display screen that is a screen including a display item (OSD) displayed in blue.

  Although details will be described later, FIG. 6E illustrates a REC review display screen when dark display is set and histogram display, and FIG. 6F illustrates a REC review display screen during normal display and histogram display. .

  In step S539, the system control unit 201 determines whether the REC review end condition is satisfied. If the end condition is satisfied, the REC review is ended and the process proceeds to S540. If not, the process returns to S535 and the REC review is continued. The end condition of the REC review is elapse of the review time, half-press of the shutter button 102, or the like.

  In S540, the system control unit 201 determines whether or not the first shutter switch signal SW1 is turned on by half-pressing the shutter button 102. If the first shutter switch signal SW1 is on, the process proceeds to S514, and if not, the process returns to S500.

<Starry sky snap mode screen>
Next, a screen displayed on the display unit 101 in the starry sky snap mode of this embodiment will be described with reference to FIG.

  FIG. 6A illustrates an initial screen in the starry sky snap mode. The display unit 101 displays an OSD (information such as shooting conditions) superimposed on the through image 600. The OSD displays a guidance display 601 for recommending that the strobe 113 is raised when the strobe 113 is stored, and a shooting mode display 602 indicating that the current starry sky snap mode is selected.

  FIG. 6B shows an example of a display screen when the dark place display setting is on. On the display unit 101, a dark place display obtained by superimposing the through image 610 and performing color conversion suitable for the dark place by applying RGB gain as the OSD 611 is displayed. The RGB gain is adjusted when the OSD (display item) developed in the RAM is transferred to the display VRAM.

  In photographing in a dark environment such as astronomical observation, it is known that the photographer adapts his eyes to a dark environment (dark adaptation). Since dark adaptation is easily solved by light having a wavelength close to blue, in order to maintain dark adaptation, a display that reduces light having a wavelength near blue in visible light is effective. That is, it is known that the dark adaptation can be made difficult to be solved (a hindrance to dark adaptation is suppressed) by performing a display in which B of RGB is suppressed as a color component (a display in which blue is suppressed).

  In this embodiment, by turning on the dark place display setting, only the B component or the B component and the G component are reduced on the display unit 101, and a dark place display such as a dull orange or brown color is performed. This makes it possible to display a color suitable for a dark place where the user's visual dark adaptation can be easily maintained.

  FIG. 6C shows a display screen example when the dark place display setting is off. On the display unit 101, the shooting information is normally displayed as the OSD 621 so as to be superimposed on the through image 620 when performing dark place color conversion.

  FIG. 6D illustrates a setting screen for white balance correction in the starry sky snap mode. The display unit 101 displays a white balance adjustment unit 631 that superimposes on the through image 630 and displays that the white balance can be changed from the B (blue) component to the A (amber) component by the operation unit 104 as an OSD. Has been.

  FIG. 6E illustrates a REC review screen when the dark place display setting is ON and the histogram is displayed. On the display unit 101, a REC review image 640 not subjected to RGB gain for dark places is superimposed on the REC review image 640, and an OSD applied with RGB gains for displaying dark places (gains for reducing B and G) are superimposed. Is displayed. In particular, if an RGB gain that reduces B is applied to the B component 643 that is normally displayed in blue in the RGB histogram portion in the histogram display, the visibility may be significantly reduced and the component may not be visible. For example, when a gain that reduces B to zero is applied to a pixel (pixel) displayed in perfect blue (R = 0, G = 0, B = 255), (R = 0, G = 0, B = 0) and the difference from the surroundings disappears, and the histogram becomes invisible. For this reason, the R component 641, the G component 642, and the B component 643 are changed to the same color other than blue (or better if other than blue and green), and then RGB gain is applied to reduce B and G. . For example, white (R = 255, G = 255, B = 255) is assumed for all the histogram portions of RGB. Then, when gain is applied to reduce B to zero, it becomes a reddish color (R = 255, G = 255, B = 0), and black that is not a histogram (R = 0, G = 0, B = 0) ) Can be identified. That is, even if B and G of a portion that is normally blue or a portion that is normally green are reduced (that is, the dark place display setting is turned on), the portion can be visually recognized. Note that the process of reducing the blue component after designating the same color other than blue as described above is a screen including an OSD portion (display item) that is displayed in blue when not in the dark place, such as a histogram display. Perform this when displaying in a dark place. In a screen that does not include the OSD (display item) displayed in blue when the display is not in the dark place (the screen that is in the shooting standby state in FIG. 6C), the processing load simply increases. Do not do. A display item that is displayed in blue when not in a dark place is that the B component is a predetermined value or more (for example, 130 or more out of 255 gradations), and the R and G components are a predetermined value or less (for example, out of 255 gradations). 70 or less) display item. Or it is good also as a display item comprised by the pixel whose B component is more than predetermined value, and the ratio of R and G component with respect to B component is below predetermined value. Furthermore, it is good also as a display item comprised by the pixel whose B component is more than predetermined value and there is no R and G components (zero). In any case, the display item displayed in blue when not in the dark place display is completely blue (R = 0, G = 0, B = 255) as described above, that is, the maximum blue component can be set. It includes display items that are composed of pixels that have values and no other color components.

  In this embodiment, the R component 641, the G component 642, and the B component 643 are changed to the same color designation, but the B component 643 (that is, only the pixel portion displayed in blue) whose visibility is remarkably deteriorated is changed. Only the color may be changed.

  Further, the visibility at the time of dark place display can be improved even if OSD display is performed by changing the gain of only the B component 643 or the B component 643 and the G component 642 without changing the color designation. The process of lowering the gain of the B component (or further lowering the gain of the G component) after changing the content displayed in blue during normal display to a color that is not blue during dark display is a digital camera. It is applicable to other devices. It can be applied to any display control device having a function of reducing the gain of the B component for dark place display. For example, a tablet PC or a smartphone capable of displaying a constellation map as a teaching material when viewing the starry sky, etc. It can be applied to other mobile terminals. In this embodiment, an example in which a B component is reduced and displayed by replacing it with another color other than blue on a screen where a complete blue OSD exists has been described. May be. For example, in the case of a photographed image in which the area ratio of pixels with strong blueness in the photographed image is large (greater than or equal to a predetermined value), the pixels with strong blueness are replaced with non-blue color pixels, and the gains of G and B are set. Lower the image and display it in the dark. A pixel with strong bluishness is a pixel in which the ratio of the B value to the RGB values in one pixel (pixel) is equal to or greater than a predetermined value.

  In the case of a captured image, unlike OSD, it is unlikely that a complete blue pixel (R = 0, G = 0, B = 255) exists. Therefore, instead of the process of replacing the strong blue-blue pixel with another color pixel as described above, the R values of all the pixels of the captured image are amplified (raised) relative to the G and B values. ) Processing may be performed, and then processing for lowering the gains of G and B may be performed to display in the dark place. Even in this way, pixels with strong blue tint can be identified even if they are displayed in the dark. In addition, the process of amplifying R is higher in quality as a photographed image because of the difference in the density and color of each pixel compared to replacing a pixel with strong blueness with the same color. The process of amplifying (raising) the value of R relative to the values of G and B is, for example, a process of applying a coefficient to each of the values of R, G, and B in one pixel. This is a process in which the coefficient is larger than the coefficients of G and B. However, even when such processing is performed, for a completely blue pixel (R = 0, G = 0, B = 255), R = 0 is multiplied by a coefficient, and therefore R is amplified. Cannot be seen in the dark. Therefore, as described above, in the OSD display in which a complete blue color (R = 0, G = 0, B = 255) exists, the processing other than blue is performed rather than the process of weighting R and multiplying each of RGB by a coefficient. It is preferable to perform the process of lowering the gains of G and B after performing the process of substituting for the color. Therefore, when the OSD including blue is displayed in a dark place during normal display, it is replaced with a color other than blue, and when a captured image having a large blueness ratio is displayed in the dark during normal display, R for all pixels is displayed. Different processes such as a process of amplifying components may be used.

  FIG. 6F illustrates a REC review screen when the dark display setting is off and the histogram is displayed. The display unit 101 displays a REC review image 650 and an OSD that are not subjected to RGB gain for dark place display.

  In the histogram, the R component 651 is red (R = 255, G = 0, B = 0), the G component 652 is green (R = 0, G = 255, B = 0), and the B component 653 is blue (R = 0, By displaying with G = 0, B = 255), the RGB histogram can be intuitively grasped.

  FIG. 6G illustrates another expression method for making it possible to confirm the RGB histogram during dark place display. Instead of displaying the RGB histogram in the same color as shown in FIG. 6E, the difference between R, G, and B can be clearly expressed while ensuring visibility by using a shaded expression.

  A method of clarifying the difference between R, G, and B by changing the direction of gradation is also possible. For example, R is vertical gradation, G is diagonal gradation, B is horizontal gradation, and the like. That is, when blue is changed to a color other than blue and the gain of B is reduced, the screen is changed to a display mode with a different pattern corresponding to the color classification in the screen before changing to a color other than blue. Reduce the gain of B.

  Although the display of the RGB histogram has been described in the present embodiment, the present invention can be applied to a display that has a large amount of B component and is difficult to visually recognize during dark place display. For example, when OSD display using the B component is performed when white balance correction is performed, it is possible to cope with the problem by performing processing similar to the RGB histogram.

<Starry night view mode>
Next, the processing in the starry night view mode in S304 of FIG. 3 will be described with reference to FIG.

  In step S700, the system control unit 201 causes the display unit 101 to display an initial screen in the starry night view mode. The display content of this initial screen is the same as the initial screen of the above-described starry sky snap mode (FIG. 6A) except that the guidance display 601 is not displayed.

  Since S701 to S714 and S726 to S732 are the same processes as S503 to S516 and S535 to S541 in FIG. In the starry sky snap mode, since flash photography is required for portrait photography, it is determined in step S501 whether the flash 113 is exposed. However, in the starry night view mode, no flash photography is required, so the determination process corresponding to step S501 is a starry sky. Not in night view mode.

  In step S715, the system control unit 201 measures the set time of the cell timer according to the self-timer setting determined in step S714. In the starry night view mode, the light emitting unit 112 is not lit while the self-timer is counting (during the countdown), and the self-timer sound is not generated. In the starry night view mode, unlike the starry sky snap mode, shooting of a person is not assumed. Therefore, it is not necessary to turn on the light emitting unit 112 during the self-timer countdown and notify the subject side. In the shooting target and shooting situation assumed in the starry night view mode, there may be a disadvantage that the light emitting unit 112 is turned on. For example, when shooting a starry sky or night view in a dark place, if there is another person watching the starry sky or night view in the surroundings, turning on the light emitting unit 112 may disturb other people's appreciation of the starry sky or night view. There is a possibility. The starry night view mode is also suitable for shooting fireflies, but if the light emitting unit 112 is turned on, the fireflies are also adversely affected. In addition, when there is another person who is watching fireflies in the surroundings, it can be an obstacle to the other person's viewing of firefly light, which can be annoying.

  There is no need to notify the self-timer sound in the same way, but a disadvantage can be considered. Dark places such as starry sky, night view, and firefly viewing are often quiet environments, and the self-timer sound can be a nuisance to others around you. In addition, when a natural creature such as firefly is used as a subject, it may be escaped by sound and cannot be photographed. Since the same disadvantage can be considered in the fireworks shooting mode and the night scene shooting mode, the light emitting unit 112 is not turned on and the self-timer sound is not generated. In this case, only the light emission (lighting / blinking) of the LED 112 light emitting unit 112 may be prohibited, and only the self-timer sound is generated. Even in the starry night view mode, if a person's face can be detected from the photographed through image in S715, it can be assumed that a person is photographed. You may make it notify the operation state of a self-timer. Furthermore, in S715, when the set time of the self-timer is 10 seconds or more, it can be assumed that the photographer is a person photograph assuming that the photographer enters the shooting range as a subject. You may make it notify the operation state of a self-timer. When the set time of the self-timer is less than 10 seconds, for example, 2 seconds, the self-timer is not used for photographing a person, but is used to eliminate the influence of camera shake caused by pressing the shutter. It is assumed.

  In step S716, the system control unit 201 determines whether the set time of the self timer has elapsed. If the set time has elapsed, the process proceeds to S717, and if not, the process returns to S715.

  In step S <b> 717, the system control unit 201 causes the sound generation unit 217 to generate a shutter start sound in synchronization with the opening timing of the shutter 204. As a result, the photographer can confirm the timing of starting the photographing. Note that the shutter sound is not generated when the silence is set in advance on the menu screen or the like.

  In step S718, the system control unit 201 performs shooting by performing long-second exposure.

  In step S719, the system control unit 201 causes the sound generation unit 217 to generate a shutter end sound in accordance with the end of exposure. As a result, the photographer can confirm the completion of exposure. Note that the shutter sound is not generated when the silence is set in advance on the menu screen or the like. Further, the light emitting unit 112 for notifying the completion of photographing is not turned on. Similarly to the above, it is not necessary to notify the subject side, and there is a demerit of lighting the light emitting unit 112 in a dark place.

  In step S720, the system control unit 201 closes the shutter 204 and captures a black image for a long time in order to perform the noise processing for the long-second shooting performed in step S718. Then, the image processing unit 207 generates an image with reduced noise by processing the image obtained in S718 and the image obtained in S720.

  In step S721, the system control unit 201 refers to the current setting state stored in the non-volatile memory 213 or the system memory 212, and determines whether the star enhancement processing is set to ON. If it is set to ON, the process proceeds to S722, and if not, the process proceeds to S723.

  In step S722, the system control unit 201 performs enhancement processing such as detecting a bright spot such as a star from the image whose noise is reduced in step S720 by the image processing unit 207, and enlarging the size of the bright spot.

  In step S723, the system control unit 201 detects a bright spot such as a star from the image in which the image processing unit 207 performs noise reduction in step S720, and does not perform enhancement processing.

  In step S724, the system control unit 201 records the starry sky image generated in steps S722 and S723 on the recording medium 109.

  In step S725, the system control unit 201 determines whether the starry sky snap mode is set via the operation unit 104. If it is set, the process of FIG. 5 is performed, and if not, the process proceeds to S726.

<Starry sky mode>
Next, processing in the starry sky locus mode in S305 of FIG. 3 will be described with reference to FIG.

  In the starry sky trajectory mode, a still image is captured a plurality of times at predetermined time intervals in response to a single shooting instruction, and a composite image (multiple composite image) is generated from the captured still image and recorded. That is, in this mode, interval shooting is performed in which shooting is performed a plurality of times at intervals according to one shooting instruction. In the following, for convenience of explanation, a series of processing from the start of shooting to recording of a composite image is called starry sky shooting, and each still image shooting performed during starry sky shooting is simply called shooting (or still image shooting). And

  In step S <b> 800, the system control unit 201 displays an initial screen (FIG. 9A) in the starry sky locus mode on the display unit 101.

  Steps S701 to S810 are the same as steps S503 to S512 in FIG.

  In step S811, the system control unit 201 determines whether the total shooting time has been set by the controller wheel 106 of the operation unit 104. The total photographing time is a time for continuously performing starry sky locus photographing, and the user can select and set from options such as 10 minutes, 30 minutes, 60 minutes, and 120 minutes. In other words, the total shooting time is a time scheduled as a time required for a series of interval shootings. If the total shooting time has been set, the process proceeds to S812, and if not, the process proceeds to S813.

  In step S812, the system control unit 201 stores the total shooting time set in step S811 in the memory 210, and also changes the total shooting time displayed on the display unit 101. Note that the total photographing time held in the memory 210 may be recorded in the nonvolatile memory 213 when the power is turned off.

  In step S813, the system control unit 201 determines whether or not the first shutter switch signal SW1 is turned on by pressing the shutter button 102 halfway. If the first shutter switch signal SW1 is turned on, the process proceeds to S814, and if not, the process proceeds to S800.

  In step S814, the system control unit 201 performs AF processing to focus the photographing lens 203 on the subject, and performs AE processing to determine the aperture value of the shutter 204 and the shutter time (exposure time).

  In step S815, the system control unit 201 calculates a shooting interval in starry sky locus shooting by adding a predetermined time required for one shooting process during the shutter time determined in step S814.

  In S816, the system control unit 201 calculates the total number of shots in starry sky trajectory shooting by dividing the total shooting time set in S812 by the shooting interval calculated in S815.

  Since S817 to S821 are the same processes as S712 to S716 of FIG.

  In step S822, the system control unit 201 performs noise reduction processing for long-second shooting and initialization processing such as setting an initial value in a counter for the number of shooting times of still image shooting described later.

  In step S823, the system control unit 201 compares the current number of shots with the total number of shots calculated in step S816. If the number of shots is less than or equal to the total number of shots, the process proceeds to S824. If the number of shots exceeds the total number of shots, the process proceeds to S834.

  In step S824, the system control unit 201 causes the sound generation unit 217 to generate a shutter start sound in synchronization with the timing at which the shutter 204 is opened. As a result, the photographer can confirm the timing of starting the photographing.

  In step S825, the system control unit 201 performs exposure under the shooting conditions determined in step S814, and takes a subject such as a starry sky or a night view.

  In step S826, the system control unit 201 causes the sound generation unit 217 to generate a shutter end sound in accordance with the end of exposure. As a result, the photographer can confirm the completion of exposure. Note that the light emitting unit 112 for notifying completion of photographing is not turned on. Similarly to the above, it is not necessary to notify the subject side, and there is a demerit of lighting the light emitting unit 112 in a dark place.

  In step S827, the system control unit 201 closes the shutter 204 to capture a black image for a long time in order to perform the noise processing for the long-second shooting performed in step S815. Then, the image processing unit 207 generates an image with reduced noise by processing the image obtained in S825 and the image obtained in S827.

  In step S828, the system control unit 201 combines the image generated in step S827 and the combined image generated in step S828 and stored in the memory 210, generates a new combined image, and stores the combined image in the memory 210. The composition here is a multiple composition in which the image generated in S827 and the composite image held in the memory 210 are superimposed on each other (not panoramic composition).

  In step S829, the system control unit 201 calculates the elapsed time from the start of shooting by multiplying the current shooting count by the shooting interval calculated in step S815.

  In step S830, the system control unit 201 adds 1 to the current number of shootings.

  In step S831, the system control unit 201 displays the composite image generated in step S828 and the elapsed time calculated in step S829 on the display unit 101 (FIG. 9B).

  In step S832, the system control unit 201 determines whether or not a predetermined time has elapsed in the shooting process other than the exposure of the still image shooting in step S825. The predetermined time here is the predetermined time used when calculating the shooting interval in S815. If the predetermined time has elapsed, the process proceeds to S833, and if not, the process returns to S832, and waits until the predetermined time elapses.

  In step S833, the system control unit 201 determines whether the second shutter switch signal SW2 is turned on by fully pressing the shutter button 102. If the second shutter switch signal SW2 is OFF, the process proceeds to S823 to perform the next still image shooting process, and if not, the process proceeds to S834. If the second shutter switch signal SW2 is turned on, the process proceeds to S834 to perform starry locus imaging end processing. That is, by fully pressing the shutter button 102, the starry sky trajectory shooting can be terminated halfway.

  In step S834, the system control unit 201 records the composite image generated in step S828 and stored in the memory 210 on the recording medium 109 as a still image file.

  Since S835-S841 are the same processes as 535-S541 of FIG. 5, description is abbreviate | omitted.

<Starry sky trajectory mode screen>
Next, with reference to FIG. 9, the screen displayed on the display part 101 in the starry sky locus mode of this embodiment is demonstrated.

  FIG. 9A illustrates an imaging standby screen in the starry sky locus mode. The display unit 101 superimposes on the through image 900 and displays the current total shooting time and an icon 901 that can be operated with the controller wheel 106 for changing the total shooting time as an OSD. In the illustrated example, the total shooting time is set to 60 minutes.

  FIG. 9B illustrates a display screen at the time of review on the way in the starry sky trajectory mode. On the display unit 101, the elapsed time 911 is displayed in the translucent elapsed time display area 912 as an OSD, superimposed on the composite image 910 generated so far.

  The intermediate progress review display may be a display as shown in FIG. That is, the display unit 101 displays the elapsed time 914 and the total shooting time 913 as the OSD superimposed on the composite image 910. Here, the elapsed time and the total shooting time are displayed together, but either one may be displayed. Further, the elapsed time 914 is displayed with a framed character, and the elapsed time display area is not displayed.

  The display modes illustrated in FIGS. 9B and 9C may be combined as appropriate. The elapsed time 911 from the start of shooting is displayed once updated at the timing of each still image shooting, and is not updated during (interim) the interval between still image shootings. By doing in this way, the user can know not only the elapsed time but also the shooting interval from the difference of the updated elapsed time. For example, the elapsed time 911 is displayed as 12 minutes 34 seconds when the Nth still image is captured, and the N + 1th still image shooting is performed until the N + 1th still image shooting is displayed as 12 seconds 34 seconds. In this case, it is assumed that 12 seconds and 40 seconds are updated. In this case, the user knows that the shooting interval of one time is 6 seconds. That is, it can be seen that the exposure time, dark processing time, and other processing time in one shooting are about 6 seconds. Accordingly, the elapsed time and the shooting interval can be indicated by one time information, so that the display space can be saved and it is possible to prevent the composite image 910 displayed on the display unit 101 from becoming difficult to see. Also, by displaying the composite image 910 generated at that time together with the elapsed time 911 as a review, the user can associate the elapsed time with the length of the trajectory, and is necessary for photographing a trajectory of a desired length. It is possible to estimate the remaining time.

  In the above-described embodiment, an example in which one composite image is finally recorded in S834 has been described. However, not only one image but also a still image recorded in S825 (that is, a material image of the composite image) is individually stopped. It may be recorded on the recording medium 109 as an image file. Further, the composite image up to the middle generated in S828 may be individually recorded. By individually recording the image that will be the material of the composite image, if the composite image taken in the starry sky trajectory mode is deemed to be unsuccessful, it can be corrected later using an image processing application on a PC, etc. It is.

  Moreover, the user may be able to arbitrarily set the shooting interval. Since the user can set the shooting time, more free starry sky shooting can be performed.

  In addition, when there is a limit on the total shooting time that can be selected due to the remaining capacity of the recording medium 109, the user can be notified by setting an upper limit on the values that can be selected in S811 and S812 or changing the color of the icon 901. May be.

  Further, not the elapsed time (count up) display but the remaining time (count down) display may be used. In this case, the user can not only know the remaining time from one piece of time information, but also know the shooting interval from the difference between the updated remaining time and the elapsed time from the difference from the total shooting time set by the user. Therefore, more space-saving information display can be realized.

  Further, instead of the time information, the remaining number of images that can be shot, the remaining time, or the remaining capacity of the recording medium 109 may be displayed.

  In addition, the user may be able to freely set in what form the display information is displayed from a menu screen or the like.

  Furthermore, if the shooting interval is short, even if the elapsed time display is updated for each shooting, it is difficult for the user to instantly calculate the difference time from the previous update and grasp the shooting interval. Therefore, the elapsed time may not be displayed when the shooting interval is short (less than the predetermined time). Alternatively, the elapsed time is displayed, but if the shooting interval is short (less than the predetermined time), the display time of the elapsed time is not updated once for each shooting, and the time of the day is not shown regardless of the interval period. It is good also as count-up (for example, update once per second) according to progress. When the shooting interval is equal to or longer than the predetermined time, as described above, it is updated once for each shooting.

  Further, the elapsed time display method described above (one update for each shooting in the interval shooting) may be performed in the starry sky interval moving image mode which is a moving image interval shooting.

  Note that the control of the system control unit 201 may be performed by a single piece of hardware, or the entire apparatus may be controlled by a plurality of pieces of hardware sharing the processing.

  Although the present invention has been described in detail based on the preferred embodiments thereof, the present invention is not limited to these specific embodiments, and various forms without departing from the gist of the present invention are also included in the present invention. included. Furthermore, each embodiment mentioned above shows only one embodiment of this invention, and it is also possible to combine each embodiment suitably.

  Further, in the above-described embodiment, the case where the present invention is applied to an imaging apparatus such as a digital camera has been described as an example. However, the present invention is not limited to this example, and, for example, shooting in a dark place such as a starry sky or a night view. It is applicable to any device that can be used. That is, the present invention can be applied to portable terminals such as tablet PCs and smartphones.

[Other Embodiments]
The present invention is also realized by executing the following processing. That is, software (program) that realizes the functions of the above-described embodiments is supplied to a system or apparatus via a network or various storage media, and a computer (or CPU, MPU, etc.) of the system or apparatus reads and executes the program code. It is processing to do. In this case, the program and the storage medium storing the program constitute the present invention.

Claims (15)

  1. Shooting control means for controlling to perform interval shooting in which shooting at intervals is performed a plurality of times in response to a single shooting instruction;
    Display control means for controlling to update and display the elapsed time from the start of the interval shooting according to the shooting instruction every time each shooting in the interval shooting is performed;
    An imaging device comprising:
  2.   The imaging apparatus according to claim 1, wherein the display control unit updates the elapsed time once for each shooting in the interval shooting, and does not update otherwise.
  3.   The imaging apparatus according to claim 1, further comprising a combining unit that multiplex-combines images captured in each series of the interval shootings to generate one image.
  4. Wherein the display control unit, in the course of a series of the interval shooting, together with the elapsed time, displaying the synthesized image generated by the synthesis means based on the obtained image capturing up to that in the interval shooting The imaging apparatus according to claim 3 .
  5.   5. The imaging according to claim 1, wherein the display control unit displays a total photographing time required for the interval photographing together with the elapsed time during a series of the interval photographings. apparatus.
  6.   The display control means updates the display of the elapsed time once for each shooting in the interval shooting when the interval of each shooting in the interval shooting is a predetermined time or more, and the interval is less than the predetermined time. The imaging apparatus according to claim 1, wherein the imaging device is controlled so as to be updated regardless of the interval.
  7.   6. The imaging apparatus according to claim 1, wherein the display control unit does not display the elapsed time when the interval of each imaging in the interval imaging is less than a predetermined time.
  8. It further has setting means for setting to any of a plurality of shooting modes,
    The imaging apparatus according to claim 1, wherein the display control unit performs the control when a specific shooting mode is set among the plurality of shooting modes.
  9.   The imaging apparatus according to claim 8, wherein the specific shooting mode is a shooting mode for shooting a starry sky.
  10.   The imaging apparatus according to claim 8 or 9, wherein when the specific photographing mode is set by the setting unit, control is performed so as to display a guidance for prompting use of a tripod.
  11.   11. The imaging apparatus according to claim 8, wherein the specific shooting mode is a shooting mode in which shooting is performed with a lens controlled to be fixed at a wide end.
  12.   The imaging apparatus according to any one of claims 8 to 11, wherein the display control unit is capable of performing a dark place display that prevents a user's dark adaptation from being hindered in the specific photographing mode.
  13. A shooting control step for controlling to perform interval shooting in which shooting at intervals is performed a plurality of times in response to a single shooting instruction;
    A display control step for controlling to update and display the elapsed time from the start of the interval shooting according to the shooting instruction every time each shooting in the interval shooting is performed,
    A method for controlling an imaging apparatus, comprising:
  14.   The program for functioning a computer as each means of the imaging device described in any one of Claims 1 thru | or 12.
  15.   A computer-readable storage medium storing a program for causing a computer to function as each unit of the imaging apparatus according to any one of claims 1 to 12.
JP2013171640A 2013-08-21 2013-08-21 Imaging apparatus, control method therefor, program, and storage medium Active JP6278636B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013171640A JP6278636B2 (en) 2013-08-21 2013-08-21 Imaging apparatus, control method therefor, program, and storage medium

Applications Claiming Priority (17)

Application Number Priority Date Filing Date Title
JP2013171640A JP6278636B2 (en) 2013-08-21 2013-08-21 Imaging apparatus, control method therefor, program, and storage medium
GB1414710.2A GB2519416B (en) 2013-08-21 2014-08-19 Image capturing apparatus and control method thereof
GB1521092.5A GB2531440B (en) 2013-08-21 2014-08-19 Image capturing apparatus and control method thereof
US14/463,209 US9712756B2 (en) 2013-08-21 2014-08-19 Image capturing apparatus and control method thereof
GB1521094.1A GB2531441B (en) 2013-08-21 2014-08-19 Image capturing apparatus and control method thereof
GB1521090.9A GB2531439B (en) 2013-08-21 2014-08-19 Image capturing apparatus and control method thereof
GB1606729.0A GB2536818B (en) 2013-08-21 2014-08-19 Image capturing apparatus and control method thereof
RU2014134199/28A RU2604570C2 (en) 2013-08-21 2014-08-20 Image capturing device and control method thereof
CN201410411970.0A CN104427224B (en) 2013-08-21 2014-08-20 Photographic device and its control method and respective media
DE102014216491.5A DE102014216491A1 (en) 2013-08-21 2014-08-20 PICTURE RECORDING DEVICE AND CONTROL PROCESS FOR THEM
CN201710411967.2A CN107181915B (en) 2013-08-21 2014-08-20 Image pickup apparatus and control method thereof
CN201810885519.0A CN108933898B (en) 2013-08-21 2014-08-20 Image pickup apparatus and control method thereof
CN201710411968.7A CN107257431B (en) 2013-08-21 2014-08-20 Image pickup apparatus and control method thereof
KR1020140108776A KR101670222B1 (en) 2013-08-21 2014-08-21 Image capturing apparatus and control method thereof
US15/597,390 US10003753B2 (en) 2013-08-21 2017-05-17 Image capturing apparatus and control method thereof
US15/951,960 US10313604B2 (en) 2013-08-21 2018-04-12 Image capturing apparatus and control method thereof
US16/390,504 US10506173B2 (en) 2013-08-21 2019-04-22 Image capturing apparatus and control method thereof

Publications (2)

Publication Number Publication Date
JP2015040965A JP2015040965A (en) 2015-03-02
JP6278636B2 true JP6278636B2 (en) 2018-02-14

Family

ID=52695161

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013171640A Active JP6278636B2 (en) 2013-08-21 2013-08-21 Imaging apparatus, control method therefor, program, and storage medium

Country Status (1)

Country Link
JP (1) JP6278636B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6520566B2 (en) * 2015-08-25 2019-05-29 リコーイメージング株式会社 Image recording device

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000125165A (en) * 1998-10-15 2000-04-28 Minolta Co Ltd Digital camera
JP2000312307A (en) * 1999-04-28 2000-11-07 Olympus Optical Co Ltd Method for controlling electronic camera, information processing unit and storage medium recording program to realize electronic camera control
JP4307648B2 (en) * 1999-09-01 2009-08-05 オリンパス株式会社 Camera
JP2001358977A (en) * 2000-06-14 2001-12-26 Olympus Optical Co Ltd Electronic camera system
US7397500B2 (en) * 2003-04-30 2008-07-08 Hewlett-Packard Development Company, L.P. Camera shake warning and feedback system that teaches the photographer
JP2005073017A (en) * 2003-08-26 2005-03-17 Casio Comput Co Ltd Time indication method / recording method of imaging unit, imaging unit, and program
JP2005333461A (en) * 2004-05-20 2005-12-02 Olympus Corp Camera, and recording method and recording/reproducing method of the camera
JP2008070259A (en) * 2006-09-14 2008-03-27 Olympus Corp Observation device
JP2009239600A (en) * 2008-03-27 2009-10-15 Olympus Imaging Corp Imaging apparatus, and method of controlling imaging apparatus
JP2012142823A (en) * 2011-01-04 2012-07-26 Nikon Corp Display controller, imaging apparatus and display control program
JP5895409B2 (en) * 2011-09-14 2016-03-30 株式会社リコー Imaging device
JP2013157918A (en) * 2012-01-31 2013-08-15 Canon Inc Electronic apparatus, control method for electronic apparatus, program, and recording medium
JP6034740B2 (en) * 2013-04-18 2016-11-30 オリンパス株式会社 Imaging apparatus and imaging method

Also Published As

Publication number Publication date
JP2015040965A (en) 2015-03-02

Similar Documents

Publication Publication Date Title
JP6388673B2 (en) Mobile terminal and imaging method thereof
US9185370B2 (en) Camera and camera control method
USRE45629E1 (en) Image capturing apparatus and control method therefor
US10225477B2 (en) Imaging apparatus and display control method thereof
CN101610363B (en) Apparatus and method of blurring background of image in digital image processing device
JP4823970B2 (en) Image editing device
US8294813B2 (en) Imaging device with a scene discriminator
JP4135100B2 (en) Imaging device
KR100585604B1 (en) Method for controlling digital photographing apparatus, and digital photographing apparatus adopting the method
US7561201B2 (en) Method for operating a digital photographing apparatus using a touch screen and a digital photographing apparatus using the method
US9215372B2 (en) Image device and imaging method
CN103108120B (en) Zoom control method and apparatus
US8872957B2 (en) Display control method and device for finder device
JP4448039B2 (en) Imaging apparatus and control method thereof
JP5493789B2 (en) Imaging apparatus and imaging method
US20200120286A1 (en) Image processing apparatus and method for controlling image processing apparatus
JP2014120844A (en) Image processing apparatus and imaging apparatus
US8582012B2 (en) Imaging apparatus and display control method in imaging apparatus
US20140176775A1 (en) Imaging device and imaging method
US8106998B2 (en) Image pickup apparatus and focusing condition displaying method
JP2007232793A (en) Imaging apparatus
CN101742117B (en) Imaging apparatus and display control method thereof
US7505679B2 (en) Image-taking apparatus
US9979893B2 (en) Imaging apparatus and method for controlling the same
US8934050B2 (en) User interface and method for exposure adjustment in an image capturing device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160809

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20170428

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170509

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170707

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20171219

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180116

R151 Written notification of patent or utility model registration

Ref document number: 6278636

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151