WO2020158069A1 - Imaging device, imaging method, and program - Google Patents

Imaging device, imaging method, and program Download PDF

Info

Publication number
WO2020158069A1
WO2020158069A1 PCT/JP2019/041616 JP2019041616W WO2020158069A1 WO 2020158069 A1 WO2020158069 A1 WO 2020158069A1 JP 2019041616 W JP2019041616 W JP 2019041616W WO 2020158069 A1 WO2020158069 A1 WO 2020158069A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving image
frame period
frame
period
flicker
Prior art date
Application number
PCT/JP2019/041616
Other languages
French (fr)
Japanese (ja)
Inventor
田中 康一
和田 哲
哲也 藤川
幸徳 西山
林 健吉
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2020569370A priority Critical patent/JP7110406B2/en
Publication of WO2020158069A1 publication Critical patent/WO2020158069A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the present invention relates to an image capturing apparatus, an image capturing method, and a program that capture a moving image, and particularly to an image capturing apparatus, an image capturing method, and a program that take measures against flicker when capturing a moving image.
  • Patent Document 1 describes that a plurality of means for detecting the flicker light amount change characteristic are provided, and processing for suppressing the influence of flicker is performed based on the detection results of these detection means.
  • Patent Document 2 describes that when the subject is under a flicker light source, the frame rate is set to a flicker cycle or an integral multiple thereof.
  • Patent Documents 1 and 2 are for taking a flicker-free still image by taking a flicker countermeasure in a live view of a moving image, and do not consider continuously taking and recording a moving image while taking a flicker countermeasure. It was
  • an imaging device includes a cycle detection unit that detects a flicker cycle of a light source in a moving image, an imaging unit that captures the moving image at a first frame period, An image acquisition unit that acquires first moving image data indicating a captured moving image, an image processing unit that processes the first moving image data to generate second moving image data having a second frame period, and a recording device that records the second moving image data.
  • An image pickup apparatus comprising: a recording unit for recording in a recording medium; and a control unit for controlling a first frame period and a second frame period, wherein the control unit has a first moving image mode and a moving image as a moving image capturing mode.
  • a second moving image mode in which the exposure time of the frames constituting the first moving image mode is set shorter than the first moving image mode, and the control unit sets the first frame period to the flicker period according to the detection result in the second moving image mode. It is set to an integral multiple, and in the first moving image mode, the first frame period is set regardless of the flicker period.
  • the image pickup apparatus When a moving image is shot under the influence of flicker, a difference in brightness and tint due to the influence of flicker is likely to occur between frames of a moving image, and the influence becomes large when the exposure time is short. Therefore, the image pickup apparatus according to the first aspect is provided with a plurality of moving image modes (first moving image mode and second moving image mode) according to the exposure time of the frames forming the moving image. Then, in the second moving image mode (in which the influence of flicker is large) in which the exposure time of the frames forming the moving image is set shorter than that of the first moving image mode, the first frame period is set to an integral multiple of the flicker period according to the detection result of the flicker.
  • first moving image mode in which the influence of flicker is large
  • the first frame period is set regardless of the flicker period.
  • first frame period and the second frame period may be the same or different.
  • the recording device may be included in the imaging device or may be independent of the imaging device.
  • the flicker can be detected before the moving image recording is started, but may be performed during the moving image recording in addition to before the moving image recording is started.
  • the control unit does not change the second frame period in the second moving image mode depending on whether flicker of the light source is detected or not.
  • the second frame period frame period of recording
  • moving image data is recorded at a period different from a preset frame period.
  • the control unit sets the second frame period to a period longer than the first frame period when flicker of the light source is detected in the second moving image mode.
  • the first frame period is shortened (to an integral multiple of the flicker period) as a countermeasure against flicker, the second frame period does not have to be shortened accordingly.
  • the image pickup apparatus is the image pickup apparatus according to any one of the first to third aspects, wherein the image processing unit configures a plurality of first moving image data when the second frame period is longer than the first frame period. A part of the frames is selected to generate the second moving image data of the second frame period.
  • the moving image captured in the first frame period is adjusted to the second frame period (recording frame period) by “thinning” some frames.
  • the image pickup apparatus is the image pickup device according to any one of the first to fourth aspects, wherein the image processing unit configures a plurality of first moving image data when the second frame period is longer than the first frame period.
  • the frames that compose the second moving image data are generated by synthesizing the frames.
  • the image processing unit may perform weighting when combining the frames.
  • the control unit sets the second frame period to a period shorter than the first frame period when flicker of the light source is detected in the second moving image mode.
  • the control unit may perform such processing according to a user's instruction.
  • the image processing unit when the first frame period is longer than the second frame period, outputs the first moving image data.
  • Interpolation frames are inserted into the constituent frames to generate second moving image data of the second frame period.
  • the interpolation frame may be a frame in which no subject is reflected, or may be the frame just before the interpolation frame is inserted. Further, it may be a frame generated by interpolation or the like from the frame before and/or after the insertion timing.
  • An image pickup apparatus is the image pickup apparatus according to the seventh aspect, further comprising a display control unit that causes the display device to display the second moving image data as an image, and the display control unit causes the second moving image data to be displayed excluding an interpolation frame. ..
  • the display control unit causes the second moving image data to be displayed excluding an interpolation frame. ..
  • the user can efficiently check the captured moving image.
  • the control unit sets the second frame period to the flicker period according to the detection result in the second moving image mode. Set to an integer multiple.
  • the control unit may equalize the first frame period and the second frame period.
  • An image pickup apparatus is the image pickup apparatus according to any one of the first to ninth aspects, further including a phase detection unit that detects flicker phase information of the light source in the first moving image data, and the control unit includes the second moving image.
  • the exposure timing of the frames forming the first moving image data is controlled based on the flicker phase information. According to the tenth aspect, it is possible to capture a moving image with a desired brightness, as well as to reduce variations in brightness and tint between frames.
  • An image pickup apparatus is the image pickup apparatus according to any one of the first to tenth aspects, wherein in the second moving image mode, an autofocus speed, an automatic exposure follow-up speed, and a white balance of the first moving image mode are set. At least one of the following speeds is set to a high speed.
  • the eleventh aspect defines shooting conditions different from those in the first moving image mode in the second moving image mode.
  • the image pickup apparatus is the image pickup apparatus according to any one of the first to eleventh aspects, further including a still image extraction unit that extracts a frame forming the second moving image data as a still image. If flicker countermeasures are not taken, problems such as "a frame at the optimum timing for extracting as a still image has poor image quality" may occur. However, according to the twelfth aspect, the above-mentioned flicker countermeasures are taken. The user can select a frame to be extracted without worrying about deterioration of image quality.
  • an imaging apparatus includes a cycle detection unit that detects a flicker cycle of a light source in a moving image, an imaging unit that captures the moving image in a first frame cycle, An image acquisition unit that acquires first moving image data indicating a captured moving image, an image processing unit that processes the first moving image data to generate second moving image data having a second frame period, and a recording device that records the second moving image data.
  • An image pickup apparatus comprising: a recording unit for recording the first frame period and a control unit for controlling the first frame period and the second frame period, wherein the control unit sets the first frame period to an integral multiple of the flicker period according to the detection result.
  • the second frame period is set to a previously designated period.
  • the first frame period by setting the first frame period to be an integral multiple of the flicker period, it is possible to reduce variations in brightness and tint between frames.
  • the control unit can set the second frame period according to the user's designation operation or without the user's designation operation, whereby the moving image captured by the flicker countermeasure is recorded at the desired frame period. be able to.
  • control unit sets and sets the second frame period (frame period of recording) according to the user's designation operation or without the user's designation operation.
  • the first frame period (frame period of shooting) may be set according to the two-frame period.
  • control unit sets the second frame period to a period longer than the first frame period when the flicker of the light source is detected.
  • the fourteenth aspect defines one aspect of the setting method of the first frame period.
  • the imaging device is the image pickup device according to the thirteenth or fourteenth aspect, wherein the image processing unit selects one of the plurality of frames forming the first moving image data when the second frame period is longer than the first frame period. A frame is selected to generate second moving image data.
  • the moving image captured in the first frame period can be matched with the second frame period (recording frame period) by “thinning” some frames.
  • the imaging device is the imaging device according to any one of the thirteenth to fifteenth aspects, wherein the image processing unit configures a plurality of first moving image data when the second frame period is longer than the first frame period.
  • the frames that compose the second moving image data are generated by synthesizing the frames. Frames may be weighted during synthesis.
  • the control unit sets the second frame period to a period shorter than the first frame period when the flicker of the light source is detected.
  • the seventeenth aspect defines one aspect of the setting method of the first frame period.
  • the image processing unit inserts an interpolation frame into a frame forming the first moving image data when the first frame period is longer than the second frame period. Then, a frame forming the second moving image data is generated.
  • the interpolation frame may be a frame in which no subject is reflected, or may be the frame just before the interpolation frame is inserted. Further, it may be a frame generated by interpolating frames before and after the insertion timing.
  • An imaging apparatus further comprises, in the eighteenth aspect, a display control unit that causes the display device to display the second moving image data as an image, and the display control unit causes the second moving image data to be displayed excluding an interpolation frame. ..
  • the display control unit since the interpolation frame is not displayed, the user can efficiently check the recorded moving image.
  • An image pickup apparatus is the imaging apparatus according to any one of the thirteenth to nineteenth aspects, further including a phase detection unit that detects flicker phase information of the light source in the first moving image data, and the control unit sets the first moving image.
  • the exposure timing of the frames forming the data is controlled based on the flicker phase information. According to the twentieth aspect, it is possible to not only reduce the variations in brightness and tint between frames but also shoot all the frames with a desired brightness.
  • the image pickup apparatus is the image pickup apparatus according to any one of the thirteenth to twentieth aspects, wherein the control unit sets the first moving image mode as an image pickup mode of the moving image and the exposure time of the frames forming the moving image.
  • the second moving image mode is set to be shorter than the first moving image mode.
  • the control unit sets the first frame period to an integral multiple of the flicker period according to the detection result.
  • the flicker detection result is obtained in the second moving image mode (the influence of the flicker is large) in which the exposure time of the frames forming the moving image is set shorter than that in the first moving image mode.
  • the first frame period is set to be an integral multiple of the flicker period, so that flicker countermeasures can be taken according to the exposure time of the frame.
  • the image pickup apparatus in the second moving image mode, at least one of an auto-focus speed, an automatic exposure follow-up speed, and a white balance follow-up speed is faster than the first moving-picture mode. Is set to.
  • the twenty-second aspect defines shooting conditions different from those in the first moving image mode in the second moving image mode.
  • the image pickup apparatus is the image pickup apparatus according to any one of the thirteenth to twenty-second aspects, further including a still image extraction unit that extracts a frame forming the second moving image data as a still image. If there is no flicker countermeasure, a problem such as "a frame with an optimal timing, but the image quality is poor" may occur. According to the twenty-third aspect, the user is concerned about the image quality deterioration due to the above-mentioned flicker countermeasure. The frame to be extracted can be selected without doing so.
  • the imaging method provides a moving image capturing mode in which a first moving image mode and an exposure time of a frame forming a moving image are shorter than those in the first moving image mode.
  • An image pickup method of an image pickup apparatus having a second moving image mode set comprising: a period detection step of detecting a flicker period of a light source in a moving image; an image pickup step of capturing a moving image at a first frame period; An image acquisition step of acquiring first moving image data indicating a moving image, an image processing step of processing the first moving image data to generate second moving image data of a second frame period, and recording the second moving image data in a recording device.
  • the first frame period is an integral multiple of the flicker period according to the detection result. And the first frame period is set regardless of the flicker period in the first moving image mode.
  • the twenty-fourth aspect as in the first aspect, it is possible to take flicker countermeasures according to the exposure time of the frames forming the moving image. Further, it is possible to record an imaged moving image with the flicker countermeasure in the second frame period.
  • the twenty-fourth aspect may further include the same configuration as the second to twelfth aspects.
  • an imaging method includes a detection step of detecting a flicker period of a light source in a moving image, an imaging step of capturing a moving image in a first frame period, and an imaging An image acquisition step of acquiring first moving image data indicating the moving image, an image processing step of processing the first moving image data to generate second moving image data of a second frame period, and the second moving image data in a recording device.
  • the second frame period is set to a previously designated period.
  • the 25th mode may further include the same configuration as the 14th to 23rd modes.
  • a program according to a twenty-sixth aspect of the present invention causes an imaging device to execute the imaging method according to the twenty-fourth aspect. Accordingly, as in the first aspect and the twenty-fourth aspect, it is possible to take measures against flicker according to the exposure time of the frames forming the moving image.
  • the program according to the twenty-sixth aspect may further include the same configuration as the second to twelfth aspects. Further, a non-transitory recording medium in which the computer readable code of the program according to these aspects is recorded can also be mentioned as one aspect of the present invention.
  • a program according to a twenty-seventh aspect of the present invention causes an imaging device to execute the imaging method according to the twenty-fifth aspect.
  • the program according to the twenty-seventh aspect may further include the same configuration as the fourteenth to twenty-third aspects.
  • a non-transitory recording medium in which a computer-readable code of the program of these aspects is recorded can also be cited as one aspect of the present invention.
  • the image pickup apparatus As described above, according to the image pickup apparatus, the image pickup method, and the program of the present invention, it is possible to take measures against flicker according to the exposure time of the frames forming the moving image. Further, according to the image pickup apparatus, the image pickup method, and the program of the present invention, it is possible to record a moving image picked up with flicker countermeasures at a desired frame period.
  • FIG. 1 is a diagram showing a configuration of a camera according to the first embodiment.
  • FIG. 2 is a diagram showing a functional configuration of the image processing apparatus.
  • FIG. 3 is a flowchart showing the processing of the imaging method.
  • FIG. 4 is a diagram showing the relationship between the imaging frame period and the influence of flicker.
  • FIG. 5 is a diagram showing how the imaging timing is controlled.
  • FIG. 6 is another flowchart showing the processing of the imaging method.
  • FIG. 7 is a diagram showing how frames are thinned out.
  • FIG. 8 is a diagram showing how a frame is inserted.
  • FIG. 9 is a flowchart showing a reproduction process when a frame is cut out.
  • FIG. 10 is another flowchart showing the processing of the imaging method.
  • FIG. 10 is another flowchart showing the processing of the imaging method.
  • FIG. 11 is another flowchart showing the processing of the imaging method.
  • FIG. 12 is still another flowchart showing the process of the imaging method.
  • FIG. 13 is another flowchart showing the processing of the imaging method.
  • FIG. 14 is an external view of a smartphone according to the second embodiment.
  • FIG. 15 is a block diagram showing the configuration of the smartphone according to the second embodiment.
  • FIG. 1 is a diagram showing a configuration of a camera 10 (imaging device) according to the first embodiment.
  • the camera 10 is composed of an interchangeable lens 100 (image pickup unit, image pickup device) and an image pickup device main body 200 (image pickup device), and forms a subject image (optical image) on the image pickup element 210 by a photographing lens including a zoom lens 110 described later. ..
  • the interchangeable lens 100 and the imaging device main body 200 can be attached and detached via a mount (not shown).
  • the interchangeable lens 100 includes a zoom lens 110, a focus lens 120, a diaphragm 130, and a lens driving unit 140.
  • the lens driving unit 140 drives the zoom lens 110 and the focus lens 120 forward and backward according to a command from the image processing device 240 (lens drive control unit 240H in FIG. 2) to perform zoom (optical zoom) adjustment and focus adjustment.
  • the zoom adjustment and the focus adjustment may be performed according to a command from the image processing device 240, or may be performed according to a zoom operation or a focus operation (a zoom ring (not shown), a rotation of the focus ring, or the like) performed by the user. Good.
  • the lens driving unit 140 controls the diaphragm 130 according to a command from the image processing device 240 to adjust the exposure.
  • information such as the positions of the zoom lens 110 and the focus lens 120 and the opening degree of the diaphragm 130 is input to the image processing device 240.
  • the interchangeable lens 100 has an optical axis L.
  • the image pickup apparatus main body 200 includes an image pickup element 210 (image pickup section), an AFE 220 (AFE: Analog FrontEnd, image pickup section), an A/D converter 230 (A/D: Analog to Digital, image pickup section), and an image processing apparatus 240 ( (Image acquisition unit, flicker detection unit, image processing unit, recording unit, control unit, display control unit, still image extraction unit, lens drive control unit). Further, the imaging device main body 200 includes an operation unit 250, a recording device 260 (recording device), a monitor 270 (display device), and an attitude sensor 280. The imaging device main body 200 may have a shutter (not shown) for blocking the light transmitted through the imaging element 210.
  • the shutter may be a mechanical shutter or an electronic shutter.
  • the exposure time can be adjusted by controlling the charge accumulation period of the image sensor 210 by the image processing device 240.
  • the image pickup unit 201 includes an interchangeable lens 100, an image pickup element 210, an AFE 220, and an A/D converter 230, and is controlled by an image acquisition unit 240A (image acquisition unit).
  • the image sensor 210 has a light receiving surface in which a large number of light receiving elements are arranged in a matrix. Then, the subject light that has passed through the zoom lens 110, the focus lens 120, and the diaphragm 130 is imaged on the light receiving surface of the image pickup element 210, and converted into an electric signal by each light receiving element.
  • An R (red), G (green), or B (blue) color filter is provided on the light-receiving surface of the image sensor 210, and a color image of a subject can be acquired based on signals of each color.
  • various photoelectric conversion elements such as a CMOS (Complementary Metal-Oxide Semiconductor) and a CCD (Charge-Coupled Device) can be used.
  • the AFE 220 performs noise removal and amplification of the analog image signal output from the image sensor 210, and the A/D converter 230 converts the captured analog image signal into a digital image signal having a gradation width.
  • FIG. 2 is a diagram showing a functional configuration of the image processing device 240.
  • the image processing device 240 includes an image acquisition unit 240A (image acquisition unit), a flicker detection unit 240B (cycle detection unit, phase detection unit), an image processing unit 240C (image processing unit), a recording unit 240D (recording unit), and a control unit.
  • 240E control unit
  • display control unit 240F display control unit
  • still image extraction unit 240G still image extraction unit
  • lens drive control unit 240H lens drive control unit.
  • the image processing device 240 captures (captures) a moving image based on the digital image signal input from the A/D converter 230, generates a file, generates a still image file, processes a plurality of frames included in the moving image, and generates a still image. Performs processing such as extraction. Details of the processing using the image processing device 240 will be described later.
  • the functions of the image processing device 240 can be realized by using various processors.
  • the various processors include, for example, a CPU (Central Processing Unit) that is a general-purpose processor that executes software (programs) to realize various functions.
  • the various processors described above include a GPU (Graphics Processing Unit) that is a processor specialized for image processing.
  • the various processors described above also include a programmable logic device (PLD) that is a processor whose circuit configuration can be changed after manufacturing such as FPGA (Field Programmable Gate Array).
  • PLD programmable logic device
  • FPGA Field Programmable Gate Array
  • a dedicated electric circuit which is a processor having a circuit configuration specifically designed to execute a specific process such as an ASIC (Application Specific Integrated Circuit), is also included in the various processors described above.
  • the processor When the above-described processor or electric circuit executes software (program), the processor (computer) readable code of the software to be executed is stored in a non-transitory recording medium such as a ROM (Read Only Memory). The processor then references the software.
  • the software stored in the non-temporary recording medium includes a program for executing the imaging method according to the present invention (a program for operating the imaging device).
  • the code may be recorded in a non-transitory recording medium such as various magneto-optical recording devices and semiconductor memories instead of the ROM.
  • a RAM Random Access Memory
  • EEPROM Electrical Erasable and Programmable Read Only Memory
  • the image processing device 240 includes a ROM 242 (non-temporary recording medium) in addition to the above-mentioned units.
  • the ROM 242 is read by a computer (for example, various processors forming the image processing device 240) of a program (including a program for executing the image pickup method according to the present invention) necessary for image pickup, recording, display, and the like. Possible codes are recorded.
  • the operation unit 250 has a release button (not shown), operation buttons (for example, a cross button, a Quick button, an OK button, etc.), a dial, a switch, and the like, and the user can set an image capturing mode, a moving image capturing instruction, and a recording frame period. Various operations such as a still image extraction instruction can be performed.
  • the control unit 240E can also accept these user instructions.
  • the monitor 270 may be used as the operation unit 250.
  • the recording device 260 (recording device) is composed of various types of magneto-optical recording media, non-transitory recording media such as semiconductor memories, and its control circuit, and stores moving images, still images, still images extracted from the moving images, and the like.
  • the recording medium may be of a type that can be attached to and detached from the image pickup apparatus main body 200.
  • the captured image (moving image, still image) may be transmitted to and stored in an external recording medium (other than the recording device 260) or a recording device by, for example, wired and/or wireless communication.
  • the monitor 270 (display device) is composed of a touch panel type liquid crystal display panel and can display a moving image, a still image, a still image extracting frame, and the like.
  • the monitor 270 can be arranged on the back surface side, the top surface side, or the like of the imaging device body 200. Further, the camera 10 may include a finder.
  • the camera 10 includes an attitude sensor 280 (for example, an acceleration sensor that measures acceleration in three axis directions), and can detect the attitude (direction with respect to gravity) of the camera 10 based on the measurement result of the attitude sensor 280.
  • the display control unit 240F can change the orientation of the image displayed on the monitor 270 according to the attitude (portrait or landscape) of the camera 10.
  • the camera 10 can set either a still image mode, a normal moving image mode (first moving image mode), or a still image extracting moving image mode (second moving image mode) as a shooting mode.
  • the still image mode and the normal moving image mode are similar to those of a normal digital camera.
  • the still image extraction moving image mode is a mode in which a still image can be extracted from a moving image file, and a moving image with different image capturing conditions from the normal moving image mode (a moving image with an image capturing condition that emphasizes still image extraction rather than viewing the moving image itself ) Is imaged.
  • the still image extracting moving image mode at least one of autofocus speed (focus lens drive speed for moving to a target focusing distance), automatic exposure following speed, and white balance following speed. Is set faster than the normal video mode.
  • the moving image mode for still image extraction it is preferable to set the resolution to the highest value (for example, 4,000 ⁇ 2,000 pixels) that can be set by the camera 10 and set the color tone on the premise of still image extraction.
  • the upper limit of the ISO speed ISO: International Organization for Standardization
  • the shutter speed is set to a value corresponding to the frame rate of the moving image to be recorded in the normal moving image mode (1/30 seconds when the frame rate is 30 frames/second).
  • the speed is set higher than the frame interval (for example, less than 1/30 seconds).
  • the shutter speed is set to a value corresponding to the frame rate of the moving image so that a smooth moving image is reproduced, but in this case, a moving subject may be blurred. For this reason, the shutter speed in the moving image mode for still image extraction is set higher than that in the normal moving image mode. That is, in the still image extracting moving image mode, the exposure time of the frames forming the moving image is shorter than that in the normal moving image mode.
  • the shutter speed (exposure time) can be set in the P mode (program mode) or the S mode (shutter speed priority), as in the still image mode.
  • P mode program mode
  • S mode shutter speed priority
  • the shutter speed is set in advance by the user, and the aperture value is automatically controlled by the imaging device.
  • the frames forming the moving image can be extracted as a still image afterwards. Therefore, the user can easily take a picture of an event (natural phenomenon, accident, happening, etc.) that is not known when it will occur, a picture of a subject whose state changes over time or a momentary state of a moving subject, etc. be able to.
  • the still image can be extracted not only at the timing at which the recording of the still image is instructed but also at other timings. Therefore, the user can acquire a still image at a desired timing.
  • a high-quality still image can be extracted by setting shooting conditions (shutter speed, resolution, frame period, etc.) suitable for still image extraction.
  • still images can be captured (recorded) during moving image shooting.
  • FIG. 3 is a flowchart showing processing in the moving image mode (normal moving image mode and still image extracting moving image mode).
  • the control unit 240E sets the camera 10 to the normal moving image mode (first moving image mode) or the still image extracting moving image mode (second moving image mode) based on the user's operation via the operation unit 250 (step S100: Mode setting step).
  • the flicker detector 240B detects the flicker cycle (step S110: cycle detection step).
  • the detection can be performed, for example, by the method described in JP-A-2018-133826.
  • the flicker detection unit 240B can detect flicker from a plurality of frames (may be frames forming a live view image) acquired at time intervals. For example, when considering 50 Hz (power supply cycle: 20 msec) and 60 Hz (power supply cycle: about 16.7 msec) as power supply frequencies, the imaging unit 201 images a plurality of images at a frame cycle different from any power supply frequency (50 Hz, 60 Hz). Then, the image acquisition unit 240A acquires the captured image.
  • the cycle of the waveform obtained by autocorrelating the acquired image by the flicker detection unit 240B can be set as the flicker cycle.
  • the flicker detector 240B can also detect the flicker phase from this waveform.
  • the flicker detector 240B may detect flicker by another known method. The same applies when other power supply frequencies (other than 50 Hz and 60 Hz) are considered. Further, the flicker detection unit 240B may detect as a flicker a light source in which the light and dark changes of the light source are periodic and the light and dark difference is constant or more.
  • the flicker detection unit 240B detects flicker not only before the recording of the moving image but also after the recording is started, and resets the imaging frame period and/or the recording frame period based on the detection result. You may. Further, when a change in the shooting scene is detected by scene recognition or the like, flicker may be detected, and the imaging frame period and/or the recording frame period may be reset based on the detection result.
  • step S120 When flicker is detected (there is flicker) (Yes in step S120), the control unit 240E determines whether or not the imaging mode is the second moving image mode (still image extracting moving image mode) (step S130). In the case of the second moving image mode (Yes in step S130), the control unit 240E sets the imaging frame period according to the detection result of step S110. Specifically, for example, when the flicker cycle is 1/100 sec, the process proceeds from step S140 to step S150.
  • the imaging frame cycle and the flicker cycle are equal, the imaging frame cycle is 1 times the flicker cycle (an aspect of one multiple). Further, in this case, the recording frame period and the imaging frame period are equal. Note that “fps (frame per second)” indicates the number of frames per second.
  • step S140 determines whether the flicker cycle is 1/120 sec. If the flicker cycle is 1/120 sec, the process proceeds from step S140 to step S170.
  • the imaging frame cycle is 1 times the flicker cycle (an aspect of one multiple).
  • the recording frame period and the imaging frame period are equal.
  • the control unit 240E does not have to change the recording frame period (second frame period) depending on whether flicker is detected or not (imaging).
  • the frame period should be an integral multiple of the flicker period).
  • the control unit 240E sets the image capturing frame period (first frame period) regardless of the flicker period (step).
  • the control unit 240E may set the imaging frame cycle based on a user operation, or may set the imaging frame cycle without depending on a user operation.
  • the control unit 240E can also set the recording frame period (second frame period) to a value equal to the imaging frame period even in the normal moving image mode (step S200: control step). For example, the control unit 240E can set the imaging frame period to 1/100 sec and the recording frame period to 1/100 sec regardless of the flicker period. It should be noted that the control unit 240E can set the imaging frame period and the recording frame period as in the case of the normal moving image mode even when no flicker is detected (No in step S120).
  • the image acquisition unit 240A and the control unit 240E determine whether or not a moving image recording start instruction has been given (step S210: imaging step, image acquisition step). For example, when a release button (not shown) of the operation unit 250 is pressed down, it can be determined that “a moving image recording start instruction has been issued”. If the determination is affirmative, the process proceeds to step S220, where the image capturing unit 201 captures a moving image at the image capturing frame period (first frame period) set by the above-described processing, and the image acquiring unit 240A captures the captured moving image data (captured). First moving image data indicating a moving image) is acquired (step S220: imaging step, image acquisition step).
  • the image processing unit 240C processes the first moving image data to generate recording moving image data (second moving image data) having a recording frame period (second frame period) (step S222: image processing step).
  • the image processing unit 240C may insert or thin out the frames to generate the moving image data for recording (see an aspect described later).
  • the image processing unit 240C may perform image processing other than frame insertion and thinning.
  • the recording unit 240D (recording unit) causes the recording device 260 (recording device) to record the generated second moving image data (step S224: recording step).
  • the recording unit 240D records all the frames of the captured moving image data.
  • the recording unit 240D may record the recording moving image data in an external recording device (such as a recording device connected to the camera 10). It is preferable that the image processing unit 240C and the recording unit 240D record information such as an imaging frame period (imaging frame rate) and a recording frame period (recording frame rate) in a tag of a moving image for recording.
  • the image acquisition unit 240A, the image processing unit 240C, and the recording unit 240D keep the moving image recording end instruction (No in step S230) and the remaining capacity of the recording medium (recording device 260 or the like) sufficient ( (Yes in step S240), the processing from steps S220 to S224 is continued.
  • the image acquisition unit 240A, the image processing unit 240C, and the recording unit 240D end the process. Then, the moving image file being recorded is closed (step S250).
  • FIG. 4 conceptually shows the relationship between the change in brightness due to the flicker light source and the image capturing timing of each frame of a moving image when the image capturing frame period (first frame period) is set to an integral multiple of the flicker period by the above-described processing.
  • the imaging frame period (
  • ) is set to one time (one mode of integer multiple) of the flicker period (T0).
  • Times t1 and t2 indicate image pickup timing.
  • the exposure time of each frame forming a moving image is set shorter than that in the normal moving image mode (first moving image mode), and the influence of flicker is large, but as described above.
  • the imaging frame cycle is an integral multiple of the flicker cycle, it is possible to take measures against flicker according to the exposure time of the frames forming the moving image.
  • the difference in lightness and darkness and the difference in tint between frames can be reduced, but the brightness of the flicker light source may be different depending on the timing of capturing a moving image (the exposure timing of each frame). Each frame will be imaged in the dark.
  • the timing of capturing a moving image as described below, the moving image can be captured with the brightness of the flicker light source being in a desired state.
  • the change in the brightness of the flicker light source (flicker phase information) can be detected together with the detection of the flicker cycle in step S110 (phase detecting step).
  • the control unit 240E sets the exposure timing of the frames forming the captured moving image data (first moving image data indicating the captured moving image) based on the flicker phase information. Control. Under this control, the imaging unit 201 captures a moving image, and the image acquisition unit 240A acquires captured moving image data (step S220).
  • FIG. 5 shows how the brightness of the flicker light source becomes maximum (brightness L1) by imaging at the imaging timing (time t3, t4, t5, t6, t7) set by the above-described control. ..
  • FIG. 5 shows a state in which imaging is performed at the exposure timing when the brightness of the flicker light source is maximized
  • the imaging timing (for example, t1 and t2) is set to the timing when the brightness of the flicker light source is maximized as in the example of FIG.
  • the brightness of the moving image may be adjusted by shifting the brightness.
  • Such setting of the imaging timing can be similarly performed in other modes and modifications thereof described below.
  • FIG. 6 is a flowchart showing another aspect 1 of the image pickup and recording frame period (first and second frame periods) setting. The same steps as those in the flowchart of FIG. 3 are designated by the same step numbers, and detailed description thereof will be omitted.
  • the control unit 240E initializes the recording frame period (second frame period) (step S102: control step).
  • the control unit 240E has a recording frame period (1/24 sec, 1/25 sec, 1/30 sec, 1/50 sec) corresponding to a general value (24 fps, 25 fps, 30 fps, 50 fps, 60 fps, etc.) as a frame rate of a moving image. 1/60) may be set.
  • the control unit 240E may perform the initial setting based on a user's instruction via the operation unit 250, or may perform the initial setting regardless of the user's instruction. In the following description, it is assumed that the recording frame period is set to 1/30 sec (frame rate is 30 fps) in step S102, but the same processing can be performed when another value is set to the recording frame period. ..
  • the recording frame cycle is left as the initial setting, and as described later in detail, by performing frame thinning, insertion, etc. on the captured moving image data (first moving image data) acquired in the capturing frame period,
  • the recording moving image data (second moving image data) having the set frame period is generated.
  • the imaging frame period and the recording frame period may be different. Since the recording frame period remains the default setting, steps S160 and S180 in FIG. 6 may be skipped.
  • the control unit 240E can set a value such as 1/100 sec or 1/50 sec as the imaging frame period (first frame period) (step S150: Control step). These imaging frame cycles are 1 times and 2 times (one aspect of integer multiple) the flicker cycle, respectively.
  • the control unit 240E can set values such as 1/120 sec and 1/60 sec as the imaging frame period (first frame period) ( Step S170: control step). These imaging frame cycles are 1 times and 2 times (one aspect of integer multiple) the flicker cycle, respectively.
  • the control unit 240E may set an imaging frame cycle that is an integral multiple (three times or more) of the flicker cycle.
  • the image processing unit 240C causes the plurality of frames that form the captured moving image data (first moving image data). A part of the frames is selected to generate recording moving image data (second moving image data) having a recording frame period (second frame period) (step S222: image processing step).
  • FIG. 7 is a diagram conceptually showing a state of such processing (frame thinning) (frames are sequentially acquired along time t).
  • the image processing unit 240C is a part of a plurality of frames (frame 506; part (a) of FIG. 16) that forms the captured moving image data (first moving image data) of the capturing frame period (T1). Frame of the recording frame period (second frame period; T2 (>T1 in FIG. 7)) constituting the recording moving image data (second moving image data) (frame 508; (b) in the same figure). Part) is being generated.
  • the image processing unit 240C combines a plurality of frames forming the captured moving image data (first moving image data) when the recording frame period (second frame period) is longer than the captured frame period (first frame period) ( Weighted addition or the like) may be performed to generate frames that form the recording moving image data (second moving image data) (step S222: image processing step).
  • the control unit 240E may set the recording frame period (second frame period) to a period shorter than the imaging frame period (first frame period). For example, when the recording frame period is initially set to 1/30 sec, the control unit 240E may set the imaging frame period to 1/25 sec (when the flicker period is 1/100 sec) (step S150: control step). ). In this case, the imaging frame cycle is four times the flicker cycle (an aspect of one multiple).
  • FIG. 8 is a diagram conceptually showing a state of such an interpolation frame.
  • the image processing unit 240C causes the image capturing moving image data (first moving image data) to be a frame (frame 500; part (a) of FIG. 8).
  • Interpolation frame 502A (frame shown in gray) between the two) to generate a frame 502 having a recording frame period of T2 (see part (b) in the same figure).
  • the frame 502 is an example in which the ratio of the imaging frame to the interpolation frame is 1:1, but the image processing unit 240C can determine the ratio according to the imaging frame period and the recording frame period.
  • examples of the interpolation frame include a black frame (dummy frame) in which the subject is not reflected, a frame in which the immediately preceding frame is used as it is, and a frame generated by interpolating frames before and/or after the insertion timing.
  • the present invention is not limited to these.
  • the image processing unit 240C and the recording unit 240D add the shooting frame period (shooting frame rate), the recording frame period (recording frame rate), and the inserted interpolation frame to the tag of the moving image for recording. It is preferable to record information such as position. As described below, this information can be used when extracting a still image.
  • FIG. 9 is a flowchart showing a state of moving image reproduction and still image cutout (still image extraction).
  • the display control unit 240F determines whether or not the dummy frame information (information indicating the position or the like of the frame inserted by interpolation) is recorded in the tag or the like of the moving image data recorded in step S224 (step S300).
  • the display control unit 240F sets the information (reading: step S310) and starts reproduction of the recording moving image data (second moving image data) (step).
  • the display control unit 240F may display the image on the monitor 270 (display device), or may display the image on another display device connected to the camera 10. Note that, here, a case will be described in which a moving image is played back frame by frame (displayed one frame at a time in response to a user operation).
  • the display control unit 240F initializes the value of the counter k to 1 (step S330) and determines whether the kth frame is a dummy frame (step S340). If the k-th frame is not a dummy frame (No in step S340), the display control unit 240F displays the frame (step S360).
  • the still image extracting unit 240G (still image extracting unit) determines whether or not there is a cutout instruction for the displayed frame (an instruction by the user to extract a frame forming the recording moving image data (second moving image data) as a still image). Is determined (step S370).
  • the still image extraction unit 240G (still image extraction unit) cuts out the instructed frame as a still image (step S380) and proceeds to step S390. Since the moving image file is stored in a moving image format such as MPEG format, the still image extraction unit 240G converts the data of the designated frame into a still image format (JPEG format or the like). The still image extraction unit 240G may perform image processing other than format conversion.
  • step S370 If there is no cutout instruction (No in step S370), the display control unit 240F determines whether or not there is a reproduction end instruction by the user, and if there is an instruction (Yes in step S390), the process of the flowchart of FIG. 9 is performed. finish.
  • the display control unit 240F determines whether or not the user has given a frame advance instruction.
  • the display control unit 240F increments the value of the counter k by 1 (step S350), clips the value of k in the range of 1 to N (step S430), and returns to step S340. ..
  • step S410 When there is no frame advance instruction (No in step S400) and there is a frame return instruction (instruction to display the previous frame) (Yes in step S410), the display control unit 240F decrements the value of the counter k by 1 (step S420). The value of k is clipped in the range of 1 to N (step S430), and the process returns to step S340.
  • step S410 the display control unit 240F clips the value of k in the range of 1 to N (step S430) and returns to step S340.
  • the user can efficiently perform the still image extraction work. it can.
  • the recording frame period initially set in step S102 may be changed (reset) according to the imaging frame period.
  • the control unit 240E changes (resets) the recording frame period (second frame period) in steps S160 and S180.
  • a frame cycle that conforms to the standard of a general moving image with a frame cycle that is longer than the imaging frame cycle (frame rate that is lower than the imaging frame rate).
  • the control unit 240E sets the imaging frame period (first frame period) to 1/100 sec (step S150).
  • control unit 240E sets the recording frame period (second frame period) to 1/60 sec, 1/50 sec, 1/25 sec, and 1/24 sec corresponding to the imaging frame period (frame rates are 60 fps, 50 fps, 25 fps, respectively). 24 fps) (step S160).
  • the control unit 240E sets the imaging frame period to 1/120 sec (step S170). Then, the control unit 240E sets the recording frame period to 1/100 sec, 1/60 sec, 1/50 sec, 1/25 sec, and 1/24 sec corresponding to the imaging frame period (frame rates are 100 fps, 60 fps, 50 fps, 25 fps, respectively. 24 fps) (step S180).
  • the display control unit 240F informs the user by displaying information on the monitor 270.
  • the control unit 240E may allow the user to select the recording frame period to be changed (reset).
  • the recording frame cycle is longer than the imaging frame cycle. Therefore, as described above with reference to FIG. 7, the image processing unit 240C thins out the frames (selects some of the frames forming the captured moving image data according to the recording frame period to generate recording moving image data. ) And frames are combined to generate moving image data for recording (step S222).
  • the recording frame period is changed (reset) to be longer than the imaging frame period, but a recording frame period shorter than the imaging frame period (recording frame rate higher than the imaging frame rate) is set. Good.
  • the recording frame period may be set in accordance with the standard of a general moving image. Specifically, for example, when the initially set recording frame period is 1/30 sec and the flicker period is 1/100 sec, the control unit 240E sets the imaging frame period (first frame period) to 1/100 sec (step S150). ), the recording frame period (second frame period) is set to 1/120 sec (frame rate is 120 fps) (step S160). The frame period can be set in the same manner even when the flicker period is 1/120 sec.
  • the display control unit 240F display the information on the monitor 270 to notify the user of the change (reset) of the recording frame period.
  • the control unit 240E may allow the user to select the recording frame period to be changed.
  • the recording frame cycle is shorter than the imaging frame cycle. Therefore, as described above with reference to FIG. 8, the image processing unit 240C inserts an interpolation frame to generate recording moving image data (step S222).
  • FIG. 10 is a flowchart showing another aspect 2 of setting the imaging frame period and the recording frame period.
  • the image is picked up at an image pickup frame period (first frame period) that is an integral multiple of the flicker period regardless of the image pickup mode (first moving image mode, second moving image mode), and a predetermined period is set as the recording frame period It is recorded as (second frame period).
  • first frame period an image pickup frame period that is an integral multiple of the flicker period regardless of the image pickup mode (first moving image mode, second moving image mode)
  • a predetermined period is set as the recording frame period It is recorded as (second frame period).
  • the recording frame period may remain the initial setting value (step S102).
  • the recording frame period may remain the specified value (step S102).
  • the image processing unit 240C performs the interpolation frame insertion, the frame synthesis, and the frame thinning described above with reference to FIGS. 7 and 8 according to the relationship between the imaging frame period and the recording frame period thus set (step). S222).
  • control unit 240E can set the imaging frame cycle to a value equal to the recording frame set in step S102 (step S182).
  • FIG. 11 is a flowchart showing a modified example 1 of the other aspect 2.
  • the control unit 240E can change (reset) the recording frame period initialized in step S102 (steps S160 and S180).
  • the recording frame cycle and the imaging frame cycle may be equal to each other, or one of them may be longer.
  • the specific value of the frame period can be set in the same manner as in the first embodiment, other aspect 1 and its modification.
  • the image processing unit 240C performs the above-described interpolation frame insertion, frame combination, and frame thinning according to the relationship between the imaging frame period and the recording frame period (step S222).
  • FIG. 12 is a flowchart showing another aspect 3 of setting the imaging frame period and the recording frame period.
  • the control unit 240E sets the imaging frame period in the normal moving image mode (first moving image mode) regardless of the flicker period (as in the case of no flicker) (step S182).
  • the control unit 240E sets the first frame period to an integral multiple of the flicker period according to the detection result of the flicker period.
  • the control unit 240E sets the recording frame period to the previously designated frame period (step S102).
  • the control unit 240E sets the imaging frame period to, for example, 1/100 sec, 1/50 sec, 1/25 sec, and 1/20 sec (frame rates are 100 fps, 50 fps, 25 fps, and 20 fps, respectively). be able to.
  • the imaging frame cycle is 1 time, 2 times, 4 times, and 5 times the flicker cycle, respectively.
  • the control unit 240E sets the image capturing frame period to, for example, 1/120 sec, 1/60 sec, 1/30 sec, and 1/24 sec (frame rates are 120 fps, 60 fps, 30 fps, and 24 fps, respectively). Can be set.
  • the imaging frame cycle is 1 time, 2 times, 4 times, and 5 times the flicker cycle, respectively.
  • the control unit 240E may set a value that is an integral multiple of the flicker period and that conforms to a general moving image standard as the imaging frame period.
  • FIG. 13 is a flowchart showing Modification 1 of the other aspect 3 described above.
  • the control unit 240E changes (resets) the recording frame period from the value of the initial setting (step S102) (steps S160 and S180).
  • the control unit 240E can change the recording frame period to an integral multiple of the flicker period in accordance with the imaging frame period.
  • the value of the imaging frame period may be the same or different. Further, when the values are different, any cycle may be long.
  • the control unit 240E may set the image capturing frame period and/or the recording frame period according to the standard of a general moving image.
  • the image processing unit 240C inserts interpolation frames, synthesizes frames, and thins out frames in accordance with the difference in frame period, as in the above-described aspect and modification (step S222).
  • the configuration of the imaging device is not limited to this.
  • the other imaging device of the present invention may be, for example, a built-in or external PC camera (PC: Personal Computer), or a portable terminal device having a shooting function as described below. ..
  • Examples of the mobile terminal device which is an embodiment of the image pickup device of the present invention include a mobile phone, a smartphone, a PDA (Personal Digital Assistants), and a portable game machine.
  • a smartphone will be described as an example in detail with reference to the drawings.
  • FIG. 14 is a diagram showing an appearance of a smartphone 1 (imaging device) which is an embodiment of an imaging device of the present invention, in which (a) is a front view and (b) is a rear view.
  • a smartphone 1 shown in FIG. 14 has a flat housing 2, and a display panel 21 (display device) as a display unit and an operation panel 22 (operation unit) as an input unit on one surface of the housing 2. Is provided with a display input unit 20.
  • the housing 2 includes a speaker 31, a microphone 32, an operation unit 40 (operation unit), camera units 41 and 42 (imaging device, imaging unit), and a strobe 43.
  • the configuration of the housing 2 is not limited to this. For example, a configuration in which the display unit and the input unit are independent may be employed, or a configuration having a folding structure or a slide mechanism may be employed.
  • FIG. 15 is a block diagram showing the configuration of the smartphone 1 shown in FIG.
  • the smartphone 1 includes a wireless communication unit 11, a display input unit 20, a call unit 30, an operation unit 40, camera units 41 and 42, a flash unit 43, a storage unit 50, and an external unit.
  • An input/output unit 60, a GPS receiving unit 70 (GPS: Global Positioning System), a motion sensor unit 80, and a power supply unit 90 are provided.
  • the smartphone 1 also includes a main control unit 101 (image acquisition unit, flicker detection unit (cycle detection unit, phase detection unit), control unit, display control unit, still image extraction unit, lens drive control unit).
  • a wireless communication function for performing mobile wireless communication via a base station device and a mobile communication network is provided.
  • the wireless communication unit 11 performs wireless communication with the base station device accommodated in the mobile communication network according to an instruction from the main control unit 101. Using such wireless communication, various file data such as voice data and image data, electronic mail data and the like are transmitted and received, and Web data and streaming data are received.
  • the display input unit 20 displays an image (still image and/or moving image), character information, and the like to visually convey the information to the user, and also to perform a user operation on the displayed information. It is a so-called touch panel for detection, and includes a display panel 21 and an operation panel 22.
  • an LCD Liquid Crystal Display
  • an OELD Organic Electro-Luminescence Display
  • the operation panel 22 is a device that is placed so that an image displayed on the display surface of the display panel 21 can be visually recognized, and detects one or a plurality of coordinates operated by a conductor such as a finger of a user or a pen.
  • a conductor such as a finger of a user or a pen
  • the operation panel 22 outputs a detection signal generated due to the operation to the main control unit 101.
  • the main control unit 101 detects the operation position (coordinates) on the display panel 21 based on the received detection signal.
  • the display panel 21 and the operation panel 22 of the smartphone 1 exemplified as one embodiment of the image pickup apparatus of the present invention integrally configure the display input unit 20, but the operation panel
  • the display panel 21 is arranged so as to completely cover the display panel 21.
  • the operation panel 22 may have a function of detecting a user operation even in the area outside the display panel 21.
  • the operation panel 22 has a detection area (hereinafter, referred to as a display area) for the overlapping portion that overlaps the display panel 21 and a detection area (hereinafter, a non-display area) for the other outer edge portion that does not overlap the display panel 21. Referred to).
  • the call unit 30 includes a speaker 31 and a microphone 32, converts a user's voice input through the microphone 32 into voice data that can be processed by the main control unit 101, and outputs the voice data to the main control unit 101. 11 or the audio data received by the external input/output unit 60 can be decoded and output from the speaker 31. Further, as shown in FIG. 14, for example, the speaker 31 can be mounted on the same surface as the surface on which the display input unit 20 is provided, and the microphone 32 can be mounted on the side surface of the housing 2.
  • the operation unit 40 is a hardware key that uses a key switch or the like, and is a device that receives an instruction from a user.
  • the operation unit 40 is mounted on the side surface of the housing 2 of the smartphone 1, and is turned on when pressed by a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a button type switch.
  • the storage unit 50 (recording device) was downloaded by control program and control data of the main control unit 101, application software, address data associated with names and telephone numbers of communication partners, data of sent and received emails, and Web browsing. It stores Web data, downloaded content data, and temporarily stores streaming data and the like. Further, the storage unit 50 includes an internal storage unit 51 with a built-in smartphone and an external storage unit 52 having a detachable external memory slot. The internal storage unit 51 and the external storage unit 52 that configure the storage unit 50 are realized using known storage media.
  • the external input/output unit 60 serves as an interface with all external devices connected to the smartphone 1.
  • the smartphone 1 is directly or indirectly connected to another external device via the external input/output unit 60 by communication or the like.
  • means for communication and the like include universal serial bus (USB), IEEE 1394, and network (eg, Internet, wireless LAN).
  • USB universal serial bus
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wide Band
  • ZigBee registered trademark
  • the like can be cited as means for communication.
  • Examples of the external device connected to the smartphone 1 include a wired/wireless headset, a wired/wireless external charger, and a wired/wireless data port.
  • a memory card Memory card
  • SIM Subscriber Identity Module Card
  • UIM User Identity Module Card
  • external audio and video devices connected via audio/video I/O (Input/Output) terminals, external audio and video devices wirelessly connected, smartphones wired/wirelessly connected, and wired/wirelessly connected.
  • External devices such as a PDA, a wired/wirelessly connected personal computer, and earphones can also be connected.
  • the external input/output unit 60 can transmit the data transmitted from such an external device to each component inside the smartphone 1, or can transmit the data inside the smartphone 1 to the external device.
  • the motion sensor unit 80 includes, for example, a triaxial acceleration sensor and a tilt sensor, and detects a physical movement of the smartphone 1 according to an instruction from the main control unit 101. By detecting the physical movement of the smartphone 1, the moving direction, acceleration, and posture of the smartphone 1 are detected. The detection result is output to the main control unit 101.
  • the power supply unit 90 supplies power stored in a battery (not shown) to each unit of the smartphone 1 according to an instruction from the main control unit 101.
  • the main control unit 101 includes a microprocessor, operates according to a control program and control data stored in the storage unit 50, and integrally controls each unit of the smartphone 1 including the camera unit 41. Further, the main control unit 101 includes a mobile communication control function for controlling each unit of the communication system and an application processing function for performing voice communication and data communication through the wireless communication unit 11.
  • the main control unit 101 has an image processing function of displaying a video on the display input unit 20 based on image data (still image data or moving image data) such as received data or downloaded streaming data.
  • the image processing function refers to a function of the main control unit 101 decoding image data, performing image processing on the decoding result, and displaying an image on the display input unit 20.
  • the camera units 41 and 42 are digital cameras (imaging devices) that electronically capture images using imaging devices such as CMOS and CCD. Under control of the main control unit 101, the camera units 41 and 42 convert the image data (moving image, still image) obtained by imaging into compressed image data such as MPEG or JPEG, and record it in the storage unit 50. In addition, it can be output through the external input/output unit 60 and the wireless communication unit 11. Further, the camera unit 41, under the control of the main control unit 101, performs division and combination of moving images, acquisition of high-quality still images (RAW images, etc.), frame replacement and processing, and extraction of still images from moving images. Can also In the smartphone 1 shown in FIGS. 14 and 15, one of the camera units 41 and 42 can be used for shooting, and the camera units 41 and 42 can be used simultaneously for shooting. When the camera unit 42 is used, the strobe 43 can be used.
  • the camera units 41 and 42 can be used for various functions of the smartphone 1.
  • the smartphone 1 can display the images acquired by the camera units 41 and 42 on the display panel 21. Further, the smartphone 1 can use the images of the camera units 41 and 42 as one of the operation inputs of the operation panel 22.
  • the GPS receiving unit 70 detects the position based on the positioning information from the GPS satellites ST1, ST2,..., STn
  • the smartphone 1 detects the position by referring to the images from the camera units 41 and 42.
  • the smartphone 1 refers to the images from the camera units 41 and 42 and uses the light of the camera unit 41 of the smartphone 1 without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the axial direction and the current usage environment.
  • the smartphone 1 can also use the images from the camera units 41 and 42 in the application software.
  • the position information obtained by the GPS receiving unit 70 and the voice information obtained by the microphone 32 are added to the image data of the still image or the moving image (the main control unit or the like performs voice text conversion to obtain text information.
  • the posture information acquired by the motion sensor unit 80 may be added and recorded in the storage unit 50.
  • the smartphone 1 can also output the image data of these still images or moving images through the external input/output unit 60 or the wireless communication unit 11.
  • the processing of the image capturing method according to the present invention is performed similarly to the camera 10 according to the first embodiment (video image capturing, flicker cycle and phase detection, frame cycle control, recording video). Data generation and recording, reproduction display of moving image data for recording, still image extraction, etc.) can be performed.
  • the camera units 41 and 42 and the main control unit 101 perform the processing (including the processing of the above-described flowchart) executed by the image processing apparatus 240 (each unit illustrated in FIG. 2) in the first embodiment. I can do it.
  • the functions of the operation unit 250, the recording device 260, and the monitor 270 according to the first embodiment can be realized by the operation unit 40, the storage unit 50 and the operation panel 22, the display panel 21, and the operation panel 22 in the smartphone 1. it can.
  • the same effect as that of the camera 10 according to the first embodiment (the flicker countermeasure can be performed according to the exposure time of the frame forming the moving image, and the flicker countermeasure can be taken). It is possible to obtain that a moving image captured by performing the recording can be recorded at a desired frame period.

Abstract

The purpose of the present invention is to provide an imaging device, an imaging method, and a program that make it possible to take a countermeasure against a flicker in accordance with exposure time of frames constituting a motion image. It is also the purpose of the present invention to provide an imaging device, an imaging method, and a program that make it possible to record a moving image, captured by taking a countermeasure against a flicker, at a desired frame period. The imaging device according to a first aspect of the present invention is provided with a plurality of moving image modes corresponding to the exposure time of the frames constituting the motion image. In a second moving image mode in which the exposure time of the frames constituting the motion image is set so as to be shorter than that of a first moving image mode, a first frame period is set to be an integral multiple of a flicker period in accordance with a flicker detection result, and this makes it possible to reduce a difference in brightness and a difference in color tone among frames. On the other hand, in the first moving image mode in which the exposure time of the frames is longer than that of the second moving image mode, the first frame period is set regardless of the flicker period.

Description

撮像装置、撮像方法、及びプログラムImaging device, imaging method, and program

 本発明は動画を撮影する撮像装置、撮像方法、及びプログラムに関し、特に動画像を撮像する際にフリッカー対策を行う撮像装置、撮像方法、及びプログラムに関する。

The present invention relates to an image capturing apparatus, an image capturing method, and a program that capture a moving image, and particularly to an image capturing apparatus, an image capturing method, and a program that take measures against flicker when capturing a moving image.

 人工光源下で動画を撮影する場合、「フリッカー」と呼ばれる明暗の変動やばらつきが生じることがあり、その対策を行う技術が知られている。例えば、特許文献1にはフリッカーの光量変化特性を検出する手段を複数設け、それら検出手段の検出結果に基づいてフリッカーの影響を抑制する処理を行うことが記載されている。また、特許文献2には、被写体がフリッカーの光源下である場合はフレームレートをフリッカー周期またはその整数倍に設定することが記載されている。

When a moving image is shot under an artificial light source, there are cases in which light and dark fluctuations and variations called "flicker" may occur, and techniques for taking measures against this are known. For example, Patent Document 1 describes that a plurality of means for detecting the flicker light amount change characteristic are provided, and processing for suppressing the influence of flicker is performed based on the detection results of these detection means. Further, Patent Document 2 describes that when the subject is under a flicker light source, the frame rate is set to a flicker cycle or an integral multiple thereof.

特開2018-133826号公報Japanese Patent Laid-Open No. 2018-133826 特開2012-60370号公報Japanese Patent Laid-Open No. 2012-60370

 フリッカーに対しては動画像を構成するフレームの露光時間に応じた対策を行うことが好ましく、また、動画像はユーザの希望等に応じた適切なフレーム周期(フレームレートに対応する)で記録することが好ましいが、このような点は特許文献1,2では考慮されていなかった。また、特許文献1、2は動画ライブビューでフリッカー対策を行った上で静止画をフリッカーレスで撮影するもので、フリッカー対策を行いつつ動画像を継続して撮影、記録することは考慮されていなかった。

For flicker, it is preferable to take measures according to the exposure time of the frames that make up the moving image, and the moving image is recorded at an appropriate frame period (corresponding to the frame rate) according to the user's wishes. However, such points have not been taken into consideration in Patent Documents 1 and 2. Further, Patent Documents 1 and 2 are for taking a flicker-free still image by taking a flicker countermeasure in a live view of a moving image, and do not consider continuously taking and recording a moving image while taking a flicker countermeasure. It was

 本発明はこのような事情に鑑みてなされたもので、動画像を構成するフレームの露光時間に応じたフリッカー対策を行える撮像装置、撮像方法、及びプログラムを提供することを目的とする。また、本発明はフリッカー対策を行って撮像した動画を所望のフレーム周期で記録できる撮像装置、撮像方法、及びプログラムを提供することを目的とする。

The present invention has been made in view of such circumstances, and an object of the present invention is to provide an imaging device, an imaging method, and a program capable of taking measures against flicker according to the exposure time of a frame forming a moving image. Another object of the present invention is to provide an image pickup apparatus, an image pickup method, and a program capable of recording a moving image picked up by taking measures against flicker at a desired frame period.

 上述した目的を達成するため、本発明の第1の態様に係る撮像装置は、動画像における光源のフリッカー周期を検出する周期検出部と、第1フレーム周期で動画像を撮像する撮像部と、撮像した動画像を示す第1動画データを取得する画像取得部と、第1動画データを処理して第2フレーム周期の第2動画データを生成する画像処理部と、第2動画データを記録装置に記録させる記録部と、第1フレーム周期及び第2フレーム周期を制御する制御部と、を備える撮像装置であって、制御部は、動画像の撮像モードとして、第1動画モードと、動画像を構成するフレームの露光時間が第1動画モードより短く設定される第2動画モードと、を有し、制御部は、第2動画モードでは検出の結果に応じて第1フレーム周期をフリッカー周期の整数倍に設定し、第1動画モードではフリッカー周期によらずに第1フレーム周期を設定する。

In order to achieve the above-mentioned object, an imaging device according to a first aspect of the present invention includes a cycle detection unit that detects a flicker cycle of a light source in a moving image, an imaging unit that captures the moving image at a first frame period, An image acquisition unit that acquires first moving image data indicating a captured moving image, an image processing unit that processes the first moving image data to generate second moving image data having a second frame period, and a recording device that records the second moving image data. An image pickup apparatus comprising: a recording unit for recording in a recording medium; and a control unit for controlling a first frame period and a second frame period, wherein the control unit has a first moving image mode and a moving image as a moving image capturing mode. And a second moving image mode in which the exposure time of the frames constituting the first moving image mode is set shorter than the first moving image mode, and the control unit sets the first frame period to the flicker period according to the detection result in the second moving image mode. It is set to an integral multiple, and in the first moving image mode, the first frame period is set regardless of the flicker period.

 フリッカーの影響下で動画を撮影すると動画像のフレーム間でフリッカーの影響による明暗差や色味の差が生じやすく、露光時間が短いと影響が大きくなる。そこで第1の態様に係る撮像装置では、動画像を構成するフレームの露光時間に応じた複数の動画モード(第1動画モード、第2動画モード)を設けた。そして、動画像を構成するフレームの露光時間が第1動画モードより短く設定される第2動画モード(フリッカーの影響が大きい)ではフリッカーの検出結果に応じて第1フレーム周期をフリッカー周期の整数倍に設定しており、これによりフレーム間の明暗差や色味の差を低減することができる。一方、フレームの露光時間が第2動画モードより長くフリッカーの影響が少ない第1動画モードでは、フリッカー周期によらずに第1フレーム周期を設定する。

When a moving image is shot under the influence of flicker, a difference in brightness and tint due to the influence of flicker is likely to occur between frames of a moving image, and the influence becomes large when the exposure time is short. Therefore, the image pickup apparatus according to the first aspect is provided with a plurality of moving image modes (first moving image mode and second moving image mode) according to the exposure time of the frames forming the moving image. Then, in the second moving image mode (in which the influence of flicker is large) in which the exposure time of the frames forming the moving image is set shorter than that of the first moving image mode, the first frame period is set to an integral multiple of the flicker period according to the detection result of the flicker. Is set, and thereby, it is possible to reduce the difference in lightness and darkness and the difference in tint between frames. On the other hand, in the first moving image mode in which the frame exposure time is longer than that in the second moving image mode and the influence of flicker is small, the first frame period is set regardless of the flicker period.

 このように、第1の態様によれば、動画像を構成するフレームの露光時間に応じたフリッカー対策を行うことができる。

As described above, according to the first aspect, it is possible to take countermeasures against flicker according to the exposure time of the frames forming the moving image.

 また、第1の態様ではフリッカー対策を行って撮像した動画を所望のフレーム周期(第2フレーム周期)で記録することができる。第1フレーム周期と第2フレーム周期とは同じであってもよいし、違っていてもよい。記録装置は撮像装置に備えられたものでもよいし、撮像装置とは独立したものでもよい。

In addition, in the first aspect, it is possible to record a moving image captured with flicker countermeasures at a desired frame period (second frame period). The first frame period and the second frame period may be the same or different. The recording device may be included in the imaging device or may be independent of the imaging device.

 第1の態様において、フリッカーの検出は動画記録開始前に行うことができるが、動画記録開始前に加えて動画記録中に行ってもよい。

In the first aspect, the flicker can be detected before the moving image recording is started, but may be performed during the moving image recording in addition to before the moving image recording is started.

 第2の態様に係る撮像装置は第1の態様において、制御部は、第2動画モードにおいて、光源のフリッカーが検出された場合と検出されない場合とで第2フレーム周期を変化させない。第2の態様では、第2動画モードではフリッカーが検出されたか否かによらず第2フレーム周期(記録のフレーム周期)を変化させないので、あらかじめ指定されたフレーム周期と異なる周期で動画データが記録されることがない。

In the imaging device according to the second aspect, in the first aspect, the control unit does not change the second frame period in the second moving image mode depending on whether flicker of the light source is detected or not. In the second mode, since the second frame period (frame period of recording) is not changed in the second moving image mode regardless of whether flicker is detected or not, moving image data is recorded at a period different from a preset frame period. Never be done.

 第3の態様に係る撮像装置は第1の態様において、制御部は、第2動画モードにおいて光源のフリッカーが検出された場合、第2フレーム周期を第1フレーム周期より長い周期に設定する。第3の態様では、フリッカー対策として第1フレーム周期を短く(フリッカー周期の整数倍に)した場合に、第2フレーム周期をそれに合わせて短くしなくてよい。

In the imaging device according to the third aspect, in the first aspect, the control unit sets the second frame period to a period longer than the first frame period when flicker of the light source is detected in the second moving image mode. In the third aspect, when the first frame period is shortened (to an integral multiple of the flicker period) as a countermeasure against flicker, the second frame period does not have to be shortened accordingly.

 第4の態様に係る撮像装置は第1から第3の態様のいずれか1つにおいて、画像処理部は、第2フレーム周期が第1フレーム周期より長い場合に、第1動画データを構成する複数のフレームのうち一部のフレームを選択して第2フレーム周期の第2動画データを生成する。第4の態様では、一部のフレームの「間引き」により、第1フレーム周期で撮像した動画像を第2フレーム周期(記録フレーム周期)に合わせる。

The image pickup apparatus according to a fourth aspect is the image pickup apparatus according to any one of the first to third aspects, wherein the image processing unit configures a plurality of first moving image data when the second frame period is longer than the first frame period. A part of the frames is selected to generate the second moving image data of the second frame period. In the fourth aspect, the moving image captured in the first frame period is adjusted to the second frame period (recording frame period) by “thinning” some frames.

 第5の態様に係る撮像装置は第1から第4の態様のいずれか1つにおいて、画像処理部は、第2フレーム周期が第1フレーム周期より長い場合に、第1動画データを構成する複数のフレームを合成して第2動画データを構成するフレームを生成する。画像処理部は、フレームを合成する際に重み付けをしてもよい。

The image pickup apparatus according to a fifth aspect is the image pickup device according to any one of the first to fourth aspects, wherein the image processing unit configures a plurality of first moving image data when the second frame period is longer than the first frame period. The frames that compose the second moving image data are generated by synthesizing the frames. The image processing unit may perform weighting when combining the frames.

 第6の態様に係る撮像装置は第1の態様において、制御部は、第2動画モードにおいて光源のフリッカーが検出された場合、第2フレーム周期を第1フレーム周期より短い周期に設定する。制御部は、ユーザの指示に応じてこのような処理を行ってもよい。

In the imaging device according to a sixth aspect, in the first aspect, the control unit sets the second frame period to a period shorter than the first frame period when flicker of the light source is detected in the second moving image mode. The control unit may perform such processing according to a user's instruction.

 第7の態様に係る撮像装置は第1、第2、第6の態様のいずれか1つにおいて、画像処理部は、第1フレーム周期が第2フレーム周期より長い場合に、第1動画データを構成するフレームに補間フレームを挿入して第2フレーム周期の第2動画データを生成する。補間フレームは、何も被写体の映っていないフレームでもよいし、補間フレーム挿入直前のフレームそのままでもよい。また、挿入タイミングの前及び/または後のフレームから補間等により生成したフレームでもよい。なお、補間フレームを挿入する場合、撮像及び記録のフレーム周期、補間フレームの挿入タイミング等の情報を動画ファイルのヘッダ等に記録しておくことが望ましい。

In the imaging device according to a seventh aspect, in any one of the first, second, and sixth aspects, the image processing unit, when the first frame period is longer than the second frame period, outputs the first moving image data. Interpolation frames are inserted into the constituent frames to generate second moving image data of the second frame period. The interpolation frame may be a frame in which no subject is reflected, or may be the frame just before the interpolation frame is inserted. Further, it may be a frame generated by interpolation or the like from the frame before and/or after the insertion timing. When inserting an interpolation frame, it is desirable to record information such as a frame cycle of image pickup and recording, an insertion timing of an interpolation frame, etc. in a header of a moving image file.

 第8の態様に係る撮像装置は第7の態様において、第2動画データを画像として表示装置に表示させる表示制御部をさらに備え、表示制御部は補間フレームを除いて第2動画データを表示させる。第8の態様では補間フレームが表示されないので、ユーザは撮像した動画を効率良く確認することができる。

An image pickup apparatus according to an eighth aspect is the image pickup apparatus according to the seventh aspect, further comprising a display control unit that causes the display device to display the second moving image data as an image, and the display control unit causes the second moving image data to be displayed excluding an interpolation frame. .. In the eighth aspect, since the interpolation frame is not displayed, the user can efficiently check the captured moving image.

 第9の態様に係る撮像装置は第1、第3から第8の態様のいずれか1つにおいて、制御部は、第2動画モードにおいて、検出の結果に応じて第2フレーム周期をフリッカー周期の整数倍に設定する。第9の態様において、制御部は第1フレーム周期と第2フレーム周期とを等しくしてもよい。

In the imaging apparatus according to a ninth aspect, in any one of the first and third to eighth aspects, the control unit sets the second frame period to the flicker period according to the detection result in the second moving image mode. Set to an integer multiple. In the ninth aspect, the control unit may equalize the first frame period and the second frame period.

 第10の態様に係る撮像装置は第1ないし第9の態様のいずれか1つにおいて、第1動画データにおける光源のフリッカー位相情報を検出する位相検出部をさらに備え、制御部は、第2動画モードにおいて、第1動画データを構成するフレームの露光タイミングをフリッカー位相情報に基づいて制御する。第10の態様によれば、フレーム間の明暗や色味のばらつきを低減するだけでなく、所望の明るさで動画を撮像することができる。

An image pickup apparatus according to a tenth aspect is the image pickup apparatus according to any one of the first to ninth aspects, further including a phase detection unit that detects flicker phase information of the light source in the first moving image data, and the control unit includes the second moving image. In the mode, the exposure timing of the frames forming the first moving image data is controlled based on the flicker phase information. According to the tenth aspect, it is possible to capture a moving image with a desired brightness, as well as to reduce variations in brightness and tint between frames.

 第11の態様に係る撮像装置は第1ないし第10の態様のいずれか1つにおいて、第2動画モードでは、第1動画モードに対してオートフォーカスの速度、自動露出の追従速度、ホワイトバランスの追従速度のうち少なくとも1つが高速に設定される。第11の態様は、第2動画モードにおいて第1動画モードと異なる撮影条件を規定したものである。

An image pickup apparatus according to an eleventh aspect is the image pickup apparatus according to any one of the first to tenth aspects, wherein in the second moving image mode, an autofocus speed, an automatic exposure follow-up speed, and a white balance of the first moving image mode are set. At least one of the following speeds is set to a high speed. The eleventh aspect defines shooting conditions different from those in the first moving image mode in the second moving image mode.

 第12の態様に係る撮像装置は第1ないし第11の態様のいずれか1つにおいて、第2動画データを構成するフレームを静止画として抽出する静止画抽出部をさらに備える。フリッカー対策を行わない場合「静止画として抽出するのに最適なタイミングのフレームが、画質不良である」等の問題が生じる可能性があるが、第12の態様によれば、上述したフリッカー対策により、ユーザは画質低下を気にせずに抽出するフレームを選択することができる。

The image pickup apparatus according to the twelfth aspect is the image pickup apparatus according to any one of the first to eleventh aspects, further including a still image extraction unit that extracts a frame forming the second moving image data as a still image. If flicker countermeasures are not taken, problems such as "a frame at the optimum timing for extracting as a still image has poor image quality" may occur. However, according to the twelfth aspect, the above-mentioned flicker countermeasures are taken. The user can select a frame to be extracted without worrying about deterioration of image quality.

 上述した目的を達成するため、本発明の第13の態様に係る撮像装置は、動画像における光源のフリッカー周期を検出する周期検出部と、第1フレーム周期で動画像を撮像する撮像部と、撮像した動画像を示す第1動画データを取得する画像取得部と、第1動画データを処理して第2フレーム周期の第2動画データを生成する画像処理部と、第2動画データを記録装置に記録させる記録部と、第1フレーム周期及び第2フレーム周期を制御する制御部と、を備える撮像装置であって、制御部は検出の結果に応じて第1フレーム周期をフリッカー周期の整数倍に設定し、第2フレーム周期をあらかじめ指定された周期に設定する。第13の態様によれば、第1フレーム周期をフリッカー周期の整数倍に設定することにより、フレーム間の明るさや色味のばらつきを低減することができる。また、第13の態様によればフリッカー対策を行って撮像した動画をあらかじめ指定されたフレーム周期(第2フレーム周期)で記録することができる。制御部は、ユーザの指定操作に応じて、またはユーザの指定操作によらずに第2フレーム周期を設定することができ、これによりフリッカー対策を行って撮像した動画を所望のフレーム周期で記録させることができる。

In order to achieve the above-mentioned object, an imaging apparatus according to a thirteenth aspect of the present invention includes a cycle detection unit that detects a flicker cycle of a light source in a moving image, an imaging unit that captures the moving image in a first frame cycle, An image acquisition unit that acquires first moving image data indicating a captured moving image, an image processing unit that processes the first moving image data to generate second moving image data having a second frame period, and a recording device that records the second moving image data. An image pickup apparatus comprising: a recording unit for recording the first frame period and a control unit for controlling the first frame period and the second frame period, wherein the control unit sets the first frame period to an integral multiple of the flicker period according to the detection result. And the second frame period is set to a previously designated period. According to the thirteenth aspect, by setting the first frame period to be an integral multiple of the flicker period, it is possible to reduce variations in brightness and tint between frames. In addition, according to the thirteenth aspect, it is possible to record a moving image captured with flicker countermeasures at a frame cycle (second frame cycle) designated in advance. The control unit can set the second frame period according to the user's designation operation or without the user's designation operation, whereby the moving image captured by the flicker countermeasure is recorded at the desired frame period. be able to.

 なお第13の態様及び以下の各態様において、制御部は、ユーザの指定操作に応じて、またはユーザの指定操作によらずに第2フレーム周期(記録のフレーム周期)を設定し、設定した第2フレーム周期に応じて第1フレーム周期(撮影のフレーム周期)を設定してもよい。

In the thirteenth aspect and each of the following aspects, the control unit sets and sets the second frame period (frame period of recording) according to the user's designation operation or without the user's designation operation. The first frame period (frame period of shooting) may be set according to the two-frame period.

 第14の態様に係る撮像装置は第13の態様において、制御部は、光源のフリッカーが検出された場合、第2フレーム周期を第1フレーム周期より長い周期に設定する。第14の態様は、第1フレーム周期の設定手法の一態様を規定するものである。

In the thirteenth aspect of the imaging apparatus according to the fourteenth aspect, the control unit sets the second frame period to a period longer than the first frame period when the flicker of the light source is detected. The fourteenth aspect defines one aspect of the setting method of the first frame period.

 第15の態様に係る撮像装置は第13または第14の態様において、画像処理部は、第2フレーム周期が第1フレーム周期より長い場合に、第1動画データを構成する複数のフレームのうち一部のフレームを選択して第2動画データを生成する。第15の態様では、一部のフレームの「間引き」により、第1フレーム周期で撮像した動画像を第2フレーム周期(記録のフレーム周期)に合わせることができる。

The imaging device according to a fifteenth aspect is the image pickup device according to the thirteenth or fourteenth aspect, wherein the image processing unit selects one of the plurality of frames forming the first moving image data when the second frame period is longer than the first frame period. A frame is selected to generate second moving image data. In the fifteenth aspect, the moving image captured in the first frame period can be matched with the second frame period (recording frame period) by “thinning” some frames.

 第16の態様に係る撮像装置は第13から第15の態様のいずれか1つにおいて、画像処理部は、第2フレーム周期が第1フレーム周期より長い場合に、第1動画データを構成する複数のフレームを合成して第2動画データを構成するフレームを生成する。合成の際に、フレームに重み付けをしてもよい。

The imaging device according to a sixteenth aspect is the imaging device according to any one of the thirteenth to fifteenth aspects, wherein the image processing unit configures a plurality of first moving image data when the second frame period is longer than the first frame period. The frames that compose the second moving image data are generated by synthesizing the frames. Frames may be weighted during synthesis.

 第17の態様に係る撮像装置は第13の態様において、制御部は、光源のフリッカーが検出された場合、第2フレーム周期を第1フレーム周期より短い周期に設定する。第17の態様は、第1フレーム周期の設定手法の一態様を規定するものである。

In the imaging device according to a seventeenth aspect, in the thirteenth aspect, the control unit sets the second frame period to a period shorter than the first frame period when the flicker of the light source is detected. The seventeenth aspect defines one aspect of the setting method of the first frame period.

 第18の態様に係る撮像装置は第13または第17の態様において、画像処理部は、第1フレーム周期が第2フレーム周期より長い場合に、第1動画データを構成するフレームに補間フレームを挿入して第2動画データを構成するフレームを生成する。補間フレームは、何も被写体の映っていないフレームでもよいし、補間フレーム挿入直前のフレームそのままでもよい。また、挿入タイミングの前後のフレームを補間して生成したフレームでもよい。なお、補間フレームを挿入する場合、撮像及び記録のフレーム周期(フレームレート)、補間フレームの挿入タイミング等の情報を動画ファイルのヘッダ等に記録しておくことが望ましい。

In the imaging device according to an eighteenth aspect, in the thirteenth or seventeenth aspect, the image processing unit inserts an interpolation frame into a frame forming the first moving image data when the first frame period is longer than the second frame period. Then, a frame forming the second moving image data is generated. The interpolation frame may be a frame in which no subject is reflected, or may be the frame just before the interpolation frame is inserted. Further, it may be a frame generated by interpolating frames before and after the insertion timing. When inserting the interpolation frame, it is desirable to record information such as a frame period (frame rate) of image pickup and recording, an insertion timing of the interpolation frame and the like in the header of the moving image file.

 第19の態様に係る撮像装置は第18の態様において、第2動画データを画像として表示装置に表示させる表示制御部をさらに備え、表示制御部は補間フレームを除いて第2動画データを表示させる。第19の態様では、補間フレームが表示されないので、ユーザは記録した動画を効率良く確認することができる。

An imaging apparatus according to a nineteenth aspect further comprises, in the eighteenth aspect, a display control unit that causes the display device to display the second moving image data as an image, and the display control unit causes the second moving image data to be displayed excluding an interpolation frame. .. In the nineteenth aspect, since the interpolation frame is not displayed, the user can efficiently check the recorded moving image.

 第20の態様に係る撮像装置は第13から第19の態様のいずれか1つにおいて、第1動画データにおける光源のフリッカー位相情報を検出する位相検出部をさらに備え、制御部は、第1動画データを構成するフレームの露光タイミングをフリッカー位相情報に基づいて制御する。第20の態様によれば、フレーム間の明暗や色味のばらつきを低減するだけでなく、全てのフレームを所望の明るさで撮影することができる。

An image pickup apparatus according to a twentieth aspect is the imaging apparatus according to any one of the thirteenth to nineteenth aspects, further including a phase detection unit that detects flicker phase information of the light source in the first moving image data, and the control unit sets the first moving image. The exposure timing of the frames forming the data is controlled based on the flicker phase information. According to the twentieth aspect, it is possible to not only reduce the variations in brightness and tint between frames but also shoot all the frames with a desired brightness.

 第21の態様に係る撮像装置は第13から第20の態様のいずれか1つにおいて、制御部は、動画像の撮像モードとして、第1動画モードと、動画像を構成するフレームの露光時間が第1動画モードより短く設定される第2動画モードとを有し、制御部は、第2動画モードでは、検出の結果に応じて第1フレーム周期をフリッカー周期の整数倍に設定する。第21の態様では、第1の態様と同様に、動画像を構成するフレームの露光時間が第1動画モードより短く設定される第2動画モード(フリッカーの影響が大きい)ではフリッカーの検出の結果に応じて第1フレーム周期をフリッカー周期の整数倍に設定するので、フレームの露光時間に応じたフリッカー対策を行うことができる。

The image pickup apparatus according to a twenty-first aspect is the image pickup apparatus according to any one of the thirteenth to twentieth aspects, wherein the control unit sets the first moving image mode as an image pickup mode of the moving image and the exposure time of the frames forming the moving image. The second moving image mode is set to be shorter than the first moving image mode. In the second moving image mode, the control unit sets the first frame period to an integral multiple of the flicker period according to the detection result. In the twenty-first aspect, similarly to the first aspect, the flicker detection result is obtained in the second moving image mode (the influence of the flicker is large) in which the exposure time of the frames forming the moving image is set shorter than that in the first moving image mode. Accordingly, the first frame period is set to be an integral multiple of the flicker period, so that flicker countermeasures can be taken according to the exposure time of the frame.

 第22の態様に係る撮像装置は第21の態様において、第2動画モードは、第1動画モードに対してオートフォーカスの速度、自動露出の追従速度、ホワイトバランスの追従速度のうち少なくとも1つが高速に設定される。第22の態様は、第2動画モードにおいて第1動画モードと異なる撮影条件を規定したものである。

According to a twenty-first aspect, in the image pickup apparatus according to the twenty-first aspect, in the second moving image mode, at least one of an auto-focus speed, an automatic exposure follow-up speed, and a white balance follow-up speed is faster than the first moving-picture mode. Is set to. The twenty-second aspect defines shooting conditions different from those in the first moving image mode in the second moving image mode.

 第23の態様に係る撮像装置は第13から第22の態様のいずれか1つにおいて、第2動画データを構成するフレームを静止画として抽出する静止画抽出部をさらに備える。フリッカー対策がないと「タイミングが最適のフレームが、画質が不良である」等の問題が生じる可能性があるが、第23の態様によれば、上述したフリッカー対策によりユーザは画質低下を気にすることなく抽出するフレームを選択することができる。

The image pickup apparatus according to the twenty-third aspect is the image pickup apparatus according to any one of the thirteenth to twenty-second aspects, further including a still image extraction unit that extracts a frame forming the second moving image data as a still image. If there is no flicker countermeasure, a problem such as "a frame with an optimal timing, but the image quality is poor" may occur. According to the twenty-third aspect, the user is concerned about the image quality deterioration due to the above-mentioned flicker countermeasure. The frame to be extracted can be selected without doing so.

 上述した目的を達成するため、本発明の第24の態様に係る撮像方法は、動画像の撮像モードとして、第1動画モードと、動画像を構成するフレームの露光時間が第1動画モードより短く設定される第2動画モードとを有する撮像装置の撮像方法であって、動画像における光源のフリッカー周期を検出する周期検出ステップと、第1フレーム周期で動画像を撮像する撮像ステップと、撮像した動画像を示す第1動画データを取得する画像取得ステップと、第1動画データを処理して第2フレーム周期の第2動画データを生成する画像処理ステップと、第2動画データを記録装置に記録させる記録ステップと、第1フレーム周期及び第2フレーム周期を制御する制御ステップと、を有し、制御ステップにおいて、第2動画モードでは検出の結果に応じて第1フレーム周期をフリッカー周期の整数倍に設定し、第1動画モードではフリッカー周期によらずに第1フレーム周期を設定する。第24の態様によれば、第1の態様と同様に、動画像を構成するフレームの露光時間に応じたフリッカー対策を行うことができる。また、フリッカー対策を行って撮像した動画を第2フレーム周期で記録することができる。第24の態様に対し、第2から第12の態様と同様の構成をさらに含めてもよい。

In order to achieve the above-mentioned object, the imaging method according to the twenty-fourth aspect of the present invention provides a moving image capturing mode in which a first moving image mode and an exposure time of a frame forming a moving image are shorter than those in the first moving image mode. An image pickup method of an image pickup apparatus having a second moving image mode set, comprising: a period detection step of detecting a flicker period of a light source in a moving image; an image pickup step of capturing a moving image at a first frame period; An image acquisition step of acquiring first moving image data indicating a moving image, an image processing step of processing the first moving image data to generate second moving image data of a second frame period, and recording the second moving image data in a recording device. And a control step of controlling the first frame period and the second frame period. In the control step, in the second moving image mode, the first frame period is an integral multiple of the flicker period according to the detection result. And the first frame period is set regardless of the flicker period in the first moving image mode. According to the twenty-fourth aspect, as in the first aspect, it is possible to take flicker countermeasures according to the exposure time of the frames forming the moving image. Further, it is possible to record an imaged moving image with the flicker countermeasure in the second frame period. The twenty-fourth aspect may further include the same configuration as the second to twelfth aspects.

 上述した目的を達成するため、本発明の第25の態様に係る撮像方法は、動画像における光源のフリッカー周期を検出する検出ステップと、第1フレーム周期で動画像を撮像する撮像ステップと、撮像した動画像を示す第1動画データを取得する画像取得ステップと、第1動画データを処理して第2フレーム周期の第2動画データを生成する画像処理ステップと、第2動画データを記録装置に記録させる記録ステップと、第1フレーム周期及び第2フレーム周期を制御する制御ステップと、を有し、制御ステップにおいて、検出の結果に応じて第1フレーム周期をフリッカー周期の整数倍に設定し、第2フレーム周期をあらかじめ指定された周期に設定する。第25の態様によれば、第13の態様と同様に、フリッカー対策を行って撮像した動画を所望のフレーム周期で記録することができる。第25の態様に対し、第14から第23の態様と同様の構成をさらに含めてもよい。

In order to achieve the above-mentioned object, an imaging method according to a twenty-fifth aspect of the present invention includes a detection step of detecting a flicker period of a light source in a moving image, an imaging step of capturing a moving image in a first frame period, and an imaging An image acquisition step of acquiring first moving image data indicating the moving image, an image processing step of processing the first moving image data to generate second moving image data of a second frame period, and the second moving image data in a recording device. A recording step for recording, and a control step for controlling the first frame period and the second frame period, wherein in the control step, the first frame period is set to an integral multiple of the flicker period according to the detection result, The second frame period is set to a previously designated period. According to the twenty-fifth aspect, similarly to the thirteenth aspect, it is possible to record a moving image captured with a countermeasure against flicker at a desired frame period. The 25th mode may further include the same configuration as the 14th to 23rd modes.

 上述した目的を達成するため、本発明の第26の態様に係るプログラムは、第24の態様に係る撮像方法を撮像装置に実行させる。これにより、第1の態様及び第24の態様と同様に、動画像を構成するフレームの露光時間に応じたフリッカー対策を行うことができる。第26の態様に係るプログラムに、第2から第12の態様と同様の構成をさらに含めてもよい。また、これら態様のプログラムのコンピュータ読み取り可能なコードを記録した非一時的記録媒体も、本発明の一態様として挙げることができる。

In order to achieve the above-mentioned object, a program according to a twenty-sixth aspect of the present invention causes an imaging device to execute the imaging method according to the twenty-fourth aspect. Accordingly, as in the first aspect and the twenty-fourth aspect, it is possible to take measures against flicker according to the exposure time of the frames forming the moving image. The program according to the twenty-sixth aspect may further include the same configuration as the second to twelfth aspects. Further, a non-transitory recording medium in which the computer readable code of the program according to these aspects is recorded can also be mentioned as one aspect of the present invention.

 上述した目的を達成するため、本発明の第27の態様に係るプログラムは、第25の態様に係る撮像方法を撮像装置に実行させる。これにより、第13の態様及び第25の態様と同様に、フリッカー対策を行って撮像した動画を所望のフレーム周期で記録することができる。第27の態様に係るプログラムに、第14から第23の態様と同様の構成をさらに含めてもよい。また、これら態様のプログラムのコンピュータで読み取り可能なコードを記録した非一時的記録媒体も、本発明の一態様として挙げることができる。

To achieve the above object, a program according to a twenty-seventh aspect of the present invention causes an imaging device to execute the imaging method according to the twenty-fifth aspect. With this, similarly to the thirteenth aspect and the twenty-fifth aspect, it is possible to record a moving image captured with a countermeasure against flicker at a desired frame period. The program according to the twenty-seventh aspect may further include the same configuration as the fourteenth to twenty-third aspects. In addition, a non-transitory recording medium in which a computer-readable code of the program of these aspects is recorded can also be cited as one aspect of the present invention.

 以上説明したように、本発明の撮像装置、撮像方法、及びプログラムによれば、動画像を構成するフレームの露光時間に応じたフリッカー対策を行うことができる。また、本発明の撮像装置、撮像方法、及びプログラムによれば、フリッカー対策を行って撮像した動画を所望のフレーム周期で記録することができる。

As described above, according to the image pickup apparatus, the image pickup method, and the program of the present invention, it is possible to take measures against flicker according to the exposure time of the frames forming the moving image. Further, according to the image pickup apparatus, the image pickup method, and the program of the present invention, it is possible to record a moving image picked up with flicker countermeasures at a desired frame period.

図1は、第1の実施形態に係るカメラの構成を示す図である。FIG. 1 is a diagram showing a configuration of a camera according to the first embodiment. 図2は、画像処理装置の機能構成を示す図である。FIG. 2 is a diagram showing a functional configuration of the image processing apparatus. 図3は、撮像方法の処理を示すフローチャートである。FIG. 3 is a flowchart showing the processing of the imaging method. 図4は、撮像フレーム周期とフリッカーによる影響との関係を示す図である。FIG. 4 is a diagram showing the relationship between the imaging frame period and the influence of flicker. 図5は、撮像タイミングの制御の様子を示す図である。FIG. 5 is a diagram showing how the imaging timing is controlled. 図6は、撮像方法の処理を示す他のフローチャートである。FIG. 6 is another flowchart showing the processing of the imaging method. 図7は、フレームの間引きの様子を示す図である。FIG. 7 is a diagram showing how frames are thinned out. 図8は、フレームの挿入の様子を示す図である。FIG. 8 is a diagram showing how a frame is inserted. 図9は、フレーム切り出し時の再生処理を示すフローチャートである。FIG. 9 is a flowchart showing a reproduction process when a frame is cut out. 図10は、撮像方法の処理を示す他のフローチャートである。FIG. 10 is another flowchart showing the processing of the imaging method. 図11は、撮像方法の処理を示すさらに他のフローチャートである。FIG. 11 is another flowchart showing the processing of the imaging method. 図12は、撮像方法の処理を示すさらに他のフローチャートである。FIG. 12 is still another flowchart showing the process of the imaging method. 図13は、撮像方法の処理を示すさらに他のフローチャートである。FIG. 13 is another flowchart showing the processing of the imaging method. 図14は、第2の実施形態に係るスマートフォンの外観図である。FIG. 14 is an external view of a smartphone according to the second embodiment. 図15は、第2の実施形態に係るスマートフォンの構成を示すブロック図である。FIG. 15 is a block diagram showing the configuration of the smartphone according to the second embodiment.

 以下、添付図面を参照しつつ、本発明に係る撮像装置、撮像方法、及びプログラムを実施するための形態について詳細に説明する。

Hereinafter, embodiments for implementing an imaging device, an imaging method, and a program according to the present invention will be described in detail with reference to the accompanying drawings.

 <第1の実施形態>

 <撮像装置の全体構成>

 図1は第1の実施形態に係るカメラ10(撮像装置)の構成を示す図である。カメラ10は交換レンズ100(撮像部、撮像装置)及び撮像装置本体200(撮像装置)により構成され、後述するズームレンズ110を含む撮影レンズにより被写体像(光学像)を撮像素子210に結像させる。交換レンズ100と撮像装置本体200とは、図示せぬマウントを介して装着及び取り外しすることができる。

<First Embodiment>

<Overall structure of imaging device>

FIG. 1 is a diagram showing a configuration of a camera 10 (imaging device) according to the first embodiment. The camera 10 is composed of an interchangeable lens 100 (image pickup unit, image pickup device) and an image pickup device main body 200 (image pickup device), and forms a subject image (optical image) on the image pickup element 210 by a photographing lens including a zoom lens 110 described later. .. The interchangeable lens 100 and the imaging device main body 200 can be attached and detached via a mount (not shown).

 <交換レンズの構成>

 交換レンズ100は、ズームレンズ110と、フォーカスレンズ120と、絞り130と、レンズ駆動部140とを備える。レンズ駆動部140は、画像処理装置240(図2のレンズ駆動制御部240H)からの指令に応じてズームレンズ110、フォーカスレンズ120を進退駆動してズーム(光学ズーム)調整、フォーカス調整を行う。ズーム調整及びフォーカス調整は、画像処理装置240からの指令に応じて行う他に、ユーザが行ったズーム操作、フォーカス操作(図示せぬズームリング、フォーカスリングの回動等)に応じて行ってもよい。また、レンズ駆動部140は画像処理装置240からの指令に応じて絞り130を制御し、露出を調整する。一方、ズームレンズ110及びフォーカスレンズ120の位置、絞り130の開放度等の情報が画像処理装置240に入力される。なお、交換レンズ100は光軸Lを有する。

<Structure of interchangeable lens>

The interchangeable lens 100 includes a zoom lens 110, a focus lens 120, a diaphragm 130, and a lens driving unit 140. The lens driving unit 140 drives the zoom lens 110 and the focus lens 120 forward and backward according to a command from the image processing device 240 (lens drive control unit 240H in FIG. 2) to perform zoom (optical zoom) adjustment and focus adjustment. The zoom adjustment and the focus adjustment may be performed according to a command from the image processing device 240, or may be performed according to a zoom operation or a focus operation (a zoom ring (not shown), a rotation of the focus ring, or the like) performed by the user. Good. Further, the lens driving unit 140 controls the diaphragm 130 according to a command from the image processing device 240 to adjust the exposure. On the other hand, information such as the positions of the zoom lens 110 and the focus lens 120 and the opening degree of the diaphragm 130 is input to the image processing device 240. The interchangeable lens 100 has an optical axis L.

 <撮像装置本体の構成>

 撮像装置本体200は、撮像素子210(撮像部)、AFE220(AFE:Analog FrontEnd、撮像部)、A/D変換器230(A/D:Analog to Digital、撮像部)、及び画像処理装置240(画像取得部、フリッカー検出部、画像処理部、記録部、制御部、表示制御部、静止画抽出部、レンズ駆動制御部)を備える。また、撮像装置本体200は、操作部250、記録装置260(記録装置)、モニタ270(表示装置)、及び姿勢センサ280を備える。撮像装置本体200は、撮像素子210に透過させる光を遮光するためのシャッター(不図示)を有していてもよい。尚、シャッターは、メカニカルシャッターでも電子シャッターでもよい。電子シャッターの場合、画像処理装置240によって撮像素子210の電荷蓄積期間を制御することで露光時間(シャッタースピード)を調節することが出来る。また、撮像部201は交換レンズ100、撮像素子210、AFE220、及びA/D変換器230で構成されており、画像取得部240A(画像取得部)により制御される。

<Structure of imaging device body>

The image pickup apparatus main body 200 includes an image pickup element 210 (image pickup section), an AFE 220 (AFE: Analog FrontEnd, image pickup section), an A/D converter 230 (A/D: Analog to Digital, image pickup section), and an image processing apparatus 240 ( (Image acquisition unit, flicker detection unit, image processing unit, recording unit, control unit, display control unit, still image extraction unit, lens drive control unit). Further, the imaging device main body 200 includes an operation unit 250, a recording device 260 (recording device), a monitor 270 (display device), and an attitude sensor 280. The imaging device main body 200 may have a shutter (not shown) for blocking the light transmitted through the imaging element 210. The shutter may be a mechanical shutter or an electronic shutter. In the case of an electronic shutter, the exposure time (shutter speed) can be adjusted by controlling the charge accumulation period of the image sensor 210 by the image processing device 240. Further, the image pickup unit 201 includes an interchangeable lens 100, an image pickup element 210, an AFE 220, and an A/D converter 230, and is controlled by an image acquisition unit 240A (image acquisition unit).

 撮像素子210は、多数の受光素子がマトリクス状に配列された受光面を備える。そして、ズームレンズ110、フォーカスレンズ120、及び絞り130を透過した被写体光が撮像素子210の受光面上に結像され、各受光素子によって電気信号に変換される。撮像素子210の受光面上にはR(赤),G(緑),またはB(青)のカラーフィルタが設けられており、各色の信号に基づいて被写体のカラー画像を取得することができる。なお、撮像素子210としては、CMOS(Complementary Metal-Oxide Semiconductor)、CCD(Charge-Coupled Device)等の様々な光電変換素子を用いることができる。AFE220は撮像素子210から出力されるアナログ画像信号のノイズ除去、増幅等を行い、A/D変換器230は、取り込んだアナログ画像信号を階調幅があるデジタル画像信号に変換する。

The image sensor 210 has a light receiving surface in which a large number of light receiving elements are arranged in a matrix. Then, the subject light that has passed through the zoom lens 110, the focus lens 120, and the diaphragm 130 is imaged on the light receiving surface of the image pickup element 210, and converted into an electric signal by each light receiving element. An R (red), G (green), or B (blue) color filter is provided on the light-receiving surface of the image sensor 210, and a color image of a subject can be acquired based on signals of each color. As the image sensor 210, various photoelectric conversion elements such as a CMOS (Complementary Metal-Oxide Semiconductor) and a CCD (Charge-Coupled Device) can be used. The AFE 220 performs noise removal and amplification of the analog image signal output from the image sensor 210, and the A/D converter 230 converts the captured analog image signal into a digital image signal having a gradation width.

 <画像処理装置の構成>

 図2は、画像処理装置240の機能構成を示す図である。画像処理装置240は、画像取得部240A(画像取得部)、フリッカー検出部240B(周期検出部、位相検出部)、画像処理部240C(画像処理部)、記録部240D(記録部)、制御部240E(制御部)、表示制御部240F(表示制御部)、静止画抽出部240G(静止画抽出部)、及びレンズ駆動制御部240H(レンズ駆動制御部)を備える。画像処理装置240は、A/D変換器230から入力されたデジタル画像信号に基づいて動画の撮像(撮影)及びファイル生成、静止画ファイル生成、動画を構成する複数のフレームに対する処理、静止画の抽出等の処理を行う。画像処理装置240を用いた処理の詳細は後述する。

<Structure of image processing device>

FIG. 2 is a diagram showing a functional configuration of the image processing device 240. The image processing device 240 includes an image acquisition unit 240A (image acquisition unit), a flicker detection unit 240B (cycle detection unit, phase detection unit), an image processing unit 240C (image processing unit), a recording unit 240D (recording unit), and a control unit. 240E (control unit), display control unit 240F (display control unit), still image extraction unit 240G (still image extraction unit), and lens drive control unit 240H (lens drive control unit). The image processing device 240 captures (captures) a moving image based on the digital image signal input from the A/D converter 230, generates a file, generates a still image file, processes a plurality of frames included in the moving image, and generates a still image. Performs processing such as extraction. Details of the processing using the image processing device 240 will be described later.

 画像処理装置240の機能は、各種のプロセッサ(processor)を用いて実現できる。各種のプロセッサには、例えばソフトウェア(プログラム)を実行して各種の機能を実現する汎用的なプロセッサであるCPU(Central Processing Unit)が含まれる。また、上述した各種のプロセッサには、画像処理に特化したプロセッサであるGPU(Graphics Processing Unit)が含まれる。また、上述した各種のプロセッサには、FPGA(Field Programmable Gate Array)などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)も含まれる。さらに、ASIC(Application Specific Integrated Circuit)などの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路なども上述した各種のプロセッサに含まれる。

The functions of the image processing device 240 can be realized by using various processors. The various processors include, for example, a CPU (Central Processing Unit) that is a general-purpose processor that executes software (programs) to realize various functions. The various processors described above include a GPU (Graphics Processing Unit) that is a processor specialized for image processing. In addition, the various processors described above also include a programmable logic device (PLD) that is a processor whose circuit configuration can be changed after manufacturing such as FPGA (Field Programmable Gate Array). Further, a dedicated electric circuit, which is a processor having a circuit configuration specifically designed to execute a specific process such as an ASIC (Application Specific Integrated Circuit), is also included in the various processors described above.

 上述したプロセッサあるいは電気回路がソフトウェア(プログラム)を実行する際は、実行するソフトウェアのプロセッサ(コンピュータ)読み取り可能なコードをROM(Read Only Memory)等の非一時的記録媒体に記憶しておく。そして、プロセッサがそのソフトウェアを参照する。非一時的記録媒体に記憶しておくソフトウェアは、本発明に係る撮像方法を実行するためのプログラム(撮像装置を動作させるプログラム)を含む。ROMではなく各種光磁気記録装置、半導体メモリ等の非一時的記録媒体にコードを記録してもよい。ソフトウェアを用いた処理の際には例えばRAM(Random Access Memory)が一時的記憶領域として用いられ、また例えば不図示のEEPROM(Electronically Erasable and Programmable Read Only Memory)に記憶されたデータを参照することもできる。

When the above-described processor or electric circuit executes software (program), the processor (computer) readable code of the software to be executed is stored in a non-transitory recording medium such as a ROM (Read Only Memory). The processor then references the software. The software stored in the non-temporary recording medium includes a program for executing the imaging method according to the present invention (a program for operating the imaging device). The code may be recorded in a non-transitory recording medium such as various magneto-optical recording devices and semiconductor memories instead of the ROM. When performing processing using software, for example, a RAM (Random Access Memory) is used as a temporary storage area, and for example, data stored in an EEPROM (Electronically Erasable and Programmable Read Only Memory) (not shown) is also referred to. it can.

 画像処理装置240は、上述の各部の他にROM242(非一時的記録媒体)を備える。ROM242には、画像の撮像、記録、表示等に必要なプログラム(本発明に係る撮像方法を実行するためのプログラムを含む)のコンピュータ(例えば、画像処理装置240を構成する各種のプロセッサ)で読み取り可能なコードが記録される。

The image processing device 240 includes a ROM 242 (non-temporary recording medium) in addition to the above-mentioned units. The ROM 242 is read by a computer (for example, various processors forming the image processing device 240) of a program (including a program for executing the image pickup method according to the present invention) necessary for image pickup, recording, display, and the like. Possible codes are recorded.

 <操作部>

 操作部250は図示せぬレリーズボタン、操作用ボタン(例えば十字ボタン、Quickボタン、OKボタン等)、ダイヤル、スイッチ等を有し、ユーザは撮像モード設定、動画撮像指示、記録フレーム周期の指定、静止画抽出指示等、各種の操作を行うことができる。また、制御部240Eは、これらユーザの指示を受け付けることができる。なお、モニタ270(タッチパネル型)を操作部250として使用してもよい。

<Operation part>

The operation unit 250 has a release button (not shown), operation buttons (for example, a cross button, a Quick button, an OK button, etc.), a dial, a switch, and the like, and the user can set an image capturing mode, a moving image capturing instruction, and a recording frame period. Various operations such as a still image extraction instruction can be performed. The control unit 240E can also accept these user instructions. The monitor 270 (touch panel type) may be used as the operation unit 250.

 <記憶部>

 記録装置260(記録装置)は各種の光磁気記録媒体、半導体メモリ等の非一時的記録媒体及びその制御回路により構成され、動画、静止画、動画から抽出した静止画等を保存する。記録媒体は撮像装置本体200に対し着脱できるタイプを用いることができる。撮像した画像(動画、静止画)を、例えば有線及び/または無線通信により外部の(記録装置260以外の)記録媒体や記録装置に送信して保存してもよい。

<Memory>

The recording device 260 (recording device) is composed of various types of magneto-optical recording media, non-transitory recording media such as semiconductor memories, and its control circuit, and stores moving images, still images, still images extracted from the moving images, and the like. The recording medium may be of a type that can be attached to and detached from the image pickup apparatus main body 200. The captured image (moving image, still image) may be transmitted to and stored in an external recording medium (other than the recording device 260) or a recording device by, for example, wired and/or wireless communication.

 <モニタ及びファインダ>

 モニタ270(表示装置)はタッチパネル型の液晶表示パネルにより構成され、動画、静止画、静止画抽出用フレーム等を表示することができる。モニタ270は撮像装置本体200の背面側、天面側等に配置することができる。また、カメラ10はファインダを備えていてもよい。

<Monitor and viewfinder>

The monitor 270 (display device) is composed of a touch panel type liquid crystal display panel and can display a moving image, a still image, a still image extracting frame, and the like. The monitor 270 can be arranged on the back surface side, the top surface side, or the like of the imaging device body 200. Further, the camera 10 may include a finder.

 <姿勢センサ>

 カメラ10は姿勢センサ280(例えば、3軸方向の加速度を測定する加速度センサ)を備えており、姿勢センサ280の測定結果に基づいてカメラ10の姿勢(重力に対する方向)を検出することができる。表示制御部240Fは、モニタ270に表示させる画像の向きを、カメラ10の姿勢(縦か横か)に応じて変更することができる。

<Attitude sensor>

The camera 10 includes an attitude sensor 280 (for example, an acceleration sensor that measures acceleration in three axis directions), and can detect the attitude (direction with respect to gravity) of the camera 10 based on the measurement result of the attitude sensor 280. The display control unit 240F can change the orientation of the image displayed on the monitor 270 according to the attitude (portrait or landscape) of the camera 10.

 <撮影モード>

 カメラ10は撮影モードとして静止画モード、通常動画モード(第1動画モード)、静止画抽出用動画モード(第2動画モード)、のいずれかを設定することができる。静止画モード、通常動画モードは通常のデジタルカメラと同様のモードである。一方、静止画抽出用動画モードは動画のファイルから静止画を抽出可能なモードであり、通常動画モードと撮像条件が異なる動画(動画自体の鑑賞よりも静止画の抽出を重視した撮像条件の動画)を撮像する。具体的には、静止画抽出用動画モードでは、オートフォーカスの速度(目標とする合焦距離に向かうためのフォーカスレンズの駆動速度)、自動露出の追従速度、ホワイトバランスの追従速度のうち少なくとも1つが通常動画モードに対して高速に設定される。なお、静止画抽出用動画モードでは解像度をカメラ10で設定可能な最高値(例えば4,000×2,000画素)に設定し、色調も静止画抽出を前提として設定することが好ましい。また、静止画抽出用動画モードではISO感度(ISO:International Organization for Standardization)の上限を通常動画モードより高くしてシャッタースピードの上限を高速化し、被写体のブレを少なくすることが好ましい。

<Shooting mode>

The camera 10 can set either a still image mode, a normal moving image mode (first moving image mode), or a still image extracting moving image mode (second moving image mode) as a shooting mode. The still image mode and the normal moving image mode are similar to those of a normal digital camera. On the other hand, the still image extraction moving image mode is a mode in which a still image can be extracted from a moving image file, and a moving image with different image capturing conditions from the normal moving image mode (a moving image with an image capturing condition that emphasizes still image extraction rather than viewing the moving image itself ) Is imaged. Specifically, in the still image extracting moving image mode, at least one of autofocus speed (focus lens drive speed for moving to a target focusing distance), automatic exposure following speed, and white balance following speed. Is set faster than the normal video mode. In the moving image mode for still image extraction, it is preferable to set the resolution to the highest value (for example, 4,000×2,000 pixels) that can be set by the camera 10 and set the color tone on the premise of still image extraction. In the still image extracting moving image mode, it is preferable that the upper limit of the ISO speed (ISO: International Organization for Standardization) is set higher than that of the normal moving image mode to increase the upper limit of the shutter speed to reduce the blurring of the subject.

 シャッタースピードに関しては、通常動画モードでは記録する動画のフレームレートに対応した値(フレームレートが30フレーム/秒の場合、1/30秒)に設定される。一方、静止画抽出用動画モードではフレーム間隔よりも高速(例えば、1/30秒未満)に設定される。通常動画モードでは、滑らかな動画が再生されるように、シャッタースピードが動画のフレームレートに対応した値に設定されるが、この場合動く被写体に対してはブレが生じる可能性がある。このため、静止画抽出用動画モードではシャッタースピードを通常動画モードよりも高速に設定している。即ち、静止画抽出用動画モードでは、動画を構成するフレームの露光時間が通常動画モードよりも短い。尚、静止画抽出用動画モードにおいて、シャッタースピード(露光時間)の設定は、静止画モードと同様にPモード(プログラムモード)やSモード(シャッタスピード優先)を用いることができる。Pモード(プログラムモード)では、シャッタースピードと絞り値の両方が撮像装置によって自動制御される。一方、Sモードでは、シャッタースピードはユーザが事前に設定し、絞り値が撮像装置によって自動制御される。

The shutter speed is set to a value corresponding to the frame rate of the moving image to be recorded in the normal moving image mode (1/30 seconds when the frame rate is 30 frames/second). On the other hand, in the moving image mode for still image extraction, the speed is set higher than the frame interval (for example, less than 1/30 seconds). In the normal moving image mode, the shutter speed is set to a value corresponding to the frame rate of the moving image so that a smooth moving image is reproduced, but in this case, a moving subject may be blurred. For this reason, the shutter speed in the moving image mode for still image extraction is set higher than that in the normal moving image mode. That is, in the still image extracting moving image mode, the exposure time of the frames forming the moving image is shorter than that in the normal moving image mode. In the still image extracting moving image mode, the shutter speed (exposure time) can be set in the P mode (program mode) or the S mode (shutter speed priority), as in the still image mode. In the P mode (program mode), both the shutter speed and the aperture value are automatically controlled by the imaging device. On the other hand, in the S mode, the shutter speed is set in advance by the user, and the aperture value is automatically controlled by the imaging device.

 なお、静止画抽出用動画モードにおけるフレーム周期(フレームレート)の設定については、詳細を後述する。

Details of the frame period (frame rate) setting in the moving image mode for still image extraction will be described later.

 上述した静止画抽出用動画モードによれば、動画を構成するフレームを事後的に静止画として抽出することができる。よって、ユーザはいつ発生するか分からないイベント(自然現象やアクシデント、ハプニング等)の写真、時間の経過と共に状態が変化する被写体や動きのある被写体の瞬間的な状態の写真等を容易に撮影することができる。この際、詳細を後述するように、静止画の記録を指示したタイミングだけでなくその他のタイミングについても静止画を抽出できる。よって、ユーザは所望のタイミングの静止画を取得することができる。また、静止画抽出に適した撮影条件(上述したシャッタースピード、解像度、フレーム周期等)を設定することにより高画質の静止画を抽出することができる。

According to the above-described moving image mode for still image extraction, the frames forming the moving image can be extracted as a still image afterwards. Therefore, the user can easily take a picture of an event (natural phenomenon, accident, happening, etc.) that is not known when it will occur, a picture of a subject whose state changes over time or a momentary state of a moving subject, etc. be able to. At this time, as will be described later in detail, the still image can be extracted not only at the timing at which the recording of the still image is instructed but also at other timings. Therefore, the user can acquire a still image at a desired timing. Moreover, a high-quality still image can be extracted by setting shooting conditions (shutter speed, resolution, frame period, etc.) suitable for still image extraction.

 なお、このような静止画の事後的抽出に加えて、静止画抽出用動画モード及び通常動画モードにおいては、動画撮影中に静止画を撮像(記録)することができる。

In addition to such subsequent extraction of still images, in the moving image mode for still image extraction and the normal moving image mode, still images can be captured (recorded) during moving image shooting.

 <動画モードにおける処理>

 上述した構成のカメラ10における撮像方法について説明する。図3は、動画モード(通常動画モード及び静止画抽出用動画モード)における処理を示すフローチャートである。

<Processing in movie mode>

An image pickup method in the camera 10 having the above-described configuration will be described. FIG. 3 is a flowchart showing processing in the moving image mode (normal moving image mode and still image extracting moving image mode).

 <動画モードの設定>

 制御部240Eは、操作部250を介したユーザの操作に基づいて、カメラ10を通常動画モード(第1動画モード)または静止画抽出用動画モード(第2動画モード)に設定する(ステップS100:モード設定ステップ)。

<Video mode settings>

The control unit 240E sets the camera 10 to the normal moving image mode (first moving image mode) or the still image extracting moving image mode (second moving image mode) based on the user's operation via the operation unit 250 (step S100: Mode setting step).

 <フリッカーの検出>

 フリッカー検出部240B(周期検出部)は、フリッカーの周期を検出する(ステップS110:周期検出ステップ)。検出は、例えば特開2018-133826号公報に記載の手法により行うことができる。具体的には、フリッカー検出部240Bは、時間間隔を空けて取得した複数のフレーム(ライブビュー画像を構成するフレームでもよい)からフリッカーを検出することができる。例えば電源周波数として50Hz(電源周期:20msec)と60Hz(電源周期:約16.7msec)を考える場合、撮像部201がいずれの電源周波数(50Hz,60Hz)とも異なるフレーム周期で複数の画像を撮像して、撮像した画像を画像取得部240Aが取得する。そして、取得した画像をフリッカー検出部240Bが自己相関処理した波形の周期をフリッカーの周期とすることができる。フリッカー検出部240Bは、この波形からフリッカーの位相を検出することもできる。フリッカー検出部240Bは、他の公知の手法によりフリッカーを検出してもよい。他の電源周波数(50Hz,60Hz以外)を考慮する場合も同様である。また、フリッカー検出部240Bは、光源の明暗変化が周期的であり、かつ、明暗差が一定の以上の光源をフリッカーとして検出してもよい。

<Flicker detection>

The flicker detector 240B (cycle detector) detects the flicker cycle (step S110: cycle detection step). The detection can be performed, for example, by the method described in JP-A-2018-133826. Specifically, the flicker detection unit 240B can detect flicker from a plurality of frames (may be frames forming a live view image) acquired at time intervals. For example, when considering 50 Hz (power supply cycle: 20 msec) and 60 Hz (power supply cycle: about 16.7 msec) as power supply frequencies, the imaging unit 201 images a plurality of images at a frame cycle different from any power supply frequency (50 Hz, 60 Hz). Then, the image acquisition unit 240A acquires the captured image. Then, the cycle of the waveform obtained by autocorrelating the acquired image by the flicker detection unit 240B can be set as the flicker cycle. The flicker detector 240B can also detect the flicker phase from this waveform. The flicker detector 240B may detect flicker by another known method. The same applies when other power supply frequencies (other than 50 Hz and 60 Hz) are considered. Further, the flicker detection unit 240B may detect as a flicker a light source in which the light and dark changes of the light source are periodic and the light and dark difference is constant or more.

 なお、第1の実施形態及び後述する他の態様(変形例を含む)では動画像の記録開始前にフリッカーの検出を行う場合について説明しているが、動画像の記録開始後にフリッカーの光源がオンまたはオフされる場合がある。フリッカー検出部240Bは、このような事情を考慮して、動画像の記録開始前に加えて記録開始後にフリッカーの検出を行い、検出結果に基づいて撮像フレーム周期及び/または記録フレーム周期を再設定してもよい。また、シーン認識等で撮影シーンの変化が検出された場合に、フリッカーの検出を行い、検出結果に基づいて撮像フレーム周期及び/または記録フレーム周期を再設定してもよい。

In addition, in the first embodiment and other modes (including modifications) described later, the case where flicker is detected before the start of recording a moving image is described. However, the light source of the flicker is detected after the start of recording the moving image. May be turned on or off. In consideration of such a situation, the flicker detection unit 240B detects flicker not only before the recording of the moving image but also after the recording is started, and resets the imaging frame period and/or the recording frame period based on the detection result. You may. Further, when a change in the shooting scene is detected by scene recognition or the like, flicker may be detected, and the imaging frame period and/or the recording frame period may be reset based on the detection result.

 なお、交流電源の波形は正弦波状であるが、フリッカーによる明るさの変化(図4参照)を考えた場合、その周波数は電源周波数の2倍になる。したがって、フリッカーの周期は電源周期の1/2である。具体的には、電源周波数50Hz(周期=1/50sec)の場合のフリッカー周波数は100Hz(周期=1/100sec)、電源周波数60Hz(周期=1/60sec)の場合のフリッカー周波数は120Hz(周期=1/120sec)である。

Although the waveform of the AC power supply is sinusoidal, its frequency is twice the power supply frequency when the change in brightness due to flicker (see FIG. 4) is considered. Therefore, the flicker cycle is 1/2 of the power supply cycle. Specifically, the flicker frequency when the power supply frequency is 50 Hz (cycle=1/50 sec) is 100 Hz (cycle=1/100 sec), and the flicker frequency when the power supply frequency is 60 Hz (cycle=1/60 sec) is 120 Hz (cycle=cycle). (1/120 sec).

 <静止画抽出用動画モードにおける撮像フレーム周期の設定>

 フリッカーが検出された(フリッカー有り)場合(ステップS120でYes)、制御部240Eは撮像モードが第2動画モード(静止画抽出用動画モード)であるか否かを判断する(ステップS130)。第2動画モードの場合(ステップS130でYes)、制御部240Eは、ステップS110での検出結果に応じて撮像フレーム周期を設定する。具体的には例えば、フリッカー周期が1/100secの場合、処理がステップS140からステップS150に進む。制御部240Eは、撮像フレーム周期(第1フレーム周期)を1/100sec(撮像フレームレート=100fps)に設定し(ステップS150:制御ステップ)、記録フレーム周期(第2フレーム周期)を1/100sec(記録フレームレート=100fps)に設定する(ステップS160:制御ステップ)ことができる。この場合、撮像フレーム周期とフリッカー周期が等しいので、撮像フレーム周期はフリッカー周期の1倍(整数倍の一態様)である。また、この場合、記録フレーム周期と撮像フレーム周期が等しい。なお、「fps(frame per second)」は1秒あたりのフレーム数を示す。

<Setting of imaging frame cycle in moving image mode for still image extraction>

When flicker is detected (there is flicker) (Yes in step S120), the control unit 240E determines whether or not the imaging mode is the second moving image mode (still image extracting moving image mode) (step S130). In the case of the second moving image mode (Yes in step S130), the control unit 240E sets the imaging frame period according to the detection result of step S110. Specifically, for example, when the flicker cycle is 1/100 sec, the process proceeds from step S140 to step S150. The control unit 240E sets the imaging frame period (first frame period) to 1/100 sec (imaging frame rate=100 fps) (step S150: control step), and sets the recording frame period (second frame period) to 1/100 sec ( The recording frame rate=100 fps) can be set (step S160: control step). In this case, since the imaging frame cycle and the flicker cycle are equal, the imaging frame cycle is 1 times the flicker cycle (an aspect of one multiple). Further, in this case, the recording frame period and the imaging frame period are equal. Note that “fps (frame per second)” indicates the number of frames per second.

 一方、フリッカー周期が1/120secの場合、処理がステップS140からステップS170に進む。制御部240Eは、撮像フレーム周期(第1フレーム周期)を1/120sec(撮像フレームレート=120fps)に設定し(ステップS170:制御ステップ)、記録フレーム周期(第2フレーム周期)を1/120sec(記録フレームレート=120fps)に設定することができる(ステップS180)。この場合も、撮像フレーム周期とフリッカー周期が等しいので、撮像フレーム周期はフリッカー周期の1倍(整数倍の一態様)である。また、記録フレーム周期と撮像フレーム周期が等しい。

On the other hand, if the flicker cycle is 1/120 sec, the process proceeds from step S140 to step S170. The control unit 240E sets the imaging frame period (first frame period) to 1/120 sec (imaging frame rate=120 fps) (step S170: control step), and sets the recording frame period (second frame period) to 1/120 sec ( The recording frame rate can be set to 120 fps (step S180). In this case as well, since the imaging frame cycle and the flicker cycle are equal, the imaging frame cycle is 1 times the flicker cycle (an aspect of one multiple). In addition, the recording frame period and the imaging frame period are equal.

 なお、静止画抽出用動画モード(第2動画モード)において、制御部240Eはフリッカーが検出された場合と検出されない場合とで記録フレーム周期(第2フレーム周期)を変化させなくてもよい(撮像フレーム周期がフリッカー周期の整数倍であればよい)。

In the still image extracting moving image mode (second moving image mode), the control unit 240E does not have to change the recording frame period (second frame period) depending on whether flicker is detected or not (imaging). The frame period should be an integral multiple of the flicker period).

 <通常動画モードにおけるフレーム周期の設定>

 撮像モードが通常動画モード(第1動画モード)に設定されている場合(ステップS130でNo)、制御部240Eは、フリッカー周期によらずに撮像フレーム周期(第1フレーム周期)を設定する(ステップS190:制御ステップ)。この場合、制御部240Eは、撮像フレーム周期をユーザ操作に基づいて設定してもよいし、ユーザ操作によらずに設定してもよい。また、制御部240Eは、通常動画モードの場合も、記録フレーム周期(第2フレーム周期)に撮像フレーム周期と等しい値を設定することができる(ステップS200:制御ステップ)。例えば、制御部240Eは、フリッカー周期によらずに撮像フレーム周期を1/100secに、記録フレーム周期を1/100secに設定することができる。なお、制御部240Eは、フリッカーが検出されなかった場合(ステップS120でNo)も、通常動画モードの場合と同様に撮像フレーム周期及び記録フレーム周期を設定することができる。

<Setting of frame cycle in normal video mode>

When the image capturing mode is set to the normal moving image mode (first moving image mode) (No in step S130), the control unit 240E sets the image capturing frame period (first frame period) regardless of the flicker period (step). S190: control step). In this case, the control unit 240E may set the imaging frame cycle based on a user operation, or may set the imaging frame cycle without depending on a user operation. The control unit 240E can also set the recording frame period (second frame period) to a value equal to the imaging frame period even in the normal moving image mode (step S200: control step). For example, the control unit 240E can set the imaging frame period to 1/100 sec and the recording frame period to 1/100 sec regardless of the flicker period. It should be noted that the control unit 240E can set the imaging frame period and the recording frame period as in the case of the normal moving image mode even when no flicker is detected (No in step S120).

 <動画像の記録>

 画像取得部240A及び制御部240Eは、動画記録開始指示がされたか否かを判断する(ステップS210:撮像ステップ、画像取得ステップ)。例えば操作部250の図示せぬレリーズボタンが押し下げされた場合に「動画記録開始指示がされた」と判断することができる。判断が肯定されるとステップS220へ進み、撮像部201が上述した処理により設定された撮像フレーム周期(第1フレーム周期)で動画像を撮像して、画像取得部240Aが撮像動画データ(撮像した動画像を示す第1動画データ)を取得する(ステップS220:撮像ステップ、画像取得ステップ)。そして、画像処理部240Cは、第1動画データを処理して、記録フレーム周期(第2フレーム周期)の記録用動画データ(第2動画データ)を生成する(ステップS222:画像処理ステップ)。撮像フレーム周期と記録フレーム周期が異なっている場合、画像処理部240Cは、フレームの挿入や間引きを行って記録用動画データを生成してもよい(後述する態様を参照)。また、画像処理部240Cは、フレームの挿入や間引き以外の画像処理を行ってもよい。

<Recording of moving images>

The image acquisition unit 240A and the control unit 240E determine whether or not a moving image recording start instruction has been given (step S210: imaging step, image acquisition step). For example, when a release button (not shown) of the operation unit 250 is pressed down, it can be determined that “a moving image recording start instruction has been issued”. If the determination is affirmative, the process proceeds to step S220, where the image capturing unit 201 captures a moving image at the image capturing frame period (first frame period) set by the above-described processing, and the image acquiring unit 240A captures the captured moving image data (captured). First moving image data indicating a moving image) is acquired (step S220: imaging step, image acquisition step). Then, the image processing unit 240C processes the first moving image data to generate recording moving image data (second moving image data) having a recording frame period (second frame period) (step S222: image processing step). When the image capturing frame period and the recording frame period are different, the image processing unit 240C may insert or thin out the frames to generate the moving image data for recording (see an aspect described later). In addition, the image processing unit 240C may perform image processing other than frame insertion and thinning.

 記録部240D(記録部)は、生成された第2動画データを記録装置260(記録装置)に記録させる(ステップS224:記録ステップ)。上述した例のように撮像フレーム周期が1/100sec、記録フレーム周期が1/100secの場合(撮像フレーム周期と記録フレーム周期が等しい場合)、記録部240Dは、撮像動画データのフレームを全て記録すればよい。記録部240Dは、記録用動画データを外部の記録装置(カメラ10に接続された記録装置等)に記録してもよい。なお、画像処理部240C及び記録部240Dは、記録用動画のタグ等に、撮像フレーム周期(撮像フレームレート)、記録フレーム周期(記録フレームレート)等の情報を記録することが好ましい。

The recording unit 240D (recording unit) causes the recording device 260 (recording device) to record the generated second moving image data (step S224: recording step). When the image capturing frame period is 1/100 sec and the recording frame period is 1/100 sec (when the image capturing frame period and the recording frame period are the same) as in the example described above, the recording unit 240D records all the frames of the captured moving image data. Good. The recording unit 240D may record the recording moving image data in an external recording device (such as a recording device connected to the camera 10). It is preferable that the image processing unit 240C and the recording unit 240D record information such as an imaging frame period (imaging frame rate) and a recording frame period (recording frame rate) in a tag of a moving image for recording.

 画像取得部240A、画像処理部240C、及び記録部240Dは、動画記録終了指示があるまで(ステップS230でNoの間)、かつ記録メディア(記録装置260等)の残容量が十分である間(ステップS240でYesの間)、ステップS220からS224までの処理を継続する。動画記録終了指示があった場合(ステップS230でYes)及び記録メディアの残容量が十分でない場合(ステップS240でNo)、画像取得部240A、画像処理部240C、及び記録部240Dは、処理を終了して記録中の動画ファイルをクローズする(ステップS250)。

The image acquisition unit 240A, the image processing unit 240C, and the recording unit 240D keep the moving image recording end instruction (No in step S230) and the remaining capacity of the recording medium (recording device 260 or the like) sufficient ( (Yes in step S240), the processing from steps S220 to S224 is continued. When there is a video recording end instruction (Yes in step S230) and when the remaining capacity of the recording medium is not sufficient (No in step S240), the image acquisition unit 240A, the image processing unit 240C, and the recording unit 240D end the process. Then, the moving image file being recorded is closed (step S250).

 図4は、上述した処理により撮像フレーム周期(第1フレーム周期)をフリッカー周期の整数倍に設定した場合の、フリッカー光源による明るさの変化と動画の各フレームの撮像タイミングとの関係を概念的に示す図である。図4に示す例では、撮像フレーム周期(=|t2-t1|)をフリッカー周期(T0)の1倍(整数倍の一態様)に設定している。時刻t1,t2は撮像タイミングを示す。これによりフリッカー光源の明るさの影響が一定の状態(明るさL0)で動画を撮像し、フレーム間の明暗差や色味の差を低減することができる。

FIG. 4 conceptually shows the relationship between the change in brightness due to the flicker light source and the image capturing timing of each frame of a moving image when the image capturing frame period (first frame period) is set to an integral multiple of the flicker period by the above-described processing. FIG. In the example shown in FIG. 4, the imaging frame period (=|t2-t1|) is set to one time (one mode of integer multiple) of the flicker period (T0). Times t1 and t2 indicate image pickup timing. As a result, a moving image can be captured in a state where the influence of the brightness of the flicker light source is constant (brightness L0), and the difference in brightness and tint between frames can be reduced.

 静止画抽出用動画モード(第2動画モード)では動画像を構成する各フレームの露光時間が通常動画モード(第1動画モード)よりも短く設定され、フリッカーの影響が大きいが、上述したように撮像フレーム周期をフリッカー周期の整数倍に設定することにより、動画像を構成するフレームの露光時間に応じたフリッカー対策を行うことができる。

In the moving image mode for still image extraction (second moving image mode), the exposure time of each frame forming a moving image is set shorter than that in the normal moving image mode (first moving image mode), and the influence of flicker is large, but as described above. By setting the imaging frame cycle to be an integral multiple of the flicker cycle, it is possible to take measures against flicker according to the exposure time of the frames forming the moving image.

 <撮像タイミングの設定>

 上述したように、本発明の一態様によりフレーム間の明暗差や色味の差を低減することができるが、動画像の撮像タイミング(各フレームの露光タイミング)によっては、フリッカー光源の明るさが暗い状態で各フレームを撮像することになる。これに対して、以下に説明するように動画像の撮像タイミングを設定することで、フリッカー光源の明るさが所望の状態で動画像を撮像することができる。フリッカー光源の明るさの変化(フリッカー位相情報)の検出は、ステップS110におけるフリッカー周期の検出と合わせて行うことができる(位相検出ステップ)。すなわち、画像取得部240Aの制御により複数回画像を取得し、これら画像をフリッカー検出部240B(位相検出部)が自己相関処理した波形から、撮像タイミングと明るさとの関係を把握することができる。そして、制御部240Eは、静止画抽出用動画モード(第2動画モード)において、撮像動画データ(撮像した動画像を示す第1動画データ)を構成するフレームの露光タイミングをフリッカー位相情報に基づいて制御する。この制御の下で撮像部201が動画像を撮像し、画像取得部240Aが撮像動画データを取得する(ステップS220)。

<Setting of imaging timing>

As described above, according to one embodiment of the present invention, the difference in lightness and darkness and the difference in tint between frames can be reduced, but the brightness of the flicker light source may be different depending on the timing of capturing a moving image (the exposure timing of each frame). Each frame will be imaged in the dark. On the other hand, by setting the timing of capturing a moving image as described below, the moving image can be captured with the brightness of the flicker light source being in a desired state. The change in the brightness of the flicker light source (flicker phase information) can be detected together with the detection of the flicker cycle in step S110 (phase detecting step). That is, it is possible to grasp the relationship between the image capturing timing and the brightness from the waveforms obtained by acquiring the images a plurality of times under the control of the image obtaining unit 240A and subjecting the images to the autocorrelation processing by the flicker detecting unit 240B (phase detecting unit). Then, in the moving image mode for still image extraction (second moving image mode), the control unit 240E sets the exposure timing of the frames forming the captured moving image data (first moving image data indicating the captured moving image) based on the flicker phase information. Control. Under this control, the imaging unit 201 captures a moving image, and the image acquisition unit 240A acquires captured moving image data (step S220).

 図5は、上述した制御により設定された撮像タイミング(時刻t3,t4,t5,t6,t7)で撮像することで、フリッカー光源の明るさが最大(明るさL1)になる様子を示している。図5ではフリッカー光源の明るさが最大になる露光タイミングで撮像する様子を示しているが、図4の例のように撮像タイミング(例えばt1,t2)をフリッカー光源の明るさが最大になるタイミングからずらし、動画像の明るさを調節してもよい。このような撮像タイミングの設定は、以下に説明する他の態様及びその変形例でも同様に行うことができる。

FIG. 5 shows how the brightness of the flicker light source becomes maximum (brightness L1) by imaging at the imaging timing (time t3, t4, t5, t6, t7) set by the above-described control. .. Although FIG. 5 shows a state in which imaging is performed at the exposure timing when the brightness of the flicker light source is maximized, the imaging timing (for example, t1 and t2) is set to the timing when the brightness of the flicker light source is maximized as in the example of FIG. The brightness of the moving image may be adjusted by shifting the brightness. Such setting of the imaging timing can be similarly performed in other modes and modifications thereof described below.

 <撮像及び記録フレーム周期設定の他の態様>

 (他の態様1)

 図6は、撮像及び記録フレーム周期(第1,第2フレーム周期)設定の他の態様1を示すフローチャートである。図3のフローチャートと同様の処理については同一のステップ番号を付し、詳細な説明を省略する。

<Another Mode of Imaging and Recording Frame Period Setting>

(Other mode 1)

FIG. 6 is a flowchart showing another aspect 1 of the image pickup and recording frame period (first and second frame periods) setting. The same steps as those in the flowchart of FIG. 3 are designated by the same step numbers, and detailed description thereof will be omitted.

 図6のフローチャートでは、制御部240Eは、記録フレーム周期(第2フレーム周期)の初期設定を行う(ステップS102:制御ステップ)。制御部240Eは、動画像のフレームレートとして一般的な値(24fps,25fps,30fps,50fps,60fps等)に対応する記録フレーム周期(1/24sec,1/25sec,1/30sec,1/50sec,1/60等)を設定してもよい。また、制御部240Eは、操作部250を介したユーザの指示に基づいて初期設定を行ってもよいし、ユーザの指示によらずに設定してもよい。以下では、ステップS102において、記録フレーム周期が1/30sec(フレームレートは30fps)に設定されたとして説明するが、記録フレーム周期に他の値が設定された場合も、同様に処理することができる。

In the flowchart of FIG. 6, the control unit 240E initializes the recording frame period (second frame period) (step S102: control step). The control unit 240E has a recording frame period (1/24 sec, 1/25 sec, 1/30 sec, 1/50 sec) corresponding to a general value (24 fps, 25 fps, 30 fps, 50 fps, 60 fps, etc.) as a frame rate of a moving image. 1/60) may be set. In addition, the control unit 240E may perform the initial setting based on a user's instruction via the operation unit 250, or may perform the initial setting regardless of the user's instruction. In the following description, it is assumed that the recording frame period is set to 1/30 sec (frame rate is 30 fps) in step S102, but the same processing can be performed when another value is set to the recording frame period. ..

 他の態様1では、記録フレーム周期は初期設定のままとし、詳細を後述するように、撮像フレーム周期で取得した撮像動画データ(第1動画データ)に対しフレームの間引き、挿入等を行って、設定されたフレーム周期の記録用動画データ(第2動画データ)を生成する。撮像フレーム周期と記録フレーム周期が異なっていてもよい。記録フレーム周期は初期設定のままなので、図6のステップS160,S180はスキップしてよい。

In another aspect 1, the recording frame cycle is left as the initial setting, and as described later in detail, by performing frame thinning, insertion, etc. on the captured moving image data (first moving image data) acquired in the capturing frame period, The recording moving image data (second moving image data) having the set frame period is generated. The imaging frame period and the recording frame period may be different. Since the recording frame period remains the default setting, steps S160 and S180 in FIG. 6 may be skipped.

 ステップS110で検出されたフリッカー周期が1/100secの場合、制御部240Eは、撮像フレーム周期(第1フレーム周期)として1/100sec、1/50sec等の値を設定することができる(ステップS150:制御ステップ)。これらの撮像フレーム周期は、フリッカー周期のそれぞれ1倍、2倍(整数倍の一態様)である。同様に、ステップS110で検出されたフリッカー周期が1/120secの場合、制御部240Eは、撮像フレーム周期(第1フレーム周期)として1/120sec、1/60sec等の値を設定することができる(ステップS170:制御ステップ)。これらの撮像フレーム周期は、フリッカー周期のそれぞれ1倍、2倍(整数倍の一態様)である。なお、制御部240Eは、フリッカー周期の他の整数倍(3倍以上)の撮像フレーム周期を設定してもよい。

When the flicker period detected in step S110 is 1/100 sec, the control unit 240E can set a value such as 1/100 sec or 1/50 sec as the imaging frame period (first frame period) (step S150: Control step). These imaging frame cycles are 1 times and 2 times (one aspect of integer multiple) the flicker cycle, respectively. Similarly, when the flicker period detected in step S110 is 1/120 sec, the control unit 240E can set values such as 1/120 sec and 1/60 sec as the imaging frame period (first frame period) ( Step S170: control step). These imaging frame cycles are 1 times and 2 times (one aspect of integer multiple) the flicker cycle, respectively. The control unit 240E may set an imaging frame cycle that is an integral multiple (three times or more) of the flicker cycle.

 上述した例のように記録フレーム周期(第2フレーム周期)が撮像フレーム周期(第1フレーム周期)より長い場合、画像処理部240Cは、撮像動画データ(第1動画データ)を構成する複数のフレームのうち一部のフレームを選択して、記録フレーム周期(第2フレーム周期)の記録用動画データ(第2動画データ)を生成する(ステップS222:画像処理ステップ)。図7はこのような処理(フレームの間引き)の様子を概念的に示す図である(フレームは、時間tに沿って順次取得される)。図7の例では、画像処理部240Cは、撮像フレーム周期(T1)の撮像動画データ(第1動画データ)を構成する複数のフレーム(フレーム506;同図の(a)部分)のうち一部のフレームを選択して、記録フレーム周期(第2フレーム周期;図7ではT2(>T1))の記録用動画データ(第2動画データ)を構成するフレーム(フレーム508;同図の(b)部分)を生成している。

When the recording frame period (second frame period) is longer than the imaging frame period (first frame period) as in the above-described example, the image processing unit 240C causes the plurality of frames that form the captured moving image data (first moving image data). A part of the frames is selected to generate recording moving image data (second moving image data) having a recording frame period (second frame period) (step S222: image processing step). FIG. 7 is a diagram conceptually showing a state of such processing (frame thinning) (frames are sequentially acquired along time t). In the example of FIG. 7, the image processing unit 240C is a part of a plurality of frames (frame 506; part (a) of FIG. 16) that forms the captured moving image data (first moving image data) of the capturing frame period (T1). Frame of the recording frame period (second frame period; T2 (>T1 in FIG. 7)) constituting the recording moving image data (second moving image data) (frame 508; (b) in the same figure). Part) is being generated.

 なお、画像処理部240Cは、記録フレーム周期(第2フレーム周期)が撮像フレーム周期(第1フレーム周期)より長い場合に、撮像動画データ(第1動画データ)を構成する複数のフレームを合成(重み付け加算等)して、記録用動画データ(第2動画データ)を構成するフレームを生成してもよい(ステップS222:画像処理ステップ)。

Note that the image processing unit 240C combines a plurality of frames forming the captured moving image data (first moving image data) when the recording frame period (second frame period) is longer than the captured frame period (first frame period) ( Weighted addition or the like) may be performed to generate frames that form the recording moving image data (second moving image data) (step S222: image processing step).

 制御部240Eは、記録フレーム周期(第2フレーム周期)を撮像フレーム周期(第1フレーム周期)より短い周期に設定してもよい。例えば、記録フレーム周期を1/30secに初期設定した場合に、制御部240Eは、撮像フレーム周期を1/25sec(フリッカー周期が1/100secの場合)に設定してもよい(ステップS150:制御ステップ)。この場合、撮像フレーム周期はフリッカー周期の4倍(整数倍の一態様)となる。

The control unit 240E may set the recording frame period (second frame period) to a period shorter than the imaging frame period (first frame period). For example, when the recording frame period is initially set to 1/30 sec, the control unit 240E may set the imaging frame period to 1/25 sec (when the flicker period is 1/100 sec) (step S150: control step). ). In this case, the imaging frame cycle is four times the flicker cycle (an aspect of one multiple).

 上述した例のように記録フレーム周期が撮像フレーム周期より短い場合(記録フレームレートが撮像フレームレートより高い場合)、画像処理部240Cは、第1動画データを構成するフレームに補間フレームを挿入して第2フレーム周期の第2動画データを生成することができる。図8はそのような補間フレームの様子を概念的に示す図である。撮像フレーム周期がT1であり記録フレーム周期がT2(<T1)である場合、画像処理部240Cは、撮像動画データ(第1動画データ)を構成するフレーム(フレーム500;図8の(a)部分を参照)の間に補間フレーム502A(グレーで示すフレーム)を挿入して、記録フレーム周期がT2のフレーム502(同図の(b)部分を参照)を生成することができる。フレーム502は撮像フレームと補間フレームとの比率が1:1の場合の例であるが、画像処理部240Cは、撮像フレーム周期及び記録フレーム周期に応じて比率を決めることができる。なお、補間フレームの例としては、被写体の映っていない黒フレーム(ダミーフレーム)、直前のフレームをそのまま使用したフレーム、挿入タイミングの前及び/または後ろのフレームを補間して生成したフレームを挙げることができるが、これらに限定されるものではない。

When the recording frame period is shorter than the image capturing frame period (when the recording frame rate is higher than the image capturing frame rate) as in the above-described example, the image processing unit 240C inserts the interpolation frame into the frame forming the first moving image data. It is possible to generate the second moving image data of the second frame period. FIG. 8 is a diagram conceptually showing a state of such an interpolation frame. When the image capturing frame period is T1 and the recording frame period is T2 (<T1), the image processing unit 240C causes the image capturing moving image data (first moving image data) to be a frame (frame 500; part (a) of FIG. 8). Interpolation frame 502A (frame shown in gray) between the two) to generate a frame 502 having a recording frame period of T2 (see part (b) in the same figure). The frame 502 is an example in which the ratio of the imaging frame to the interpolation frame is 1:1, but the image processing unit 240C can determine the ratio according to the imaging frame period and the recording frame period. Note that examples of the interpolation frame include a black frame (dummy frame) in which the subject is not reflected, a frame in which the immediately preceding frame is used as it is, and a frame generated by interpolating frames before and/or after the insertion timing. However, the present invention is not limited to these.

 なお、補間フレームを挿入する場合、画像処理部240C及び記録部240Dは、記録用動画のタグ等に、撮影フレーム周期(撮影フレームレート)、記録フレーム周期(記録フレームレート)、挿入した補間フレームの位置等の情報を記録することが好ましい。以下に説明するように、静止画切り出しの際にこの情報を利用することができる。

When inserting the interpolation frame, the image processing unit 240C and the recording unit 240D add the shooting frame period (shooting frame rate), the recording frame period (recording frame rate), and the inserted interpolation frame to the tag of the moving image for recording. It is preferable to record information such as position. As described below, this information can be used when extracting a still image.

 <動画再生及び静止画切り出し>

 図9は、動画再生及び静止画切り出し(静止画抽出)の様子を示すフローチャートである。表示制御部240Fは、ステップS224で記録した動画データのタグ等にダミーフレーム情報(補間により挿入したフレームの位置等を示す情報)が記録されているか否かを判断する(ステップS300)。ダミーフレーム情報が記録されている場合(ステップS300でYes)、表示制御部240Fはその情報を設定し(読み込み:ステップS310)、記録用動画データ(第2動画データ)の再生を開始する(ステップS320)。表示制御部240Fは、画像をモニタ270(表示装置)に表示させてもよいし、カメラ10に接続された他の表示装置に表示させてもよい。なお、ここでは動画をコマ送り再生(ユーザの操作に応じて1フレームずつ表示)する場合について説明する。

<Video playback and still image clipping>

FIG. 9 is a flowchart showing a state of moving image reproduction and still image cutout (still image extraction). The display control unit 240F determines whether or not the dummy frame information (information indicating the position or the like of the frame inserted by interpolation) is recorded in the tag or the like of the moving image data recorded in step S224 (step S300). When the dummy frame information is recorded (Yes in step S300), the display control unit 240F sets the information (reading: step S310) and starts reproduction of the recording moving image data (second moving image data) (step). S320). The display control unit 240F may display the image on the monitor 270 (display device), or may display the image on another display device connected to the camera 10. Note that, here, a case will be described in which a moving image is played back frame by frame (displayed one frame at a time in response to a user operation).

 動画の再生を開始すると、表示制御部240Fはカウンタkの値を1に初期設定し(ステップS330)、k番目のフレームがダミーフレームであるか否かを判断する(ステップS340)。k番目のフレームがダミーフレームでない場合(ステップS340でNo)、表示制御部240Fはそのフレームを表示させる(ステップS360)。静止画抽出部240G(静止画抽出部)は、表示されたフレームに対する切り出し指示(ユーザによる、記録用動画データ(第2動画データ)を構成するフレームを静止画として抽出する指示)があったか否かを判断する(ステップS370)。切り出し指示があった場合(ステップS370でYes)、静止画抽出部240G(静止画抽出部)は、指示されたフレームを静止画として切り出して(ステップS380)ステップS390に進む。動画ファイルはMPEG形式等の動画用フォーマットで記憶されているので、静止画抽出部240Gは、指定されたフレームのデータを静止画用のフォーマット(JPEG形式等)に変換する。静止画抽出部240Gは、フォーマットの変換以外の画像処理を施してもよい。

When the reproduction of the moving image is started, the display control unit 240F initializes the value of the counter k to 1 (step S330) and determines whether the kth frame is a dummy frame (step S340). If the k-th frame is not a dummy frame (No in step S340), the display control unit 240F displays the frame (step S360). The still image extracting unit 240G (still image extracting unit) determines whether or not there is a cutout instruction for the displayed frame (an instruction by the user to extract a frame forming the recording moving image data (second moving image data) as a still image). Is determined (step S370). If there is a cutout instruction (Yes in step S370), the still image extraction unit 240G (still image extraction unit) cuts out the instructed frame as a still image (step S380) and proceeds to step S390. Since the moving image file is stored in a moving image format such as MPEG format, the still image extraction unit 240G converts the data of the designated frame into a still image format (JPEG format or the like). The still image extraction unit 240G may perform image processing other than format conversion.

 切り出し指示がなかった場合(ステップS370でNo)、表示制御部240Fはユーザによる再生終了指示があったか否かを判断し、指示があった場合(ステップS390でYes)は図9のフローチャートの処理を終了する。再生終了指示がない場合(ステップS390でNo)、表示制御部240Fはユーザによるコマ送り指示があったか否かを判断する。指示があった場合(ステップS400でYes)、表示制御部240Fはカウンタkの値を1増やし(ステップS350)、kの値を1からNの範囲にクリップして(ステップS430)ステップS340に戻る。コマ送り指示がなく(ステップS400でNo)コマ戻し指示(前のフレームを表示させる指示)があった場合(ステップS410でYes)、表示制御部240Fはカウンタkの値を1減らし(ステップS420)kの値を1からNの範囲にクリップして(ステップS430)ステップS340に戻る。コマ戻し指示がなかった場合(ステップS410でNo)も、同様に表示制御部240Fはkの値を1からNの範囲にクリップして(ステップS430)ステップS340に戻る。

If there is no cutout instruction (No in step S370), the display control unit 240F determines whether or not there is a reproduction end instruction by the user, and if there is an instruction (Yes in step S390), the process of the flowchart of FIG. 9 is performed. finish. When there is no playback end instruction (No in step S390), the display control unit 240F determines whether or not the user has given a frame advance instruction. When there is an instruction (Yes in step S400), the display control unit 240F increments the value of the counter k by 1 (step S350), clips the value of k in the range of 1 to N (step S430), and returns to step S340. .. When there is no frame advance instruction (No in step S400) and there is a frame return instruction (instruction to display the previous frame) (Yes in step S410), the display control unit 240F decrements the value of the counter k by 1 (step S420). The value of k is clipped in the range of 1 to N (step S430), and the process returns to step S340. When there is no frame return instruction (No in step S410), the display control unit 240F clips the value of k in the range of 1 to N (step S430) and returns to step S340.

 このように、ダミーフレーム(補間フレームの一態様)が記録された動画データを静止画抽出のために再生する場合にダミーフレームを表示させないので、ユーザは静止画抽出作業を効率的に行うことができる。

As described above, since the dummy frame is not displayed when the moving image data in which the dummy frame (one mode of the interpolation frame) is recorded is reproduced for the still image extraction, the user can efficiently perform the still image extraction work. it can.

 (他の態様1の変形例1)

 図6のフローチャートにおいて、ステップS102で初期設定された記録フレーム周期を、撮像フレーム周期に応じて変更(再設定)してもよい。この場合、制御部240Eは、ステップS160、ステップS180で記録フレーム周期(第2フレーム周期)を変更(再設定)する。この際、撮像フレーム周期よりも長いフレーム周期(撮像フレームレートよりも低いフレームレート)で、一般的な動画の規格に合わせたフレーム周期を設定することができる。具体的には、例えば初期設定された記録フレーム周期が1/30secでフリッカー周期が1/100secの場合、制御部240Eは撮像フレーム周期(第1フレーム周期)を1/100secに設定する(ステップS150)。また、制御部240Eは、撮像フレーム周期に対応させて記録フレーム周期(第2フレーム周期)を1/60sec、1/50sec、1/25sec、1/24sec(フレームレートはそれぞれ60fps、50fps、25fps、24fps)のいずれかに設定する(ステップS160)。

(Modification 1 of other aspect 1)

In the flowchart of FIG. 6, the recording frame period initially set in step S102 may be changed (reset) according to the imaging frame period. In this case, the control unit 240E changes (resets) the recording frame period (second frame period) in steps S160 and S180. At this time, it is possible to set a frame cycle that conforms to the standard of a general moving image with a frame cycle that is longer than the imaging frame cycle (frame rate that is lower than the imaging frame rate). Specifically, for example, when the initially set recording frame period is 1/30 sec and the flicker period is 1/100 sec, the control unit 240E sets the imaging frame period (first frame period) to 1/100 sec (step S150). ). In addition, the control unit 240E sets the recording frame period (second frame period) to 1/60 sec, 1/50 sec, 1/25 sec, and 1/24 sec corresponding to the imaging frame period (frame rates are 60 fps, 50 fps, 25 fps, respectively). 24 fps) (step S160).

 また、例えば初期設定された記録フレーム周期が1/30secでフリッカー周期が1/120secの場合、制御部240Eは撮像フレーム周期を1/120secに設定する(ステップS170)。そして、制御部240Eは、撮像フレーム周期に対応させて記録フレーム周期を1/100sec、1/60sec、1/50sec、1/25sec、1/24sec(フレームレートはそれぞれ100fps、60fps、50fps、25fps、24fps)のいずれかに設定する(ステップS180)。

Further, for example, when the initially set recording frame period is 1/30 sec and the flicker period is 1/120 sec, the control unit 240E sets the imaging frame period to 1/120 sec (step S170). Then, the control unit 240E sets the recording frame period to 1/100 sec, 1/60 sec, 1/50 sec, 1/25 sec, and 1/24 sec corresponding to the imaging frame period (frame rates are 100 fps, 60 fps, 50 fps, 25 fps, respectively. 24 fps) (step S180).

 記録フレーム周期を変更(再設定)する場合、表示制御部240Fがモニタ270に情報を表示すること等によりユーザに報知することが好ましい。また、制御部240Eは、変更(再設定)する記録フレーム周期をユーザに選択させてもよい。また、変形例1では記録フレーム周期が撮像フレーム周期よりも長い。そこで、図7について上述したように、画像処理部240Cはフレームの間引き(記録フレーム周期に合わせて、撮像動画データを構成するフレームのうち一部のフレームを選択して記録用動画データを生成する)やフレームの合成を行って記録用動画データを生成する(ステップS222)。

When changing (resetting) the recording frame period, it is preferable that the display control unit 240F informs the user by displaying information on the monitor 270. In addition, the control unit 240E may allow the user to select the recording frame period to be changed (reset). Further, in the modified example 1, the recording frame cycle is longer than the imaging frame cycle. Therefore, as described above with reference to FIG. 7, the image processing unit 240C thins out the frames (selects some of the frames forming the captured moving image data according to the recording frame period to generate recording moving image data. ) And frames are combined to generate moving image data for recording (step S222).

 (他の態様1の変形例2)

 上述した変形例1では撮像フレーム周期よりも長い記録フレーム周期に変更(再設定)しているが、撮像フレーム周期よりも短い記録フレーム周期(撮像フレームレートよりも高い記録フレームレート)を設定してもよい。この場合も、一般的な動画の規格に合わせた記録フレーム周期を設定してよい。具体的には、例えば初期設定された記録フレーム周期が1/30secでフリッカー周期が1/100secの場合、制御部240Eは撮像フレーム周期(第1フレーム周期)を1/100secに設定し(ステップS150)、記録フレーム周期(第2フレーム周期)を1/120sec(フレームレートは120fps)に設定する(ステップS160)。このようなフレーム周期の設定は、フリッカー周期が1/120secの場合も同様に行うことができる。

(Modification 2 of other aspect 1)

In Modification 1 described above, the recording frame period is changed (reset) to be longer than the imaging frame period, but a recording frame period shorter than the imaging frame period (recording frame rate higher than the imaging frame rate) is set. Good. Also in this case, the recording frame period may be set in accordance with the standard of a general moving image. Specifically, for example, when the initially set recording frame period is 1/30 sec and the flicker period is 1/100 sec, the control unit 240E sets the imaging frame period (first frame period) to 1/100 sec (step S150). ), the recording frame period (second frame period) is set to 1/120 sec (frame rate is 120 fps) (step S160). The frame period can be set in the same manner even when the flicker period is 1/120 sec.

 変形例2においても、表示制御部240Fがモニタ270に情報を表示すること等により、記録フレーム周期の変更(再設定)をユーザに報知することが好ましい。また、制御部240Eは、変更する記録フレーム周期をユーザに選択させてもよい。また、変形例2では記録フレーム周期が撮像フレーム周期よりも短い。そこで、図8について上述したように、画像処理部240Cは補間フレームの挿入を行って記録用動画データを生成する(ステップS222)。

Also in the second modification, it is preferable that the display control unit 240F display the information on the monitor 270 to notify the user of the change (reset) of the recording frame period. In addition, the control unit 240E may allow the user to select the recording frame period to be changed. In the second modification, the recording frame cycle is shorter than the imaging frame cycle. Therefore, as described above with reference to FIG. 8, the image processing unit 240C inserts an interpolation frame to generate recording moving image data (step S222).

 (他の態様2)

 図10は、撮像フレーム周期及び記録フレーム周期設定の他の態様2を示すフローチャートである。他の態様2では、撮像モード(第1動画モード、第2動画モード)によらずフリッカー周期の整数倍の撮像フレーム周期(第1フレーム周期)で撮像し、あらかじめ指定された周期を記録フレーム周期(第2フレーム周期)として記録する。なお、図10において図3,6のフローチャートと同様の処理については同一のステップ番号を付し、詳細な説明を省略する。

(Other mode 2)

FIG. 10 is a flowchart showing another aspect 2 of setting the imaging frame period and the recording frame period. In another aspect 2, the image is picked up at an image pickup frame period (first frame period) that is an integral multiple of the flicker period regardless of the image pickup mode (first moving image mode, second moving image mode), and a predetermined period is set as the recording frame period It is recorded as (second frame period). In FIG. 10, the same steps as those in the flowcharts of FIGS. 3 and 6 are designated by the same step numbers, and detailed description thereof will be omitted.

 ステップS110で検出したフリッカー周期が1/100secの場合、制御部240Eは、撮像フレーム周期を1/100sec(=フリッカー周期)またはその整数倍(2倍、3倍等)に設定する(ステップS150)。記録フレーム周期は初期設定値(ステップS102)のままでよい。一方、フリッカー周期が1/120secの場合、制御部240Eは、撮像フレーム周期を1/120sec(=フリッカー周期)またはその整数倍(2倍、3倍等)に設定する(ステップS170)。記録フレーム周期は指定値(ステップS102)のままでよい。画像処理部240Cは、このようにして設定された撮像フレーム周期と記録フレーム周期との関係に応じて、図7,8について上述した補間フレームの挿入、フレームの合成、フレームの間引きを行う(ステップS222)。

When the flicker cycle detected in step S110 is 1/100 sec, the control unit 240E sets the imaging frame cycle to 1/100 sec (=flicker cycle) or an integral multiple (2 times, 3 times, etc.) thereof (step S150). .. The recording frame period may remain the initial setting value (step S102). On the other hand, when the flicker cycle is 1/120 sec, the control unit 240E sets the imaging frame cycle to 1/120 sec (=flicker cycle) or an integral multiple (2 times, 3 times, etc.) thereof (step S170). The recording frame period may remain the specified value (step S102). The image processing unit 240C performs the interpolation frame insertion, the frame synthesis, and the frame thinning described above with reference to FIGS. 7 and 8 according to the relationship between the imaging frame period and the recording frame period thus set (step). S222).

 なお、フリッカーが検出されない場合(ステップS120でNo)、制御部240Eは、撮像フレーム周期を、ステップS102で設定された記録フレームと等しい値に設定することができる(ステップS182)。

If no flicker is detected (No in step S120), the control unit 240E can set the imaging frame cycle to a value equal to the recording frame set in step S102 (step S182).

 (他の態様2の変形例1)

 図11は、他の態様2の変形例1を示すフローチャートである。他の態様2の変形例1においても、制御部240Eは、ステップS102で初期設定された記録フレーム周期を変更(再設定)することができる(ステップS160,S180)。フレーム周期を変更する場合、記録フレーム周期と撮像フレーム周期が等しくてもよいし、いずれかが長くてもよい。また、具体的なフレーム周期の値は、第1の実施形態、他の態様1及びその変形例と同様に設定することができる。画像処理部240Cは、撮像フレーム周期と記録フレーム周期との関係に応じて、上述した補間フレームの挿入、フレームの合成、フレームの間引きを行う(ステップS222)。

(Modification 1 of other aspect 2)

FIG. 11 is a flowchart showing a modified example 1 of the other aspect 2. In Modification 1 of the other aspect 2 as well, the control unit 240E can change (reset) the recording frame period initialized in step S102 (steps S160 and S180). When changing the frame cycle, the recording frame cycle and the imaging frame cycle may be equal to each other, or one of them may be longer. Further, the specific value of the frame period can be set in the same manner as in the first embodiment, other aspect 1 and its modification. The image processing unit 240C performs the above-described interpolation frame insertion, frame combination, and frame thinning according to the relationship between the imaging frame period and the recording frame period (step S222).

 (他の態様3)

 図12は、撮像フレーム周期及び記録フレーム周期設定の他の態様3を示すフローチャートである。他の態様3では、制御部240Eは、通常動画モード(第1動画モード)ではフリッカー周期によらずに(フリッカーなしの場合と同様に)撮像フレーム周期を設定する(ステップS182)。一方、静止画抽出用動画モード(第2動画モード)では、制御部240Eは、フリッカー周期の検出結果に応じて、第1フレーム周期をフリッカー周期の整数倍に設定する。いずれの動画モードにおいても、制御部240Eは、記録フレーム周期をあらかじめ指定されたフレーム周期に設定する(ステップS102)。

(Other mode 3)

FIG. 12 is a flowchart showing another aspect 3 of setting the imaging frame period and the recording frame period. In another aspect 3, the control unit 240E sets the imaging frame period in the normal moving image mode (first moving image mode) regardless of the flicker period (as in the case of no flicker) (step S182). On the other hand, in the still image extracting moving image mode (second moving image mode), the control unit 240E sets the first frame period to an integral multiple of the flicker period according to the detection result of the flicker period. In any of the moving image modes, the control unit 240E sets the recording frame period to the previously designated frame period (step S102).

 フリッカー周期が1/100secの場合、制御部240Eは、撮像フレーム周期を例えば1/100sec、1/50sec、1/25sec、1/20sec(フレームレートはそれぞれ100fps,50fps,25fps、20fps)に設定することができる。この場合、撮像フレーム周期はフリッカー周期のそれぞれ1倍、2倍、4倍、5倍である。また、フリッカー周期が1/120secの場合、制御部240Eは、撮像フレーム周期を例えば1/120sec、1/60sec、1/30sec、1/24sec(フレームレートはそれぞれ120fps,60fps,30fps,24fps)に設定することができる。この場合、撮像フレーム周期はフリッカー周期のそれぞれ1倍、2倍、4倍、5倍である。制御部240Eは、フリッカー周期の整数倍で、かつ動画像の一般的な規格に合わせた値を撮像フレーム周期に設定してもよい。

When the flicker period is 1/100 sec, the control unit 240E sets the imaging frame period to, for example, 1/100 sec, 1/50 sec, 1/25 sec, and 1/20 sec (frame rates are 100 fps, 50 fps, 25 fps, and 20 fps, respectively). be able to. In this case, the imaging frame cycle is 1 time, 2 times, 4 times, and 5 times the flicker cycle, respectively. When the flicker period is 1/120 sec, the control unit 240E sets the image capturing frame period to, for example, 1/120 sec, 1/60 sec, 1/30 sec, and 1/24 sec (frame rates are 120 fps, 60 fps, 30 fps, and 24 fps, respectively). Can be set. In this case, the imaging frame cycle is 1 time, 2 times, 4 times, and 5 times the flicker cycle, respectively. The control unit 240E may set a value that is an integral multiple of the flicker period and that conforms to a general moving image standard as the imaging frame period.

 (他の態様3の変形例1)

 図13は、上述した他の態様3の変形例1を示すフローチャートである。変形例1では、制御部240Eは、記録フレーム周期を初期設定(ステップS102)の値から変更(再設定)する(ステップS160、S180)。この際、制御部240Eは、記録フレーム周期を撮像フレーム周期に合わせてフリッカー周期の整数倍に変更することができる。撮像フレーム周期との値は同じでもよいし、違っていてもよい。また、値が違う場合に、いずれの周期が長くてもよい。制御部240Eは、一般的な動画像の規格に合わせて撮像フレーム周期及び/または記録フレーム周期を設定してもよい。画像処理部240Cは、上述した態様及び変形例と同様に、フレーム周期の相違に応じて補間フレームの挿入、フレームの合成、フレームの間引きを行う(ステップS222)。

(Modification 1 of other aspect 3)

FIG. 13 is a flowchart showing Modification 1 of the other aspect 3 described above. In the first modification, the control unit 240E changes (resets) the recording frame period from the value of the initial setting (step S102) (steps S160 and S180). At this time, the control unit 240E can change the recording frame period to an integral multiple of the flicker period in accordance with the imaging frame period. The value of the imaging frame period may be the same or different. Further, when the values are different, any cycle may be long. The control unit 240E may set the image capturing frame period and/or the recording frame period according to the standard of a general moving image. The image processing unit 240C inserts interpolation frames, synthesizes frames, and thins out frames in accordance with the difference in frame period, as in the above-described aspect and modification (step S222).

 上述した他の態様1~3及びその変形例においても、第1の実施形態と同様に動画像を構成するフレームの露光時間に応じたフリッカー対策を行うことができる。また、フリッカー対策を行って撮像した動画を所望のフレーム周期で記録することができる。

Also in the above-described other aspects 1 to 3 and the modified examples thereof, it is possible to take the flicker countermeasure according to the exposure time of the frame forming the moving image as in the first embodiment. Further, it is possible to record a moving image captured by taking measures against flicker at a desired frame period.

 <第2の実施形態>

 第1の実施形態では、デジタルカメラであるカメラ10について説明したが、撮像装置の構成はこれに限定されない。本発明のその他の撮像装置としては、例えば、内蔵型または外付け型のPC用カメラ(PC:Personal Computer)、あるいは、以下に説明するような、撮影機能を有する携帯端末装置とすることができる。

<Second Embodiment>

Although the camera 10 which is a digital camera has been described in the first embodiment, the configuration of the imaging device is not limited to this. The other imaging device of the present invention may be, for example, a built-in or external PC camera (PC: Personal Computer), or a portable terminal device having a shooting function as described below. ..

 本発明の撮像装置の一実施形態である携帯端末装置としては、例えば、携帯電話機やスマートフォン、PDA(Personal Digital Assistants)、携帯型ゲーム機が挙げられる。以下、スマートフォンを例に挙げ、図面を参照しつつ、詳細に説明する。

Examples of the mobile terminal device which is an embodiment of the image pickup device of the present invention include a mobile phone, a smartphone, a PDA (Personal Digital Assistants), and a portable game machine. Hereinafter, a smartphone will be described as an example in detail with reference to the drawings.

 図14は本発明の撮像装置の一実施形態であるスマートフォン1(撮像装置)の外観を示す図であり、同図の(a)部分は正面図、(b)部分は背面図である。図14に示すスマートフォン1は平板状の筐体2を有し、筐体2の一方の面に表示部としての表示パネル21(表示装置)と、入力部としての操作パネル22(操作部)とが一体となった表示入力部20を備えている。また、筐体2は、スピーカ31と、マイクロフォン32、操作部40(操作部)と、カメラ部41,42(撮像装置、撮像部)、ストロボ43とを備えている。なお、筐体2の構成はこれに限定されず、例えば、表示部と入力部とが独立した構成を採用してもよいし、折り畳み構造やスライド機構を有する構成を採用してもよい。

FIG. 14 is a diagram showing an appearance of a smartphone 1 (imaging device) which is an embodiment of an imaging device of the present invention, in which (a) is a front view and (b) is a rear view. A smartphone 1 shown in FIG. 14 has a flat housing 2, and a display panel 21 (display device) as a display unit and an operation panel 22 (operation unit) as an input unit on one surface of the housing 2. Is provided with a display input unit 20. Further, the housing 2 includes a speaker 31, a microphone 32, an operation unit 40 (operation unit), camera units 41 and 42 (imaging device, imaging unit), and a strobe 43. The configuration of the housing 2 is not limited to this. For example, a configuration in which the display unit and the input unit are independent may be employed, or a configuration having a folding structure or a slide mechanism may be employed.

 図15は、図14に示すスマートフォン1の構成を示すブロック図である。図15に示すように、スマートフォン1は、無線通信部11と、表示入力部20と、通話部30と、操作部40と、カメラ部41,42と、ストロボ43と、記憶部50と、外部入出力部60と、GPS受信部70(GPS:Global Positioning System)と、モーションセンサ部80と、電源部90と、を備える。また、スマートフォン1は、主制御部101(画像取得部、フリッカー検出部(周期検出部、位相検出部)、制御部、表示制御部、静止画抽出部、レンズ駆動制御部)を備える。また、スマートフォン1の主たる機能として、基地局装置と移動通信網とを介した移動無線通信を行う無線通信機能を備える。

FIG. 15 is a block diagram showing the configuration of the smartphone 1 shown in FIG. As shown in FIG. 15, the smartphone 1 includes a wireless communication unit 11, a display input unit 20, a call unit 30, an operation unit 40, camera units 41 and 42, a flash unit 43, a storage unit 50, and an external unit. An input/output unit 60, a GPS receiving unit 70 (GPS: Global Positioning System), a motion sensor unit 80, and a power supply unit 90 are provided. The smartphone 1 also includes a main control unit 101 (image acquisition unit, flicker detection unit (cycle detection unit, phase detection unit), control unit, display control unit, still image extraction unit, lens drive control unit). In addition, as a main function of the smartphone 1, a wireless communication function for performing mobile wireless communication via a base station device and a mobile communication network is provided.

 無線通信部11は、主制御部101の指示にしたがって、移動通信網に収容された基地局装置に対し無線通信を行う。斯かる無線通信を使用して、音声データ、画像データ等の各種ファイルデータ、電子メールデータなどの送受信や、Webデータやストリーミングデータなどの受信を行う。

The wireless communication unit 11 performs wireless communication with the base station device accommodated in the mobile communication network according to an instruction from the main control unit 101. Using such wireless communication, various file data such as voice data and image data, electronic mail data and the like are transmitted and received, and Web data and streaming data are received.

 表示入力部20は、主制御部101の制御により、画像(静止画像及び/または動画像)や文字情報などを表示して視覚的にユーザに情報を伝達すると共に、表示した情報に対するユーザ操作を検出する、いわゆるタッチパネルであって、表示パネル21と、操作パネル22とを備える。

Under the control of the main control unit 101, the display input unit 20 displays an image (still image and/or moving image), character information, and the like to visually convey the information to the user, and also to perform a user operation on the displayed information. It is a so-called touch panel for detection, and includes a display panel 21 and an operation panel 22.

 表示パネル21においては、LCD(Liquid Crystal Display)、OELD(Organic Electro-Luminescence Display)などが表示デバイスとして用いられる。操作パネル22は、表示パネル21の表示面上に表示される画像を視認可能に載置され、ユーザの指やペン等の導体によって操作される1または複数の座標を検出するデバイスである。斯かるデバイスをユーザの指やペン等の導体によって操作すると、操作パネル22は、操作に起因して発生する検出信号を主制御部101に出力する。次いで、主制御部101は、受信した検出信号に基づいて、表示パネル21上の操作位置(座標)を検出する。

In the display panel 21, an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like is used as a display device. The operation panel 22 is a device that is placed so that an image displayed on the display surface of the display panel 21 can be visually recognized, and detects one or a plurality of coordinates operated by a conductor such as a finger of a user or a pen. When such a device is operated by a user's finger or a conductor such as a pen, the operation panel 22 outputs a detection signal generated due to the operation to the main control unit 101. Next, the main control unit 101 detects the operation position (coordinates) on the display panel 21 based on the received detection signal.

 図14に示すように、本発明の撮像装置の一実施形態として例示しているスマートフォン1の表示パネル21と操作パネル22とは一体となって表示入力部20を構成しているが、操作パネル22が表示パネル21を完全に覆う配置となっている。斯かる配置を採用した場合、操作パネル22は、表示パネル21外の領域についても、ユーザ操作を検出する機能を備えてもよい。換言すると、操作パネル22は、表示パネル21に重なる重畳部分についての検出領域(以下、表示領域と称する)と、それ以外の表示パネル21に重ならない外縁部分についての検出領域(以下、非表示領域と称する)とを備えていてもよい。

As shown in FIG. 14, the display panel 21 and the operation panel 22 of the smartphone 1 exemplified as one embodiment of the image pickup apparatus of the present invention integrally configure the display input unit 20, but the operation panel The display panel 21 is arranged so as to completely cover the display panel 21. When such an arrangement is adopted, the operation panel 22 may have a function of detecting a user operation even in the area outside the display panel 21. In other words, the operation panel 22 has a detection area (hereinafter, referred to as a display area) for the overlapping portion that overlaps the display panel 21 and a detection area (hereinafter, a non-display area) for the other outer edge portion that does not overlap the display panel 21. Referred to).

 通話部30は、スピーカ31やマイクロフォン32を備え、マイクロフォン32を通じて入力されたユーザの音声を主制御部101にて処理可能な音声データに変換して主制御部101に出力すること、無線通信部11あるいは外部入出力部60により受信された音声データを復号してスピーカ31から出力することができる。また、図14に示すように、例えばスピーカ31を表示入力部20が設けられた面と同じ面に搭載し、マイクロフォン32を筐体2の側面に搭載することができる。

The call unit 30 includes a speaker 31 and a microphone 32, converts a user's voice input through the microphone 32 into voice data that can be processed by the main control unit 101, and outputs the voice data to the main control unit 101. 11 or the audio data received by the external input/output unit 60 can be decoded and output from the speaker 31. Further, as shown in FIG. 14, for example, the speaker 31 can be mounted on the same surface as the surface on which the display input unit 20 is provided, and the microphone 32 can be mounted on the side surface of the housing 2.

 操作部40は、キースイッチなどを用いたハードウェアキーであって、ユーザからの指示を受け付けるデバイスである。例えば図14に示すように、操作部40は、スマートフォン1の筐体2の側面に搭載され、指などで押下されるとオンとなり、指を離すとバネなどの復元力によってオフ状態となる押しボタン式のスイッチである。

The operation unit 40 is a hardware key that uses a key switch or the like, and is a device that receives an instruction from a user. For example, as shown in FIG. 14, the operation unit 40 is mounted on the side surface of the housing 2 of the smartphone 1, and is turned on when pressed by a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a button type switch.

 記憶部50(記録装置)は、主制御部101の制御プログラムや制御データ、アプリケーションソフトウェア、通信相手の名称や電話番号などを対応づけたアドレスデータ、送受信した電子メールのデータ、WebブラウジングによりダウンロードしたWebデータや、ダウンロードしたコンテンツデータを記憶し、またストリーミングデータなどを一時的に記憶する。また、記憶部50は、スマートフォン内蔵の内部記憶部51と着脱自在な外部メモリスロットを有する外部記憶部52により構成される。なお、記憶部50を構成するそれぞれの内部記憶部51と外部記憶部52は、公知の格納媒体を用いて実現される。

The storage unit 50 (recording device) was downloaded by control program and control data of the main control unit 101, application software, address data associated with names and telephone numbers of communication partners, data of sent and received emails, and Web browsing. It stores Web data, downloaded content data, and temporarily stores streaming data and the like. Further, the storage unit 50 includes an internal storage unit 51 with a built-in smartphone and an external storage unit 52 having a detachable external memory slot. The internal storage unit 51 and the external storage unit 52 that configure the storage unit 50 are realized using known storage media.

 外部入出力部60は、スマートフォン1に連結される全ての外部機器とのインターフェースの役割を果たす。スマートフォン1は、外部入出力部60を介して他の外部機器に通信等により直接的または間接的に接続される。通信等の手段としては、例えば、ユニバーサルシリアルバス(USB:Universal Serial Bus)、IEEE1394、ネットワーク(例えば、インターネット、無線LAN)を挙げることができる。この他、ブルートゥース(Bluetooth)(登録商標)、RFID(Radio Frequency Identification)、赤外線通信(Infrared Data Association:IrDA)(登録商標)などを通信等の手段として挙げることができる。さらに、UWB(Ultra Wide Band)(登録商標)、ジグビー(ZigBee)(登録商標)なども通信等の手段として挙げることができる。

The external input/output unit 60 serves as an interface with all external devices connected to the smartphone 1. The smartphone 1 is directly or indirectly connected to another external device via the external input/output unit 60 by communication or the like. Examples of means for communication and the like include universal serial bus (USB), IEEE 1394, and network (eg, Internet, wireless LAN). In addition, Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (Infrared Data Association: IrDA) (registered trademark), and the like can be cited as means for communication and the like. Furthermore, UWB (Ultra Wide Band) (registered trademark), ZigBee (registered trademark), and the like can be cited as means for communication.

 スマートフォン1に連結される外部機器としては、例えば、有線/無線ヘッドセット、有線/無線外部充電器、有線/無線データポートを挙げることができる。また、カードソケットを介して接続されるメモリカード(Memory card)やSIM(Subscriber Identity Module Card)/UIM(User Identity Module Card)カードも外部機器として挙げることができる。また、オーディオ及びビデオI/O(Input/Output)端子を介して接続される外部オーディオ及びビデオ機器、無線接続される外部オーディオ及びビデオ機器、有線/無線接続されるスマートフォン、有線/無線接続されるPDA、有線/無線接続されるパーソナルコンピュータ、イヤホンなどの外部機器も連結することができる。外部入出力部60は、このような外部機器から伝送を受けたデータをスマートフォン1の内部の各構成要素に伝達することや、スマートフォン1の内部のデータを外部機器に伝送することができる。

Examples of the external device connected to the smartphone 1 include a wired/wireless headset, a wired/wireless external charger, and a wired/wireless data port. In addition, a memory card (Memory card), a SIM (Subscriber Identity Module Card)/UIM (User Identity Module Card) card connected via a card socket can also be used as the external device. Also, external audio and video devices connected via audio/video I/O (Input/Output) terminals, external audio and video devices wirelessly connected, smartphones wired/wirelessly connected, and wired/wirelessly connected. External devices such as a PDA, a wired/wirelessly connected personal computer, and earphones can also be connected. The external input/output unit 60 can transmit the data transmitted from such an external device to each component inside the smartphone 1, or can transmit the data inside the smartphone 1 to the external device.

 モーションセンサ部80は、例えば、3軸の加速度センサや傾斜センサなどを備え、主制御部101の指示にしたがって、スマートフォン1の物理的な動きを検出する。スマートフォン1の物理的な動きを検出することにより、スマートフォン1の動く方向や加速度、姿勢が検出される。斯かる検出結果は、主制御部101に出力されるものである。電源部90は、主制御部101の指示にしたがって、スマートフォン1の各部に、バッテリ(不図示)に蓄えられる電力を供給する。

The motion sensor unit 80 includes, for example, a triaxial acceleration sensor and a tilt sensor, and detects a physical movement of the smartphone 1 according to an instruction from the main control unit 101. By detecting the physical movement of the smartphone 1, the moving direction, acceleration, and posture of the smartphone 1 are detected. The detection result is output to the main control unit 101. The power supply unit 90 supplies power stored in a battery (not shown) to each unit of the smartphone 1 according to an instruction from the main control unit 101.

 主制御部101は、マイクロプロセッサを備え、記憶部50が記憶する制御プログラムや制御データにしたがって動作し、カメラ部41を含むスマートフォン1の各部を統括して制御する。また、主制御部101は、無線通信部11を通じて、音声通信やデータ通信を行うために、通信系の各部を制御する移動通信制御機能と、アプリケーション処理機能を備える。

The main control unit 101 includes a microprocessor, operates according to a control program and control data stored in the storage unit 50, and integrally controls each unit of the smartphone 1 including the camera unit 41. Further, the main control unit 101 includes a mobile communication control function for controlling each unit of the communication system and an application processing function for performing voice communication and data communication through the wireless communication unit 11.

 また、主制御部101は、受信データやダウンロードしたストリーミングデータなどの画像データ(静止画像や動画像のデータ)に基づいて、映像を表示入力部20に表示する等の画像処理機能を備える。画像処理機能とは、主制御部101が、画像データを復号し、斯かる復号結果に画像処理を施して、画像を表示入力部20に表示する機能のことをいう。

Further, the main control unit 101 has an image processing function of displaying a video on the display input unit 20 based on image data (still image data or moving image data) such as received data or downloaded streaming data. The image processing function refers to a function of the main control unit 101 decoding image data, performing image processing on the decoding result, and displaying an image on the display input unit 20.

 カメラ部41,42は、CMOSやCCDなどの撮像素子を用いて電子撮影するデジタルカメラ(撮像装置)である。また、カメラ部41,42は、主制御部101の制御により、撮像によって得た画像データ(動画、静止画)を例えばMPEGやJPEGなどの圧縮した画像データに変換し、記憶部50に記録することや、外部入出力部60や無線通信部11を通じて出力することができる。また、カメラ部41は、主制御部101の制御により、動画の分割及び結合、高画質な静止画(RAW画像等)の取得、フレームの入れ替え及び加工、動画からの静止画の抽出を行うこともできる。図14,15に示すスマートフォン1において、カメラ部41,42の一方を用いて撮影することもできるし、カメラ部41,42を同時に使用して撮影することもできる。カメラ部42を用いる場合はストロボ43を使用することができる。

The camera units 41 and 42 are digital cameras (imaging devices) that electronically capture images using imaging devices such as CMOS and CCD. Under control of the main control unit 101, the camera units 41 and 42 convert the image data (moving image, still image) obtained by imaging into compressed image data such as MPEG or JPEG, and record it in the storage unit 50. In addition, it can be output through the external input/output unit 60 and the wireless communication unit 11. Further, the camera unit 41, under the control of the main control unit 101, performs division and combination of moving images, acquisition of high-quality still images (RAW images, etc.), frame replacement and processing, and extraction of still images from moving images. Can also In the smartphone 1 shown in FIGS. 14 and 15, one of the camera units 41 and 42 can be used for shooting, and the camera units 41 and 42 can be used simultaneously for shooting. When the camera unit 42 is used, the strobe 43 can be used.

 また、カメラ部41,42はスマートフォン1の各種機能に利用することができる。例えば、スマートフォン1は、表示パネル21にカメラ部41,42で取得した画像を表示できる。また、スマートフォン1は、操作パネル22の操作入力のひとつとして、カメラ部41,42の画像を利用することができる。また、スマートフォン1は、GPS受信部70がGPS衛星ST1,ST2,…,STnからの測位情報に基づいて位置を検出する際に、カメラ部41,42からの画像を参照して位置を検出することもできる。さらには、スマートフォン1は、カメラ部41,42からの画像を参照して、3軸の加速度センサを用いずに、あるいは、3軸の加速度センサと併用して、スマートフォン1のカメラ部41の光軸方向を判断することや、現在の使用環境を判断することもできる。勿論、スマートフォン1は、カメラ部41,42からの画像をアプリケーションソフトウェア内で利用することもできる。その他、スマートフォン1は、静止画または動画の画像データにGPS受信部70により取得した位置情報、マイクロフォン32により取得した音声情報(主制御部等により、音声テキスト変換を行ってテキスト情報となっていてもよい)、モーションセンサ部80により取得した姿勢情報等を付加して記憶部50に記録することもできる。また、スマートフォン1は、これらの静止画または動画の画像データを外部入出力部60や無線通信部11を通じて出力することもできる。

In addition, the camera units 41 and 42 can be used for various functions of the smartphone 1. For example, the smartphone 1 can display the images acquired by the camera units 41 and 42 on the display panel 21. Further, the smartphone 1 can use the images of the camera units 41 and 42 as one of the operation inputs of the operation panel 22. When the GPS receiving unit 70 detects the position based on the positioning information from the GPS satellites ST1, ST2,..., STn, the smartphone 1 detects the position by referring to the images from the camera units 41 and 42. You can also Furthermore, the smartphone 1 refers to the images from the camera units 41 and 42 and uses the light of the camera unit 41 of the smartphone 1 without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the axial direction and the current usage environment. Of course, the smartphone 1 can also use the images from the camera units 41 and 42 in the application software. In addition, in the smartphone 1, the position information obtained by the GPS receiving unit 70 and the voice information obtained by the microphone 32 are added to the image data of the still image or the moving image (the main control unit or the like performs voice text conversion to obtain text information. Alternatively, the posture information acquired by the motion sensor unit 80 may be added and recorded in the storage unit 50. The smartphone 1 can also output the image data of these still images or moving images through the external input/output unit 60 or the wireless communication unit 11.

 上述した構成のスマートフォン1においても、第1の実施形態に係るカメラ10と同様に本発明に係る撮像方法の処理(動画の撮像、フリッカーの周期及び位相の検出、フレーム周期の制御、記録用動画データの生成及び記録、記録用動画データの再生表示、静止画抽出等)を実行することができる。具体的には、第1の実施形態において画像処理装置240(図2に示す各部)が実行する処理(上述したフローチャートの処理を含む)をスマートフォン1ではカメラ部41,42及び主制御部101が実行できる。その他、第1の実施形態における操作部250、記録装置260、モニタ270の機能は、スマートフォン1において操作部40、記憶部50及び操作パネル22、表示パネル21及び操作パネル22によりそれぞれ実現することができる。

Also in the smartphone 1 having the above-described configuration, the processing of the image capturing method according to the present invention is performed similarly to the camera 10 according to the first embodiment (video image capturing, flicker cycle and phase detection, frame cycle control, recording video). Data generation and recording, reproduction display of moving image data for recording, still image extraction, etc.) can be performed. Specifically, in the smartphone 1, the camera units 41 and 42 and the main control unit 101 perform the processing (including the processing of the above-described flowchart) executed by the image processing apparatus 240 (each unit illustrated in FIG. 2) in the first embodiment. I can do it. In addition, the functions of the operation unit 250, the recording device 260, and the monitor 270 according to the first embodiment can be realized by the operation unit 40, the storage unit 50 and the operation panel 22, the display panel 21, and the operation panel 22 in the smartphone 1. it can.

 これにより、第2の実施形態に係るスマートフォン1においても、第1の実施形態に係るカメラ10と同様の効果(動画像を構成するフレームの露光時間に応じたフリッカー対策を行えること、フリッカー対策を行って撮像した動画を所望のフレーム周期で記録できること等)を得ることができる。

As a result, also in the smartphone 1 according to the second embodiment, the same effect as that of the camera 10 according to the first embodiment (the flicker countermeasure can be performed according to the exposure time of the frame forming the moving image, and the flicker countermeasure can be taken). It is possible to obtain that a moving image captured by performing the recording can be recorded at a desired frame period.

 以上で本発明の実施形態及び他の態様(変形例を含む)に関して説明してきたが、本発明は上述した実施形態及び他の態様に限定されず、本発明の精神を逸脱しない範囲で種々の変形が可能である。

Although the embodiments and other aspects (including modifications) of the present invention have been described above, the present invention is not limited to the above-described embodiments and other aspects, and various modifications are possible without departing from the spirit of the present invention. Deformation is possible.

1    スマートフォン

2    筐体

10   カメラ

11   無線通信部

20   表示入力部

21   表示パネル

22   操作パネル

30   通話部

31   スピーカ

32   マイクロフォン

40   操作部

41   カメラ部

42   カメラ部

43   ストロボ

50   記憶部

51   内部記憶部

52   外部記憶部

60   外部入出力部

70   GPS受信部

80   モーションセンサ部

90   電源部

100  交換レンズ

101  主制御部

110  ズームレンズ

120  フォーカスレンズ

130  絞り

140  レンズ駆動部

200  撮像装置本体

201  撮像部

210  撮像素子

220  AFE

230  A/D変換器

240  画像処理装置

240A 画像取得部

240B フリッカー検出部

240C 画像処理部

240D 記録部

240E 制御部

240F 表示制御部

240G 静止画抽出部

240H レンズ駆動制御部

242  ROM

250  操作部

260  記録装置

270  モニタ

280  姿勢センサ

500  フレーム

502  フレーム

502A 補間フレーム

506  フレーム   

508  フレーム

k    カウンタ

L    光軸

L0   明るさ

L1   明るさ

S100~S430 撮像方法の各ステップ

ST1  GPS衛星

ST2  GPS衛星

STn  GPS衛星

T0   フリッカー周期

T1   撮像フレーム周期

T2   記録フレーム周期

t    時間

t1   時刻

t2   時刻

t3   時刻

t4   時刻

t5   時刻

t6   時刻

t7   時刻

1 smartphone

2 housing

10 cameras

11 Wireless communication section

20 Display input section

21 Display panel

22 Operation panel

30 call department

31 speakers

32 microphone

40 Operation part

41 camera section

42 Camera section

43 Strobe

50 memory

51 internal storage

52 external storage

60 External input/output section

70 GPS receiver

80 Motion sensor part

90 power supply

100 interchangeable lens

101 main control unit

110 zoom lens

120 focus lens

130 aperture

140 lens drive

200 Imaging device body

201 Imaging unit

210 image sensor

220 AFE

230 A/D converter

240 image processing device

240A image acquisition unit

240B Flicker detector

240C image processing unit

240D recording unit

240E control unit

240F Display control unit

240G still image extraction unit

240H Lens drive control unit

242 ROM

250 Control

260 recording device

270 monitor

280 Attitude sensor

500 frames

502 frames

502A interpolation frame

506 frames

508 frames

k counter

L optical axis

L0 brightness

L1 brightness

S100 to S430 Each step of the imaging method

ST1 GPS satellite

ST2 GPS satellite

STn GPS satellite

T0 flicker cycle

T1 imaging frame period

T2 recording frame period

t time

t1 time

t2 time

t3 time

t4 time

t5 time

t6 time

t7 time

Claims (27)


  1.  動画像における光源のフリッカー周期を検出する周期検出部と、

     フレーム周期が第1フレーム周期の動画像を撮像する撮像部と、

     前記撮像した前記動画像を示す第1動画データを取得する画像取得部と、

     前記第1動画データを処理してフレーム周期が第2フレーム周期の第2動画データを生成する画像処理部と、

     前記第2動画データを記録装置に記録させる記録部と、

     前記第1フレーム周期及び第2フレーム周期を制御する制御部と、を備える撮像装置であって、

     前記制御部は、前記動画像の撮像モードとして、第1動画モードと、前記動画像を構成するフレームの露光時間が前記第1動画モードより短く設定される第2動画モードと、を有し、

     前記制御部は、前記第2動画モードでは前記周期検出部でフリッカー周期が検出された場合に前記第1フレーム周期を前記フリッカー周期の整数倍に設定し、前記第1動画モードでは前記フリッカー周期によらずに前記第1フレーム周期を設定する撮像装置。

    A cycle detection unit that detects a flicker cycle of a light source in a moving image,

    An image capturing unit that captures a moving image having a frame period of a first frame period;

    An image acquisition unit that acquires first moving image data indicating the captured moving image;

    An image processing unit that processes the first moving image data to generate second moving image data having a frame period of a second frame period;

    A recording unit for recording the second moving image data in a recording device;

    An image pickup apparatus comprising: a control unit that controls the first frame period and the second frame period,

    The control unit has, as the moving image capturing mode, a first moving image mode and a second moving image mode in which an exposure time of a frame forming the moving image is set shorter than the first moving image mode,

    The control unit sets the first frame period to an integral multiple of the flicker period when a flicker period is detected by the period detection unit in the second moving image mode, and sets the first frame period to the flicker period in the first moving image mode. An imaging device that sets the first frame period independently of the above.

  2.  前記制御部は、前記第2動画モードにおいて、前記光源のフリッカー周期が検出された場合と検出されない場合とで前記第2フレーム周期を変化させない請求項1に記載の撮像装置。

    The imaging device according to claim 1, wherein the control unit does not change the second frame period in the second moving image mode depending on whether a flicker period of the light source is detected or not.

  3.  前記制御部は、前記第2動画モードにおいて前記光源のフリッカー周期が検出された場合、前記第2フレーム周期を前記第1フレーム周期より長いフレーム周期に設定する請求項1に記載の撮像装置。

    The imaging device according to claim 1, wherein the control unit sets the second frame period to a frame period longer than the first frame period when a flicker period of the light source is detected in the second moving image mode.

  4.  前記画像処理部は、前記第2フレーム周期が前記第1フレーム周期より長い場合に、前記第1動画データを構成する複数のフレームのうち一部のフレームを選択してフレーム周期が前記第2フレーム周期の前記第2動画データを生成する請求項1から3のいずれか1項に記載の撮像装置。

    When the second frame period is longer than the first frame period, the image processing unit selects a part of the plurality of frames forming the first moving image data so that the frame period is the second frame. The image pickup device according to claim 1, wherein the second moving image data having a cycle is generated.

  5.  前記画像処理部は、前記第2フレーム周期が前記第1フレーム周期より長い場合に、前記第1動画データを構成する複数のフレームを合成して前記第2動画データを構成するフレームを生成する請求項1から4のいずれか1項に記載の撮像装置。

    The image processing unit, when the second frame period is longer than the first frame period, synthesizes a plurality of frames forming the first moving image data to generate a frame forming the second moving image data. The imaging device according to any one of items 1 to 4.

  6.  前記制御部は、前記第2動画モードにおいて前記光源のフリッカー周期が検出された場合、前記第2フレーム周期を前記第1フレーム周期より短いフレーム周期に設定する請求項1に記載の撮像装置。

    The imaging device according to claim 1, wherein the control unit sets the second frame period to a frame period shorter than the first frame period when a flicker period of the light source is detected in the second moving image mode.

  7.  前記画像処理部は、前記第1フレーム周期が前記第2フレーム周期より長い場合に、前記第1動画データを構成するフレームに補間フレームを挿入してフレーム周期が前記第2フレーム周期の前記第2動画データを生成する請求項1、2、6のいずれか1項に記載の撮像装置。

    When the first frame period is longer than the second frame period, the image processing unit inserts an interpolation frame into a frame forming the first moving image data, and the second frame period is the second frame period. The imaging device according to claim 1, which generates moving image data.

  8.  前記第2動画データを画像として表示装置に表示させる表示制御部をさらに備え、前記表示制御部は前記補間フレームを除いて前記第2動画データを表示させる請求項7に記載の撮像装置。

    The imaging device according to claim 7, further comprising a display control unit that causes the display device to display the second moving image data as an image, wherein the display control unit displays the second moving image data excluding the interpolation frame.

  9.  前記制御部は、前記第2動画モードにおいて、前記検出の結果に応じて前記第2フレーム周期を前記フリッカー周期の整数倍に設定する請求項1、3から8のいずれか1項に記載の撮像装置。

    9. The imaging according to claim 1, wherein the control unit sets the second frame period to an integral multiple of the flicker period according to a result of the detection in the second moving image mode. apparatus.

  10.  前記第1動画データにおける前記光源のフリッカー位相情報を検出する位相検出部をさらに備え、

     前記制御部は、前記第2動画モードにおいて、前記第1動画データを構成するフレームの露光タイミングを前記フリッカー位相情報に基づいて制御する請求項1から9のいずれか1項に記載の撮像装置。

    A phase detector that detects flicker phase information of the light source in the first moving image data,

    The image pickup apparatus according to claim 1, wherein the control unit controls the exposure timing of a frame forming the first moving image data based on the flicker phase information in the second moving image mode.

  11.  前記第2動画モードでは、前記第1動画モードに対してオートフォーカスの速度、自動露出の追従速度、ホワイトバランスの追従速度のうち少なくとも1つが高速に設定される請求項1から10のいずれか1項に記載の撮像装置。

    11. In the second moving image mode, at least one of an autofocus speed, an automatic exposure follow-up speed, and a white balance follow-up speed is set to be higher than that in the first moving picture mode. The imaging device according to the item.

  12.  前記第2動画データを構成するフレームを静止画として抽出する静止画抽出部をさらに備える請求項1から11のいずれか1項に記載の撮像装置。

    The image pickup apparatus according to claim 1, further comprising a still image extraction unit that extracts a frame forming the second moving image data as a still image.

  13.  動画像における光源のフリッカー周期を検出する周期検出部と、

     フレーム周期が第1フレーム周期の動画像を撮像する撮像部と、

     前記撮像した前記動画像を示す第1動画データを取得する画像取得部と、

     前記第1動画データを処理してフレーム周期が第2フレーム周期の第2動画データを生成する画像処理部と、

     前記第2動画データを記録装置に記録させる記録部と、

     前記第1フレーム周期及び第2フレーム周期を制御する制御部と、を備える撮像装置であって、

     前記制御部は前記周期検出部でフリッカー周期が検出された場合に前記第1フレーム周期を前記フリッカー周期の整数倍に設定し、前記第2フレーム周期をあらかじめ指定されたフレーム周期に設定する撮像装置。

    A cycle detection unit that detects a flicker cycle of a light source in a moving image,

    An image capturing unit that captures a moving image having a frame period of a first frame period;

    An image acquisition unit that acquires first moving image data indicating the captured moving image;

    An image processing unit that processes the first moving image data to generate second moving image data having a frame period of a second frame period;

    A recording unit for recording the second moving image data in a recording device;

    An image pickup apparatus comprising: a control unit that controls the first frame period and the second frame period,

    An image pickup apparatus in which the control unit sets the first frame period to an integral multiple of the flicker period and the second frame period to a predesignated frame period when a flicker period is detected by the period detection unit. ..

  14.  前記制御部は、前記光源のフリッカー周期が検出された場合、前記第2フレーム周期を前記第1フレーム周期より長いフレーム周期に設定する請求項13に記載の撮像装置。

    The imaging device according to claim 13, wherein the control unit sets the second frame period to a frame period longer than the first frame period when a flicker period of the light source is detected.

  15.  前記画像処理部は、前記第2フレーム周期が前記第1フレーム周期より長い場合に、前記第1動画データを構成する複数のフレームのうち一部のフレームを選択して前記第2動画データを生成する請求項13または14に記載の撮像装置。

    When the second frame period is longer than the first frame period, the image processing unit selects a part of a plurality of frames forming the first moving image data to generate the second moving image data. The imaging device according to claim 13 or 14.

  16.  前記画像処理部は、前記第2フレーム周期が前記第1フレーム周期より長い場合に、前記第1動画データを構成する複数のフレームを合成して前記第2動画データを構成するフレームを生成する請求項13から15のいずれか1項に記載の撮像装置。

    The image processing unit, when the second frame period is longer than the first frame period, synthesizes a plurality of frames forming the first moving image data to generate a frame forming the second moving image data. Item 16. The imaging device according to any one of items 13 to 15.

  17.  前記制御部は、前記光源のフリッカー周期が検出された場合、前記第2フレーム周期を前記第1フレーム周期より短い周期に設定する請求項13に記載の撮像装置。

    The imaging device according to claim 13, wherein the control unit sets the second frame period to a period shorter than the first frame period when a flicker period of the light source is detected.

  18.  前記画像処理部は、前記第1フレーム周期が前記第2フレーム周期より長い場合に、前記第1動画データを構成するフレームに補間フレームを挿入して前記第2動画データを構成するフレームを生成する請求項13または17に記載の撮像装置。

    When the first frame period is longer than the second frame period, the image processing unit inserts an interpolation frame into a frame forming the first moving image data to generate a frame forming the second moving image data. The imaging device according to claim 13 or 17.

  19.  前記第2動画データを画像として表示装置に表示させる表示制御部をさらに備え、

     前記表示制御部は前記補間フレームを除いて前記第2動画データを表示させる請求項18に記載の撮像装置。

    Further comprising a display control unit for displaying the second moving image data as an image on a display device,

    The imaging device according to claim 18, wherein the display control unit displays the second moving image data excluding the interpolation frame.

  20.  前記第1動画データにおける前記光源のフリッカー位相情報を検出する位相検出部をさらに備え、

     前記制御部は、前記第1動画データを構成するフレームの露光タイミングを前記フリッカー位相情報に基づいて制御する請求項13から19のいずれか1項に記載の撮像装置。

    A phase detector that detects flicker phase information of the light source in the first moving image data,

    20. The image pickup apparatus according to claim 13, wherein the control unit controls an exposure timing of a frame forming the first moving image data based on the flicker phase information.

  21.  前記制御部は、前記動画像の撮像モードとして、第1動画モードと、前記動画像を構成するフレームの露光時間が前記第1動画モードより短く設定される第2動画モードとを有し、

     前記制御部は、前記第2動画モードでは、前記検出の結果に応じて前記第1フレーム周期を前記フリッカー周期の整数倍に設定する請求項13から20のいずれか1項に記載の撮像装置。

    The control unit has a first moving image mode as a moving image capturing mode, and a second moving image mode in which an exposure time of a frame forming the moving image is set shorter than the first moving image mode.

    The imaging device according to claim 13, wherein in the second moving image mode, the control unit sets the first frame period to an integral multiple of the flicker period according to a result of the detection.

  22.  前記第2動画モードは、前記第1動画モードに対してオートフォーカスの速度、自動露出の追従速度、ホワイトバランスの追従速度のうち少なくとも1つが高速に設定される請求項21に記載の撮像装置。

    22. The imaging device according to claim 21, wherein in the second moving image mode, at least one of an autofocus speed, an automatic exposure following speed, and a white balance following speed is set to be higher than that in the first moving image mode.

  23.  前記第2動画データを構成するフレームを静止画として抽出する静止画抽出部をさらに備える請求項13から22のいずれか1項に記載の撮像装置。

    23. The image pickup apparatus according to claim 13, further comprising a still image extraction unit that extracts a frame that constitutes the second moving image data as a still image.

  24.  動画像の撮像モードとして、第1動画モードと、前記動画像を構成するフレームの露光時間が前記第1動画モードより短く設定される第2動画モードとを有する撮像装置の撮像方法であって、

     動画像における光源のフリッカー周期を検出する周期検出ステップと、

     フレーム周期が第1フレーム周期の動画像を撮像する撮像ステップと、

     前記撮像した前記動画像を示す第1動画データを取得する画像取得ステップと、

     前記第1動画データを処理してフレーム周期が第2フレーム周期の第2動画データを生成する画像処理ステップと、

     前記第2動画データを記録装置に記録させる記録ステップと、

     前記第1フレーム周期及び第2フレーム周期を制御する制御ステップと、を有し、

     前記制御ステップにおいて、前記第2動画モードでは前記検出の結果に応じて前記第1フレーム周期を前記フリッカー周期の整数倍に設定し、前記第1動画モードでは前記フリッカー周期によらずに前記第1フレーム周期を設定する撮像方法。

    A method for imaging a moving image, comprising: a first moving image mode; and a second moving image mode in which an exposure time of a frame forming the moving image is set shorter than the first moving image mode.

    A cycle detection step of detecting a flicker cycle of a light source in a moving image,

    An image capturing step of capturing a moving image having a frame period of a first frame period;

    An image acquisition step of acquiring first moving image data indicating the captured moving image;

    An image processing step of processing the first moving image data to generate second moving image data having a frame period of a second frame period;

    A recording step of recording the second moving image data in a recording device,

    A control step of controlling the first frame period and the second frame period,

    In the control step, in the second moving image mode, the first frame period is set to an integral multiple of the flicker period according to a result of the detection, and in the first moving image mode, the first frame period is set regardless of the flicker period. An imaging method for setting a frame period.

  25.  動画像における光源のフリッカー周期を検出する検出ステップと、

     フレーム周期が第1フレーム周期の動画像を撮像する撮像ステップと、

     前記撮像した前記動画像を示す第1動画データを取得する画像取得ステップと、

     前記第1動画データを処理してフレーム周期が第2フレーム周期の第2動画データを生成する画像処理ステップと、

     前記第2動画データを記録装置に記録させる記録ステップと、

     前記第1フレーム周期及び第2フレーム周期を制御する制御ステップと、を有し、

     前記制御ステップにおいて、前記検出の結果に応じて前記第1フレーム周期を前記フリッカー周期の整数倍に設定し、前記第2フレーム周期をあらかじめ指定された周期に設定する撮像方法。

    A detection step of detecting the flicker period of the light source in the moving image,

    An image capturing step of capturing a moving image having a frame period of a first frame period;

    An image acquisition step of acquiring first moving image data indicating the captured moving image;

    An image processing step of processing the first moving image data to generate second moving image data having a frame period of a second frame period;

    A recording step of recording the second moving image data in a recording device,

    A control step of controlling the first frame period and the second frame period,

    In the control step, an imaging method in which the first frame period is set to an integral multiple of the flicker period and the second frame period is set to a predetermined period according to the detection result.

  26.  請求項24に記載の撮像方法を撮像装置に実行させるプログラム。

    A program that causes an imaging device to execute the imaging method according to claim 24.

  27.  請求項25に記載の撮像方法を撮像装置に実行させるプログラム。

    A program that causes an imaging device to execute the imaging method according to claim 25.
PCT/JP2019/041616 2019-01-29 2019-10-24 Imaging device, imaging method, and program WO2020158069A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020569370A JP7110406B2 (en) 2019-01-29 2019-10-24 IMAGING DEVICE, IMAGING METHOD, AND PROGRAM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019013237 2019-01-29
JP2019-013237 2019-01-29

Publications (1)

Publication Number Publication Date
WO2020158069A1 true WO2020158069A1 (en) 2020-08-06

Family

ID=71840526

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/041616 WO2020158069A1 (en) 2019-01-29 2019-10-24 Imaging device, imaging method, and program

Country Status (2)

Country Link
JP (1) JP7110406B2 (en)
WO (1) WO2020158069A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022267763A1 (en) * 2021-06-24 2022-12-29 荣耀终端有限公司 Photographing method and related apparatus
WO2023121154A1 (en) * 2021-12-24 2023-06-29 Samsung Electronics Co., Ltd. A method and system for capturing a video in a user equipment
WO2024004519A1 (en) * 2022-06-28 2024-01-04 ソニーセミコンダクタソリューションズ株式会社 Information processing device and information processing method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008283707A (en) * 2008-06-30 2008-11-20 Fujifilm Corp Photographing apparatus and control method therefor
JP2009105693A (en) * 2007-10-24 2009-05-14 Olympus Corp Camera
JP2010103746A (en) * 2008-10-23 2010-05-06 Hoya Corp Imaging apparatus
JP2012160785A (en) * 2011-01-28 2012-08-23 Canon Inc Image-capturing apparatus and control method therefor
JP2014007622A (en) * 2012-06-26 2014-01-16 Casio Comput Co Ltd Image pick-up device and program
JP2016032214A (en) * 2014-07-29 2016-03-07 パナソニックIpマネジメント株式会社 Imaging device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009105693A (en) * 2007-10-24 2009-05-14 Olympus Corp Camera
JP2008283707A (en) * 2008-06-30 2008-11-20 Fujifilm Corp Photographing apparatus and control method therefor
JP2010103746A (en) * 2008-10-23 2010-05-06 Hoya Corp Imaging apparatus
JP2012160785A (en) * 2011-01-28 2012-08-23 Canon Inc Image-capturing apparatus and control method therefor
JP2014007622A (en) * 2012-06-26 2014-01-16 Casio Comput Co Ltd Image pick-up device and program
JP2016032214A (en) * 2014-07-29 2016-03-07 パナソニックIpマネジメント株式会社 Imaging device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022267763A1 (en) * 2021-06-24 2022-12-29 荣耀终端有限公司 Photographing method and related apparatus
WO2023121154A1 (en) * 2021-12-24 2023-06-29 Samsung Electronics Co., Ltd. A method and system for capturing a video in a user equipment
WO2024004519A1 (en) * 2022-06-28 2024-01-04 ソニーセミコンダクタソリューションズ株式会社 Information processing device and information processing method

Also Published As

Publication number Publication date
JPWO2020158069A1 (en) 2021-11-25
JP7110406B2 (en) 2022-08-01

Similar Documents

Publication Publication Date Title
US8582012B2 (en) Imaging apparatus and display control method in imaging apparatus
US8638374B2 (en) Image pickup apparatus, image pickup system, and image pickup method
JP5567235B2 (en) Image processing apparatus, photographing apparatus, program, and image processing method
CN113508575B (en) Method and system for high dynamic range processing based on angular rate measurements
US9088721B2 (en) Imaging apparatus and display control method thereof
WO2020158069A1 (en) Imaging device, imaging method, and program
JP2017168982A (en) Imaging apparatus, and control method and program for imaging apparatus
US11438521B2 (en) Image capturing device, image capturing method, and program
US11032483B2 (en) Imaging apparatus, imaging method, and program
JP5105139B2 (en) Imaging apparatus, display method, and program
JP2013062711A (en) Photographing device, photographed image processing method, and program
JP7060703B2 (en) Shooting equipment, shooting method, and program
JP2006339728A (en) Imaging apparatus and program thereof
WO2018181163A1 (en) Image processing device, image processing method and program
JP7191980B2 (en) IMAGING DEVICE, IMAGING METHOD, AND PROGRAM
JP7150053B2 (en) IMAGING DEVICE, IMAGING METHOD, AND PROGRAM
JP7003286B2 (en) Shooting equipment, shooting method, and program
JP6941744B2 (en) Image processing device, photographing device, image processing method and image processing program
JP6810298B2 (en) Image alignment aids, methods and programs and imaging devices
WO2015052959A1 (en) Imaging device, imaging system, imaging method and imaging program
JPWO2020161969A1 (en) Image processing device, photographing device, image processing method and image processing program
JP2008283455A (en) Imaging apparatus and imaging method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19913940

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020569370

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19913940

Country of ref document: EP

Kind code of ref document: A1