WO2015166711A1 - Distance-measurement device, distance-measurement method, and distance-measurement program - Google Patents

Distance-measurement device, distance-measurement method, and distance-measurement program Download PDF

Info

Publication number
WO2015166711A1
WO2015166711A1 PCT/JP2015/056873 JP2015056873W WO2015166711A1 WO 2015166711 A1 WO2015166711 A1 WO 2015166711A1 JP 2015056873 W JP2015056873 W JP 2015056873W WO 2015166711 A1 WO2015166711 A1 WO 2015166711A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
unit
subject
light
measuring device
Prior art date
Application number
PCT/JP2015/056873
Other languages
French (fr)
Japanese (ja)
Inventor
智紀 増田
宏 玉山
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2015166711A1 publication Critical patent/WO2015166711A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/40Systems for automatic generation of focusing signals using time delay of the reflected waves, e.g. of ultrasonic waves
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems

Definitions

  • the technology of the present disclosure relates to a distance measuring device, a distance measuring method, and a distance measuring program.
  • a technique for measuring the distance to a distance measurement object by emitting laser light to the distance measurement object and receiving the reflected light is known.
  • Japanese Patent Application Laid-Open No. 2002-207163 proposes a television lens system including a distance measuring circuit. Specifically, Japanese Patent Application Laid-Open No. 2002-207163 describes performing autofocus based on a distance measurement result by a distance measurement circuit.
  • Japanese Patent Laid-Open No. 2008-96181 describes that distance measurement is performed, but recording of measurement results is not taken into consideration. In addition, there is room for improvement because the shooting of the subject is not taken into consideration.
  • Japanese Patent Application Laid-Open No. 2002-207163 describes that the distance measurement result is used for autofocusing, but does not notify the distance measurement result, and there is room for improvement in the use of the distance measurement result. There is.
  • An object of the present invention is to provide a distance measuring device, a distance measuring method, and a distance measuring program.
  • a distance measuring apparatus captures a plurality of consecutive subject images formed by an imaging optical system that forms a subject image representing a subject.
  • a shooting unit that performs moving image shooting for generating a frame image, an emission unit that emits directional light that is directional light along the optical axis direction of the imaging optical system, and reflection of directional light from the subject
  • a light-receiving unit that receives light
  • a derivation unit that derives a distance to the subject based on the timing when the directional light is emitted by the emission unit and the timing when the reflected light is received by the light-receiving unit, and shooting the moving image by the photographing unit
  • the storage unit stores the distance data indicating the distance derived by the deriving unit in association with at least one frame image.
  • the derivation unit calculates the distance at the timing at which a predetermined distance measurement trigger signal is generated during moving image shooting by the shooting unit. To derive. This makes it possible to derive the distance to the subject during moving image shooting.
  • the storage unit is at least before and after the frame image at the timing when the predetermined distance measurement trigger signal is generated.
  • the distance data derived by the deriving unit is stored in association with one frame image. Thereby, each of a frame image and ranging data can be matched.
  • the storage unit performs a plurality of predetermined frame images when moving image shooting is performed by the shooting unit. Every time, the distance data at the time of shooting the corresponding frame image is stored in association with each other. Thereby, each of the captured moving image and the distance measurement data can be associated.
  • the distance measuring apparatus further includes a frame image number setting unit for setting the number of frame images in the fourth aspect of the present invention.
  • a frame image number setting unit for setting the number of frame images in the fourth aspect of the present invention.
  • the derivation unit derives a distance in synchronization with the frame image, and the storage unit Individual distance data is stored in association with each other. Thereby, each frame image of the captured moving image can be associated with each of the distance measurement data.
  • the photographing unit is an instruction unit for instructing the derivation unit to derive the distance to the subject. While the derivation of the distance is instructed, moving image shooting is performed. Thereby, the derivation of the distance to the subject and the moving image shooting can be easily performed.
  • the photographing unit starts moving image photographing when the distance is derived by the deriving unit.
  • the measurement result of the distance to the subject and the moving image of the subject at the time of measurement can be stored in correspondence with each other with high accuracy.
  • any one of the first to eighth aspects of the present invention when the derivation unit cannot derive the distance, whether or not the photographing unit can shoot a moving image is determined.
  • a setting means for setting in advance is further provided. Thereby, it is possible to arbitrarily set whether or not to store a moving image when the distance cannot be derived.
  • the storage unit stores the storage by the storage unit when the derivation unit cannot derive the distance. Cancel. As a result, it is possible to prevent storage of incomplete data without distance data.
  • the derivation unit is a focus by the focus adjustment unit that performs focus adjustment on the subject of the imaging optical system.
  • the distance is derived.
  • the moving image focused when the distance is derived can be taken.
  • the deriving unit obtains the distance a plurality of times and obtains the distance a plurality of times.
  • the frequency is obtained based on the adjustment result of the focus adjustment unit that performs focus adjustment on the subject of the imaging optical system when the distance is derived.
  • a distance range to be used at the time, or a time range from emission to reception of directional light is determined, and the distance to the subject is derived with a resolution determined according to the determined result. As a result, the distance to the subject can be derived in precise numerical units.
  • the emission unit can adjust the emission intensity of the directional light and derive the distance.
  • the emission intensity is adjusted based on the adjustment result of the focus adjustment unit that performs focus adjustment on the subject of the imaging optical system, and directional light is emitted.
  • the distance to the subject can be derived with an appropriate emission intensity that is not affected by ambient light noise.
  • the emission unit decreases the emission intensity as the focal length adjusted by the focus adjustment unit is shorter.
  • the distance to the subject can be derived with an appropriate emission intensity that is not affected by ambient light noise.
  • the light receiving unit when the light receiving unit can adjust the light receiving sensitivity and derive the distance, The reflected light is received by adjusting the light receiving sensitivity based on the adjustment result of the focus adjusting unit that performs focus adjustment on the subject of the optical system. As a result, the distance to the subject can be derived with an appropriate light receiving sensitivity that is not affected by ambient light noise.
  • the light receiving unit lowers the light receiving sensitivity as the focal length adjusted by the focus adjusting unit is shorter.
  • the distance to the subject can be derived with an appropriate light receiving sensitivity that is not affected by ambient light noise.
  • the emission unit can adjust the emission intensity of the directional light, and the subject luminance or the exposure state Directional light is emitted by adjusting the emission intensity based on the specific information.
  • the distance to the subject can be derived with an appropriate emission intensity that is not affected by ambient light noise.
  • the emission unit decreases the emission intensity as the subject brightness is lower or the exposure indicated by the exposure state specifying information is higher.
  • the distance to the subject can be derived with an appropriate emission intensity that is not affected by ambient light noise.
  • the moving image obtained by the photographing unit is displayed, and the moving image is displayed.
  • a display unit for displaying information on the distance to the subject derived by the deriving unit is further provided.
  • the distance measuring device allows the user to accurately grasp the relationship between the state of the subject and the distance to the subject, in comparison with the case where the information regarding the distance to the subject is not displayed in parallel with the display of the moving image. it can.
  • the distance measurement by the emitting unit, the light receiving unit, and the deriving unit is subject luminance or This is performed a predetermined number of times according to the exposure state specifying information.
  • the distance measuring device according to the nineteenth aspect of the present invention has a distance measurement in which the influence of ambient light noise is reduced compared to the case where the number of directional light emissions is fixed regardless of the subject brightness. The result can be obtained.
  • the distance measurement by the emitting unit, the light receiving unit, and the deriving unit is indicated as the subject luminance is higher or by the exposure state specifying information. Do more with lower exposure.
  • the distance measuring apparatus according to the twenty-first aspect of the present invention is less influenced by ambient light noise than when the number of directional light emissions is fixed despite the subject brightness being high. A ranging result with reduced can be obtained.
  • a distance measuring method is a method of photographing a subject image formed by an imaging optical system that forms a subject image showing a subject, and Reflected light from the subject of the directional light emitted from the emitting unit that emits directional light that is directional light along the optical axis direction of the imaging optical system, performing moving image shooting that generates a frame image
  • the distance to the subject is derived based on the timing when the directional light is emitted and the timing when the reflected light is received, and at least one frame image includes distance data indicating the derived distance. And storing them in the storage unit in association with each other.
  • a distance measuring program continuously captures a subject image formed by an imaging optical system that forms a subject image representing a subject on a computer. From the subject of the directional light emitted from the emission unit that emits directional light that is directional light along the optical axis direction of the imaging optical system. The reflected light is received by the light receiving unit, the distance to the subject is derived based on the timing when the directional light is emitted and the timing when the reflected light is received, and the derived distance is indicated in at least one frame image A process including associating the distance data with each other and storing it in the storage unit is executed. As a result, the measurement result of the distance to the subject and the moving image of the subject at the time of measurement can be stored in correspondence with each other with high accuracy.
  • the present invention it is possible to obtain an effect that the measurement result of the distance to the subject and the moving image of the subject at the time of measurement can be stored in correspondence with each other with high accuracy.
  • 5 is a timing chart illustrating an example of timing from light emission to light reception in one measurement in the distance measuring device of the embodiment. It is a graph showing an example of a histogram of measured values when the distance to the subject is on the horizontal axis and the number of measurements is on the vertical axis.
  • 5 is a flowchart illustrating an example of a flow of processing performed by a main control unit when performing distance measurement and moving image shooting in the distance measuring apparatus according to the embodiment.
  • FIG. 5 is a block diagram for explaining adjustment of laser beam emission intensity and photodiode light receiving sensitivity based on AF results and AE results. It is a conceptual diagram which shows an example of a structure of the light emission frequency determination table. It is a flowchart which shows an example of the flow of a luminance information transmission process. It is a flowchart which shows an example of the flow of the light emission frequency determination process. It is a conceptual diagram which shows the other example of a structure of the light emission frequency determination table. It is a flowchart which shows the other example of the flow of an exposure state specific information transmission process. It is a flowchart which shows the other example of the flow of the light emission frequency determination process.
  • FIG. 1 is a block diagram illustrating an example of a configuration of a main part of a distance measuring device 10 according to the present embodiment.
  • the distance measuring device 10 of the present embodiment has a function of measuring a distance and a function of generating a photographed image obtained by photographing a subject.
  • the distance measuring apparatus 10 includes a control unit 20, a light emitting lens 30, a laser diode 32, a light receiving lens 34, a photodiode 36, an imaging optical system 40, an image sensor 42, an operation unit 44, a viewfinder 46, And a storage unit 48.
  • the control unit 20 includes a time counter 22, a distance measurement control unit 24, and a main control unit 26.
  • the time counter 22 has a function of generating a count signal at predetermined intervals in accordance with a signal (for example, a clock pulse) input from the main control unit 26 via the distance measurement control unit 24.
  • the distance measurement control unit 24 has a function of measuring a distance according to the control of the main control unit 26.
  • the distance measurement control unit 24 of the present embodiment performs distance measurement by controlling the driving of the laser diode 32 at a timing according to the count signal generated by the time counter 22.
  • the ranging control unit 24 functions as a derivation unit according to the technique of the present disclosure.
  • Specific examples of the distance measurement control unit 24 include ASIC (Application Specific Specific Integrated Circuit) and FPGA (Field-Programmable Gate Array).
  • the distance measurement control unit 24 of the present embodiment has a storage unit (not shown).
  • Specific examples of the storage unit included in the distance measurement control unit 24 include a nonvolatile storage unit such as a ROM (Read Only Memory) and a volatile storage unit such as a RAM (Random Access Memory).
  • the main control unit 26 has a function of controlling the entire distance measuring device 10. Further, the main control unit 26 of the present embodiment has a function of controlling the imaging optical system 40 and the image sensor 42 to photograph a subject and generating a photographed image (subject image).
  • the main control unit 26 functions as a control unit, a luminance detection unit, a focus adjustment unit, and an exposure adjustment unit according to the technique of the present disclosure. Specific examples of the main control unit 26 include a CPU (Central Processing Unit).
  • the distance measurement control unit 24 of the present embodiment has a storage unit (not shown). Specific examples of the storage unit included in the distance measurement control unit 24 include a nonvolatile storage unit such as a ROM and a volatile storage unit such as a RAM. In the ROM, a program such as a moving picture distance measuring process described later is stored in advance.
  • a program such as a moving image ranging process does not necessarily need to be stored in the main control unit 26 from the beginning.
  • a program such as a moving image ranging process may be stored in an arbitrary portable storage medium such as an SSD (Solid State Drive), a CD-ROM, a DVD, a magneto-optical disk, and an IC card.
  • the distance measuring device 10 may acquire a program such as a moving image distance measuring process from a portable storage medium in which the program is stored, and store it in the main control unit 26 or the like.
  • the distance measuring device 10 may acquire a program such as a moving image distance measuring process from another external device via the Internet or a LAN (Local Area Network), and store the program in the main control unit 26 or the like.
  • the operation unit 44 is a user interface operated by the user when giving various instructions to the distance measuring device 10.
  • the operation unit 44 includes a release button, a distance measurement instruction button, and buttons and keys used when the user gives various instructions (all not shown).
  • Various instructions received by the operation unit 44 are output as operation signals to the main control unit 26, and the main control unit 26 executes processing according to the operation signals input from the operation unit 44.
  • the release button of the operation unit 44 detects a two-stage pressing operation of a shooting preparation instruction state and a shooting instruction state.
  • the shooting preparation instruction state refers to, for example, a state where the image is pressed from the standby position to the intermediate position (half-pressed position), and the shooting instruction state refers to a state where the image is pressed to the final pressed position (full-pressed position) exceeding the intermediate position. Point to. In the following, “a state where the standby position is pressed down to the half-pressed position” is referred to as a “half-pressed state”, and “a state where the standby position or the half-pressed position is pressed down to the full-pressed position” is referred to as a “full-pressed state”. .
  • the manual focus mode and the autofocus mode are selectively set according to a user instruction.
  • the shooting conditions are adjusted by pressing the release button of the operation unit 44 halfway, and then exposure (shooting) is performed when the release button is fully pressed.
  • the AE (Automatic Exposure) function works to adjust the exposure
  • the AF (Auto-Focus) function works to control the focus. Shooting is performed when is fully pressed.
  • the AE function or the AF function may be activated by a moving image shooting instruction or the like of the operation unit 44.
  • the main control unit 26 transmits exposure state specifying information for specifying the current exposure state, which is a result obtained by performing AE, to the distance measurement control unit 24. Further, the main control unit 26 transmits focus state specifying information for specifying the current focus state, which is a result obtained by performing AF, to the distance measurement control unit 24.
  • An example of the exposure state specifying information includes an F value and a shutter speed derived from a so-called AE evaluation value that is uniquely determined according to the subject brightness. Another example of the exposure state specifying information is an AE evaluation value.
  • An example of the focus state specifying information is a subject distance obtained by AF.
  • a non-volatile memory is used for the storage unit 48. Mainly, image data obtained by photographing is stored, and specific examples of the storage unit 48 include a flash memory and an HDD (Hard Disk Disk Drive).
  • the viewfinder 46 has a function of displaying image and character information.
  • the viewfinder 46 of the present embodiment is an electronic viewfinder (hereinafter referred to as “EVF”), and is a live view image (through image) that is an example of a continuous frame image obtained by shooting in a continuous frame in the shooting mode. Used for display.
  • the viewfinder 46 is also used to display a still image that is an example of a single frame image obtained by shooting a single frame when a still image shooting instruction is given.
  • the viewfinder 46 is also used for displaying a playback image and a menu screen in the playback mode.
  • the imaging optical system 40 includes a photographing lens including a focus lens, a motor, a slide mechanism, and a shutter (all not shown).
  • the slide mechanism moves the focus lens along the optical axis direction (not shown) of the imaging optical system 40.
  • a focus lens is attached to the slide mechanism so as to be slidable along the optical axis direction.
  • a motor is connected to the slide mechanism, and the slide mechanism slides the focus lens along the optical axis direction under the power of the motor.
  • the motor is connected to the main control unit 26 of the control unit 20, and driving is controlled in accordance with a command from the main control unit 26.
  • a stepping motor is applied as a specific example of the motor. Accordingly, the motor operates in synchronization with the pulse power according to a command from the main control unit 26.
  • the main control unit 26 drives the motor of the imaging optical system 40 so that the contrast value of the image obtained by imaging with the imaging element 42 is maximized. Focus control is performed by controlling.
  • the main control unit 26 calculates AE information that is a physical quantity indicating the brightness of an image obtained by imaging.
  • the main control unit 26 derives a shutter speed and an F value (aperture value) corresponding to the brightness of the image indicated by the AE information. Then, the main control unit 26 performs exposure adjustment by controlling each related unit so that the derived shutter speed and F value are obtained.
  • the image sensor 42 is an image sensor provided with a color filter (not shown), and functions as an imaging unit according to the technique of the present disclosure.
  • a CMOS image sensor is used as an example of the image sensor 42.
  • the image sensor 42 is not limited to a CMOS image sensor, and may be a CCD image sensor, for example.
  • the color filter includes a G filter corresponding to G (green) that contributes most to obtain a luminance signal, an R filter corresponding to R (red), and a B filter corresponding to B (blue).
  • Each pixel (not shown) of the image sensor 42 is assigned one of the filters “R”, “G”, and “B” included in the color filter.
  • the image sensor 42 When photographing a subject, image light indicating the subject is imaged on the light receiving surface of the image sensor 42 via the imaging optical system 40.
  • the image sensor 42 has a plurality of pixels (not shown) arranged in a matrix in the horizontal direction and the vertical direction, and signal charges corresponding to image light are accumulated in the pixels of the image sensor 42.
  • the signal charge accumulated in the image sensor 42 is sequentially read out as a digital signal corresponding to the signal charge (voltage) based on the control of the main control unit 26.
  • the image sensor 42 has a so-called electronic shutter function, and controls the charge accumulation time (shutter speed) of each photosensor according to the timing based on the control of the main control unit 26 by using the electronic shutter function.
  • the image sensor 42 outputs a digital signal indicating the pixel value of the photographed image from each pixel.
  • the photographed image output from each pixel is a chromatic image, for example, a color image having the same color array as the pixel array.
  • the captured image (frame) output from the image sensor 42 is stored in the main control unit 26 via the main control unit 26 or a predetermined RAW (raw) image storage area (not shown) in the storage unit 48. Is temporarily stored (overwritten).
  • the main control unit 26 performs various image processing on the frame.
  • the main control unit 26 includes a WB (White Balance) gain unit, a gamma correction unit, and a synchronization processing unit (all not shown), and the original digital signal (RAW image) temporarily stored in the main control unit 26 or the like. ) Sequentially performs signal processing in each processing unit. That is, the WB gain unit performs white balance (WB) adjustment by adjusting the gains of the R, G, and B signals.
  • the gamma correction unit performs gamma correction on the R, G, and B signals that have been subjected to WB adjustment by the WB gain unit.
  • the synchronization processing unit performs color interpolation processing corresponding to the color filter array of the image sensor 42 and generates synchronized R, G, and B signals.
  • the main control unit 26 performs image processing on the RAW image in parallel every time a RAW image for one screen is acquired by the image sensor 42.
  • the main control unit 26 outputs the generated image data of the recorded image for recording to an encoder (not shown) that converts the input signal into a signal of another format.
  • the R, G, and B signals processed by the main control unit 26 are converted (encoded) into recording signals by the encoder and recorded in the storage unit 48.
  • the captured image for display processed by the main control unit 26 is output to the viewfinder 46.
  • the term “for recording” and “for display” are used when it is not necessary to distinguish between the above “recorded image for recording” and “captured image for display”. Is referred to as a “photographed image”.
  • the main control unit 26 of the present embodiment displays the live view image on the viewfinder 46 by performing control to continuously display the captured image for display as a moving image.
  • the light emitting lens 30 and the laser diode 32 function as an example of an emission unit according to the technique of the present disclosure.
  • the laser diode 32 is driven based on an instruction from the distance measurement control unit 24, and emits laser light toward the subject to be measured via the light emitting lens 30 in the optical axis direction of the imaging optical system 40. It has a function.
  • a specific example of the light emitting lens 30 of the present embodiment includes an objective lens.
  • the laser light emitted from the laser diode 32 is an example of directional light according to the technique of the present disclosure.
  • the light receiving lens 34 and the photodiode 36 function as an example of a light receiving unit according to the technique of the present disclosure.
  • the photodiode 36 has a function of receiving laser light emitted from the laser diode 32 and reflected by the subject through the light receiving lens 34 and outputting an electric signal corresponding to the amount of received light to the distance measurement control unit 24.
  • the main control unit 26 instructs the distance measurement control unit 24 to perform distance measurement. Specifically, in this embodiment, the main control unit 26 instructs the distance measurement control unit 24 to perform distance measurement by transmitting a distance measurement instruction signal to the distance measurement control unit 24. In addition, the main control unit 26 transmits a synchronization signal for synchronizing the distance measurement operation and the photographing operation to the distance measurement control unit 24 when measuring the distance to the subject and photographing the subject in parallel.
  • the distance measurement control unit 24 When receiving the synchronization signal and the distance measurement instruction signal, the distance measurement control unit 24 emits laser light toward the subject by controlling the light emission of the laser diode 32 at a timing according to the count signal of the time counter 22. Control timing. In addition, the distance measurement control unit 24 samples an electrical signal corresponding to the amount of received light output from the photodiode 36 at a timing corresponding to the count signal of the time counter 22.
  • the distance measurement control unit 24 derives the distance to the subject based on the light emission timing at which the laser diode 32 emits the laser light and the light reception timing at which the photodiode 36 receives the laser light, and the distance representing the derived distance Data is output to the main control unit 26.
  • the main control unit 26 causes the viewfinder 46 to display information related to the distance to the subject based on the distance data. Further, the main control unit 26 stores the distance data in the storage unit 48.
  • FIG. 2 is a timing chart showing an example of the timing of a distance measuring operation for measuring the distance to the subject in the distance measuring apparatus 10 according to the embodiment.
  • a single distance measurement (measurement) sequence includes a voltage adjustment period, a measurement period, and a pause period.
  • the voltage adjustment period is a period during which the drive voltages of the laser diode 32 and the photodiode 36 are adjusted to appropriate voltage values.
  • the voltage adjustment period is set to several 100 msec (milliseconds).
  • the actual measurement period is a period during which the distance to the subject is actually measured.
  • the operation of emitting (emitting) laser light and receiving the laser light reflected by the subject is repeated several hundred times to emit light (emitted).
  • the distance to the subject is measured by measuring the elapsed time from light reception to light reception. That is, in the distance measuring device 10 of the present embodiment, the distance to the subject is measured several hundred times in one measurement sequence.
  • FIG. 3 shows an example of a timing chart showing the timing from light emission to light reception in one measurement.
  • the distance measurement control unit 24 outputs a laser trigger for causing the laser diode 32 to emit light to the laser diode 32 in accordance with the count signal of the time counter 22.
  • the laser diode 32 emits light in response to the laser trigger.
  • the light emission time of the laser diode 32 is set to several tens of nanoseconds (nanoseconds).
  • the emitted laser light is emitted in the optical axis direction of the imaging optical system 40 toward the subject via the light emitting lens 30.
  • the laser light emitted from the distance measuring device 10 is reflected by the subject and reaches the distance measuring device 10.
  • the photodiode 36 of the distance measuring device 10 receives the reflected laser light via the light receiving lens 34.
  • a distance measuring apparatus that performs distance measurement on a subject whose distance from the distance measuring apparatus 10 is within several km.
  • the time required for the laser beam emitted from the laser diode 32 through the light emitting lens 30 toward the subject several km away to return (receive light) is several km ⁇ 2 / light speed ⁇ several ⁇ sec (microseconds). Become. Therefore, in order to measure the distance to the subject several kilometers away, as shown in FIG. 2, it takes at least several ⁇ sec.
  • a specific measurement time is set to several msec as shown in FIG. Since the round trip time of the laser light varies depending on the distance to the subject, the measurement time per one time may be varied according to the distance assumed by the distance measuring device 10.
  • the distance measurement control unit 24 derives the distance to the subject based on the measurement value measured several hundred times as described above.
  • the distance measurement control unit 24 of the present embodiment derives the distance to the subject by analyzing a histogram of measured values for several hundred times.
  • FIG. 4 is a graph showing an example of a histogram of measured values when the distance to the subject is on the horizontal axis and the number of measurements is on the vertical axis.
  • the distance measurement control unit 24 derives the distance to the subject corresponding to the maximum number of times of measurement in the histogram as a measurement result, and outputs distance data indicating the derived measurement result to the main control unit 26.
  • a histogram may be generated based on the round trip time of laser light (elapsed time from light emission to light reception), 1/2 of the round trip time of laser light, or the like.
  • the pause period is a period for stopping the driving of the laser diode 32 and the photodiode 36.
  • the rest period is set to several hundreds msec.
  • one measurement period is set to several 100 msec.
  • shooting and distance measurement are performed in synchronization.
  • the main control unit 26 of the distance measuring device 10 of the present embodiment displays the live view image on the viewfinder 46 as described above.
  • the main control unit 26 displays a live view image by displaying a captured image captured at several tens of fps on the viewfinder 46 as a moving image. Therefore, 30 live view images are displayed on the viewfinder 46 during one measurement period. Further, the main control unit 26 displays the measurement result up to the subject on the viewfinder 46 in a superimposed manner on the live view image.
  • FIG. 5 is a flowchart illustrating an example of a flow of processing performed by the main control unit when performing distance measurement and moving image shooting. 5 is started when the main control unit 26 executes a program for moving picture distance measurement, and is executed, for example, when the power switch is turned on and moving picture shooting is selected.
  • step 100 the main control unit 26 starts a live view operation.
  • the main control unit 26 performs control to continuously display captured images obtained by capturing with the imaging optical system 40 and the image sensor 42 as moving images, thereby causing the viewfinder 46 to display a live view image. Is displayed.
  • step 102 the main control unit 26 determines whether or not the moving image starts. In this determination, the main control unit 26 determines whether or not an operation for starting a moving image has been performed on the operation unit 44. If the determination is affirmative, the process proceeds to step 104. 120.
  • step 104 the main control unit 26 controls the imaging optical system 40, performs AE and AF, and proceeds to step 106.
  • the AE is performed, the exposure state is set, the focus is adjusted by performing the AF, and the image light indicating the subject is focused on the light receiving surface of the image sensor 42. Is done.
  • step 106 the main control unit 26 determines whether or not an AE or AF error has occurred. If the determination is affirmative, the process returns to step 104 to perform AE and AF, and if the determination is negative, the process proceeds to step 108.
  • step 108 the main control unit 26 controls the image sensor 42 to start moving image shooting, and proceeds to step 110. Thereby, moving image shooting is started, and the main control unit 26 sequentially acquires frame images obtained from the image sensor 42. That is, since moving image shooting is started when there is no AE or AF error, it is possible to shoot a moving image without blurring during distance measurement.
  • step 110 the main control unit 26 determines whether or not a ranging start instruction has been issued. In this determination, the main control unit 26 determines whether or not an operation for starting distance measurement has been performed on the operation unit 44. If the determination is negative, the process proceeds to step 112. Goes to step 114.
  • step 112 the main control unit 26 sequentially stores the frame images sequentially obtained from the image sensor 42 in the storage unit 48, and proceeds to step 118.
  • step 114 the main control unit 26 outputs a distance measurement start instruction to the distance measurement control unit 24 and proceeds to step 116.
  • the distance measurement control unit 24 measures the distance to the subject, and distance measurement data indicating the distance to the subject is output to the main control unit 26 as a measurement result. This makes it possible to derive the distance to the subject during moving image shooting.
  • the main control unit 26 stores the distance measurement data output from the distance measurement control unit 24 in the storage unit 48 corresponding to the frame image of the distance measurement timing, and proceeds to step 118.
  • distance measurement data corresponding to a frame image it may be as shown in FIG. 6A. That is, by storing the distance measurement data corresponding to at least one of the frame images before and after the frame image at the timing when the distance measurement start instruction (ranging trigger signal) is generated, the frame image and the distance measurement data are stored. You may make it correspond, respectively.
  • distance measurement data may be stored in association with each other for each predetermined number of frame images.
  • the number of frame images for storing distance measurement data can be set by the operation unit 44, and the number of frame images corresponding to the distance measurement data can be arbitrarily set.
  • the distance is measured in synchronization with the frame image as shown in FIG. 6C by lowering the frame rate or decreasing the number of times of sampling of the laser beam during distance measurement. Then, by storing individual ranging data in correspondence with each frame image, each frame image of the captured moving image may be associated with the ranging data.
  • step 118 the main control unit 26 determines whether or not the end of moving image shooting is instructed. In this determination, the main control unit 26 determines whether or not an operation indicating the end of moving image shooting has been performed on the operation unit 44. If the determination is negative, the process returns to step 110 and the above-described processing is repeated. If the determination is positive, the process proceeds to step 120.
  • step 120 the main control unit 26 determines whether or not the power is turned off. In this determination, the main control unit 26 determines whether or not a power switch included in the operation unit 44 is operated. If the determination is negative, the process returns to step 102 and the above processing is repeated. If the determination is affirmative, the routine proceeds to step 122.
  • step 122 the main control unit 26 stops the live view operation and ends the series of processes.
  • main control unit 26 performs processing in this way, measurement of the distance to the subject and video shooting are performed, and the measurement result of the distance to the subject and the moving image at the time of distance measurement are stored in correspondence with each other with high accuracy. be able to.
  • FIG. 7 is a flowchart illustrating a modified example of processing performed by the main control unit 26 when performing distance measurement and moving image shooting in the distance measuring device 10 according to the embodiment.
  • the distance to the subject is measured in response to an instruction to start ranging after moving image shooting is started, and ranging data is stored corresponding to the frame image of the moving image.
  • the video shooting is performed while the distance measurement instruction is given.
  • step 100 the main control unit 26 starts a live view operation.
  • the main control unit 26 performs control to continuously display captured images obtained by capturing with the imaging optical system 40 and the image sensor 42 as moving images, thereby causing the viewfinder 46 to display a live view image. Is displayed.
  • step 101 the main control unit 26 determines whether or not a ranging start instruction has been issued. In this determination, the main control unit 26 determines whether or not a distance measurement instruction operation has been performed on the operation unit 44. If the determination is negative, the process proceeds to step 120. If the determination is affirmative, the process proceeds to step 104. In the modified example, the distance measurement start operation is continuously performed on the operation unit 44, whereby the distance to the subject and the moving image shooting are performed.
  • step 104 the main control unit 26 controls the imaging optical system 40, performs AE and AF, and proceeds to step 106.
  • the AE is performed, the exposure state is set, the focus is adjusted by performing the AF, and the image light indicating the subject is focused on the light receiving surface of the image sensor 42. Is done.
  • step 106 the main control unit 26 determines whether or not an AE or AF error has occurred. If the determination is affirmative, the process returns to step 104 to perform AE and AF, and if the determination is negative, the process proceeds to step 108.
  • step 108 the main control unit 26 starts moving image shooting by controlling the image sensor 42 and proceeds to step 114. Thereby, moving image shooting is started, and the main control unit 26 sequentially acquires frame images obtained from the image sensor 42. That is, since moving image shooting is started when there is no AE or AF error, a moving image focused at the time of distance measurement can be shot.
  • step 114 the main control unit 26 outputs a ranging start instruction to the ranging control unit 24 and proceeds to step 116.
  • the distance measurement control unit 24 measures the distance to the subject, and distance measurement data indicating the distance to the subject is output to the main control unit 26 as a measurement result.
  • step 116 the main control unit 26 stores the distance measurement data output from the distance measurement control unit 24 in the storage unit 48 corresponding to the frame image of the distance measurement timing, and proceeds to step 117.
  • the distance measurement data corresponding to the frame image as shown in FIG. 6A, at least one of the frame image at the timing when the distance measurement start instruction (ranging trigger signal) is generated and before You may make it memorize
  • distance measurement data may be stored in association with each other for each predetermined number of frame images. In this case, the operation unit 44 may set the number of frame images for storing distance measurement data.
  • distance data may be stored in correspondence with each frame image by lowering the frame rate or decreasing the number of samplings during distance measurement.
  • step 117 the main control unit 26 determines whether or not a ranging stop instruction has been issued. In this determination, the main control unit 26 determines whether or not a distance measurement stop operation has been performed on the operation unit 44. In the present embodiment, the main control unit 26 determines whether or not the distance measurement instruction operation performed on the operation unit 44 has been performed. If the determination is negative, the process returns to step 116 and the above-described processing is repeated. If the determination is affirmative, the process proceeds to step 119.
  • step 119 the main control unit 26 controls the image pickup element 42 to stop moving image shooting and proceeds to step 120.
  • step 120 the main control unit 26 determines whether or not the power is turned off. In this determination, the main control unit 26 determines whether or not the power switch included in the operation unit 44 is operated. If the determination is negative, the process returns to step 101 and the above-described processing is repeated. If the determination is affirmative, the routine proceeds to step 122.
  • step 122 the main control unit 26 stops the live view operation and ends the series of processes.
  • main control unit 26 Since the main control unit 26 performs processing in this manner, moving image shooting can be performed while a distance measurement instruction is being performed, so that measurement of the distance to the subject and moving image shooting can be easily performed. it can.
  • the distance measurement control unit 24 may cause a case where the distance to the subject cannot be measured (ranging error).
  • the main control unit 26 may stop storing in the storage unit 48 to prevent storing incomplete data without ranging data.
  • FIG. Process when the main control unit 26 transmits a distance measurement instruction signal to the distance measurement control unit 24 in the above-described step 114 and the distance measurement control unit 24 measures the distance to the subject, it is shown in FIG. Process.
  • FIG. 8 is a flowchart showing a part of processing when a ranging error occurs.
  • step 115 determines whether or not a ranging error has occurred. This determination is performed by determining whether or not the main control unit 26 has received a signal indicating a ranging error from the ranging control unit 24. If the determination is affirmative, the process proceeds to step 121, and If so, the process proceeds to step 116 described above.
  • a signal is transmitted from the distance measurement control unit 24 to the main control unit 26 when a distance measurement error occurs is shown, but the user may make a determination when a distance measurement result is not displayed. Good.
  • step 117 the main control unit 26 stops storing frame images and distance measurement data in the storage unit 48. Note that since the frame image is obtained, only the storage of the distance measurement data may be stopped.
  • step 116 the main control unit 26 stores the distance measurement data output from the distance measurement control unit 24 in the storage unit 48 corresponding to the frame image of the distance measurement timing.
  • the distance measurement control unit 24 performs the measurement by emitting and receiving the laser light a plurality of times (for example, several hundred times) to derive the distance to the subject.
  • the focus adjustment result may be used when deriving. For example, when analyzing a histogram (FIG. 4) generated from a plurality of measurement results by laser light emission and light reception, a distance range (focal length and a nearby range including it) from the AF result to the subject is calculated. I understand. Therefore, as shown in FIG. 9A, the measurement result other than the subject distance range based on AF (the hatched portion in FIG. 9A) is not used, and the distance to the subject is derived using only the measurement result of the subject distance range based on AF.
  • the resolution of the distance range when calculating the frequency can be increased rather than using all the measured values, and the distance to the subject can be determined in precise numerical units.
  • FIG. 9A an example is shown in which measured values of a distance shorter and a longer distance than the subject distance range based on AF are not used, but only one of them may not be used. That is, the distance to the subject is measured without using the measured value of the distance less than the subject distance based on AF (the hatched portion in FIG. 9B) or the measured value longer than the subject distance based on AF (the shaded portion in FIG. 9C). May be derived. Further, the result of manual focus adjustment in the manual focus mode may be used instead of AF.
  • the main control unit 26 starts the distance measurement by transmitting the distance measurement instruction signal after starting the moving image shooting.
  • the order may be reversed.
  • the main control unit 26 sends a distance measurement instruction signal to the distance measurement control unit 24 in step 200 as shown in FIG. 10A. Output.
  • the distance measurement control unit 24 starts distance measurement and there is no distance measurement error
  • distance measurement data is transmitted to the main control unit 26, and in the case of a distance measurement error, a signal indicating the distance measurement error is transmitted. It transmits to the main control part 26.
  • the main controller 26 determines whether or not there is a distance measurement error.
  • step 204 the main control unit 26 starts moving image shooting, and in step 206, the main control unit 26 stores the distance measurement data corresponding to the frame image. If a distance measurement error occurs at this time, whether to shoot a moving image may be set in advance by the operation unit 44. Thereby, it is possible to arbitrarily set whether to store a moving image when distance measurement is impossible.
  • the focus state specifying information for specifying the AF result (or the manual focus adjustment result) from the main control unit 26
  • Exposure state specification information for specifying the AE result may be acquired, and at least one of the laser diode 32 and the photodiode 36 may be driven and adjusted based on the acquired focus state specification information and exposure state specification information. That is, since the distance from the focus adjustment result (focal length) to the approximate subject is known, the emission intensity of the laser light emitted from the laser diode 32 is adjusted based on the focus state specifying information for specifying the AF result. May be. For example, the shorter the focal length, the smaller the emission intensity.
  • the distance to the subject can be derived with an appropriate emission intensity of the laser light that is not affected by the ambient light noise.
  • the light receiving sensitivity of the photodiode 36 may be adjusted based on the focus state specifying information for specifying the AF result. For example, the light receiving sensitivity is lowered as the focal length is shorter. As a result, the distance to the subject can be derived with an appropriate light receiving sensitivity that is not affected by ambient light noise.
  • strength of a laser beam is known from an exposure adjustment result, you may make it adjust the injection
  • information related to the distance to the subject is displayed on the viewfinder 46 while being superimposed on the live view image is illustrated, but the technology of the present disclosure is not limited to this.
  • information regarding the distance to the subject may be displayed in a display area different from the display area of the live view image. In this way, information regarding the distance to the subject may be displayed on the viewfinder 46 in parallel with the display of the live view image.
  • AE and AF are started in accordance with a shooting preparation instruction received by a UI (user interface) unit of an external device that is connected to the distance measuring device 10 and used in accordance with a shooting instruction received by the UI unit of the external device.
  • Movie shooting may be started.
  • an external device connected to the distance measuring device 10 there is a smart device, a personal computer (PC), or a spectacle-type or watch-type wearable terminal device.
  • the live view image and the distance measurement result (information on the distance to the subject) are displayed on the viewfinder 46 is exemplified, but the technology of the present disclosure is not limited to this.
  • at least one of the live view image and the distance measurement result may be displayed on a display unit of an external device that is connected to the distance measuring device 10 and used.
  • a display unit of an external device connected to the distance measuring device 10 a smart device display, a PC display, or a wearable terminal device display can be cited.
  • each moving picture distance measurement process (see FIGS. 5, 7, and 10A) and distance measurement error process (see FIG. 8) described in the above embodiment are merely examples. Therefore, it goes without saying that unnecessary steps may be deleted, new steps may be added, and the processing order may be changed within a range not departing from the spirit.
  • each process included in the moving picture distance measurement process and the distance measurement error process described in the above embodiment may be realized by a software configuration using a computer by executing a program, or other hardware The configuration may be realized. Further, it may be realized by a combination of a hardware configuration and a software configuration.
  • the distance measuring device has been described as an example.
  • the technology of the present disclosure may be applied to a photographing device such as a digital camera.
  • the configuration, operation, and the like of the distance measuring device 10 described in the present embodiment are merely examples, and it is needless to say that they can be changed according to the situation without departing from the gist of the technology of the present disclosure.
  • the technique of this indication is not limited to this. Absent. Since the ambient light becomes noise for the laser light, the number of light emission times of the laser light may be a light emission number determined according to the subject brightness.
  • the number of times of laser light emission is derived from the light emission number determination table 300 shown in FIG. 11 as an example.
  • the subject brightness and the laser light emission count are associated with each other so that the higher the subject brightness, the greater the laser light emission count. That is, in the light-emitting times determination table 300, the object luminance, L 1 ⁇ L 2 ⁇ ⁇ and magnitude of L n is satisfied, emission number, N 1 ⁇ N 2 ⁇ ⁇ N n The magnitude relationship is established.
  • the number of times of light emission is exemplified by 100 times. However, the number of times of light emission is not limited to this. Good.
  • luminance information transmission processing (see FIG. 12) is executed by the main control unit 26, and light emission is performed by the distance measurement control unit 24.
  • the number determination process (see FIG. 13) is executed.
  • luminance information transmission processing executed by the main control unit 26 when the power switch of the distance measuring device 10 is turned on will be described with reference to FIG.
  • step 400 the main control unit 26 determines whether or not a luminance acquisition start condition that is a condition for starting acquisition of subject luminance is satisfied.
  • a luminance acquisition start condition is a condition that the release button is half-pressed.
  • Another example of the luminance acquisition start condition is a condition that a captured image is output from the image sensor 42.
  • step 400 if the luminance acquisition start condition is satisfied, the determination is affirmed and the routine proceeds to step 402. If the luminance acquisition start condition is not satisfied at step 400, the determination is negative and the routine proceeds to step 406.
  • step 402 the main control unit 26 acquires the subject brightness from the photographed image, and then proceeds to step 404.
  • the subject brightness is acquired from the captured image is illustrated, but the technology of the present disclosure is not limited to this.
  • the main control unit 26 may acquire the subject brightness from the brightness sensor.
  • step 404 the main control unit 26 transmits the luminance information indicating the subject luminance acquired in step 402 to the distance measurement control unit 24, and then proceeds to step 406.
  • step 406 the main control unit 26 determines whether or not an end condition that is a condition for ending the luminance information transmission process is satisfied.
  • An example of the end condition is a condition that the power switch of the distance measuring device 10 is turned off. If the termination condition is not satisfied at step 406, the determination is negative and the routine proceeds to step 400. In step 406, if the end condition is satisfied, the determination is affirmed, and the luminance information transmission process ends.
  • step 410 the distance measurement control unit 24 determines whether or not the luminance information transmitted by executing the process of step 404 has been received. In step 410, when the luminance information transmitted by executing the process of step 404 is not received, the determination is negative and the process proceeds to step 416. In step 410, when the luminance information transmitted by executing the process of step 404 is received, the determination is affirmed and the process proceeds to step 412.
  • step 412 the distance measurement control unit 24 derives the number of times of light emission corresponding to the subject luminance indicated by the luminance information received in step 410 from the light emission number determination table 300, and then proceeds to step 414.
  • step 414 the distance measurement control unit 24 stores the number of times of light emission derived in the process of step 412 in the storage unit 48, and then proceeds to step 416. Note that the number of times of light emission stored in the storage unit 48 by the processing in step 416 is employed as the number of times of laser light emission when the distance measurement is actually executed.
  • step 416 the main control unit 26 determines whether or not an end condition that is a condition for ending the main light emission number determination process is satisfied.
  • An example of the end condition is a condition that the power switch of the distance measuring device 10 is turned off. If the termination condition is not satisfied at step 416, the determination is negative and the routine proceeds to step 410. In step 416, if the end condition is satisfied, the determination is affirmed and the main light emission number determination process is ended.
  • the number of times of laser light emission is derived according to the light emission number determination table 500 shown in FIG. 14 as an example.
  • exposure state specifying information E 1 , E 2 ,... E n
  • subject brightness the number of times of laser light emission
  • N 1 , N 2 , etc N n the number of times of laser light emission
  • the exposure state specifying information uniquely determined according to the subject brightness means, for example, exposure state specifying information indicating exposure that decreases as the subject brightness increases.
  • the main control unit 26 executes the exposure state specifying information transmission process (see FIG. 15), and the distance measurement control unit 24 executes the light emission frequency determination process (see FIG. 15). 16) is executed.
  • step 600 the main control unit 26 determines whether or not the release button is half-pressed. If it is determined in step 600 that the release button has not been pressed halfway, the determination is negative and the process proceeds to step 606. If the release button is pressed halfway in step 600, the determination is affirmed and the routine proceeds to step 602.
  • a release button is provided in the operation unit 44 will be described as an example, but the technology of the present disclosure is not limited to this.
  • step 600 may be omitted, and the process of step 602 may be started when the power is turned on.
  • step 602 the main control unit 26 performs AE based on the subject brightness acquired from the captured image, and then proceeds to step 604.
  • step 604 the main control unit 26 transmits the exposure state specifying information to the distance measurement control unit 24, and then proceeds to step 606.
  • step 606 the main control unit 26 determines whether or not an end condition that is a condition for ending the exposure state specifying information transmission process is satisfied.
  • An example of the end condition is a condition that the power switch of the distance measuring device 10 is turned off. If it is determined in step 606 that the termination condition is not satisfied, the determination is negative and the process proceeds to step 600. In step 606, when the end condition is satisfied, the determination is affirmed, and the exposure state specifying information transmission process is ended.
  • step 610 the distance measurement control unit 24 determines whether or not the exposure state specifying information transmitted by executing the process of step 604 has been received. If it is determined in step 610 that the exposure state specifying information transmitted by executing the process of step 604 has not been received, the determination is negative and the routine proceeds to step 616. In step 610, when the exposure state specifying information transmitted by executing the process of step 604 is received, the determination is affirmed and the process proceeds to step 612.
  • step 612 the distance measurement control unit 24 derives the number of times of light emission corresponding to the exposure state specifying information received in step 610 from the light emission number determination table 500, and then proceeds to step 614.
  • step 614 the distance measurement control unit 24 stores the number of times of light emission derived in the process of step 612 in the storage unit 48, and then proceeds to step 616. Note that the number of times of light emission stored in the storage unit 48 by the processing of step 616 is employed as the number of times of laser light emission when the distance measurement is actually executed.
  • step 616 the main control unit 26 determines whether or not an end condition that is a condition for ending the exposure state specifying information transmission process is satisfied.
  • An example of the end condition is a condition that the power switch of the distance measuring device 10 is turned off. If it is determined in step 616 that the termination condition is not satisfied, the determination is negative and the process proceeds to step 610. In step 616, if the end condition is satisfied, the determination is affirmed, and the exposure state specifying information transmission process is ended.
  • the distance measuring device 10 increases the number of times of laser light emission (distance measurement) as the subject luminance increases, the number of times of laser light emission (distance measurement) is fixed regardless of the subject luminance. Compared to the case where the measurement is performed, it is possible to obtain a distance measurement result in which the influence of ambient light noise is reduced.
  • laser light is exemplified as distance measurement light.
  • the technology of the present disclosure is not limited to this, and any directional light that is directional light may be used. .
  • it may be directional light obtained by a light emitting diode (LED: Light Emitting Diode) or a super luminescent diode (SLD: Super Luminescent Diode).
  • the directivity of the directional light is preferably the same as the directivity of the laser light.
  • the directivity can be used for ranging within a range of several meters to several kilometers. preferable.
  • An emission unit that emits laser light along the optical axis direction of the imaging optical system;
  • a light receiving unit that receives reflected light from the subject of the laser beam;
  • a deriving unit for deriving the distance to the subject based on the timing at which the laser beam is emitted by the emitting unit and the timing at which the reflected light is received by the light receiving unit;
  • a storage unit that stores distance data indicating a distance derived by the deriving unit in association with at least one frame image when capturing a moving image by the imaging unit;
  • Ranging device equipped with
  • a distance measurement method including storing distance data indicating a derived distance in association with at least one frame image in a storage unit.
  • a distance measurement program for executing processing including storing distance data indicating a derived distance in association with at least one frame image in a storage unit.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

This distance-measurement device contains the following: an imaging unit that captures video, generating a consecutive plurality of frame images by capturing subject images that are formed by image-forming optics and represent a subject; an emission unit that emits directional light, i.e. light that exhibits directionality, along the axis of the image-forming optics; a light-receiving unit that receives reflected light consisting of the directional light reflected by the subject; a derivation unit that derives the distance to the subject on the basis of the timing with which the directional light was emitted by the emission unit and the timing with which the reflected light was received by the light-receiving unit; and a storage unit that, when video is captured by the imaging unit, stores, in association with at least one frame image, distance data indicating the distance derived by the derivation unit.

Description

測距装置、測距方法、及び測距プログラムRanging device, ranging method, and ranging program
 本開示の技術は、測距装置、測距方法、及び測距プログラムに関する。 The technology of the present disclosure relates to a distance measuring device, a distance measuring method, and a distance measuring program.
 レーザ光を測距対象に射出してその反射光を受光することにより、測距対象までの距離を計測する技術が知られている。 A technique for measuring the distance to a distance measurement object by emitting laser light to the distance measurement object and receiving the reflected light is known.
 例えば、特開2008-96181号公報に記載の技術では、レーザ光を測距対象に対して複数回射出し、その反射光を各々検出する。そして、レーザ光の各々の射出タイミングと各々の反射光の検出タイミングとから算出される複数の距離のうち、頻度の高い距離を測距対象までの距離として算出している。 For example, in the technique described in Japanese Patent Application Laid-Open No. 2008-96181, laser light is emitted a plurality of times to a distance measuring object, and the reflected light is detected respectively. Of the plurality of distances calculated from the laser light emission timings and the reflected light detection timings, a high-frequency distance is calculated as the distance to the distance measurement target.
 一方、特開2002-207163号公報には、距離測定回路を備えたテレビレンズシステムが提案されている。詳細には、特開2002-207163号公報では、距離測定回路による距離測定結果に基づいてオートフォーカスを行うことが記載されている。 On the other hand, Japanese Patent Application Laid-Open No. 2002-207163 proposes a television lens system including a distance measuring circuit. Specifically, Japanese Patent Application Laid-Open No. 2002-207163 describes performing autofocus based on a distance measurement result by a distance measurement circuit.
 しかしながら、特開2008-96181号公報では、距離の計測を行うことが記載されているが、計測結果の記録については考慮されていない。また、被写体の撮影についても考慮しておらず、改善の余地がある。 However, Japanese Patent Laid-Open No. 2008-96181 describes that distance measurement is performed, but recording of measurement results is not taken into consideration. In addition, there is room for improvement because the shooting of the subject is not taken into consideration.
 一方、特開2002-207163号公報では、距離の測定結果をオートフォーカスに利用することが記載されているが、距離の測定結果を報知するものではなく、距離の測定結果の利用について改善の余地がある。 On the other hand, Japanese Patent Application Laid-Open No. 2002-207163 describes that the distance measurement result is used for autofocusing, but does not notify the distance measurement result, and there is room for improvement in the use of the distance measurement result. There is.
 本発明の一つの実施形態は、上記事実を考慮して成されたもので、被写体までの距離の計測結果と、計測の際の被写体の動画像とを精度よく対応して記憶することが可能な測距装置、測距方法、及び測距プログラムを提供することを目的とする。 One embodiment of the present invention has been made in consideration of the above facts, and it is possible to store the measurement result of the distance to the subject and the moving image of the subject at the time of measurement in correspondence with each other. An object of the present invention is to provide a distance measuring device, a distance measuring method, and a distance measuring program.
 上記目的を達成するために、本発明の第1の態様に係る測距装置は、被写体を示す被写体像を結像する結像光学系により結像された被写体像を撮影して連続した複数のフレーム画像を生成する動画撮影を行う撮影部と、結像光学系の光軸方向に沿って、指向性のある光である指向性光を射出する射出部と、指向性光の被写体からの反射光を受光する受光部と、射出部により指向性光が射出されたタイミング及び受光部により反射光が受光されたタイミングに基づいて被写体までの距離を導出する導出部と、撮影部により動画撮影を行う場合、少なくとも1つのフレーム画像に、導出部により導出された距離を示す距離データを対応付けて記憶する記憶部と、を備えている。これにより、被写体までの距離の計測結果と、計測の際の被写体の動画像とを精度よく対応して記憶することが可能となる。 In order to achieve the above object, a distance measuring apparatus according to a first aspect of the present invention captures a plurality of consecutive subject images formed by an imaging optical system that forms a subject image representing a subject. A shooting unit that performs moving image shooting for generating a frame image, an emission unit that emits directional light that is directional light along the optical axis direction of the imaging optical system, and reflection of directional light from the subject A light-receiving unit that receives light, a derivation unit that derives a distance to the subject based on the timing when the directional light is emitted by the emission unit and the timing when the reflected light is received by the light-receiving unit, and shooting the moving image by the photographing unit When performing, the storage unit stores the distance data indicating the distance derived by the deriving unit in association with at least one frame image. As a result, the measurement result of the distance to the subject and the moving image of the subject at the time of measurement can be stored in correspondence with each other with high accuracy.
 本発明の第2の態様に係る測距装置では、本発明の第1の態様において、導出部は、撮影部による動画撮影中は、予め定めた測距トリガ信号が発生したタイミングで、距離を導出する。これにより、動画撮影中に被写体までの距離の導出を行うことができる。 In the distance measuring device according to the second aspect of the present invention, in the first aspect of the present invention, the derivation unit calculates the distance at the timing at which a predetermined distance measurement trigger signal is generated during moving image shooting by the shooting unit. To derive. This makes it possible to derive the distance to the subject during moving image shooting.
 本発明の第3の態様に係る測距装置では、本発明の第1又は第2の態様において、記憶部は、予め定めた測距トリガ信号が発生したタイミングのフレーム画像の前及び後の少なくとも一方のフレーム画像に、導出部により導出された距離データを対応付けて記憶する。これにより、フレーム画像と測距データの各々を対応させることができる。 In the distance measuring device according to the third aspect of the present invention, in the first or second aspect of the present invention, the storage unit is at least before and after the frame image at the timing when the predetermined distance measurement trigger signal is generated. The distance data derived by the deriving unit is stored in association with one frame image. Thereby, each of a frame image and ranging data can be matched.
 本発明の第4の態様に係る測距装置では、本発明の第1~3の何れか1つの態様において、記憶部は、撮影部により動画撮影を行う場合、予め定められた複数のフレーム画像毎に、対応するフレーム画像の撮影時の距離データを対応付けて記憶する。これにより、撮影した動画と測距データの各々を対応させることができる。 In the distance measuring apparatus according to the fourth aspect of the present invention, in any one of the first to third aspects of the present invention, the storage unit performs a plurality of predetermined frame images when moving image shooting is performed by the shooting unit. Every time, the distance data at the time of shooting the corresponding frame image is stored in association with each other. Thereby, each of the captured moving image and the distance measurement data can be associated.
 本発明の第5の態様に係る測距装置では、本発明の第4の態様において、フレーム画像の数を設定するためのフレーム画像数設定部を更に備えている。これにより、測距データと対応させるフレーム画像数を任意に設定することができる。 The distance measuring apparatus according to the fifth aspect of the present invention further includes a frame image number setting unit for setting the number of frame images in the fourth aspect of the present invention. Thereby, the number of frame images corresponding to the distance measurement data can be arbitrarily set.
 本発明の第6の態様に係る測距装置では、本発明の第1~3の何れか1つの態様において、導出部は、フレーム画像に同期して距離を導出し、記憶部は、フレーム画像毎に個別の距離データを対応付けて記憶する。これにより、撮影した動画の各フレーム画像と測距データの各々を対応させることができる。 In the distance measuring apparatus according to the sixth aspect of the present invention, in any one of the first to third aspects of the present invention, the derivation unit derives a distance in synchronization with the frame image, and the storage unit Individual distance data is stored in association with each other. Thereby, each frame image of the captured moving image can be associated with each of the distance measurement data.
 本発明の第7の態様に係る測距装置では、本発明の第1~6の何れか1つの態様において、撮影部は、導出部による被写体までの距離の導出を指示するための指示部による距離の導出が指示されている間、動画撮影を行う。これにより、被写体までの距離の導出と動画撮影とを容易に行うことができる。 In the distance measuring apparatus according to the seventh aspect of the present invention, in any one of the first to sixth aspects of the present invention, the photographing unit is an instruction unit for instructing the derivation unit to derive the distance to the subject. While the derivation of the distance is instructed, moving image shooting is performed. Thereby, the derivation of the distance to the subject and the moving image shooting can be easily performed.
 本発明の第8の態様に係る測距装置では、本発明の第1~7の何れか1つの態様において、撮影部は、導出部によって距離が導出された場合に、動画撮影を開始する。これにより、被写体までの距離の計測結果と、計測の際の被写体の動画像とを精度よく対応して記憶することが可能となる。 In the distance measuring apparatus according to the eighth aspect of the present invention, in any one of the first to seventh aspects of the present invention, the photographing unit starts moving image photographing when the distance is derived by the deriving unit. As a result, the measurement result of the distance to the subject and the moving image of the subject at the time of measurement can be stored in correspondence with each other with high accuracy.
 本発明の第9の態様に係る測距装置では、本発明の第1~8の何れか1つの態様において、導出部によって距離の導出が不可能な場合に、撮影部による動画撮影の可否を予め設定するための設定手段を更に備えている。これにより、距離の導出が不可能な場合に動画像を記憶するか否かを任意に設定することができる。 In the distance measuring device according to the ninth aspect of the present invention, in any one of the first to eighth aspects of the present invention, when the derivation unit cannot derive the distance, whether or not the photographing unit can shoot a moving image is determined. A setting means for setting in advance is further provided. Thereby, it is possible to arbitrarily set whether or not to store a moving image when the distance cannot be derived.
 本発明の第10の態様に係る測距装置では、本発明の第1~9の何れか1つの態様において、記憶部は、導出部によって距離の導出が不可能な場合、記憶部による記憶を中止する。これによって距離データのない不完全なデータの記憶を防止することができる。 In the distance measuring device according to the tenth aspect of the present invention, in any one of the first to ninth aspects of the present invention, the storage unit stores the storage by the storage unit when the derivation unit cannot derive the distance. Cancel. As a result, it is possible to prevent storage of incomplete data without distance data.
 本発明の第11の態様に係る測距装置では、本発明の第1~10の何れか1つの態様において、導出部は、結像光学系の被写体への焦点調整を行う焦点調整部による焦点調整エラーがない場合に、距離を導出する。これにより、距離の導出時に合焦された動画を撮影することができる。 In the distance measuring apparatus according to the eleventh aspect of the present invention, in any one of the first to tenth aspects of the present invention, the derivation unit is a focus by the focus adjustment unit that performs focus adjustment on the subject of the imaging optical system. When there is no adjustment error, the distance is derived. Thereby, the moving image focused when the distance is derived can be taken.
 本発明の第12の態様に係る測距装置では、本発明の第1~11の何れか1つの態様において、導出部は、距離の導出を複数回行い、複数回の距離の導出によって得られる距離のうちの頻度が高い距離を最終的な距離として導出する場合、距離を導出する際に、結像光学系の被写体への焦点調整を行う焦点調整部の調整結果に基づいて、頻度を求める際に使用する距離範囲、又は指向性光の射出から受光までの時間範囲を定め、定めた結果に応じて定める分解能で被写体までの距離を導出する。これにより、綿密な数値単位で被写体までの距離を導出することができる。 In the distance measuring device according to the twelfth aspect of the present invention, in any one of the first to eleventh aspects of the present invention, the deriving unit obtains the distance a plurality of times and obtains the distance a plurality of times. When deriving a distance having a high frequency among the distances as the final distance, the frequency is obtained based on the adjustment result of the focus adjustment unit that performs focus adjustment on the subject of the imaging optical system when the distance is derived. A distance range to be used at the time, or a time range from emission to reception of directional light is determined, and the distance to the subject is derived with a resolution determined according to the determined result. As a result, the distance to the subject can be derived in precise numerical units.
 本発明の第13の態様に係る測距装置では、本発明の第1~12の何れか1つの態様において、射出部が、指向性光の射出強度を調整可能であり、距離を導出する場合に、結像光学系の被写体への焦点調整を行う焦点調整部の調整結果に基づいて射出強度を調整して指向性光を射出する。これにより、環境光のノイズに影響されない適正な射出強度で被写体までの距離を導出することができる。 In the distance measuring device according to the thirteenth aspect of the present invention, in any one of the first to twelfth aspects of the present invention, the emission unit can adjust the emission intensity of the directional light and derive the distance. In addition, the emission intensity is adjusted based on the adjustment result of the focus adjustment unit that performs focus adjustment on the subject of the imaging optical system, and directional light is emitted. Thus, the distance to the subject can be derived with an appropriate emission intensity that is not affected by ambient light noise.
 本発明の第14の態様に係る測距装置では、本発明の第13の態様において、射出部は、焦点調整部によって調整された焦点距離が短いほど射出強度を小さくする。これにより、環境光のノイズに影響されない適正な射出強度で被写体までの距離を導出することができる。 In the distance measuring apparatus according to the fourteenth aspect of the present invention, in the thirteenth aspect of the present invention, the emission unit decreases the emission intensity as the focal length adjusted by the focus adjustment unit is shorter. Thus, the distance to the subject can be derived with an appropriate emission intensity that is not affected by ambient light noise.
 本発明の第15の態様に係る測距装置では、本発明の第1~14の何れか1つの態様において、受光部が、受光感度を調整可能であり、距離を導出する場合に、結像光学系の被写体への焦点調整を行う焦点調整部の調整結果に基づいて受光感度を調整して反射光を受光する。これにより、環境光のノイズに影響されない適正な受光感度で被写体までの距離を導出することができる。 In the distance measuring device according to the fifteenth aspect of the present invention, in any one of the first to fourteenth aspects of the present invention, when the light receiving unit can adjust the light receiving sensitivity and derive the distance, The reflected light is received by adjusting the light receiving sensitivity based on the adjustment result of the focus adjusting unit that performs focus adjustment on the subject of the optical system. As a result, the distance to the subject can be derived with an appropriate light receiving sensitivity that is not affected by ambient light noise.
 本発明の第16の態様に係る測距装置では、本発明の第15の態様において、受光部は、焦点調整部によって調整された焦点距離が短いほど受光感度を下げる。これにより、環境光のノイズに影響されない適正な受光感度で被写体までの距離を導出することができる。 In the distance measuring apparatus according to the sixteenth aspect of the present invention, in the fifteenth aspect of the present invention, the light receiving unit lowers the light receiving sensitivity as the focal length adjusted by the focus adjusting unit is shorter. As a result, the distance to the subject can be derived with an appropriate light receiving sensitivity that is not affected by ambient light noise.
 本発明の第17の態様に係る測距装置では、本発明の第1~16の何れか1つの態様において、射出部が、指向性光の射出強度を調整可能であり、被写体輝度又は露出状態特定情報に基づいて射出強度を調整して指向性光を射出する。これにより、環境光のノイズに影響されない適正な射出強度で被写体までの距離を導出することができる。 In the distance measuring apparatus according to the seventeenth aspect of the present invention, in any one of the first to sixteenth aspects of the present invention, the emission unit can adjust the emission intensity of the directional light, and the subject luminance or the exposure state Directional light is emitted by adjusting the emission intensity based on the specific information. Thus, the distance to the subject can be derived with an appropriate emission intensity that is not affected by ambient light noise.
 本発明の第18の態様に係る測距装置では、本発明の第17の態様において、射出部は、被写体輝度が低いほど又は露出状態特定情報により示される露出が高いほど射出強度を小さくする。これにより、環境光のノイズに影響されない適正な射出強度で被写体までの距離を導出することができる。 In the distance measuring apparatus according to the eighteenth aspect of the present invention, in the seventeenth aspect of the present invention, the emission unit decreases the emission intensity as the subject brightness is lower or the exposure indicated by the exposure state specifying information is higher. Thus, the distance to the subject can be derived with an appropriate emission intensity that is not affected by ambient light noise.
 本発明の第19の態様に係る測距装置では、本発明の第1~18の何れか1つの態様において、撮影部により撮影されて得られた動画像を表示し、かつ、動画像の表示に並行して、導出部により導出された被写体までの距離に関する情報を表示する表示部を更に備えている。これにより、測距装置は、動画像の表示に並行して、被写体までの距離に関する情報が表示されない場合に比べ、被写体の様子と被写体までの距離との関係をユーザに正確に把握させることができる。 In the distance measuring apparatus according to the nineteenth aspect of the present invention, in any one of the first to eighteenth aspects of the present invention, the moving image obtained by the photographing unit is displayed, and the moving image is displayed. In parallel, a display unit for displaying information on the distance to the subject derived by the deriving unit is further provided. Thereby, the distance measuring device allows the user to accurately grasp the relationship between the state of the subject and the distance to the subject, in comparison with the case where the information regarding the distance to the subject is not displayed in parallel with the display of the moving image. it can.
 本発明の第20の態様に係る測距装置では、本発明の第1の態様から第19の態様の何れか1つにおいて、射出部、受光部、及び導出部による測距は、被写体輝度又は露出状態特定情報に応じて予め定めた回数行う。これにより、本発明の第19の態様に係る測距装置は、被写体輝度に拘わらず指向性光の発光回数が固定化されている場合に比べ、環境光のノイズの影響が緩和された測距結果を得ることができる。 In the distance measuring device according to the twentieth aspect of the present invention, in any one of the first to nineteenth aspects of the present invention, the distance measurement by the emitting unit, the light receiving unit, and the deriving unit is subject luminance or This is performed a predetermined number of times according to the exposure state specifying information. Thus, the distance measuring device according to the nineteenth aspect of the present invention has a distance measurement in which the influence of ambient light noise is reduced compared to the case where the number of directional light emissions is fixed regardless of the subject brightness. The result can be obtained.
 本発明の第21の態様に係る測距装置では、本発明の第20の態様において、射出部、受光部、及び導出部による測距は、被写体輝度が高いほど又は露出状態特定情報により示される露出が低いほど多く行う。これにより、本発明の第21の態様に係る測距装置は、被写体輝度が高くなっているにも拘わらず指向性光の発光回数が固定化されている場合に比べ、環境光のノイズの影響が緩和された測距結果を得ることができる。 In the distance measuring device according to the twenty-first aspect of the present invention, in the twentieth aspect of the present invention, the distance measurement by the emitting unit, the light receiving unit, and the deriving unit is indicated as the subject luminance is higher or by the exposure state specifying information. Do more with lower exposure. As a result, the distance measuring apparatus according to the twenty-first aspect of the present invention is less influenced by ambient light noise than when the number of directional light emissions is fixed despite the subject brightness being high. A ranging result with reduced can be obtained.
 上記目的を達成するために、本発明の第22の態様に係る測距方法は、被写体を示す被写体像を結像する結像光学系により結像された被写体像を撮影して連続した複数のフレーム画像を生成する動画撮影を行い、結像光学系の光軸方向に沿って、指向性のある光である指向性光を射出する射出部から射出された指向性光の被写体からの反射光を受光部によって受光し、指向性光が射出されたタイミング及び反射光が受光されたタイミングに基づいて被写体までの距離を導出し、少なくとも1つのフレーム画像に、導出された距離を示す距離データを対応付けて記憶部に記憶することを含む。これにより、被写体までの距離の計測結果と、計測の際の被写体の動画像とを精度よく対応して記憶することが可能となる。 In order to achieve the above object, a distance measuring method according to a twenty-second aspect of the present invention is a method of photographing a subject image formed by an imaging optical system that forms a subject image showing a subject, and Reflected light from the subject of the directional light emitted from the emitting unit that emits directional light that is directional light along the optical axis direction of the imaging optical system, performing moving image shooting that generates a frame image The distance to the subject is derived based on the timing when the directional light is emitted and the timing when the reflected light is received, and at least one frame image includes distance data indicating the derived distance. And storing them in the storage unit in association with each other. As a result, the measurement result of the distance to the subject and the moving image of the subject at the time of measurement can be stored in correspondence with each other with high accuracy.
 上記目的を達成するために、本発明の第23の態様に係る測距プログラムは、コンピュータに、被写体を示す被写体像を結像する結像光学系により結像された被写体像を撮影して連続した複数のフレーム画像を生成する動画撮影を行い、結像光学系の光軸方向に沿って、指向性のある光である指向性光を射出する射出部から射出された指向性光の被写体からの反射光を受光部によって受光し、指向性光が射出されたタイミング及び反射光が受光されたタイミングに基づいて被写体までの距離を導出し、少なくとも1つのフレーム画像に、導出された距離を示す距離データを対応付けて記憶部に記憶することを含む処理を実行させるものである。これにより、被写体までの距離の計測結果と、計測の際の被写体の動画像とを精度よく対応して記憶することが可能となる。 To achieve the above object, a distance measuring program according to a twenty-third aspect of the present invention continuously captures a subject image formed by an imaging optical system that forms a subject image representing a subject on a computer. From the subject of the directional light emitted from the emission unit that emits directional light that is directional light along the optical axis direction of the imaging optical system. The reflected light is received by the light receiving unit, the distance to the subject is derived based on the timing when the directional light is emitted and the timing when the reflected light is received, and the derived distance is indicated in at least one frame image A process including associating the distance data with each other and storing it in the storage unit is executed. As a result, the measurement result of the distance to the subject and the moving image of the subject at the time of measurement can be stored in correspondence with each other with high accuracy.
 本発明の一つの実施形態によれば、被写体までの距離の計測結果と、計測の際の被写体の動画像とを精度よく対応して記憶することが可能となる、という効果が得られる。 According to one embodiment of the present invention, it is possible to obtain an effect that the measurement result of the distance to the subject and the moving image of the subject at the time of measurement can be stored in correspondence with each other with high accuracy.
実施形態に係る測距装置の要部の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the principal part of the distance measuring device which concerns on embodiment. 実施形態に係る測距装置における被写体までの距離を計測する測距動作のタイミングの一例を示すタイミングチャートである。It is a timing chart which shows an example of the timing of the ranging operation which measures the distance to the to-be-photographed object in the ranging apparatus which concerns on embodiment. 実施形態の測距装置において1回の計測における発光から受光までのタイミングの一例を表すタイミングチャートを示す。5 is a timing chart illustrating an example of timing from light emission to light reception in one measurement in the distance measuring device of the embodiment. 被写体までの距離を横軸、計測回数を縦軸とした場合の計測値のヒストグラムの一例を表したグラフである。It is a graph showing an example of a histogram of measured values when the distance to the subject is on the horizontal axis and the number of measurements is on the vertical axis. 実施形態に係る測距装置において、測距と動画撮影とを行う場合の主制御部で行われる処理の流れの一例を示すフローチャートである。5 is a flowchart illustrating an example of a flow of processing performed by a main control unit when performing distance measurement and moving image shooting in the distance measuring apparatus according to the embodiment. 測距開始指示(測距トリガ信号)が発生したタイミングのフレーム画像の前及び後の少なくとも一方のフレーム画像に測距データを対応して記憶する例を示す模式図である。It is a schematic diagram which shows the example which memorize | stores distance measurement data corresponding to at least one frame image before and after the frame image of the timing at which the ranging start instruction (ranging trigger signal) is generated. 所定数のフレーム画像毎に測距データを記憶する例を示す模式図である。It is a schematic diagram which shows the example which memorize | stores ranging data for every predetermined number of frame images. フレーム画像毎に測距データを記憶する例を示す模式図である。It is a schematic diagram which shows the example which memorize | stores ranging data for every frame image. 実施形態に係る測距装置において、測距と動画撮影とを行う場合の主制御部で行われる処理の変形例を示すフローチャートである。10 is a flowchart illustrating a modification of processing performed by the main control unit when performing distance measurement and moving image shooting in the distance measuring apparatus according to the embodiment. 測距エラーが発生した際の処理の一部を示すフローチャートである。It is a flowchart which shows a part of process when a ranging error generate | occur | produces. 実施形態に係る測距装置において得られるヒストグラムの変形例であって、AFに基づく被写体距離範囲以外の計測結果を用いずに被写体までの距離を導出する例を説明するための図である。It is a modification of the histogram obtained in the distance measuring apparatus according to the embodiment, and is a diagram for explaining an example of deriving the distance to the subject without using the measurement result other than the subject distance range based on AF. 実施形態に係る測距装置において得られるヒストグラムの変形例であって、AFに基づく被写体距離未満の距離の計測値を用いずに被写体までの距離を導出する例を説明するための図である。It is a modification of the histogram obtained in the distance measuring apparatus according to the embodiment, and is a diagram for explaining an example in which the distance to the subject is derived without using the measured value of the distance less than the subject distance based on AF. 実施形態に係る測距装置において得られるヒストグラムの変形例であって、AFに基づく被写体距離より長い距離の計測値を用いずに被写体までの距離を導出する例を説明するための図である。It is a modification of the histogram obtained in the distance measuring device according to the embodiment, and is a diagram for explaining an example of deriving the distance to the subject without using the measured value of the distance longer than the subject distance based on AF. 変形例の動画測距処理の一部を示すフローチャートである。It is a flowchart which shows a part of moving image ranging process of a modification. AF結果やAE結果に基づくレーザ光の射出強度やフォトダイオードの受光感度の調整を説明するためのブロック図である。FIG. 5 is a block diagram for explaining adjustment of laser beam emission intensity and photodiode light receiving sensitivity based on AF results and AE results. 発光回数決定テーブルの構成の一例を示す概念図である。It is a conceptual diagram which shows an example of a structure of the light emission frequency determination table. 輝度情報送信処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of a luminance information transmission process. 発光回数決定処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the light emission frequency determination process. 発光回数決定テーブルの構成の他の例を示す概念図である。It is a conceptual diagram which shows the other example of a structure of the light emission frequency determination table. 露出状態特定情報送信処理の流れの他の例を示すフローチャートである。It is a flowchart which shows the other example of the flow of an exposure state specific information transmission process. 発光回数決定処理の流れの他の例を示すフローチャートである。It is a flowchart which shows the other example of the flow of the light emission frequency determination process.
 以下、添付図面に従って本開示の技術に係る測距装置の実施形態の一例について説明する。なお、計測対象となる被写体までの距離を計測することを測距という。 Hereinafter, an example of an embodiment of a distance measuring apparatus according to the technique of the present disclosure will be described with reference to the accompanying drawings. Note that measuring the distance to the subject to be measured is called distance measurement.
 まず、本実施形態に係る測距装置の構成について説明する。図1は、本実施形態に係る測距装置10の要部の構成の一例を示すブロック図である。 First, the configuration of the distance measuring apparatus according to this embodiment will be described. FIG. 1 is a block diagram illustrating an example of a configuration of a main part of a distance measuring device 10 according to the present embodiment.
 本実施形態の測距装置10は、測距する機能と、被写体を撮影した撮影画像を生成する機能とを有する。本実施形態の測距装置10は、制御部20、発光用レンズ30、レーザダイオード32、受光用レンズ34、フォトダイオード36、結像光学系40、撮像素子42、操作部44、ビューファインダー46、及び記憶部48を備える。 The distance measuring device 10 of the present embodiment has a function of measuring a distance and a function of generating a photographed image obtained by photographing a subject. The distance measuring apparatus 10 according to the present embodiment includes a control unit 20, a light emitting lens 30, a laser diode 32, a light receiving lens 34, a photodiode 36, an imaging optical system 40, an image sensor 42, an operation unit 44, a viewfinder 46, And a storage unit 48.
 制御部20は、タイムカウンタ22、測距制御部24、及び主制御部26を備える。タイムカウンタ22は、測距制御部24を介して主制御部26から入力された信号(例えば、クロックパルス)に応じて予め定められた一定周期毎にカウント信号を発生させる機能を有する。 The control unit 20 includes a time counter 22, a distance measurement control unit 24, and a main control unit 26. The time counter 22 has a function of generating a count signal at predetermined intervals in accordance with a signal (for example, a clock pulse) input from the main control unit 26 via the distance measurement control unit 24.
 測距制御部24は、主制御部26の制御に応じて、測距する機能を有する。本実施形態の測距制御部24は、タイムカウンタ22で発生したカウント信号に応じたタイミングで、レーザダイオード32の駆動を制御して測距を行う。測距制御部24が、本開示の技術に係る導出部として機能する。測距制御部24の具体例としては、ASIC(Application Specific Integrated Circuit)やFPGA(Field-Programmable Gate Array)等が挙げられる。また、本実施形態の測距制御部24は、記憶部(図示省略)を有している。測距制御部24が有する記憶部の具体例としては、ROM(Read Only Memory)等の不揮発性の記憶部や、RAM(Random Access Memory)等の揮発性の記憶部が挙げられる。 The distance measurement control unit 24 has a function of measuring a distance according to the control of the main control unit 26. The distance measurement control unit 24 of the present embodiment performs distance measurement by controlling the driving of the laser diode 32 at a timing according to the count signal generated by the time counter 22. The ranging control unit 24 functions as a derivation unit according to the technique of the present disclosure. Specific examples of the distance measurement control unit 24 include ASIC (Application Specific Specific Integrated Circuit) and FPGA (Field-Programmable Gate Array). Further, the distance measurement control unit 24 of the present embodiment has a storage unit (not shown). Specific examples of the storage unit included in the distance measurement control unit 24 include a nonvolatile storage unit such as a ROM (Read Only Memory) and a volatile storage unit such as a RAM (Random Access Memory).
 主制御部26は、測距装置10全体を制御する機能を有する。また、本実施形態の主制御部26は、結像光学系40及び撮像素子42を制御して被写体を撮影し、撮影画像(被写体像)を生成する機能を有する。主制御部26が本開示の技術に係る制御部、輝度検出部、焦点調整部、及び露出調整部として機能する。主制御部26の具体例としては、CPU(Central Processing Unit)等が挙げられる。また、本実施形態の測距制御部24は、記憶部(図示省略)を有している。測距制御部24が有する記憶部の具体例としては、ROM等の不揮発性の記憶部や、RAM等の揮発性の記憶部が挙げられる。ROMには、後述する動画測距処理等のプログラムが予め記憶されている。 The main control unit 26 has a function of controlling the entire distance measuring device 10. Further, the main control unit 26 of the present embodiment has a function of controlling the imaging optical system 40 and the image sensor 42 to photograph a subject and generating a photographed image (subject image). The main control unit 26 functions as a control unit, a luminance detection unit, a focus adjustment unit, and an exposure adjustment unit according to the technique of the present disclosure. Specific examples of the main control unit 26 include a CPU (Central Processing Unit). Further, the distance measurement control unit 24 of the present embodiment has a storage unit (not shown). Specific examples of the storage unit included in the distance measurement control unit 24 include a nonvolatile storage unit such as a ROM and a volatile storage unit such as a RAM. In the ROM, a program such as a moving picture distance measuring process described later is stored in advance.
 なお、動画測距処理等のプログラムは、必ずしも最初から主制御部26に記憶させておく必要はない。例えば、SSD(Solid State Drive)、CD-ROM、DVD、光磁気ディスク、及びICカード等の任意の可搬型の記憶媒体に先ずは動画測距処理等のプログラムを記憶させておいてもよい。そして、プログラムを記憶させておいた可搬型の記憶媒体から測距装置10が動画測距処理等のプログラムを取得して主制御部26等に記憶するようにしてもよい。また、インターネットやLAN(Local Area Network)などを介して他の外部装置から動画測距処理等のプログラムを測距装置10が取得して、主制御部26等に記憶するようにしてもよい。 It should be noted that a program such as a moving image ranging process does not necessarily need to be stored in the main control unit 26 from the beginning. For example, a program such as a moving image ranging process may be stored in an arbitrary portable storage medium such as an SSD (Solid State Drive), a CD-ROM, a DVD, a magneto-optical disk, and an IC card. Then, the distance measuring device 10 may acquire a program such as a moving image distance measuring process from a portable storage medium in which the program is stored, and store it in the main control unit 26 or the like. Alternatively, the distance measuring device 10 may acquire a program such as a moving image distance measuring process from another external device via the Internet or a LAN (Local Area Network), and store the program in the main control unit 26 or the like.
 操作部44は、測距装置10に対して各種指示を与える際にユーザによって操作されるユーザインタフェースである。操作部44は、レリーズボタン、測距指示ボタン、及び各種指示をユーザが与える際に用いられるボタンやキー等(いずれも図示省略)を含む。操作部44によって受け付けられた各種指示は操作信号として主制御部26に出力され、主制御部26は、操作部44から入力された操作信号に応じた処理を実行する。 The operation unit 44 is a user interface operated by the user when giving various instructions to the distance measuring device 10. The operation unit 44 includes a release button, a distance measurement instruction button, and buttons and keys used when the user gives various instructions (all not shown). Various instructions received by the operation unit 44 are output as operation signals to the main control unit 26, and the main control unit 26 executes processing according to the operation signals input from the operation unit 44.
 操作部44のレリーズボタンは、撮影準備指示状態と撮影指示状態との2段階の押圧操作を検出する。撮影準備指示状態とは、例えば待機位置から中間位置(半押し位置)まで押下される状態を指し、撮影指示状態とは、中間位置を超えた最終押下位置(全押し位置)まで押下される状態を指す。なお、以下では、「待機位置から半押し位置まで押下される状態」を「半押し状態」といい、「待機位置又は半押しから全押し位置まで押下される状態」を「全押し状態」という。 The release button of the operation unit 44 detects a two-stage pressing operation of a shooting preparation instruction state and a shooting instruction state. The shooting preparation instruction state refers to, for example, a state where the image is pressed from the standby position to the intermediate position (half-pressed position), and the shooting instruction state refers to a state where the image is pressed to the final pressed position (full-pressed position) exceeding the intermediate position. Point to. In the following, “a state where the standby position is pressed down to the half-pressed position” is referred to as a “half-pressed state”, and “a state where the standby position or the half-pressed position is pressed down to the full-pressed position” is referred to as a “full-pressed state”. .
 本実施形態に係る測距装置10では、マニュアルフォーカスモードとオートフォーカスモードとがユーザの指示に応じて選択的に設定される。オートフォーカスモードでは、操作部44のレリーズボタンを半押し状態にすることにより撮影条件の調整が行われ、その後、引き続き全押し状態にすると露光(撮影)が行われる。つまり、操作部44のレリーズボタンを半押し状態にすることによりAE(Automatic Exposure)機能が働いて露出調整が行われた後、AF(Auto-Focus)機能が働いて合焦制御され、レリーズボタンを全押し状態にすると撮影が行われる。なお、動画撮影の際には、操作部44の動画撮影指示等によってAE機能やAF機能が働くようにしてもよい。 In the distance measuring apparatus 10 according to the present embodiment, the manual focus mode and the autofocus mode are selectively set according to a user instruction. In the autofocus mode, the shooting conditions are adjusted by pressing the release button of the operation unit 44 halfway, and then exposure (shooting) is performed when the release button is fully pressed. In other words, by pressing the release button of the operation unit 44 halfway, the AE (Automatic Exposure) function works to adjust the exposure, and then the AF (Auto-Focus) function works to control the focus. Shooting is performed when is fully pressed. Note that when shooting moving images, the AE function or the AF function may be activated by a moving image shooting instruction or the like of the operation unit 44.
 また、本実施形態において、主制御部26は、AEが行われることによって得られた結果である現時点の露出状態を特定する露出状態特定情報を測距制御部24に送信する。また、主制御部26は、AFが行われることによって得られた結果である現時点の合焦状態を特定する合焦状態特定情報を測距制御部24に送信する。なお、露出状態特定情報の一例としては、被写体輝度に応じて一意に定まる所謂AE評価値から導出されるF値及びシャッタスピードが挙げられる。また、露出状態特定情報の他の例としては、AE評価値が挙げられる。また、合焦状態特定情報の一例としては、AFにより得られる被写体距離が挙げられる。 In the present embodiment, the main control unit 26 transmits exposure state specifying information for specifying the current exposure state, which is a result obtained by performing AE, to the distance measurement control unit 24. Further, the main control unit 26 transmits focus state specifying information for specifying the current focus state, which is a result obtained by performing AF, to the distance measurement control unit 24. An example of the exposure state specifying information includes an F value and a shutter speed derived from a so-called AE evaluation value that is uniquely determined according to the subject brightness. Another example of the exposure state specifying information is an AE evaluation value. An example of the focus state specifying information is a subject distance obtained by AF.
 記憶部48は、不揮発性のメモリが用いられる。主として、撮影によって得られた画像データが記憶されるものであり、記憶部48の具体例としては、フラッシュメモリやHDD(Hard Disk Drive)が挙げられる。 A non-volatile memory is used for the storage unit 48. Mainly, image data obtained by photographing is stored, and specific examples of the storage unit 48 include a flash memory and an HDD (Hard Disk Disk Drive).
 ビューファインダー46は、画像及び文字情報等を表示する機能を有する。本実施形態のビューファインダー46は、電子ビューファインダー(以下、「EVF」という)であり、撮影モード時に連続フレームで撮影されて得られた連続フレーム画像の一例であるライブビュー画像(スルー画像)の表示に用いられる。また、ビューファインダー46は、静止画撮影の指示が与えられた場合に単一フレームで撮影されて得られた単一フレーム画像の一例である静止画像の表示にも用いられる。更に、ビューファインダー46は、再生モード時の再生画像の表示やメニュー画面等の表示にも用いられる。 The viewfinder 46 has a function of displaying image and character information. The viewfinder 46 of the present embodiment is an electronic viewfinder (hereinafter referred to as “EVF”), and is a live view image (through image) that is an example of a continuous frame image obtained by shooting in a continuous frame in the shooting mode. Used for display. The viewfinder 46 is also used to display a still image that is an example of a single frame image obtained by shooting a single frame when a still image shooting instruction is given. Furthermore, the viewfinder 46 is also used for displaying a playback image and a menu screen in the playback mode.
 結像光学系40は、フォーカスレンズを含む撮影レンズ、モータ、スライド機構、及びシャッタ(いずれも図示省略)を備えている。スライド機構は、フォーカスレンズを結像光学系40の光軸方向(図示省略)に沿って移動させる。スライド機構には光軸方向に沿ってスライド可能にフォーカスレンズが取り付けられている。また、スライド機構にはモータが接続されており、スライド機構は、モータの動力を受けてフォーカスレンズを光軸方向に沿ってスライドさせる。モータは、制御部20の主制御部26に接続されており、主制御部26からの命令に従って駆動が制御される。なお、本実施形態の測距装置10では、モータの具体例として、ステッピングモータを適用している。従って、モータは、主制御部26からの命令によりパルス電力に同期して動作する。 The imaging optical system 40 includes a photographing lens including a focus lens, a motor, a slide mechanism, and a shutter (all not shown). The slide mechanism moves the focus lens along the optical axis direction (not shown) of the imaging optical system 40. A focus lens is attached to the slide mechanism so as to be slidable along the optical axis direction. Further, a motor is connected to the slide mechanism, and the slide mechanism slides the focus lens along the optical axis direction under the power of the motor. The motor is connected to the main control unit 26 of the control unit 20, and driving is controlled in accordance with a command from the main control unit 26. In the distance measuring device 10 of the present embodiment, a stepping motor is applied as a specific example of the motor. Accordingly, the motor operates in synchronization with the pulse power according to a command from the main control unit 26.
 本実施形態に係る測距装置10では、オートフォーカスモード時に、主制御部26が、撮像素子42による撮像によって得られた画像のコントラスト値が最大となるように結像光学系40のモータを駆動制御することによって合焦制御を行う。また、オートフォーカスモード時に、主制御部26は、撮像によって得られた画像の明るさを示す物理量であるAE情報を算出する。主制御部26は、操作部44のレリーズボタンが半押し状態とされたときには、AE情報により示される画像の明るさに応じたシャッタスピード及びF値(絞り値)を導出する。そして、主制御部26は、導出したシャッタスピード及びF値となるように関係各部を制御することによって露出調整を行う。 In the distance measuring apparatus 10 according to the present embodiment, in the autofocus mode, the main control unit 26 drives the motor of the imaging optical system 40 so that the contrast value of the image obtained by imaging with the imaging element 42 is maximized. Focus control is performed by controlling. In the autofocus mode, the main control unit 26 calculates AE information that is a physical quantity indicating the brightness of an image obtained by imaging. When the release button of the operation unit 44 is pressed halfway, the main control unit 26 derives a shutter speed and an F value (aperture value) corresponding to the brightness of the image indicated by the AE information. Then, the main control unit 26 performs exposure adjustment by controlling each related unit so that the derived shutter speed and F value are obtained.
 撮像素子42は、カラーフィルタ(図示省略)を備えた撮像素子であり、本開示の技術に係る撮影部として機能する。本実施形態では、撮像素子42の一例としてCMOS型のイメージセンサを用いている。なお、撮像素子42は、CMOS型のイメージセンサに限らず、例えば、CCDイメージセンサでもよい。カラーフィルタは、輝度信号を得るために最も寄与するG(緑)に対応するGフィルタ、R(赤)に対応するRフィルタ、及びB(青)に対応するBフィルタを含む。撮像素子42の各画素(図示省略)には、カラーフィルタに含まれる“R”、“G”、及び“B”の何れかのフィルタが割り当てられている。 The image sensor 42 is an image sensor provided with a color filter (not shown), and functions as an imaging unit according to the technique of the present disclosure. In the present embodiment, a CMOS image sensor is used as an example of the image sensor 42. Note that the image sensor 42 is not limited to a CMOS image sensor, and may be a CCD image sensor, for example. The color filter includes a G filter corresponding to G (green) that contributes most to obtain a luminance signal, an R filter corresponding to R (red), and a B filter corresponding to B (blue). Each pixel (not shown) of the image sensor 42 is assigned one of the filters “R”, “G”, and “B” included in the color filter.
 被写体を撮影する場合、被写体を示す画像光は、結像光学系40を介して撮像素子42の受光面に結像される。撮像素子42は、複数の画素(図示省略)が水平方向及び垂直方向にマトリクス状に配列されており、画像光に応じた信号電荷が撮像素子42の画素に蓄積される。撮像素子42に蓄積された信号電荷は、主制御部26の制御に基づいて信号電荷(電圧)に応じたデジタル信号として順次読み出される。撮像素子42は、いわゆる電子シャッタ機能を有しており、電子シャッタ機能を働かせることで、主制御部26の制御に基づいたタイミングによって各フォトセンサの電荷蓄積時間(シャッタスピード)を制御する。 When photographing a subject, image light indicating the subject is imaged on the light receiving surface of the image sensor 42 via the imaging optical system 40. The image sensor 42 has a plurality of pixels (not shown) arranged in a matrix in the horizontal direction and the vertical direction, and signal charges corresponding to image light are accumulated in the pixels of the image sensor 42. The signal charge accumulated in the image sensor 42 is sequentially read out as a digital signal corresponding to the signal charge (voltage) based on the control of the main control unit 26. The image sensor 42 has a so-called electronic shutter function, and controls the charge accumulation time (shutter speed) of each photosensor according to the timing based on the control of the main control unit 26 by using the electronic shutter function.
 撮像素子42は、各画素から撮影画像の画素値を示すデジタル信号を出力する。なお、各画素から出力される撮影画像は有彩色画像であり、例えば、画素の配列と同じカラー配列のカラー画像である。撮像素子42から出力された撮影画像(フレーム)は、主制御部26を介して主制御部26内の記憶部または、記憶部48の予め定められたRAW(生)画像記憶領域(図示省略)に一時記憶(上書き保存)される。 The image sensor 42 outputs a digital signal indicating the pixel value of the photographed image from each pixel. The photographed image output from each pixel is a chromatic image, for example, a color image having the same color array as the pixel array. The captured image (frame) output from the image sensor 42 is stored in the main control unit 26 via the main control unit 26 or a predetermined RAW (raw) image storage area (not shown) in the storage unit 48. Is temporarily stored (overwritten).
 主制御部26は、フレームに対して各種の画像処理を施す。主制御部26は、WB(White Balance)ゲイン部、ガンマ補正部及び同時化処理部を有し(いずれも図示省略)、主制御部26内等に一時記憶された元のデジタル信号(RAW画像)に対して各処理部で順次信号処理を行う。すなわち、WBゲイン部は、R,G,B信号のゲインを調整することによりホワイトバランス(WB)調整を実行する。ガンマ補正部は、WBゲイン部でWB調整が実行された各R,G,B信号をガンマ補正する。同時化処理部は、撮像素子42のカラーフィルタの配列に対応した色補間処理を行い、同時化したR,G,B信号を生成する。なお、主制御部26は、撮像素子42により1画面分のRAW画像が取得される毎に、そのRAW画像に対して並列に画像処理を行う。 The main control unit 26 performs various image processing on the frame. The main control unit 26 includes a WB (White Balance) gain unit, a gamma correction unit, and a synchronization processing unit (all not shown), and the original digital signal (RAW image) temporarily stored in the main control unit 26 or the like. ) Sequentially performs signal processing in each processing unit. That is, the WB gain unit performs white balance (WB) adjustment by adjusting the gains of the R, G, and B signals. The gamma correction unit performs gamma correction on the R, G, and B signals that have been subjected to WB adjustment by the WB gain unit. The synchronization processing unit performs color interpolation processing corresponding to the color filter array of the image sensor 42 and generates synchronized R, G, and B signals. The main control unit 26 performs image processing on the RAW image in parallel every time a RAW image for one screen is acquired by the image sensor 42.
 また、主制御部26は、生成した記録用の撮影画像の画像データを、入力された信号を別の形式の信号に変換するエンコーダ(図示省略)に出力する。主制御部26により処理されたR,G,B信号は、エンコーダにより記録用の信号に変換(エンコーディング)され、記憶部48に記録される。また、主制御部26により処理された表示用の撮影画像は、ビューファインダー46に出力される。なお、以下では、説明の便宜上、上記の「記録用の撮影画像」及び「表示用の撮影画像」を区別して説明する必要がない場合は「記録用の」との文言及び「表示用の」との文言を省略して「撮影画像」と称する。 Also, the main control unit 26 outputs the generated image data of the recorded image for recording to an encoder (not shown) that converts the input signal into a signal of another format. The R, G, and B signals processed by the main control unit 26 are converted (encoded) into recording signals by the encoder and recorded in the storage unit 48. Further, the captured image for display processed by the main control unit 26 is output to the viewfinder 46. In the following, for convenience of explanation, the term “for recording” and “for display” are used when it is not necessary to distinguish between the above “recorded image for recording” and “captured image for display”. Is referred to as a “photographed image”.
 また、本実施形態の主制御部26は、表示用の撮影画像を動画像として連続して表示させる制御を行うことにより、ビューファインダー46にライブビュー画像を表示する。 In addition, the main control unit 26 of the present embodiment displays the live view image on the viewfinder 46 by performing control to continuously display the captured image for display as a moving image.
 発光用レンズ30及びレーザダイオード32は、本開示の技術に係る射出部の一例として機能する。レーザダイオード32は、測距制御部24からの指示に基づいて駆動され、レーザ光を発光用レンズ30を介して計測対象となる被写体へ向けて、結像光学系40の光軸方向に射出する機能を有する。本実施形態の発光用レンズ30の具体例としては、対物レンズなどが挙げられる。なお、レーザダイオード32により射出されるレーザ光は、本開示の技術に係る指向性光の一例である。 The light emitting lens 30 and the laser diode 32 function as an example of an emission unit according to the technique of the present disclosure. The laser diode 32 is driven based on an instruction from the distance measurement control unit 24, and emits laser light toward the subject to be measured via the light emitting lens 30 in the optical axis direction of the imaging optical system 40. It has a function. A specific example of the light emitting lens 30 of the present embodiment includes an objective lens. The laser light emitted from the laser diode 32 is an example of directional light according to the technique of the present disclosure.
 また、受光用レンズ34及びフォトダイオード36は、本開示の技術に係る受光部の一例として機能する。フォトダイオード36は、レーザダイオード32から射出され、被写体で反射されたレーザ光を受光用レンズ34を介して受光し、受光量に応じた電気信号を測距制御部24に出力する機能を有する。 In addition, the light receiving lens 34 and the photodiode 36 function as an example of a light receiving unit according to the technique of the present disclosure. The photodiode 36 has a function of receiving laser light emitted from the laser diode 32 and reflected by the subject through the light receiving lens 34 and outputting an electric signal corresponding to the amount of received light to the distance measurement control unit 24.
 操作部44の測距指示ボタン等により、測距するようにユーザから指示がなされると、主制御部26は、測距制御部24に、測距を行うように指示する。具体的には、本実施形態では、主制御部26は、測距指示信号を測距制御部24に送信することにより、測距制御部24に対して測距を行うように指示する。また、主制御部26は、被写体までの距離の計測及び被写体の撮影を並行して行う場合は、測距動作と撮影動作とを同期させるための同期信号を測距制御部24に送信する。 When the user gives an instruction to perform distance measurement using a distance measurement instruction button or the like of the operation unit 44, the main control unit 26 instructs the distance measurement control unit 24 to perform distance measurement. Specifically, in this embodiment, the main control unit 26 instructs the distance measurement control unit 24 to perform distance measurement by transmitting a distance measurement instruction signal to the distance measurement control unit 24. In addition, the main control unit 26 transmits a synchronization signal for synchronizing the distance measurement operation and the photographing operation to the distance measurement control unit 24 when measuring the distance to the subject and photographing the subject in parallel.
 同期信号及び測距指示信号を受信すると、測距制御部24は、タイムカウンタ22のカウント信号に応じたタイミングで、レーザダイオード32の発光を制御することにより、被写体に向けてレーザ光を射出するタイミングを制御する。また、測距制御部24は、タイムカウンタ22のカウント信号に応じたタイミングで、フォトダイオード36から出力された受光量に応じた電気信号をサンプリングする。 When receiving the synchronization signal and the distance measurement instruction signal, the distance measurement control unit 24 emits laser light toward the subject by controlling the light emission of the laser diode 32 at a timing according to the count signal of the time counter 22. Control timing. In addition, the distance measurement control unit 24 samples an electrical signal corresponding to the amount of received light output from the photodiode 36 at a timing corresponding to the count signal of the time counter 22.
 測距制御部24は、レーザダイオード32がレーザ光を発光した発光タイミングと、フォトダイオード36がレーザ光を受光した受光タイミングとに基づいて、被写体までの距離を導出し、導出した距離を表す距離データを主制御部26に出力する。主制御部26は、距離データに基づいて、被写体までの距離に関する情報をビューファインダー46に表示させる。また、主制御部26は、距離データを記憶部48に記憶させる。 The distance measurement control unit 24 derives the distance to the subject based on the light emission timing at which the laser diode 32 emits the laser light and the light reception timing at which the photodiode 36 receives the laser light, and the distance representing the derived distance Data is output to the main control unit 26. The main control unit 26 causes the viewfinder 46 to display information related to the distance to the subject based on the distance data. Further, the main control unit 26 stores the distance data in the storage unit 48.
 測距制御部24による被写体までの距離の計測についてさらに詳細に説明する。図2は、実施形態に係る測距装置10における被写体までの距離を計測する測距動作のタイミングの一例を示すタイミングチャートである。 The measurement of the distance to the subject by the distance measurement control unit 24 will be described in more detail. FIG. 2 is a timing chart showing an example of the timing of a distance measuring operation for measuring the distance to the subject in the distance measuring apparatus 10 according to the embodiment.
 本実施形態の測距装置10では、1回の測距(計測)シーケンスに、電圧調整期間、計測期間、及び休止期間を含む。電圧調整期間とは、レーザダイオード32及びフォトダイオード36の駆動電圧を、適切な電圧値に調整する期間をいう。具体例として、本実施形態の測距装置10では、図2に示すように、電圧調整期間を数100msec(ミリ秒)としている。 In the distance measuring device 10 of the present embodiment, a single distance measurement (measurement) sequence includes a voltage adjustment period, a measurement period, and a pause period. The voltage adjustment period is a period during which the drive voltages of the laser diode 32 and the photodiode 36 are adjusted to appropriate voltage values. As a specific example, in the distance measuring device 10 of the present embodiment, as shown in FIG. 2, the voltage adjustment period is set to several 100 msec (milliseconds).
 また、実計測期間とは、被写体までの距離を実際に計測する期間をいう。本実施形態の測距装置10では、具体例として、図2に示すように、レーザ光を発光(射出)させ、被写体で反射したレーザ光を受光する動作を数100回繰り返し、発光(射出)から受光までの経過時間を計測することにより被写体までの距離を計測している。すなわち、本実施形態の測距装置10では、1回の計測シーケンスにおいて、被写体までの距離の計測を数100回、行っている。 In addition, the actual measurement period is a period during which the distance to the subject is actually measured. In the distance measuring apparatus 10 of the present embodiment, as a specific example, as shown in FIG. 2, the operation of emitting (emitting) laser light and receiving the laser light reflected by the subject is repeated several hundred times to emit light (emitted). The distance to the subject is measured by measuring the elapsed time from light reception to light reception. That is, in the distance measuring device 10 of the present embodiment, the distance to the subject is measured several hundred times in one measurement sequence.
 図3には、1回の計測における発光から受光までのタイミングを表すタイミングチャートの一例を示す。測距を行う場合、測距制御部24は、タイムカウンタ22のカウント信号に応じて、レーザダイオード32を発光させるためのレーザトリガをレーザダイオード32に出力する。レーザダイオード32は、レーザトリガに応じて、発光する。本実施形態の測距装置10では、具体例として、レーザダイオード32の発光時間を、数10nsec(ナノ秒)としている。発光したレーザ光は、発光用レンズ30を介して被写体に向けて結像光学系40の光軸方向に射出される。測距装置10から射出されたレーザ光は、被写体で反射し、測距装置10に到達する。測距装置10のフォトダイオード36は、受光用レンズ34を介して、反射してきたレーザ光を受光する。 FIG. 3 shows an example of a timing chart showing the timing from light emission to light reception in one measurement. When performing distance measurement, the distance measurement control unit 24 outputs a laser trigger for causing the laser diode 32 to emit light to the laser diode 32 in accordance with the count signal of the time counter 22. The laser diode 32 emits light in response to the laser trigger. In the distance measuring apparatus 10 of the present embodiment, as a specific example, the light emission time of the laser diode 32 is set to several tens of nanoseconds (nanoseconds). The emitted laser light is emitted in the optical axis direction of the imaging optical system 40 toward the subject via the light emitting lens 30. The laser light emitted from the distance measuring device 10 is reflected by the subject and reaches the distance measuring device 10. The photodiode 36 of the distance measuring device 10 receives the reflected laser light via the light receiving lens 34.
 本実施形態の測距装置10では、具体例として、測距装置10からの距離が数km以内の被写体に対して測距を行う測距装置としている。レーザダイオード32から発光用レンズ30を介して数km先の被写体に向けて射出したレーザ光が戻ってくる(受光する)までの時間は、数km×2/光速≒数μsec(マイクロ秒)となる。従って、数km先の被写体までの距離を計測するためには、図2に示すように、少なくとも数μsecの時間を要する。 In the distance measuring apparatus 10 of the present embodiment, as a specific example, a distance measuring apparatus that performs distance measurement on a subject whose distance from the distance measuring apparatus 10 is within several km. The time required for the laser beam emitted from the laser diode 32 through the light emitting lens 30 toward the subject several km away to return (receive light) is several km × 2 / light speed≈several μsec (microseconds). Become. Therefore, in order to measure the distance to the subject several kilometers away, as shown in FIG. 2, it takes at least several μsec.
 なお、本実施形態の測距装置10では、レーザ光の往復時間等を考慮し、具体例として、1回の計測時間を図2に示すように数msecとしている。なお、被写体までの距離により、レーザ光の往復時間は異なるため、測距装置10が想定する距離に応じて、1回あたりの計測時間を異ならせてもよい。 In the distance measuring apparatus 10 of the present embodiment, taking into account the round trip time of the laser light and the like, a specific measurement time is set to several msec as shown in FIG. Since the round trip time of the laser light varies depending on the distance to the subject, the measurement time per one time may be varied according to the distance assumed by the distance measuring device 10.
 測距装置10では、測距制御部24が、上述のようにして数100回計測した計測値に基づいて、被写体までの距離を導出する。本実施形態の測距制御部24では、具体的一例として、数100回分の計測値のヒストグラムを解析して被写体までの距離を導出している。図4には、被写体までの距離を横軸、計測回数を縦軸とした場合の計測値のヒストグラムの一例を表したグラフを示す。測距制御部24は、上記ヒストグラムにおいて、計測回数の最大値に対応する被写体までの距離を計測結果として導出し、導出した計測結果を示す距離データを主制御部26に出力する。なお、被写体までの距離に代わり、レーザ光の往復時間(発光から受光までの経過時間)や、レーザ光の往復時間の1/2等に基づいてヒストグラムを生成するようにしてもよい。 In the distance measuring apparatus 10, the distance measurement control unit 24 derives the distance to the subject based on the measurement value measured several hundred times as described above. As a specific example, the distance measurement control unit 24 of the present embodiment derives the distance to the subject by analyzing a histogram of measured values for several hundred times. FIG. 4 is a graph showing an example of a histogram of measured values when the distance to the subject is on the horizontal axis and the number of measurements is on the vertical axis. The distance measurement control unit 24 derives the distance to the subject corresponding to the maximum number of times of measurement in the histogram as a measurement result, and outputs distance data indicating the derived measurement result to the main control unit 26. Instead of the distance to the subject, a histogram may be generated based on the round trip time of laser light (elapsed time from light emission to light reception), 1/2 of the round trip time of laser light, or the like.
 また、休止期間とは、レーザダイオード32及びフォトダイオード36の駆動を休止させるための期間をいう。本実施形態の測距装置10では、具体例として、図2に示すように、休止期間を数100msecとしている。 Also, the pause period is a period for stopping the driving of the laser diode 32 and the photodiode 36. In the distance measuring apparatus 10 of the present embodiment, as a specific example, as shown in FIG. 2, the rest period is set to several hundreds msec.
 さらに、本実施形態の測距装置10では、1回の計測期間を数100msecとしている。また、本実施形態では、撮影と測距とを同期して行うようになっている。 Furthermore, in the distance measuring device 10 of the present embodiment, one measurement period is set to several 100 msec. In the present embodiment, shooting and distance measurement are performed in synchronization.
 一方、本実施形態の測距装置10の主制御部26は、上述したようにライブビュー画像をビューファインダー46に表示する。主制御部26は、数10fpsで撮影された撮影画像を動画像としてビューファインダー46に表示することによりライブビュー画像の表示を行う。そのため、1回の計測期間の間に、30個のライブビュー画像がビューファインダー46に表示されることになる。また、主制御部26は、被写体までの計測結果をライブビュー画像に重畳してビューファインダー46に表示する。 On the other hand, the main control unit 26 of the distance measuring device 10 of the present embodiment displays the live view image on the viewfinder 46 as described above. The main control unit 26 displays a live view image by displaying a captured image captured at several tens of fps on the viewfinder 46 as a moving image. Therefore, 30 live view images are displayed on the viewfinder 46 during one measurement period. Further, the main control unit 26 displays the measurement result up to the subject on the viewfinder 46 in a superimposed manner on the live view image.
 続いて、上述のように構成された本実施形態に係る測距装置10において、被写体までの距離の計測と動画撮影とを行う場合の主制御部26で行われる具体的な処理について説明する。図5は、測距と動画撮影とを行う場合の主制御部で行われる処理の流れの一例を示すフローチャートである。なお、図5の処理は、動画測距処理のプログラムを主制御部26が実行することによって開始され、例えば、電源スイッチがオン状態とされ動画撮影が選択された際に実行される。 Subsequently, specific processing performed by the main control unit 26 in the case of performing distance measurement to the subject and moving image shooting in the distance measuring device 10 according to the present embodiment configured as described above will be described. FIG. 5 is a flowchart illustrating an example of a flow of processing performed by the main control unit when performing distance measurement and moving image shooting. 5 is started when the main control unit 26 executes a program for moving picture distance measurement, and is executed, for example, when the power switch is turned on and moving picture shooting is selected.
 まず、ステップ100で、主制御部26は、ライブビュー動作を開始する。上述したように、主制御部26は、結像光学系40及び撮像素子42により撮影して得た撮影画像を動画像として連続して表示させる制御を行うことにより、ビューファインダー46にライブビュー画像を表示させる。 First, in step 100, the main control unit 26 starts a live view operation. As described above, the main control unit 26 performs control to continuously display captured images obtained by capturing with the imaging optical system 40 and the image sensor 42 as moving images, thereby causing the viewfinder 46 to display a live view image. Is displayed.
 ステップ102では、主制御部26は動画開始か否かを判定する。該判定は、動画開始の操作が操作部44に対して行われたか否かを主制御部26が判定し、判定が肯定された場合にはステップ104へ移行し、否定された場合にはステップ120へ移行する。 In step 102, the main control unit 26 determines whether or not the moving image starts. In this determination, the main control unit 26 determines whether or not an operation for starting a moving image has been performed on the operation unit 44. If the determination is affirmative, the process proceeds to step 104. 120.
 ステップ104では、主制御部26は結像光学系40を制御し、AE及びAFを行ってステップ106へ移行する。測距装置10では、AEを行うことにおり、露出状態が設定され、AFを行うことにより焦点調整が行われて、被写体を示す画像光が撮像素子42の受光面に合焦状態で結像される。 In step 104, the main control unit 26 controls the imaging optical system 40, performs AE and AF, and proceeds to step 106. In the distance measuring apparatus 10, the AE is performed, the exposure state is set, the focus is adjusted by performing the AF, and the image light indicating the subject is focused on the light receiving surface of the image sensor 42. Is done.
 ステップ106では、主制御部26はAE又はAFのエラーが発生したか否かを判定する。該判定が肯定された場合にはステップ104へ戻ってAE及びAFを行って、判定が否定された場合にはステップ108へ移行する。 In step 106, the main control unit 26 determines whether or not an AE or AF error has occurred. If the determination is affirmative, the process returns to step 104 to perform AE and AF, and if the determination is negative, the process proceeds to step 108.
 ステップ108では、主制御部26は撮像素子42を制御することにより動画撮影を開始させてステップ110へ移行する。これにより、動画撮影が開始されて、主制御部26は、撮像素子42から得られるフレーム画像を順次取得する。すなわち、AEやAFのエラーがない場合に動画撮影を開始するので、測距時にぼけのない動画を撮影することができる。 In step 108, the main control unit 26 controls the image sensor 42 to start moving image shooting, and proceeds to step 110. Thereby, moving image shooting is started, and the main control unit 26 sequentially acquires frame images obtained from the image sensor 42. That is, since moving image shooting is started when there is no AE or AF error, it is possible to shoot a moving image without blurring during distance measurement.
 ステップ110では、主制御部26は測距開始指示が行われたか否かを判定する。該判定は、測距開始の操作が操作部44に対して行われたか否かを主制御部26が判定し、該判定が否定された場合にはステップ112へ移行し、肯定された場合にはステップ114へ移行する。 In step 110, the main control unit 26 determines whether or not a ranging start instruction has been issued. In this determination, the main control unit 26 determines whether or not an operation for starting distance measurement has been performed on the operation unit 44. If the determination is negative, the process proceeds to step 112. Goes to step 114.
 ステップ112では、主制御部26は撮像素子42から順次得られるフレーム画像を順次記憶部48に記憶させてステップ118へ移行する。 In step 112, the main control unit 26 sequentially stores the frame images sequentially obtained from the image sensor 42 in the storage unit 48, and proceeds to step 118.
 一方、ステップ114では、主制御部26は測距開始指示を測距制御部24に出力してステップ116へ移行する。これによって測距制御部24では、被写体までの距離が計測され、被写体までの距離を示す測距データが計測結果として主制御部26に出力される。これにより、動画撮影中に被写体までの距離を導出することができる。 On the other hand, in step 114, the main control unit 26 outputs a distance measurement start instruction to the distance measurement control unit 24 and proceeds to step 116. Thus, the distance measurement control unit 24 measures the distance to the subject, and distance measurement data indicating the distance to the subject is output to the main control unit 26 as a measurement result. This makes it possible to derive the distance to the subject during moving image shooting.
 ステップ116では、主制御部26は、測距制御部24から出力された測距データを、測距タイミングのフレーム画像に対応して記憶部48に記憶してステップ118へ移行する。例えば、測距データをフレーム画像に対応して記憶する際には、図6Aに示すようにしてもよい。すなわち、測距開始指示(測距トリガ信号)が発生したタイミングのフレーム画像の前及び後の少なくとも一方のフレーム画像に対応して測距データを記憶することにより、フレーム画像と測距データとを各々対応させるようにしてもよい。また、図6Bに示すように、所定数のフレーム画像毎に測距データを対応させて記憶するようにしてもよい。この場合、操作部44によって測距データを記憶するフレーム画像数を設定可能として、測距データと対応させるフレーム画像数を任意に設定可能としてもよい。或いは、フレームレートを低くしたり、測距時のレーザ光のサンプリング回数を少なくすることにより、図6Cに示すように、フレーム画像に同期して測距を行う。そして、フレーム画像毎に個別の測距データを対応させて記憶することにより、撮影した動画の各フレーム画像と測距データとを各々対応させるようにしてもよい。 In step 116, the main control unit 26 stores the distance measurement data output from the distance measurement control unit 24 in the storage unit 48 corresponding to the frame image of the distance measurement timing, and proceeds to step 118. For example, when storing distance measurement data corresponding to a frame image, it may be as shown in FIG. 6A. That is, by storing the distance measurement data corresponding to at least one of the frame images before and after the frame image at the timing when the distance measurement start instruction (ranging trigger signal) is generated, the frame image and the distance measurement data are stored. You may make it correspond, respectively. Further, as shown in FIG. 6B, distance measurement data may be stored in association with each other for each predetermined number of frame images. In this case, the number of frame images for storing distance measurement data can be set by the operation unit 44, and the number of frame images corresponding to the distance measurement data can be arbitrarily set. Alternatively, the distance is measured in synchronization with the frame image as shown in FIG. 6C by lowering the frame rate or decreasing the number of times of sampling of the laser beam during distance measurement. Then, by storing individual ranging data in correspondence with each frame image, each frame image of the captured moving image may be associated with the ranging data.
 ステップ118では、主制御部26は動画撮影の終了が指示されたか否かを判定する。該判定は、動画撮影終了を表す操作が操作部44に対して行われたか否かを主制御部26が判定する。該判定が否定された場合にはステップ110に戻って上述の処理が繰り返され、肯定された場合にはステップ120へ移行する。 In step 118, the main control unit 26 determines whether or not the end of moving image shooting is instructed. In this determination, the main control unit 26 determines whether or not an operation indicating the end of moving image shooting has been performed on the operation unit 44. If the determination is negative, the process returns to step 110 and the above-described processing is repeated. If the determination is positive, the process proceeds to step 120.
 ステップ120では、主制御部26は、電源がオフされたか否かを判定する。該判定は、操作部44に含まれる電源スイッチが操作されたか否かを主制御部26が判定する。該判定が否定された場合にはステップ102に戻って上述の処理が繰り返され、判定が肯定された場合にはステップ122へ移行する。 In step 120, the main control unit 26 determines whether or not the power is turned off. In this determination, the main control unit 26 determines whether or not a power switch included in the operation unit 44 is operated. If the determination is negative, the process returns to step 102 and the above processing is repeated. If the determination is affirmative, the routine proceeds to step 122.
 ステップ122では、主制御部26は、ライブビュー動作を停止して一連の処理を終了する。 In step 122, the main control unit 26 stops the live view operation and ends the series of processes.
 このように主制御部26が処理を行うことにより、被写体までの距離の計測と動画撮影とを行って、被写体までの距離の計測結果と距離計測時の動画とを精度よく対応して記憶することができる。 As the main control unit 26 performs processing in this way, measurement of the distance to the subject and video shooting are performed, and the measurement result of the distance to the subject and the moving image at the time of distance measurement are stored in correspondence with each other with high accuracy. be able to.
 続いて、測距と動画撮影とを行う場合の主制御部26で行われる処理の変形例について説明する。図7は、実施形態に係る測距装置10において、測距と動画撮影とを行う場合の主制御部26で行われる処理の変形例を示すフローチャートである。 Subsequently, a modified example of processing performed by the main control unit 26 when performing ranging and moving image shooting will be described. FIG. 7 is a flowchart illustrating a modified example of processing performed by the main control unit 26 when performing distance measurement and moving image shooting in the distance measuring device 10 according to the embodiment.
 上記の実施形態では、動画撮影を開始してから測距開始の指示により被写体までの距離を計測して、動画のフレーム画像に対応して測距データを記憶するようにしたが、変形例では、測距の指示が行われている間、動画撮影を行うようにしたものである。 In the above embodiment, the distance to the subject is measured in response to an instruction to start ranging after moving image shooting is started, and ranging data is stored corresponding to the frame image of the moving image. The video shooting is performed while the distance measurement instruction is given.
 すなわち、ステップ100で、主制御部26は、ライブビュー動作を開始する。上述したように、主制御部26は、結像光学系40及び撮像素子42により撮影して得た撮影画像を動画像として連続して表示させる制御を行うことにより、ビューファインダー46にライブビュー画像を表示させる。 That is, in step 100, the main control unit 26 starts a live view operation. As described above, the main control unit 26 performs control to continuously display captured images obtained by capturing with the imaging optical system 40 and the image sensor 42 as moving images, thereby causing the viewfinder 46 to display a live view image. Is displayed.
 ステップ101では、主制御部26は、測距開始指示が行われたか否かを判定する。該判定は、測距指示の操作が操作部44に対して行われたか否かを主制御部26が判定する。該判定が否定された場合にはステップ120へ移行し、肯定された場合にはステップ104へ移行する。なお、変形例では、測距開始の操作を操作部44に対して継続して行うことにより、被写体までの距離と動画撮影とを行う。 In step 101, the main control unit 26 determines whether or not a ranging start instruction has been issued. In this determination, the main control unit 26 determines whether or not a distance measurement instruction operation has been performed on the operation unit 44. If the determination is negative, the process proceeds to step 120. If the determination is affirmative, the process proceeds to step 104. In the modified example, the distance measurement start operation is continuously performed on the operation unit 44, whereby the distance to the subject and the moving image shooting are performed.
 ステップ104では、主制御部26は結像光学系40を制御し、AE及びAFを行ってステップ106へ移行する。測距装置10では、AEを行うことにおり、露出状態が設定され、AFを行うことにより焦点調整が行われて、被写体を示す画像光が撮像素子42の受光面に合焦状態で結像される。 In step 104, the main control unit 26 controls the imaging optical system 40, performs AE and AF, and proceeds to step 106. In the distance measuring apparatus 10, the AE is performed, the exposure state is set, the focus is adjusted by performing the AF, and the image light indicating the subject is focused on the light receiving surface of the image sensor 42. Is done.
 ステップ106では、主制御部26はAE又はAFのエラーが発生したか否かを判定する。該判定が肯定された場合にはステップ104へ戻ってAE及びAFを行って、判定が否定された場合にはステップ108へ移行する。 In step 106, the main control unit 26 determines whether or not an AE or AF error has occurred. If the determination is affirmative, the process returns to step 104 to perform AE and AF, and if the determination is negative, the process proceeds to step 108.
 ステップ108では、主制御部26は撮像素子42を制御することにより動画撮影を開始させてステップ114へ移行する。これにより、動画撮影が開始されて、主制御部26は、撮像素子42から得られるフレーム画像を順次取得する。すなわち、AEやAFのエラーがない場合に動画撮影を開始するので、測距時に合焦された動画を撮影することができる。 In step 108, the main control unit 26 starts moving image shooting by controlling the image sensor 42 and proceeds to step 114. Thereby, moving image shooting is started, and the main control unit 26 sequentially acquires frame images obtained from the image sensor 42. That is, since moving image shooting is started when there is no AE or AF error, a moving image focused at the time of distance measurement can be shot.
 ステップ114では、主制御部26は測距開始指示を測距制御部24に出力してステップ116へ移行する。これによって測距制御部24では、被写体までの距離が計測され、被写体までの距離を示す測距データが計測結果として主制御部26に出力される。 In step 114, the main control unit 26 outputs a ranging start instruction to the ranging control unit 24 and proceeds to step 116. Thus, the distance measurement control unit 24 measures the distance to the subject, and distance measurement data indicating the distance to the subject is output to the main control unit 26 as a measurement result.
 ステップ116では、主制御部26は、測距制御部24から出力された測距データを、測距タイミングのフレーム画像に対応して記憶部48に記憶してステップ117へ移行する。例えば、測距データをフレーム画像に対応して記憶する際には、図6Aに示すように、測距開始指示(測距トリガ信号)が発生したタイミングのフレーム画像の前及び後の少なくとも一方のフレーム画像に対応して記憶するようにしてもよい。また、図6Bに示すように、所定数のフレーム画像毎に測距データを対応させて記憶するようにしてもよい。この場合、操作部44によって測距データを記憶するフレーム画像数を設定するようにしてもよい。或いは、フレームレートを低くしたり、測距時のサンプリング回数を少なくすることにより、図6Cに示すように、フレーム画像毎に測距データを対応させて記憶するようにしてもよい。 In step 116, the main control unit 26 stores the distance measurement data output from the distance measurement control unit 24 in the storage unit 48 corresponding to the frame image of the distance measurement timing, and proceeds to step 117. For example, when storing the distance measurement data corresponding to the frame image, as shown in FIG. 6A, at least one of the frame image at the timing when the distance measurement start instruction (ranging trigger signal) is generated and before You may make it memorize | store corresponding to a frame image. Further, as shown in FIG. 6B, distance measurement data may be stored in association with each other for each predetermined number of frame images. In this case, the operation unit 44 may set the number of frame images for storing distance measurement data. Alternatively, as shown in FIG. 6C, distance data may be stored in correspondence with each frame image by lowering the frame rate or decreasing the number of samplings during distance measurement.
 ステップ117では、主制御部26は、測距停止指示が行われたか否かを判定する。該判定は、測距停止の操作が操作部44に対して行われたか否かを主制御部26が判定する。本実施形態では、操作部44に対して行われていた測距指示の操作がなくなったか否かを主制御部26が判定する。該判定が否定された場合にはステップ116に戻って上述の処理が繰り返され、判定が肯定された場合にはステップ119へ移行する。 In step 117, the main control unit 26 determines whether or not a ranging stop instruction has been issued. In this determination, the main control unit 26 determines whether or not a distance measurement stop operation has been performed on the operation unit 44. In the present embodiment, the main control unit 26 determines whether or not the distance measurement instruction operation performed on the operation unit 44 has been performed. If the determination is negative, the process returns to step 116 and the above-described processing is repeated. If the determination is affirmative, the process proceeds to step 119.
 ステップ119では、主制御部26は撮像素子42を制御することにより動画撮影を停止してステップ120へ移行する。 In step 119, the main control unit 26 controls the image pickup element 42 to stop moving image shooting and proceeds to step 120.
 ステップ120では、主制御部26は電源がオフされたか否かを判定する。該判定は、操作部44に含まれる電源スイッチが操作されたか否かを主制御部26が判定し、該判定が否定された場合にはステップ101に戻って上述の処理が繰り返され、判定が肯定された場合にはステップ122へ移行する。 In step 120, the main control unit 26 determines whether or not the power is turned off. In this determination, the main control unit 26 determines whether or not the power switch included in the operation unit 44 is operated. If the determination is negative, the process returns to step 101 and the above-described processing is repeated. If the determination is affirmative, the routine proceeds to step 122.
 ステップ122では、主制御部26はライブビュー動作を停止させて一連の処理を終了する。 In step 122, the main control unit 26 stops the live view operation and ends the series of processes.
 このように主制御部26が処理を行うことにより、測距の指示が行われている間、動画撮影を行うことができるので、被写体までの距離の計測と動画撮影とを容易に行うことができる。 Since the main control unit 26 performs processing in this manner, moving image shooting can be performed while a distance measurement instruction is being performed, so that measurement of the distance to the subject and moving image shooting can be easily performed. it can.
 ところで、測距制御部24では、被写体までの距離を計測できない場合(測距エラー)が発生することがあり得る。この場合には、主制御部26は記憶部48への記憶を中止して測距データのない不完全なデータの記憶を防止するようにしてもよい。例えば、上記のステップ114において、主制御部26が測距制御部24に対して測距指示信号を送信して、測距制御部24が被写体までの距離を計測する際に、図8に示す処理を行う。図8は、測距エラーが発生した際の処理の一部を示すフローチャートである。 Incidentally, the distance measurement control unit 24 may cause a case where the distance to the subject cannot be measured (ranging error). In this case, the main control unit 26 may stop storing in the storage unit 48 to prevent storing incomplete data without ranging data. For example, when the main control unit 26 transmits a distance measurement instruction signal to the distance measurement control unit 24 in the above-described step 114 and the distance measurement control unit 24 measures the distance to the subject, it is shown in FIG. Process. FIG. 8 is a flowchart showing a part of processing when a ranging error occurs.
 すなわち、主制御部26は、ステップ114で、測距指示信号を送信した後に、ステップ115へ移行して、測距エラーが発生したか否かを判定する。該判定は、測距制御部24から測距エラーであることを表す信号を主制御部26が受信したか否かを判定し、該判定が肯定された場合にはステップ121へ移行し、否定された場合には、上述のステップ116へ移行する。なお、ここでは、測距エラーが発生した場合に、測距制御部24から主制御部26に信号を送信する例を示すが、測距結果が表示されない場合にユーザが判断するようにしてもよい。 That is, after transmitting the ranging instruction signal in step 114, the main control unit 26 proceeds to step 115 and determines whether or not a ranging error has occurred. This determination is performed by determining whether or not the main control unit 26 has received a signal indicating a ranging error from the ranging control unit 24. If the determination is affirmative, the process proceeds to step 121, and If so, the process proceeds to step 116 described above. Here, an example in which a signal is transmitted from the distance measurement control unit 24 to the main control unit 26 when a distance measurement error occurs is shown, but the user may make a determination when a distance measurement result is not displayed. Good.
 ステップ117では、主制御部26は記憶部48へのフレーム画像及び測距データの記憶を中止する。なお、フレーム画像は得られるので、測距データの記憶のみを中止するようにしてもよい。 In step 117, the main control unit 26 stops storing frame images and distance measurement data in the storage unit 48. Note that since the frame image is obtained, only the storage of the distance measurement data may be stopped.
 一方、ステップ116では、上述したように、主制御部26は、測距制御部24から出力された測距データを、測距タイミングのフレーム画像に対応して記憶部48に記憶する。 On the other hand, in step 116, as described above, the main control unit 26 stores the distance measurement data output from the distance measurement control unit 24 in the storage unit 48 corresponding to the frame image of the distance measurement timing.
 なお、上記の実施形態では、測距制御部24は、レーザ光の射出及び受光による計測を複数回(例えば、数100回)行って、被写体までの距離を導出する例を説明したが、距離の導出の際に、焦点調整結果を用いるようにしてもよい。例えば、レーザ光の射出及び受光による複数の計測結果から生成されるヒストグラム(図4)を解析する際に、AF結果から被写体までの距離の距離範囲(焦点距離及びそれを含む近傍の範囲)が分かる。そこで、図9Aに示すように、AFに基づく被写体距離範囲以外の計測結果(図9Aの斜線部分)は使用せず、AFに基づく被写体距離範囲の計測結果のみを用いて被写体までの距離を導出するようにしてもよい。これにより、距離範囲が定まれば分解能が一意に定まるので、全ての計測値を用いるよりも頻度を求める際の距離範囲の分解能を高くすることができ、綿密な数値単位で被写体までの距離を導出することができる。図9Aの例では、AFに基づく被写体距離範囲より短い距離及び長い距離の計測値を共に使用しない例を示すが、何れか一方のみを使用しないようにしてもよい。すなわち、AFに基づく被写体距離未満の距離の計測値(図9Bの斜線部分)、またはAFに基づく被写体距離より長い距離の計測値(図9Cの斜線部分)は使用せずに、被写体までの距離を導出するようにしてもよい。また、AFの代わりにマニュアルフォーカスモードにおいて手動で焦点調整した結果を用いるようにしてもよい。 In the above-described embodiment, the distance measurement control unit 24 performs the measurement by emitting and receiving the laser light a plurality of times (for example, several hundred times) to derive the distance to the subject. The focus adjustment result may be used when deriving. For example, when analyzing a histogram (FIG. 4) generated from a plurality of measurement results by laser light emission and light reception, a distance range (focal length and a nearby range including it) from the AF result to the subject is calculated. I understand. Therefore, as shown in FIG. 9A, the measurement result other than the subject distance range based on AF (the hatched portion in FIG. 9A) is not used, and the distance to the subject is derived using only the measurement result of the subject distance range based on AF. You may make it do. As a result, since the resolution is uniquely determined once the distance range is determined, the resolution of the distance range when calculating the frequency can be increased rather than using all the measured values, and the distance to the subject can be determined in precise numerical units. Can be derived. In the example of FIG. 9A, an example is shown in which measured values of a distance shorter and a longer distance than the subject distance range based on AF are not used, but only one of them may not be used. That is, the distance to the subject is measured without using the measured value of the distance less than the subject distance based on AF (the hatched portion in FIG. 9B) or the measured value longer than the subject distance based on AF (the shaded portion in FIG. 9C). May be derived. Further, the result of manual focus adjustment in the manual focus mode may be used instead of AF.
 また、上記の実施形態及び変形例では、主制御部26が動画撮影を開始してから測距指示信号を送信して測距を開始するようにしたが、順番を逆にしてもよい。例えば、動画撮影の開始指示や測距開始指示が操作部44によって行われた場合に、図10Aに示すように、ステップ200で、主制御部26が測距指示信号を測距制御部24に出力する。これにより、測距制御部24が測距を開始して、測距エラーがない場合に、測距データを主制御部26へ送信し、測距エラーの場合には測距エラーを表す信号を主制御部26へ送信する。次にステップ202で主制御部26が測距エラーか否かを判定し、測距データを受信して判定が否定された場合にはステップ204へ移行する。ステップ204では主制御部26が動画撮影を開始し、ステップ206で主制御部26が測距データをフレーム画像に対応して記憶する。このとき測距エラーが発生した場合には、動画撮影の可否を操作部44によって予め設定可能とするようにしてもよい。これにより、測距が不可能な場合に動画像を記憶するか否かを任意に設定することができる。 In the embodiment and the modification described above, the main control unit 26 starts the distance measurement by transmitting the distance measurement instruction signal after starting the moving image shooting. However, the order may be reversed. For example, when an instruction to start moving image shooting or a distance measurement start instruction is given by the operation unit 44, the main control unit 26 sends a distance measurement instruction signal to the distance measurement control unit 24 in step 200 as shown in FIG. 10A. Output. Thus, when the distance measurement control unit 24 starts distance measurement and there is no distance measurement error, distance measurement data is transmitted to the main control unit 26, and in the case of a distance measurement error, a signal indicating the distance measurement error is transmitted. It transmits to the main control part 26. Next, at step 202, the main controller 26 determines whether or not there is a distance measurement error. If distance measurement data is received and the determination is negative, the process proceeds to step 204. In step 204, the main control unit 26 starts moving image shooting, and in step 206, the main control unit 26 stores the distance measurement data corresponding to the frame image. If a distance measurement error occurs at this time, whether to shoot a moving image may be set in advance by the operation unit 44. Thereby, it is possible to arbitrarily set whether to store a moving image when distance measurement is impossible.
 また、上記の実施形態において、測距制御部24が測距を行う場合、図10Bに示すように主制御部26からAF結果(又は手動の焦点調整結果)を特定する合焦状態特定情報やAE結果を特定する露出状態特定情報を取得し、取得した合焦状態特定情報及び露出状態特定情報に基づいてレーザダイオード32及びフォトダイオード36の少なくとも一方を駆動調整してもよい。すなわち、焦点調整結果(焦点距離)から大体の被写体までの距離が分かるので、AF結果を特定する合焦状態特定情報に基づいてレーザダイオード32から射出されるレーザ光の射出強度を調整するようにしてもよい。例えば、焦点距離が短いほど射出強度を小さくする。これにより、環境光がノイズとなるが、環境光のノイズに影響されない適正なレーザ光の射出強度で被写体までの距離を導出することができる。同様に、焦点調整結果から大体の被写体までの距離が分かるので、AF結果を特定する合焦状態特定情報に基づいてフォトダイオード36の受光感度を調整するようにしてもよい。例えば、焦点距離が短いほど受光感度を下げる。これにより、環境光のノイズに影響されない適正な受光感度で被写体までの距離を導出することができる。或いは、露出調整結果からレーザ光の必要な強度が分かるので、AE結果を特定する露出状態特定情報に基づいてレーザ光の射出強度を調整するようにしてもよい。例えば、露出が高いほど射出強度を小さくする。或いは、露出が高くなるということは被写体輝度が低くなることを意味するので、被写体輝度が低いほど射出強度を小さくしてもよい。これにより、環境光のノイズに影響されない適正なレーザ光の射出強度で被写体までの距離を導出することができる。 Further, in the above embodiment, when the distance measurement control unit 24 performs distance measurement, as shown in FIG. 10B, the focus state specifying information for specifying the AF result (or the manual focus adjustment result) from the main control unit 26, Exposure state specification information for specifying the AE result may be acquired, and at least one of the laser diode 32 and the photodiode 36 may be driven and adjusted based on the acquired focus state specification information and exposure state specification information. That is, since the distance from the focus adjustment result (focal length) to the approximate subject is known, the emission intensity of the laser light emitted from the laser diode 32 is adjusted based on the focus state specifying information for specifying the AF result. May be. For example, the shorter the focal length, the smaller the emission intensity. As a result, although the ambient light becomes noise, the distance to the subject can be derived with an appropriate emission intensity of the laser light that is not affected by the ambient light noise. Similarly, since the distance to the approximate subject can be known from the focus adjustment result, the light receiving sensitivity of the photodiode 36 may be adjusted based on the focus state specifying information for specifying the AF result. For example, the light receiving sensitivity is lowered as the focal length is shorter. As a result, the distance to the subject can be derived with an appropriate light receiving sensitivity that is not affected by ambient light noise. Or since the required intensity | strength of a laser beam is known from an exposure adjustment result, you may make it adjust the injection | emission intensity | strength of a laser beam based on the exposure state specific information which specifies an AE result. For example, the higher the exposure, the smaller the injection strength. Alternatively, since higher exposure means lower subject brightness, the lower the subject brightness, the lower the emission intensity. As a result, the distance to the subject can be derived with an appropriate laser beam emission intensity that is not affected by ambient light noise.
 また、上記実施形態では、被写体までの距離に関する情報がライブビュー画像に重畳させてビューファインダー46に表示される場合を例示したが、本開示の技術はこれに限定されるものではない。例えば、ライブビュー画像の表示領域とは別の表示領域に被写体までの距離に関する情報が表示されるようにしてもよい。このように、被写体までの距離に関する情報は、ライブビュー画像の表示と並行してビューファインダー46に表示されるようにすればよい。 In the above-described embodiment, the case where the information related to the distance to the subject is displayed on the viewfinder 46 while being superimposed on the live view image is illustrated, but the technology of the present disclosure is not limited to this. For example, information regarding the distance to the subject may be displayed in a display area different from the display area of the live view image. In this way, information regarding the distance to the subject may be displayed on the viewfinder 46 in parallel with the display of the live view image.
 また、上記実施形態では、測距装置10に設けられているレリーズボタンが操作される場合を例示したが、本開示の技術はこれに限定されるものではない。例えば、測距装置10に接続して使用される外部装置のUI(ユーザ・インタフェース)部によって受け付けられた撮影準備指示に従ってAE及びAFが開始され、外部装置のUI部によって受け付けられた撮影指示に従って動画撮影が開始されるようにしてもよい。測距装置10に接続して使用される外部装置の一例としては、スマートデバイス、パーソナル・コンピュータ(PC)、又は、眼鏡型若しくは腕時計型のウェアラブル端末装置が挙げられる。 In the above embodiment, the case where the release button provided in the distance measuring device 10 is operated is illustrated, but the technique of the present disclosure is not limited to this. For example, AE and AF are started in accordance with a shooting preparation instruction received by a UI (user interface) unit of an external device that is connected to the distance measuring device 10 and used in accordance with a shooting instruction received by the UI unit of the external device. Movie shooting may be started. As an example of an external device connected to the distance measuring device 10, there is a smart device, a personal computer (PC), or a spectacle-type or watch-type wearable terminal device.
 また、上記実施形態では、ビューファインダー46にライブビュー画像及び測距結果(被写体までの距離に関する情報)が表示される場合を例示したが、本開示の技術はこれに限定されるものではない。例えば、測距装置10に接続して使用される外部装置の表示部にライブビュー画像及び測距結果の少なくとも一方が表示されるようにしてもよい。測距装置10に接続して使用される外部装置の表示部の一例としては、スマートデバイスのディスプレイ、PCのディスプレイ、又はウェアラブル端末装置のディスプレイが挙げられる。 In the above embodiment, the case where the live view image and the distance measurement result (information on the distance to the subject) are displayed on the viewfinder 46 is exemplified, but the technology of the present disclosure is not limited to this. For example, at least one of the live view image and the distance measurement result may be displayed on a display unit of an external device that is connected to the distance measuring device 10 and used. As an example of a display unit of an external device connected to the distance measuring device 10, a smart device display, a PC display, or a wearable terminal device display can be cited.
 また、上記実施形態で説明した各動画測距処理(図5、図7、及び図10A参照)及び測距エラー処理(図8参照)はあくまでも一例である。従って、主旨を逸脱しない範囲内において不要なステップを削除したり、新たなステップを追加したり、処理順序を入れ替えたりしてもよいことは言うまでもない。また、上記実施形態で説明した動画測距処理や測距エラー処理に含まれる各処理は、プログラムを実行することにより、コンピュータを利用してソフトウェア構成により実現されてもよいし、その他のハードウェア構成で実現されてもよい。また、ハードウェア構成とソフトウェア構成の組み合わせによって実現してもよい。 In addition, each moving picture distance measurement process (see FIGS. 5, 7, and 10A) and distance measurement error process (see FIG. 8) described in the above embodiment are merely examples. Therefore, it goes without saying that unnecessary steps may be deleted, new steps may be added, and the processing order may be changed within a range not departing from the spirit. In addition, each process included in the moving picture distance measurement process and the distance measurement error process described in the above embodiment may be realized by a software configuration using a computer by executing a program, or other hardware The configuration may be realized. Further, it may be realized by a combination of a hardware configuration and a software configuration.
 また、上記実施形態では、測距装置を一例として説明したが、デジタルカメラ等の撮影装置に本開示の技術を適用するようにしてもよい。 In the above embodiment, the distance measuring device has been described as an example. However, the technology of the present disclosure may be applied to a photographing device such as a digital camera.
 その他、本実施形態で説明した測距装置10等の構成、動作等は一例であり、本開示の技術の主旨を逸脱しない範囲内において状況に応じて変更可能であることは言うまでもない。 In addition, the configuration, operation, and the like of the distance measuring device 10 described in the present embodiment are merely examples, and it is needless to say that they can be changed according to the situation without departing from the gist of the technology of the present disclosure.
 また、上記実施の形態では、一例として図2に示すように、レーザ光の発光回数が数100回に固定化されている場合を例示したが、本開示の技術はこれに限定されるものではない。環境光は、レーザ光にとってノイズとなるため、レーザ光の発光回数は、被写体輝度に応じて定められた発光回数であってもよい。 Moreover, in the said embodiment, as shown in FIG. 2 as an example, although the case where the frequency | count of light emission of a laser beam was fixed to several hundred times was illustrated, the technique of this indication is not limited to this. Absent. Since the ambient light becomes noise for the laser light, the number of light emission times of the laser light may be a light emission number determined according to the subject brightness.
 以下、レーザ光の発光回数の決め方の一例について説明する。 Hereinafter, an example of how to determine the number of times of laser light emission will be described.
 レーザ光の発光回数は、一例として図11に示す発光回数決定テーブル300から導出される。発光回数決定テーブル300では、被写体輝度が高くなるほどレーザ光の発光回数が多くなるように、被写体輝度とレーザ光の発光回数とが対応付けられている。すなわち、発光回数決定テーブル300において、被写体輝度は、L<L<・・・<Lの大小関係が成立しており、発光回数は、N<N<・・・<Nの大小関係が成立している。なお、図2に示す例では、100回単位の発光回数が例示されているが、これに限らず、発光回数は発光回数決定テーブル300によって10回単位又は1回単位で定められるようにしてもよい。 The number of times of laser light emission is derived from the light emission number determination table 300 shown in FIG. 11 as an example. In the light emission count determination table 300, the subject brightness and the laser light emission count are associated with each other so that the higher the subject brightness, the greater the laser light emission count. That is, in the light-emitting times determination table 300, the object luminance, L 1 <L 2 <··· < and magnitude of L n is satisfied, emission number, N 1 <N 2 <··· <N n The magnitude relationship is established. In the example shown in FIG. 2, the number of times of light emission is exemplified by 100 times. However, the number of times of light emission is not limited to this. Good.
 測距装置10では、発光回数決定テーブル300によるレーザ光の発光回数の導出を実現するために、主制御部26によって輝度情報送信処理(図12参照)が実行され、測距制御部24によって発光回数決定処理(図13参照)が実行される。 In the distance measuring device 10, in order to realize the derivation of the number of times of laser light emission by the emission number determination table 300, luminance information transmission processing (see FIG. 12) is executed by the main control unit 26, and light emission is performed by the distance measurement control unit 24. The number determination process (see FIG. 13) is executed.
 先ず、測距装置10の電源スイッチがオンされると主制御部26によって実行される輝度情報送信処理について図12を参照して説明する。 First, luminance information transmission processing executed by the main control unit 26 when the power switch of the distance measuring device 10 is turned on will be described with reference to FIG.
 図12に示す輝度情報送信処理では、先ず、ステップ400で、主制御部26は、被写体輝度の取得を開始する条件である輝度取得開始条件を満たしたか否かを判定する。輝度取得開始条件の一例としては、レリーズボタンが半押しされたとの条件が挙げられる。輝度取得開始条件の他の例としては、撮像素子42から撮影画像が出力されたとの条件が挙げられる。 In the luminance information transmission processing shown in FIG. 12, first, in step 400, the main control unit 26 determines whether or not a luminance acquisition start condition that is a condition for starting acquisition of subject luminance is satisfied. An example of the luminance acquisition start condition is a condition that the release button is half-pressed. Another example of the luminance acquisition start condition is a condition that a captured image is output from the image sensor 42.
 ステップ400において、輝度取得開始条件を満たした場合は、判定が肯定されて、ステップ402へ移行する。ステップ400において、輝度取得開始条件を満たしていない場合は、判定が否定されて、ステップ406へ移行する。 In step 400, if the luminance acquisition start condition is satisfied, the determination is affirmed and the routine proceeds to step 402. If the luminance acquisition start condition is not satisfied at step 400, the determination is negative and the routine proceeds to step 406.
 ステップ402で、主制御部26は、撮影画像から被写体輝度を取得し、その後、ステップ404へ移行する。なお、ここでは、撮影画像から被写体輝度が取得される場合を例示しているが、本開示の技術はこれに限定されるものではない。例えば、被写体輝度を検出する輝度センサが測距装置10に搭載されているのであれば、主制御部26は、輝度センサから被写体輝度を取得してもよい。 In step 402, the main control unit 26 acquires the subject brightness from the photographed image, and then proceeds to step 404. Note that, here, the case where the subject brightness is acquired from the captured image is illustrated, but the technology of the present disclosure is not limited to this. For example, if a brightness sensor that detects subject brightness is mounted on the distance measuring device 10, the main control unit 26 may acquire the subject brightness from the brightness sensor.
 ステップ404で、主制御部26は、ステップ402で取得した被写体輝度を示す輝度情報を測距制御部24に送信し、その後、ステップ406へ移行する。 In step 404, the main control unit 26 transmits the luminance information indicating the subject luminance acquired in step 402 to the distance measurement control unit 24, and then proceeds to step 406.
 ステップ406で、主制御部26は、本輝度情報送信処理を終了する条件である終了条件を満たしたか否かを判定する。終了条件の一例としては、測距装置10の電源スイッチがオフされたとの条件が挙げられる。ステップ406において、終了条件を満たしていない場合は、判定が否定されて、ステップ400へ移行する。ステップ406において、終了条件を満たした場合は、判定が肯定されて、本輝度情報送信処理を終了する。 In step 406, the main control unit 26 determines whether or not an end condition that is a condition for ending the luminance information transmission process is satisfied. An example of the end condition is a condition that the power switch of the distance measuring device 10 is turned off. If the termination condition is not satisfied at step 406, the determination is negative and the routine proceeds to step 400. In step 406, if the end condition is satisfied, the determination is affirmed, and the luminance information transmission process ends.
 次に、測距装置10の電源スイッチがオンされると測距制御部24によって実行される発光回数決定処理について図13を参照して説明する。 Next, the light emission number determination process executed by the distance measurement control unit 24 when the power switch of the distance measuring device 10 is turned on will be described with reference to FIG.
 図13に示す発光回数決定処理では、先ず、ステップ410で、測距制御部24は、ステップ404の処理が実行されることによって送信された輝度情報を受信したか否かを判定する。ステップ410において、ステップ404の処理が実行されることによって送信された輝度情報を受信していない場合は、判定が否定されて、ステップ416へ移行する。ステップ410において、ステップ404の処理が実行されることによって送信された輝度情報を受信した場合は、判定が肯定されて、ステップ412へ移行する。 In the light emission number determination process shown in FIG. 13, first, in step 410, the distance measurement control unit 24 determines whether or not the luminance information transmitted by executing the process of step 404 has been received. In step 410, when the luminance information transmitted by executing the process of step 404 is not received, the determination is negative and the process proceeds to step 416. In step 410, when the luminance information transmitted by executing the process of step 404 is received, the determination is affirmed and the process proceeds to step 412.
 ステップ412で、測距制御部24は、発光回数決定テーブル300から、ステップ410で受信した輝度情報により示される被写体輝度に対応する発光回数を導出し、その後、ステップ414へ移行する。 In step 412, the distance measurement control unit 24 derives the number of times of light emission corresponding to the subject luminance indicated by the luminance information received in step 410 from the light emission number determination table 300, and then proceeds to step 414.
 ステップ414で、測距制御部24は、ステップ412の処理で導出した発光回数を記憶部48に記憶し、その後、ステップ416へ移行する。なお、本ステップ416の処理によって記憶部48に記憶された発光回数は、実際に測距が実行される際のレーザ光の発光回数として採用される。 In step 414, the distance measurement control unit 24 stores the number of times of light emission derived in the process of step 412 in the storage unit 48, and then proceeds to step 416. Note that the number of times of light emission stored in the storage unit 48 by the processing in step 416 is employed as the number of times of laser light emission when the distance measurement is actually executed.
 ステップ416で、主制御部26は、本発光回数決定処理を終了する条件である終了条件を満たしたか否かを判定する。終了条件の一例としては、測距装置10の電源スイッチがオフされたとの条件が挙げられる。ステップ416において、終了条件を満たしていない場合は、判定が否定されて、ステップ410へ移行する。ステップ416において、終了条件を満たした場合は、判定が肯定されて、本発光回数決定処理を終了する。 In step 416, the main control unit 26 determines whether or not an end condition that is a condition for ending the main light emission number determination process is satisfied. An example of the end condition is a condition that the power switch of the distance measuring device 10 is turned off. If the termination condition is not satisfied at step 416, the determination is negative and the routine proceeds to step 410. In step 416, if the end condition is satisfied, the determination is affirmed and the main light emission number determination process is ended.
 次に、レーザ光の発光回数の決め方の他の例について説明する。 Next, another example of how to determine the number of times of laser light emission will be described.
 レーザ光の発光回数は、一例として図14に示す発光回数決定テーブル500に従って導出される。発光回数決定テーブル500では、被写体輝度に応じて一意に定まる露出状態特定情報(E,E,・・・・E)とレーザ光の発光回数(N,N,・・・・N)とが対応付けられている。なお、ここで、被写体輝度に応じて一意に定まる露出状態特定情報とは、例えば、被写体輝度が高いほど低くなる露出を示す露出状態特定情報を意味する。 The number of times of laser light emission is derived according to the light emission number determination table 500 shown in FIG. 14 as an example. In the light emission number determination table 500, exposure state specifying information (E 1 , E 2 ,... E n ) uniquely determined according to subject brightness and the number of times of laser light emission (N 1 , N 2 ,...). N n ). Here, the exposure state specifying information uniquely determined according to the subject brightness means, for example, exposure state specifying information indicating exposure that decreases as the subject brightness increases.
 発光回数決定テーブル500を用いてレーザ光の発光回数を導出する場合、主制御部26によって露出状態特定情報送信処理(図15参照)が実行され、測距制御部24によって発光回数決定処理(図16参照)が実行される。 When the laser light emission frequency is derived using the light emission frequency determination table 500, the main control unit 26 executes the exposure state specifying information transmission process (see FIG. 15), and the distance measurement control unit 24 executes the light emission frequency determination process (see FIG. 15). 16) is executed.
 先ず、測距装置10の電源スイッチがオンされると主制御部26によって実行される露出状態特定情報送信処理について図15を参照して説明する。 First, an exposure state specifying information transmission process executed by the main control unit 26 when the power switch of the distance measuring device 10 is turned on will be described with reference to FIG.
 図15に示す露出状態特定情報送信処理では、先ず、ステップ600で、主制御部26は、レリーズボタンが半押しされたか否かを判定する。ステップ600において、レリーズボタンが半押しされていない場合は、判定が否定されて、ステップ606へ移行する。ステップ600において、レリーズボタンが半押しされた場合は、判定が肯定されて、ステップ602へ移行する。なお、図15では、操作部44にレリーズボタンが備えられている場合を例に挙げて説明するが、本開示の技術はこれに限定されるものではない。例えば、操作部44に測距撮影開始ボタンが備えられている場合には、ステップ600を省略して、電源が投入された場合にステップ602の処理が開始されるようにすればよい。 In the exposure state specifying information transmission process shown in FIG. 15, first, at step 600, the main control unit 26 determines whether or not the release button is half-pressed. If it is determined in step 600 that the release button has not been pressed halfway, the determination is negative and the process proceeds to step 606. If the release button is pressed halfway in step 600, the determination is affirmed and the routine proceeds to step 602. In FIG. 15, a case where a release button is provided in the operation unit 44 will be described as an example, but the technology of the present disclosure is not limited to this. For example, when the operation unit 44 is provided with a distance measurement start button, step 600 may be omitted, and the process of step 602 may be started when the power is turned on.
 ステップ602で、主制御部26は、撮影画像から取得した被写体輝度に基づいてAEを行い、その後、ステップ604へ移行する。 In step 602, the main control unit 26 performs AE based on the subject brightness acquired from the captured image, and then proceeds to step 604.
 ステップ604で、主制御部26は、露出状態特定情報を測距制御部24に送信し、その後、ステップ606へ移行する。 In step 604, the main control unit 26 transmits the exposure state specifying information to the distance measurement control unit 24, and then proceeds to step 606.
 ステップ606で、主制御部26は、本露出状態特定情報送信処理を終了する条件である終了条件を満たしたか否かを判定する。終了条件の一例としては、測距装置10の電源スイッチがオフされたとの条件が挙げられる。ステップ606において、終了条件を満たしていない場合は、判定が否定されて、ステップ600へ移行する。ステップ606において、終了条件を満たした場合は、判定が肯定されて、本露出状態特定情報送信処理を終了する。 In step 606, the main control unit 26 determines whether or not an end condition that is a condition for ending the exposure state specifying information transmission process is satisfied. An example of the end condition is a condition that the power switch of the distance measuring device 10 is turned off. If it is determined in step 606 that the termination condition is not satisfied, the determination is negative and the process proceeds to step 600. In step 606, when the end condition is satisfied, the determination is affirmed, and the exposure state specifying information transmission process is ended.
 次に、測距装置10の電源スイッチがオンされると測距制御部24によって実行される発光回数決定処理について図16を参照して説明する。 Next, the light emission number determination process executed by the distance measurement control unit 24 when the power switch of the distance measuring device 10 is turned on will be described with reference to FIG.
 図16に示す発光回数決定処理では、先ず、ステップ610で、測距制御部24は、ステップ604の処理が実行されることによって送信された露出状態特定情報を受信したか否かを判定する。ステップ610において、ステップ604の処理が実行されることによって送信された露出状態特定情報を受信していない場合は、判定が否定されて、ステップ616へ移行する。ステップ610において、ステップ604の処理が実行されることによって送信された露出状態特定情報を受信した場合は、判定が肯定されて、ステップ612へ移行する。 In the light emission number determination process shown in FIG. 16, first, in step 610, the distance measurement control unit 24 determines whether or not the exposure state specifying information transmitted by executing the process of step 604 has been received. If it is determined in step 610 that the exposure state specifying information transmitted by executing the process of step 604 has not been received, the determination is negative and the routine proceeds to step 616. In step 610, when the exposure state specifying information transmitted by executing the process of step 604 is received, the determination is affirmed and the process proceeds to step 612.
 ステップ612で、測距制御部24は、発光回数決定テーブル500から、ステップ610で受信した露出状態特定情報に対応する発光回数を導出し、その後、ステップ614へ移行する。 In step 612, the distance measurement control unit 24 derives the number of times of light emission corresponding to the exposure state specifying information received in step 610 from the light emission number determination table 500, and then proceeds to step 614.
 ステップ614で、測距制御部24は、ステップ612の処理で導出した発光回数を記憶部48に記憶し、その後、ステップ616へ移行する。なお、本ステップ616の処理によって記憶部48に記憶された発光回数は、実際に測距が実行される際のレーザ光の発光回数として採用される。 In step 614, the distance measurement control unit 24 stores the number of times of light emission derived in the process of step 612 in the storage unit 48, and then proceeds to step 616. Note that the number of times of light emission stored in the storage unit 48 by the processing of step 616 is employed as the number of times of laser light emission when the distance measurement is actually executed.
 ステップ616で、主制御部26は、本露出状態特定情報送信処理を終了する条件である終了条件を満たしたか否かを判定する。終了条件の一例としては、測距装置10の電源スイッチがオフされたとの条件が挙げられる。ステップ616において、終了条件を満たしていない場合は、判定が否定されて、ステップ610へ移行する。ステップ616において、終了条件を満たした場合は、判定が肯定されて、本露出状態特定情報送信処理を終了する。 In step 616, the main control unit 26 determines whether or not an end condition that is a condition for ending the exposure state specifying information transmission process is satisfied. An example of the end condition is a condition that the power switch of the distance measuring device 10 is turned off. If it is determined in step 616 that the termination condition is not satisfied, the determination is negative and the process proceeds to step 610. In step 616, if the end condition is satisfied, the determination is affirmed, and the exposure state specifying information transmission process is ended.
 このように、測距装置10は、被写体輝度が高いほどレーザ光の発光回数(測距回数)を多くしているので、被写体輝度に拘わらずレーザ光の発光回数(測距回数)が固定化されている場合に比べ、環境光のノイズの影響が緩和された測距結果を得ることができる。 Thus, since the distance measuring device 10 increases the number of times of laser light emission (distance measurement) as the subject luminance increases, the number of times of laser light emission (distance measurement) is fixed regardless of the subject luminance. Compared to the case where the measurement is performed, it is possible to obtain a distance measurement result in which the influence of ambient light noise is reduced.
 また、上記実施形態では、測距用の光としてレーザ光を例示しているが、本開示の技術はこれに限定されるものではなく、指向性のある光である指向性光であればよい。例えば、発光ダイオード(LED:Light Emitting Diode)やスーパールミネッセントダイオード(SLD:Super Luminescent Diode)により得られる指向性光であってもよい。指向性光が有する指向性は、レーザ光が有する指向性と同程度の指向性であることが好ましく、例えば、数メートルから数キロメートルの範囲内における測距で使用可能な指向性であることが好ましい。 In the above-described embodiment, laser light is exemplified as distance measurement light. However, the technology of the present disclosure is not limited to this, and any directional light that is directional light may be used. . For example, it may be directional light obtained by a light emitting diode (LED: Light Emitting Diode) or a super luminescent diode (SLD: Super Luminescent Diode). The directivity of the directional light is preferably the same as the directivity of the laser light. For example, the directivity can be used for ranging within a range of several meters to several kilometers. preferable.
 なお、2014年5月2日に出願された日本国特許出願2014-095538号及び2014年8月5日に出願された日本国特許出願2014-159805号の開示は、その全体が参照により本明細書に取り込まれる。 The disclosures of Japanese Patent Application No. 2014-095538 filed on May 2, 2014 and Japanese Patent Application No. 2014-159805 filed on August 5, 2014 are hereby incorporated by reference in their entirety. Captured in the book.
 本明細書に記載された全ての文献、特許出願及び技術規格は、個々の文献、特許出願及び技術規格が参照により取り込まれることが具体的かつ個々に記された場合と同程度に、本明細書中に参照により取り込まれる。 All documents, patent applications and technical standards mentioned in this specification are to the same extent as if each individual document, patent application and technical standard were specifically and individually stated to be incorporated by reference. Incorporated by reference in the book.
 以上の実施形態に関し、更に以下の付記を開示する。 Regarding the above embodiment, the following additional notes are disclosed.
 (付記1)
 被写体を示す被写体像を結像する結像光学系により結像された被写体像を撮影して連続した複数のフレーム画像を生成する動画撮影を行う撮影部と、
 結像光学系の光軸方向に沿ってレーザ光を射出する射出部と、
 レーザ光の被写体からの反射光を受光する受光部と、
 射出部によりレーザ光が射出されたタイミング及び受光部により反射光が受光されたタイミングに基づいて被写体までの距離を導出する導出部と、
 撮影部により動画撮影を行う場合、少なくとも1つのフレーム画像に、導出部により導出された距離を示す距離データを対応付けて記憶する記憶部と、
 を備えた測距装置。
(Appendix 1)
A shooting unit for shooting a moving image for shooting a subject image formed by an imaging optical system that forms a subject image showing a subject and generating a plurality of continuous frame images;
An emission unit that emits laser light along the optical axis direction of the imaging optical system;
A light receiving unit that receives reflected light from the subject of the laser beam;
A deriving unit for deriving the distance to the subject based on the timing at which the laser beam is emitted by the emitting unit and the timing at which the reflected light is received by the light receiving unit;
A storage unit that stores distance data indicating a distance derived by the deriving unit in association with at least one frame image when capturing a moving image by the imaging unit;
Ranging device equipped with.
 (付記2)
 被写体を示す被写体像を結像する結像光学系により結像された被写体像を撮影して連続した複数のフレーム画像を生成する動画撮影を行い、
 結像光学系の光軸方向に沿ってレーザ光を射出する射出部から射出されたレーザ光の被写体からの反射光を受光部によって受光し、
 レーザ光が射出されたタイミング及び反射光が受光されたタイミングに基づいて被写体までの距離を導出し、
 少なくとも1つのフレーム画像に、導出された距離を示す距離データを対応付けて記憶部に記憶することを含む測距方法。
(Appendix 2)
Shooting a moving image to shoot a subject image formed by an imaging optical system that forms a subject image showing a subject to generate a plurality of continuous frame images,
The reflected light from the subject of the laser beam emitted from the emitting unit that emits the laser beam along the optical axis direction of the imaging optical system is received by the light receiving unit,
Deriving the distance to the subject based on the timing when the laser light is emitted and the timing when the reflected light is received,
A distance measurement method including storing distance data indicating a derived distance in association with at least one frame image in a storage unit.
 (付記3)
 コンピュータに、
 被写体を示す被写体像を結像する結像光学系により結像された被写体像を撮影して連続した複数のフレーム画像を生成する動画撮影を行い、
 結像光学系の光軸方向に沿ってレーザ光を射出する射出部から射出されたレーザ光の被写体からの反射光を受光部によって受光し、
 レーザ光が射出されたタイミング及び反射光が受光されたタイミングに基づいて被写体までの距離を導出し、
 少なくとも1つのフレーム画像に、導出された距離を示す距離データを対応付けて記憶部に記憶することを含む処理を実行させるための測距プログラム。
(Appendix 3)
On the computer,
Shooting a moving image to shoot a subject image formed by an imaging optical system that forms a subject image showing a subject to generate a plurality of continuous frame images,
The reflected light from the subject of the laser beam emitted from the emitting unit that emits the laser beam along the optical axis direction of the imaging optical system is received by the light receiving unit,
Deriving the distance to the subject based on the timing when the laser light is emitted and the timing when the reflected light is received,
A distance measurement program for executing processing including storing distance data indicating a derived distance in association with at least one frame image in a storage unit.

Claims (23)

  1.  被写体を示す被写体像を結像する結像光学系により結像された被写体像を撮影して連続した複数のフレーム画像を生成する動画撮影を行う撮影部と、
     前記結像光学系の光軸方向に沿って、指向性のある光である指向性光を射出する射出部と、
     前記指向性光の前記被写体からの反射光を受光する受光部と、
     前記射出部により前記指向性光が射出されたタイミング及び前記受光部により前記反射光が受光されたタイミングに基づいて前記被写体までの距離を導出する導出部と、
     前記撮影部により前記動画撮影を行う場合、少なくとも1つの前記フレーム画像に、前記導出部により導出された前記距離を示す距離データを対応付けて記憶する記憶部と、
     を備えた測距装置。
    A shooting unit for shooting a moving image for shooting a subject image formed by an imaging optical system that forms a subject image showing a subject and generating a plurality of continuous frame images;
    An emission unit that emits directional light that is directional light along the optical axis direction of the imaging optical system;
    A light receiving unit for receiving reflected light from the subject of the directional light;
    A deriving unit for deriving a distance to the subject based on the timing at which the directional light is emitted by the emitting unit and the timing at which the reflected light is received by the light receiving unit;
    A storage unit that stores distance data indicating the distance derived by the deriving unit in association with at least one of the frame images when the moving image is captured by the photographing unit;
    Ranging device equipped with.
  2.  前記導出部は、前記撮影部による前記動画撮影中は、予め定めた測距トリガ信号が発生したタイミングで、前記距離を導出する請求項1に記載の測距装置。 2. The distance measuring device according to claim 1, wherein the deriving unit derives the distance at a timing at which a predetermined distance measurement trigger signal is generated during the moving image shooting by the shooting unit.
  3.  前記記憶部は、予め定めた測距トリガ信号が発生したタイミングの前記フレーム画像の前及び後の少なくとも一方の前記フレーム画像に、前記導出部により導出された前記距離データを対応付けて記憶する請求項1又は請求項2に記載の測距装置。 The storage unit stores the distance data derived by the deriving unit in association with at least one of the frame images before and after the frame image at a timing when a predetermined distance measurement trigger signal is generated. Item 3. A distance measuring device according to item 1 or item 2.
  4.  前記記憶部は、前記撮影部により前記動画撮影を行う場合、予め定められた複数の前記フレーム画像毎に、対応するフレーム画像の撮影時の前記距離データを対応付けて記憶する請求項1~3の何れか一項に記載の測距装置。 The storage unit stores the distance data at the time of shooting a corresponding frame image in association with each of a plurality of predetermined frame images when the moving image is shot by the shooting unit. The distance measuring device according to any one of the above.
  5.  前記フレーム画像の数を設定するためのフレーム画像数設定部を更に備えた請求項4に記載の測距装置。 5. The distance measuring device according to claim 4, further comprising a frame image number setting unit for setting the number of the frame images.
  6.  前記導出部は、フレーム画像に同期して前記距離を導出し、前記記憶部は、フレーム画像毎に個別の前記距離データを対応付けて記憶する請求項1~3の何れか一項に記載の測距装置。 The derivation unit derives the distance in synchronization with a frame image, and the storage unit stores the individual distance data in association with each frame image. Distance measuring device.
  7.  前記撮影部は、前記導出部による前記被写体までの距離の導出を指示するための指示部による前記距離の導出が指示されている間、前記動画撮影を行う請求項1~6の何れか一項に記載の測距装置。 7. The moving image shooting according to claim 1, wherein the shooting unit performs the moving image shooting while the instruction unit for instructing the derivation of the distance to the subject by the derivation unit is instructed. The distance measuring device described in 1.
  8.  前記撮影部は、前記導出部によって前記距離が導出された場合に、前記動画撮影を開始する請求項1~7の何れか一項に記載の測距装置。 The distance measuring device according to any one of claims 1 to 7, wherein the photographing unit starts photographing the moving image when the distance is derived by the deriving unit.
  9.  前記導出部によって前記距離の導出が不可能な場合に、前記撮影部による動画撮影の可否を予め設定するための設定手段を更に備えた請求項1~8の何れか一項に記載の測距装置。 The distance measuring device according to any one of claims 1 to 8, further comprising setting means for setting in advance whether or not moving images can be captured by the photographing unit when the deriving unit cannot derive the distance. apparatus.
  10.  前記記憶部は、前記導出部によって前記距離の導出が不可能な場合、前記記憶部による記憶を中止する請求項1~9の何れか一項に記載の測距装置。 10. The distance measuring apparatus according to claim 1, wherein the storage unit stops the storage by the storage unit when the derivation unit cannot derive the distance.
  11.  前記導出部は、前記結像光学系の被写体への焦点調整を行う焦点調整部による焦点調整エラーがない場合に、前記距離を導出する請求項1~10の何れか一項に記載の測距装置。 The distance measurement according to any one of claims 1 to 10, wherein the derivation unit derives the distance when there is no focus adjustment error by a focus adjustment unit that performs focus adjustment on a subject of the imaging optical system. apparatus.
  12.  前記導出部は、前記距離の導出を複数回行い、前記複数回の前記距離の導出によって得られる距離のうちの頻度が高い距離を最終的な距離として導出する場合、前記距離を導出する際に、前記結像光学系の被写体への焦点調整を行う焦点調整部の調整結果に基づいて、前記頻度を求める際に使用する距離範囲、又は前記指向性光の射出から受光までの時間範囲を定め、定めた結果に応じて定める分解能で前記被写体までの距離を導出する請求項1~11の何れか一項に記載の測距装置。 The derivation unit performs the derivation of the distance a plurality of times, and derives a distance having a high frequency among the distances obtained by the derivation of the distance a plurality of times as a final distance. Based on the adjustment result of the focus adjustment unit that performs focus adjustment on the subject of the imaging optical system, a distance range used when obtaining the frequency or a time range from emission to reception of the directional light is determined. The distance measuring device according to any one of claims 1 to 11, wherein a distance to the subject is derived with a resolution determined according to a determined result.
  13.  前記射出部が、前記指向性光の射出強度を調整可能であり、前記距離を導出する場合に、前記結像光学系の被写体への焦点調整を行う焦点調整部の調整結果に基づいて前記射出強度を調整して前記指向性光を射出する請求項1~12の何れか一項に記載の測距装置。 The emission unit can adjust the emission intensity of the directional light, and when the distance is derived, the emission unit performs the emission based on an adjustment result of a focus adjustment unit that performs focus adjustment on a subject of the imaging optical system. The distance measuring device according to any one of claims 1 to 12, wherein the directional light is emitted while adjusting an intensity.
  14.  前記射出部は、前記焦点調整部によって調整された焦点距離が短いほど前記射出強度を小さくする請求項13に記載の測距装置。 14. The distance measuring device according to claim 13, wherein the emission unit decreases the emission intensity as the focal length adjusted by the focus adjustment unit is shorter.
  15.  前記受光部が、受光感度を調整可能であり、前記距離を導出する場合に、前記結像光学系の被写体への焦点調整を行う焦点調整部の調整結果に基づいて受光感度を調整して前記反射光を受光する請求項1~14の何れか一項に記載の測距装置。 The light receiving unit can adjust the light receiving sensitivity, and when the distance is derived, the light receiving sensitivity is adjusted based on an adjustment result of a focus adjusting unit that performs focus adjustment on the subject of the imaging optical system, and the light receiving sensitivity is adjusted. The distance measuring device according to any one of claims 1 to 14, which receives reflected light.
  16.  前記受光部は、前記焦点調整部によって調整された焦点距離が短いほど前記受光感度を下げる請求項15に記載の測距装置。 16. The distance measuring device according to claim 15, wherein the light receiving unit lowers the light receiving sensitivity as the focal length adjusted by the focus adjusting unit is shorter.
  17.  前記射出部が、前記指向性光の射出強度を調整可能であり、被写体輝度又は露出状態特定情報に基づいて前記射出強度を調整して前記指向性光を射出する請求項1~16の何れか一項に記載の測距装置。 17. The emission unit according to claim 1, wherein the emission unit is capable of adjusting an emission intensity of the directional light, and adjusts the emission intensity based on subject luminance or exposure state specifying information to emit the directional light. The distance measuring device according to one item.
  18.  前記射出部は、前記被写体輝度が低いほど又は前記露出状態特定情報により示される露出が高いほど前記射出強度を小さくする請求項17に記載の測距装置。 18. The distance measuring device according to claim 17, wherein the emission unit decreases the emission intensity as the subject luminance is lower or as the exposure indicated by the exposure state specifying information is higher.
  19.  前記撮影部により撮影されて得られた動画像を表示し、かつ、前記導出部により導出された前記被写体までの距離に関する情報を表示する表示部を更に備えた請求項1~18の何れか一項に記載の測距装置。 The display unit according to any one of claims 1 to 18, further comprising a display unit that displays a moving image captured by the imaging unit and displays information about a distance to the subject derived by the deriving unit. The distance measuring device according to item.
  20.  前記射出部、前記受光部、及び前記導出部による測距は、被写体輝度又は露出状態特定情報に応じて予め定めた回数行う請求項1から請求項19の何れか一項に記載の測距装置。 The distance measuring device according to any one of claims 1 to 19, wherein the distance measurement by the emission unit, the light receiving unit, and the deriving unit is performed a predetermined number of times according to subject luminance or exposure state specifying information. .
  21.  前記射出部、前記受光部、及び前記導出部による測距は、前記被写体輝度が高いほど又は前記露出状態特定情報により示される露出が低いほど多く行う請求項20に記載の測距装置。 21. The distance measuring device according to claim 20, wherein the distance measurement by the emission unit, the light receiving unit, and the deriving unit is performed more as the subject brightness is higher or as the exposure indicated by the exposure state specifying information is lower.
  22.  被写体を示す被写体像を結像する結像光学系により結像された被写体像を撮影して連続した複数のフレーム画像を生成する動画撮影を行い、
     前記結像光学系の光軸方向に沿って、指向性のある光である指向性光を射出する射出部から射出された前記指向性光の被写体からの反射光を受光部によって受光し、
     前記指向性光が射出されたタイミング及び前記反射光が受光されたタイミングに基づいて前記被写体までの距離を導出し、
     少なくとも1つの前記フレーム画像に、導出された前記距離を示す距離データを対応付けて記憶部に記憶することを含む測距方法。
    Shooting a moving image to shoot a subject image formed by an imaging optical system that forms a subject image showing a subject to generate a plurality of continuous frame images,
    The reflected light from the subject of the directional light emitted from the emitting unit that emits directional light that is directional light along the optical axis direction of the imaging optical system is received by the light receiving unit,
    Deriving the distance to the subject based on the timing at which the directional light is emitted and the timing at which the reflected light is received;
    A distance measuring method including storing distance data indicating the derived distance in association with at least one frame image in a storage unit.
  23.  コンピュータに、
     被写体を示す被写体像を結像する結像光学系により結像された被写体像を撮影して連続した複数のフレーム画像を生成する動画撮影を行い、
     前記結像光学系の光軸方向に沿って、指向性のある光である指向性光を射出する射出部から射出された前記指向性光の被写体からの反射光を受光部によって受光し、
     前記指向性光が射出されたタイミング及び前記反射光が受光されたタイミングに基づいて前記被写体までの距離を導出し、
     少なくとも1つの前記フレーム画像に、導出された前記距離を示す距離データを対応付けて記憶部に記憶することを含む処理を実行させるための測距プログラム。
    On the computer,
    Shooting a moving image to shoot a subject image formed by an imaging optical system that forms a subject image showing a subject to generate a plurality of continuous frame images,
    The reflected light from the subject of the directional light emitted from the emitting unit that emits directional light that is directional light along the optical axis direction of the imaging optical system is received by the light receiving unit,
    Deriving the distance to the subject based on the timing at which the directional light is emitted and the timing at which the reflected light is received;
    A distance measurement program for executing processing including storing at least one frame image in association with distance data indicating the derived distance in a storage unit.
PCT/JP2015/056873 2014-05-02 2015-03-09 Distance-measurement device, distance-measurement method, and distance-measurement program WO2015166711A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2014-095538 2014-05-02
JP2014095538 2014-05-02
JP2014-159805 2014-08-05
JP2014159805 2014-08-05

Publications (1)

Publication Number Publication Date
WO2015166711A1 true WO2015166711A1 (en) 2015-11-05

Family

ID=54358452

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/056873 WO2015166711A1 (en) 2014-05-02 2015-03-09 Distance-measurement device, distance-measurement method, and distance-measurement program

Country Status (1)

Country Link
WO (1) WO2015166711A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2015166712A1 (en) * 2014-05-02 2017-04-20 富士フイルム株式会社 Ranging device, ranging method, and ranging program
CN107505619A (en) * 2017-06-30 2017-12-22 努比亚技术有限公司 A kind of terminal imaging method, camera shooting terminal and computer-readable recording medium
CN114761757A (en) * 2019-11-29 2022-07-15 富士胶片株式会社 Information processing apparatus, information processing method, and program

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003139534A (en) * 2001-10-30 2003-05-14 Pentax Corp Electronic distance meter
JP2003344046A (en) * 2002-05-22 2003-12-03 Pentax Corp Light wave range finder
JP2006101310A (en) * 2004-09-30 2006-04-13 Casio Comput Co Ltd Photographing apparatus and program therefor
JP2006322834A (en) * 2005-05-19 2006-11-30 Nikon Corp Distance measuring instrument and distance measuring method
JP2008042580A (en) * 2006-08-07 2008-02-21 Chugoku Electric Power Co Inc:The Imaging apparatus
JP2009103463A (en) * 2007-10-19 2009-05-14 Nissan Motor Co Ltd Light detection device, light detection method, and vehicle
JP2011234871A (en) * 2010-05-10 2011-11-24 Olympus Corp Endoscope system
JP2013157946A (en) * 2012-01-31 2013-08-15 Jvc Kenwood Corp Image processing device, image processing method, and image processing program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003139534A (en) * 2001-10-30 2003-05-14 Pentax Corp Electronic distance meter
JP2003344046A (en) * 2002-05-22 2003-12-03 Pentax Corp Light wave range finder
JP2006101310A (en) * 2004-09-30 2006-04-13 Casio Comput Co Ltd Photographing apparatus and program therefor
JP2006322834A (en) * 2005-05-19 2006-11-30 Nikon Corp Distance measuring instrument and distance measuring method
JP2008042580A (en) * 2006-08-07 2008-02-21 Chugoku Electric Power Co Inc:The Imaging apparatus
JP2009103463A (en) * 2007-10-19 2009-05-14 Nissan Motor Co Ltd Light detection device, light detection method, and vehicle
JP2011234871A (en) * 2010-05-10 2011-11-24 Olympus Corp Endoscope system
JP2013157946A (en) * 2012-01-31 2013-08-15 Jvc Kenwood Corp Image processing device, image processing method, and image processing program

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2015166712A1 (en) * 2014-05-02 2017-04-20 富士フイルム株式会社 Ranging device, ranging method, and ranging program
CN107505619A (en) * 2017-06-30 2017-12-22 努比亚技术有限公司 A kind of terminal imaging method, camera shooting terminal and computer-readable recording medium
CN114761757A (en) * 2019-11-29 2022-07-15 富士胶片株式会社 Information processing apparatus, information processing method, and program
US11877056B2 (en) 2019-11-29 2024-01-16 Fujifilm Corporation Information processing apparatus, information processing method, and program

Similar Documents

Publication Publication Date Title
JP6185178B2 (en) Ranging device, ranging method, and ranging program
JP6224232B2 (en) Ranging device, ranging method, and ranging program
JP6321145B2 (en) Ranging device, ranging method, and ranging program
US10887520B2 (en) Distance measurement device, distance measurement method, and distance measurement program
US11650315B2 (en) Distance measurement device, distance measurement method, and distance measurement program
US8274598B2 (en) Image capturing apparatus and control method therefor
WO2015166711A1 (en) Distance-measurement device, distance-measurement method, and distance-measurement program
JP2010193498A (en) Photographing device and exposure control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15786716

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15786716

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP