WO2022181095A1 - Control device, imaging device, control method, and program - Google Patents

Control device, imaging device, control method, and program Download PDF

Info

Publication number
WO2022181095A1
WO2022181095A1 PCT/JP2022/000783 JP2022000783W WO2022181095A1 WO 2022181095 A1 WO2022181095 A1 WO 2022181095A1 JP 2022000783 W JP2022000783 W JP 2022000783W WO 2022181095 A1 WO2022181095 A1 WO 2022181095A1
Authority
WO
WIPO (PCT)
Prior art keywords
mode
imaging
light
image
temperature
Prior art date
Application number
PCT/JP2022/000783
Other languages
French (fr)
Japanese (ja)
Inventor
哲也 藤川
智大 島田
臣一 下津
敏浩 青井
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2023502151A priority Critical patent/JPWO2022181095A1/ja
Publication of WO2022181095A1 publication Critical patent/WO2022181095A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/48Thermography; Techniques using wholly visual means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the technology of the present disclosure relates to a control device, an imaging device, a control method, and a program.
  • Japanese Patent Application Laid-Open No. 10-134272 describes a monitoring system for large-space disaster prevention, etc. for detecting fire outbreaks and intruders from thermal images of a large space, and for environmental control.
  • an infrared camera for taking out a temperature filter selection means having a plurality of temperature filters with different temperature ranges from room temperature to a temperature range necessary for fire monitoring, and selecting a temperature filter for the infrared camera from the plurality of temperature filters, and an infrared camera image data storage means for storing thermal image data extracted from and processed, and temperature filter selection means for selecting a temperature filter of the infrared camera in accordance with preset switching timings of a plurality of monitoring modes, and selecting a monitoring mode.
  • a monitoring system for large space disaster prevention and the like is disclosed, which is characterized by comprising control processing means for taking out thermal image data from an infrared camera each time and executing processing in each monitoring mode.
  • Japanese Patent Application Publication No. 2004-534941 discloses a handheld infrared camera including a lens assembly, the lens assembly being supported by a housing, the housing being a source of electrical energy and information received via the lens assembly. and wherein the housing is provided with user control means for manually and visually controlling the device, wherein the housing is in a substantially elongated configuration. with a lens assembly mounted at one end, a user handle at the opposite end, and a manual handle on one side of the housing intended to be operated by the user's thumb. A control assembly and a visual control are provided, the visual control being positioned between the manual control assembly and the lens assembly to position the infrared camera away from the user's eyes and body. Disclosed is a hand-held infrared camera adapted to be viewed when holding a camera and characterized in that the infrared camera is intended to be operated with one hand.
  • One embodiment according to the technology of the present disclosure provides a control device, an imaging device, a control method, and a program that are capable of differentiating the details of control on a controlled object between the first mode and the second mode.
  • a first aspect of the technology of the present disclosure includes a processor and a memory connected to or built into the processor, and the processor performs imaging based on light received by an image sensor of an imaging device in a first mode and , and a second mode for deriving temperature based on the near-infrared light received by the image sensor, the control factor being different between the first mode and the second mode.
  • a second aspect of the technology of the present disclosure is the control device according to the first aspect, wherein the control factor includes a display control factor to be displayed on the display.
  • a third aspect of the technology of the present disclosure is the control device according to the second aspect, wherein in the first mode, the processor, as the display control factor, is a captured image obtained by light being received by the image sensor. is displayed on the display, and in the second mode, as the display control factor, a temperature information display factor is set for displaying temperature information indicating temperature on the display.
  • a fourth aspect of the technology of the present disclosure is the control device according to any one of the first to third aspects, wherein the control factor includes a light projection control factor that operates the light projector, and the processor comprises: In the second mode, the controller sets a light projection suppression control factor for suppressing the light projection of the light projector as the light projection control factor.
  • a fifth aspect of the technology of the present disclosure is the control device according to any one of the first to fourth aspects, wherein the control factor includes an imaging setting factor related to imaging settings.
  • a sixth aspect of the technology of the present disclosure is the control device according to the fifth aspect, wherein the imaging settings include settings related to the projector, settings related to the shutter speed, settings related to the aperture, settings related to photometry, settings related to the sensitivity of the image sensor,
  • the control device includes at least one setting of a setting related to high dynamic range and a setting related to anti-vibration control.
  • a seventh aspect of the technology of the present disclosure is the control device according to any one of the first to sixth aspects, wherein the control factor includes an image processing setting factor related to image processing setting. be.
  • An eighth aspect of the technology of the present disclosure is the control device according to the seventh aspect, wherein the image processing settings include at least one of noise reduction settings, sharpness settings, contrast settings, and tone settings.
  • the image processing settings include at least one of noise reduction settings, sharpness settings, contrast settings, and tone settings.
  • a ninth aspect of the technology of the present disclosure includes a processor and a memory connected to or built into the processor, wherein the processor performs imaging based on light received by an image sensor of an imaging device in a first mode and and a second mode for deriving a temperature based on the near-infrared light received by the image sensor, setting the first mode when the light projector state is on and the light projector state is off.
  • the processor performs imaging based on light received by an image sensor of an imaging device in a first mode and and a second mode for deriving a temperature based on the near-infrared light received by the image sensor, setting the first mode when the light projector state is on and the light projector state is off.
  • it is a control device for setting the second mode.
  • a tenth aspect of the technology of the present disclosure is the control device according to the ninth aspect, wherein the processor switches from the second mode to the first mode in accordance with the temperature in the second mode.
  • An eleventh aspect of the technology of the present disclosure is the control device according to the tenth aspect, wherein the light projector performs pulsed light emission, the processor sets the first mode during the light emission period of the pulsed light emission, and emits the pulsed light emission.
  • the control device repeats the operation of setting the second mode during the stop period according to the light emission timing of the pulse light emission.
  • a twelfth aspect of the technology of the present disclosure is the control device according to any one of the first to eleventh aspects, wherein the processor comprises: It is a control device that outputs a synthesized image obtained by synthesizing an image and temperature information indicating temperature.
  • a thirteenth aspect of the technology of the present disclosure is an imaging device including the control device according to any one of the first to twelfth aspects, and an image sensor.
  • a fourteenth aspect of the technology of the present disclosure includes a first mode of imaging based on light received by an image sensor of an imaging device, and a first mode of deriving temperature based on near-infrared light received by the image sensor.
  • the control method includes switching between two modes and different control factors between the first mode and the second mode.
  • a fifteenth aspect of the technology of the present disclosure includes a first mode of imaging based on light received by an image sensor of an imaging device, and a first mode of deriving temperature based on near-infrared light received by the image sensor. and setting the first mode when the operation of the light projector is on and setting the second mode when the operation of the light projector is off. be.
  • a sixteenth aspect of the technology of the present disclosure provides a computer with a first mode for capturing an image based on light received by an image sensor of an imaging device, and calculating a temperature based on near-infrared light received by the image sensor. It is a program for executing processing including switching between a second mode to be derived and different control factors between the first mode and the second mode.
  • a seventeenth aspect of the technology of the present disclosure provides a computer with a first mode for capturing an image based on light received by an image sensor of an image capturing device, and measuring a temperature based on near-infrared light received by the image sensor. and switching to and from a second mode to derive, and setting the first mode when operation of the light projector is on and setting the second mode when operation of the light projector is off. It is a program for executing processing.
  • FIG. 1 is a perspective view showing an example of a camera according to a first embodiment
  • FIG. 2 is a block diagram showing an example of the internal configuration of the camera according to the first embodiment
  • FIG. 2 is a block diagram showing an example of the electrical configuration of the camera according to the first embodiment
  • FIG. FIG. 4 is an explanatory diagram showing an example of the configuration and operation of the turret filter according to the first embodiment
  • 3 is a block diagram showing an example of a functional configuration of a CPU according to the first embodiment
  • FIG. It is a block diagram showing an example of a configuration as a mode switching processing unit of the CPU according to the first embodiment
  • 3 is a block diagram showing an example of a configuration of a CPU as an imaging processing unit according to the first embodiment
  • FIG. 4 is a front view showing an example of a captured image obtained by imaging processing according to the first embodiment; It is a block diagram showing an example of a configuration of a CPU as a temperature measurement processing unit according to the first embodiment.
  • FIG. 4 is an explanatory diagram showing an example of a function of a CPU as a wavelength selector according to the first embodiment;
  • FIG. 4 is a front view showing a first example of a synthesized image obtained by temperature measurement processing according to the first embodiment;
  • FIG. 11 is a front view showing a second example of a synthesized image obtained by temperature measurement processing according to the first embodiment;
  • FIG. 11 is a front view showing a third example of a synthesized image obtained by temperature measurement processing according to the first embodiment;
  • FIG. 11 is a front view showing a fourth example of a synthesized image obtained by temperature measurement processing according to the first embodiment; 6 is a flowchart showing an example of the flow of mode switching processing according to the first embodiment; 4 is a flowchart showing an example of the flow of imaging processing according to the first embodiment; 6 is a flowchart showing an example of the flow of temperature measurement processing according to the first embodiment; FIG. 11 is a front view showing a modified example of the captured image obtained by the imaging process according to the first embodiment; FIG. 11 is a front view showing a modified example of the synthesized image obtained by the temperature measurement process according to the first embodiment; FIG.
  • FIG. 11 is a block diagram showing an example of a configuration of a CPU as a mode switching processing unit according to the second embodiment; 9 is a flowchart showing an example of the flow of mode switching processing according to the second embodiment;
  • FIG. 12 is a block diagram showing an example of a configuration of a CPU as a parameter change processing unit according to the third embodiment;
  • FIG. 12 is a block diagram showing an example of a configuration of a CPU as an imaging processing unit according to the third embodiment;
  • FIG. 12 is a block diagram showing an example of a configuration of a CPU as a temperature measurement processing unit according to the third embodiment;
  • FIG. 11 is a flow chart showing an example of the flow of parameter change processing according to the third embodiment;
  • FIG. 14 is a block diagram showing an example of a configuration of a CPU as a mode switching processing unit according to the fourth embodiment;
  • FIG. 16 is a flowchart showing an example of the flow of mode switching processing according to the fourth embodiment;
  • FIG. FIG. 20 is a block diagram showing an example of a configuration of a CPU as an integrated display processing unit according to the fifth embodiment;
  • FIG. 14 is a flowchart showing an example of the flow of integrated display processing according to the fifth embodiment;
  • FIG. FIG. 11 is a block diagram showing an example of an electrical configuration of an imaging device according to a first modified example;
  • FIG. 11 is a block diagram showing an example of an electrical configuration of an imaging device according to a second modified example;
  • CPU is an abbreviation for "Central Processing Unit”.
  • GPU is an abbreviation for "Graphics Processing Unit”.
  • NVM is an abbreviation for "Non-Volatile Memory”.
  • RAM is an abbreviation for "Random Access Memory”.
  • IC is an abbreviation for "Integrated Circuit”.
  • ASIC is an abbreviation for "Application Specific Integrated Circuit”.
  • PLD is an abbreviation for "Programmable Logic Device”.
  • FPGA is an abbreviation for "Field-Programmable Gate Array”.
  • SoC is an abbreviation for "System-on-a-chip.”
  • SSD is an abbreviation for "Solid State Drive”.
  • HDD is an abbreviation for "Hard Disk Drive”.
  • EEPROM is an abbreviation for "Electrically Erasable and Programmable Read Only Memory”.
  • SRAM is an abbreviation for "Static Random Access Memory”.
  • I/F is an abbreviation for "Interface”.
  • USB is an abbreviation for "Universal Serial Bus”.
  • CMOS is an abbreviation for "Complementary Metal Oxide Semiconductor”.
  • CCD is an abbreviation for "Charge Coupled Device”.
  • LAN is an abbreviation for "Local Area Network”.
  • WAN is an abbreviation for "Wide Area Network”.
  • BPF is an abbreviation for "Band Pass Filter”.
  • Ir is an abbreviation for "Infrared Rays”.
  • EL is an abbreviation for "Electro Luminescence”.
  • perpendicular means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to perfect verticality, and does not go against the spirit of the technology of the present disclosure. It refers to the vertical in the sense of including the error of
  • horizontal means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to being completely horizontal, and is not contrary to the spirit of the technology of the present disclosure.
  • parallel means, in addition to complete parallelism, an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, and does not go against the spirit of the technology of the present disclosure. It refers to parallel in the sense of including the error of In the description of this specification, “orthogonal” is an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to perfect orthogonality, and is not contrary to the spirit of the technology of the present disclosure.
  • match means, in addition to perfect match, an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, and does not go against the spirit of the technology of the present disclosure. It refers to a match in terms of meaning including errors in
  • the term “equidistant interval” means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to perfectly equal intervals, and is contrary to the spirit of the technology of the present disclosure. It refers to equal intervals in the sense of including errors to the extent that they do not occur.
  • camera 1 includes camera body 10 and lens unit 20 .
  • the camera 1 has a function of obtaining a visible light image by capturing visible light, a function of obtaining a near-infrared light image by capturing near-infrared light, and a function of capturing a subject based on electromagnetic waves emitted by thermal radiation from the subject. and a function to measure the temperature of
  • the camera 1 is an example of an “imaging device” according to the technology of the present disclosure.
  • a camera-side mount 12 for attaching the lens unit 20 is provided on the front surface 11 of the camera body 10 .
  • An illumination window 13 is provided on the front surface 11 of the camera body 10 for illuminating the subject with the illumination light IL.
  • the camera body 10 includes a projector 14 that generates illumination light IL.
  • the light projector 14 is, for example, an LED that emits near-infrared light with a peak wavelength of 1550 nm as illumination light IL.
  • the light projector 14 may be a halogen light. Illumination light IL generated by the light projector 14 is transmitted through the irradiation window 13 and emitted forward of the camera body 10 .
  • the light projector 14 is an example of a "light projector" according to the technology of the present disclosure.
  • the camera body 10 also includes an image sensor 15 .
  • the image sensor 15 captures an image of the light L incident from the subject through the lens unit 20 .
  • the image sensor 15 has a light receiving surface 15A.
  • the light L incident on the lens unit 20 is imaged on the light receiving surface 15A by the lens unit 20.
  • An image is obtained by forming an image of the light L on the light receiving surface 15A.
  • a plurality of photodiodes are arranged in a matrix on the light receiving surface 15A.
  • the plurality of photodiodes includes a plurality of silicon photodiodes sensitive to visible light and a plurality of indium-gallium-arsenide photodiodes sensitive to near-infrared light.
  • the silicon photodiode will be referred to as a Si diode
  • the indium-gallium-arsenide photodiode will be referred to as an InGaAs diode.
  • a plurality of Si diodes generate and output analog image data according to the received visible light.
  • a plurality of InGaAs diodes generate and output analog image data corresponding to the received near-infrared light.
  • visible light and near-infrared light incident on the image sensor 15 will be referred to as light unless it is necessary to distinguish between them.
  • CMOS image sensor is exemplified as the image sensor 15, but the technology of the present disclosure is not limited to this. The technology of the present disclosure is also established.
  • the image sensor 15 is an example of an “imaging device image sensor” according to the technology of the present disclosure.
  • the lens unit 20 includes a lens barrel 21 and a lens side mount 22.
  • the lens side mount 22 is provided at the rear end of the lens barrel 21 .
  • the lens side mount 22 is configured to be connectable to the camera side mount 12 of the camera body 10 .
  • the lens unit 20 is detachably attached to the camera body 10 by a lens side mount 22 . Note that the lens unit 20 may be fixed to the camera body 10 in a non-detachable manner.
  • the lens unit 20 includes an objective lens 30, a focus lens 31, a zoom lens 32, an aperture 33, a blur correction lens 34, a turret filter 35, and an adjustment lens 37.
  • An objective lens 30, a focus lens 31, a zoom lens 32, an aperture 33, a blur correction lens 34, a turret filter 35, and an adjustment lens 37 are arranged in order from the object side to the image side along the optical axis OA of the lens unit 20.
  • the objective lens 30 is fixed to the tip of the lens barrel 21 and is a lens that collects light.
  • the focus lens 31 is a lens for adjusting the focus position of the image.
  • the zoom lens 32 is a lens for adjusting zoom magnification.
  • the diaphragm 33 is an optical element for adjusting the amount of light.
  • the diaphragm 33 has an aperture 33A.
  • Light guided by zoom lens 32 passes through aperture 33A.
  • the diaphragm 33 is a movable diaphragm in which the diameter of the aperture 33A is variable.
  • the amount of light directed by zoom lens 32 is modified by aperture 33 .
  • the blur correction lens 34 is a lens for correcting image blur.
  • the turret filter 35 has a plurality of optical filters.
  • the turret filter 35 selects the optical filter inserted in the optical path of the light in the lens unit 20 among the plurality of optical filters, thereby filtering out light in a plurality of wavelength bands (eg, visible light, It is an optical element that selectively transmits (near-infrared light in different wavelength bands within the near-infrared wavelength band).
  • the optical path of light within the lens unit 20 is positioned, for example, on the optical axis OA.
  • the optical path of light within the lens unit 20 is simply referred to as an optical path.
  • the configuration of the turret filter 35 will be detailed later with reference to FIG.
  • the adjustment lens 37 is a lens for adjusting the difference in focal length when the plurality of optical filters included in the turret filter 35 are switched.
  • the order of arrangement of the focus lens 31, zoom lens 32, diaphragm 33, blur correction lens 34, turret filter 35, and adjustment lens 37 may be other than the above.
  • Each of the objective lens 30, the focus lens 31, the zoom lens 32, the blur correction lens 34, and the adjusting lens 37 may be a single lens, or may be a lens group having a plurality of lenses.
  • the lens unit 20 may include other lenses.
  • the lens unit 20 may include an optical element such as a half mirror or a polarizing element.
  • the lens unit 20 includes a zoom drive mechanism 42, an aperture drive mechanism 43, a blur correction drive mechanism 44, a turret drive mechanism 45, and an adjustment drive mechanism 47.
  • the zoom drive mechanism 42, the aperture drive mechanism 43, the blur correction drive mechanism 44, the turret drive mechanism 45, and the adjustment drive mechanism 47 are electrically connected to an electrical contact 38 provided at the rear end of the lens barrel 21. .
  • the camera body 10 includes a control circuit 50.
  • the control circuit 50 is electrically connected to electrical contacts 58 provided on the camera-side mount 12 .
  • the electrical contact 38 is connected to the electrical contact 58, and the control circuit 50 operates the zoom drive mechanism 42 and the aperture drive mechanism. 43 , the blur correction drive mechanism 44 , the turret drive mechanism 45 and the adjustment drive mechanism 47 are electrically connected.
  • the zoom drive mechanism 42, the diaphragm drive mechanism 43, the blur correction drive mechanism 44, the turret drive mechanism 45, and the adjustment drive mechanism 47 are all drive mechanisms including actuators such as motors.
  • the control circuit 50 includes a computer 60, a zoom drive circuit 52, an aperture drive circuit 53, a blur correction drive circuit 54, a turret drive circuit 55, and an adjustment drive circuit 57.
  • the zoom drive circuit 52 , the aperture drive circuit 53 , the blur correction drive circuit 54 , the turret drive circuit 55 and the adjustment drive circuit 57 are connected to the computer 60 via the input/output I/F 59 .
  • the computer 60 includes a CPU 61, NVM 62, and RAM 63.
  • the CPU 61 , NVM 62 and RAM 63 are interconnected via a bus 64 , and the bus 64 is connected to the input/output I/F 59 .
  • the NVM 62 is a non-temporary storage medium and stores various parameters and various programs.
  • NVM 62 is an EEPROM.
  • the RAM 63 temporarily stores various information and is used as a work memory.
  • the CPU 61 reads necessary programs from the NVM 62 and executes the read programs in the RAM 63 .
  • the CPU 61 controls the entire camera 1 according to programs executed on the RAM 63 .
  • the CPU 61 is an example of a "processor” according to the technology of the present disclosure
  • the RAM 63 is an example of a “memory” according to the technology of the present disclosure
  • the computer 60 is an example of a “control device” according to the technology of the present disclosure.
  • the zoom drive circuit 52 adjusts the positions of the focus lens 31 and the zoom lens 32 by driving the zoom drive mechanism 42 according to instructions from the computer 60 .
  • the focus lens 31 and the zoom lens 32 move along the optical axis OA of the lens unit 20 by applying power from the zoom drive mechanism 42 .
  • the aperture drive circuit 53 changes the diameter of the aperture 33A (see FIG. 2) provided in the aperture 33 by driving the aperture drive mechanism 43 according to instructions from the computer 60.
  • the blur correction drive circuit 54 adjusts the position of the blur correction lens 34 by driving the blur correction drive mechanism 44 according to instructions from the computer 60 and a feedback signal output from a feedback circuit 75, which will be described later.
  • the blur correction lens 34 moves along a plane perpendicular to the optical axis OA of the lens unit 20 by applying power from the blur correction driving mechanism 44 . Specifically, the blur correction lens 34 moves in a direction in which blurring of an image obtained by forming an image of light on the image sensor 15 is corrected.
  • the turret drive circuit 55 adjusts the position of the turret filter 35 in the rotational direction by driving the turret drive mechanism 45 according to instructions from the computer 60 .
  • the turret filter 35 rotates along a plane perpendicular to the optical axis OA of the lens unit 20 by applying power from the turret driving mechanism 45 .
  • the rotation operation of the turret filter 35 will be detailed later with reference to FIG.
  • the adjustment drive circuit 57 adjusts the position of the adjustment lens 37 by driving the adjustment drive mechanism 47 according to instructions from the computer 60 .
  • the adjustment lens 37 is moved along the optical axis OA of the lens unit 20 by applying power from the adjustment drive mechanism 47 .
  • the camera body 10 includes an image sensor driver 71, a signal processing circuit 72, a light projection control circuit 73, a vibration sensor 74, a feedback circuit 75, a display 76, a display control circuit 77, an input device 78, An input circuit 79 and an external I/F 80 are provided.
  • Image sensor driver 71, signal processing circuit 72, light projection control circuit 73, feedback circuit 75, display control circuit 77, input circuit 79, and external I/F 80 are connected to computer 60 via input/output I/F 59.
  • the image sensor driver 71 causes the image sensor 15 to capture light according to instructions from the computer 60 .
  • the signal processing circuit 72 performs various signal processing on the analog image data output from the image sensor 15 to generate and output digital image data.
  • the light projection control circuit 73 switches the light projector 14 on and off according to instructions from the computer 60 .
  • the light projector 14 outputs illumination light when switched on, and stops outputting illumination light when switched off.
  • the vibration sensor 74 is, for example, a gyro sensor, and detects vibration of the camera 1.
  • a gyro sensor included in the vibration sensor 74 detects vibrations of the camera 1 around the pitch axis and the yaw axis.
  • the vibration sensor 74 converts vibrations about the pitch axis and the vibrations about the yaw axis detected by the gyro sensor into vibrations in a two-dimensional plane parallel to the pitch axis and the yaw axis. to detect vibration acting in the direction of the pitch axis and vibration acting in the direction of the yaw axis.
  • the vibration sensor 74 outputs a vibration detection signal corresponding to the detected vibration.
  • the vibration sensor 74 may be an acceleration sensor. Also, instead of the vibration sensor 74, for example, a motion vector obtained by comparing successive captured images stored in the NVM 62 and/or the RAM 63 may be used as vibration. Also, the final used vibration may be derived based on the vibration detected by the physical sensor and the motion vector obtained by image processing.
  • the feedback circuit 75 generates a feedback signal by performing various signal processing on the vibration detection signal output from the vibration sensor 74 .
  • the feedback circuit 75 is connected to the blur correction drive circuit 54 via the input/output I/F 59 and outputs a feedback signal to the blur correction drive circuit 54 according to instructions from the computer 60 .
  • the display 76 is, for example, a liquid crystal display or an EL display, and displays images and/or character information.
  • the display control circuit 77 causes the display 76 to display an image according to instructions from the computer 60 .
  • the input device 78 is, for example, a device such as a touch panel and/or a switch, and receives instructions given by the user.
  • the input circuit 79 outputs an input signal according to an instruction given to the input device 78 by the user.
  • the external I/F 80 is an interface communicably connected to an external device.
  • the turret filter 35 has a disc 81 .
  • the disc 81 is provided with an Ir cut filter 82, a first BPF 83A, a second BPF 83B, a third BPF 83C, and a fourth BPF 83D as a plurality of optical filters at equal intervals along the circumferential direction of the disc 81.
  • the Ir cut filter 82, the first BPF 83A, the second BPF 83B, the third BPF 83C, and the fourth BPF 83D are referred to as optical filters unless they need to be distinguished and described.
  • the first BPF 83A, the second BPF 83B, the third BPF 83C, and the fourth BPF 83D will be referred to as BPFs 83 unless they need to be distinguished and described.
  • the turret filter 35 selectively inserts and removes a plurality of optical filters with respect to the optical path in a turret system. Specifically, by rotating the turret filter 35 in the direction of the arc arrow R shown in FIG. (upper optical path). When the optical filter is inserted into the optical path, the optical axis OA passes through the center of the optical filter, and the center of the optical filter inserted into the optical path coincides with the center of the light receiving surface of the image sensor 15 .
  • the Ir cut filter 82 is an optical filter that cuts infrared rays and transmits only light other than infrared rays.
  • the BPF 83 is an optical filter that transmits near-infrared light.
  • the first BPF 83A, the second BPF 83B, the third BPF 83C, and the fourth BPF 83D transmit near-infrared light in different wavelength bands.
  • the first BPF 83A is an optical filter that corresponds to a wavelength band near 1000 nm (nanometers). As an example, the first BPF 83A transmits only near-infrared light in the wavelength band from 950 nm to 1100 nm. The near-infrared light transmitted through the first BPF 83A is hereinafter referred to as first near-infrared light.
  • the second BPF 83B is an optical filter corresponding to a wavelength band near 1250 nm.
  • the second BPF 83B transmits only near-infrared light in the wavelength band from 1150 nm to 1350 nm.
  • the near-infrared light transmitted through the second BPF 83B is hereinafter referred to as second near-infrared light.
  • the third BPF 83C is an optical filter that corresponds to a wavelength band near 1550 nm.
  • the third BPF 83C transmits only near-infrared light in the wavelength band from 1500 nm to 1750 nm.
  • the near-infrared light transmitted through the third BPF 83C is hereinafter referred to as third near-infrared light.
  • the fourth BPF 83D is an optical filter corresponding to a wavelength band near 2150 nm.
  • the fourth BPF 83D transmits only near-infrared light in the wavelength band from 2000 nm to 2400 nm.
  • the near-infrared light transmitted through the fourth BPF 83D is hereinafter referred to as fourth near-infrared light.
  • the first near-infrared light, the second near-infrared light, the third near-infrared light, and the fourth near-infrared light are referred to as near-infrared light unless it is necessary to distinguish them.
  • each band mentioned here includes an error that is generally allowed in the technical field to which the technology of the present disclosure belongs and that does not deviate from the gist of the technology of the present disclosure.
  • each wavelength band mentioned here is merely an example, and different wavelength bands may be used.
  • the camera 1 has a function of obtaining a visible light image by capturing visible light and a function of obtaining a near-infrared light image by capturing near-infrared light.
  • the camera 1 also has a function of measuring the temperature of a subject based on electromagnetic waves emitted by thermal radiation from the subject.
  • two-color thermometry is used to improve measurement accuracy.
  • the turret filter 35 is used as one means for generating light in two different wavelength bands, and as an example, near-infrared light in two wavelength bands is used for temperature measurement.
  • the imaging support processing is realized by executing the imaging support processing program 100 by the CPU 61 .
  • the imaging support processing program 100 is an example of a “program” according to the technology of the present disclosure.
  • the imaging support processing program 100 is stored in the NVM 62 , and the CPU 61 reads the imaging support processing program 100 from the NVM 62 and executes it on the RAM 63 .
  • the CPU 61 performs imaging support processing according to the imaging support processing program 100 executed on the RAM 63 .
  • the CPU 61 functions as a mode switching processing section 110 , an imaging processing section 120 and a temperature measurement processing section 130 by executing the imaging support processing program 100 on the RAM 63 .
  • the CPU 61 has an imaging mode and a temperature measurement mode as operation modes, and switches between the imaging mode and the temperature measurement mode.
  • the imaging processing unit 120 is a processing unit that operates when the operation mode of the CPU 61 is switched to the imaging mode.
  • the imaging processing unit 120 displays a visible light image obtained by imaging visible light with the image sensor 15 or a near-infrared light image obtained by imaging near-infrared light with the image sensor 15.
  • 76 is a processing unit that executes imaging processing to be displayed on 76 .
  • the temperature measurement processing unit 130 is a processing unit that operates when the operation mode of the CPU 61 is switched to the temperature measurement mode.
  • the temperature measurement processing unit 130 calculates the temperature distribution of the subject based on the near-infrared light image obtained by capturing the near-infrared light by the image sensor 15, and calculates the temperature distribution generated based on the temperature distribution of the subject. It is a processing unit that executes temperature measurement processing for displaying information on the display 76 .
  • the mode switching processing unit 110 is a processing unit that executes mode switching processing for switching the operation mode of the CPU 61 between the imaging mode and the temperature measurement mode.
  • the mode switching processing unit 110, the imaging processing unit 120, and the temperature measurement processing unit 130 will be described in order below.
  • the mode switching processing unit 110 has a mode selection information acquisition unit 111 , a mode determination unit 112 , a flag setting unit 113 , a light projection control unit 114 and a mode setting unit 115 .
  • the mode selection information acquisition unit 111 obtains imaging mode selection information for selecting the imaging mode as the operation mode and temperature measurement mode for selecting the temperature measurement mode as the operation mode, for example, via the reception device. Selectively acquire mode selection information.
  • the receiving device receives various information and outputs the received various information to the CPU 61 . Examples of receiving devices include the input circuit 79 and the external I/F 80 .
  • the imaging mode selection information and the temperature measurement mode selection information are hereinafter referred to as mode selection information when there is no need to distinguish between them.
  • the input circuit 79 outputs information to the CPU 61 according to the instruction input to the input device 78 .
  • the input circuit 79 outputs mode selection information to the CPU 61 according to a mode selection instruction given to the input device 78 by the user.
  • the mode selection information acquisition unit 111 acquires mode selection information input from the input circuit 79 .
  • the external I/F 80 receives mode selection information output from an external device (not shown) and outputs the received mode selection information to the CPU 61 .
  • Mode selection information acquisition unit 111 acquires mode selection information input from external I/F 80 .
  • the mode determination unit 112 determines whether the operation mode selected by the mode selection information acquired by the mode selection information acquisition unit 111 is the imaging mode or the temperature measurement mode. When the mode selection information acquired by the mode selection information acquisition unit 111 is the imaging mode selection information, the mode determination unit 112 determines that the operation mode selected by the mode selection information is the imaging mode. Further, when the mode selection information acquired by the mode selection information acquisition unit 111 is the temperature measurement mode selection information, the mode determination unit 112 determines that the operation mode selected by the mode selection information is the temperature measurement mode. do.
  • a display control flag storage area 141 and a light projection control flag storage area 142 are provided in the RAM 63 .
  • the display control flag storage area 141 stores a display control flag 151 that designates an image to be displayed on the display 76 .
  • the light projection control flag storage area 142 stores a light projection control flag 152 that designates the operation of the light projector 14 .
  • the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141 and A light projection ON control flag 152 A is set as the light projection control flag 152 in the light control flag storage area 142 . Further, when the mode determination unit 112 determines that the operation mode is the temperature measurement mode, the flag setting unit 113 sets the composite image display flag 151B as the display control flag 151 in the display control flag storage area 141, A light projection OFF control flag 152 B is set as the light projection control flag 152 in the light projection control flag storage area 142 .
  • the display control flag 151 and the light projection control flag 152 will be referred to as control flags unless they need to be distinguished and described.
  • the light projection control unit 114 outputs an ON command to the light projection control circuit 73 when the light projection ON control flag 152A is set by the flag setting unit 113 .
  • the ON command is a command to switch the projector 14 ON.
  • the flag setting unit 113 sets the light projection OFF control flag 152B
  • the light projection control unit 114 outputs an OFF command to the light projection control circuit 73 .
  • the OFF command is a command to switch the projector 14 off. Note that “on” refers to a setting in which the light projector 14 projects light, and “off” refers to a setting in which the light projector 14 does not perform light projection.
  • the mode setting unit 115 sets the imaging mode as the operation mode of the CPU 61 when the captured image display flag 151A is set as the display control flag 151 . Further, when the composite image display flag 151B is set as the display control flag 151, the mode setting unit 115 sets the temperature measurement mode as the operation mode of the CPU 61.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • the imaging mode is an example of the "first mode” according to the technology of the present disclosure
  • the temperature measurement mode is an example of the “second mode” according to the technology of the present disclosure
  • the display control flag 151 and the light projection control flag 152 are examples of the “control factor” according to the technique of the present disclosure.
  • the display 76 and the light projector 14 are examples of the “controlled object” according to the technology of the present disclosure, and the image displayed on the display 76 and the operation of the light projector 14 are the same as the “controlled object” according to the technology of the present disclosure. This is an example of "content”.
  • the display control flag 151 is an example of the "display control factor” according to the technology of the present disclosure
  • the captured image display flag 151A is an example of the “captured image display factor” according to the technology of the present disclosure
  • the composite image is an example of a "temperature information display factor” according to the technology of the present disclosure.
  • the light projection ON control flag 152A and the light projection OFF control flag 152B are examples of the “light projection control factor” according to the technology of the present disclosure
  • the light projection ON control flag 152A is an example of the “projection control factor” according to the technology of the present disclosure.
  • the light emission OFF control flag 152B is an example of the "light emission suppression control factor” according to the technology of the present disclosure.
  • the imaging processing unit 120 has a wavelength selection unit 121, a turret control unit 122, an imaging control unit 123, and a display control unit .
  • the wavelength selection unit 121 selects one wavelength band used for imaging from a plurality of wavelength bands according to the wavelength selection instruction accepted by the input device 78 .
  • the wavelength selection unit 121 selects the visible light wavelength band, the first near-infrared light wavelength band from 950 nm to 1100 nm, the second near-infrared light wavelength band from 1150 nm to 1350 nm, and the third near-infrared light wavelength band from 1500 nm to 1750 nm.
  • One wavelength band is selected from the infrared wavelength band and the fourth near-infrared wavelength band from 2000 nm to 2400 nm.
  • the wavelength band is selected according to the wavelength selection instruction received by the input device 78, but the wavelength can be selected according to various conditions (for example, subject temperature and/or imaging conditions).
  • a band may be selected.
  • the turret control unit 122 outputs to the turret drive circuit 55 a rotation command for inserting into the optical path an optical filter corresponding to the wavelength band selected by the wavelength selection unit 121 among the plurality of optical filters.
  • the turret drive circuit 55 drives the turret drive mechanism 45 to rotate the turret filter 35 to a position where the optical filter corresponding to the rotation command is inserted into the optical path.
  • the imaging control unit 123 outputs an imaging command to the image sensor driver 71 .
  • the imaging command is a command to cause the image sensor 15 to capture light.
  • the image sensor 15 captures light emitted from a subject and outputs analog image data obtained by capturing the light.
  • the signal processing circuit 72 performs various signal processing on the analog image data output from the image sensor 15 to generate and output digital image data.
  • the display control unit 124 causes the display 76 to display the captured image 161 A based on the digital image data generated by the signal processing circuit 72 .
  • the captured image 161A is displayed as a moving image, but may be displayed as a still image.
  • FIG. 8 shows an example of a captured image 161A displayed on the display 76 as a result of the imaging processing being executed by the imaging processing unit 120.
  • the display 76 displays a captured image 161A in which a curtain 163 provided on the indoor side of a window 162 of a building is on fire 164 .
  • the temperature measurement processing unit 130 includes a wavelength selection unit 131, a first turret control unit 132, a first imaging control unit 133, a second turret control unit 134, a second imaging control unit 135, a temperature It has a derivation unit 136 and a display control unit 137 .
  • the wavelength selection unit 131 selects two wavelength bands, that is, a first wavelength band and a second wavelength band, used for dichroic thermometry.
  • the wavelength selection unit 131 uses a first near-infrared light wavelength band from 950 nm to 1100 nm, a second near-infrared light wavelength band from 1150 nm to 1350 nm, and a wavelength band from 1500 nm to Two wavelength bands are selected from the third near-infrared wavelength band of 1750 nm and the fourth near-infrared wavelength band of 2000 nm to 2400 nm.
  • the wavelength selection unit 131 uses the wavelength band of the first near-infrared light, the wavelength band of the second near-infrared light, and the wavelength of the third near-infrared light as the first wavelength band and the second wavelength band. and two adjacent wavelength bands from among the wavelength bands of the fourth near-infrared light.
  • the wavelength selection unit 131 uses the wavelength band of the first near-infrared light, the wavelength band of the second near-infrared light, and the wavelength of the third near-infrared light as the first wavelength band and the second wavelength band. and the wavelength band of the fourth near-infrared light, two wavelength bands are selected based on the temperature of the subject. Further, as an example, the wavelength selection unit 131 uses the wavelength band of the first near-infrared light, the wavelength band of the second near-infrared light, and the wavelength of the third near-infrared light as the first wavelength band and the second wavelength band. A shorter wavelength band is selected from the band and the wavelength band of the fourth near-infrared light as the temperature of the object increases.
  • the temperature of the subject is predicted based on information about the temperature expected from the fire situation and/or information input to the input device 78 by the user.
  • Information about the temperature expected from the fire situation is obtained through the external I/F 80 shown in FIG.
  • the information on the temperature expected from the fire situation may be information on the standard fire temperature with respect to the elapsed time from the occurrence of the fire. Standard fire temperature is defined in ISO834.
  • the wavelength selection unit 131 may switch the wavelength band according to the wavelength selection instruction accepted by the input device 78 shown in FIG.
  • the first turret control unit 132 inserts the BPF 83 corresponding to the first wavelength band selected by the wavelength selection unit 131 from among the plurality of BPFs 83 (see FIG. 4) into the optical path.
  • a rotation command is output to the turret drive circuit 55 .
  • the turret drive circuit 55 drives the turret drive mechanism 45 to rotate the turret filter 35 to a position where the BPF corresponding to the first rotation command is inserted into the optical path.
  • the first imaging control section 133 outputs a first imaging command to the image sensor driver 71 .
  • the first imaging command is a command to cause the image sensor 15 to capture light.
  • the image sensor 15 captures the first near-infrared light transmitted through the BPF 83 corresponding to the first wavelength band, and outputs first analog image data obtained by capturing the first near-infrared light.
  • the signal processing circuit 72 performs various signal processing on the first analog image data output from the image sensor 15 to generate and output first digital image data.
  • the second turret control unit 134 outputs to the turret drive circuit 55 a second rotation command for inserting into the optical path the BPF 83 corresponding to the second wavelength band selected by the wavelength selection unit 131 from among the plurality of BPFs 83 .
  • the turret drive circuit 55 drives the turret drive mechanism 45 to rotate the turret filter 35 to a position where the BPF corresponding to the second rotation command is inserted into the optical path.
  • the second imaging control section 135 outputs a second imaging command to the image sensor driver 71 .
  • the second imaging command is a command to cause the image sensor 15 to capture the light L.
  • the image sensor 15 captures the second near-infrared light transmitted through the BPF corresponding to the second wavelength band, and outputs second analog image data obtained by capturing the second near-infrared light.
  • the signal processing circuit 72 performs various signal processing on the second analog image data output from the image sensor 15 to generate and output second digital image data.
  • the temperature derivation unit 136 calculates the temperature distribution of the subject by two-color thermometry based on the first digital image data and the second digital image data. Specifically, the temperature derivation unit 136 extracts the value of the first signal output by the physical pixel from the first digital image data for each of the plurality of physical pixels of the image sensor 15, and extracts the value of the first signal output from the physical pixel from the second digital image data. A value of the second signal output by the physical pixel is extracted. Then, the temperature derivation unit 136 derives the temperature measured by the physical pixel based on the value of the first signal and the value of the second signal, based on the two-color thermometry method, for each of the plurality of physical pixels.
  • a calculation formula based on a two-color thermometry method may be used to derive the temperature, or a data matching table may be used.
  • the temperature derivation unit 136 then derives the temperature measured by each of the plurality of physical pixels to calculate the temperature distribution of the object.
  • the display control unit 137 generates temperature-related temperature information based on the temperature distribution of the subject obtained by the temperature derivation unit 136 . Then, the display control unit 137 outputs a synthesized image 161B obtained by synthesizing the captured image obtained based on the first digital image data or the second digital image data with the temperature information, and causes the display 76 to display the synthesized image 161B.
  • temperature information include information indicating a region where the temperature is equal to or higher than a predetermined threshold, information indicating a specific temperature value, and a plurality of sections divided for each predetermined temperature range. For example, information indicating a specific numerical value of the temperature, or information indicating a temperature distribution with a color tone corresponding to the temperature.
  • FIG. 11 shows a first example of a composite image 161B displayed on the display 76 as a result of the temperature measurement processing being executed by the temperature measurement processing section 130.
  • a synthesized image 161B according to the first example is an image obtained by synthesizing a captured image 165 with temperature information 166 indicating an area whose temperature is equal to or higher than a predetermined threshold.
  • the temperature information 166 is, for example, information indicating a rectangular frame, but may be information other than the frame.
  • the frame may have a shape other than a square shape.
  • the frame may be displayed in a color according to the temperature.
  • a scale display indicating the temperature corresponding to the color may be displayed together with the frame.
  • the temperature information 166 may include a character string indicating a specific temperature value. Specific numerical values of temperature may be maximum and/or minimum values of temperature. Also, the temperature information 166 may be displayed still or blinking.
  • FIG. 12 shows a second example of the composite image 161B displayed on the display 76 as a result of the temperature measurement processing being executed by the temperature measurement processing section 130.
  • a synthesized image 161B according to the second example is an image obtained by synthesizing a plurality of pieces of temperature information 167A, 167B, and 167C indicating specific numerical values of temperatures with a captured image 165 .
  • Specific numerical values of temperature may be maximum and/or minimum values of temperature.
  • the plurality of pieces of temperature information 167A, 167B, and 167C may be displayed stationary or may be displayed blinking.
  • FIG. 13 shows a third example of the composite image 161B displayed on the display 76 as a result of the temperature measurement processing being executed by the temperature measurement processing section 130.
  • FIG. A synthesized image 161B according to the third example is an image obtained by synthesizing temperature information 168 indicating a plurality of sections divided for each predetermined temperature range together with specific temperature values into a captured image 165. .
  • Specific numerical values of temperature may be maximum and/or minimum values of temperature.
  • the temperature information 168 may be displayed still or blinking.
  • FIG. 14 shows a fourth example of the composite image 161B displayed on the display 76 as a result of the temperature measurement processing being executed by the temperature measurement processing section 130.
  • FIG. A synthesized image 161B according to the fourth example is an image in which the captured image 165 is synthesized with the temperature information 169 indicating the temperature distribution with a color tone corresponding to the temperature.
  • the temperature distribution of the fourth example shown in FIG. 14 is obtained by displaying the temperature of each section of the third example shown in FIG. 13 in color tones corresponding to the temperature.
  • the color tone is defined at a constant hue angle (eg, 0.01° per 1° C.) around a reference temperature (eg, 3000° C.).
  • Temperature information 169 may include an indicator of the temperature corresponding to the color tone. Also, the temperature information 169 may include a character string indicating a specific numerical value of the temperature. Specific numerical values of temperature may be maximum and/or minimum values of temperature. Also, the temperature information 169 may be displayed still or blinking.
  • the temperature information 166, 167A, 167B, 167C, 168, and 169 are examples of "temperature information" according to the technology of the present disclosure.
  • the captured image 161A obtained in the imaging mode is an example of the “captured image” and the “first captured image” according to the technology of the present disclosure.
  • a composite image 161B obtained in the temperature measurement mode is an example of a “composite image” according to the technology of the present disclosure.
  • the captured image 161A obtained in the imaging mode and the composite image 161B obtained in the temperature measurement mode will be referred to as images unless otherwise distinguished.
  • the mode selection information acquisition unit 111 acquires mode selection information.
  • step S12 the mode determination unit 112 determines whether the operation mode selected by the mode selection information is the imaging mode or the temperature measurement mode, based on the mode selection information acquired by the mode selection information acquisition unit 111. judge. If it is determined in step S12 that the mode is the imaging mode, the process shown in FIG. 15 proceeds to step S13, and if it is determined that the mode is the temperature measurement mode, the process illustrated in FIG. 15 proceeds to step S16. .
  • step S13 the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141, and sets the light projection control flag 152 in the light projection control flag storage area 142 so that light projection is turned on.
  • Set control flag 152A the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141, and sets the light projection control flag 152 in the light projection control flag storage area 142 so that light projection is turned on.
  • step S14 the light projection control unit 114 turns on the light projector 14.
  • the mode setting unit 115 sets the imaging mode as the operation mode of the CPU 61.
  • step S16 the flag setting unit 113 sets the composite image display flag 151B as the display control flag 151 in the display control flag storage area 141, and sets the light projection OFF control flag as the light projection control flag 152 in the light projection control flag storage area 142. 152B.
  • step S17 the light projection control unit 114 turns off the light projector 14.
  • the mode setting unit 115 sets the temperature measurement mode as the operation mode of the CPU 61.
  • step S21 the wavelength selection unit 121 selects one wavelength band used for imaging from a plurality of wavelength bands according to the wavelength selection instruction received by the input device 78.
  • step S22 the turret control unit 122 rotates the turret filter 35 to a position where the optical filter corresponding to the wavelength band selected by the wavelength selection unit 121 among the plurality of optical filters is inserted into the optical path.
  • the imaging control unit 123 causes the image sensor 15 to capture light.
  • the image sensor 15 outputs analog image data obtained by capturing light, and the signal processing circuit 72 generates and outputs digital image data by performing various signal processing on the analog image data.
  • step S24 the display control unit 124 causes the display 76 to display the captured image 161A based on the digital image data generated by the signal processing circuit 72.
  • step S31 the wavelength selection unit 131 selects two wavelength bands, that is, the first wavelength band and the second wavelength band, to be used for the two-color thermometry method.
  • step S32 the first turret control unit 132 sets the turret filter 35 to a position where the BPF 83 corresponding to the first wavelength band selected by the wavelength selection unit 131 among the plurality of BPFs 83 (see FIG. 4) is inserted into the optical path. rotate.
  • step S33 the first imaging control unit 133 causes the image sensor 15 to image the first near-infrared light transmitted through the BPF 83 corresponding to the first wavelength band.
  • the image sensor 15 outputs the first analog image data obtained by imaging the first near-infrared light, and the signal processing circuit 72 performs various signal processing on the first analog image data to obtain the first analog image data. 1 Generate and output digital image data.
  • step S34 the second turret control unit 134 rotates the turret filter 35 to a position where the BPF 83 corresponding to the second wavelength band selected by the wavelength selection unit 131 among the plurality of BPFs 83 is inserted into the optical path.
  • step S35 the second imaging control unit 135 causes the image sensor 15 to image the second near-infrared light transmitted through the BPF 83 corresponding to the second wavelength band.
  • the image sensor 15 outputs second analog image data obtained by imaging the second near-infrared light, and the signal processing circuit 72 performs various signal processing on the second analog image data to obtain the second analog image data. 2. Generate and output digital image data.
  • step S36 the temperature derivation unit 136 calculates the temperature distribution of the subject by two-color thermometry based on the first digital image data and the second digital image data.
  • step S ⁇ b>37 the display control unit 137 generates temperature information related to temperature based on the temperature distribution of the subject obtained by the temperature derivation unit 136 . Then, the display control unit 137 outputs a synthesized image 161B obtained by synthesizing the captured image obtained based on the first digital image data or the second digital image data with the temperature information, and causes the display 76 to display the synthesized image 161B.
  • the CPU 61 controls the position of focus by moving the focus lens 31 along the optical axis OA and adjusts the zoom magnification by moving the zoom lens 32 in each of the imaging mode and the temperature measurement mode. Control for adjustment is performed on the zoom drive mechanism 42 . Further, the CPU 61 controls the blur correction drive mechanism 44 to correct image blur by moving the blur correction lens 34 in each of the imaging mode and the temperature measurement mode. In addition, the CPU 61 controls the diaphragm drive mechanism 43 to adjust the amount of light passing through the diaphragm 33 by changing the diameter of the aperture 33A provided in the diaphragm 33 in each of the imaging mode and the temperature measurement mode. do. The CPU 61 also controls the adjustment drive mechanism 47 to adjust the focus position by moving the adjustment lens 37 in each of the imaging mode and the temperature measurement mode.
  • the control method of the camera 1 according to the first embodiment is an example of the "control method" according to the technology of the present disclosure.
  • the CPU 61 has an imaging mode for causing the image sensor 15 to image light and a temperature measurement mode for deriving temperature based on the near-infrared light received by the image sensor 15 .
  • the control flag differs between the imaging mode and the temperature measurement mode. Therefore, it is possible to change the control contents of the controlled object between the imaging mode and the temperature measurement mode.
  • the control flag includes the display control flag 151 displayed on the display 76. Therefore, an image displayed on the display 76 can be made different as an example of differentiating the control details in the controlled object between the imaging mode and the temperature measurement mode.
  • control flag includes the light projection control flag 152 that operates the light projector 14 . Therefore, the operation of the light projector 14 can be made different as an example of differentiating the control details in the controlled object between the imaging mode and the temperature measurement mode.
  • control contents of the controlled object can be made different between the imaging mode and the temperature measurement mode. Therefore, the operation of the imaging device can be controlled to be suitable for each mode.
  • the CPU 61 sets a captured image display flag 151A as the display control flag 151 to cause the display 76 to display a captured image 161A obtained by light being received by the image sensor 15 .
  • the captured image 161A that does not contain the temperature information is displayed on the display 76.
  • FIG. Therefore, in the imaging mode, the visibility of the captured image 161A can be improved as compared with the case where the captured image 161A includes temperature information, for example.
  • the CPU 61 sets, as the display control flag 151, a composite image display flag 151B that causes the display 76 to display a composite image 161B including temperature information indicating temperature.
  • a synthesized image 161B including temperature information is displayed on the display 76.
  • FIG. Therefore, in the temperature measurement mode, the user can accurately grasp the temperature of the object, compared to the case where the temperature information is not displayed on the display 76, for example.
  • the CPU 61 sets a light projection ON control flag 152A for switching the light projector 14 ON as the light projection control flag 152 in the imaging mode.
  • the light projector 14 is switched on. Therefore, in the imaging mode, it is possible to ensure the amount of light emitted from the subject. This allows the user to check the indoor environment through the captured image 161A, for example, even in a situation where the room is filled with smoke due to a fire.
  • the CPU 61 sets a light projection off control flag 152B for switching off the light projector 14 as the light projection control flag 152 . This switches off the light projector 14 in the temperature measurement mode. Therefore, it is possible to prevent the illumination light from the light projector 14 from being mixed with the near-infrared light emitted from the subject. In addition, the measurement accuracy of temperature measurement can be improved as compared with, for example, the case where illumination light from the projector 14 is mixed with near-infrared light emitted from the object.
  • the CPU 61 switches the light projector 14 on and off in response to switching between the imaging mode and the temperature measurement mode. Therefore, convenience can be improved compared to, for example, the case where the user has to switch the light projector 14 on and off.
  • the CPU 61 outputs a synthesized image 161B obtained by synthesizing a captured image obtained by receiving light from the image sensor 15 and temperature information indicating temperature. Therefore, by displaying the synthesized image 161B on the display 76, the user can easily grasp the situation and temperature of the subject.
  • a captured image 161A in which a curtain 163 provided on the indoor side of a window 162 of a building is lit with fire 164 is displayed on the display 76.
  • the captured image 161A may be any image.
  • the captured image 161A may be an image obtained by capturing the arm 170 of a person.
  • a synthesized image 161B obtained by synthesizing temperature information with a captured image of a curtain provided on the indoor side of a window of a building is displayed.
  • the synthesized image 161B may be any image as long as it is an image obtained by synthesizing the captured image and the temperature information.
  • the synthesized image 161B may be an image obtained by synthesizing temperature information 172 indicating the temperature distribution with an imaged image 171 obtained by imaging a person's arm 170 .
  • Synthesis may be, for example, superimposition of a plurality of images by alpha blending, or embedding of temperature information in a captured image.
  • the light emission control unit 114 instead of setting the light projection OFF control flag 152B by the flag setting unit 113 shown in FIG. Further, when the light emission suppression control flag is set by the flag setting unit 113, the light emission control unit 114 outputs a light emission suppression command to the light emission control circuit 73, and the light emission control circuit 73 outputs the light emission suppression command. Accordingly, the light projection of the light projector 14 may be suppressed (that is, the amount of light projected from the light projector 14 may be suppressed). Suppression refers to, for example, the operation of reducing the amount of light below a reference amount.
  • the reference amount may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • the configuration of the camera 1 is changed as follows from the first embodiment.
  • the points of the second embodiment that are different from the first embodiment will be described below.
  • the mode switching processing unit 110 has a state signal acquisition unit 181, a state signal determination unit 182, a flag setting unit 113, a light projection control unit 114, and a mode setting unit 115.
  • the state signal acquisition unit 181 acquires the state signal output from the light projection control circuit 73 according to the operating state of the light projector 14 .
  • the light projection control circuit 73 outputs an ON state signal representing the ON state of the light projector 14 as a state signal when the light projector 14 is ON, and an ON state signal representing the OFF state of the light projector 14 when the light projector 14 is OFF. Output the state signal as a state signal.
  • the state signal determination unit 182 determines whether the state signal acquired by the state signal acquisition unit 181 is an ON state signal indicating that the light projector 14 is ON.
  • the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141 . Further, when the state signal determination unit 182 determines that the state signal is not the ON state signal, the flag setting unit 113 sets the composite image display flag 151B as the display control flag 151 in the display control flag storage area 141. .
  • the mode setting unit 115 sets the imaging mode as the operation mode of the CPU 61 when the captured image display flag 151A is set as the display control flag 151 . Further, when the composite image display flag 151B is set as the display control flag 151, the mode setting unit 115 sets the temperature measurement mode as the mode of the CPU 61.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • the imaging mode is an example of the "first mode” according to the technology of the present disclosure
  • the temperature measurement mode is an example of the "second mode” according to the technology of the present disclosure
  • the display control flag 151 is an example of a “control factor” according to the technology of the present disclosure
  • the display 76 is an example of the "controlled object” according to the technique of the present disclosure
  • the image displayed on the display 76 is an example of "control details for the controlled object” according to the technique of the present disclosure.
  • the display control flag 151 is an example of the "display control factor" according to the technology of the present disclosure
  • the captured image display flag 151A is an example of the “captured image display factor” according to the technology of the present disclosure
  • the display flag 151B is an example of a "temperature information display factor” according to the technology of the present disclosure.
  • the imaging processing performed by the imaging processing unit 120 and the temperature measurement processing performed by the temperature measurement processing unit 130 are the same as in the first embodiment.
  • mode switching processing executed by the mode switching processing unit 110 is different from that in the first embodiment. Mode switching processing executed by the mode switching processing unit 110 according to the second embodiment will be described below with reference to FIG. 21 .
  • step S41 the state signal acquisition unit 181 acquires the state signal output from the light projection control circuit 73 according to the operating state of the light projector .
  • step S42 the state signal determination unit 182 determines whether or not the state signal is the ON state signal. If it is determined in step S42 that it is an on-state signal, the processing shown in FIG. 21 proceeds to step S43, and if it is determined that it is not an on-state signal, the processing illustrated in FIG. 21 proceeds to step S45. do.
  • step S ⁇ b>43 the flag setting unit 113 sets the captured image display flag 151 ⁇ /b>A as the display control flag 151 in the display control flag storage area 141 .
  • the mode setting unit 115 sets the imaging mode as the operation mode of the CPU 61.
  • step S45 the flag setting unit 113 sets the composite image display flag 151B as the display control flag 151 in the display control flag storage area 141.
  • the mode setting unit 115 sets the temperature measurement mode as the operation mode of the CPU 61.
  • the control method of the camera 1 according to the second embodiment is an example of the "control method" according to the technology of the present disclosure.
  • the CPU 61 sets the imaging mode when the light projector 14 is on, and sets the temperature measurement mode when the light projector 14 is off. Therefore, convenience can be improved as compared with the case where the imaging mode and the temperature measurement mode are not switched according to the ON/OFF operation of the light projector 14, for example.
  • the configuration of the camera 1 is changed as follows from the first embodiment.
  • the points of the third embodiment that are different from the first embodiment will be described below.
  • the CPU 61 functions as a parameter change processing section 190 in addition to the mode switching processing section 110, imaging processing section 120, and temperature measurement processing section 130 described above.
  • the parameter change processing unit 190 is a processing unit that operates when the mode of the CPU 61 is the imaging mode and the temperature measurement mode.
  • the parameter change processing unit 190 is a processing unit that sets different mode parameters 211 for the imaging mode and the temperature measurement mode.
  • the parameter change processing unit 190 has a mode determination unit 191 and a mode-specific parameter setting unit 192 .
  • the mode determination unit 191 determines whether the operation mode of the CPU 61 is the imaging mode or the temperature measurement mode.
  • the RAM 63 is provided with a parameter storage area 201 for storing mode-specific parameters 211 .
  • the mode-specific parameter setting unit 192 When the mode determination unit 191 determines that the operation mode of the CPU 61 is the imaging mode, the mode-specific parameter setting unit 192 performs various parameter setting processes based on the imaging conditions, etc., to correspond to the imaging mode. An imaging mode parameter 211A is derived. Then, the mode-specific parameter setting unit 192 sets the imaging mode parameter 211 A as the mode-specific parameter 211 in the parameter storage area 201 .
  • the mode-specific parameter setting unit 192 performs various parameter setting processes based on the temperature measurement conditions and the like to determine the temperature.
  • a temperature measurement mode parameter 211B corresponding to the measurement mode is derived.
  • the mode-specific parameter setting unit 192 sets the temperature measurement mode parameter 211 B as the mode-specific parameter 211 in the parameter storage area 201 .
  • the imaging mode parameters 211A include a first imaging setting parameter 212A set for imaging and a first image processing setting parameter 213A set for image processing of the captured image.
  • the temperature measurement mode parameters 211B include a second imaging setting parameter 212B set for imaging and a second image processing setting parameter 213B set for image processing of the captured image.
  • the first imaging setting parameter 212A and the second imaging setting parameter 212B include a parameter related to the projector 14, a parameter related to shutter speed, a parameter related to the diaphragm 33, a parameter related to photometry, a parameter related to the sensitivity of the image sensor 15, a parameter related to high dynamic range, and a parameter related to protection.
  • the parameters relating to the sensitivity of the image sensor 15 include parameters relating to the gain of the image sensor 15 and/or parameters relating to the conversion efficiency of the image sensor 15 .
  • the first image processing setting parameter 213A and the second image processing setting parameter 213B include noise reduction parameters, sharpness parameters, contrast parameters, and tone parameters.
  • the first imaging setting parameter 212A included in the imaging mode parameter 211A is set as follows.
  • the parameters related to the light projector 14 are set to the parameters for the light projector 14 to project light.
  • a parameter related to the shutter speed is set to a parameter that makes the shutter speed equal to or higher than the reference speed.
  • Shutter speed is the time from when the front curtain starts to open until the rear curtain finishes closing in a mechanical shutter, the time from when an electronic front curtain operates until the rear curtain of a mechanical shutter finishes closing, or an electronic It is defined by the time from when the shutter starts operating to when it finishes operating.
  • the reference speed may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (for example, subject temperature and/or imaging conditions).
  • a parameter relating to the diaphragm 33 is set to a parameter that makes the diaphragm amount greater than or equal to the reference diaphragm amount.
  • the diaphragm amount is proportional to the diameter of the aperture 33A provided in the diaphragm 33.
  • the reference aperture amount may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (for example, subject temperature and/or imaging conditions). There may be.
  • the parameters related to photometry are set to parameters for performing photometry by the average photometry method or the multi-pattern photometry method.
  • Photometry is the measurement of the brightness of a subject.
  • a parameter related to the gain of the image sensor 15 is set to a parameter that makes the gain of the image sensor 15 greater than or equal to the reference gain.
  • the gain of the image sensor 15 refers to the analog gain of an A/D converter (not shown) connected to the photodiode of the image sensor 15, for example.
  • the reference gain may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • a parameter related to the conversion efficiency of the image sensor 15 is set to a parameter that makes the conversion efficiency of the image sensor 15 equal to or higher than the reference conversion efficiency.
  • the conversion efficiency of the image sensor 15 refers to the efficiency of converting electric charges accumulated in a variable capacitor (not shown) connected to a photodiode included in the image sensor 15 into voltage.
  • the reference conversion efficiency may be a fixed value, or may be a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (such as subject temperature and/or imaging conditions). There may be.
  • a parameter related to the high dynamic range is set to a parameter that turns on the high dynamic range.
  • the high dynamic range is a display technology that improves the contrast (that is, the contrast ratio) between the bright and dark portions of the captured image 161A. Turning on the high dynamic range means expanding the dynamic range beyond the reference range.
  • the reference range may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • a parameter related to anti-vibration control is set to a parameter for turning on anti-vibration control.
  • Anti-vibration control is control for moving the blur correction lens 34 in a direction in which image blur is corrected. Turning on anti-vibration control means performing control to move the blur correction lens 34 .
  • a parameter related to noise reduction is set to a parameter that makes the degree of strength of noise reduction larger than the first reference degree.
  • Noise reduction is image processing for reducing noise that appears in a captured image, and the strength of noise reduction means increasing or decreasing the rate at which noise that appears in a captured image is reduced.
  • the first reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • a parameter related to sharpness is set to a parameter that makes the degree of strength of sharpness larger than the second reference degree. Sharpness refers to enhancement of the contour of the captured image, and strength of sharpness refers to increasing or decreasing the rate at which the contour of the captured image is enhanced.
  • the second reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • the parameter related to contrast is set to a parameter that makes the degree of strength of contrast larger than the third reference degree. Contrast refers to the difference in brightness and/or color of a captured image, and strength of contrast refers to increasing or decreasing the difference in brightness and/or color of a captured image.
  • the third reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • a parameter relating to tone is set to a parameter that makes the degree of intensity of the tone larger than the fourth reference degree.
  • the tone is the color tone of the captured image, and the intensity of the tone is to increase or decrease the color tone of the captured image.
  • the fourth reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
  • the second imaging setting parameter 212B included in the temperature measurement mode parameter 211B is set as follows, as an example.
  • the parameters relating to the light projector 14 are set to parameters for which the light projector 14 does not project light.
  • a parameter related to the shutter speed is set to a parameter that makes the shutter speed less than the reference speed.
  • the reference speed may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (for example, subject temperature and/or imaging conditions).
  • a parameter related to the diaphragm 33 is set to a parameter that makes the diaphragm amount smaller than the reference amount.
  • the reference aperture amount may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (for example, subject temperature and/or imaging conditions). There may be.
  • the parameters related to photometry are set to parameters for performing photometry with a highlight-weighted photometry method, a center-weighted photometry method, or a spot photometry method.
  • a parameter related to the gain of the image sensor 15 is set to a parameter that makes the gain of the image sensor 15 less than the reference gain.
  • the reference gain may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • a parameter related to the conversion efficiency of the image sensor 15 is set to a parameter that makes the photoelectric conversion efficiency of the image sensor 15 less than the reference conversion efficiency.
  • the reference conversion efficiency may be a fixed value, or may be a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (such as subject temperature and/or imaging conditions).
  • a parameter related to the high dynamic range is set to a parameter that turns off the high dynamic range. Turning off the high dynamic range means setting the dynamic range to the reference range.
  • the reference range may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • a parameter related to anti-vibration control is set to a parameter for turning off anti-vibration control.
  • the second image processing setting parameter 213B included in the temperature measurement mode parameter 211B is set as follows as an example.
  • a parameter related to noise reduction is set to a parameter that makes the degree of strength of noise reduction equal to or less than the first reference degree.
  • the first reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • the sharpness-related parameter is set to a parameter that makes the degree of strength of sharpness equal to or lower than the second reference degree.
  • the second reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • the parameter related to contrast is set to a parameter that makes the degree of strength of contrast equal to or lower than the third reference degree.
  • the third reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions).
  • the tone-related parameter is set to a parameter that makes the degree of intensity of the tone equal to or lower than the fourth reference degree.
  • the fourth reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
  • the imaging control unit 123 controls imaging related to imaging according to the first imaging setting parameters 212A included in the imaging mode parameters 211A stored in the parameter storage area 201. Make settings. (1) The imaging control unit 123 sets the light projector 14 to project light as a setting related to the light projector 14 , and outputs an ON command to the light projection control circuit 73 . (2) The imaging control unit 123 adjusts the shutter speed by setting the shutter speed to be equal to or higher than the reference speed as the shutter speed setting.
  • the imaging control unit 123 sets the diaphragm amount equal to or larger than the reference amount as the setting for the diaphragm 33 , and outputs the diaphragm command corresponding to the set diaphragm amount to the diaphragm drive circuit 53 .
  • the imaging control unit 123 sets the photometry method to the average photometry method or the multi-pattern photometry method as a setting related to photometry.
  • the imaging control unit 123 sets the gain of the image sensor 15 to be equal to or higher than the reference gain as the setting related to the gain of the image sensor 15 .
  • the imaging control unit 123 sets the photoelectric conversion efficiency of the image sensor 15 to be equal to or higher than the reference photoelectric conversion efficiency as the setting regarding the photoelectric conversion efficiency of the image sensor 15 .
  • the imaging control unit 123 outputs a sensitivity command corresponding to the set gain and photoelectric conversion efficiency to the image sensor driver 71 .
  • the imaging control unit 123 sets the high dynamic range to ON as a setting related to the high dynamic range.
  • the imaging control unit 123 sets anti-shake control to ON as a setting related to anti-shake control, and outputs a blur correction command to the blur correction drive circuit 54 .
  • the display control unit 124 when the CPU 61 is in the imaging mode, the display control unit 124 performs the Perform image processing settings related to image processing.
  • the display control unit 124 sets the degree of strength of noise reduction to be higher than the first reference degree as a setting related to noise reduction.
  • the display control unit 124 sets the degree of strength of sharpness to be higher than the second reference degree as a setting related to sharpness.
  • the display control unit 124 sets the degree of strength of contrast to be higher than the third reference degree as the setting related to contrast.
  • the display control unit 124 sets the degree of intensity of the tone to be higher than the fourth reference degree as the tone-related setting.
  • the first imaging control unit 133 and the second imaging control unit 135 are included in the temperature measurement mode parameter 211B stored in the parameter storage area 201. Imaging settings related to imaging are performed according to the second imaging setting parameter 212B.
  • the first image pickup control unit 133 and the second image pickup control unit 135 set the light projector 14 so that the light projector 14 does not project light, and output an OFF command to the light projection control circuit 73 .
  • the first imaging control unit 133 and the second imaging control unit 135 adjust the shutter speed by setting the shutter speed to be less than the reference speed as the shutter speed setting.
  • the first imaging control unit 133 and the second imaging control unit 135 set the aperture amount to be less than the reference amount as a setting related to the aperture 33, and send an aperture command corresponding to the set aperture amount to the aperture drive circuit 53. output.
  • the first imaging control unit 133 and the second imaging control unit 135 set the photometry method to highlight-weighted photometry, center-weighted photometry, or spot photometry.
  • the first imaging control unit 133 and the second imaging control unit 135 set the gain of the image sensor 15 to be less than the reference gain as the setting related to the gain of the image sensor 15 .
  • the first imaging control unit 133 and the second imaging control unit 135 set the photoelectric conversion efficiency of the image sensor 15 to be less than the reference photoelectric conversion efficiency as the setting regarding the photoelectric conversion efficiency of the image sensor 15 .
  • the first imaging control unit 133 and the second imaging control unit 135 output sensitivity commands corresponding to the set gain and photoelectric conversion efficiency to the image sensor driver 71 .
  • the first imaging control unit 133 and the second imaging control unit 135 set the high dynamic range to OFF as a setting related to the high dynamic range.
  • the first image pickup control unit 133 and the second image pickup control unit 135 set image stabilization control to OFF as settings related to image stabilization control, and output a motion compensation stop command to the motion compensation drive circuit 54 .
  • the display control unit 137 sets the second image processing setting parameter included in the temperature measurement mode parameter 211B stored in the parameter storage area 201. 213B to set image processing.
  • the display control unit 137 sets the degree of strength of noise reduction to be equal to or less than the first reference degree as a setting related to noise reduction.
  • the display control unit 137 sets the degree of strength of sharpness to be equal to or lower than the second reference degree as a setting related to sharpness.
  • the display control unit 137 sets the degree of strength of contrast to a third reference degree or less as a setting related to contrast.
  • the display control unit 137 sets the degree of intensity of the tone to be equal to or lower than the fourth reference degree as the tone-related setting.
  • the settings related to the projector 14, the shutter speed settings, the aperture 33 settings, the photometry settings, the image sensor 15 sensitivity settings, the high dynamic range settings, and the anti-vibration control settings are the same as those described in the present disclosure. It is an example of the "imaging setting" which concerns on a technique. Also, the first imaging setting parameter 212A and the second imaging setting parameter 212B are examples of "imaging setting factor related to imaging setting” and “control factor” according to the technology of the present disclosure. Also, the settings related to noise reduction, the settings related to sharpness, the settings related to contrast, and the settings related to tone are examples of "image processing settings” according to the technology of the present disclosure. Also, the first image processing setting parameter 213A and the second image processing setting parameter 213B are examples of the “image processing setting factor related to image processing setting” and the “control factor” according to the technology of the present disclosure.
  • mode switching processing executed by the mode switching processing unit 110, imaging processing executed by the imaging processing unit 120, and temperature measurement processing executed by the temperature measurement processing unit 130 are the same as in the first embodiment.
  • the third embodiment differs from the first embodiment in that a parameter change processing unit 190 executes parameter change processing. Parameter change processing executed by the parameter change processing unit 190 according to the third embodiment will be described below with reference to FIG. 25 .
  • the mode determination unit 112 determines whether the operation mode of the CPU 61 is the imaging mode or the temperature measurement mode. If it is determined in step S51 that it is the imaging mode, the processing shown in FIG. 25 proceeds to step S52, and if it is determined that it is in the temperature measurement mode, the processing illustrated in FIG. 25 proceeds to step S53. .
  • step S52 the mode-specific parameter setting unit 192 derives the imaging mode parameter 211A corresponding to the imaging mode, and sets the imaging mode parameter 211A as the mode-specific parameter 211 in the parameter storage area 201.
  • step S53 the mode-specific parameter setting unit 192 derives the temperature measurement mode parameter 211B corresponding to the temperature measurement mode, and sets the temperature measurement mode parameter 211B as the mode-specific parameter 211 in the parameter storage area 201.
  • the control method of the camera 1 according to the third embodiment is an example of the "control method" according to the technology of the present disclosure.
  • imaging setting parameters regarding imaging settings are different between the imaging mode and the temperature measurement mode. That is, as an example, the first imaging setting parameter 212A is set in the imaging mode, and the second imaging setting parameter 212B is set in the temperature measurement mode. Therefore, compared to the case where the imaging setting parameters are the same in the imaging mode and the temperature measurement mode, for example, in the imaging mode, it is possible to obtain good image quality for the captured image, and in the temperature measurement mode, for example, the load on the CPU 61 is reduced. It is possible to secure the measurement accuracy while reducing the
  • the imaging settings also include settings related to the projector 14 .
  • the CPU 61 sets the light projector 14 to emit light in the imaging mode, and sets the light projector 14 not to emit light in the temperature measurement mode. Therefore, in the imaging mode, by ensuring the amount of light emitted from the subject, it is possible to obtain a better image quality of the captured image than when the light projector 14 projects light.
  • the temperature measurement mode compared to the case where the light projector 14 projects light, for example, by suppressing the near-infrared light emitted from the subject from being mixed with the illumination light from the light projector 14, measurement accuracy is ensured. can do.
  • the imaging settings include settings related to shutter speed.
  • the CPU 61 sets the shutter speed to a reference speed or higher in the imaging mode, and sets the shutter speed to less than the reference speed in the temperature measurement mode. That is, the CPU 61 sets the shutter speed in the imaging mode longer than the shutter speed in the temperature measurement mode. Accordingly, in the imaging mode, by ensuring the amount of light incident on the image sensor 15, it is possible to obtain a good image quality for the captured image. On the other hand, in the temperature measurement mode, it is possible to ensure measurement accuracy by suppressing noise from being included in the first analog image data and the second analog image data output from the image sensor 15 .
  • the imaging settings include settings related to the aperture 33 .
  • the CPU 61 sets the diaphragm amount to a reference amount or more in the imaging mode, and sets the diaphragm amount to less than the reference amount in the temperature measurement mode. That is, the CPU 61 sets the aperture amount in the imaging mode to be larger than the aperture amount in the temperature measurement mode. Accordingly, in the imaging mode, by ensuring the amount of light incident on the image sensor 15, it is possible to obtain a good image quality for the captured image. On the other hand, in the temperature measurement mode, it is possible to ensure measurement accuracy by suppressing noise from being included in the first analog image data and the second analog image data output from the image sensor 15 .
  • the imaging settings include settings related to photometry.
  • the CPU 61 sets the photometry method to the average photometry method or the multi-pattern photometry method in the imaging mode, and sets the photometry method to the highlight-weighted photometry method, the center-weighted photometry method, or the spot photometry method in the temperature measurement mode. Accordingly, in the imaging mode, by securing the amount of light emitted from the entire subject and incident on the image sensor 15, it is possible to obtain a good image quality of the captured image. On the other hand, in the temperature measurement mode, the measurement accuracy can be ensured by suppressing saturation of the amount of light emitted from the highest temperature region of the subject and incident on the image sensor 15 .
  • the imaging settings also include settings related to the sensitivity of the image sensor 15 .
  • the parameters relating to the sensitivity of the image sensor 15 include parameters relating to the gain of the image sensor 15 and parameters relating to the conversion efficiency of the image sensor 15 .
  • the CPU 61 sets the gain of the image sensor 15 to a reference gain or more in the imaging mode, and sets the gain of the image sensor 15 to less than the reference gain in the temperature measurement mode. That is, the CPU 61 sets the gain in the imaging mode higher than the gain in the temperature measurement mode. Accordingly, in the imaging mode, analog image data corresponding to the exposure can be obtained, so that a good image quality can be obtained for the captured image.
  • noise is included in the first analog image data and the second analog image data output from the image sensor 15, and the light radiated from the area with the highest temperature of the object is subject to By suppressing saturation of the peak values of the first analog image data and the second analog image data, measurement accuracy can be ensured.
  • the CPU 61 sets the conversion efficiency of the image sensor 15 to be equal to or higher than the reference conversion efficiency in the imaging mode, and sets the conversion efficiency of the image sensor 15 to be lower than the reference conversion efficiency in the temperature measurement mode. That is, the CPU 61 sets the conversion efficiency in the imaging mode higher than the conversion efficiency in the temperature measurement mode. Accordingly, in the imaging mode, analog image data corresponding to the exposure can be obtained, so that a good image quality can be obtained for the captured image.
  • noise is included in the first analog image data and the second analog image data output from the image sensor 15, and the light radiated from the area with the highest temperature of the object is subject to By suppressing saturation of the peak values of the first analog image data and the second analog image data, measurement accuracy can be ensured.
  • the imaging settings include settings related to high dynamic range.
  • the CPU 61 sets the high dynamic range to ON in the imaging mode, and sets the high dynamic range to OFF in the temperature measurement mode.
  • the dynamic range is wider than the reference range, so that a good image quality can be obtained for the captured image.
  • the temperature measurement mode by setting the dynamic range to the reference range, measurement accuracy can be ensured.
  • the imaging settings include settings related to anti-vibration control.
  • the CPU 61 sets anti-vibration control to ON in the imaging mode, and sets anti-vibration control to OFF in the temperature measurement mode.
  • image blurring is suppressed, so that a good image quality can be obtained for the captured image.
  • the load on the CPU 61 can be reduced by reducing the amount of computation required by the CPU 61 to suppress blurring of the image.
  • image processing setting parameters regarding image processing settings are different between the imaging mode and the temperature measurement mode. That is, as an example, the first image processing setting parameter 213A is set in the imaging mode, and the second image processing setting parameter 213B is set in the temperature measurement mode. Therefore, compared to the case where the image processing setting parameters are the same in the imaging mode and the temperature measurement mode, for example, in the imaging mode, it is possible to obtain good image quality for the captured image, and in the temperature measurement mode, for example, the CPU 61 The burden can be reduced.
  • the image processing settings include settings related to noise reduction.
  • the CPU 61 sets the degree of noise reduction strength to be greater than the first reference degree in the imaging mode, and sets the degree of noise reduction strength to the first reference degree or less in the temperature measurement mode. That is, the CPU 61 sets the noise reduction in the imaging mode to be stronger than the noise reduction in the temperature measurement mode. Accordingly, in the imaging mode, the noise included in the captured image is reduced, so that the captured image can be obtained with good image quality. On the other hand, in the temperature measurement mode, the load on the CPU 61 can be reduced by reducing the amount of computation required by the CPU 61 to reduce noise.
  • the image processing settings include settings related to sharpness.
  • the CPU 61 sets the degree of sharpness to be greater than the second reference degree in the imaging mode, and sets the degree of sharpness to the second reference degree or less in the temperature measurement mode. That is, the CPU 61 sets the sharpness in the imaging mode stronger than the sharpness in the temperature measurement mode. As a result, in the imaging mode, the sharpness of the captured image is enhanced, so that a good image quality can be obtained for the captured image.
  • the burden on the CPU 61 can be reduced by reducing the amount of arithmetic processing of the CPU 61 necessary for adjusting the sharpness.
  • the image processing settings include settings related to contrast.
  • the CPU 61 sets the degree of contrast strength to be greater than the third reference degree in the imaging mode, and sets the degree of contrast strength to the third reference degree or less in the temperature measurement mode. That is, the CPU 61 sets the contrast in the imaging mode higher than the contrast in the temperature measurement mode.
  • the load on the CPU 61 can be reduced by reducing the amount of computation required by the CPU 61 for adjusting the contrast.
  • the image processing settings include settings related to tone.
  • the CPU 61 sets the degree of tone strength to be greater than the fourth reference degree in the imaging mode, and sets the degree of tone strength to the fourth reference degree or less in the temperature measurement mode. That is, the CPU 61 sets the tone in the imaging mode stronger than the tone in the temperature measurement mode.
  • the contrast of the captured image is enhanced, so that good image quality can be obtained for the captured image.
  • the load on the CPU 61 can be reduced by reducing the amount of computation required by the CPU 61 for adjusting the contrast.
  • settings related to the projector 14 in the imaging mode and the temperature measurement mode, settings related to the projector 14, settings related to the shutter speed, settings related to the aperture 33, settings related to photometry, settings related to the sensitivity of the image sensor 15, settings related to high dynamic range, and settings related to protection
  • settings related to vibration control are different, combinations of imaging settings that are different between the imaging mode and the temperature measurement mode may be other than the above.
  • the setting for high dynamic range may be set to on in both imaging mode and temperature measurement mode.
  • the setting regarding anti-vibration control may be set to ON in both the imaging mode and the temperature measurement mode.
  • the imaging settings that are made different between the imaging mode and the temperature measurement mode include settings related to the projector 14, settings related to the shutter speed, settings related to the aperture 33, settings related to photometry, settings related to the sensitivity of the image sensor 15, settings related to high dynamic range, and Any setting other than the above may be used as long as at least one setting related to anti-vibration control is included.
  • the imaging settings that are made different between the imaging mode and the temperature measurement mode include settings related to the projector 14, settings related to the shutter speed, settings related to the aperture 33, settings related to photometry, settings related to the sensitivity of the image sensor 15, settings related to high dynamic range, and Various imaging settings related to the camera 1 may be included in addition to the settings related to anti-vibration control.
  • the noise reduction setting, the sharpness setting, the contrast setting, and the tone setting are different between the imaging mode and the temperature measurement mode. Combinations of settings may be other than the above.
  • the image processing settings that are differentiated between the imaging mode and the temperature measurement mode may be other than the above, as long as they include at least one of noise reduction settings, sharpness settings, contrast settings, and tone settings.
  • the image processing settings that are made different between the imaging mode and the temperature measurement mode may include various image processing settings related to the camera 1 in addition to settings related to noise reduction, settings related to sharpness, settings related to contrast, and settings related to tone. .
  • the CPU 61 sets the light projector 14 not to project light in the temperature measurement mode, but sets the light projector 14 to suppress light projection (that is, the amount of light projected from the light projector 14 can be set).
  • the configuration of the camera 1 is changed as follows from the first embodiment. Differences of the fourth embodiment from the first embodiment will be described below.
  • the mode switching processing unit 110 includes a mode determination unit 112, a measured temperature acquisition unit 221, a temperature measurement mode end determination unit 222, a flag setting unit 113, a light projection control unit 114, and a mode setting unit. 115.
  • the mode determination unit 112 determines whether the operation mode of the CPU 61 is the imaging mode or the temperature measurement mode.
  • the measured temperature acquisition unit 221 acquires the temperature of the subject measured in the temperature measurement mode (hereinafter referred to as the measured temperature). do.
  • the measured temperature is the temperature distribution value of the subject, the highest value of the temperature distribution of the subject, the mode of the temperature distribution of the subject, the median of the temperature distribution of the subject, and the subject Any of the average values of the temperature distribution of
  • the temperature measurement mode end determination unit 222 determines whether to end the temperature measurement mode based on the measured temperature acquired by the measured temperature acquisition unit 221 .
  • the temperature measurement mode end determination unit 222 derives a value based on the measured temperature acquired by the measured temperature acquisition unit 221, determines whether the derived value is equal to or less than a threshold value, and ends the measurement mode. Determine whether to terminate.
  • the value based on the measured temperature may be any value derived from the measured temperature. For example, it may be the value of the measured temperature itself, or the value of the amount of radiant heat calculated based on the measured temperature. Also, a calculation formula may be used to derive the value based on the measured temperature, or a data matching table may be used.
  • the temperature measurement mode end determination unit 222 determines to end the temperature measurement mode when the value based on the measured temperature is equal to or less than the threshold.
  • the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141, and A light projection ON control flag 152 A is set as the light projection control flag 152 in the light projection control flag storage area 142 .
  • the light projection control unit 114 outputs an ON command to the light projection control circuit 73 when the light projection ON control flag 152A is set by the flag setting unit 113 .
  • the ON command is a command to switch the projector 14 ON.
  • the mode setting unit 115 sets the imaging mode as the mode of the CPU 61 .
  • the imaging processing executed by the imaging processing unit 120 and the temperature measurement processing executed by the temperature measurement processing unit 130 are the same as in the first embodiment.
  • mode switching processing executed by the mode switching processing unit 110 is different from that in the first embodiment. Mode switching processing executed by the mode switching processing unit 110 according to the fourth embodiment will be described below with reference to FIG. 27 .
  • the mode determination unit 112 determines whether the mode of the CPU 61 is the imaging mode or the temperature measurement mode. If it is determined in step S61 that the imaging mode is set, the process shown in FIG. 27 proceeds to step S62, and if it is determined that the temperature measurement mode is set, the process shown in FIG. 27 ends.
  • the measured temperature acquisition unit 221 acquires the measured temperature measured in the temperature measurement mode.
  • step S63 the temperature measurement mode end determination unit 222 determines whether to end the temperature measurement mode based on the measured temperature acquired by the measured temperature acquisition unit 221. If it is determined in step S63 that the temperature measurement mode should be ended, the process shown in FIG. 27 proceeds to step S64, and if it is determined not to end the temperature measurement mode, the process shown in FIG. 27 ends.
  • step S64 the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141, and sets the light projection control flag 152 in the light projection control flag storage area 142 so that light projection is turned on.
  • Set control flag 152A the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141, and sets the light projection control flag 152 in the light projection control flag storage area 142 so that light projection is turned on.
  • step S65 the light projection control unit 114 switches the light projector 14 on.
  • step S66 the mode setting unit 115 sets the imaging mode as the mode of the CPU61.
  • the control method of the camera 1 according to the fourth embodiment is an example of the "control method" according to the technology of the present disclosure.
  • the CPU 61 switches from the temperature measurement mode to the imaging mode according to the temperature of the subject in the temperature measurement mode. Therefore, convenience can be improved compared to the case where the temperature measurement mode is not switched to the imaging mode according to the temperature of the object, for example.
  • the configuration of the camera 1 is changed as follows from the first embodiment. Differences of the fifth embodiment from the first embodiment will be described below.
  • the CPU 61 functions as an integrated display processing section 230 .
  • the integrated display processing unit 230 performs the operation of setting the imaging mode during the light emission period of the pulse light emission and setting the temperature measurement mode during the light emission stop period of the pulse light emission while causing the light projector 14 to emit light in accordance with the light emission timing of the pulse light emission. It is a processing unit that repeats The light projector 14 performs intermittent light projection by performing pulse light emission.
  • the integrated display processing section 230 has a pulse emission control section 241 , an imaging processing section 120 , a pulse emission stop control section 242 , a temperature measurement processing section 130 and an integrated display control section 243 .
  • the pulse light emission control unit 241 outputs a pulse light emission command to the light projection control circuit 73 and controls the light projector 14 to emit pulse light.
  • the imaging processing unit 120 has a wavelength selection unit 121, a turret control unit 122, and an imaging control unit 123.
  • the functions of the wavelength selection unit 121, the turret control unit 122, and the imaging control unit 123 are the same as in the first embodiment.
  • the imaging processing unit 120 performs imaging processing for obtaining a captured image by causing the image sensor 15 to capture visible light or near-infrared light.
  • the pulse emission stop control unit 242 outputs a pulse emission stop command to the light emission control circuit 73 and controls the light projector 14 to stop pulse emission.
  • the temperature measurement processing unit 130 has a wavelength selection unit 131, a first turret control unit 132, a first imaging control unit 133, a second turret control unit 134, a second imaging control unit 135, and a temperature derivation unit 136.
  • the functions of the wavelength selection unit 131, the first turret control unit 132, the first imaging control unit 133, the second turret control unit 134, the second imaging control unit 135, and the temperature derivation unit 136 are the same as in the first embodiment. .
  • the temperature measurement processing unit 130 calculates the temperature distribution of the subject based on the near-infrared light image obtained by capturing the near-infrared light by the image sensor 15, and obtains the temperature information based on the temperature distribution of the subject. Run the generated temperature measurement process.
  • the integrated display control unit 243 outputs an integrated image 251 that integrates the captured image obtained by the imaging processing unit 120 and the temperature information obtained by the temperature measurement processing unit 130, and displays the integrated image 251 on the display 76.
  • the integrated image 251 may be an image obtained by synthesizing the captured image obtained by the imaging processing unit 120 with the temperature information obtained by the temperature measurement processing unit 130, or the captured image obtained by the imaging processing unit 120. and the temperature information obtained by the temperature measurement processing unit 130 may be displayed side by side.
  • the temperature information includes, for example, information indicating an area where the temperature is equal to or higher than a predetermined threshold, information indicating a specific numerical value of the temperature, and information indicating a plurality of sections divided for each predetermined temperature range. or information indicating the temperature distribution in a color tone corresponding to the temperature.
  • the captured image is an example of the “captured image” and the “first captured image” according to the technology of the present disclosure
  • the temperature information is an example of the “temperature information” according to the technology of the present disclosure
  • the integrated image 251 is an example of a “composite image” according to the technology of the present disclosure.
  • the fifth embodiment differs from the first embodiment in that the integrated display processing unit 230 executes integrated display processing.
  • the integrated display processing executed by the integrated display processing unit 230 according to the fifth embodiment will be described below with reference to FIG. 29 .
  • step S71 the pulse light emission control unit 241 causes the light projector 14 to emit pulse light.
  • step S72 the imaging processing unit 120 obtains a captured image by causing the image sensor 15 to capture visible light or near-infrared light.
  • step S73 the pulse emission stop control unit 242 causes the light projector 14 to stop pulse emission.
  • step S74 the temperature measurement processing unit 130 calculates the temperature distribution of the subject based on the near-infrared light image obtained by causing the image sensor 15 to capture the near-infrared light, and calculates the temperature distribution based on the temperature distribution of the subject. generate information
  • step S75 the integrated display control unit 243 outputs the integrated image 251 obtained by integrating the captured image obtained by the imaging processing unit 120 and the temperature information obtained by the temperature measurement processing unit 130, and displays the integrated image 251. 76.
  • the control method of the camera 1 according to the fifth embodiment is an example of the "control method" according to the technology of the present disclosure.
  • the light projector 14 performs pulsed light emission, and the CPU 61 sets the imaging mode during the light emission period of the pulsed light emission and sets the temperature measurement mode during the light emission stop period of the pulsed light emission. Repeat in time. Thereby, an integrated image can be obtained by integrating the captured image obtained in the imaging mode with the temperature information obtained in the temperature measurement mode.
  • the CPU 61 also outputs an integrated image 251 that integrates the captured image obtained by the imaging processing unit 120 and the temperature information obtained by the temperature measurement processing unit 130 . Therefore, even without switching between the imaging mode and the temperature measurement mode, the display 76 displays the integrated image 251 in which temperature information is integrated with the captured image, thereby allowing the user to visually understand the relationship between the subject's condition and the temperature. can be grasped.
  • the CPU 61 has an imaging mode and a temperature measurement mode, but may have modes other than the imaging mode and the temperature measurement mode.
  • the CPU 61 sets the display control flag 151 and the light emission control flag 152 differently between the imaging mode and the temperature measurement mode. Control flags other than 152 may be varied.
  • the display control flag 151 includes a captured image display flag 151A for displaying the captured image 161A on the display 76, and a synthesized image 161B obtained by synthesizing temperature information with the captured image. is set on the display 76 to display the composite image display flag 151B.
  • a display control flag 151 other than the captured image display flag 151A and the composite image display flag 151B may be set.
  • a temperature information display flag for displaying temperature information on the display 76 may be set instead of the composite image display flag 151B, and the temperature information may be displayed on the display 76 .
  • a temperature information display flag is an example of a "temperature information display factor" according to the technology of the present disclosure.
  • image blur is corrected by moving the blur correction lens 34.
  • the image sensor 15 is used as an example of the "optical element" according to the technology of the present disclosure.
  • Image blur may be corrected by the movement.
  • Image blurring may also be corrected by an image processing technique based on a plurality of captured images.
  • the wavelength band from 950 nm to 1100 nm, the wavelength band from 1150 nm to 1350 nm, the wavelength band from 1500 nm to 1750 nm, and the wavelength from 200 nm to 2400 nm Two wavelength bands are selected from the bands, but two wavelength bands may be selected from other wavelength bands.
  • near-infrared light is used in the temperature measurement by the two-color thermometry method, but light other than near-infrared light such as visible light may be used. .
  • the camera 1 is given as an example of an imaging device, but the technology of the present disclosure is not limited to this, and smart devices, wearable terminals, cell observation devices, and ophthalmologic observations are possible. It may be a digital camera built into a device or various electronic equipment such as a surgical microscope.
  • the functional configuration of the CPU 61 and the order of the processes executed by the CPU 61 are examples, and may be modified in various ways.
  • the imaging support processing may be executed by a computer 314 in an external device 312 communicably connected to the camera 1 via a network 310 such as LAN or WAN.
  • computer 314 comprises CPU 316 , storage 318 and memory 320 .
  • the storage 318 stores the imaging support processing program 100 .
  • the camera 1 requests execution of imaging support processing from the external device 312 via the network 310 .
  • the CPU 316 of the external device 312 reads the imaging support processing program 100 from the storage 318 and executes the imaging support processing program 100 on the memory 320 .
  • the CPU 316 performs imaging support processing according to the imaging support processing program 100 executed on the memory 320 .
  • the CPU 316 provides the camera 1 via the network 310 with the processing result obtained by executing the imaging support processing.
  • the camera 1 and the external device 312 may perform the imaging support processing in a distributed manner, or a plurality of devices including the camera 1 and the external device 312 may perform the imaging support processing in a distributed manner.
  • the camera 1 and the external device 312 are examples of the “imaging device” according to the technology of the present disclosure.
  • the NVM 62 stores the imaging support processing program 100, but the technique of the present disclosure is not limited to this.
  • the imaging support processing program 100 may be stored in the storage medium 330.
  • FIG. Storage medium 330 is a non-temporary storage medium.
  • An example of the storage medium 330 includes any portable storage medium such as an SSD or USB memory.
  • the imaging support processing program 100 stored in the storage medium 330 is installed in the computer 60 .
  • the CPU 61 executes imaging support processing according to the imaging support processing program 100 .
  • the imaging support processing program 100 is stored in a storage unit such as another computer or server device connected to the computer 60 via a communication network (not shown), and the imaging support processing program is executed in response to a request from the camera 1. 100 may be downloaded and installed on computer 60 .
  • a storage unit such as a server device, or the NVM 62, and a part of the imaging support processing program 100 may be stored. You can leave it.
  • FIG. 31 shows a mode example in which the computer 60 is built in the camera 1, the technology of the present disclosure is not limited to this. may be made available.
  • the CPU 61 is a single CPU, but may be a plurality of CPUs. Also, a GPU may be applied instead of the CPU 61 .
  • the computer 60 is illustrated in the example shown in FIG. good too. Also, instead of the computer 60, a combination of hardware configuration and software configuration may be used.
  • processors shown below can be used as hardware resources for executing the imaging support processing described in the first to fifth embodiments.
  • a processor for example, there is a CPU, which is a general-purpose processor that functions as a hardware resource that executes imaging support processing by executing software, that is, a program.
  • processors include, for example, FPGAs, PLDs, ASICs, and other dedicated electric circuits that are processors having circuit configurations specially designed to execute specific processing.
  • a memory is built in or connected to each processor, and each processor uses the memory to execute imaging support processing.
  • the hardware resource that executes the imaging support processing may be configured with one of these various processors, or a combination of two or more processors of the same or different types (for example, a combination of multiple FPGAs, or (combination of CPU and FPGA). Also, the hardware resource for executing the imaging support process may be one processor.
  • one processor is configured by combining one or more CPUs and software, and this processor functions as a hardware resource for executing imaging support processing.
  • this processor functions as a hardware resource for executing imaging support processing.
  • SoC SoC
  • a and/or B is synonymous with “at least one of A and B.” That is, “A and/or B” means that only A, only B, or a combination of A and B may be used. Also, in this specification, when three or more matters are expressed by connecting with “and/or”, the same idea as “A and/or B" is applied.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Studio Devices (AREA)

Abstract

This control device comprises: a processor; and a memory which is connected to or built into the processor. The processor is provided with: a first mode in which an image is captured on the basis of light received by an image sensor of an imaging device; and a second mode in which the temperature is derived on the basis of near-infrared light received by the image sensor. Different control factors are used in the first mode and in the second mode.

Description

制御装置、撮像装置、制御方法、及びプログラムControl device, imaging device, control method, and program
 本開示の技術は、制御装置、撮像装置、制御方法、及びプログラムに関する。 The technology of the present disclosure relates to a control device, an imaging device, a control method, and a program.
 特開平10-134272号公報には、大空間の熱画像から火災発生や侵入者の検知、環境制御を行うための大空間防災等のモニタリングシステムにおいて、温度フィルタにより特定の温度域の熱画像を取り出すための赤外線カメラと、常温から火災監視に必要な温度範囲まで異なる温度域の複数の温度フィルタを有し該複数の温度フィルタから赤外線カメラの温度フィルタを選択する温度フィルタ選択手段と、赤外線カメラから取り出し処理された熱画像データを記憶する画像データ記憶手段と、予め設定された複数のモニタリングモードの切り換えタイミングに応じて温度フィルタ選択手段を制御して赤外線カメラの温度フィルタを選択し、モニタリングモード毎に赤外線カメラから熱画像データを取り出して各モニタリングモードの処理を実行する制御処理手段とを備えたことを特徴とする大空間防災等のモニタリングシステムが開示されている。 Japanese Patent Application Laid-Open No. 10-134272 describes a monitoring system for large-space disaster prevention, etc. for detecting fire outbreaks and intruders from thermal images of a large space, and for environmental control. an infrared camera for taking out, a temperature filter selection means having a plurality of temperature filters with different temperature ranges from room temperature to a temperature range necessary for fire monitoring, and selecting a temperature filter for the infrared camera from the plurality of temperature filters, and an infrared camera image data storage means for storing thermal image data extracted from and processed, and temperature filter selection means for selecting a temperature filter of the infrared camera in accordance with preset switching timings of a plurality of monitoring modes, and selecting a monitoring mode. A monitoring system for large space disaster prevention and the like is disclosed, which is characterized by comprising control processing means for taking out thermal image data from an infrared camera each time and executing processing in each monitoring mode.
 国際公開第2005/71372号パンフレットには、被写体像を結像するための撮影光学系と、撮影光学系により結像された被写体像を撮像して画像信号を出力するための撮像素子部と、画像の撮影に係る操作を行うための撮影操作部と、を有し、分光画像を取得し得るように構成された画像撮影部、を具備した画像処理システムであって、当該画像処理システムが取り得る複数のモードに各対応するモード関連情報を表示するためのモード表示手段をさらに具備したことを特徴とする画像処理システムが開示されている。 International Publication No. 2005/71372 pamphlet describes a photographing optical system for forming an image of a subject, an image sensor section for photographing the subject image formed by the photographing optical system and outputting an image signal, and an image capturing operation unit for performing operations related to image capturing, and an image capturing unit configured to acquire a spectroscopic image, wherein the image processing system acquires An image processing system is disclosed, further comprising mode display means for displaying mode-related information corresponding to each of the plurality of modes to be obtained.
 特表2004-534941号公報には、レンズ組立体を含む手持ち式赤外線カメラであって、該レンズ組立体がハウジングによって支持され、該ハウジングが、電気エネルギー源と、レンズ組立体を介して受け取る情報を記録および処理する処理手段とを保持するように構成され、装置を手動により、かつ視覚によって制御する使用者制御手段がハウジングに設けられている手持ち式赤外線カメラにおいて、ハウジングが実質的に伸長形状であり、一方の端部にレンズ組立体が装架され、反対側の端部が使用者用ハンドルとして形成され、ハウジングの一方の側に、使用者の親指で操作するように意図された手動制御手段の集合体と、視覚による制御手段とが設けられ、視覚による制御手段が、手動制御手段の集合体とレンズ組立体との間に配置され、使用者の目および体から離して赤外線カメラを保持する時に、見ることができるように適合され、赤外線カメラが片手で操作するように意図されていることを特徴とする手持ち式赤外線カメラが開示されている。 Japanese Patent Application Publication No. 2004-534941 discloses a handheld infrared camera including a lens assembly, the lens assembly being supported by a housing, the housing being a source of electrical energy and information received via the lens assembly. and wherein the housing is provided with user control means for manually and visually controlling the device, wherein the housing is in a substantially elongated configuration. with a lens assembly mounted at one end, a user handle at the opposite end, and a manual handle on one side of the housing intended to be operated by the user's thumb. A control assembly and a visual control are provided, the visual control being positioned between the manual control assembly and the lens assembly to position the infrared camera away from the user's eyes and body. Disclosed is a hand-held infrared camera adapted to be viewed when holding a camera and characterized in that the infrared camera is intended to be operated with one hand.
 本開示の技術に係る一つの実施形態は、第1モードと第2モードとで、制御対象における制御内容を異ならせることができる制御装置、撮像装置、制御方法、及びプログラムを提供する。 One embodiment according to the technology of the present disclosure provides a control device, an imaging device, a control method, and a program that are capable of differentiating the details of control on a controlled object between the first mode and the second mode.
 本開示の技術に係る第1の態様は、プロセッサと、プロセッサに接続又は内蔵されたメモリと、を備え、プロセッサは、撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを備え、第1モードと第2モードとで、制御因子が異なる制御装置である。 A first aspect of the technology of the present disclosure includes a processor and a memory connected to or built into the processor, and the processor performs imaging based on light received by an image sensor of an imaging device in a first mode and , and a second mode for deriving temperature based on the near-infrared light received by the image sensor, the control factor being different between the first mode and the second mode.
 本開示の技術に係る第2の態様は、第1の態様に係る制御装置において、制御因子は、ディスプレイに対して表示させる表示制御因子を含む制御装置である。 A second aspect of the technology of the present disclosure is the control device according to the first aspect, wherein the control factor includes a display control factor to be displayed on the display.
 本開示の技術に係る第3の態様は、第2の態様に係る制御装置において、プロセッサは、第1モードでは、表示制御因子として、光がイメージセンサによって受光されることで得られた撮像画像をディスプレイに対して表示させる撮像画像表示因子を設定し、第2モードでは、表示制御因子として、温度を示す温度情報をディスプレイに対して表示させる温度情報表示因子を設定する制御装置である。 A third aspect of the technology of the present disclosure is the control device according to the second aspect, wherein in the first mode, the processor, as the display control factor, is a captured image obtained by light being received by the image sensor. is displayed on the display, and in the second mode, as the display control factor, a temperature information display factor is set for displaying temperature information indicating temperature on the display.
 本開示の技術に係る第4の態様は、第1の態様から第3の態様の何れか一つの態様に係る制御装置において、制御因子は、投光器を動作させる投光制御因子を含み、プロセッサは、第2モードでは、投光制御因子として、投光器の投光を抑制させる投光抑制制御因子を設定する制御装置である。 A fourth aspect of the technology of the present disclosure is the control device according to any one of the first to third aspects, wherein the control factor includes a light projection control factor that operates the light projector, and the processor comprises: In the second mode, the controller sets a light projection suppression control factor for suppressing the light projection of the light projector as the light projection control factor.
 本開示の技術に係る第5の態様は、第1の態様から第4の態様の何れか一つの態様に係る制御装置において、制御因子は、撮像設定に関する撮像設定因子を含む制御装置である。 A fifth aspect of the technology of the present disclosure is the control device according to any one of the first to fourth aspects, wherein the control factor includes an imaging setting factor related to imaging settings.
 本開示の技術に係る第6の態様は、第5の態様に係る制御装置において、撮像設定は、投光器に関する設定、シャッタスピードに関する設定、絞りに関する設定、測光に関する設定、イメージセンサの感度に関する設定、ハイダイナミックレンジに関する設定、及び防振制御に関する設定のうちの少なくとも1つの設定を含む制御装置である。 A sixth aspect of the technology of the present disclosure is the control device according to the fifth aspect, wherein the imaging settings include settings related to the projector, settings related to the shutter speed, settings related to the aperture, settings related to photometry, settings related to the sensitivity of the image sensor, The control device includes at least one setting of a setting related to high dynamic range and a setting related to anti-vibration control.
 本開示の技術に係る第7の態様は、第1の態様から第6の態様の何れか一つの態様に係る制御装置において、制御因子は、画像処理設定に関する画像処理設定因子を含む制御装置である。 A seventh aspect of the technology of the present disclosure is the control device according to any one of the first to sixth aspects, wherein the control factor includes an image processing setting factor related to image processing setting. be.
 本開示の技術に係る第8の態様は、第7の態様に係る制御装置において、画像処理設定は、ノイズリダクションに関する設定、シャープネスに関する設定、コントラストに関する設定、及びトーンに関する設定のうちの少なくとも1つの設定を含む制御装置である。 An eighth aspect of the technology of the present disclosure is the control device according to the seventh aspect, wherein the image processing settings include at least one of noise reduction settings, sharpness settings, contrast settings, and tone settings. A controller containing settings.
 本開示の技術に係る第9の態様は、プロセッサと、プロセッサに接続又は内蔵されたメモリと、を備え、プロセッサは、撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを備え、投光器の状態がオンである場合には、第1モードを設定し、投光器の状態がオフである場合には、第2モードを設定する制御装置である。 A ninth aspect of the technology of the present disclosure includes a processor and a memory connected to or built into the processor, wherein the processor performs imaging based on light received by an image sensor of an imaging device in a first mode and and a second mode for deriving a temperature based on the near-infrared light received by the image sensor, setting the first mode when the light projector state is on and the light projector state is off. In the case, it is a control device for setting the second mode.
 本開示の技術に係る第10の態様は、第9の態様に係る制御装置において、プロセッサは、第2モードにおいて、温度に応じて第2モードから第1モードに切り替わる制御装置である。 A tenth aspect of the technology of the present disclosure is the control device according to the ninth aspect, wherein the processor switches from the second mode to the first mode in accordance with the temperature in the second mode.
 本開示の技術に係る第11の態様は、第10の態様に係る制御装置において、投光器は、パルス発光を行い、プロセッサは、パルス発光の発光期間に第1モードを設定し、パルス発光の発光停止期間に第2モードを設定する動作を、パルス発光の発光タイミングに応じて繰り返す制御装置である。 An eleventh aspect of the technology of the present disclosure is the control device according to the tenth aspect, wherein the light projector performs pulsed light emission, the processor sets the first mode during the light emission period of the pulsed light emission, and emits the pulsed light emission. The control device repeats the operation of setting the second mode during the stop period according to the light emission timing of the pulse light emission.
 本開示の技術に係る第12の態様は、第1の態様から第11の態様の何れか一つの態様に係る制御装置において、プロセッサは、イメージセンサによって受光されることで得られた第1撮像画像と、温度を示す温度情報とを合成した合成画像を出力する制御装置である。 A twelfth aspect of the technology of the present disclosure is the control device according to any one of the first to eleventh aspects, wherein the processor comprises: It is a control device that outputs a synthesized image obtained by synthesizing an image and temperature information indicating temperature.
 本開示の技術に係る第13の態様は、第1の態様から第12の態様の何れか一つの態様に係る制御装置と、イメージセンサとを備える撮像装置である。 A thirteenth aspect of the technology of the present disclosure is an imaging device including the control device according to any one of the first to twelfth aspects, and an image sensor.
 本開示の技術に係る第14の態様は、撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを切り替えること、及び、第1モードと第2モードとで、制御因子を異ならせることを含む制御方法である。 A fourteenth aspect of the technology of the present disclosure includes a first mode of imaging based on light received by an image sensor of an imaging device, and a first mode of deriving temperature based on near-infrared light received by the image sensor. The control method includes switching between two modes and different control factors between the first mode and the second mode.
 本開示の技術に係る第15の態様は、撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを切り替えること、及び、投光器の動作がオンである場合には、第1モードを設定し、投光器の動作がオフである場合には、第2モードを設定することを含む制御方法である。 A fifteenth aspect of the technology of the present disclosure includes a first mode of imaging based on light received by an image sensor of an imaging device, and a first mode of deriving temperature based on near-infrared light received by the image sensor. and setting the first mode when the operation of the light projector is on and setting the second mode when the operation of the light projector is off. be.
 本開示の技術に係る第16の態様は、コンピュータに、撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを切り替えること、及び、第1モードと第2モードとで、制御因子を異ならせることを含む処理を実行させるためのプログラムである。 A sixteenth aspect of the technology of the present disclosure provides a computer with a first mode for capturing an image based on light received by an image sensor of an imaging device, and calculating a temperature based on near-infrared light received by the image sensor. It is a program for executing processing including switching between a second mode to be derived and different control factors between the first mode and the second mode.
 本開示の技術に係る第17の態様は、コンピュータに、撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを切り替えること、及び、投光器の動作がオンである場合には、第1モードを設定し、投光器の動作がオフである場合には、第2モードを設定することを含む処理を実行させるためのプログラムである。 A seventeenth aspect of the technology of the present disclosure provides a computer with a first mode for capturing an image based on light received by an image sensor of an image capturing device, and measuring a temperature based on near-infrared light received by the image sensor. and switching to and from a second mode to derive, and setting the first mode when operation of the light projector is on and setting the second mode when operation of the light projector is off. It is a program for executing processing.
第1実施形態に係るカメラの一例を示す斜視図である。1 is a perspective view showing an example of a camera according to a first embodiment; FIG. 第1実施形態に係るカメラの内部構成の一例を示すブロック図である。2 is a block diagram showing an example of the internal configuration of the camera according to the first embodiment; FIG. 第1実施形態に係るカメラの電気的構成の一例を示すブロック図である。2 is a block diagram showing an example of the electrical configuration of the camera according to the first embodiment; FIG. 第1実施形態に係るターレットフィルタの構成及び動作の一例を示す説明図である。FIG. 4 is an explanatory diagram showing an example of the configuration and operation of the turret filter according to the first embodiment; 第1実施形態に係るCPUの機能的構成の一例を示すブロック図である。3 is a block diagram showing an example of a functional configuration of a CPU according to the first embodiment; FIG. 第1実施形態に係るCPUのモード切替処理部としての構成の一例を示すブロック図である。It is a block diagram showing an example of a configuration as a mode switching processing unit of the CPU according to the first embodiment. 第1実施形態に係るCPUの撮像処理部としての構成の一例を示すブロック図である。3 is a block diagram showing an example of a configuration of a CPU as an imaging processing unit according to the first embodiment; FIG. 第1実施形態に係る撮像処理で得られた撮像画像の一例を示す正面図である。FIG. 4 is a front view showing an example of a captured image obtained by imaging processing according to the first embodiment; 第1実施形態に係るCPUの温度測定処理部としての構成の一例を示すブロック図である。It is a block diagram showing an example of a configuration of a CPU as a temperature measurement processing unit according to the first embodiment. 第1実施形態に係るCPUの波長選択部としての機能の一例を示す説明図である。FIG. 4 is an explanatory diagram showing an example of a function of a CPU as a wavelength selector according to the first embodiment; 第1実施形態に係る温度測定処理で得られた合成画像の第1例を示す正面図である。FIG. 4 is a front view showing a first example of a synthesized image obtained by temperature measurement processing according to the first embodiment; 第1実施形態に係る温度測定処理で得られた合成画像の第2例を示す正面図である。FIG. 11 is a front view showing a second example of a synthesized image obtained by temperature measurement processing according to the first embodiment; 第1実施形態に係る温度測定処理で得られた合成画像の第3例を示す正面図である。FIG. 11 is a front view showing a third example of a synthesized image obtained by temperature measurement processing according to the first embodiment; 第1実施形態に係る温度測定処理で得られた合成画像の第4例を示す正面図である。FIG. 11 is a front view showing a fourth example of a synthesized image obtained by temperature measurement processing according to the first embodiment; 第1実施形態に係るモード切替処理の流れの一例を示すフローチャートである。6 is a flowchart showing an example of the flow of mode switching processing according to the first embodiment; 第1実施形態に係る撮像処理の流れの一例を示すフローチャートである。4 is a flowchart showing an example of the flow of imaging processing according to the first embodiment; 第1実施形態に係る温度測定処理の流れの一例を示すフローチャートである。6 is a flowchart showing an example of the flow of temperature measurement processing according to the first embodiment; 第1実施形態に係る撮像処理で得られた撮像画像の変形例を示す正面図である。FIG. 11 is a front view showing a modified example of the captured image obtained by the imaging process according to the first embodiment; 第1実施形態に係る温度測定処理で得られた合成画像の変形例を示す正面図である。FIG. 11 is a front view showing a modified example of the synthesized image obtained by the temperature measurement process according to the first embodiment; 第2実施形態に係るCPUのモード切替処理部としての構成の一例を示すブロック図である。FIG. 11 is a block diagram showing an example of a configuration of a CPU as a mode switching processing unit according to the second embodiment; 第2実施形態に係るモード切替処理の流れの一例を示すフローチャートである。9 is a flowchart showing an example of the flow of mode switching processing according to the second embodiment; 第3実施形態に係るCPUのパラメータ変更処理部としての構成の一例を示すブロック図である。FIG. 12 is a block diagram showing an example of a configuration of a CPU as a parameter change processing unit according to the third embodiment; 第3実施形態に係るCPUの撮像処理部としての構成の一例を示すブロック図である。FIG. 12 is a block diagram showing an example of a configuration of a CPU as an imaging processing unit according to the third embodiment; 第3実施形態に係るCPUの温度測定処理部としての構成の一例を示すブロック図である。FIG. 12 is a block diagram showing an example of a configuration of a CPU as a temperature measurement processing unit according to the third embodiment; 第3実施形態に係るパラメータ変更処理の流れの一例を示すフローチャートである。FIG. 11 is a flow chart showing an example of the flow of parameter change processing according to the third embodiment; FIG. 第4実施形態に係るCPUのモード切替処理部としての構成の一例を示すブロック図である。FIG. 14 is a block diagram showing an example of a configuration of a CPU as a mode switching processing unit according to the fourth embodiment; 第4実施形態に係るモード切替処理の流れの一例を示すフローチャートである。FIG. 16 is a flowchart showing an example of the flow of mode switching processing according to the fourth embodiment; FIG. 第5実施形態に係るCPUの統合表示処理部としての構成の一例を示すブロック図である。FIG. 20 is a block diagram showing an example of a configuration of a CPU as an integrated display processing unit according to the fifth embodiment; 第5実施形態に係る統合表示処理の流れの一例を示すフローチャートである。FIG. 14 is a flowchart showing an example of the flow of integrated display processing according to the fifth embodiment; FIG. 第1変形例に係る撮像装置の電気的構成の一例を示すブロック図である。FIG. 11 is a block diagram showing an example of an electrical configuration of an imaging device according to a first modified example; 第2変形例に係る撮像装置の電気的構成の一例を示すブロック図である。FIG. 11 is a block diagram showing an example of an electrical configuration of an imaging device according to a second modified example;
 以下、添付図面に従って本開示の技術に係る制御装置、撮像装置、制御方法、及びプログラムの実施形態の一例について説明する。 An example of embodiments of a control device, an imaging device, a control method, and a program according to the technology of the present disclosure will be described below with reference to the accompanying drawings.
 先ず、以下の説明で使用される文言について説明する。 First, the wording used in the following explanation will be explained.
 CPUとは、“Central Processing Unit”の略称を指す。GPUとは、“Graphics Processing Unit”の略称を指す。NVMとは、“Non-Volatile Memory”の略称を指す。RAMとは、“Random Access Memory”の略称を指す。ICとは、“Integrated Circuit”の略称を指す。ASICとは、“Application Specific Integrated Circuit”の略称を指す。PLDとは、“Programmable Logic Device”の略称を指す。FPGAとは、“Field-Programmable Gate Array”の略称を指す。SoCとは、“System-on-a-chip”の略称を指す。SSDとは、“Solid State Drive”の略称を指す。HDDとは、“Hard Disk Drive”の略称を指す。EEPROMとは、“Electrically Erasable and Programmable Read Only Memory”の略称を指す。SRAMとは、“Static Random Access Memory”の略称を指す。I/Fとは、“Interface”の略称を指す。USBとは、“Universal Serial Bus”の略称を指す。CMOSとは、“Complementary Metal Oxide Semiconductor”の略称を指す。CCDとは、“Charge Coupled Device”の略称を指す。LANとは、“Local Area Network”の略称を指す。WANとは、“Wide Area Network”の略称を指す。BPFとは、“Band Pass Filter”の略称を指す。Irとは、“Infrared Rays”の略称を指す。ELとは、“Electro Luminescence”の略称を指す。  CPU is an abbreviation for "Central Processing Unit". GPU is an abbreviation for "Graphics Processing Unit". NVM is an abbreviation for "Non-Volatile Memory". RAM is an abbreviation for "Random Access Memory". IC is an abbreviation for "Integrated Circuit". ASIC is an abbreviation for "Application Specific Integrated Circuit". PLD is an abbreviation for "Programmable Logic Device". FPGA is an abbreviation for "Field-Programmable Gate Array". SoC is an abbreviation for "System-on-a-chip." SSD is an abbreviation for "Solid State Drive". HDD is an abbreviation for "Hard Disk Drive". EEPROM is an abbreviation for "Electrically Erasable and Programmable Read Only Memory". SRAM is an abbreviation for "Static Random Access Memory". I/F is an abbreviation for "Interface". USB is an abbreviation for "Universal Serial Bus". CMOS is an abbreviation for "Complementary Metal Oxide Semiconductor". CCD is an abbreviation for "Charge Coupled Device". LAN is an abbreviation for "Local Area Network". WAN is an abbreviation for "Wide Area Network". BPF is an abbreviation for "Band Pass Filter". Ir is an abbreviation for "Infrared Rays". EL is an abbreviation for "Electro Luminescence".
 本明細書の説明において、「垂直」とは、完全な垂直の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの垂直を指す。本明細書の説明において、「水平」とは、完全な水平の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの水平を指す。本明細書の説明において、「平行」とは、完全な平行の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの平行を指す。本明細書の説明において、「直交」とは、完全な直交の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの直交を指す。本明細書の説明において、「一致」とは、完全な一致の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの一致を指す。本明細書の説明において、「等間隔」とは、完全な等間隔の他に、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨に反しない程度の誤差を含めた意味合いでの等間隔を指す。 In the description of this specification, "perpendicular" means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to perfect verticality, and does not go against the spirit of the technology of the present disclosure. It refers to the vertical in the sense of including the error of In the description of this specification, "horizontal" means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to being completely horizontal, and is not contrary to the spirit of the technology of the present disclosure. It refers to the horizontal in the sense of including the error of In the description of this specification, "parallel" means, in addition to complete parallelism, an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, and does not go against the spirit of the technology of the present disclosure. It refers to parallel in the sense of including the error of In the description of this specification, "orthogonal" is an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to perfect orthogonality, and is not contrary to the spirit of the technology of the present disclosure. It refers to orthogonality in the sense of including the error of In the description of this specification, "match" means, in addition to perfect match, an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, and does not go against the spirit of the technology of the present disclosure. It refers to a match in terms of meaning including errors in In the description of this specification, the term “equidistant interval” means an error that is generally allowed in the technical field to which the technology of the present disclosure belongs, in addition to perfectly equal intervals, and is contrary to the spirit of the technology of the present disclosure. It refers to equal intervals in the sense of including errors to the extent that they do not occur.
 [第1実施形態]
 はじめに、第1実施形態について説明する。一例として図1に示すように、カメラ1は、カメラ本体10と、レンズユニット20とを備える。カメラ1は、可視光を撮像することにより可視光画像を得る機能と、近赤外光を撮像することにより近赤外光画像を得る機能と、被写体から熱輻射により発せられる電磁波に基づいて被写体の温度を測定する機能とを備える。カメラ1は、本開示の技術に係る「撮像装置」の一例である。
[First embodiment]
First, the first embodiment will be described. As shown in FIG. 1 as an example, camera 1 includes camera body 10 and lens unit 20 . The camera 1 has a function of obtaining a visible light image by capturing visible light, a function of obtaining a near-infrared light image by capturing near-infrared light, and a function of capturing a subject based on electromagnetic waves emitted by thermal radiation from the subject. and a function to measure the temperature of The camera 1 is an example of an “imaging device” according to the technology of the present disclosure.
 カメラ本体10の前面11には、レンズユニット20を取り付けるためのカメラ側マウント12が設けられている。また、カメラ本体10の前面11には、被写体に向けて照明光ILを照射するための照射窓13が設けられている。 A camera-side mount 12 for attaching the lens unit 20 is provided on the front surface 11 of the camera body 10 . An illumination window 13 is provided on the front surface 11 of the camera body 10 for illuminating the subject with the illumination light IL.
 カメラ本体10は、照明光ILを発生する投光器14を備える。投光器14は、例えば、ピーク波長が1550nmの近赤外光を、照明光ILとして発するLEDである。投光器14は、ハロゲンライトでもよい。投光器14が発生した照明光ILは、照射窓13を透過してカメラ本体10の前方に射出される。投光器14は、本開示の技術に係る「投光器」の一例である。 The camera body 10 includes a projector 14 that generates illumination light IL. The light projector 14 is, for example, an LED that emits near-infrared light with a peak wavelength of 1550 nm as illumination light IL. The light projector 14 may be a halogen light. Illumination light IL generated by the light projector 14 is transmitted through the irradiation window 13 and emitted forward of the camera body 10 . The light projector 14 is an example of a "light projector" according to the technology of the present disclosure.
 また、カメラ本体10は、イメージセンサ15を備える。イメージセンサ15は、被写体からレンズユニット20を介して入射する光Lを撮像する。イメージセンサ15は、受光面15Aを有する。レンズユニット20に入射した光Lは、レンズユニット20によって受光面15Aに結像される。光Lが受光面15Aに結像されることで像が得られる。受光面15Aには、複数のフォトダイオードがマトリクス状に配置されている。 The camera body 10 also includes an image sensor 15 . The image sensor 15 captures an image of the light L incident from the subject through the lens unit 20 . The image sensor 15 has a light receiving surface 15A. The light L incident on the lens unit 20 is imaged on the light receiving surface 15A by the lens unit 20. As shown in FIG. An image is obtained by forming an image of the light L on the light receiving surface 15A. A plurality of photodiodes are arranged in a matrix on the light receiving surface 15A.
 一例として、複数のフォトダイオードには、可視光に感度を有する複数のシリコンフォトダイオードと、近赤外光に感度を有する複数のインジウム・ガリウム・ヒ素フォトダイオードとが含まれている。以降、シリコンフォトダイオードを、Siダイオードと称し、インジウム・ガリウム・ヒ素フォトダイオードを、InGaAsダイオードと称する。複数のSiダイオードは、受光した可視光に応じたアナログ画像データを生成して出力する。複数のInGaAsダイオードは、受光した近赤外光に応じたアナログ画像データを生成して出力する。以下では、特に区別して説明する必要がない場合、イメージセンサ15に入射する可視光及び近赤外光を光と称する。 As an example, the plurality of photodiodes includes a plurality of silicon photodiodes sensitive to visible light and a plurality of indium-gallium-arsenide photodiodes sensitive to near-infrared light. Hereinafter, the silicon photodiode will be referred to as a Si diode, and the indium-gallium-arsenide photodiode will be referred to as an InGaAs diode. A plurality of Si diodes generate and output analog image data according to the received visible light. A plurality of InGaAs diodes generate and output analog image data corresponding to the received near-infrared light. Hereinafter, visible light and near-infrared light incident on the image sensor 15 will be referred to as light unless it is necessary to distinguish between them.
 第1実施形態では、イメージセンサ15としてCMOSイメージセンサを例示しているが、本開示の技術はこれに限定されず、例えば、イメージセンサ15がCCDイメージセンサ等の他種類のイメージセンサであっても本開示の技術は成立する。イメージセンサ15は、本開示の技術に係る「撮像装置のイメージセンサ」の一例である。 In the first embodiment, a CMOS image sensor is exemplified as the image sensor 15, but the technology of the present disclosure is not limited to this. The technology of the present disclosure is also established. The image sensor 15 is an example of an “imaging device image sensor” according to the technology of the present disclosure.
 レンズユニット20は、鏡筒21と、レンズ側マウント22とを備える。レンズ側マウント22は、鏡筒21の後端部に設けられている。レンズ側マウント22は、カメラ本体10のカメラ側マウント12に接続可能に構成されている。レンズユニット20は、レンズ側マウント22によってカメラ本体10に着脱可能に装着される。なお、レンズユニット20は、カメラ本体10に着脱不能に固定されていてもよい。 The lens unit 20 includes a lens barrel 21 and a lens side mount 22. The lens side mount 22 is provided at the rear end of the lens barrel 21 . The lens side mount 22 is configured to be connectable to the camera side mount 12 of the camera body 10 . The lens unit 20 is detachably attached to the camera body 10 by a lens side mount 22 . Note that the lens unit 20 may be fixed to the camera body 10 in a non-detachable manner.
 一例として図2に示すように、レンズユニット20は、対物レンズ30、フォーカスレンズ31、ズームレンズ32、絞り33、ぶれ補正レンズ34、ターレットフィルタ35、及び調整レンズ37を備える。レンズユニット20の光軸OAに沿って被写体側から像側にかけて順に、対物レンズ30、フォーカスレンズ31、ズームレンズ32、絞り33、ぶれ補正レンズ34、ターレットフィルタ35、及び調整レンズ37が配置されている。 As shown in FIG. 2 as an example, the lens unit 20 includes an objective lens 30, a focus lens 31, a zoom lens 32, an aperture 33, a blur correction lens 34, a turret filter 35, and an adjustment lens 37. An objective lens 30, a focus lens 31, a zoom lens 32, an aperture 33, a blur correction lens 34, a turret filter 35, and an adjustment lens 37 are arranged in order from the object side to the image side along the optical axis OA of the lens unit 20. there is
 対物レンズ30は、鏡筒21の先端部に固定されており、光を集光するレンズである。フォーカスレンズ31は、像の合焦位置を調節するためのレンズである。ズームレンズ32は、ズーム倍率を調節するためのレンズである。 The objective lens 30 is fixed to the tip of the lens barrel 21 and is a lens that collects light. The focus lens 31 is a lens for adjusting the focus position of the image. The zoom lens 32 is a lens for adjusting zoom magnification.
 絞り33は、光の量を調節するための光学要素である。絞り33は、開口33Aを有する。ズームレンズ32によって導かれた光は開口33Aを通過する。絞り33は、開口33Aの口径が可変である可動式の絞りである。ズームレンズ32によって導かれた光の量は、絞り33によって変更される。ぶれ補正レンズ34は、像のぶれを補正するためのレンズである。 The diaphragm 33 is an optical element for adjusting the amount of light. The diaphragm 33 has an aperture 33A. Light guided by zoom lens 32 passes through aperture 33A. The diaphragm 33 is a movable diaphragm in which the diameter of the aperture 33A is variable. The amount of light directed by zoom lens 32 is modified by aperture 33 . The blur correction lens 34 is a lens for correcting image blur.
 ターレットフィルタ35は、複数の光学フィルタを有している。ターレットフィルタ35は、複数の光学フィルタのうちレンズユニット20内の光の光路に挿入される光学フィルタが切り替えられることで、光に含まれる複数の波長帯域の光(一例として、可視光、及び、近赤外波長帯域内の異なる波長帯域の近赤外光)を選択的に透過させる光学要素である。レンズユニット20内の光の光路は、例えば、光軸OA上に位置する。以下、レンズユニット20内の光の光路を、単に光路と称する。ターレットフィルタ35の構成については、後に図4を用いて詳述する。 The turret filter 35 has a plurality of optical filters. The turret filter 35 selects the optical filter inserted in the optical path of the light in the lens unit 20 among the plurality of optical filters, thereby filtering out light in a plurality of wavelength bands (eg, visible light, It is an optical element that selectively transmits (near-infrared light in different wavelength bands within the near-infrared wavelength band). The optical path of light within the lens unit 20 is positioned, for example, on the optical axis OA. Hereinafter, the optical path of light within the lens unit 20 is simply referred to as an optical path. The configuration of the turret filter 35 will be detailed later with reference to FIG.
 調整レンズ37は、ターレットフィルタ35が備える複数の光学フィルタを切り替えた場合における焦点距離の差異を調整するためのレンズである。 The adjustment lens 37 is a lens for adjusting the difference in focal length when the plurality of optical filters included in the turret filter 35 are switched.
 なお、フォーカスレンズ31、ズームレンズ32、絞り33、ぶれ補正レンズ34、ターレットフィルタ35、及び調整レンズ37の並び順は、上記以外でもよい。また、対物レンズ30、フォーカスレンズ31、ズームレンズ32、ぶれ補正レンズ34、及び調整レンズ37の各々は、単一のレンズでもよく、また、複数のレンズを有するレンズ群でもよい。また、レンズユニット20は、フォーカスレンズ31、ズームレンズ32、ぶれ補正レンズ34、及び調整レンズ37に加えてその他のレンズを備えていてもよい。また、レンズユニット20は、ハーフミラー、又は偏光素子等の光学素子を備えていてもよい。 The order of arrangement of the focus lens 31, zoom lens 32, diaphragm 33, blur correction lens 34, turret filter 35, and adjustment lens 37 may be other than the above. Each of the objective lens 30, the focus lens 31, the zoom lens 32, the blur correction lens 34, and the adjusting lens 37 may be a single lens, or may be a lens group having a plurality of lenses. In addition to the focus lens 31, zoom lens 32, blur correction lens 34, and adjustment lens 37, the lens unit 20 may include other lenses. Also, the lens unit 20 may include an optical element such as a half mirror or a polarizing element.
 一例として図2に示すように、レンズユニット20は、ズーム駆動機構42、絞り駆動機構43、ぶれ補正駆動機構44、ターレット駆動機構45、及び調整駆動機構47を備える。ズーム駆動機構42、絞り駆動機構43、ぶれ補正駆動機構44、ターレット駆動機構45、及び調整駆動機構47は、鏡筒21の後端部に設けられた電気接点38に電気的に接続されている。 As shown in FIG. 2 as an example, the lens unit 20 includes a zoom drive mechanism 42, an aperture drive mechanism 43, a blur correction drive mechanism 44, a turret drive mechanism 45, and an adjustment drive mechanism 47. The zoom drive mechanism 42, the aperture drive mechanism 43, the blur correction drive mechanism 44, the turret drive mechanism 45, and the adjustment drive mechanism 47 are electrically connected to an electrical contact 38 provided at the rear end of the lens barrel 21. .
 カメラ本体10は、制御回路50を備える。制御回路50は、カメラ側マウント12に設けられた電気接点58に電気的に接続されている。レンズ側マウント22がカメラ側マウント12に接続され、レンズユニット20がカメラ本体10に装着された状態では、電気接点38が電気接点58と接続され、制御回路50がズーム駆動機構42、絞り駆動機構43、ぶれ補正駆動機構44、ターレット駆動機構45、及び調整駆動機構47と電気的に接続される。 The camera body 10 includes a control circuit 50. The control circuit 50 is electrically connected to electrical contacts 58 provided on the camera-side mount 12 . When the lens side mount 22 is connected to the camera side mount 12 and the lens unit 20 is attached to the camera body 10, the electrical contact 38 is connected to the electrical contact 58, and the control circuit 50 operates the zoom drive mechanism 42 and the aperture drive mechanism. 43 , the blur correction drive mechanism 44 , the turret drive mechanism 45 and the adjustment drive mechanism 47 are electrically connected.
 ズーム駆動機構42、絞り駆動機構43、ぶれ補正駆動機構44、ターレット駆動機構45、及び調整駆動機構47は、いずれもモータ等のアクチュエータを含む駆動機構である。 The zoom drive mechanism 42, the diaphragm drive mechanism 43, the blur correction drive mechanism 44, the turret drive mechanism 45, and the adjustment drive mechanism 47 are all drive mechanisms including actuators such as motors.
 一例として図3に示すように、制御回路50は、コンピュータ60、ズーム駆動回路52、絞り駆動回路53、ぶれ補正駆動回路54、ターレット駆動回路55、及び調整駆動回路57を備える。ズーム駆動回路52、絞り駆動回路53、ぶれ補正駆動回路54、ターレット駆動回路55、及び調整駆動回路57は、入出力I/F59を介してコンピュータ60と接続されている。 As shown in FIG. 3 as an example, the control circuit 50 includes a computer 60, a zoom drive circuit 52, an aperture drive circuit 53, a blur correction drive circuit 54, a turret drive circuit 55, and an adjustment drive circuit 57. The zoom drive circuit 52 , the aperture drive circuit 53 , the blur correction drive circuit 54 , the turret drive circuit 55 and the adjustment drive circuit 57 are connected to the computer 60 via the input/output I/F 59 .
 コンピュータ60は、CPU61、NVM62、及びRAM63を備えている。CPU61、NVM62、及びRAM63は、バス64を介して相互に接続されており、バス64は入出力I/F59に接続されている。 The computer 60 includes a CPU 61, NVM 62, and RAM 63. The CPU 61 , NVM 62 and RAM 63 are interconnected via a bus 64 , and the bus 64 is connected to the input/output I/F 59 .
 NVM62は、非一時的記憶媒体であり、各種パラメータ及び各種プログラムを記憶している。例えば、NVM62は、EEPROMである。但し、これは、あくまでも一例に過ぎず、EEPROMに代えて、又は、EEPROMと共に、HDD、及び/又はSSD等をNVM62として適用してもよい。RAM63は、各種情報を一時的に記憶し、ワークメモリとして用いられる。CPU61は、NVM62から必要なプログラムを読み出し、読み出したプログラムをRAM63で実行する。CPU61は、RAM63上で実行するプログラムに従ってカメラ1の全体を制御する。 The NVM 62 is a non-temporary storage medium and stores various parameters and various programs. For example, NVM 62 is an EEPROM. However, this is merely an example, and an HDD and/or an SSD may be applied as the NVM 62 instead of or together with the EEPROM. The RAM 63 temporarily stores various information and is used as a work memory. The CPU 61 reads necessary programs from the NVM 62 and executes the read programs in the RAM 63 . The CPU 61 controls the entire camera 1 according to programs executed on the RAM 63 .
 CPU61は、本開示の技術に係る「プロセッサ」の一例であり、RAM63は、本開示の技術に係る「メモリ」の一例である。また、コンピュータ60は、本開示の技術に係る「制御装置」の一例である。 The CPU 61 is an example of a "processor" according to the technology of the present disclosure, and the RAM 63 is an example of a "memory" according to the technology of the present disclosure. Also, the computer 60 is an example of a “control device” according to the technology of the present disclosure.
 ズーム駆動回路52は、コンピュータ60の指示に従って、ズーム駆動機構42を駆動させることにより、フォーカスレンズ31の位置と、ズームレンズ32の位置とを調整する。フォーカスレンズ31及びズームレンズ32は、ズーム駆動機構42から動力が付与されることによってレンズユニット20の光軸OAに沿って移動する。 The zoom drive circuit 52 adjusts the positions of the focus lens 31 and the zoom lens 32 by driving the zoom drive mechanism 42 according to instructions from the computer 60 . The focus lens 31 and the zoom lens 32 move along the optical axis OA of the lens unit 20 by applying power from the zoom drive mechanism 42 .
 絞り駆動回路53は、コンピュータ60の指示に従って、絞り駆動機構43を駆動させることにより、絞り33に設けられた開口33A(図2参照)の口径を変更する。 The aperture drive circuit 53 changes the diameter of the aperture 33A (see FIG. 2) provided in the aperture 33 by driving the aperture drive mechanism 43 according to instructions from the computer 60.
 ぶれ補正駆動回路54は、コンピュータ60の指示及び後述するフィードバック回路75から出力されたフィードバック信号に従って、ぶれ補正駆動機構44を駆動させることにより、ぶれ補正レンズ34の位置を調整する。ぶれ補正レンズ34は、ぶれ補正駆動機構44から動力が付与されることによってレンズユニット20の光軸OAと垂直な平面に沿って移動する。ぶれ補正レンズ34は、具体的には、イメージセンサ15に光が結像されることで得られる像のぶれが補正される方向へ移動する。 The blur correction drive circuit 54 adjusts the position of the blur correction lens 34 by driving the blur correction drive mechanism 44 according to instructions from the computer 60 and a feedback signal output from a feedback circuit 75, which will be described later. The blur correction lens 34 moves along a plane perpendicular to the optical axis OA of the lens unit 20 by applying power from the blur correction driving mechanism 44 . Specifically, the blur correction lens 34 moves in a direction in which blurring of an image obtained by forming an image of light on the image sensor 15 is corrected.
 ターレット駆動回路55は、コンピュータ60の指示に従って、ターレット駆動機構45を駆動させることにより、ターレットフィルタ35の回転方向の位置を調整する。ターレットフィルタ35は、ターレット駆動機構45から動力が付与されることによってレンズユニット20の光軸OAと垂直な平面に沿って回転する。ターレットフィルタ35の回転動作については、後に図4を用いて詳述する。 The turret drive circuit 55 adjusts the position of the turret filter 35 in the rotational direction by driving the turret drive mechanism 45 according to instructions from the computer 60 . The turret filter 35 rotates along a plane perpendicular to the optical axis OA of the lens unit 20 by applying power from the turret driving mechanism 45 . The rotation operation of the turret filter 35 will be detailed later with reference to FIG.
 調整駆動回路57は、コンピュータ60の指示に従って、調整駆動機構47を駆動させることにより、調整レンズ37の位置を調整する。調整レンズ37は、調整駆動機構47から動力が付与されることによってレンズユニット20の光軸OAに沿って移動する。 The adjustment drive circuit 57 adjusts the position of the adjustment lens 37 by driving the adjustment drive mechanism 47 according to instructions from the computer 60 . The adjustment lens 37 is moved along the optical axis OA of the lens unit 20 by applying power from the adjustment drive mechanism 47 .
 一例として図3に示すように、カメラ本体10は、イメージセンサドライバ71、信号処理回路72、投光制御回路73、振動センサ74、フィードバック回路75、ディスプレイ76、ディスプレイ制御回路77、入力デバイス78、入力回路79、及び外部I/F80を備える。イメージセンサドライバ71、信号処理回路72、投光制御回路73、フィードバック回路75、ディスプレイ制御回路77、入力回路79、及び外部I/F80は、入出力I/F59を介してコンピュータ60と接続されている。 As shown in FIG. 3 as an example, the camera body 10 includes an image sensor driver 71, a signal processing circuit 72, a light projection control circuit 73, a vibration sensor 74, a feedback circuit 75, a display 76, a display control circuit 77, an input device 78, An input circuit 79 and an external I/F 80 are provided. Image sensor driver 71, signal processing circuit 72, light projection control circuit 73, feedback circuit 75, display control circuit 77, input circuit 79, and external I/F 80 are connected to computer 60 via input/output I/F 59. there is
 イメージセンサドライバ71は、コンピュータ60の指示に従って、イメージセンサ15に光を撮像させる。信号処理回路72は、イメージセンサ15から出力されたアナログ画像データに対して各種の信号処理を施すことによりデジタル画像データを生成して出力する。 The image sensor driver 71 causes the image sensor 15 to capture light according to instructions from the computer 60 . The signal processing circuit 72 performs various signal processing on the analog image data output from the image sensor 15 to generate and output digital image data.
 投光制御回路73は、コンピュータ60の指示に従って、投光器14をオンとオフに切り替える。投光器14は、オンに切り替えられると照明光を出力し、オフに切り替えられると照明光の出力を停止する。 The light projection control circuit 73 switches the light projector 14 on and off according to instructions from the computer 60 . The light projector 14 outputs illumination light when switched on, and stops outputting illumination light when switched off.
 振動センサ74は、例えば、ジャイロセンサであり、カメラ1の振動を検出する。振動センサ74に含まれるジャイロセンサは、カメラ1のピッチ軸及びヨー軸の各軸周りの振動を検出する。振動センサ74は、ジャイロセンサによって検出されたピッチ軸周りの振動及びヨー軸周りの振動をピッチ軸及びヨー軸に平行な2次元状の面内での振動に変換することで、カメラ1に対してピッチ軸の方向に作用する振動及びヨー軸の方向に作用する振動を検出する。振動センサ74は、検出した振動に応じた振動検出信号を出力する。 The vibration sensor 74 is, for example, a gyro sensor, and detects vibration of the camera 1. A gyro sensor included in the vibration sensor 74 detects vibrations of the camera 1 around the pitch axis and the yaw axis. The vibration sensor 74 converts vibrations about the pitch axis and the vibrations about the yaw axis detected by the gyro sensor into vibrations in a two-dimensional plane parallel to the pitch axis and the yaw axis. to detect vibration acting in the direction of the pitch axis and vibration acting in the direction of the yaw axis. The vibration sensor 74 outputs a vibration detection signal corresponding to the detected vibration.
 なお、振動センサ74は、加速度センサであってもよい。また、振動センサ74の代わりに、例えば、NVM62及び/又はRAM63に記憶された時系列的に前後する撮像画像を比較することで得た動きベクトルを振動として用いてもよい。また、物理的なセンサによって検出された振動と、画像処理によって得られた動きベクトルとに基づいて最終的に使用される振動が導出されてもよい。 Note that the vibration sensor 74 may be an acceleration sensor. Also, instead of the vibration sensor 74, for example, a motion vector obtained by comparing successive captured images stored in the NVM 62 and/or the RAM 63 may be used as vibration. Also, the final used vibration may be derived based on the vibration detected by the physical sensor and the motion vector obtained by image processing.
 フィードバック回路75は、振動センサ74から出力された振動検出信号に対して各種の信号処理を施すことにより、フィードバック信号を生成する。フィードバック回路75は、入出力I/F59を介してぶれ補正駆動回路54と接続されており、コンピュータ60の指示に従って、フィードバック信号をぶれ補正駆動回路54に出力する。 The feedback circuit 75 generates a feedback signal by performing various signal processing on the vibration detection signal output from the vibration sensor 74 . The feedback circuit 75 is connected to the blur correction drive circuit 54 via the input/output I/F 59 and outputs a feedback signal to the blur correction drive circuit 54 according to instructions from the computer 60 .
 ディスプレイ76は、例えば、液晶ディスプレイ又はELディスプレイ等であり、画像及び/又は文字情報等を表示する。ディスプレイ制御回路77は、コンピュータ60の指示に従って、ディスプレイ76に画像を表示させる。 The display 76 is, for example, a liquid crystal display or an EL display, and displays images and/or character information. The display control circuit 77 causes the display 76 to display an image according to instructions from the computer 60 .
 入力デバイス78は、例えば、タッチパネル及び/又はスイッチ等のデバイスであり、ユーザから与えられた指示を受け付ける。入力回路79は、ユーザによって入力デバイス78に与えられた指示に応じた入力信号を出力する。外部I/F80は、外部装置と通信可能に接続されるインターフェースである。 The input device 78 is, for example, a device such as a touch panel and/or a switch, and receives instructions given by the user. The input circuit 79 outputs an input signal according to an instruction given to the input device 78 by the user. The external I/F 80 is an interface communicably connected to an external device.
 一例として図4に示すように、ターレットフィルタ35は、円板81を備える。円板81には、円板81の周方向に沿って等間隔に複数の光学フィルタとして、Irカットフィルタ82、第1BPF83A、第2BPF83B、第3BPF83C、及び第4BPF83Dが設けられている。以下では、特に区別して説明する必要がない場合、Irカットフィルタ82、第1BPF83A、第2BPF83B、第3BPF83C、及び第4BPF83Dを光学フィルタと称する。また、以下では、特に区別して説明する必要がない場合、第1BPF83A、第2BPF83B、第3BPF83C、及び第4BPF83DをBPF83と称する。 As shown in FIG. 4 as an example, the turret filter 35 has a disc 81 . The disc 81 is provided with an Ir cut filter 82, a first BPF 83A, a second BPF 83B, a third BPF 83C, and a fourth BPF 83D as a plurality of optical filters at equal intervals along the circumferential direction of the disc 81. Hereinafter, the Ir cut filter 82, the first BPF 83A, the second BPF 83B, the third BPF 83C, and the fourth BPF 83D are referred to as optical filters unless they need to be distinguished and described. Further, hereinafter, the first BPF 83A, the second BPF 83B, the third BPF 83C, and the fourth BPF 83D will be referred to as BPFs 83 unless they need to be distinguished and described.
 ターレットフィルタ35は、ターレット方式で複数の光学フィルタを光路に対して選択的に挿脱させる。具体的には、図4に示す円弧矢印R方向にターレットフィルタ35が回転することで、Irカットフィルタ82、第1BPF83A、第2BPF83B、第3BPF83C、及び第4BPF83Dが、光路(一例として、光軸OA上の光路)に対して選択的に挿脱される。光学フィルタが光路に挿入されると、光学フィルタの中心を光軸OAが貫き、光路に挿入された光学フィルタの中心とイメージセンサ15の受光面の中心とが一致する。 The turret filter 35 selectively inserts and removes a plurality of optical filters with respect to the optical path in a turret system. Specifically, by rotating the turret filter 35 in the direction of the arc arrow R shown in FIG. (upper optical path). When the optical filter is inserted into the optical path, the optical axis OA passes through the center of the optical filter, and the center of the optical filter inserted into the optical path coincides with the center of the light receiving surface of the image sensor 15 .
 Irカットフィルタ82は、赤外線をカットし、赤外線以外の光のみを透過させる光学フィルタである。BPF83は、近赤外光を透過させる光学フィルタである。第1BPF83A、第2BPF83B、第3BPF83C、及び第4BPF83Dは、それぞれ異なる波長帯域の近赤外光を透過させる。 The Ir cut filter 82 is an optical filter that cuts infrared rays and transmits only light other than infrared rays. The BPF 83 is an optical filter that transmits near-infrared light. The first BPF 83A, the second BPF 83B, the third BPF 83C, and the fourth BPF 83D transmit near-infrared light in different wavelength bands.
 第1BPF83Aは、1000nm(ナノメートル)近傍の波長帯域に対応している光学フィルタである。一例として、第1BPF83Aは、950nmから1100nmの波長帯域の近赤外光のみを透過させる。以下、第1BPF83Aを透過した近赤外光を第1近赤外光と称する。 The first BPF 83A is an optical filter that corresponds to a wavelength band near 1000 nm (nanometers). As an example, the first BPF 83A transmits only near-infrared light in the wavelength band from 950 nm to 1100 nm. The near-infrared light transmitted through the first BPF 83A is hereinafter referred to as first near-infrared light.
 第2BPF83Bは、1250nm近傍の波長帯域に対応している光学フィルタである。一例として、第2BPF83Bは、1150nmから1350nmの波長帯域の近赤外光のみを透過させる。以下、第2BPF83Bを透過した近赤外光を第2近赤外光と称する。 The second BPF 83B is an optical filter corresponding to a wavelength band near 1250 nm. As an example, the second BPF 83B transmits only near-infrared light in the wavelength band from 1150 nm to 1350 nm. The near-infrared light transmitted through the second BPF 83B is hereinafter referred to as second near-infrared light.
 第3BPF83Cは、1550nm近傍の波長帯域に対応している光学フィルタである。一例として、第3BPF83Cは、1500nmから1750nmの波長帯域の近赤外光のみを透過させる。以下、第3BPF83Cを透過した近赤外光を第3近赤外光と称する。 The third BPF 83C is an optical filter that corresponds to a wavelength band near 1550 nm. As an example, the third BPF 83C transmits only near-infrared light in the wavelength band from 1500 nm to 1750 nm. The near-infrared light transmitted through the third BPF 83C is hereinafter referred to as third near-infrared light.
 第4BPF83Dは、2150nm近傍の波長帯域に対応している光学フィルタである。一例として、第4BPF83Dは、2000nmから2400nmの波長帯域の近赤外光のみを透過させる。以下、第4BPF83Dを透過した近赤外光を第4近赤外光と称する。 The fourth BPF 83D is an optical filter corresponding to a wavelength band near 2150 nm. As an example, the fourth BPF 83D transmits only near-infrared light in the wavelength band from 2000 nm to 2400 nm. The near-infrared light transmitted through the fourth BPF 83D is hereinafter referred to as fourth near-infrared light.
 以下では、特に区別して説明する必要がない場合、第1近赤外光、第2近赤外光、第3近赤外光、及び第4近赤外光を近赤外光と称する。なお、ここで挙げた各帯域には、本開示の技術が属する技術分野で一般的に許容される誤差であって、本開示の技術の趣旨を逸脱しない範囲内の誤差も含まれている。また、ここで挙げた各波長帯域は、あくまでも一例に過ぎず、それぞれ異なる波長帯域であればよい。 Hereinafter, the first near-infrared light, the second near-infrared light, the third near-infrared light, and the fourth near-infrared light are referred to as near-infrared light unless it is necessary to distinguish them. Note that each band mentioned here includes an error that is generally allowed in the technical field to which the technology of the present disclosure belongs and that does not deviate from the gist of the technology of the present disclosure. Further, each wavelength band mentioned here is merely an example, and different wavelength bands may be used.
 Irカットフィルタ82が光路に挿入され、Irカットフィルタ82を透過した可視光がイメージセンサ15の受光面に結像されると、受光面に配置された複数のSiダイオードが、受光した可視光を撮像することで得たアナログ画像データを出力する。これにより、可視光を撮像することにより可視光画像を得る機能が実現される。またBPF83が光路に挿入され、BPF83を透過した近赤外光がイメージセンサ15の受光面に結像されると、受光面に配置された複数のInGaAsダイオードが、受光した近赤外光を撮像することで得たアナログ画像データを出力する。これにより、近赤外光を撮像することにより近赤外光画像を得る機能が実現される。 When the Ir cut filter 82 is inserted into the optical path and the visible light transmitted through the Ir cut filter 82 is imaged on the light receiving surface of the image sensor 15, a plurality of Si diodes arranged on the light receiving surface cut the received visible light. Analog image data obtained by imaging is output. This realizes a function of obtaining a visible light image by imaging visible light. Further, when the BPF 83 is inserted into the optical path and the near-infrared light transmitted through the BPF 83 is imaged on the light receiving surface of the image sensor 15, a plurality of InGaAs diodes arranged on the light receiving surface captures the received near-infrared light. The analog image data obtained by doing is output. This realizes a function of obtaining a near-infrared light image by capturing near-infrared light.
 上述の通り、カメラ1は、可視光を撮像することにより可視光画像を得る機能、及び近赤外光を撮像することにより近赤外光画像を得る機能を備える。また、カメラ1は、可視光画像を得る機能及び近赤外光画像を得る機能に加えて、被写体から熱輻射により発せられる電磁波に基づいて被写体の温度を測定する機能を備える。第1実施形態では、測定精度を向上させるために、二色温度測定法を利用する。二色温度測定法による温度測定を実現するためには、例えば、異なる二つの波長帯域の光を撮像することが要求される。第1実施形態では、異なる二つの波長帯域の光を生成するための一手段として、ターレットフィルタ35が用いられており、一例として、二つの波長帯域の近赤外光が温度測定に用いられる。 As described above, the camera 1 has a function of obtaining a visible light image by capturing visible light and a function of obtaining a near-infrared light image by capturing near-infrared light. In addition to the function of obtaining a visible light image and the function of obtaining a near-infrared light image, the camera 1 also has a function of measuring the temperature of a subject based on electromagnetic waves emitted by thermal radiation from the subject. In the first embodiment, two-color thermometry is used to improve measurement accuracy. In order to realize temperature measurement by two-color thermometry, for example, it is required to capture light in two different wavelength bands. In the first embodiment, the turret filter 35 is used as one means for generating light in two different wavelength bands, and as an example, near-infrared light in two wavelength bands is used for temperature measurement.
 一例として図5に示すように、撮像支援処理は、CPU61によって撮像支援処理プログラム100が実行されることで実現される。撮像支援処理プログラム100は、本開示の技術に係る「プログラム」の一例である。図5に示す例では、NVM62に撮像支援処理プログラム100が記憶されており、CPU61が、NVM62から撮像支援処理プログラム100を読み出し、RAM63上で実行する。 As shown in FIG. 5 as an example, the imaging support processing is realized by executing the imaging support processing program 100 by the CPU 61 . The imaging support processing program 100 is an example of a “program” according to the technology of the present disclosure. In the example shown in FIG. 5 , the imaging support processing program 100 is stored in the NVM 62 , and the CPU 61 reads the imaging support processing program 100 from the NVM 62 and executes it on the RAM 63 .
 CPU61は、RAM63上で実行する撮像支援処理プログラム100に従って撮像支援処理を行う。CPU61は、RAM63上で撮像支援処理プログラム100を実行することで、モード切替処理部110、撮像処理部120、及び温度測定処理部130として機能する。 The CPU 61 performs imaging support processing according to the imaging support processing program 100 executed on the RAM 63 . The CPU 61 functions as a mode switching processing section 110 , an imaging processing section 120 and a temperature measurement processing section 130 by executing the imaging support processing program 100 on the RAM 63 .
 CPU61は、動作モードとして、撮像モードと温度測定モードとを備えており、撮像モードと温度測定モードとを切り替える。撮像処理部120は、CPU61の動作モードが撮像モードに切り替えられた場合に動作する処理部である。撮像処理部120は、イメージセンサ15によって可視光が撮像されることにより得られた可視光画像、又はイメージセンサ15によって近赤外光が撮像されることにより得られた近赤外光画像をディスプレイ76に表示させる撮像処理を実行する処理部である。 The CPU 61 has an imaging mode and a temperature measurement mode as operation modes, and switches between the imaging mode and the temperature measurement mode. The imaging processing unit 120 is a processing unit that operates when the operation mode of the CPU 61 is switched to the imaging mode. The imaging processing unit 120 displays a visible light image obtained by imaging visible light with the image sensor 15 or a near-infrared light image obtained by imaging near-infrared light with the image sensor 15. 76 is a processing unit that executes imaging processing to be displayed on 76 .
 温度測定処理部130は、CPU61の動作モードが温度測定モードに切り替えられた場合に動作する処理部である。温度測定処理部130は、イメージセンサ15によって近赤外光が撮像されることにより得られた近赤外光画像に基づいて被写体の温度分布を算出し、被写体の温度分布に基づいて作成した温度情報をディスプレイ76に表示させる温度測定処理を実行する処理部である。 The temperature measurement processing unit 130 is a processing unit that operates when the operation mode of the CPU 61 is switched to the temperature measurement mode. The temperature measurement processing unit 130 calculates the temperature distribution of the subject based on the near-infrared light image obtained by capturing the near-infrared light by the image sensor 15, and calculates the temperature distribution generated based on the temperature distribution of the subject. It is a processing unit that executes temperature measurement processing for displaying information on the display 76 .
 モード切替処理部110は、CPU61の動作モードを撮像モードと温度測定モードとに切り替えるモード切替処理を実行する処理部である。以下、モード切替処理部110、撮像処理部120、及び温度測定処理部130について順に説明する。 The mode switching processing unit 110 is a processing unit that executes mode switching processing for switching the operation mode of the CPU 61 between the imaging mode and the temperature measurement mode. The mode switching processing unit 110, the imaging processing unit 120, and the temperature measurement processing unit 130 will be described in order below.
 モード切替処理部110は、モード選択情報取得部111、モード判定部112、フラグ設定部113、投光制御部114、及びモード設定部115を有する。 The mode switching processing unit 110 has a mode selection information acquisition unit 111 , a mode determination unit 112 , a flag setting unit 113 , a light projection control unit 114 and a mode setting unit 115 .
 一例として図6に示すように、モード選択情報取得部111は、例えば、受付デバイスを介して、動作モードとして撮像モードを選択する撮像モード選択情報と、動作モードとして温度測定モードを選択する温度測定モード選択情報とを選択的に取得する。受付デバイスは、各種情報を受け付け、受け付けた各種情報をCPU61に出力する。受付デバイスの一例としては、入力回路79及び外部I/F80が挙げられる。なお、以下では、説明の便宜上、特に区別して説明する必要がない場合、撮像モード選択情報及び温度測定モード選択情報を、モード選択情報と称する。 As an example, as shown in FIG. 6, the mode selection information acquisition unit 111 obtains imaging mode selection information for selecting the imaging mode as the operation mode and temperature measurement mode for selecting the temperature measurement mode as the operation mode, for example, via the reception device. Selectively acquire mode selection information. The receiving device receives various information and outputs the received various information to the CPU 61 . Examples of receiving devices include the input circuit 79 and the external I/F 80 . For convenience of explanation, the imaging mode selection information and the temperature measurement mode selection information are hereinafter referred to as mode selection information when there is no need to distinguish between them.
 入力デバイス78には、ユーザからの各種指示が入力される。入力回路79は、入力デバイス78に入力された指示に応じた情報をCPU61に出力する。入力回路79は、ユーザから入力デバイス78に与えられたモード選択指示に応じてモード選択情報をCPU61に出力する。モード選択情報取得部111は、入力回路79から入力されたモード選択情報を取得する。外部I/F80は、外部装置(図示省略)から出力されたモード選択情報を受信し、受信したモード選択情報をCPU61に出力する。モード選択情報取得部111は、外部I/F80から入力されたモード選択情報を取得する。 Various instructions from the user are input to the input device 78 . The input circuit 79 outputs information to the CPU 61 according to the instruction input to the input device 78 . The input circuit 79 outputs mode selection information to the CPU 61 according to a mode selection instruction given to the input device 78 by the user. The mode selection information acquisition unit 111 acquires mode selection information input from the input circuit 79 . The external I/F 80 receives mode selection information output from an external device (not shown) and outputs the received mode selection information to the CPU 61 . Mode selection information acquisition unit 111 acquires mode selection information input from external I/F 80 .
 モード判定部112は、モード選択情報取得部111によって取得されたモード選択情報によって選択された動作モードが撮像モードであるのか、又は温度測定モードであるのかを判定する。モード判定部112は、モード選択情報取得部111によって取得されたモード選択情報が撮像モード選択情報である場合には、モード選択情報によって選択された動作モードが撮像モードであると判定する。また、モード判定部112は、モード選択情報取得部111によって取得されたモード選択情報が温度測定モード選択情報である場合には、モード選択情報によって選択された動作モードが温度測定モードであると判定する。 The mode determination unit 112 determines whether the operation mode selected by the mode selection information acquired by the mode selection information acquisition unit 111 is the imaging mode or the temperature measurement mode. When the mode selection information acquired by the mode selection information acquisition unit 111 is the imaging mode selection information, the mode determination unit 112 determines that the operation mode selected by the mode selection information is the imaging mode. Further, when the mode selection information acquired by the mode selection information acquisition unit 111 is the temperature measurement mode selection information, the mode determination unit 112 determines that the operation mode selected by the mode selection information is the temperature measurement mode. do.
 RAM63には、表示制御フラグ記憶領域141及び投光制御フラグ記憶領域142が設けられている。表示制御フラグ記憶領域141は、ディスプレイ76に対して表示させる画像を指定する表示制御フラグ151を記憶する。投光制御フラグ記憶領域142は、投光器14の動作を指定する投光制御フラグ152を記憶する。 A display control flag storage area 141 and a light projection control flag storage area 142 are provided in the RAM 63 . The display control flag storage area 141 stores a display control flag 151 that designates an image to be displayed on the display 76 . The light projection control flag storage area 142 stores a light projection control flag 152 that designates the operation of the light projector 14 .
 フラグ設定部113は、モード判定部112によって動作モードが撮像モードであると判定された場合には、表示制御フラグ記憶領域141に表示制御フラグ151として撮像画像表示フラグ151Aを設定し、かつ、投光制御フラグ記憶領域142に投光制御フラグ152として投光オン制御フラグ152Aを設定する。また、フラグ設定部113は、モード判定部112によって動作モードが温度測定モードであると判定された場合には、表示制御フラグ記憶領域141に表示制御フラグ151として合成画像表示フラグ151Bを設定し、投光制御フラグ記憶領域142に投光制御フラグ152として投光オフ制御フラグ152Bを設定する。以下では、特に区別して説明する必要がない場合、表示制御フラグ151及び投光制御フラグ152を制御フラグと称する。 When the mode determination unit 112 determines that the operation mode is the imaging mode, the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141 and A light projection ON control flag 152 A is set as the light projection control flag 152 in the light control flag storage area 142 . Further, when the mode determination unit 112 determines that the operation mode is the temperature measurement mode, the flag setting unit 113 sets the composite image display flag 151B as the display control flag 151 in the display control flag storage area 141, A light projection OFF control flag 152 B is set as the light projection control flag 152 in the light projection control flag storage area 142 . Hereinafter, the display control flag 151 and the light projection control flag 152 will be referred to as control flags unless they need to be distinguished and described.
 投光制御部114は、フラグ設定部113によって投光オン制御フラグ152Aが設定された場合に、投光制御回路73にオン指令を出力する。オン指令は、投光器14をオンに切り替える指令である。また、投光制御部114は、フラグ設定部113によって投光オフ制御フラグ152Bが設定された場合に、投光制御回路73にオフ指令を出力する。オフ指令は、投光器14をオフに切り替える指令である。なお、オンとは、投光器14が投光を行う設定を指し、オフとは、投光器14が投光を行わない設定を指す。 The light projection control unit 114 outputs an ON command to the light projection control circuit 73 when the light projection ON control flag 152A is set by the flag setting unit 113 . The ON command is a command to switch the projector 14 ON. Further, when the flag setting unit 113 sets the light projection OFF control flag 152B, the light projection control unit 114 outputs an OFF command to the light projection control circuit 73 . The OFF command is a command to switch the projector 14 off. Note that "on" refers to a setting in which the light projector 14 projects light, and "off" refers to a setting in which the light projector 14 does not perform light projection.
 モード設定部115は、表示制御フラグ151として撮像画像表示フラグ151Aが設定された場合には、CPU61の動作モードとして撮像モードを設定する。また、モード設定部115は、表示制御フラグ151として合成画像表示フラグ151Bが設定された場合には、CPU61の動作モードとして温度測定モードを設定する。 The mode setting unit 115 sets the imaging mode as the operation mode of the CPU 61 when the captured image display flag 151A is set as the display control flag 151 . Further, when the composite image display flag 151B is set as the display control flag 151, the mode setting unit 115 sets the temperature measurement mode as the operation mode of the CPU 61. FIG.
 第1実施形態において、撮像モードは、本開示の技術に係る「第1モード」の一例であり、温度測定モードは、本開示の技術に係る「第2モード」の一例である。また、表示制御フラグ151及び投光制御フラグ152は、本開示の技術に係る「制御因子」の一例である。また、ディスプレイ76及び投光器14は、本開示の技術に係る「制御対象」の一例であり、ディスプレイ76に表示される画像、及び投光器14の動作は、本開示の技術に係る「制御対象に対する制御内容」の一例である。また、表示制御フラグ151は、本開示の技術に係る「表示制御因子」の一例であり、撮像画像表示フラグ151Aは、本開示の技術に係る「撮像画像表示因子」の一例であり、合成画像表示フラグ151Bは、本開示の技術に係る「温度情報表示因子」の一例である。また、投光オン制御フラグ152A及び投光オフ制御フラグ152Bは、本開示の技術に係る「投光制御因子」の一例であり、投光オン制御フラグ152Aは、本開示の技術に係る「投光オン制御因子」の一例であり、投光オフ制御フラグ152Bは、本開示の技術に係る「投光抑制制御因子」の一例である。 In the first embodiment, the imaging mode is an example of the "first mode" according to the technology of the present disclosure, and the temperature measurement mode is an example of the "second mode" according to the technology of the present disclosure. Also, the display control flag 151 and the light projection control flag 152 are examples of the “control factor” according to the technique of the present disclosure. In addition, the display 76 and the light projector 14 are examples of the “controlled object” according to the technology of the present disclosure, and the image displayed on the display 76 and the operation of the light projector 14 are the same as the “controlled object” according to the technology of the present disclosure. This is an example of "content". Further, the display control flag 151 is an example of the "display control factor" according to the technology of the present disclosure, the captured image display flag 151A is an example of the "captured image display factor" according to the technology of the present disclosure, and the composite image The display flag 151B is an example of a "temperature information display factor" according to the technology of the present disclosure. Further, the light projection ON control flag 152A and the light projection OFF control flag 152B are examples of the “light projection control factor” according to the technology of the present disclosure, and the light projection ON control flag 152A is an example of the “projection control factor” according to the technology of the present disclosure. The light emission OFF control flag 152B is an example of the "light emission suppression control factor" according to the technology of the present disclosure.
 一例として図7に示すように、撮像処理部120は、波長選択部121、ターレット制御部122、撮像制御部123、表示制御部124を有する。 As shown in FIG. 7 as an example, the imaging processing unit 120 has a wavelength selection unit 121, a turret control unit 122, an imaging control unit 123, and a display control unit .
 波長選択部121は、入力デバイス78によって受け付けられた波長選択指示に従って、複数の波長帯域から撮像に用いる一つの波長帯域を選択する。一例として、波長選択部121は、可視光の波長帯域、950nmから1100nmの第1近赤外光の波長帯域、1150nmから1350nmの第2近赤外光の波長帯域、1500nmから1750nmの第3近赤外光の波長帯域、及び2000nmから2400nmの第4近赤外光の波長帯域から一つの波長帯域を選択する。なお、ここでは、入力デバイス78によって受け付けられた波長選択指示に従って波長帯域が選択される形態例を挙げて説明しているが、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って波長帯域が選択されてもよい。 The wavelength selection unit 121 selects one wavelength band used for imaging from a plurality of wavelength bands according to the wavelength selection instruction accepted by the input device 78 . As an example, the wavelength selection unit 121 selects the visible light wavelength band, the first near-infrared light wavelength band from 950 nm to 1100 nm, the second near-infrared light wavelength band from 1150 nm to 1350 nm, and the third near-infrared light wavelength band from 1500 nm to 1750 nm. One wavelength band is selected from the infrared wavelength band and the fourth near-infrared wavelength band from 2000 nm to 2400 nm. Here, an example is given in which the wavelength band is selected according to the wavelength selection instruction received by the input device 78, but the wavelength can be selected according to various conditions (for example, subject temperature and/or imaging conditions). A band may be selected.
 ターレット制御部122は、複数の光学フィルタのうち、波長選択部121によって選択された波長帯域に対応する光学フィルタを光路に挿入させる回転指令をターレット駆動回路55に対して出力する。ターレット駆動回路55は、回転指令を受け取ると、ターレット駆動機構45を駆動させることにより、回転指令に対応する光学フィルタが光路に挿入される位置にターレットフィルタ35を回転させる。 The turret control unit 122 outputs to the turret drive circuit 55 a rotation command for inserting into the optical path an optical filter corresponding to the wavelength band selected by the wavelength selection unit 121 among the plurality of optical filters. Upon receiving the rotation command, the turret drive circuit 55 drives the turret drive mechanism 45 to rotate the turret filter 35 to a position where the optical filter corresponding to the rotation command is inserted into the optical path.
 撮像制御部123は、イメージセンサドライバ71に対して撮像指令を出力する。撮像指令は、イメージセンサ15に光を撮像させる指令である。イメージセンサ15は、被写体から発せられた光を撮像し、光を撮像することで得たアナログ画像データを出力する。信号処理回路72は、イメージセンサ15から出力されたアナログ画像データに対して各種の信号処理を施すことによりデジタル画像データを生成して出力する。 The imaging control unit 123 outputs an imaging command to the image sensor driver 71 . The imaging command is a command to cause the image sensor 15 to capture light. The image sensor 15 captures light emitted from a subject and outputs analog image data obtained by capturing the light. The signal processing circuit 72 performs various signal processing on the analog image data output from the image sensor 15 to generate and output digital image data.
 表示制御部124は、信号処理回路72によって生成されたデジタル画像データに基づいて撮像画像161Aをディスプレイ76に表示させる。撮像画像161Aは、例えば、動画像で表示されるが、静止画像で表示されてもよい。 The display control unit 124 causes the display 76 to display the captured image 161 A based on the digital image data generated by the signal processing circuit 72 . For example, the captured image 161A is displayed as a moving image, but may be displayed as a still image.
 図8には、撮像処理部120によって撮像処理が実行されることによりディスプレイ76に表示された撮像画像161Aの一例が示されている。図8に示す例では、建物の窓162の室内側に設けられたカーテン163に火164がついている撮像画像161Aがディスプレイ76に表示されている。 FIG. 8 shows an example of a captured image 161A displayed on the display 76 as a result of the imaging processing being executed by the imaging processing unit 120. As shown in FIG. In the example shown in FIG. 8, the display 76 displays a captured image 161A in which a curtain 163 provided on the indoor side of a window 162 of a building is on fire 164 .
 一例として図9に示すように、温度測定処理部130は、波長選択部131、第1ターレット制御部132、第1撮像制御部133、第2ターレット制御部134、第2撮像制御部135、温度導出部136、及び表示制御部137を有する。 As an example, as shown in FIG. 9, the temperature measurement processing unit 130 includes a wavelength selection unit 131, a first turret control unit 132, a first imaging control unit 133, a second turret control unit 134, a second imaging control unit 135, a temperature It has a derivation unit 136 and a display control unit 137 .
 一例として図10に示すように、波長選択部131は、二色温度測定法に用いる二つの波長帯域、すなわち第1波長帯域及び第2波長帯域を選択する。一例として、波長選択部131は、第1波長帯域及び第2波長帯域として、950nmから1100nmの第1近赤外光の波長帯域、1150nmから1350nmの第2近赤外光の波長帯域、1500nmから1750nmの第3近赤外光の波長帯域、及び2000nmから2400nmの第4近赤外光の波長帯域から二つの波長帯域を選択する。また、一例として、波長選択部131は、第1波長帯域及び第2波長帯域として、第1近赤外光の波長帯域、第2近赤外光の波長帯域、第3近赤外光の波長帯域、及び第4近赤外光の波長帯域のうちの隣接する二つの波長帯域を選択する。 As an example, as shown in FIG. 10, the wavelength selection unit 131 selects two wavelength bands, that is, a first wavelength band and a second wavelength band, used for dichroic thermometry. As an example, the wavelength selection unit 131 uses a first near-infrared light wavelength band from 950 nm to 1100 nm, a second near-infrared light wavelength band from 1150 nm to 1350 nm, and a wavelength band from 1500 nm to Two wavelength bands are selected from the third near-infrared wavelength band of 1750 nm and the fourth near-infrared wavelength band of 2000 nm to 2400 nm. Further, as an example, the wavelength selection unit 131 uses the wavelength band of the first near-infrared light, the wavelength band of the second near-infrared light, and the wavelength of the third near-infrared light as the first wavelength band and the second wavelength band. and two adjacent wavelength bands from among the wavelength bands of the fourth near-infrared light.
 また、一例として、波長選択部131は、第1波長帯域及び第2波長帯域として、第1近赤外光の波長帯域、第2近赤外光の波長帯域、第3近赤外光の波長帯域、及び第4近赤外光の波長帯域から被写体の温度に基づいて二つの波長帯域を選択する。また、一例として、波長選択部131は、第1波長帯域及び第2波長帯域として、第1近赤外光の波長帯域、第2近赤外光の波長帯域、第3近赤外光の波長帯域、及び第4近赤外光の波長帯域から、被写体の温度が高くになる従って短い波長帯域を選択する。 Further, as an example, the wavelength selection unit 131 uses the wavelength band of the first near-infrared light, the wavelength band of the second near-infrared light, and the wavelength of the third near-infrared light as the first wavelength band and the second wavelength band. and the wavelength band of the fourth near-infrared light, two wavelength bands are selected based on the temperature of the subject. Further, as an example, the wavelength selection unit 131 uses the wavelength band of the first near-infrared light, the wavelength band of the second near-infrared light, and the wavelength of the third near-infrared light as the first wavelength band and the second wavelength band. A shorter wavelength band is selected from the band and the wavelength band of the fourth near-infrared light as the temperature of the object increases.
 例えば、火災の場合、被写体の温度は、火災状況から予想される温度に関する情報、及び/又は、ユーザが入力デバイス78に入力した情報に基づいて予測される。火災状況から予想される温度に関する情報は、図9に示す外部I/F80を通じて取得される。火災状況から予想される温度に関する情報は、火災発生からの経過時間に対する標準火災温度に関する情報でもよい。標準火災温度は、ISO834にて定められている。なお、波長選択部131は、図9に示す入力デバイス78によって受け付けられた波長選択指示に従って波長帯域を切り替えてもよい。 For example, in the case of a fire, the temperature of the subject is predicted based on information about the temperature expected from the fire situation and/or information input to the input device 78 by the user. Information about the temperature expected from the fire situation is obtained through the external I/F 80 shown in FIG. The information on the temperature expected from the fire situation may be information on the standard fire temperature with respect to the elapsed time from the occurrence of the fire. Standard fire temperature is defined in ISO834. Note that the wavelength selection unit 131 may switch the wavelength band according to the wavelength selection instruction accepted by the input device 78 shown in FIG.
 一例として図9に示すように、第1ターレット制御部132は、複数のBPF83(図4参照)のうち波長選択部131によって選択された第1波長帯域に対応するBPF83を光路に挿入させる第1回転指令をターレット駆動回路55に対して出力する。ターレット駆動回路55は、第1回転指令を受け取ると、ターレット駆動機構45を駆動させることにより、第1回転指令に対応するBPFが光路に挿入される位置にターレットフィルタ35を回転させる。 As an example, as shown in FIG. 9, the first turret control unit 132 inserts the BPF 83 corresponding to the first wavelength band selected by the wavelength selection unit 131 from among the plurality of BPFs 83 (see FIG. 4) into the optical path. A rotation command is output to the turret drive circuit 55 . Upon receiving the first rotation command, the turret drive circuit 55 drives the turret drive mechanism 45 to rotate the turret filter 35 to a position where the BPF corresponding to the first rotation command is inserted into the optical path.
 第1撮像制御部133は、イメージセンサドライバ71に対して第1撮像指令を出力する。第1撮像指令は、イメージセンサ15に光を撮像させる指令である。イメージセンサ15は、第1波長帯域に対応するBPF83を透過した第1近赤外光を撮像し、第1近赤外光を撮像することで得た第1アナログ画像データを出力する。信号処理回路72は、イメージセンサ15から出力された第1アナログ画像データに対して各種の信号処理を施すことにより第1デジタル画像データを生成して出力する。 The first imaging control section 133 outputs a first imaging command to the image sensor driver 71 . The first imaging command is a command to cause the image sensor 15 to capture light. The image sensor 15 captures the first near-infrared light transmitted through the BPF 83 corresponding to the first wavelength band, and outputs first analog image data obtained by capturing the first near-infrared light. The signal processing circuit 72 performs various signal processing on the first analog image data output from the image sensor 15 to generate and output first digital image data.
 第2ターレット制御部134は、複数のBPF83のうち波長選択部131によって選択された第2波長帯域に対応するBPF83を光路に挿入させる第2回転指令をターレット駆動回路55に対して出力する。ターレット駆動回路55は、第2回転指令を受け取ると、ターレット駆動機構45を駆動させることにより、第2回転指令に対応するBPFが光路に挿入される位置にターレットフィルタ35を回転させる。 The second turret control unit 134 outputs to the turret drive circuit 55 a second rotation command for inserting into the optical path the BPF 83 corresponding to the second wavelength band selected by the wavelength selection unit 131 from among the plurality of BPFs 83 . Upon receiving the second rotation command, the turret drive circuit 55 drives the turret drive mechanism 45 to rotate the turret filter 35 to a position where the BPF corresponding to the second rotation command is inserted into the optical path.
 第2撮像制御部135は、イメージセンサドライバ71に対して第2撮像指令を出力する。第2撮像指令は、イメージセンサ15に光Lを撮像させる指令である。イメージセンサ15は、第2波長帯域に対応するBPFを透過した第2近赤外光を撮像し、第2近赤外光を撮像することで得た第2アナログ画像データを出力する。信号処理回路72は、イメージセンサ15から出力された第2アナログ画像データに対して各種の信号処理を施すことにより第2デジタル画像データを生成して出力する。 The second imaging control section 135 outputs a second imaging command to the image sensor driver 71 . The second imaging command is a command to cause the image sensor 15 to capture the light L. FIG. The image sensor 15 captures the second near-infrared light transmitted through the BPF corresponding to the second wavelength band, and outputs second analog image data obtained by capturing the second near-infrared light. The signal processing circuit 72 performs various signal processing on the second analog image data output from the image sensor 15 to generate and output second digital image data.
 温度導出部136は、第1デジタル画像データ及び第2デジタル画像データに基づいて、二色温度測定法により、被写体の温度分布を算出する。具体的には、温度導出部136は、イメージセンサ15が有する複数の物理画素のそれぞれについて、第1デジタル画像データから物理画素が出力した第1信号の値を抽出し、第2デジタル画像データから物理画素が出力した第2信号の値を抽出する。そして、温度導出部136は、複数の物理画素のそれぞれについて、第1信号の値及び第2信号の値に基づいて物理画素によって測定された温度を二色温度測定法に基づいて導出する。温度の導出には、二色温度測定法に基づく計算式が用いられてもよく、また、データマッチングテーブルが用いられてもよい。そして、温度導出部136は、複数の物理画素のそれぞれによって測定された温度を導出することで被写体の温度分布を算出する。 The temperature derivation unit 136 calculates the temperature distribution of the subject by two-color thermometry based on the first digital image data and the second digital image data. Specifically, the temperature derivation unit 136 extracts the value of the first signal output by the physical pixel from the first digital image data for each of the plurality of physical pixels of the image sensor 15, and extracts the value of the first signal output from the physical pixel from the second digital image data. A value of the second signal output by the physical pixel is extracted. Then, the temperature derivation unit 136 derives the temperature measured by the physical pixel based on the value of the first signal and the value of the second signal, based on the two-color thermometry method, for each of the plurality of physical pixels. A calculation formula based on a two-color thermometry method may be used to derive the temperature, or a data matching table may be used. The temperature derivation unit 136 then derives the temperature measured by each of the plurality of physical pixels to calculate the temperature distribution of the object.
 表示制御部137は、温度導出部136によって得られた被写体の温度分布に基づいて温度に関連する温度情報を生成する。そして、表示制御部137は、第1デジタル画像データ又は第2デジタル画像データに基づいて得られる撮像画像に温度情報を合成した合成画像161Bを出力し、合成画像161Bをディスプレイ76に表示させる。温度情報の一例としては、例えば、温度が予め定められた閾値以上である領域を示す情報、温度の具体的な数値を示す情報、予め定められた温度の範囲毎に区画された複数の区画を温度の具体的な数値と併せて示す情報、又は温度に応じた色調で温度分布を示す情報等が挙げられる。 The display control unit 137 generates temperature-related temperature information based on the temperature distribution of the subject obtained by the temperature derivation unit 136 . Then, the display control unit 137 outputs a synthesized image 161B obtained by synthesizing the captured image obtained based on the first digital image data or the second digital image data with the temperature information, and causes the display 76 to display the synthesized image 161B. Examples of temperature information include information indicating a region where the temperature is equal to or higher than a predetermined threshold, information indicating a specific temperature value, and a plurality of sections divided for each predetermined temperature range. For example, information indicating a specific numerical value of the temperature, or information indicating a temperature distribution with a color tone corresponding to the temperature.
 図11には、温度測定処理部130によって温度測定処理が実行されることによりディスプレイ76に表示された合成画像161Bの第1例が示されている。第1例に係る合成画像161Bは、温度が予め定められた閾値以上である領域を示す温度情報166が撮像画像165に合成された画像である。第1例では、温度情報166は、例えば、四角形状の枠を示す情報であるが、枠以外の情報でもよい。また、枠は、四角形状以外でもよい。また、枠は、温度に応じた色で表示されてもよい。また、枠が温度に応じた色で表示される場合には、色に対応する温度を示すスケール表示が枠と併せて表示されてもよい。また、温度情報166には、温度の具体的な数値を示す文字列が含まれてもよい。温度の具体的な数値は、温度の最大値及び/又は最小値でもよい。また、温度情報166は、静止して表示されてよいし、点滅して表示されてもよい。 FIG. 11 shows a first example of a composite image 161B displayed on the display 76 as a result of the temperature measurement processing being executed by the temperature measurement processing section 130. As shown in FIG. A synthesized image 161B according to the first example is an image obtained by synthesizing a captured image 165 with temperature information 166 indicating an area whose temperature is equal to or higher than a predetermined threshold. In the first example, the temperature information 166 is, for example, information indicating a rectangular frame, but may be information other than the frame. Also, the frame may have a shape other than a square shape. Also, the frame may be displayed in a color according to the temperature. Moreover, when the frame is displayed in a color corresponding to the temperature, a scale display indicating the temperature corresponding to the color may be displayed together with the frame. Also, the temperature information 166 may include a character string indicating a specific temperature value. Specific numerical values of temperature may be maximum and/or minimum values of temperature. Also, the temperature information 166 may be displayed still or blinking.
 図12には、温度測定処理部130によって温度測定処理が実行されることによりディスプレイ76に表示された合成画像161Bの第2例が示されている。第2例に係る合成画像161Bは、温度の具体的な数値を示す複数の温度情報167A、167B及び167Cが撮像画像165に合成された画像である。温度の具体的な数値は、温度の最大値及び/又は最小値でもよい。また、複数の温度情報167A、167B及び167Cは、静止して表示されてよいし、点滅して表示されてもよい。 FIG. 12 shows a second example of the composite image 161B displayed on the display 76 as a result of the temperature measurement processing being executed by the temperature measurement processing section 130. As shown in FIG. A synthesized image 161B according to the second example is an image obtained by synthesizing a plurality of pieces of temperature information 167A, 167B, and 167C indicating specific numerical values of temperatures with a captured image 165 . Specific numerical values of temperature may be maximum and/or minimum values of temperature. Also, the plurality of pieces of temperature information 167A, 167B, and 167C may be displayed stationary or may be displayed blinking.
 図13には、温度測定処理部130によって温度測定処理が実行されることによりディスプレイ76に表示された合成画像161Bの第3例が示されている。第3例に係る合成画像161Bは、予め定められた温度の範囲毎に区画された複数の区画を温度の具体的な数値と併せて示す温度情報168が撮像画像165に合成された画像である。温度の具体的な数値は、温度の最大値及び/又は最小値でもよい。また、温度情報168は、静止して表示されてよいし、点滅して表示されてもよい。 FIG. 13 shows a third example of the composite image 161B displayed on the display 76 as a result of the temperature measurement processing being executed by the temperature measurement processing section 130. FIG. A synthesized image 161B according to the third example is an image obtained by synthesizing temperature information 168 indicating a plurality of sections divided for each predetermined temperature range together with specific temperature values into a captured image 165. . Specific numerical values of temperature may be maximum and/or minimum values of temperature. Also, the temperature information 168 may be displayed still or blinking.
 図14には、温度測定処理部130によって温度測定処理が実行されることによりディスプレイ76に表示された合成画像161Bの第4例が示されている。第4例に係る合成画像161Bは、温度に応じた色調で温度分布を示す温度情報169が撮像画像165に合成された画像である。図14に示す第4例の温度分布は、図13に示す第3例の各区画の温度を温度に対応する色調で表示したものである。色調は、例えば、基準温度(例えば、3000℃)を中心として一定の色相角(例えば、1℃につき0.01°)で規定されている。また、色調に代えて、無彩色の濃淡で温度分布が可視化されていてもよい。温度情報169は、色調に対応する温度を示すインディケータを含んでいてもよい。また、温度情報169には、温度の具体的な数値を示す文字列が含まれてもよい。温度の具体的な数値は、温度の最大値及び/又は最小値でもよい。また、温度情報169は、静止して表示されてよいし、点滅して表示されてもよい。 FIG. 14 shows a fourth example of the composite image 161B displayed on the display 76 as a result of the temperature measurement processing being executed by the temperature measurement processing section 130. FIG. A synthesized image 161B according to the fourth example is an image in which the captured image 165 is synthesized with the temperature information 169 indicating the temperature distribution with a color tone corresponding to the temperature. The temperature distribution of the fourth example shown in FIG. 14 is obtained by displaying the temperature of each section of the third example shown in FIG. 13 in color tones corresponding to the temperature. For example, the color tone is defined at a constant hue angle (eg, 0.01° per 1° C.) around a reference temperature (eg, 3000° C.). Also, the temperature distribution may be visualized with an achromatic shade instead of a color tone. Temperature information 169 may include an indicator of the temperature corresponding to the color tone. Also, the temperature information 169 may include a character string indicating a specific numerical value of the temperature. Specific numerical values of temperature may be maximum and/or minimum values of temperature. Also, the temperature information 169 may be displayed still or blinking.
 温度情報166、167A、167B、167C、168、及び169は、本開示の技術に係る「温度情報」の一例である。撮像モードで得られた撮像画像161Aは、本開示の技術に係る「撮像画像」及び「第1撮像画像」の一例である。温度測定モードで得られた合成画像161Bは、本開示の技術に係る「合成画像」の一例である。なお、以下では、特に区別して説明する必要がない場合、撮像モードで得られた撮像画像161A、及び温度測定モードで得られた合成画像161Bを画像と称する。 The temperature information 166, 167A, 167B, 167C, 168, and 169 are examples of "temperature information" according to the technology of the present disclosure. The captured image 161A obtained in the imaging mode is an example of the “captured image” and the “first captured image” according to the technology of the present disclosure. A composite image 161B obtained in the temperature measurement mode is an example of a “composite image” according to the technology of the present disclosure. In the following description, the captured image 161A obtained in the imaging mode and the composite image 161B obtained in the temperature measurement mode will be referred to as images unless otherwise distinguished.
 次に、第1実施形態の作用として、カメラ1の制御方法について説明する。 Next, a method for controlling the camera 1 will be described as an operation of the first embodiment.
 はじめに、図15を参照しながら、CPU61が実行する撮像支援処理のうち、モード切替処理部110(図6参照)が実行するモード切替処理について説明する。 First, the mode switching processing executed by the mode switching processing unit 110 (see FIG. 6) among the imaging support processing executed by the CPU 61 will be described with reference to FIG.
 ステップS11で、モード選択情報取得部111は、モード選択情報を取得する。 At step S11, the mode selection information acquisition unit 111 acquires mode selection information.
 ステップS12で、モード判定部112は、モード選択情報取得部111によって取得されたモード選択情報に基づいて、モード選択情報によって選択された動作モードが撮像モードであるのか、又は温度測定モードであるのかを判定する。ステップS12において、撮像モードであると判定された場合、図15に示す処理は、ステップS13に移行し、温度測定モードであると判定された場合、図15に示す処理は、ステップS16に移行する。 In step S12, the mode determination unit 112 determines whether the operation mode selected by the mode selection information is the imaging mode or the temperature measurement mode, based on the mode selection information acquired by the mode selection information acquisition unit 111. judge. If it is determined in step S12 that the mode is the imaging mode, the process shown in FIG. 15 proceeds to step S13, and if it is determined that the mode is the temperature measurement mode, the process illustrated in FIG. 15 proceeds to step S16. .
 ステップS13で、フラグ設定部113は、表示制御フラグ記憶領域141に表示制御フラグ151として撮像画像表示フラグ151Aを設定し、かつ、投光制御フラグ記憶領域142に投光制御フラグ152として投光オン制御フラグ152Aを設定する。 In step S13, the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141, and sets the light projection control flag 152 in the light projection control flag storage area 142 so that light projection is turned on. Set control flag 152A.
 ステップS14で、投光制御部114は、投光器14をオンにする。 In step S14, the light projection control unit 114 turns on the light projector 14.
 ステップS15で、モード設定部115は、CPU61の動作モードとして撮像モードを設定する。 At step S15, the mode setting unit 115 sets the imaging mode as the operation mode of the CPU 61.
 ステップS16で、フラグ設定部113は、表示制御フラグ記憶領域141に表示制御フラグ151として合成画像表示フラグ151Bを設定し、投光制御フラグ記憶領域142に投光制御フラグ152として投光オフ制御フラグ152Bを設定する。 In step S16, the flag setting unit 113 sets the composite image display flag 151B as the display control flag 151 in the display control flag storage area 141, and sets the light projection OFF control flag as the light projection control flag 152 in the light projection control flag storage area 142. 152B.
 ステップS17で、投光制御部114は、投光器14をオフにする。 In step S17, the light projection control unit 114 turns off the light projector 14.
 ステップS18で、モード設定部115は、CPU61の動作モードとして温度測定モードを設定する。 At step S18, the mode setting unit 115 sets the temperature measurement mode as the operation mode of the CPU 61.
 続いて、図16を参照しながら、CPU61が実行する撮像支援処理のうち、撮像処理部120(図7参照)が実行する撮像処理について説明する。 Next, the imaging process executed by the imaging processing unit 120 (see FIG. 7) among the imaging support processes executed by the CPU 61 will be described with reference to FIG.
 ステップS21で、波長選択部121は、入力デバイス78によって受け付けられた波長選択指示に従って、複数の波長帯域から撮像に用いる一つの波長帯域を選択する。 In step S21, the wavelength selection unit 121 selects one wavelength band used for imaging from a plurality of wavelength bands according to the wavelength selection instruction received by the input device 78.
 ステップS22で、ターレット制御部122は、複数の光学フィルタのうち、波長選択部121によって選択された波長帯域に対応する光学フィルタが光路に挿入される位置にターレットフィルタ35を回転させる。 In step S22, the turret control unit 122 rotates the turret filter 35 to a position where the optical filter corresponding to the wavelength band selected by the wavelength selection unit 121 among the plurality of optical filters is inserted into the optical path.
 ステップS23で、撮像制御部123は、イメージセンサ15に光を撮像させる。イメージセンサ15は、光を撮像することで得たアナログ画像データを出力し、信号処理回路72は、アナログ画像データに対して各種の信号処理を施すことによりデジタル画像データを生成して出力する。 At step S23, the imaging control unit 123 causes the image sensor 15 to capture light. The image sensor 15 outputs analog image data obtained by capturing light, and the signal processing circuit 72 generates and outputs digital image data by performing various signal processing on the analog image data.
 ステップS24で、表示制御部124は、信号処理回路72によって生成されたデジタル画像データに基づいて撮像画像161Aをディスプレイ76に表示させる。 In step S24, the display control unit 124 causes the display 76 to display the captured image 161A based on the digital image data generated by the signal processing circuit 72.
 続いて、図17を参照しながら、CPU61が実行する撮像支援処理のうち、温度測定処理部130(図9参照)が実行する温度測定処理について説明する。 Subsequently, the temperature measurement processing executed by the temperature measurement processing unit 130 (see FIG. 9) among the imaging support processing executed by the CPU 61 will be described with reference to FIG.
 ステップS31で、波長選択部131は、二色温度測定法に用いる二つの波長帯域、すなわち第1波長帯域及び第2波長帯域を選択する。 In step S31, the wavelength selection unit 131 selects two wavelength bands, that is, the first wavelength band and the second wavelength band, to be used for the two-color thermometry method.
 ステップS32で、第1ターレット制御部132は、複数のBPF83(図4参照)のうち波長選択部131によって選択された第1波長帯域に対応するBPF83が光路に挿入される位置にターレットフィルタ35を回転させる。 In step S32, the first turret control unit 132 sets the turret filter 35 to a position where the BPF 83 corresponding to the first wavelength band selected by the wavelength selection unit 131 among the plurality of BPFs 83 (see FIG. 4) is inserted into the optical path. rotate.
 ステップS33で、第1撮像制御部133は、第1波長帯域に対応するBPF83を透過した第1近赤外光をイメージセンサ15に撮像させる。イメージセンサ15は、第1近赤外光を撮像することで得た第1アナログ画像データを出力し、信号処理回路72は、第1アナログ画像データに対して各種の信号処理を施すことにより第1デジタル画像データを生成して出力する。 In step S33, the first imaging control unit 133 causes the image sensor 15 to image the first near-infrared light transmitted through the BPF 83 corresponding to the first wavelength band. The image sensor 15 outputs the first analog image data obtained by imaging the first near-infrared light, and the signal processing circuit 72 performs various signal processing on the first analog image data to obtain the first analog image data. 1 Generate and output digital image data.
 ステップS34で、第2ターレット制御部134は、複数のBPF83のうち波長選択部131によって選択された第2波長帯域に対応するBPF83が光路に挿入される位置にターレットフィルタ35を回転させる。 In step S34, the second turret control unit 134 rotates the turret filter 35 to a position where the BPF 83 corresponding to the second wavelength band selected by the wavelength selection unit 131 among the plurality of BPFs 83 is inserted into the optical path.
 ステップS35で、第2撮像制御部135は、第2波長帯域に対応するBPF83を透過した第2近赤外光をイメージセンサ15に撮像させる。イメージセンサ15は、第2近赤外光を撮像することで得た第2アナログ画像データを出力し、信号処理回路72は、第2アナログ画像データに対して各種の信号処理を施すことにより第2デジタル画像データを生成して出力する。 In step S35, the second imaging control unit 135 causes the image sensor 15 to image the second near-infrared light transmitted through the BPF 83 corresponding to the second wavelength band. The image sensor 15 outputs second analog image data obtained by imaging the second near-infrared light, and the signal processing circuit 72 performs various signal processing on the second analog image data to obtain the second analog image data. 2. Generate and output digital image data.
 ステップS36で、温度導出部136は、第1デジタル画像データ及び第2デジタル画像データに基づいて、二色温度測定法により、被写体の温度分布を算出する。 In step S36, the temperature derivation unit 136 calculates the temperature distribution of the subject by two-color thermometry based on the first digital image data and the second digital image data.
 ステップS37で、表示制御部137は、温度導出部136によって得られた被写体の温度分布に基づいて温度に関連する温度情報を生成する。そして、表示制御部137は、第1デジタル画像データ又は第2デジタル画像データに基づいて得られる撮像画像に温度情報を合成した合成画像161Bを出力し、合成画像161Bをディスプレイ76に表示させる In step S<b>37 , the display control unit 137 generates temperature information related to temperature based on the temperature distribution of the subject obtained by the temperature derivation unit 136 . Then, the display control unit 137 outputs a synthesized image 161B obtained by synthesizing the captured image obtained based on the first digital image data or the second digital image data with the temperature information, and causes the display 76 to display the synthesized image 161B.
 なお、CPU61は、撮像モード及び温度測定モードの各モードにおいて、フォーカスレンズ31を光軸OAに沿って移動させることでピントの位置を調節する制御、及びズームレンズ32を移動させることでズーム倍率を調節する制御をズーム駆動機構42に対して行う。また、CPU61は、撮像モード及び温度測定モードの各モードにおいて、ぶれ補正レンズ34を移動させることで像のぶれを補正する制御をぶれ補正駆動機構44に対して行う。また、CPU61は、撮像モード及び温度測定モードの各モードにおいて、絞り33に設けられた開口33Aの口径を変更することで絞り33を透過する光の量を調節する制御を絞り駆動機構43に対して行う。また、CPU61は、撮像モード及び温度測定モードの各モードにおいて、調整レンズ37を移動させることで焦点の位置を調整する制御を調整駆動機構47に対して行う。 Note that the CPU 61 controls the position of focus by moving the focus lens 31 along the optical axis OA and adjusts the zoom magnification by moving the zoom lens 32 in each of the imaging mode and the temperature measurement mode. Control for adjustment is performed on the zoom drive mechanism 42 . Further, the CPU 61 controls the blur correction drive mechanism 44 to correct image blur by moving the blur correction lens 34 in each of the imaging mode and the temperature measurement mode. In addition, the CPU 61 controls the diaphragm drive mechanism 43 to adjust the amount of light passing through the diaphragm 33 by changing the diameter of the aperture 33A provided in the diaphragm 33 in each of the imaging mode and the temperature measurement mode. do. The CPU 61 also controls the adjustment drive mechanism 47 to adjust the focus position by moving the adjustment lens 37 in each of the imaging mode and the temperature measurement mode.
 第1実施形態に係るカメラ1の制御方法は、本開示の技術に係る「制御方法」の一例である。 The control method of the camera 1 according to the first embodiment is an example of the "control method" according to the technology of the present disclosure.
 次に、第1実施形態の効果について説明する。 Next, the effects of the first embodiment will be described.
 第1実施形態では、CPU61は、イメージセンサ15に対して光を撮像させる撮像モードと、イメージセンサ15によって受光された近赤外光に基づいて温度を導出する温度測定モードとを備える。そして、撮像モードと温度測定モードとでは、制御フラグが異なる。したがって、撮像モードと温度測定モードとで制御対象における制御内容を異ならせることができる。 In the first embodiment, the CPU 61 has an imaging mode for causing the image sensor 15 to image light and a temperature measurement mode for deriving temperature based on the near-infrared light received by the image sensor 15 . The control flag differs between the imaging mode and the temperature measurement mode. Therefore, it is possible to change the control contents of the controlled object between the imaging mode and the temperature measurement mode.
 例えば、第1実施形態では、制御フラグは、ディスプレイ76に対して表示させる表示制御フラグ151を含む。したがって、撮像モードと温度測定モードとで、制御対象における制御内容を異ならせることの一例として、ディスプレイ76に表示される画像を異ならせることができる。 For example, in the first embodiment, the control flag includes the display control flag 151 displayed on the display 76. Therefore, an image displayed on the display 76 can be made different as an example of differentiating the control details in the controlled object between the imaging mode and the temperature measurement mode.
 また、第1実施形態では、制御フラグは、投光器14を動作させる投光制御フラグ152を含む。したがって、撮像モードと温度測定モードとで、制御対象における制御内容を異ならせることの一例として、投光器14の動作を異ならせることができる。 Also, in the first embodiment, the control flag includes the light projection control flag 152 that operates the light projector 14 . Therefore, the operation of the light projector 14 can be made different as an example of differentiating the control details in the controlled object between the imaging mode and the temperature measurement mode.
 このように、第1実施形態では、撮像モードと温度測定モードとで制御対象における制御内容を異ならせることができるので、例えば撮像モードと温度測定モードとで制御フラグが同じである場合に比して、撮像装置の動作を各モードに適した動作に制御することができる。 As described above, in the first embodiment, the control contents of the controlled object can be made different between the imaging mode and the temperature measurement mode. Therefore, the operation of the imaging device can be controlled to be suitable for each mode.
 また、CPU61は、撮像モードでは、表示制御フラグ151として、光がイメージセンサ15によって受光されることで得られた撮像画像161Aをディスプレイ76に対して表示させる撮像画像表示フラグ151Aを設定する。これにより、温度情報を含まない撮像画像161Aがディスプレイ76に表示される。したがって、撮像モードでは、例えば撮像画像161Aが温度情報を含む場合に比して、撮像画像161Aの視認性を向上させることができる。 In addition, in the imaging mode, the CPU 61 sets a captured image display flag 151A as the display control flag 151 to cause the display 76 to display a captured image 161A obtained by light being received by the image sensor 15 . As a result, the captured image 161A that does not contain the temperature information is displayed on the display 76. FIG. Therefore, in the imaging mode, the visibility of the captured image 161A can be improved as compared with the case where the captured image 161A includes temperature information, for example.
 また、CPU61は、温度測定モードでは、表示制御フラグ151として、温度を示す温度情報を含む合成画像161Bをディスプレイ76に対して表示させる合成画像表示フラグ151Bを設定する。これにより、温度情報を含む合成画像161Bがディスプレイ76に表示される。したがって、温度測定モードでは、例えば温度情報がディスプレイ76に表示されない場合に比して、ユーザが被写体の温度を的確に把握することができる。 In addition, in the temperature measurement mode, the CPU 61 sets, as the display control flag 151, a composite image display flag 151B that causes the display 76 to display a composite image 161B including temperature information indicating temperature. As a result, a synthesized image 161B including temperature information is displayed on the display 76. FIG. Therefore, in the temperature measurement mode, the user can accurately grasp the temperature of the object, compared to the case where the temperature information is not displayed on the display 76, for example.
 また、CPU61は、撮像モードでは、投光制御フラグ152として、投光器14をオンに切り替える投光オン制御フラグ152Aを設定する。これにより、撮像モードでは、投光器14がオンに切り替えられる。したがって、撮像モードでは、被写体から放射される光の量を確保することができる。これにより、例えば、火災で室内に煙が充満している状況下であっても、ユーザが撮像画像161Aを通じて室内環境を確認することができる。 In addition, the CPU 61 sets a light projection ON control flag 152A for switching the light projector 14 ON as the light projection control flag 152 in the imaging mode. Thereby, in imaging mode, the light projector 14 is switched on. Therefore, in the imaging mode, it is possible to ensure the amount of light emitted from the subject. This allows the user to check the indoor environment through the captured image 161A, for example, even in a situation where the room is filled with smoke due to a fire.
 また、CPU61は、温度測定モードでは、投光制御フラグ152として、投光器14をオフに切り替える投光オフ制御フラグ152Bを設定する。これにより、温度測定モードでは、投光器14がオフに切り替えられる。したがって、被写体から放射される近赤外光に投光器14による照明光が混在することを回避することができる。また、例えば被写体から放射される近赤外光に投光器14による照明光が混在する場合に比して、温度測定の測定精度を向上させることができる。 Also, in the temperature measurement mode, the CPU 61 sets a light projection off control flag 152B for switching off the light projector 14 as the light projection control flag 152 . This switches off the light projector 14 in the temperature measurement mode. Therefore, it is possible to prevent the illumination light from the light projector 14 from being mixed with the near-infrared light emitted from the subject. In addition, the measurement accuracy of temperature measurement can be improved as compared with, for example, the case where illumination light from the projector 14 is mixed with near-infrared light emitted from the object.
 また、CPU61は、撮像モードと温度測定モードとに切り替わることに応じて、投光器14をオンとオフに切り替える。したがって、例えばユーザが投光器14をオンとオフに切り替える必要がある場合に比して、利便性を高めることができる。 In addition, the CPU 61 switches the light projector 14 on and off in response to switching between the imaging mode and the temperature measurement mode. Therefore, convenience can be improved compared to, for example, the case where the user has to switch the light projector 14 on and off.
 また、CPU61は、温度測定モードでは、イメージセンサ15によって受光されることで得られた撮像画像と、温度を示す温度情報とを合成した合成画像161Bを出力する。したがって、合成画像161Bがディスプレイ76に表示されることにより、ユーザが被写体の状況と温度を容易に把握できる。 Also, in the temperature measurement mode, the CPU 61 outputs a synthesized image 161B obtained by synthesizing a captured image obtained by receiving light from the image sensor 15 and temperature information indicating temperature. Therefore, by displaying the synthesized image 161B on the display 76, the user can easily grasp the situation and temperature of the subject.
 次に、第1実施形態の変形例について説明する。 Next, a modified example of the first embodiment will be described.
 第1実施形態では、一例として図8に示すように、建物の窓162の室内側に設けられたカーテン163に火164がついている撮像画像161Aがディスプレイ76に表示される例が示されているが、撮像画像161Aは、どのような画像でもよい。例えば、一例として図18に示すように、撮像画像161Aは、人の腕170が撮像されることで得られた画像でもよい。 In the first embodiment, as an example, as shown in FIG. 8, a captured image 161A in which a curtain 163 provided on the indoor side of a window 162 of a building is lit with fire 164 is displayed on the display 76. However, the captured image 161A may be any image. For example, as shown in FIG. 18 as an example, the captured image 161A may be an image obtained by capturing the arm 170 of a person.
 また、第1実施形態では、一例として図11から図14に示すように、建物の窓の室内側に設けられたカーテンに火がついている撮像画像と温度情報とを合成した合成画像161Bがディスプレイ76に表示される例が示されているが、合成画像161Bは、撮像画像と温度情報とが合成された画像であれば、どのような画像でもよい。例えば、一例として図19に示すように、合成画像161Bは、人の腕170が撮像されることで得られた撮像画像171に温度分布を示す温度情報172が合成された画像でもよい。合成は、例えば、アルファブレンドによる複数の画像の重畳であってもよいし、撮像画像に対する温度情報の埋め込みであってもよい。 Further, in the first embodiment, as an example, as shown in FIGS. 11 to 14, a synthesized image 161B obtained by synthesizing temperature information with a captured image of a curtain provided on the indoor side of a window of a building is displayed. 76 is shown, the synthesized image 161B may be any image as long as it is an image obtained by synthesizing the captured image and the temperature information. For example, as shown in FIG. 19 as an example, the synthesized image 161B may be an image obtained by synthesizing temperature information 172 indicating the temperature distribution with an imaged image 171 obtained by imaging a person's arm 170 . Synthesis may be, for example, superimposition of a plurality of images by alpha blending, or embedding of temperature information in a captured image.
 また、第1実施形態では、図6に示すフラグ設定部113によって投光オフ制御フラグ152Bが設定される代わりに、フラグ設定部113によって投光抑制制御フラグが設定されてもよい。また、フラグ設定部113によって投光抑制制御フラグが設定された場合、投光制御部114は、投光制御回路73に投光抑制指令を出力し、投光制御回路73は、投光抑制指令に従って投光器14の投光を抑制(つまり投光器14から投光される光の量を抑制)させてもよい。抑制とは、例えば、光の量を基準量よりも小さくする動作を指す。基準量は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。 Further, in the first embodiment, instead of setting the light projection OFF control flag 152B by the flag setting unit 113 shown in FIG. Further, when the light emission suppression control flag is set by the flag setting unit 113, the light emission control unit 114 outputs a light emission suppression command to the light emission control circuit 73, and the light emission control circuit 73 outputs the light emission suppression command. Accordingly, the light projection of the light projector 14 may be suppressed (that is, the amount of light projected from the light projector 14 may be suppressed). Suppression refers to, for example, the operation of reducing the amount of light below a reference amount. The reference amount may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may
 [第2実施形態]
 次に、第2実施形態について説明する。
[Second embodiment]
Next, a second embodiment will be described.
 第2実施形態では、第1実施形態に対し、カメラ1の構成が次のように変更されている。以下、第2実施形態について第1実施形態と異なる点を説明する。 In the second embodiment, the configuration of the camera 1 is changed as follows from the first embodiment. The points of the second embodiment that are different from the first embodiment will be described below.
 一例として図20に示すように、モード切替処理部110は、状態信号取得部181、状態信号判定部182、フラグ設定部113、投光制御部114、及びモード設定部115を有する。 As shown in FIG. 20 as an example, the mode switching processing unit 110 has a state signal acquisition unit 181, a state signal determination unit 182, a flag setting unit 113, a light projection control unit 114, and a mode setting unit 115.
 状態信号取得部181は、投光器14の動作の状態に応じて投光制御回路73から出力された状態信号を取得する。投光制御回路73は、投光器14がオンの状態では、投光器14がオンの状態を表すオン状態信号を状態信号として出力し、投光器14がオフの状態では、投光器14がオフの状態を表すオン状態信号を状態信号として出力する。 The state signal acquisition unit 181 acquires the state signal output from the light projection control circuit 73 according to the operating state of the light projector 14 . The light projection control circuit 73 outputs an ON state signal representing the ON state of the light projector 14 as a state signal when the light projector 14 is ON, and an ON state signal representing the OFF state of the light projector 14 when the light projector 14 is OFF. Output the state signal as a state signal.
 状態信号判定部182は、状態信号取得部181によって取得された状態信号が、投光器14がオンの状態を表すオン状態信号であるか否かを判定する。 The state signal determination unit 182 determines whether the state signal acquired by the state signal acquisition unit 181 is an ON state signal indicating that the light projector 14 is ON.
 フラグ設定部113は、状態信号判定部182によって状態信号がオン状態信号であると判定された場合には、表示制御フラグ記憶領域141に表示制御フラグ151として撮像画像表示フラグ151Aを設定する。また、フラグ設定部113は、状態信号判定部182によって状態信号がオン状態信号ではないと判定された場合には、表示制御フラグ記憶領域141に表示制御フラグ151として合成画像表示フラグ151Bを設定する。 When the state signal determination unit 182 determines that the state signal is the ON state signal, the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141 . Further, when the state signal determination unit 182 determines that the state signal is not the ON state signal, the flag setting unit 113 sets the composite image display flag 151B as the display control flag 151 in the display control flag storage area 141. .
 モード設定部115は、表示制御フラグ151として撮像画像表示フラグ151Aが設定された場合には、CPU61の動作モードとして撮像モードを設定する。また、モード設定部115は、表示制御フラグ151として合成画像表示フラグ151Bが設定された場合には、CPU61のモードとして温度測定モードを設定する。 The mode setting unit 115 sets the imaging mode as the operation mode of the CPU 61 when the captured image display flag 151A is set as the display control flag 151 . Further, when the composite image display flag 151B is set as the display control flag 151, the mode setting unit 115 sets the temperature measurement mode as the mode of the CPU 61. FIG.
 第2実施形態において、撮像モードは、本開示の技術に係る「第1モード」の一例であり、温度測定モードは、本開示の技術に係る「第2モード」の一例である。また、表示制御フラグ151は、本開示の技術に係る「制御因子」の一例である。また、ディスプレイ76は、本開示の技術に係る「制御対象」の一例であり、ディスプレイ76に表示される画像は、本開示の技術に係る「制御対象に対する制御内容」の一例である。また、表示制御フラグ151は、本開示の技術に係る「表示制御因子」の一例であり、撮像画像表示フラグ151Aは、本開示の技術に係る「撮像画像表示因子」の一例であり、合成画像表示フラグ151Bは、本開示の技術に係る「温度情報表示因子」の一例である。 In the second embodiment, the imaging mode is an example of the "first mode" according to the technology of the present disclosure, and the temperature measurement mode is an example of the "second mode" according to the technology of the present disclosure. Also, the display control flag 151 is an example of a “control factor” according to the technology of the present disclosure. In addition, the display 76 is an example of the "controlled object" according to the technique of the present disclosure, and the image displayed on the display 76 is an example of "control details for the controlled object" according to the technique of the present disclosure. Further, the display control flag 151 is an example of the "display control factor" according to the technology of the present disclosure, the captured image display flag 151A is an example of the "captured image display factor" according to the technology of the present disclosure, and the composite image The display flag 151B is an example of a "temperature information display factor" according to the technology of the present disclosure.
 次に、第2実施形態の作用として、カメラ1の制御方法について説明する。 Next, a method for controlling the camera 1 will be described as an operation of the second embodiment.
 第2実施形態では、撮像処理部120が実行する撮像処理、及び温度測定処理部130が実行する温度測定処理は、第1実施形態と同じである。第2実施形態では、モード切替処理部110が実行するモード切替処理が、第1実施形態と異なる。以下、図21を参照しながら、第2実施形態に係るモード切替処理部110が実行するモード切替処理について説明する。 In the second embodiment, the imaging processing performed by the imaging processing unit 120 and the temperature measurement processing performed by the temperature measurement processing unit 130 are the same as in the first embodiment. In the second embodiment, mode switching processing executed by the mode switching processing unit 110 is different from that in the first embodiment. Mode switching processing executed by the mode switching processing unit 110 according to the second embodiment will be described below with reference to FIG. 21 .
 ステップS41で、状態信号取得部181は、投光器14の動作の状態に応じて投光制御回路73から出力された状態信号を取得する。 In step S41, the state signal acquisition unit 181 acquires the state signal output from the light projection control circuit 73 according to the operating state of the light projector .
 ステップS42で、状態信号判定部182は、状態信号がオン状態信号であるか否かを判定する。ステップS42において、オン状態信号であると判定された場合、図21に示す処理は、ステップS43に移行し、オン状態信号ではないと判定された場合、図21に示す処理は、ステップS45に移行する。 In step S42, the state signal determination unit 182 determines whether or not the state signal is the ON state signal. If it is determined in step S42 that it is an on-state signal, the processing shown in FIG. 21 proceeds to step S43, and if it is determined that it is not an on-state signal, the processing illustrated in FIG. 21 proceeds to step S45. do.
 ステップS43で、フラグ設定部113は、表示制御フラグ記憶領域141に表示制御フラグ151として撮像画像表示フラグ151Aを設定する。 In step S<b>43 , the flag setting unit 113 sets the captured image display flag 151</b>A as the display control flag 151 in the display control flag storage area 141 .
 ステップS44で、モード設定部115は、CPU61の動作モードとして撮像モードを設定する。 At step S44, the mode setting unit 115 sets the imaging mode as the operation mode of the CPU 61.
 ステップS45で、フラグ設定部113は、表示制御フラグ記憶領域141に表示制御フラグ151として合成画像表示フラグ151Bを設定する。 In step S45, the flag setting unit 113 sets the composite image display flag 151B as the display control flag 151 in the display control flag storage area 141.
 ステップS46で、モード設定部115は、CPU61の動作モードとして温度測定モードを設定する。 At step S46, the mode setting unit 115 sets the temperature measurement mode as the operation mode of the CPU 61.
 第2実施形態に係るカメラ1の制御方法は、本開示の技術に係る「制御方法」の一例である。 The control method of the camera 1 according to the second embodiment is an example of the "control method" according to the technology of the present disclosure.
 次に、第2実施形態の効果について第1実施形態と異なる点を説明する。 Next, regarding the effects of the second embodiment, the differences from the first embodiment will be described.
 第2実施形態では、CPU61は、投光器14の状態がオンである場合には、撮像モードを設定し、投光器14の状態がオフである場合には、温度測定モードを設定する。したがって、例えば投光器14のオンオフの動作に応じて撮像モードと温度測定モードとに切り替わらない場合に比して、利便性を高めることができる。 In the second embodiment, the CPU 61 sets the imaging mode when the light projector 14 is on, and sets the temperature measurement mode when the light projector 14 is off. Therefore, convenience can be improved as compared with the case where the imaging mode and the temperature measurement mode are not switched according to the ON/OFF operation of the light projector 14, for example.
 [第3実施形態]
 次に、第3実施形態について説明する。
[Third embodiment]
Next, a third embodiment will be described.
 第3実施形態では、第1実施形態に対し、カメラ1の構成が次のように変更されている。以下、第3実施形態について第1実施形態と異なる点を説明する。 In the third embodiment, the configuration of the camera 1 is changed as follows from the first embodiment. The points of the third embodiment that are different from the first embodiment will be described below.
 一例として図22に示すように、CPU61は、上述のモード切替処理部110、撮像処理部120、及び温度測定処理部130に加えて、パラメータ変更処理部190として機能する。パラメータ変更処理部190は、CPU61のモードが撮像モードである場合、及び温度測定モードである場合に動作する処理部である。パラメータ変更処理部190は、撮像モードと温度測定モードとで異なるモード別パラメータ211を設定する処理部である。パラメータ変更処理部190は、モード判定部191及びモード別パラメータ設定部192を有する。 As an example, as shown in FIG. 22, the CPU 61 functions as a parameter change processing section 190 in addition to the mode switching processing section 110, imaging processing section 120, and temperature measurement processing section 130 described above. The parameter change processing unit 190 is a processing unit that operates when the mode of the CPU 61 is the imaging mode and the temperature measurement mode. The parameter change processing unit 190 is a processing unit that sets different mode parameters 211 for the imaging mode and the temperature measurement mode. The parameter change processing unit 190 has a mode determination unit 191 and a mode-specific parameter setting unit 192 .
 モード判定部191は、CPU61の動作モードが撮像モードであるのか、又は温度測定モードであるのかを判定する。 The mode determination unit 191 determines whether the operation mode of the CPU 61 is the imaging mode or the temperature measurement mode.
 RAM63には、モード別パラメータ211を記憶するパラメータ記憶領域201が設けられている。 The RAM 63 is provided with a parameter storage area 201 for storing mode-specific parameters 211 .
 モード別パラメータ設定部192は、モード判定部191によってCPU61の動作モードが撮像モードであると判定された場合には、撮像条件等に基づいて各種パラメータ設定処理を行うことにより、撮像モードに対応する撮像モード用パラメータ211Aを導出する。そして、モード別パラメータ設定部192は、パラメータ記憶領域201にモード別パラメータ211として撮像モード用パラメータ211Aを設定する。 When the mode determination unit 191 determines that the operation mode of the CPU 61 is the imaging mode, the mode-specific parameter setting unit 192 performs various parameter setting processes based on the imaging conditions, etc., to correspond to the imaging mode. An imaging mode parameter 211A is derived. Then, the mode-specific parameter setting unit 192 sets the imaging mode parameter 211 A as the mode-specific parameter 211 in the parameter storage area 201 .
 また、モード別パラメータ設定部192は、モード判定部191によってCPU61の動作モードが温度測定モードであると判定された場合には、温度測定条件等に基づいて各種パラメータ設定処理を行うことにより、温度測定モードに対応する温度測定モード用パラメータ211Bを導出する。そして、モード別パラメータ設定部192は、パラメータ記憶領域201にモード別パラメータ211として温度測定モード用パラメータ211Bを設定する。 Further, when the mode determination unit 191 determines that the operation mode of the CPU 61 is the temperature measurement mode, the mode-specific parameter setting unit 192 performs various parameter setting processes based on the temperature measurement conditions and the like to determine the temperature. A temperature measurement mode parameter 211B corresponding to the measurement mode is derived. Then, the mode-specific parameter setting unit 192 sets the temperature measurement mode parameter 211 B as the mode-specific parameter 211 in the parameter storage area 201 .
 撮像モード用パラメータ211Aは、撮像に関して設定された第1撮像設定パラメータ212A、及び撮像画像に対する画像処理に関して設定された第1画像処理設定パラメータ213Aを含む。同様に、温度測定モード用パラメータ211Bは、撮像に関して設定された第2撮像設定パラメータ212B、及び撮像画像に対する画像処理に関して設定された第2画像処理設定パラメータ213Bを含む。 The imaging mode parameters 211A include a first imaging setting parameter 212A set for imaging and a first image processing setting parameter 213A set for image processing of the captured image. Similarly, the temperature measurement mode parameters 211B include a second imaging setting parameter 212B set for imaging and a second image processing setting parameter 213B set for image processing of the captured image.
 第1撮像設定パラメータ212A及び第2撮像設定パラメータ212Bは、投光器14に関するパラメータ、シャッタスピードに関するパラメータ、絞り33に関するパラメータ、測光に関するパラメータ、イメージセンサ15の感度に関するパラメータ、ハイダイナミックレンジに関するパラメータ、及び防振制御に関するパラメータを含む。イメージセンサ15の感度に関するパラメータは、イメージセンサ15のゲインに関するパラメータ及び/又はイメージセンサ15の変換効率に関するパラメータを含む。 The first imaging setting parameter 212A and the second imaging setting parameter 212B include a parameter related to the projector 14, a parameter related to shutter speed, a parameter related to the diaphragm 33, a parameter related to photometry, a parameter related to the sensitivity of the image sensor 15, a parameter related to high dynamic range, and a parameter related to protection. Contains parameters related to vibration control. The parameters relating to the sensitivity of the image sensor 15 include parameters relating to the gain of the image sensor 15 and/or parameters relating to the conversion efficiency of the image sensor 15 .
 第1画像処理設定パラメータ213A及び第2画像処理設定パラメータ213Bは、ノイズリダクションに関するパラメータ、シャープネスに関するパラメータ、コントラストに関するパラメータ、及びトーンに関するパラメータを含む。 The first image processing setting parameter 213A and the second image processing setting parameter 213B include noise reduction parameters, sharpness parameters, contrast parameters, and tone parameters.
 撮像モード用パラメータ211Aに含まれる第1撮像設定パラメータ212Aは、一例として、次のように設定される。
 投光器14に関するパラメータは、投光器14が投光を行うパラメータに設定される。
 シャッタスピードに関するパラメータは、シャッタスピードを基準スピード以上にするパラメータに設定される。シャッタスピードとは、メカニカルシャッタにおける先幕が開き始めてから後幕が閉じ終わるまでの間の時間、電子先幕シャッタが作動してからメカニカルシャッタの後幕が閉じ終わるまでの間の時間、又は電子シャッタが作動し始めてから作動し終わるまでの間の時間によって規定されている。基準スピードは、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 絞り33に関するパラメータは、絞り量を基準絞り量以上にするパラメータに設定される。絞り量とは、絞り33に設けられた開口33Aの口径に比例する。基準絞り量は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 測光に関するパラメータは、測光をアベレージ測光方式又はマルチパターン測光方式で行うパラメータに設定される。測光とは、被写体の明るさを測ることである。
 イメージセンサ15のゲインに関するパラメータは、イメージセンサ15のゲインを基準ゲイン以上にするパラメータに設定される。イメージセンサ15のゲインとは、例えば、イメージセンサ15のフォトダイオードに接続されているA/D変換器(図示省略)のアナログゲインを指す。基準ゲインは、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 イメージセンサ15の変換効率に関するパラメータは、イメージセンサ15の変換効率を基準変換効率以上にするパラメータに設定される。イメージセンサ15の変換効率とは、イメージセンサ15に含まれるフォトダイオードに接続されている可変コンデンサ(図示省略)に蓄積された電荷を電圧に変換する効率を指す。基準変換効率は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 ハイダイナミックレンジに関するパラメータは、ハイダイナミックレンジをオンにするパラメータに設定される。ハイダイナミックレンジとは、撮像画像161Aの明部と暗部のコントラスト(つまり明暗比)を向上させる表示技術である。ハイダイナミックレンジをオンにするとは、ダイナミックレンジを基準レンジよりも拡げることである。基準レンジは、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 防振制御に関するパラメータは、防振制御をオンにするパラメータに設定される。防振制御とは、像のぶれが補正される方向へぶれ補正レンズ34を移動させる制御のことである。防振制御をオンにするとは、ぶれ補正レンズ34を移動させる制御を行うことである。
As an example, the first imaging setting parameter 212A included in the imaging mode parameter 211A is set as follows.
The parameters related to the light projector 14 are set to the parameters for the light projector 14 to project light.
A parameter related to the shutter speed is set to a parameter that makes the shutter speed equal to or higher than the reference speed. Shutter speed is the time from when the front curtain starts to open until the rear curtain finishes closing in a mechanical shutter, the time from when an electronic front curtain operates until the rear curtain of a mechanical shutter finishes closing, or an electronic It is defined by the time from when the shutter starts operating to when it finishes operating. The reference speed may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (for example, subject temperature and/or imaging conditions). may
A parameter relating to the diaphragm 33 is set to a parameter that makes the diaphragm amount greater than or equal to the reference diaphragm amount. The diaphragm amount is proportional to the diameter of the aperture 33A provided in the diaphragm 33. FIG. The reference aperture amount may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (for example, subject temperature and/or imaging conditions). There may be.
The parameters related to photometry are set to parameters for performing photometry by the average photometry method or the multi-pattern photometry method. Photometry is the measurement of the brightness of a subject.
A parameter related to the gain of the image sensor 15 is set to a parameter that makes the gain of the image sensor 15 greater than or equal to the reference gain. The gain of the image sensor 15 refers to the analog gain of an A/D converter (not shown) connected to the photodiode of the image sensor 15, for example. The reference gain may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may
A parameter related to the conversion efficiency of the image sensor 15 is set to a parameter that makes the conversion efficiency of the image sensor 15 equal to or higher than the reference conversion efficiency. The conversion efficiency of the image sensor 15 refers to the efficiency of converting electric charges accumulated in a variable capacitor (not shown) connected to a photodiode included in the image sensor 15 into voltage. The reference conversion efficiency may be a fixed value, or may be a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (such as subject temperature and/or imaging conditions). There may be.
A parameter related to the high dynamic range is set to a parameter that turns on the high dynamic range. The high dynamic range is a display technology that improves the contrast (that is, the contrast ratio) between the bright and dark portions of the captured image 161A. Turning on the high dynamic range means expanding the dynamic range beyond the reference range. The reference range may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may
A parameter related to anti-vibration control is set to a parameter for turning on anti-vibration control. Anti-vibration control is control for moving the blur correction lens 34 in a direction in which image blur is corrected. Turning on anti-vibration control means performing control to move the blur correction lens 34 .
 ノイズリダクションに関するパラメータは、ノイズリダクションの強弱の度合を第1基準度合よりも大きくするパラメータに設定される。ノイズリダクションとは、撮像画像に出現するノイズを減少させる画像処理であり、ノイズリダクションの強弱とは、撮像画像に出現するノイズを減少させる割合を増減させることである。第1基準度合いは、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 シャープネスに関するパラメータは、シャープネスの強弱の度合を第2基準度合よりも大きくするパラメータに設定される。シャープネスとは、撮像画像の輪郭の強調のことであり、シャープネスの強弱とは、撮像画像の輪郭を強調する割合を増減させることである。第2基準度合は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 コントラストに関するパラメータは、コントラストの強弱の度合を第3基準度合よりも大きくするパラメータに設定される。コントラストとは、撮像画像の明暗及び/又は色彩の差異のことであり、コントラストの強弱とは、撮像画像の明暗及び/又は色彩の差異を増減させることである。第3基準度合は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 トーンに関するパラメータは、トーンの強弱の度合を第4基準度合よりも大きくするパラメータに設定される。トーンとは、撮像画像の色調のことであり、トーンの強弱とは、撮像画像の色調の多少を増減させることである。第4基準度合は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
A parameter related to noise reduction is set to a parameter that makes the degree of strength of noise reduction larger than the first reference degree. Noise reduction is image processing for reducing noise that appears in a captured image, and the strength of noise reduction means increasing or decreasing the rate at which noise that appears in a captured image is reduced. The first reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
A parameter related to sharpness is set to a parameter that makes the degree of strength of sharpness larger than the second reference degree. Sharpness refers to enhancement of the contour of the captured image, and strength of sharpness refers to increasing or decreasing the rate at which the contour of the captured image is enhanced. The second reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
The parameter related to contrast is set to a parameter that makes the degree of strength of contrast larger than the third reference degree. Contrast refers to the difference in brightness and/or color of a captured image, and strength of contrast refers to increasing or decreasing the difference in brightness and/or color of a captured image. The third reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
A parameter relating to tone is set to a parameter that makes the degree of intensity of the tone larger than the fourth reference degree. The tone is the color tone of the captured image, and the intensity of the tone is to increase or decrease the color tone of the captured image. The fourth reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
 また、温度測定モード用パラメータ211Bに含まれる第2撮像設定パラメータ212Bは、一例として、次のように設定される。
 投光器14に関するパラメータは、投光器14が投光を行わないパラメータに設定される。
 シャッタスピードに関するパラメータは、シャッタスピードを基準スピード未満にするパラメータに設定される。基準スピードは、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 絞り33に関するパラメータは、絞り量を基準量未満にするパラメータに設定される。基準絞り量は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 測光に関するパラメータは、測光をハイライト重点測光方式、中央重点測光方式、又はスポット測光方式で行うパラメータに設定される。
 イメージセンサ15のゲインに関するパラメータは、イメージセンサ15のゲインを基準ゲイン未満にするパラメータに設定される。基準ゲインは、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 イメージセンサ15の変換効率に関するパラメータは、イメージセンサ15の光電変換効率を基準変換効率未満にするパラメータに設定される。基準変換効率は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 ハイダイナミックレンジに関するパラメータは、ハイダイナミックレンジをオフにするパラメータに設定される。ハイダイナミックレンジをオフにするとは、ダイナミックレンジを基準レンジに設定することである。基準レンジは、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 防振制御に関するパラメータは、防振制御をオフにするパラメータに設定される。
Also, the second imaging setting parameter 212B included in the temperature measurement mode parameter 211B is set as follows, as an example.
The parameters relating to the light projector 14 are set to parameters for which the light projector 14 does not project light.
A parameter related to the shutter speed is set to a parameter that makes the shutter speed less than the reference speed. The reference speed may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (for example, subject temperature and/or imaging conditions). may
A parameter related to the diaphragm 33 is set to a parameter that makes the diaphragm amount smaller than the reference amount. The reference aperture amount may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (for example, subject temperature and/or imaging conditions). There may be.
The parameters related to photometry are set to parameters for performing photometry with a highlight-weighted photometry method, a center-weighted photometry method, or a spot photometry method.
A parameter related to the gain of the image sensor 15 is set to a parameter that makes the gain of the image sensor 15 less than the reference gain. The reference gain may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may
A parameter related to the conversion efficiency of the image sensor 15 is set to a parameter that makes the photoelectric conversion efficiency of the image sensor 15 less than the reference conversion efficiency. The reference conversion efficiency may be a fixed value, or may be a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (such as subject temperature and/or imaging conditions). There may be.
A parameter related to the high dynamic range is set to a parameter that turns off the high dynamic range. Turning off the high dynamic range means setting the dynamic range to the reference range. The reference range may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may
A parameter related to anti-vibration control is set to a parameter for turning off anti-vibration control.
 また、温度測定モード用パラメータ211Bに含まれる第2画像処理設定パラメータ213Bは、一例として、次のように設定される。
 ノイズリダクションに関するパラメータは、ノイズリダクションの強弱の度合を第1基準度合以下にするパラメータに設定される。第1基準度合いは、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 シャープネスに関するパラメータは、シャープネスの強弱の度合を第2基準度合以下にするパラメータに設定される。第2基準度合は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 コントラストに関するパラメータは、コントラストの強弱の度合を第3基準度合以下にするパラメータに設定される。第3基準度合は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
 トーンに関するパラメータは、トーンの強弱の度合を第4基準度合以下にするパラメータに設定される。第4基準度合は、固定値であってもよいし、ユーザが入力デバイス78に入力した指示、及び/又は、各種条件(例えば、被写体の温度及び/又は撮像条件など)に従って変更される可変値であってもよい。
Further, the second image processing setting parameter 213B included in the temperature measurement mode parameter 211B is set as follows as an example.
A parameter related to noise reduction is set to a parameter that makes the degree of strength of noise reduction equal to or less than the first reference degree. The first reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
The sharpness-related parameter is set to a parameter that makes the degree of strength of sharpness equal to or lower than the second reference degree. The second reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
The parameter related to contrast is set to a parameter that makes the degree of strength of contrast equal to or lower than the third reference degree. The third reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
The tone-related parameter is set to a parameter that makes the degree of intensity of the tone equal to or lower than the fourth reference degree. The fourth reference degree may be a fixed value, or a variable value that is changed according to instructions input by the user into the input device 78 and/or various conditions (eg, subject temperature and/or imaging conditions). may be
 一例として図23に示すように、CPU61が撮像モードである場合に、撮像制御部123は、パラメータ記憶領域201に記憶された撮像モード用パラメータ211Aに含まれる第1撮像設定パラメータ212Aに従って撮像に関する撮像設定を行う。
 (1)撮像制御部123は、投光器14に関する設定として、投光器14が投光を行う設定にし、投光制御回路73に対してオン指令を出力する。
 (2)撮像制御部123は、シャッタスピードに関する設定として、シャッタスピードを基準スピード以上に設定し、シャッタスピードを調節する。
 (3)撮像制御部123は、絞り33に関する設定として、絞り量を基準量以上に設定し、設定した絞り量に対応する絞り指令を絞り駆動回路53に対して出力する。
 (4)撮像制御部123は、測光に関する設定として、測光方式をアベレージ測光方式又はマルチパターン測光方式に設定する。
 (5)撮像制御部123は、イメージセンサ15のゲインに関する設定として、イメージセンサ15のゲインを基準ゲイン以上に設定する。
 (6)撮像制御部123は、イメージセンサ15の光電変換効率に関する設定として、イメージセンサ15の光電変換効率を基準光電変換効率以上に設定する。撮像制御部123は、設定したゲイン及び光電変換効率に対応する感度指令をイメージセンサドライバ71に対して出力する。
 (7)撮像制御部123は、ハイダイナミックレンジに関する設定として、ハイダイナミックレンジをオンに設定する。
 (8)撮像制御部123は、防振制御に関する設定として、防振制御をオンに設定し、ぶれ補正駆動回路54に対してぶれ補正指令を出力する。
As an example, as shown in FIG. 23 , when the CPU 61 is in the imaging mode, the imaging control unit 123 controls imaging related to imaging according to the first imaging setting parameters 212A included in the imaging mode parameters 211A stored in the parameter storage area 201. Make settings.
(1) The imaging control unit 123 sets the light projector 14 to project light as a setting related to the light projector 14 , and outputs an ON command to the light projection control circuit 73 .
(2) The imaging control unit 123 adjusts the shutter speed by setting the shutter speed to be equal to or higher than the reference speed as the shutter speed setting.
(3) The imaging control unit 123 sets the diaphragm amount equal to or larger than the reference amount as the setting for the diaphragm 33 , and outputs the diaphragm command corresponding to the set diaphragm amount to the diaphragm drive circuit 53 .
(4) The imaging control unit 123 sets the photometry method to the average photometry method or the multi-pattern photometry method as a setting related to photometry.
(5) The imaging control unit 123 sets the gain of the image sensor 15 to be equal to or higher than the reference gain as the setting related to the gain of the image sensor 15 .
(6) The imaging control unit 123 sets the photoelectric conversion efficiency of the image sensor 15 to be equal to or higher than the reference photoelectric conversion efficiency as the setting regarding the photoelectric conversion efficiency of the image sensor 15 . The imaging control unit 123 outputs a sensitivity command corresponding to the set gain and photoelectric conversion efficiency to the image sensor driver 71 .
(7) The imaging control unit 123 sets the high dynamic range to ON as a setting related to the high dynamic range.
(8) The imaging control unit 123 sets anti-shake control to ON as a setting related to anti-shake control, and outputs a blur correction command to the blur correction drive circuit 54 .
 また、一例として図23に示すように、CPU61が撮像モードである場合に、表示制御部124は、パラメータ記憶領域201に記憶された撮像モード用パラメータ211Aに含まれる第1画像処理設定パラメータ213Aに従って画像処理に関する画像処理設定を行う。
 (9)表示制御部124は、ノイズリダクションに関する設定として、ノイズリダクションの強弱の度合を第1基準度合よりも大きく設定する。
 (10)表示制御部124は、シャープネスに関する設定として、シャープネスの強弱の度合を第2基準度合よりも大きく設定する。
 (11)表示制御部124は、コントラストに関する設定として、コントラストの強弱の度合を第3基準度合よりも大きく設定する。
 (12)表示制御部124は、トーンに関する設定として、トーンの強弱の度合を第4基準度合よりも大きく設定する。
Further, as shown in FIG. 23 as an example, when the CPU 61 is in the imaging mode, the display control unit 124 performs the Perform image processing settings related to image processing.
(9) The display control unit 124 sets the degree of strength of noise reduction to be higher than the first reference degree as a setting related to noise reduction.
(10) The display control unit 124 sets the degree of strength of sharpness to be higher than the second reference degree as a setting related to sharpness.
(11) The display control unit 124 sets the degree of strength of contrast to be higher than the third reference degree as the setting related to contrast.
(12) The display control unit 124 sets the degree of intensity of the tone to be higher than the fourth reference degree as the tone-related setting.
 一例として図24に示すように、CPU61が温度測定モードである場合に、第1撮像制御部133及び第2撮像制御部135は、パラメータ記憶領域201に記憶された温度測定モード用パラメータ211Bに含まれる第2撮像設定パラメータ212Bに従って撮像に関する撮像設定を行う。
 (1)第1撮像制御部133及び第2撮像制御部135は、投光器14に関する設定として、投光器14が投光を行わない設定にし、投光制御回路73に対してオフ指令を出力する。
 (2)第1撮像制御部133及び第2撮像制御部135は、シャッタスピードに関する設定として、シャッタスピードを基準スピード未満に設定し、シャッタスピードを調節する。
 (3)第1撮像制御部133及び第2撮像制御部135は、絞り33に関する設定として、絞り量を基準量未満に設定し、設定した絞り量に対応する絞り指令を絞り駆動回路53に対して出力する。
 (4)第1撮像制御部133及び第2撮像制御部135は、測光に関する設定として、測光方式をハイライト重点測光方式、中央重点測光方式、又はスポット測光方式に設定する。
 (5)第1撮像制御部133及び第2撮像制御部135は、イメージセンサ15のゲインに関する設定として、イメージセンサ15のゲインを基準ゲイン未満に設定する。
 (6)第1撮像制御部133及び第2撮像制御部135は、イメージセンサ15の光電変換効率に関する設定として、イメージセンサ15の光電変換効率を基準光電変換効率未満に設定する。第1撮像制御部133及び第2撮像制御部135は、設定したゲイン及び光電変換効率に対応する感度指令をイメージセンサドライバ71に対して出力する。
 (7)第1撮像制御部133及び第2撮像制御部135は、ハイダイナミックレンジに関する設定として、ハイダイナミックレンジをオフに設定する。
 (8)第1撮像制御部133及び第2撮像制御部135は、防振制御に関する設定として、防振制御をオフに設定し、ぶれ補正駆動回路54に対してぶれ補正停止指令を出力する。
As an example, as shown in FIG. 24, when the CPU 61 is in the temperature measurement mode, the first imaging control unit 133 and the second imaging control unit 135 are included in the temperature measurement mode parameter 211B stored in the parameter storage area 201. Imaging settings related to imaging are performed according to the second imaging setting parameter 212B.
(1) The first image pickup control unit 133 and the second image pickup control unit 135 set the light projector 14 so that the light projector 14 does not project light, and output an OFF command to the light projection control circuit 73 .
(2) The first imaging control unit 133 and the second imaging control unit 135 adjust the shutter speed by setting the shutter speed to be less than the reference speed as the shutter speed setting.
(3) The first imaging control unit 133 and the second imaging control unit 135 set the aperture amount to be less than the reference amount as a setting related to the aperture 33, and send an aperture command corresponding to the set aperture amount to the aperture drive circuit 53. output.
(4) The first imaging control unit 133 and the second imaging control unit 135 set the photometry method to highlight-weighted photometry, center-weighted photometry, or spot photometry.
(5) The first imaging control unit 133 and the second imaging control unit 135 set the gain of the image sensor 15 to be less than the reference gain as the setting related to the gain of the image sensor 15 .
(6) The first imaging control unit 133 and the second imaging control unit 135 set the photoelectric conversion efficiency of the image sensor 15 to be less than the reference photoelectric conversion efficiency as the setting regarding the photoelectric conversion efficiency of the image sensor 15 . The first imaging control unit 133 and the second imaging control unit 135 output sensitivity commands corresponding to the set gain and photoelectric conversion efficiency to the image sensor driver 71 .
(7) The first imaging control unit 133 and the second imaging control unit 135 set the high dynamic range to OFF as a setting related to the high dynamic range.
(8) The first image pickup control unit 133 and the second image pickup control unit 135 set image stabilization control to OFF as settings related to image stabilization control, and output a motion compensation stop command to the motion compensation drive circuit 54 .
 また、一例として図24に示すように、CPU61が温度測定モードである場合に、表示制御部137は、パラメータ記憶領域201に記憶された温度測定モード用パラメータ211Bに含まれる第2画像処理設定パラメータ213Bに従って画像処理に関する画像処理設定を行う。
 (9)表示制御部137は、ノイズリダクションに関する設定として、ノイズリダクションの強弱の度合を第1基準度合以下に設定する。
 (10)表示制御部137は、シャープネスに関する設定として、シャープネスの強弱の度合を第2基準度合以下に設定する。
 (11)表示制御部137は、コントラストに関する設定として、コントラストの強弱の度合を第3基準度合以下に設定する。
 (12)表示制御部137は、トーンに関する設定として、トーンの強弱の度合を第4基準度合以下に設定する。
As an example, as shown in FIG. 24, when the CPU 61 is in the temperature measurement mode, the display control unit 137 sets the second image processing setting parameter included in the temperature measurement mode parameter 211B stored in the parameter storage area 201. 213B to set image processing.
(9) The display control unit 137 sets the degree of strength of noise reduction to be equal to or less than the first reference degree as a setting related to noise reduction.
(10) The display control unit 137 sets the degree of strength of sharpness to be equal to or lower than the second reference degree as a setting related to sharpness.
(11) The display control unit 137 sets the degree of strength of contrast to a third reference degree or less as a setting related to contrast.
(12) The display control unit 137 sets the degree of intensity of the tone to be equal to or lower than the fourth reference degree as the tone-related setting.
 第3実施形態において、投光器14に関する設定、シャッタスピードに関する設定、絞り33に関する設定、測光に関する設定、イメージセンサ15の感度に関する設定、ハイダイナミックレンジに関する設定、及び防振制御に関する設定は、本開示の技術に係る「撮像設定」の一例である。また、第1撮像設定パラメータ212A及び第2撮像設定パラメータ212Bは、本開示の技術に係る「撮像設定に関する撮像設定因子」及び「制御因子」の一例である。また、ノイズリダクションに関する設定、シャープネスに関する設定、コントラストに関する設定、及びトーンに関する設定は、本開示の技術に係る「画像処理設定」の一例である。また、第1画像処理設定パラメータ213A及び第2画像処理設定パラメータ213Bは、本開示の技術に係る「画像処理設定に関する画像処理設定因子」及び「制御因子」の一例である。 In the third embodiment, the settings related to the projector 14, the shutter speed settings, the aperture 33 settings, the photometry settings, the image sensor 15 sensitivity settings, the high dynamic range settings, and the anti-vibration control settings are the same as those described in the present disclosure. It is an example of the "imaging setting" which concerns on a technique. Also, the first imaging setting parameter 212A and the second imaging setting parameter 212B are examples of "imaging setting factor related to imaging setting" and "control factor" according to the technology of the present disclosure. Also, the settings related to noise reduction, the settings related to sharpness, the settings related to contrast, and the settings related to tone are examples of "image processing settings" according to the technology of the present disclosure. Also, the first image processing setting parameter 213A and the second image processing setting parameter 213B are examples of the “image processing setting factor related to image processing setting” and the “control factor” according to the technology of the present disclosure.
 次に、第3実施形態の作用として、カメラ1の制御方法について説明する。 Next, a method for controlling the camera 1 will be described as an operation of the third embodiment.
 第3実施形態では、モード切替処理部110が実行するモード切替処理、撮像処理部120が実行する撮像処理、及び温度測定処理部130が実行する温度測定処理は、第1実施形態と同じである。第3実施形態では、パラメータ変更処理部190がパラメータ変更処理を実行する点が、第1実施形態と異なる。以下、図25を参照しながら、第3実施形態に係るパラメータ変更処理部190が実行するパラメータ変更処理について説明する。 In the third embodiment, mode switching processing executed by the mode switching processing unit 110, imaging processing executed by the imaging processing unit 120, and temperature measurement processing executed by the temperature measurement processing unit 130 are the same as in the first embodiment. . The third embodiment differs from the first embodiment in that a parameter change processing unit 190 executes parameter change processing. Parameter change processing executed by the parameter change processing unit 190 according to the third embodiment will be described below with reference to FIG. 25 .
 ステップS51で、モード判定部112は、CPU61の動作モードが撮像モードであるのか、又は温度測定モードであるのかを判定する。ステップS51において、撮像モードであると判定された場合、図25に示す処理は、ステップS52に移行し、温度測定モードであると判定された場合、図25に示す処理は、ステップS53に移行する。 At step S51, the mode determination unit 112 determines whether the operation mode of the CPU 61 is the imaging mode or the temperature measurement mode. If it is determined in step S51 that it is the imaging mode, the processing shown in FIG. 25 proceeds to step S52, and if it is determined that it is in the temperature measurement mode, the processing illustrated in FIG. 25 proceeds to step S53. .
 ステップS52で、モード別パラメータ設定部192は、撮像モードに対応する撮像モード用パラメータ211Aを導出し、パラメータ記憶領域201にモード別パラメータ211として撮像モード用パラメータ211Aを設定する。 In step S52, the mode-specific parameter setting unit 192 derives the imaging mode parameter 211A corresponding to the imaging mode, and sets the imaging mode parameter 211A as the mode-specific parameter 211 in the parameter storage area 201.
 ステップS53で、モード別パラメータ設定部192は、温度測定モードに対応する温度測定モード用パラメータ211Bを導出し、パラメータ記憶領域201にモード別パラメータ211として温度測定モード用パラメータ211Bを設定する。 In step S53, the mode-specific parameter setting unit 192 derives the temperature measurement mode parameter 211B corresponding to the temperature measurement mode, and sets the temperature measurement mode parameter 211B as the mode-specific parameter 211 in the parameter storage area 201.
 第3実施形態に係るカメラ1の制御方法は、本開示の技術に係る「制御方法」の一例である。 The control method of the camera 1 according to the third embodiment is an example of the "control method" according to the technology of the present disclosure.
 次に、第3実施形態の効果について第1実施形態と異なる点を説明する。 Next, regarding the effects of the third embodiment, the differences from the first embodiment will be described.
 第3実施形態では、撮像モードと温度測定モードとで、撮像設定に関する撮像設定パラメータが異なる。つまり、一例として、撮像モードでは、第1撮像設定パラメータ212Aが設定され、温度測定モードでは、第2撮像設定パラメータ212Bが設定される。したがって、例えば撮像モードと温度測定モードとで撮像設定パラメータが同じである場合に比して、撮像モードでは、例えば撮像画像について良好な画質を得ることができ、温度測定モードでは、例えばCPU61の負担を軽減しつつ測定精度を確保することができる。 In the third embodiment, imaging setting parameters regarding imaging settings are different between the imaging mode and the temperature measurement mode. That is, as an example, the first imaging setting parameter 212A is set in the imaging mode, and the second imaging setting parameter 212B is set in the temperature measurement mode. Therefore, compared to the case where the imaging setting parameters are the same in the imaging mode and the temperature measurement mode, for example, in the imaging mode, it is possible to obtain good image quality for the captured image, and in the temperature measurement mode, for example, the load on the CPU 61 is reduced. It is possible to secure the measurement accuracy while reducing the
 また、撮像設定は、投光器14に関する設定を含む。CPU61は、撮像モードでは、投光器14が投光を行う設定にし、温度測定モードでは、投光器14が投光を行わない設定にする。したがって、撮像モードでは、例えば投光器14が投光を行う場合に比して、被写体から放射される光の量を確保することにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、例えば投光器14が投光を行う場合に比して、被写体から放射される近赤外光に投光器14による照明光が混在することを抑制することにより、測定精度を確保することができる。 The imaging settings also include settings related to the projector 14 . The CPU 61 sets the light projector 14 to emit light in the imaging mode, and sets the light projector 14 not to emit light in the temperature measurement mode. Therefore, in the imaging mode, by ensuring the amount of light emitted from the subject, it is possible to obtain a better image quality of the captured image than when the light projector 14 projects light. On the other hand, in the temperature measurement mode, compared to the case where the light projector 14 projects light, for example, by suppressing the near-infrared light emitted from the subject from being mixed with the illumination light from the light projector 14, measurement accuracy is ensured. can do.
 また、撮像設定は、シャッタスピードに関する設定を含む。CPU61は、撮像モードでは、シャッタスピードを基準スピード以上に設定し、温度測定モードでは、シャッタスピードを基準スピード未満に設定する。つまり、CPU61は、撮像モードでのシャッタスピードを温度測定モードでのシャッタスピードよりも長く設定する。これにより、撮像モードでは、イメージセンサ15に入射する光の量を確保することにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、イメージセンサ15から出力される第1アナログ画像データ及び第2アナログ画像データにノイズが含まれることを抑制することにより、測定精度を確保することができる。 Also, the imaging settings include settings related to shutter speed. The CPU 61 sets the shutter speed to a reference speed or higher in the imaging mode, and sets the shutter speed to less than the reference speed in the temperature measurement mode. That is, the CPU 61 sets the shutter speed in the imaging mode longer than the shutter speed in the temperature measurement mode. Accordingly, in the imaging mode, by ensuring the amount of light incident on the image sensor 15, it is possible to obtain a good image quality for the captured image. On the other hand, in the temperature measurement mode, it is possible to ensure measurement accuracy by suppressing noise from being included in the first analog image data and the second analog image data output from the image sensor 15 .
 また、撮像設定は、絞り33に関する設定を含む。CPU61は、撮像モードでは、絞り量を基準量以上に設定し、温度測定モードでは、絞り量を基準量未満に設定する。つまり、CPU61は、撮像モードでの絞り量を温度測定モードでの絞り量よりも多く設定する。これにより、撮像モードでは、イメージセンサ15に入射する光の量を確保することにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、イメージセンサ15から出力される第1アナログ画像データ及び第2アナログ画像データにノイズが含まれることを抑制することにより、測定精度を確保することができる。 Also, the imaging settings include settings related to the aperture 33 . The CPU 61 sets the diaphragm amount to a reference amount or more in the imaging mode, and sets the diaphragm amount to less than the reference amount in the temperature measurement mode. That is, the CPU 61 sets the aperture amount in the imaging mode to be larger than the aperture amount in the temperature measurement mode. Accordingly, in the imaging mode, by ensuring the amount of light incident on the image sensor 15, it is possible to obtain a good image quality for the captured image. On the other hand, in the temperature measurement mode, it is possible to ensure measurement accuracy by suppressing noise from being included in the first analog image data and the second analog image data output from the image sensor 15 .
 また、撮像設定は、測光に関する設定を含む。CPU61は、撮像モードでは、測光方式をアベレージ測光方式又はマルチパターン測光方式に設定し、温度測定モードでは、測光方式をハイライト重点測光方式、中央重点測光方式、又はスポット測光方式に設定する。これにより、撮像モードでは、被写体全体から放射され、イメージセンサ15に入射する光の量を確保することにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、被写体の最も温度の高い領域から放射され、イメージセンサ15に入射する光の量が飽和することを抑制することにより、測定精度を確保することができる。 Also, the imaging settings include settings related to photometry. The CPU 61 sets the photometry method to the average photometry method or the multi-pattern photometry method in the imaging mode, and sets the photometry method to the highlight-weighted photometry method, the center-weighted photometry method, or the spot photometry method in the temperature measurement mode. Accordingly, in the imaging mode, by securing the amount of light emitted from the entire subject and incident on the image sensor 15, it is possible to obtain a good image quality of the captured image. On the other hand, in the temperature measurement mode, the measurement accuracy can be ensured by suppressing saturation of the amount of light emitted from the highest temperature region of the subject and incident on the image sensor 15 .
 また、撮像設定は、イメージセンサ15の感度に関する設定を含む。イメージセンサ15の感度に関するパラメータは、イメージセンサ15のゲインに関するパラメータ及びイメージセンサ15の変換効率に関するパラメータを含む。CPU61は、撮像モードでは、イメージセンサ15のゲインを基準ゲイン以上に設定し、温度測定モードでは、イメージセンサ15のゲインを基準ゲイン未満に設定する。つまり、CPU61は、撮像モードでのゲインを温度測定モードでのゲインよりも高く設定する。これにより、撮像モードでは、露光に応じたアナログ画像データが得られることにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、イメージセンサ15から出力される第1アナログ画像データ及び第2アナログ画像データにノイズが含まれること、及び、被写体の最も温度の高い領域から放射された光に対して第1アナログ画像データ及び第2アナログ画像データのピーク値が飽和することを抑制することにより、測定精度を確保することができる。 The imaging settings also include settings related to the sensitivity of the image sensor 15 . The parameters relating to the sensitivity of the image sensor 15 include parameters relating to the gain of the image sensor 15 and parameters relating to the conversion efficiency of the image sensor 15 . The CPU 61 sets the gain of the image sensor 15 to a reference gain or more in the imaging mode, and sets the gain of the image sensor 15 to less than the reference gain in the temperature measurement mode. That is, the CPU 61 sets the gain in the imaging mode higher than the gain in the temperature measurement mode. Accordingly, in the imaging mode, analog image data corresponding to the exposure can be obtained, so that a good image quality can be obtained for the captured image. On the other hand, in the temperature measurement mode, noise is included in the first analog image data and the second analog image data output from the image sensor 15, and the light radiated from the area with the highest temperature of the object is subject to By suppressing saturation of the peak values of the first analog image data and the second analog image data, measurement accuracy can be ensured.
 また、CPU61は、撮像モードでは、イメージセンサ15の変換効率を基準変換効率以上に設定し、温度測定モードでは、イメージセンサ15の変換効率を基準変換効率未満に設定する。つまり、CPU61は、撮像モードでの変換効率を温度測定モードでの変換効率よりも高く設定する。これにより、撮像モードでは、露光に応じたアナログ画像データが得られることにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、イメージセンサ15から出力される第1アナログ画像データ及び第2アナログ画像データにノイズが含まれること、及び、被写体の最も温度の高い領域から放射された光に対して第1アナログ画像データ及び第2アナログ画像データのピーク値が飽和することを抑制することにより、測定精度を確保することができる。 In addition, the CPU 61 sets the conversion efficiency of the image sensor 15 to be equal to or higher than the reference conversion efficiency in the imaging mode, and sets the conversion efficiency of the image sensor 15 to be lower than the reference conversion efficiency in the temperature measurement mode. That is, the CPU 61 sets the conversion efficiency in the imaging mode higher than the conversion efficiency in the temperature measurement mode. Accordingly, in the imaging mode, analog image data corresponding to the exposure can be obtained, so that a good image quality can be obtained for the captured image. On the other hand, in the temperature measurement mode, noise is included in the first analog image data and the second analog image data output from the image sensor 15, and the light radiated from the area with the highest temperature of the object is subject to By suppressing saturation of the peak values of the first analog image data and the second analog image data, measurement accuracy can be ensured.
 また、撮像設定は、ハイダイナミックレンジに関する設定を含む。CPU61は、撮像モードでは、ハイダイナミックレンジをオンに設定し、温度測定モードでは、ハイダイナミックレンジをオフに設定する。これにより、撮像モードでは、ダイナミックレンジが基準レンジよりも拡がることにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、ダイナミックレンジが基準レンジに設定されることにより、測定精度を確保することができる。 Also, the imaging settings include settings related to high dynamic range. The CPU 61 sets the high dynamic range to ON in the imaging mode, and sets the high dynamic range to OFF in the temperature measurement mode. As a result, in the imaging mode, the dynamic range is wider than the reference range, so that a good image quality can be obtained for the captured image. On the other hand, in the temperature measurement mode, by setting the dynamic range to the reference range, measurement accuracy can be ensured.
 また、撮像設定は、防振制御に関する設定を含む。CPU61は、撮像モードでは、防振制御をオンに設定し、温度測定モードでは、防振制御をオフに設定する。これにより、撮像モードでは、像のぶれが抑制されることにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、像のぶれを抑制するために必要なCPU61の演算処理量を低減することにより、CPU61の負担を軽減することができる。 Also, the imaging settings include settings related to anti-vibration control. The CPU 61 sets anti-vibration control to ON in the imaging mode, and sets anti-vibration control to OFF in the temperature measurement mode. As a result, in the imaging mode, image blurring is suppressed, so that a good image quality can be obtained for the captured image. On the other hand, in the temperature measurement mode, the load on the CPU 61 can be reduced by reducing the amount of computation required by the CPU 61 to suppress blurring of the image.
 また、第3実施形態では、撮像モードと温度測定モードとで、画像処理設定に関する画像処理設定パラメータが異なる。つまり、一例として、撮像モードでは、第1画像処理設定パラメータ213Aが設定され、温度測定モードでは、第2画像処理設定パラメータ213Bが設定される。したがって、例えば撮像モードと温度測定モードとで画像処理設定パラメータが同じである場合に比して、撮像モードでは、例えば撮像画像について良好な画質を得ることができ、温度測定モードでは、例えばCPU61の負担を軽減することができる。 Also, in the third embodiment, image processing setting parameters regarding image processing settings are different between the imaging mode and the temperature measurement mode. That is, as an example, the first image processing setting parameter 213A is set in the imaging mode, and the second image processing setting parameter 213B is set in the temperature measurement mode. Therefore, compared to the case where the image processing setting parameters are the same in the imaging mode and the temperature measurement mode, for example, in the imaging mode, it is possible to obtain good image quality for the captured image, and in the temperature measurement mode, for example, the CPU 61 The burden can be reduced.
 また、画像処理設定は、ノイズリダクションに関する設定を含む。CPU61は、撮像モードでは、ノイズリダクションの強弱の度合を第1基準度合よりも大きく設定し、温度測定モードでは、ノイズリダクションの強弱の度合を第1基準度合以下に設定する。つまり、CPU61は、撮像モードでのノイズリダクションを温度測定モードでのノイズリダクションよりも強く設定する。これにより、撮像モードでは、撮像画像に含まれるノイズが低減されることにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、ノイズを低減するために必要なCPU61の演算処理量を低減することにより、CPU61の負担を軽減することができる。 In addition, the image processing settings include settings related to noise reduction. The CPU 61 sets the degree of noise reduction strength to be greater than the first reference degree in the imaging mode, and sets the degree of noise reduction strength to the first reference degree or less in the temperature measurement mode. That is, the CPU 61 sets the noise reduction in the imaging mode to be stronger than the noise reduction in the temperature measurement mode. Accordingly, in the imaging mode, the noise included in the captured image is reduced, so that the captured image can be obtained with good image quality. On the other hand, in the temperature measurement mode, the load on the CPU 61 can be reduced by reducing the amount of computation required by the CPU 61 to reduce noise.
 また、画像処理設定は、シャープネスに関する設定を含む。CPU61は、撮像モードでは、シャープネスの強弱の度合を第2基準度合よりも大きく設定し、温度測定モードでは、シャープネスの強弱の度合を第2基準度合以下に設定する。つまり、CPU61は、撮像モードでのシャープネスを温度測定モードでのシャープネスよりも強く設定する。これにより、撮像モードでは、撮像画像のシャープネスが強まることにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、シャープネスを調節するために必要なCPU61の演算処理量を低減することにより、CPU61の負担を軽減することができる。 Also, the image processing settings include settings related to sharpness. The CPU 61 sets the degree of sharpness to be greater than the second reference degree in the imaging mode, and sets the degree of sharpness to the second reference degree or less in the temperature measurement mode. That is, the CPU 61 sets the sharpness in the imaging mode stronger than the sharpness in the temperature measurement mode. As a result, in the imaging mode, the sharpness of the captured image is enhanced, so that a good image quality can be obtained for the captured image. On the other hand, in the temperature measurement mode, the burden on the CPU 61 can be reduced by reducing the amount of arithmetic processing of the CPU 61 necessary for adjusting the sharpness.
 また、画像処理設定は、コントラストに関する設定を含む。CPU61は、撮像モードでは、コントラストの強弱の度合を第3基準度合よりも大きく設定し、温度測定モードでは、コントラストの強弱の度合を第3基準度合以下に設定する。つまり、CPU61は、撮像モードでのコントラストを温度測定モードでのコントラストよりも強く設定する。これにより、撮像モードでは、撮像画像のコントラストが強まることにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、コントラストを調節するために必要なCPU61の演算処理量を低減することにより、CPU61の負担を軽減することができる。 Also, the image processing settings include settings related to contrast. The CPU 61 sets the degree of contrast strength to be greater than the third reference degree in the imaging mode, and sets the degree of contrast strength to the third reference degree or less in the temperature measurement mode. That is, the CPU 61 sets the contrast in the imaging mode higher than the contrast in the temperature measurement mode. As a result, in the imaging mode, the contrast of the captured image is enhanced, so that good image quality can be obtained for the captured image. On the other hand, in the temperature measurement mode, the load on the CPU 61 can be reduced by reducing the amount of computation required by the CPU 61 for adjusting the contrast.
 また、画像処理設定は、トーンに関する設定を含む。CPU61は、撮像モードでは、トーンの強弱の度合を第4基準度合よりも大きく設定し、温度測定モードでは、トーンの強弱の度合を第4基準度合以下に設定する。つまり、CPU61は、撮像モードでのトーンを温度測定モードでのトーンよりも強く設定する。これにより、撮像モードでは、撮像画像のコントラストが強まることにより、撮像画像について良好な画質を得ることができる。一方、温度測定モードでは、コントラストを調節するために必要なCPU61の演算処理量を低減することにより、CPU61の負担を軽減することができる。 Also, the image processing settings include settings related to tone. The CPU 61 sets the degree of tone strength to be greater than the fourth reference degree in the imaging mode, and sets the degree of tone strength to the fourth reference degree or less in the temperature measurement mode. That is, the CPU 61 sets the tone in the imaging mode stronger than the tone in the temperature measurement mode. As a result, in the imaging mode, the contrast of the captured image is enhanced, so that good image quality can be obtained for the captured image. On the other hand, in the temperature measurement mode, the load on the CPU 61 can be reduced by reducing the amount of computation required by the CPU 61 for adjusting the contrast.
 次に、第3実施形態の変形例について説明する。 Next, a modified example of the third embodiment will be described.
 第3実施形態では、撮像モードと温度測定モードとで、投光器14に関する設定、シャッタスピードに関する設定、絞り33に関する設定、測光に関する設定、イメージセンサ15の感度に関する設定、ハイダイナミックレンジに関する設定、及び防振制御に関する設定が異なるが、撮像モードと温度測定モードとで異ならせる撮像設定の組み合わせは、上記以外でもよい。例えば、ハイダイナミックレンジに関する設定は、撮像モード及び温度測定モードの両方でオンに設定されてもよい。また、防振制御に関する設定は、撮像モード及び温度測定モードの両方でオンに設定されてもよい。 In the third embodiment, in the imaging mode and the temperature measurement mode, settings related to the projector 14, settings related to the shutter speed, settings related to the aperture 33, settings related to photometry, settings related to the sensitivity of the image sensor 15, settings related to high dynamic range, and settings related to protection Although settings related to vibration control are different, combinations of imaging settings that are different between the imaging mode and the temperature measurement mode may be other than the above. For example, the setting for high dynamic range may be set to on in both imaging mode and temperature measurement mode. In addition, the setting regarding anti-vibration control may be set to ON in both the imaging mode and the temperature measurement mode.
 また、撮像モードと温度測定モードとで異ならせる撮像設定は、投光器14に関する設定、シャッタスピードに関する設定、絞り33に関する設定、測光に関する設定、イメージセンサ15の感度に関する設定、ハイダイナミックレンジに関する設定、及び防振制御に関する設定の少なくとも1つの設定を含んでいれば、上記以外でもよい。また、撮像モードと温度測定モードとで異ならせる撮像設定は、投光器14に関する設定、シャッタスピードに関する設定、絞り33に関する設定、測光に関する設定、イメージセンサ15の感度に関する設定、ハイダイナミックレンジに関する設定、及び防振制御に関する設定以外に、カメラ1に関する各種撮像設定を含んでいてもよい。 In addition, the imaging settings that are made different between the imaging mode and the temperature measurement mode include settings related to the projector 14, settings related to the shutter speed, settings related to the aperture 33, settings related to photometry, settings related to the sensitivity of the image sensor 15, settings related to high dynamic range, and Any setting other than the above may be used as long as at least one setting related to anti-vibration control is included. In addition, the imaging settings that are made different between the imaging mode and the temperature measurement mode include settings related to the projector 14, settings related to the shutter speed, settings related to the aperture 33, settings related to photometry, settings related to the sensitivity of the image sensor 15, settings related to high dynamic range, and Various imaging settings related to the camera 1 may be included in addition to the settings related to anti-vibration control.
 また、第3実施形態では、撮像モードと温度測定モードとで、ノイズリダクションに関する設定、シャープネスに関する設定、コントラストに関する設定、及びトーンに関する設定が異なるが、撮像モードと温度測定モードとで異ならせる画像処理設定の組み合わせは、上記以外でもよい。 In addition, in the third embodiment, the noise reduction setting, the sharpness setting, the contrast setting, and the tone setting are different between the imaging mode and the temperature measurement mode. Combinations of settings may be other than the above.
 また、撮像モードと温度測定モードとで異ならせる画像処理設定は、ノイズリダクションに関する設定、シャープネスに関する設定、コントラストに関する設定、及びトーンに関する設定の少なくとも1つの設定を含んでいれば、上記以外でもよい。また、撮像モードと温度測定モードとで異ならせる画像処理設定は、ノイズリダクションに関する設定、シャープネスに関する設定、コントラストに関する設定、及びトーンに関する設定以外に、カメラ1に関する各種画像処理設定を含んでいてもよい。 Also, the image processing settings that are differentiated between the imaging mode and the temperature measurement mode may be other than the above, as long as they include at least one of noise reduction settings, sharpness settings, contrast settings, and tone settings. Also, the image processing settings that are made different between the imaging mode and the temperature measurement mode may include various image processing settings related to the camera 1 in addition to settings related to noise reduction, settings related to sharpness, settings related to contrast, and settings related to tone. .
 また、第3実施形態では、CPU61は、温度測定モードでは、投光器14が投光を行わない設定にするが、投光器14が投光を抑制する設定(つまり投光器14から投光される光の量を抑制する設定)にしてもよい。 In the third embodiment, the CPU 61 sets the light projector 14 not to project light in the temperature measurement mode, but sets the light projector 14 to suppress light projection (that is, the amount of light projected from the light projector 14 can be set).
 [第4実施形態]
 次に、第4実施形態について説明する。
[Fourth embodiment]
Next, a fourth embodiment will be described.
 第4実施形態では、第1実施形態に対し、カメラ1の構成が次のように変更されている。以下、第4実施形態について第1実施形態と異なる点を説明する。 In the fourth embodiment, the configuration of the camera 1 is changed as follows from the first embodiment. Differences of the fourth embodiment from the first embodiment will be described below.
 一例として図26に示すように、モード切替処理部110は、モード判定部112、測定温度取得部221、温度測定モード終了判定部222、フラグ設定部113、投光制御部114、及びモード設定部115を有する。 As shown in FIG. 26 as an example, the mode switching processing unit 110 includes a mode determination unit 112, a measured temperature acquisition unit 221, a temperature measurement mode end determination unit 222, a flag setting unit 113, a light projection control unit 114, and a mode setting unit. 115.
 モード判定部112は、CPU61の動作モードが撮像モードであるのか、又は温度測定モードであるのかを判定する。 The mode determination unit 112 determines whether the operation mode of the CPU 61 is the imaging mode or the temperature measurement mode.
 測定温度取得部221は、モード判定部112によってCPU61の動作モードが温度測定モードであると判定された場合には、温度測定モードで測定された被写体の温度(以下、測定温度と称する)を取得する。測定温度は、被写体の温度分布の値、被写体の温度分布の値のうちの最高値、被写体の温度分布の値のうちの最頻値、被写体の温度分布の値のうちの中央値、及び被写体の温度分布の平均値のいずれでもよい。 When the mode determination unit 112 determines that the operation mode of the CPU 61 is the temperature measurement mode, the measured temperature acquisition unit 221 acquires the temperature of the subject measured in the temperature measurement mode (hereinafter referred to as the measured temperature). do. The measured temperature is the temperature distribution value of the subject, the highest value of the temperature distribution of the subject, the mode of the temperature distribution of the subject, the median of the temperature distribution of the subject, and the subject Any of the average values of the temperature distribution of
 温度測定モード終了判定部222は、測定温度取得部221で取得された測定温度に基づいて温度測定モードを終了するか否かを判定する。一例として、温度測定モード終了判定部222は、測定温度取得部221で取得された測定温度に基づく値を導出し、導出した値が閾値以下であるか否かを判定することにより、測定モードを終了するか否かを判定する。測定温度に基づく値は、測定温度から導かれる値であれば何でもよく、例えば、測定温度そのものの値でもよく、また、測定温度に基づいて算出される輻射熱量等の値でもよい。また、測定温度に基づく値の導出には、計算式が用いられてもよく、また、データマッチングテーブルが用いられてもよい。温度測定モード終了判定部222は、測定温度に基づく値が閾値以下である場合には、温度測定モードを終了すると判定する。 The temperature measurement mode end determination unit 222 determines whether to end the temperature measurement mode based on the measured temperature acquired by the measured temperature acquisition unit 221 . As an example, the temperature measurement mode end determination unit 222 derives a value based on the measured temperature acquired by the measured temperature acquisition unit 221, determines whether the derived value is equal to or less than a threshold value, and ends the measurement mode. Determine whether to terminate. The value based on the measured temperature may be any value derived from the measured temperature. For example, it may be the value of the measured temperature itself, or the value of the amount of radiant heat calculated based on the measured temperature. Also, a calculation formula may be used to derive the value based on the measured temperature, or a data matching table may be used. The temperature measurement mode end determination unit 222 determines to end the temperature measurement mode when the value based on the measured temperature is equal to or less than the threshold.
 フラグ設定部113は、温度測定モード終了判定部222によって温度測定モードを終了すると判定された場合には、表示制御フラグ記憶領域141に表示制御フラグ151として撮像画像表示フラグ151Aを設定し、かつ、投光制御フラグ記憶領域142に投光制御フラグ152として投光オン制御フラグ152Aを設定する。 When the temperature measurement mode end determination unit 222 determines to end the temperature measurement mode, the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141, and A light projection ON control flag 152 A is set as the light projection control flag 152 in the light projection control flag storage area 142 .
 投光制御部114は、フラグ設定部113によって投光オン制御フラグ152Aが設定された場合に、投光制御回路73にオン指令を出力する。オン指令は、投光器14をオンに切り替える指令である。 The light projection control unit 114 outputs an ON command to the light projection control circuit 73 when the light projection ON control flag 152A is set by the flag setting unit 113 . The ON command is a command to switch the projector 14 ON.
 モード設定部115は、CPU61のモードとして撮像モードを設定する。 The mode setting unit 115 sets the imaging mode as the mode of the CPU 61 .
 次に、第4実施形態の作用として、カメラ1の制御方法について説明する。 Next, a method for controlling the camera 1 will be described as an operation of the fourth embodiment.
 第4実施形態では、撮像処理部120が実行する撮像処理、及び温度測定処理部130が実行する温度測定処理は、第1実施形態と同じである。第3実施形態では、モード切替処理部110が実行するモード切替処理が、第1実施形態と異なる。以下、図27を参照しながら、第4実施形態に係るモード切替処理部110が実行するモード切替処理について説明する。 In the fourth embodiment, the imaging processing executed by the imaging processing unit 120 and the temperature measurement processing executed by the temperature measurement processing unit 130 are the same as in the first embodiment. In the third embodiment, mode switching processing executed by the mode switching processing unit 110 is different from that in the first embodiment. Mode switching processing executed by the mode switching processing unit 110 according to the fourth embodiment will be described below with reference to FIG. 27 .
 ステップS61で、モード判定部112は、CPU61のモードが撮像モードであるのか、又は温度測定モードであるのかを判定する。ステップS61において、撮像モードであると判定された場合、図27に示す処理は、ステップS62に移行し、温度測定モードであると判定された場合、図27に示す処理は終了する。 At step S61, the mode determination unit 112 determines whether the mode of the CPU 61 is the imaging mode or the temperature measurement mode. If it is determined in step S61 that the imaging mode is set, the process shown in FIG. 27 proceeds to step S62, and if it is determined that the temperature measurement mode is set, the process shown in FIG. 27 ends.
 ステップS62で、測定温度取得部221は、温度測定モードで測定された測定温度を取得する。 At step S62, the measured temperature acquisition unit 221 acquires the measured temperature measured in the temperature measurement mode.
 ステップS63で、温度測定モード終了判定部222は、測定温度取得部221で取得された測定温度に基づいて温度測定モードを終了するか否かを判定する。ステップS63において、温度測定モードを終了すると判定された場合、図27に示す処理は、ステップS64に移行し、温度測定モードを終了しないと判定された場合、図27に示す処理は終了する。 In step S63, the temperature measurement mode end determination unit 222 determines whether to end the temperature measurement mode based on the measured temperature acquired by the measured temperature acquisition unit 221. If it is determined in step S63 that the temperature measurement mode should be ended, the process shown in FIG. 27 proceeds to step S64, and if it is determined not to end the temperature measurement mode, the process shown in FIG. 27 ends.
 ステップS64で、フラグ設定部113は、表示制御フラグ記憶領域141に表示制御フラグ151として撮像画像表示フラグ151Aを設定し、かつ、投光制御フラグ記憶領域142に投光制御フラグ152として投光オン制御フラグ152Aを設定する。 In step S64, the flag setting unit 113 sets the captured image display flag 151A as the display control flag 151 in the display control flag storage area 141, and sets the light projection control flag 152 in the light projection control flag storage area 142 so that light projection is turned on. Set control flag 152A.
 ステップS65で、投光制御部114は、投光器14をオンに切り替える。 In step S65, the light projection control unit 114 switches the light projector 14 on.
 ステップS66で、モード設定部115は、CPU61のモードとして撮像モードを設定する。 In step S66, the mode setting unit 115 sets the imaging mode as the mode of the CPU61.
 第4実施形態に係るカメラ1の制御方法は、本開示の技術に係る「制御方法」の一例である。 The control method of the camera 1 according to the fourth embodiment is an example of the "control method" according to the technology of the present disclosure.
 次に、第4実施形態の効果について第1実施形態と異なる点を説明する。 Next, the effects of the fourth embodiment that differ from those of the first embodiment will be described.
 第4実施形態では、CPU61は、温度測定モードにおいて、被写体の温度に応じて温度測定モードから撮像モードに切り替わる。したがって、例えば被写体の温度に応じて温度測定モードから撮像モードに切り替わらない場合に比して、利便性を高めることができる。 In the fourth embodiment, the CPU 61 switches from the temperature measurement mode to the imaging mode according to the temperature of the subject in the temperature measurement mode. Therefore, convenience can be improved compared to the case where the temperature measurement mode is not switched to the imaging mode according to the temperature of the object, for example.
 [第5実施形態]
 次に、第5実施形態について説明する。
[Fifth embodiment]
Next, a fifth embodiment will be described.
 第5実施形態では、第1実施形態に対し、カメラ1の構成が次のように変更されている。以下、第5実施形態について第1実施形態と異なる点を説明する。 In the fifth embodiment, the configuration of the camera 1 is changed as follows from the first embodiment. Differences of the fifth embodiment from the first embodiment will be described below.
 一例として図28に示すように、CPU61は、統合表示処理部230として機能する。統合表示処理部230は、投光器14をパルス発光させながら、パルス発光の発光期間に撮像モードを設定し、パルス発光の発光停止期間に温度測定モードを設定する動作を、パルス発光の発光タイミングに応じて繰り返す処理部である。投光器14は、パルス発光を行うことで、断続的な投光を行う。統合表示処理部230は、パルス発光制御部241、撮像処理部120、パルス発光停止制御部242、温度測定処理部130、及び統合表示制御部243を有する。 As shown in FIG. 28 as an example, the CPU 61 functions as an integrated display processing section 230 . The integrated display processing unit 230 performs the operation of setting the imaging mode during the light emission period of the pulse light emission and setting the temperature measurement mode during the light emission stop period of the pulse light emission while causing the light projector 14 to emit light in accordance with the light emission timing of the pulse light emission. It is a processing unit that repeats The light projector 14 performs intermittent light projection by performing pulse light emission. The integrated display processing section 230 has a pulse emission control section 241 , an imaging processing section 120 , a pulse emission stop control section 242 , a temperature measurement processing section 130 and an integrated display control section 243 .
 パルス発光制御部241は、投光制御回路73にパルス発光指令を出力し、投光器14に対してパルス発光させる制御を行う。 The pulse light emission control unit 241 outputs a pulse light emission command to the light projection control circuit 73 and controls the light projector 14 to emit pulse light.
 撮像処理部120は、波長選択部121、ターレット制御部122、及び撮像制御部123を有する。波長選択部121、ターレット制御部122、及び撮像制御部123の機能は、第1実施形態と同じである。撮像処理部120は、イメージセンサ15に可視光又は近赤外光を撮像させることにより撮像画像を得る撮像処理を実行する。 The imaging processing unit 120 has a wavelength selection unit 121, a turret control unit 122, and an imaging control unit 123. The functions of the wavelength selection unit 121, the turret control unit 122, and the imaging control unit 123 are the same as in the first embodiment. The imaging processing unit 120 performs imaging processing for obtaining a captured image by causing the image sensor 15 to capture visible light or near-infrared light.
 パルス発光停止制御部242は、投光制御回路73にパルス発光停止指令を出力し、投光器14に対してパルス発光を停止させる制御を行う。 The pulse emission stop control unit 242 outputs a pulse emission stop command to the light emission control circuit 73 and controls the light projector 14 to stop pulse emission.
 温度測定処理部130は、波長選択部131、第1ターレット制御部132、第1撮像制御部133、第2ターレット制御部134、第2撮像制御部135、及び温度導出部136を有する。波長選択部131、第1ターレット制御部132、第1撮像制御部133、第2ターレット制御部134、第2撮像制御部135、及び温度導出部136の機能は、第1実施形態と同じである。温度測定処理部130は、イメージセンサ15によって近赤外光が撮像されることにより得られた近赤外光画像に基づいて被写体の温度分布を算出し、被写体の温度分布に基づいて温度情報を生成する温度測定処理を実行する。 The temperature measurement processing unit 130 has a wavelength selection unit 131, a first turret control unit 132, a first imaging control unit 133, a second turret control unit 134, a second imaging control unit 135, and a temperature derivation unit 136. The functions of the wavelength selection unit 131, the first turret control unit 132, the first imaging control unit 133, the second turret control unit 134, the second imaging control unit 135, and the temperature derivation unit 136 are the same as in the first embodiment. . The temperature measurement processing unit 130 calculates the temperature distribution of the subject based on the near-infrared light image obtained by capturing the near-infrared light by the image sensor 15, and obtains the temperature information based on the temperature distribution of the subject. Run the generated temperature measurement process.
 統合表示制御部243は、撮像処理部120によって得られた撮像画像と温度測定処理部130によって得られた温度情報を統合した統合画像251を出力し、統合画像251をディスプレイ76に表示させる。統合画像251は、撮像処理部120によって得られた撮像画像に、温度測定処理部130によって得られた温度情報を合成した画像であってもよく、また、撮像処理部120によって得られた撮像画像と、温度測定処理部130によって得られた温度情報とを並べて表示する画像でもよい。また、温度情報は、例えば、温度が予め定められた閾値以上である領域を示す情報、温度の具体的な数値を示す情報、予め定められた温度の範囲毎に区画された複数の区画を温度の具体的な数値と併せて示す情報、又は温度に応じた色調で温度分布を示す情報等でもよい。 The integrated display control unit 243 outputs an integrated image 251 that integrates the captured image obtained by the imaging processing unit 120 and the temperature information obtained by the temperature measurement processing unit 130, and displays the integrated image 251 on the display 76. The integrated image 251 may be an image obtained by synthesizing the captured image obtained by the imaging processing unit 120 with the temperature information obtained by the temperature measurement processing unit 130, or the captured image obtained by the imaging processing unit 120. and the temperature information obtained by the temperature measurement processing unit 130 may be displayed side by side. Further, the temperature information includes, for example, information indicating an area where the temperature is equal to or higher than a predetermined threshold, information indicating a specific numerical value of the temperature, and information indicating a plurality of sections divided for each predetermined temperature range. or information indicating the temperature distribution in a color tone corresponding to the temperature.
 第5実施形態において、撮像画像は、本開示の技術に係る「撮像画像」及び「第1撮像画像」の一例であり、温度情報は、本開示の技術に係る「温度情報」の一例である。また、統合画像251は、本開示の技術に係る「合成画像」の一例である。 In the fifth embodiment, the captured image is an example of the “captured image” and the “first captured image” according to the technology of the present disclosure, and the temperature information is an example of the “temperature information” according to the technology of the present disclosure. . Also, the integrated image 251 is an example of a “composite image” according to the technology of the present disclosure.
 次に、第5実施形態の作用として、カメラ1の制御方法について説明する。 Next, a method for controlling the camera 1 will be described as an operation of the fifth embodiment.
 第5実施形態では、統合表示処理部230が統合表示処理を実行する点が、第1実施形態と異なる。以下、図29を参照しながら、第5実施形態に係る統合表示処理部230が実行する統合表示処理について説明する。 The fifth embodiment differs from the first embodiment in that the integrated display processing unit 230 executes integrated display processing. The integrated display processing executed by the integrated display processing unit 230 according to the fifth embodiment will be described below with reference to FIG. 29 .
 ステップS71で、パルス発光制御部241は、投光器14にパルス発光させる。 In step S71, the pulse light emission control unit 241 causes the light projector 14 to emit pulse light.
 ステップS72で、撮像処理部120は、イメージセンサ15に可視光又は近赤外光を撮像させることにより撮像画像を得る。 In step S72, the imaging processing unit 120 obtains a captured image by causing the image sensor 15 to capture visible light or near-infrared light.
 ステップS73で、パルス発光停止制御部242は、投光器14にパルス発光を停止させる。 In step S73, the pulse emission stop control unit 242 causes the light projector 14 to stop pulse emission.
 ステップS74で、温度測定処理部130は、イメージセンサ15に近赤外光を撮像させることにより得た近赤外光画像に基づいて被写体の温度分布を算出し、被写体の温度分布に基づいて温度情報を生成する In step S74, the temperature measurement processing unit 130 calculates the temperature distribution of the subject based on the near-infrared light image obtained by causing the image sensor 15 to capture the near-infrared light, and calculates the temperature distribution based on the temperature distribution of the subject. generate information
 ステップS75で、統合表示制御部243は、撮像処理部120によって得られた撮像画像と、温度測定処理部130によって得られた温度情報とを統合した統合画像251を出力し、統合画像251をディスプレイ76に表示させる。 In step S75, the integrated display control unit 243 outputs the integrated image 251 obtained by integrating the captured image obtained by the imaging processing unit 120 and the temperature information obtained by the temperature measurement processing unit 130, and displays the integrated image 251. 76.
 第5実施形態に係るカメラ1の制御方法は、本開示の技術に係る「制御方法」の一例である。 The control method of the camera 1 according to the fifth embodiment is an example of the "control method" according to the technology of the present disclosure.
 次に、第5実施形態の効果について第1実施形態と異なる点を説明する。 Next, the effects of the fifth embodiment that differ from those of the first embodiment will be described.
 第5実施形態では、投光器14は、パルス発光を行い、CPU61は、パルス発光の発光期間に撮像モードを設定し、パルス発光の発光停止期間に温度測定モードを設定する動作を、パルス発光の発光タイミングに応じて繰り返す。これにより、撮像モードで得られた撮像画像に、温度測定モードで得られた温度情報を統合した統合画像を得ることができる。 In the fifth embodiment, the light projector 14 performs pulsed light emission, and the CPU 61 sets the imaging mode during the light emission period of the pulsed light emission and sets the temperature measurement mode during the light emission stop period of the pulsed light emission. Repeat in time. Thereby, an integrated image can be obtained by integrating the captured image obtained in the imaging mode with the temperature information obtained in the temperature measurement mode.
 また、CPU61は、撮像処理部120によって得られた撮像画像と温度測定処理部130によって得られた温度情報を統合した統合画像251を出力する。したがって、撮像モードと温度測定モードとに切り替わらなくても、撮像画像に温度情報が統合された統合画像251がディスプレイ76に表示されることにより、ユーザが被写体の状況と温度との関係を視覚的に把握できる。 The CPU 61 also outputs an integrated image 251 that integrates the captured image obtained by the imaging processing unit 120 and the temperature information obtained by the temperature measurement processing unit 130 . Therefore, even without switching between the imaging mode and the temperature measurement mode, the display 76 displays the integrated image 251 in which temperature information is integrated with the captured image, thereby allowing the user to visually understand the relationship between the subject's condition and the temperature. can be grasped.
 次に、第1実施形態から第5実施形態に共通の変形例について説明する。 Next, modifications common to the first to fifth embodiments will be described.
 第1実施形態から第5実施形態において、CPU61は、撮像モードと温度測定モードと備えるが、撮像モード及び温度測定モード以外のモードを備えていてもよい。 In the first to fifth embodiments, the CPU 61 has an imaging mode and a temperature measurement mode, but may have modes other than the imaging mode and the temperature measurement mode.
 また、第1実施形態から第5実施形態において、CPU61は、撮像モードと温度測定モードとで、表示制御フラグ151及び投光制御フラグ152をそれぞれ異ならせるが、表示制御フラグ151及び投光制御フラグ152以外の制御フラグを異ならせてもよい。 Further, in the first to fifth embodiments, the CPU 61 sets the display control flag 151 and the light emission control flag 152 differently between the imaging mode and the temperature measurement mode. Control flags other than 152 may be varied.
 また、第1実施形態から第5実施形態では、表示制御フラグ151として、撮像画像161Aをディスプレイ76に対して表示させる撮像画像表示フラグ151A、及び、撮像画像に温度情報を合成させた合成画像161Bをディスプレイ76に対して表示させる合成画像表示フラグ151Bが設定される。しかしながら、撮像画像表示フラグ151A及び合成画像表示フラグ151B以外の表示制御フラグ151が設定されてもよい。また、温度測定モードでは、合成画像表示フラグ151Bの代わりに、温度情報をディスプレイ76に対して表示させる温度情報表示フラグが設定され、ディスプレイ76に温度情報が表示されてもよい。温度情報表示フラグは、本開示の技術に係る「温度情報表示因子」の一例である。 In addition, in the first to fifth embodiments, the display control flag 151 includes a captured image display flag 151A for displaying the captured image 161A on the display 76, and a synthesized image 161B obtained by synthesizing temperature information with the captured image. is set on the display 76 to display the composite image display flag 151B. However, a display control flag 151 other than the captured image display flag 151A and the composite image display flag 151B may be set. Also, in the temperature measurement mode, a temperature information display flag for displaying temperature information on the display 76 may be set instead of the composite image display flag 151B, and the temperature information may be displayed on the display 76 . A temperature information display flag is an example of a "temperature information display factor" according to the technology of the present disclosure.
 また、第1実施形態から第5実施形態では、ぶれ補正レンズ34を移動させることにより像のぶれが補正されるが、例えば、本開示の技術に係る「光学素子」の一例としてイメージセンサ15を移動させることにより像のぶれが補正されてよい。また、複数の撮像画像に基づく画像処理技術により像のぶれが補正されてもよい。 Further, in the first to fifth embodiments, image blur is corrected by moving the blur correction lens 34. For example, the image sensor 15 is used as an example of the "optical element" according to the technology of the present disclosure. Image blur may be corrected by the movement. Image blurring may also be corrected by an image processing technique based on a plurality of captured images.
 また、第1実施形態から第5実施形態において、二色温度測定法による温度測定では、950nmから1100nmの波長帯域、1150nmから1350nmの波長帯域、1500nmから1750nmの波長帯域、及び200nmから2400nmの波長帯域から二つの波長帯域が選択されるが、これら以外の波長帯域から二つの波長帯域が選択されてもよい。 Further, in the first to fifth embodiments, in the temperature measurement by the two-color thermometry method, the wavelength band from 950 nm to 1100 nm, the wavelength band from 1150 nm to 1350 nm, the wavelength band from 1500 nm to 1750 nm, and the wavelength from 200 nm to 2400 nm Two wavelength bands are selected from the bands, but two wavelength bands may be selected from other wavelength bands.
 また、第1実施形態から第5実施形態において、二色温度測定法による温度測定では、近赤外光が用いられるが、例えば可視光などの近赤外光以外の光が用いられてもよい。 Further, in the first to fifth embodiments, near-infrared light is used in the temperature measurement by the two-color thermometry method, but light other than near-infrared light such as visible light may be used. .
 また、第1実施形態から第5実施形態において、カメラ1が撮像装置の一例として挙げられているが、本開示の技術はこれに限定されず、スマートデバイス、ウェアラブル端末、細胞観察装置、眼科観察装置、又は外科顕微鏡等の各種の電子機器に内蔵されるデジタルカメラであってもよい。 Further, in the first to fifth embodiments, the camera 1 is given as an example of an imaging device, but the technology of the present disclosure is not limited to this, and smart devices, wearable terminals, cell observation devices, and ophthalmologic observations are possible. It may be a digital camera built into a device or various electronic equipment such as a surgical microscope.
 また、第1実施形態から第5実施形態において、CPU61の機能的な構成、及びCPU61が実行する処理の順序は、一例であり、種々改変されてもよい。 Also, in the first to fifth embodiments, the functional configuration of the CPU 61 and the order of the processes executed by the CPU 61 are examples, and may be modified in various ways.
 また、第1実施形態から第5実施形態における複数の技術のうち組み合わせ可能な技術は、適宜、組み合わされてもよい。 Also, among the multiple technologies in the first to fifth embodiments, technologies that can be combined may be combined as appropriate.
 また、第1実施形態から第5実施形態では、カメラ1内のコンピュータ60によって撮像支援処理が実行される形態例を挙げて説明したが、本開示の技術はこれに限定されない。例えば、図30に示すように、LAN又はWAN等のネットワーク310を介してカメラ1と通信可能に接続された外部装置312内のコンピュータ314によって撮像支援処理が実行されるようにしてもよい。図30に示す例では、コンピュータ314は、CPU316、ストレージ318、及びメモリ320を備えている。ストレージ318には、撮像支援処理プログラム100が記憶されている。 Also, in the first to fifth embodiments, the example of the mode in which the computer 60 in the camera 1 executes the imaging support process has been described, but the technology of the present disclosure is not limited to this. For example, as shown in FIG. 30, the imaging support processing may be executed by a computer 314 in an external device 312 communicably connected to the camera 1 via a network 310 such as LAN or WAN. In the example shown in FIG. 30, computer 314 comprises CPU 316 , storage 318 and memory 320 . The storage 318 stores the imaging support processing program 100 .
 カメラ1は、ネットワーク310を介して外部装置312に対して撮像支援処理の実行を要求する。これに応じて、外部装置312のCPU316は、ストレージ318から撮像支援処理プログラム100を読み出し、撮像支援処理プログラム100をメモリ320上で実行する。CPU316は、メモリ320上で実行する撮像支援処理プログラム100に従って撮像支援処理を行う。そして、CPU316は、撮像支援処理が実行されることで得られた処理結果を、ネットワーク310を介してカメラ1に提供する。 The camera 1 requests execution of imaging support processing from the external device 312 via the network 310 . In response, the CPU 316 of the external device 312 reads the imaging support processing program 100 from the storage 318 and executes the imaging support processing program 100 on the memory 320 . The CPU 316 performs imaging support processing according to the imaging support processing program 100 executed on the memory 320 . Then, the CPU 316 provides the camera 1 via the network 310 with the processing result obtained by executing the imaging support processing.
 また、カメラ1と外部装置312とが撮像支援処理を分散して実行するようにしてもよいし、カメラ1と外部装置312を含む複数の装置とが撮像支援処理を分散して実行するようにしてもよい。なお、図30に示す例では、カメラ1及び外部装置312が本開示の技術に係る「撮像装置」の一例である。 Alternatively, the camera 1 and the external device 312 may perform the imaging support processing in a distributed manner, or a plurality of devices including the camera 1 and the external device 312 may perform the imaging support processing in a distributed manner. may Note that in the example illustrated in FIG. 30 , the camera 1 and the external device 312 are examples of the “imaging device” according to the technology of the present disclosure.
 また、第1実施形態から第5実施形態では、NVM62に撮像支援処理プログラム100が記憶されている形態例を挙げて説明したが、本開示の技術はこれに限定されない。例えば、図31に示すように、撮像支援処理プログラム100が記憶媒体330に記憶されていてもよい。記憶媒体330は、非一時的記憶媒体である。記憶媒体330の一例としては、SSD又はUSBメモリなどの任意の可搬型の記憶媒体が挙げられる。 Also, in the first to fifth embodiments, the NVM 62 stores the imaging support processing program 100, but the technique of the present disclosure is not limited to this. For example, as shown in FIG. 31, the imaging support processing program 100 may be stored in the storage medium 330. FIG. Storage medium 330 is a non-temporary storage medium. An example of the storage medium 330 includes any portable storage medium such as an SSD or USB memory.
 記憶媒体330に記憶されている撮像支援処理プログラム100は、コンピュータ60にインストールされる。CPU61は、撮像支援処理プログラム100に従って撮像支援処理を実行する。 The imaging support processing program 100 stored in the storage medium 330 is installed in the computer 60 . The CPU 61 executes imaging support processing according to the imaging support processing program 100 .
 また、通信網(図示省略)を介してコンピュータ60に接続される他のコンピュータ又はサーバ装置等の記憶部に撮像支援処理プログラム100を記憶させておき、カメラ1の要求に応じて撮像支援処理プログラム100がダウンロードされ、コンピュータ60にインストールされるようにしてもよい。 In addition, the imaging support processing program 100 is stored in a storage unit such as another computer or server device connected to the computer 60 via a communication network (not shown), and the imaging support processing program is executed in response to a request from the camera 1. 100 may be downloaded and installed on computer 60 .
 なお、コンピュータ60に接続される他のコンピュータ又はサーバ装置等の記憶部、又はNVM62に撮像支援処理プログラム100の全てを記憶させておく必要はなく、撮像支援処理プログラム100の一部を記憶させておいてもよい。 It should be noted that it is not necessary to store all of the imaging support processing program 100 in another computer connected to the computer 60, a storage unit such as a server device, or the NVM 62, and a part of the imaging support processing program 100 may be stored. You can leave it.
 また、図31に示す例では、カメラ1にコンピュータ60が内蔵されている態様例が示されているが、本開示の技術はこれに限定されず、例えば、コンピュータ60がカメラ1の外部に設けられるようにしてもよい。 Further, although the example shown in FIG. 31 shows a mode example in which the computer 60 is built in the camera 1, the technology of the present disclosure is not limited to this. may be made available.
 また、図31に示す例では、CPU61は、単数のCPUであるが、複数のCPUであってもよい。また、CPU61に代えてGPUを適用してもよい。 Also, in the example shown in FIG. 31, the CPU 61 is a single CPU, but may be a plurality of CPUs. Also, a GPU may be applied instead of the CPU 61 .
 また、図31に示す例では、コンピュータ60が例示されているが、本開示の技術はこれに限定されず、コンピュータ60に代えて、ASIC、FPGA、及び/又はPLDを含むデバイスを適用してもよい。また、コンピュータ60に代えて、ハードウェア構成及びソフトウェア構成の組み合わせを用いてもよい。 Also, although the computer 60 is illustrated in the example shown in FIG. good too. Also, instead of the computer 60, a combination of hardware configuration and software configuration may be used.
 第1実施形態から第5実施形態で説明した撮像支援処理を実行するハードウェア資源としては、次に示す各種のプロセッサを用いることができる。プロセッサとしては、例えば、ソフトウェア、すなわち、プログラムを実行することによって、撮像支援処理を実行するハードウェア資源として機能する汎用的なプロセッサであるCPUが挙げられる。また、プロセッサとしては、例えば、FPGA、PLD、又はASICなどの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路が挙げられる。何れのプロセッサにもメモリが内蔵又は接続されており、何れのプロセッサもメモリを使用することによって撮像支援処理を実行する。 Various processors shown below can be used as hardware resources for executing the imaging support processing described in the first to fifth embodiments. As a processor, for example, there is a CPU, which is a general-purpose processor that functions as a hardware resource that executes imaging support processing by executing software, that is, a program. Also, processors include, for example, FPGAs, PLDs, ASICs, and other dedicated electric circuits that are processors having circuit configurations specially designed to execute specific processing. A memory is built in or connected to each processor, and each processor uses the memory to execute imaging support processing.
 撮像支援処理を実行するハードウェア資源は、これらの各種のプロセッサのうちの1つで構成されてもよいし、同種または異種の2つ以上のプロセッサの組み合わせ(例えば、複数のFPGAの組み合わせ、又はCPUとFPGAとの組み合わせ)で構成されてもよい。また、撮像支援処理を実行するハードウェア資源は1つのプロセッサであってもよい。 The hardware resource that executes the imaging support processing may be configured with one of these various processors, or a combination of two or more processors of the same or different types (for example, a combination of multiple FPGAs, or (combination of CPU and FPGA). Also, the hardware resource for executing the imaging support process may be one processor.
 1つのプロセッサで構成する例としては、第1に、1つ以上のCPUとソフトウェアの組み合わせで1つのプロセッサを構成し、このプロセッサが、撮像支援処理を実行するハードウェア資源として機能する形態がある。第2に、SoCなどに代表されるように、撮像支援処理を実行する複数のハードウェア資源を含むシステム全体の機能を1つのICチップで実現するプロセッサを使用する形態がある。このように、撮像支援処理は、ハードウェア資源として、上記各種のプロセッサの1つ以上を用いて実現される。 As an example of configuration with one processor, first, there is a form in which one processor is configured by combining one or more CPUs and software, and this processor functions as a hardware resource for executing imaging support processing. . Secondly, as typified by SoC, etc., there is a form that uses a processor that realizes the function of the entire system including multiple hardware resources for executing imaging support processing with a single IC chip. In this way, the imaging support processing is implemented using one or more of the above-described various processors as hardware resources.
 更に、これらの各種のプロセッサのハードウェア的な構造としては、より具体的には、半導体素子などの回路素子を組み合わせた電気回路を用いることができる。また、上記の撮像支援処理はあくまでも一例である。したがって、主旨を逸脱しない範囲内において不要なステップを削除したり、新たなステップを追加したり、処理順序を入れ替えたりしてもよいことは言うまでもない。 Furthermore, as the hardware structure of these various processors, more specifically, an electric circuit in which circuit elements such as semiconductor elements are combined can be used. Also, the imaging support process described above is merely an example. Therefore, it goes without saying that unnecessary steps may be deleted, new steps added, and the order of processing may be changed without departing from the scope of the invention.
 以上に示した記載内容及び図示内容は、本開示の技術に係る部分についての詳細な説明であり、本開示の技術の一例に過ぎない。例えば、上記の構成、機能、作用、及び効果に関する説明は、本開示の技術に係る部分の構成、機能、作用、及び効果の一例に関する説明である。よって、本開示の技術の主旨を逸脱しない範囲内において、以上に示した記載内容及び図示内容に対して、不要な部分を削除したり、新たな要素を追加したり、置き換えたりしてもよいことは言うまでもない。また、錯綜を回避し、本開示の技術に係る部分の理解を容易にするために、以上に示した記載内容及び図示内容では、本開示の技術の実施を可能にする上で特に説明を要しない技術常識等に関する説明は省略されている。 The descriptions and illustrations shown above are detailed descriptions of the parts related to the technology of the present disclosure, and are merely examples of the technology of the present disclosure. For example, the above descriptions of configurations, functions, actions, and effects are descriptions of examples of configurations, functions, actions, and effects of portions related to the technology of the present disclosure. Therefore, unnecessary parts may be deleted, new elements added, or replaced with respect to the above-described description and illustration without departing from the gist of the technology of the present disclosure. Needless to say. In addition, in order to avoid complication and facilitate understanding of the portion related to the technology of the present disclosure, the descriptions and illustrations shown above require particular explanation in order to enable implementation of the technology of the present disclosure. Descriptions of common technical knowledge, etc., that are not used are omitted.
 本明細書において、「A及び/又はB」は、「A及びBのうちの少なくとも1つ」と同義である。つまり、「A及び/又はB」は、Aだけであってもよいし、Bだけであってもよいし、A及びBの組み合わせであってもよい、という意味である。また、本明細書において、3つ以上の事柄を「及び/又は」で結び付けて表現する場合も、「A及び/又はB」と同様の考え方が適用される。 In this specification, "A and/or B" is synonymous with "at least one of A and B." That is, "A and/or B" means that only A, only B, or a combination of A and B may be used. Also, in this specification, when three or more matters are expressed by connecting with "and/or", the same idea as "A and/or B" is applied.
 本明細書に記載された全ての文献、特許出願及び技術規格は、個々の文献、特許出願及び技術規格が参照により取り込まれることが具体的かつ個々に記された場合と同程度に、本明細書中に参照により取り込まれる。 All publications, patent applications and technical standards mentioned herein are expressly incorporated herein by reference to the same extent as if each individual publication, patent application and technical standard were specifically and individually noted to be incorporated by reference. incorporated by reference into the book.

Claims (17)

  1.  プロセッサと、
     前記プロセッサに接続又は内蔵されたメモリと、を備え、
     前記プロセッサは、撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、前記イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを備え、
     前記第1モードと前記第2モードとで、制御因子が異なる
     制御装置。
    a processor;
    a memory connected to or embedded in the processor;
    The processor has a first mode for imaging based on light received by an image sensor of an imaging device, and a second mode for deriving temperature based on near-infrared light received by the image sensor,
    A control device in which control factors are different between the first mode and the second mode.
  2.  前記制御因子は、ディスプレイに対して表示させる表示制御因子を含む
     請求項1に記載の制御装置。
    The control device according to claim 1, wherein the control factor includes a display control factor displayed on a display.
  3.  前記プロセッサは、
     前記第1モードでは、前記表示制御因子として、前記光が前記イメージセンサによって受光されることで得られた撮像画像を前記ディスプレイに対して表示させる撮像画像表示因子を設定し、
     前記第2モードでは、前記表示制御因子として、前記温度を示す温度情報を前記ディスプレイに対して表示させる温度情報表示因子を設定する
     請求項2に記載の制御装置。
    The processor
    In the first mode, as the display control factor, a captured image display factor for displaying on the display a captured image obtained by receiving the light by the image sensor is set;
    3. The control device according to claim 2, wherein, in said second mode, a temperature information display factor for displaying temperature information indicating said temperature on said display is set as said display control factor.
  4.  前記制御因子は、投光器を動作させる投光制御因子を含み、
     前記プロセッサは、前記第2モードでは、前記投光制御因子として、前記投光器の投光を抑制させる投光抑制制御因子を設定する
     請求項1から請求項3の何れか一項に記載の制御装置。
    the control factor includes a light projection control factor that operates a light projector;
    The control device according to any one of claims 1 to 3, wherein in the second mode, the processor sets, as the light emission control factor, a light emission suppression control factor for suppressing light emission of the light projector. .
  5.  前記制御因子は、撮像設定に関する撮像設定因子を含む
     請求項1から請求項4の何れか一項に記載の制御装置。
    The control device according to any one of claims 1 to 4, wherein the control factors include imaging setting factors related to imaging settings.
  6.  前記撮像設定は、投光器に関する設定、シャッタスピードに関する設定、絞りに関する設定、測光に関する設定、前記イメージセンサの感度に関する設定、ハイダイナミックレンジに関する設定、及び防振制御に関する設定のうちの少なくとも1つの設定を含む
     請求項5に記載の制御装置。
    The imaging settings include at least one of settings related to projector, settings related to shutter speed, settings related to aperture, settings related to photometry, settings related to sensitivity of the image sensor, settings related to high dynamic range, and settings related to image stabilization control. 6. The control device of claim 5, comprising:
  7.  前記制御因子は、画像処理設定に関する画像処理設定因子を含む
     請求項1から請求項6の何れか一項に記載の制御装置。
    The control device according to any one of claims 1 to 6, wherein the control factors include image processing setting factors related to image processing settings.
  8.  前記画像処理設定は、ノイズリダクションに関する設定、シャープネスに関する設定、コントラストに関する設定、及びトーンに関する設定のうちの少なくとも1つの設定を含む
     請求項7に記載の制御装置。
    8. The control device of claim 7, wherein the image processing settings include at least one of noise reduction settings, sharpness settings, contrast settings, and tone settings.
  9.  プロセッサと、
     前記プロセッサに接続又は内蔵されたメモリと、を備え、
     前記プロセッサは、
     撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、前記イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを備え、
     投光器の状態がオンである場合には、前記第1モードを設定し、前記投光器の状態がオフである場合には、前記第2モードを設定する
     制御装置。
    a processor;
    a memory connected to or embedded in the processor;
    The processor
    A first mode for imaging based on light received by an image sensor of an imaging device, and a second mode for deriving temperature based on near-infrared light received by the image sensor,
    A controller for setting the first mode if the state of the light projector is on and setting the second mode if the state of the light projector is off.
  10.  前記プロセッサは、前記第2モードにおいて、前記温度に応じて前記第2モードから前記第1モードに切り替わる
     請求項9に記載の制御装置。
    The control device according to claim 9, wherein, in the second mode, the processor switches from the second mode to the first mode according to the temperature.
  11.  前記投光器は、パルス発光を行い、
     前記プロセッサは、前記パルス発光の発光期間に前記第1モードを設定し、前記パルス発光の発光停止期間に前記第2モードを設定する動作を、前記パルス発光の発光タイミングに応じて繰り返す
     請求項10に記載の制御装置。
    The light projector emits pulsed light,
    10. The processor repeats an operation of setting the first mode during a light emission period of the pulse light emission and setting the second mode during a light emission stop period of the pulse light emission according to the light emission timing of the pulse light emission. The control device according to .
  12.  前記プロセッサは、前記イメージセンサによって受光されることで得られた第1撮像画像と、前記温度を示す温度情報とを合成した合成画像を出力する
     請求項1から請求項11の何れか一項に記載の制御装置。
    12. The processor according to any one of claims 1 to 11, wherein the processor outputs a synthesized image obtained by synthesizing a first captured image obtained by receiving light from the image sensor and the temperature information indicating the temperature. Control device as described.
  13.  請求項1から請求項12の何れか一項に記載の制御装置と、
     前記イメージセンサと、 
     を備える撮像装置。
    A control device according to any one of claims 1 to 12;
    the image sensor;
    An imaging device comprising:
  14.  撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、前記イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを切り替えること、及び、
     前記第1モードと前記第2モードとで、制御因子を異ならせること
     を含む制御方法。
    switching between a first mode of imaging based on light received by an image sensor of an imaging device and a second mode of deriving temperature based on near-infrared light received by the image sensor;
    A control method comprising: making control factors different between the first mode and the second mode.
  15.  撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、前記イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを切り替えること、及び、
     投光器の動作がオンである場合には、前記第1モードを設定し、前記投光器の動作がオフである場合には、前記第2モードを設定すること
     を含む制御方法。
    switching between a first mode of imaging based on light received by an image sensor of an imaging device and a second mode of deriving temperature based on near-infrared light received by the image sensor;
    setting the first mode when the operation of the light projector is on, and setting the second mode when the operation of the light projector is off.
  16.  コンピュータに、
     撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、前記イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを切り替えること、及び、
     前記第1モードと前記第2モードとで、制御因子を異ならせること
     を含む処理を実行させるためのプログラム。
    to the computer,
    switching between a first mode of imaging based on light received by an image sensor of an imaging device and a second mode of deriving temperature based on near-infrared light received by the image sensor;
    A program for executing processing including different control factors between the first mode and the second mode.
  17.  コンピュータに、
     撮像装置のイメージセンサによって受光された光に基づいて撮像する第1モードと、前記イメージセンサによって受光された近赤外光に基づいて温度を導出する第2モードとを切り替えること、及び、
     投光器の動作がオンである場合には、前記第1モードを設定し、前記投光器の動作がオフである場合には、前記第2モードを設定すること
     を含む処理を実行させるためのプログラム。
    to the computer,
    switching between a first mode of imaging based on light received by an image sensor of an imaging device and a second mode of deriving temperature based on near-infrared light received by the image sensor;
    Setting the first mode when the operation of the light projector is on, and setting the second mode when the operation of the light projector is off.
PCT/JP2022/000783 2021-02-26 2022-01-12 Control device, imaging device, control method, and program WO2022181095A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023502151A JPWO2022181095A1 (en) 2021-02-26 2022-01-12

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021031215 2021-02-26
JP2021-031215 2021-02-26

Publications (1)

Publication Number Publication Date
WO2022181095A1 true WO2022181095A1 (en) 2022-09-01

Family

ID=83048861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/000783 WO2022181095A1 (en) 2021-02-26 2022-01-12 Control device, imaging device, control method, and program

Country Status (2)

Country Link
JP (1) JPWO2022181095A1 (en)
WO (1) WO2022181095A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH066652A (en) * 1992-06-23 1994-01-14 Sony Corp Video camera
JP2005191948A (en) * 2003-12-25 2005-07-14 Casio Comput Co Ltd Camera apparatus and program
JP2011047747A (en) * 2009-08-26 2011-03-10 Sanyo Electric Co Ltd Temperature measurement display device and portable information communication terminal
JP2013222980A (en) * 2012-04-12 2013-10-28 Canon Inc Image-pickup device and image-pickup method
CN104867265A (en) * 2015-04-22 2015-08-26 深圳市佳信捷技术股份有限公司 Camera apparatus, and fire detection alarm system and method
JP2016127432A (en) * 2014-12-29 2016-07-11 セコム株式会社 Image sensing device
JP2017017593A (en) * 2015-07-02 2017-01-19 キヤノン株式会社 Imaging device
US20210004970A1 (en) * 2019-07-01 2021-01-07 Snap-On Incorporated Apparatus with component aligner

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH066652A (en) * 1992-06-23 1994-01-14 Sony Corp Video camera
JP2005191948A (en) * 2003-12-25 2005-07-14 Casio Comput Co Ltd Camera apparatus and program
JP2011047747A (en) * 2009-08-26 2011-03-10 Sanyo Electric Co Ltd Temperature measurement display device and portable information communication terminal
JP2013222980A (en) * 2012-04-12 2013-10-28 Canon Inc Image-pickup device and image-pickup method
JP2016127432A (en) * 2014-12-29 2016-07-11 セコム株式会社 Image sensing device
CN104867265A (en) * 2015-04-22 2015-08-26 深圳市佳信捷技术股份有限公司 Camera apparatus, and fire detection alarm system and method
JP2017017593A (en) * 2015-07-02 2017-01-19 キヤノン株式会社 Imaging device
US20210004970A1 (en) * 2019-07-01 2021-01-07 Snap-On Incorporated Apparatus with component aligner

Also Published As

Publication number Publication date
JPWO2022181095A1 (en) 2022-09-01

Similar Documents

Publication Publication Date Title
JP5907738B2 (en) Imaging apparatus, display method, and program
JP5443916B2 (en) Camera body
CN105579880B (en) The method of work of endoscope-use camera system, endoscope-use camera system
TWI704808B (en) Focusing of a camera monitoring a scene
JP6886037B2 (en) Finder device, image pickup device, and control method of finder device
JP2020057989A (en) Sensor module, electronic equipment, vision sensor calibration method, photogenic subject detection method and program
JP5607210B2 (en) Endoscope system
WO2022181095A1 (en) Control device, imaging device, control method, and program
JP6916891B2 (en) Finder device, image pickup device, and control method of finder device
JP2006171213A (en) Microscope system
JP2013186293A (en) Image generation device and image display method
US10560635B2 (en) Control device, control method, and program
JP6584103B2 (en) Imaging device
JP6403002B2 (en) Projector system and projector system control method
US20240015377A1 (en) Imaging control device, imaging apparatus, imaging control method, and program
JP4640108B2 (en) camera
JP6529214B2 (en) Imaging device
JP2009014495A (en) Measuring device and measuring method using it
JP2012168429A (en) Image pickup apparatus
JP5281494B2 (en) Image processing apparatus and method
WO2022181094A1 (en) Optical device, optical device operation method, and program
JP7334325B2 (en) Imaging device
WO2023188939A1 (en) Image capture method, image capture device, and program
WO2022213340A1 (en) Focusing method, photographic device, photographic system and readable storage medium
JP2013066051A (en) Photographing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22759138

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023502151

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22759138

Country of ref document: EP

Kind code of ref document: A1