EP3617795A1 - Image capture apparatus having illumination section, monitoring system including image capture apparatus, method of controlling image capture apparatus, and storage medium - Google Patents

Image capture apparatus having illumination section, monitoring system including image capture apparatus, method of controlling image capture apparatus, and storage medium Download PDF

Info

Publication number
EP3617795A1
EP3617795A1 EP19193874.5A EP19193874A EP3617795A1 EP 3617795 A1 EP3617795 A1 EP 3617795A1 EP 19193874 A EP19193874 A EP 19193874A EP 3617795 A1 EP3617795 A1 EP 3617795A1
Authority
EP
European Patent Office
Prior art keywords
image capture
section
eye image
eye
sections
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19193874.5A
Other languages
German (de)
English (en)
French (fr)
Inventor
Aihiko Numata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of EP3617795A1 publication Critical patent/EP3617795A1/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/72Combination of two or more compensation controls
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/02Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with scanning movement of lens or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19617Surveillance camera constructional details
    • G08B13/19626Surveillance camera constructional details optical details, e.g. lenses, mirrors or multiple lenses
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19641Multiple cameras having overlapping views on a single scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19617Surveillance camera constructional details
    • G08B13/1963Arrangements allowing camera rotation to change view, e.g. pivoting camera, pan-tilt and zoom [PTZ]

Definitions

  • the present invention relates to an image capture apparatus having an illumination section, a monitoring system including the image capture apparatus, a method of controlling the image capture apparatus, and a storage medium, and more particularly to an image capture apparatus having an illumination section, which is used e.g. for surveillance during the night, a monitoring system including the image capture apparatus, a method of controlling the image capture apparatus, and a storage medium.
  • the illumination intensity at an illuminated flat surface is gradually reduced toward the periphery of the surface, compared with an illumination intensity at an area directly under the light source, according to the cosine fourth law, and hence it is difficult to illuminate a subject at a uniform illumination intensity. It is much more difficult to illuminate a subject at a uniform illumination intensity, in a case where an LED light source having a light distribution with high directivity is used for illumination, as disclosed in Japanese Laid-Open Patent Publication (Kokai) No. 2013-41282 ,.
  • the present invention provides an image capture apparatus that is capable of improving the quality of an image captured using illumination, a monitoring system including the image capture apparatus, a method of controlling the image capture apparatus, and a storage medium.
  • a non-transitory computer-readable storage medium storing a computer-executable program for executing a method of controlling an image capture apparatus as specified in claim 18.
  • FIG. 1A is a schematic view of the image capture apparatus, denoted by reference numeral 100, as viewed obliquely
  • FIG. 1B is a view of the image capture apparatus 100, as viewed from above, i.e. from a +Z axis based on a XYZ coordinate system indicated therein
  • FIG. 1C is an internal function block diagram of the image capture apparatus 100.
  • the image capture apparatus 100 includes a multi-eye image capture section 110, a synthesis processor 114, a single-eye image capture section 120, a controller 130, and a transfer section 140.
  • the multi-eye image capture section 110 includes a plurality of image capture sections 111a to 111d that capture images in different image capture ranges, which partially overlap, so as to generate a wide-angle image 101.
  • the single-eye image capture section 120 includes a single image capture section 121 that captures an image in part of the whole image capture range covered by the multi-eye image capture section 110 so as to acquire a detailed image 102.
  • the controller 130 controls the overall operation of the image capture apparatus 100, including the operations of the multi-eye image capture section 110 and the single-eye image capture section 120.
  • the transfer section 140 transfers the wide-angle image 101 and the detailed image 102 to an external client apparatus. More specifically, the transfer section 140 is connected to the external client apparatus e.g. via a wired or wireless network, and switches an image to be transferred to one of the wide-angle image 101 and the detailed image 102 by a switch, not shown, within the transfer section 140. With this, the wide-angle image 101 and the detailed image 102 are sequentially transferred from the transfer section 140 to the external client apparatus via the same network.
  • the external client apparatus transmits a command for controlling the image capture apparatus 100 to the transfer section 140 via the network. After that, upon transfer of the command from the transfer section 140 to the controller 130, the controller 130 performs processing corresponding to the command in the image capture apparatus 100, and transmits a result of the processing to the client apparatus as a response to the command.
  • the client apparatus is an external apparatus, such as a PC, and the network is formed by a wired LAN, a wireless LAN, or the like. Further, power may be supplied to the image capture apparatus 100 via the network.
  • a monitoring system using the image capture apparatus is comprised of the image capture apparatus 100 and an external client apparatus (information processing apparatus) connected to the image capture apparatus 100 via a network.
  • the multi-eye image capture section 110 which is a multi-lens wide-angle camera will be described.
  • the multi-eye image capture section 110 includes the plurality of image capture sections 111a to 111d, which are arranged such that the image capture ranges partially overlap, and generates the wide-angle image 101 using the synthesis processor 114 that synthesizes a plurality of images acquired by the image capture sections 111a to 111d. More specifically, the wide-angle image 101 is generated by applying a technique of so-called pattern matching in which a correlation coefficient is determined while shifting an overlapping part of images acquired by adjacent ones (e.g. the image capture sections 111a and 111b) of a plurality of image capture sections, to thereby determine positional shift amounts between the plurality of images.
  • a technique of so-called pattern matching in which a correlation coefficient is determined while shifting an overlapping part of images acquired by adjacent ones (e.g. the image capture sections 111a and 111b) of a plurality of image capture sections, to thereby determine positional shift amounts between the plurality of images.
  • the wide-angle image 101 is generated by the synthesis processor 114, this is not limitative.
  • the transfer section 114 output unit
  • the external client apparatus may generate the wide-angle image 101 based on the images acquired by the image capture sections 111a to 111d.
  • the plurality of image capture sections 111a to 111d include image forming optical systems 112a to 112d and solid-state image capture devices 113a to 113d, respectively.
  • the image capture sections 111a to 111d acquire images by causing subject images to be formed on the solid-state image capture devices 113a to 113d via the image forming optical systems 112a to 112d, respectively
  • Driving and signal reading of each of the solid-state image capture devices 113a to 113d are controlled by the controller 130. That is, the controller 130 controls the exposure level of each of the image capture sections 111a to 111d of the multi-eye image capture section 110 by controlling a time period over which charges are accumulated in pixels of the solid-state image capture devices 113a to 113d. Details of this exposure level control will be described hereinafter. Note that the exposure level control may be performed by controlling exposure parameters, such as a digital gain, an aperture value, and the density of an ND filter or the like, which are associated with components for adjusting the light amount and exposure, other than the solid-state image capture devices 113a to 113d.
  • exposure parameters such as a digital gain, an aperture value, and the density of an ND filter or the like, which are associated with components for adjusting the light amount and exposure, other than the solid-state image capture devices 113a to 113d.
  • the single-eye image capture section 120 which is a single-lens camera will be described.
  • the single-eye image capture section 120 includes the single image capture section 121 formed by an image forming optical system 122 and a solid-state image capture device 123, a drive mechanism 124 which can change an image capture direction, and an illumination section 125. Similar to the multi-eye image capture section 110, driving and signal reading of the solid-state image capture device 123 are controlled by the controller 130.
  • the drive mechanism 124 includes a motor and a gear, and rotates the single-eye image capture section 120 as a single unit about a specific rotational axis by controlling electric power for driving the motor. This rotation of the single-eye image capture section 120, performed by the drive mechanism 124, is controlled by the controller 130. Note that the drive mechanism 124 may be configured to have a plurality of motors, whereby the single-eye image capture section 120 may be rotated about a plurality of rotational axes.
  • the illumination section 125 is an LED formed by a compound semiconductor material, such as InGaN and AlGaAs, or an organic semiconductor, and performs illumination in an image capture angle of the single-eye image capture section 120. More specifically, the illumination section 125 has a peak of illumination intensity within an image capture range of the single-eye image capture section 120.
  • the illumination light irradiated from the illumination section 125 is set to a wavelength to which the solid-state image capture devices 113a to 113d and 123 of the image capture sections 111a to 111d and 121 have sensitivity.
  • the peak wavelength of the illumination light is only required to be set to a value which is not smaller than 300 nm and not larger than 1100 nm.
  • the image capture apparatus 100 controls the exposure levels of the plurality of image capture sections 111a to 111d of the multi-eye image capture section 110 based on an image capture direction of the single-eye image capture section 120. More specifically, the exposure level(s) of any of the plurality of image capture sections 111a to 111d forming the multi-eye image capture section 110, which has an image capture range at least partially overlapping with the image capture range of the single-eye image capture section 120, is/are controlled to a value or values lower than the exposure level(s) of image capture sections each having an image capture range not overlapping with the image capture range of the single-eye image capture section 120. For example, in a case shown in FIG.
  • the exposure levels of the ones 111b and 111c of the plurality of image capture sections 111a to 111d forming the multi-eye image capture section 110 are controlled to values lower than the exposure levels of the others 111a and 111d, by this exposure level control.
  • the following description is given of the exposure level control for the multi-eye image capture section 110 in a case where the image capture range of the single-eye image capture section 120 overlaps only with the image capture range of one of the image capture sections of the multi-eye image capture section 110, and does not overlap with the image capture ranges of the other image capture sections, in comparison with a conventional example.
  • FIG. 2A shows a state of the image capture apparatus 100 in which the image capture range of the single-eye image capture section 120 overlaps only with the image capture range of the image capture section 111a of the multi-eye image capture section 110.
  • FIG. 2B shows illumination intensity distribution by the illumination section 125 in the state shown in FIG. 2A
  • FIG. 2C shows exposure levels set to the image capture sections 111a to 111d, respectively, for exposure level adjustment to the illumination intensity distribution shown in FIG. 2B .
  • the exposure level refers to a so-called EV (Exposure Value). That is, when the EV is increased by 1, the luminance of the image is made twice as high as before.
  • FIG. 2D shows a shading correction value set in a case where not only exposure level adjustment shown in FIG. 2C , but also shading adjustment is performed within the image capture range of the image capture section 111a so as to make flat the pixel signal levels of the image capture sections 111a to 111d. This shading adjustment will be described hereinafter.
  • the illumination section 125 has a peak of its illumination intensity in a direction in which the single-eye image capture section 120 is oriented. Therefore, in the case of the illumination intensity distribution shown in FIG. 2B , assuming that all of the image capture sections 111a to 111d of the multi-eye image capture section 110 are equal in exposure level, only an image acquired by the image capture section 111a having the image capture range overlapping with the image capture range of the single-eye image capture section 120 becomes brighter than images acquired by the other image capture sections 111b to 111d. As a result, luminance unevenness is caused in the wide-angle image 101 generated by synthesizing the images captured by the image capture sections 111a to 111d.
  • the exposure level of the image capture section 111a is made lower than the exposure level of the image capture sections 111b to 111d, a difference in brightness between the image acquired by the image capture section 111a and the images acquired by the image capture sections 111b to 111d is reduced. As a result, it is possible to reduce luminance unevenness caused in the wide-angle image 101 generated by synthesizing the images captured by the image capture sections 111a to 111d. Therefore, it is possible to improve the quality of the wide-angle image 101.
  • the exposure levels of the image capture sections 111a to 111d may be varied within the image capture angle of the single-eye image capture section 120. That is, in a case where the image capture range of the single-eye image capture section 120 overlaps with the image capture ranges of a plurality of image capture sections of the multi-eye image capture section 110, the exposure level may be controlled to different values between the image capture sections. More specifically, it is preferable that as the image capture range of an image capture section is closer to the image capture direction of the single-eye image capture section 120, the exposure level set to the image capture section is controlled to a lower value.
  • the term "image capture direction of the single-eye image capture section 120" refers to the direction of the optical axis of the image forming optical system 122.
  • the exposure level control for the image capture apparatus 100 for controlling the exposure levels of the multi-eye image capture section 110 in a case where the image capture range of the single-eye image capture section 120 overlaps with the image capture ranges of two of the image capture sections of the multi-eye image capture section 110, and the image capture range of one of the two image capture sections is closer to the image capture direction of the single-eye image capture section 120 than that of the other is.
  • FIG. 3A shows a state of the image capture apparatus 100 in which the image capture range of the single-eye image capture section 120 overlaps with the image capture ranges of the image capture sections 111a and 111b of the multi-eye image capture section 110, and the image capture range of the image capture section 111a is closer to the image capture direction of the single-eye image capture section 120 than that of the image capture section 111b is.
  • FIG. 3B shows illumination intensity distribution by the illumination section 125, occurring in the state shown in FIG. 3A
  • FIG. 3C shows the exposure levels set to the image capture sections 111a to 111d in this state.
  • the illumination intensity distribution in FIG. 3B when a comparison is made between illumination intensities associated with the image capture sections 111a and 111b, the illumination intensity is higher in the image capture range of the image capture section 111a closer to the image capture direction of the single-eye image capture section 120 than in the image capture range of the image capture section 111b. Therefore, it is possible to reduce luminance unevenness of the wide-angle image 101 generated by synthesis by making the exposure level of the image capture section 111a lower than that of the image capture section 111b.
  • the exposure level control may be also performed for any of the image capture sections 111a to 111d of the multi-eye image capture section 110, outside the image capture range of the single-eye image capture section 120. More specifically, this applies to the image capture section 111c in FIG. 3B in which although the image capture range thereof does not overlap with the image capture range of the single-eye image capture section 120, the illumination intensity distribution by the illumination section 125 spreads to part of this image capture range.
  • the above description is given of the method of reducing luminance unevenness of the wide-angle image 101 by adjusting the exposure level of each of the plurality of image capture sections 111a to 111d of the multi-eye image capture section 110.
  • coefficient of shading correction is determined by measuring illumination intensity distribution by the illumination section 125 in advance. More specifically, it is possible to measure the illumination intensity distribution from images acquired by the image capture sections 111a to 111d when only the illumination section 125 is lighted to a subject having uniform reflectance distribution.
  • shading correction is performed for any of the image capture sections 111a to 111d of the multi-eye image capture section 110, which has an image capture range at least partially overlapping with the image capture range of the single-eye image capture section 120. More specifically, as shown in FIG. 2D , it is only required that shading correction is performed for the image capture ranges of the image capture sections 111a to 111d of the multi-eye image capture section 110 such that the pixel signal level is made higher as the image capture range is farther from the image capture direction of the single-eye image capture section 120.
  • FIG. 1C shows the case where only the single-eye image capture section 120 includes the illumination section 125, but the multi-eye image capture section 110 does not include an illumination section.
  • each of the image capture sections 111a to 111d of the multi-eye image capture section 110 may include an illumination section having a peak of illumination intensity in its image capture range.
  • an SN ratio of the wide-angle image 101 in a low-illumination environment is improved, and hence this configuration is further preferable.
  • FIGS. 1B and 1C show the case where the multi-eye image capture section 110 includes the four image capture sections 111a to 111d
  • the number of image capture sections included in the multi-eye image capture section 110 is not limited to four.
  • the multi-eye image capture section 110 may include two or three, or five or more image capture sections.
  • the image capture ranges of the multi-eye image capture section 110 are not necessarily required to be the ranges indicated by broken lines in FIG. 1B , but it may cover the whole circumference of 360 degrees.
  • the transfer section 140 sequentially transfers images to the external client apparatus via the same network by switching the image to be transferred to one of the wide-angle image 101 and the detailed image 102 by the switch, not shown, the transfer method is not limited to this.
  • the image capture apparatus 100 may have a transfer section dedicated to the wide-angle image 101 and a transfer section dedicated to the detailed image 102, and transfer the wide-angle image 101 and the detailed image 102 to the external client apparatus via different networks, respectively.
  • it is more preferable to distribute images via the same network because it is easy to grasp correspondence between the wide-angle image 101 and the detailed image 102.
  • the present embodiment shows an example in which the transfer section 140 transfers the wide-angle image 101 and the detailed image 102 to the external client apparatus, and receives a command from the external client apparatus, and the controller 130 controls the operation of the image capture apparatus 100 according to the received command.
  • the image capture apparatus 100 may include, in place of the transfer section 140, a memory for storing the wide-angle image 101 and the detailed image 102, a viewer for displaying the wide-angle image 101 and the detailed image 102, stored in the memory, and an interface for receiving a command from a user.
  • the image capture apparatus 100 shown in FIG. 1C may additionally include the above-mentioned memory, viewer, and interface.
  • the method of controlling the exposure levels of the plurality of image capture sections 111a to 111d of the multi-eye image capture section 110 the method of controlling the time period of accumulating charges in the pixels of the sold-state image capture devices 113a to 113d is described by way of example, it is not necessarily required to use this method.
  • the exposure level may be controlled by controlling the signal amplification coefficients.
  • the gain controller controls the signal amplification coefficients before analog-to-digital conversion (analog gains).
  • the controller 130 may control the exposure levels of the plurality of image capture sections 111a to 111d by controlling each aperture control mechanism. Further, in a case where the image forming optical systems 112a to 112d each include a light absorbing filter and a mechanism for inserting and removing the light absorbing filter, the controller 130 may control the exposure level by controlling insertion and removal of the light absorbing filter using the mechanism.
  • the controller 130 may control the exposure level by controlling a voltage applied to the variable transmittance filter.
  • the plurality of exposure level control methods described above may be used in combination.
  • a frame rate of each of the image capture sections 111a to 111d of the multi-eye image capture section 110 may be changed.
  • the charge accumulation time period of the solid-state image capture devices 113a to 113d can be set to be longer as the frame rate is lower, and hence it is possible to increase the exposure level of the image capture sections 111a to 111d.
  • the controller 130 controls the image capture sections 111a to 111d of the multi-eye image capture section 110 such that the frame rate of one(s) which has/have the image capture range not overlapping with the image capture range of the single-eye image capture section 120 is/are made lower than that of an image capture section having an image capture range at least partially overlapping with the image capture range of the single-eye image capture section 120. This makes it possible to further reduce luminance unevenness of the wide-angle image 101.
  • An image capture apparatus 200 according to the second embodiment differs from the image capture apparatus 100 according to the first embodiment in the construction of a single-eye image capture section 220.
  • FIG. 4 is an internal function block diagram of the image capture apparatus 200 according to the second embodiment.
  • the single-eye image capture section 220 of the image capture apparatus 200 includes not only the same component elements as those of the single-eye image capture section 120, but also a zoom control mechanism 226 that is capable of changing an image capture angle of an image capture section 221.
  • the zoom control mechanism 226 includes a motor and a gear, and the configuration may be such that a zoom ratio is changed by moving one or some of the lenses of an image forming optical system 222 of the single-eye image capture section 220 in the optical axis direction.
  • an illumination section 225 of the single-eye image capture section 220 includes a narrow-angle illumination section 225a having a narrow light distribution angle and a wide-angle illumination section 225b having a wide light distribution angle, and selectively uses one of the narrow-angle illumination section 225a and the wide-angle illumination section 225b according to an image capture angle of the single-eye image capture section 220. More specifically, in a case where the image capture angle of the single-eye image capture section 220 is a narrow angle smaller than a predetermined angle, the narrow-angle illumination section 225a is used for illumination for image capture. On the other hand, in a case where the image capture angle of the single-eye image capture section 220 is a wide angle not smaller than the predetermined angle, the wide-angle illumination section 225b is used for illumination for image capture.
  • the narrow-angle illumination section 225a or the wide-angle illumination section 225b which are different in light distribution angle, is selectively used according to the image capture angle of the single-eye image capture section 220, whereby it is possible to efficiently illuminate a subject an image of which is being captured using the single-eye image capture section 220. As a result, it is possible to improve the image quality of a detailed image 202.
  • the following description is given of this control.
  • the image capture angle of the single-eye image capture section 220 is narrow, it is preferable to increase the luminous intensity by using the narrow-angle illumination section 225a having the narrow light distribution angle.
  • the image capture angle of the single-eye image capture section 220 is wide, it is preferable to enlarge the irradiation range by using the wide-angle illumination section 225b having the wide light distribution angle. This makes it possible to efficiently illuminate a subject an image of which is desired to be captured.
  • the exposure levels of a plurality of image capture sections 211a to 211d forming a multi-eye image capture section 210 are controlled based on the image capture angle of the single-eye image capture section 220. More specifically, as the image capture angle of the single-eye image capture section 220 is narrower, the difference between the exposure level of one of the image capture sections 211a to 211d, which has an image capture range at least partially overlapping with the image capture range of the single-eye image capture section 220, and the exposure level of another of the same, which has an image capture range not overlapping with the image capture range of the single-eye image capture section 220, is made larger.
  • FIG. 5A and FIG. 6A are different in the image capture angle of the single-eye image capture section 220, i.e. the image capture angle of the single-eye image capture section 220 is wider in a case of exposure level control described with reference to FIGS. 6A to 6C than in a case of exposure level control described with reference to FIGS. 5A to 5C .
  • the image capture apparatus 200 As described above, in the image capture apparatus 200 according to the second embodiment, as the image capture angle of the single-eye image capture section 220 is narrower, the light distribution angle of illumination from the illumination section 225 is narrower. For this reason, as shown in FIGS. 5B and 6B , as the image capture angle of the single-eye image capture section 220 is narrower, the peak value of the illumination intensity from the illumination section 225 is larger. Therefore, as shown in FIG. 5C and 6C , as the image capture angle of the single-eye image capture section 220 is narrower, the difference in the exposure level between the image capture section 211a and the image capture sections 211b to 211d is made larger. This makes it possible to reduce luminance unevenness of the wide-angle image 201 generated by synthesis.
  • FIG. 1C Component elements of the present embodiment are denoted by reference numerals of 300s. Further, the same component elements as those of the first embodiment are denoted by reference numerals which are changed from 100s to 300s, and redundant description is omitted. Further, a functional block diagram of an image capture apparatus 300 according to the third embodiment, which corresponds to FIG. 1C and is mainly different only in the reference numerals from that shown in FIG. 1C , is omitted, but the description is given using the changed reference numerals. This is also the case with the other embodiments described hereinafter.
  • the image capture apparatus 300 differs from the image capture apparatus 100 according to the first embodiment in that image capture sections 311a to 311d of a multi-eye image capture section 310 include photometry sections 315a to 315d, respectively, each of which acquires a photometric value from an image obtained through image capture.
  • the photometry sections 315a to 315d acquire photometric values which are average signal levels of images captured by the image capture sections 311a to 311d of the multi-eye image capture section 310, from pixel signals read from solid-state image capture devices 313a to 313d, respectively.
  • a controller 330 may set a range from which each of the photometry sections 315a to 315d calculates the photometric value (hereinafter referred to as an evaluation frame) only to part of the image, and may acquire a photometric value by performing weighted-averaging depending on an area of the image.
  • the exposure levels of the image capture sections 311a to 311d are controlled based not only on the image capture angle direction of a single-eye image capture section 320, but also on the photometric values acquired by the photometry sections 315a to 315d, respectively. With this configuration, it is possible to further improve the quality of a wide-angle image 301, and hence it is preferable.
  • the peak of the intensity of illumination from an illumination section 325 is included in the image capture range of the image capture section 311a, which overlaps with the image capture range of the single-eye image capture section 320.
  • the illumination intensity of the image capture range of each of the image capture sections 311a to 311d also varies with distribution of environment light other than illumination light from the illumination section 325.
  • the difference between the illumination intensity in the image capture range of the image capture section 311a and the illumination intensities in the image capture ranges of the image capture section 311b to 311d is mainly determined by illumination light from the illumination section 325, the difference also depends on the distribution of environment light, to be exact. Therefore, in the image capture apparatus 300 according to the third embodiment, the photometric values of the image capture sections 311a to 311d are acquired, and the difference in exposure level between the image capture section 311a and the image capture sections 311b to 311d is adjusted to further reduce luminance unevenness of the wide-angle image 301.
  • the image capture apparatus 300 controls the exposure level using both of the information on the image capture direction of the single-eye image capture section 320 and the information on the photometric values acquired by the photometry sections 315a to 315d, respectively. This makes it easier to adjust the exposure level than the control of the exposure level only using the photometric values.
  • the following description is given of the exposure level control.
  • FIG. 8 is a flowchart of a process for setting the exposure levels of the multi-eye image capture section 310.
  • the present process is performed by the controller 330 of the image capture apparatus 300, and the exposure levels of the image capture sections 311a to 311d are set according to the image capture direction of the single-eye image capture section 320.
  • a user instructs the image capture apparatus 300 to change the image capture direction of the single-eye image capture section 320 from the client apparatus.
  • This instruction for changing the image capture direction is transmitted to the image capture apparatus 300 via the network, and the controller 330 controls a drive mechanism 324 to change the image capture direction of the single-eye image capture section 320 (step S32).
  • the exposure levels of the image capture sections 311a to 311d of the multi-eye image capture section 310 are temporarily determined so as to offset luminance differences caused by the illumination intensity of light emitted from the illumination section 325 in the changed image capture direction of the single-eye image capture section 320 (step S33). That is, the exposure levels are temporarily determined such that the exposure level of any of image capture sections 111a to 111d of the multi-eye image capture section 310, which has an image capture range at least partially overlapping with the image capture range of the single-eye image capture section 320, is lower than the exposure level(s) of image capture section(s) each having an image capture range not overlapping with the image capture range of the single-eye image capture section 320.
  • image capture is performed by the image capture sections 311a to 311d at the exposure levels determined in the step S33, respectively, and photometric values are acquired by the photometry sections 315a to 315d from images acquired from the image capture sections 311a to 311d, respectively (step S34).
  • a step S35 the exposure levels of the image capture sections 311a to 311d of the multi-eye image capture section 310 are adjusted using the photometric values acquired by the photometry sections 315a to 315d in the step S34. Then, image capture is performed again by the image capture sections 311a to 311d of the multi-eye image capture section 310, and the wide-angle image 301 is generated based on the images obtained through image capture.
  • the exposure levels may be adjusted, as conventionally performed, by omitting the step S33, i.e. without temporarily determining the exposure levels of the image capture sections 311a to 311d of the multi-eye image capture section 310, and based on the photometric values acquired from the images obtained through image capture performed by the image capture sections 311a to 311d of the multi-eye image capture section 310 in the step S34. In this case, however, a plurality of frames are needed or a large luminance difference is generated between the frames, before the exposure level becomes stable at the optimum exposure level.
  • the present embodiment it is possible to reduce luminance unevenness of the wide-angle image 301 more effectively than in a case where the exposure level is adjusted only based on the image capture direction of the single-eye image capture section 320 , and further, it is possible to more easily adjust the exposure level than in a case where the exposure level is adjusted using only the photometric values of the image capture sections 311a to 311d.
  • the controller 330 may set an evaluation frame within which each associated one of the photometry sections 315a to 315d acquires a photometric value, based on the image capture direction of the single-eye image capture section 320. More specifically, for one(s) of the image capture sections 311a to 311d of the multi-eye image capture section 310, each of which has an image capture range at least partially overlapping with the image capture range of the single-eye image capture section 320, it is preferable to set a central portion of the image as the evaluation frame. This makes it possible to more positively grasp the peak of illumination intensity caused by the illumination section 325.
  • the controller 330 sets a narrower area including the central portion of the acquired image as the evaluation frame for one of the image capture sections 311a to 311d of the multi-eye image capture section 310, which has an image capture range closer to the image capture direction of the single-eye image capture section 320.
  • An image capture apparatus 400 according to the fourth embodiment differs from the image capture apparatus 100 according to the first embodiment in component elements of image capture sections 411a to 411d of a multi-eye image capture section 410. More specifically, the image capture sections 411a to 411d include focus control mechanisms 415a to 415d, respectively, each of which is capable of controlling a focal length. It is only required that the focus control mechanisms 415a to 415d are each configured to include a motor and a gear and change the focal length by moving the position of a focus lens of an associated one of image forming optical systems 412a to 412d in the optical axis direction.
  • a single-eye image capture section 420 includes a photometry section 426 (environment light-measuring section) which measures brightness of environment light as a photometric value, and an image capture mode is set to a day mode when the brightness measured by the photometry section 426 is not lower than a predetermined value, and set to a night mode when the measured brightness is lower than the predetermined value.
  • a photometry section 426 environment light-measuring section
  • an image capture mode is set to a day mode when the brightness measured by the photometry section 426 is not lower than a predetermined value, and set to a night mode when the measured brightness is lower than the predetermined value.
  • the image capture apparatus 400 controls the focal lengths of the image capture sections 411a to 411d of the multi-eye image capture section 410 based on the image capture direction of the single-eye image capture section 420. More specifically, the focal length of an image capture section of the image capture sections 411a to 411d of the multi-eye image capture section 410, which has an image capture range at least partially overlapping with the image capture range of the single-eye image capture section 420, is controlled to a different value from the focal length of an image capture section having an image capture range not overlapping with the image capture range of the single-eye image capture section 420. With this configuration, it is possible to further improve the image quality of a wide-angle image 401. The following description will be given of this control.
  • the image capture apparatus 400 is used as a surveillance monitor provided at an entrance of a store or a parking lot. In this case, it is preferable to continuously perform image capture while focusing on a specified image capture area (the entrance in this case). Further, in a case where the image capture apparatus 400 is used e.g. for monitoring an assembly process in a factory, or used as an on-vehicle camera for detecting obstacles, it is also preferable to continuously perform image capture while focusing on a specified image capture area.
  • the image forming optical system has chromatic aberration, and hence the focal length is different depending on a wavelength of light entering the image forming optical system. Particularly, as the wavelength difference is larger, the difference in focal length is larger. As a result, the following problem is caused:
  • the position of the focus lens of any of the image capture sections 411a to 411d of the multi-eye image capture section 410 which has an image capture range at least partially overlapping with the image capture range of the single-eye image capture section 420, is set again at nighttime.
  • the plurality of image capture sections 411a to 411d can perform image capture in a state in which the same subject is in focus. Further, this enables the image capture sections 411a to 411d of the multi-eye image capture section 410 to perform image capture in the state in which the same subject is in focus regardless of daytime or nighttime, and hence is preferable.
  • a controller 430 acquires information on a table shown in FIG. 9 in advance, which shows a relationship between the position of the focus lens and the focal length of the image forming optical systems 412a to 412d, for each wavelength of transmitted light.
  • the controller 430 adjusts the position of the focus lens to a position where the subject is in focus based on various focusing processing, including contrast AF or phase difference AF.
  • the position of the focus lens is adjusted to PVI_1, which is a focus lens position at time when visible light enters, indicated in the table shown in FIG. 9 .
  • the controller 430 starts to use the illumination section 425. After starting to use the illumination section 425, the controller 430 judges that in any of the image capture sections 411a to 411d of the multi-eye image capture section 410, which has an image capture range at least partially overlapping with the image capture range of the single-eye image capture section 420, the wavelength of light transmitted through the image forming optical system thereof has changed.
  • the position of the focus lens of the image capture section determined as described above is adjusted to a focus lens position at time which infrared light enters, which is associated with the focal length set as above, based on the table shown in FIG. 9 (lens position adjustment unit). For example, in a case where the position of the focus lens in daytime is set to PVI_1 as in the example described above, the position of the focus lens is changed from PVI_1 to PIR_1 for the image capture section determined as above.
  • An image capture apparatus 500 is capable of switching the image capture mode between a day mode in which an image is acquired only with visible light, and a night mode in which an image is acquired with both of visible light and near-infrared light.
  • the image capture mode is set to the day mode in a case where the brightness measured by a photometry section 526 is not lower than the predetermined value, and is set to the night mode in a case where the measured brightness is lower than the predetermined value.
  • a plurality of image capture sections 511a to 511d of a multi-eye image capture section 510 and an image capture section 521 of a single-eye image capture section 520 each include an infrared cut filter which selectively transmits visible light and selectively absorbs near-infrared light, and an insertion/removal mechanism for inserting and removing the infrared cut filter.
  • the visible light refers to light having a wavelength from 380 nm to 750 nm
  • the near-infrared light refers to light having a wavelength from 750 nm to 1100 nm.
  • the wavelength of illumination light irradiated from an illumination section 525 of the single-eye image capture section 520 corresponds to near-infrared light.
  • the image capture apparatus 500 has the day mode in which the infrared cut filters are inserted into the multi-eye image capture section 510 and the single-eye image capture section 520, respectively, and the night mode in which the infrared cut filters are removed from both of the multi-eye image capture section 510 and the single-eye image capture section 520.
  • image capture apparatus 500 is in the night mode, image capture is performed using the illumination section 525 of the single-eye image capture section 520.
  • the infrared cut filter in the single-eye image capture section 520 is removed and the illumination section 525 in the single-eye image capture section 520 is used.
  • the exposure levels of the image capture sections 511a to 511d of the multi-eye image capture section 510 are controlled based on the image capture direction of the single-eye image capture section 520.
  • the illumination section 525 is not used in the day mode and hence it is only required that a fixed exposure level is set irrespective of the image capture direction of the single-eye image capture section 520.
  • the image capture apparatus 500 may be configured such that in addition to the day mode and the night mode, there is provided a hybrid mode in which the infrared cut filter of one of the multi-eye image capture section 510 and the single-eye image capture section 520 is removed, and the infrared cut filter of the other is inserted.
  • the image capture mode it is preferable to set the image capture mode to a mode in which the infrared cut filters of the multi-eye image capture section 510 are inserted, and the infrared cut filter of the single-eye image capture section 520 is removed. At this time, it is preferable to improve the SN ratio of a detailed image 502 acquired by the single-eye image capture section 520 by using the illumination section 525 of the single-eye image capture section 520.
  • the infrared cut filters are inserted into the multi-eye image capture section 510, near-infrared light irradiated by the illumination section 525 does not enter the solid-state image capture devices 513a to 513d of the multi-eye image capture section 510. Therefore, it is only required that a fixed exposure level is set for the image capture sections 511a to 511d of the multi-eye image capture section 510 irrespective of the image capture direction of the single-eye image capture section 520.
  • the image capture mode it is preferable to set the image capture mode to a mode in which the infrared cut filters of the multi-eye image capture section 510 are removed, and the infrared cut filter of the single-eye image capture section 520 is inserted. This is because the multi-eye image capture section 510 cannot acquire an image of sufficient quality only with visible light even at an illumination intensity which enables the single-eye image capture section 520 to acquire an image of sufficient quality only with the visible light.
  • the illumination section 525 of the single-eye image capture section 520 it is more preferable to perform image capture using the illumination section 525 of the single-eye image capture section 520 because the SN ratio of the wide-angle image 501 acquired by the multi-eye image capture section 510 is improved. Further, by controlling the exposure levels of the image capture sections 511a to 511d of the multi-eye image capture section 510 based on the image capture direction of the single-eye image capture section 520, it is possible to reduce luminance unevenness caused by illumination distribution.
  • FIG. 10 shows whether or not to perform the above-described exposure level control based on the image capture direction of the single-eye image capture section 520 in the case where the mechanism for inserting and removing the infrared cut filter is included in each of the multi-eye image capture section 510 and the single-eye image capture section 520. That is, whether or not to perform the exposure level control in the image capture apparatus 500 is determined based on whether the infrared cut filter of each of the multi-eye image capture section 510 and the single-eye image capture section 520 is inserted or removed, whether or not the illumination section 525 is used, and the image capture direction of the single-eye image capture section 520, in each of the above-described modes.
  • the exposure level control is performed.
  • a solid-state image capture device having pixels for visible light and pixels for near-infrared light may be used as one or both of each of the solid-state image capture devices 513a to 513d of the multi-eye image capture section 510 and the solid-state image capture device 523 of the single-eye image capture section 520. More specifically, a solid-state image capture device may be used in which on-chip color filters of some of pixels in the RGB Bayer array are replaced by color filters capable of transmitting only near-infrared light. On the other hand, a solid-state image capture device having only pixels for visible light in the RGB Bayer array is referred to as the RGB sensor.
  • the multi-eye image capture section 510 further includes photometry sections 515a to 515d which measure brightness of the image capture ranges of the image capture sections 511a to 511d, respectively. Insertion/removal of the infrared cut filters into/from the multi-eye image capture section 510 is determined according to the brightness values measured by the photometry sections 515a to 515d.
  • the single-eye image capture section 520 has an RGB sensor as the solid-state image capture device 523
  • insertion/removal of the infrared cut filter into/from the single-eye image capture section 520 is determined according to the brightness measured by the photometry section 526.
  • FIGS. 11A to 11C show whether or not to perform exposure level control for the image capture sections 511a to 511d of the multi-eye image capture section 510 based on the image capture direction of the single-eye image capture section 520 in a case where one or both of the multi-eye image capture section 510 and the single-eye image capture section 520 has/have (an) RGBIR sensor(s).
  • the infrared cut filter is inserted into an image capture section having the RGBIR sensor, infrared light is prevented from entering the pixels for near-infrared light. For this reason, no infrared cut filter is used in one(s) of the multi-eye image capture section 510 and the single-eye image capture section 520, in which the RGBIR sensor(s) is/are used. That is, in a case shown in FIG. 11A , where the RGBIR sensors are used only in the multi-eye image capture section 510, no infrared cut filters are used in the multi-eye image capture section 510. Further, in a case shown in FIG.
  • the exposure level control is performed as indicated by the table shown in FIG. 11A .
  • a fist row of settings in the table shown in FIG. 11A indicates the control in a case where the brightness measured by the photometry section 526 is not lower than a first predetermined value (the image capture range of the single-eye image capture section 520 is bright) and the brightness values measured by the photometry sections 515a to 515d are not lower than a second predetermined value (the image capture range of the multi-eye image capture section 510 is bright).
  • the infrared cut filter is inserted into the single-eye image capture section 520, and the illumination section 525 is not used.
  • a second row of settings in the table shown in FIG. 11A indicates the control in a case where the brightness measured by the photometry section 526 is not lower than the first predetermined value, but the brightness values measured by the photometry sections 515a to 515d are lower than the second predetermined value (the image capture range of the multi-eye image capture section 510 is dark).
  • the infrared cut filter is inserted into the single-eye image capture section 520, but the illumination section 525 is used to brighten the image capture range of the multi-eye image capture section 510.
  • a third row of settings in the table shown in FIG. 11A indicates the control in a case where the brightness measured by the photometry section 526 is lower than the first predetermined value and the brightness values measured by the photometry sections 515a to 515d are also lower than the second predetermined value.
  • the infrared cut filter is removed from the single-eye image capture section 520 and the illumination section 525 is used to brighten the image capture range of the single-eye image capture section 520.
  • control based on the second and third rows of settings is performed such that the exposure levels of the image capture sections 511a to 511d of the multi-eye image capture section 510 are adjusted to eliminate an influence of the use of the illumination section 525.
  • the exposure level control indicated by the table shown in FIG. 11B is performed.
  • a fist row of settings in the table shown in FIG. 11B indicates the control in a case where the brightness measured by the photometry section 526 is not lower than the first predetermined value (the image capture range of the single-eye image capture section 520 is bright) and the brightness values measured by the photometry sections 515a to 515d are not lower than the second predetermined value (the image capture range of the multi-eye image capture section 510 is bright).
  • the infrared cut filters are inserted into the multi-eye image capture section 510 and the illumination section 525 is not used.
  • a second row of settings in the table shown in FIG. 11B indicates the control in a case where the brightness measured by the photometry section 526 is lower than the first predetermined value (the image capture range of the single-eye image capture section 520 is dark) but the brightness values measured by the photometry sections 515a to 515d are not lower than the second predetermined value (the image capture range of the multi-eye image capture section 510 is bright).
  • the infrared cut filters are inserted into the multi-eye image capture section 510 and the illumination section 525 is not used.
  • a third row of settings in the table shown in FIG. 11B indicates the control in a case where the brightness measured by the photometry section 526 is not lower than the first predetermined value (the image capture range of the single-eye image capture section 520 is bright) and the brightness values measured by the photometry sections 515a to 515d are lower than the second predetermined value (the image capture range of the multi-eye image capture section 510 is dark).
  • the infrared cut filter is removed from the multi-eye image capture section 510 and the illumination section 525 is not used.
  • a fourth row of settings in the table shown in FIG. 11B indicates the control in a case where the brightness measured by the photometry section 526 is lower than the first predetermined value (the image capture range of the single-eye image capture section 520 is dark) and the brightness values measured by the photometry sections 515a to 515d are lower than the second predetermined value (the image capture range of the multi-eye image capture section 510 is dark).
  • the infrared cut filters are removed from the multi-eye image capture section 510 and the illumination section 525 is used.
  • the illumination section 525 even when the illumination section 525 is used, near-infrared light irradiated by the illumination section 525 does not enter the solid-state image capture devices 513a to 513d of the multi-eye image capture section 510 when the infrared cut filters are inserted into the multi-eye image capture section 510. Therefore, the exposure level adjustment is performed only in a state in which the illumination section 525 is used and also the infrared cut filters are removed from the multi-eye image capture section 510 (the case indicated by the fourth row of the settings in the table shown in FIG. 11B ).
  • the exposure level control is performed only based on a condition of use/non-use of the illumination section 525, as indicated by the table shown in FIG. 11C .
  • Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a 'non-transitory computer-readable storage medium') to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Human Computer Interaction (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Exposure Control For Cameras (AREA)
  • Focusing (AREA)
  • Stroboscope Apparatuses (AREA)
  • Cameras In General (AREA)
  • Blocking Light For Cameras (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
  • Automatic Focus Adjustment (AREA)
EP19193874.5A 2018-08-30 2019-08-27 Image capture apparatus having illumination section, monitoring system including image capture apparatus, method of controlling image capture apparatus, and storage medium Withdrawn EP3617795A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2018161685A JP7191597B2 (ja) 2018-08-30 2018-08-30 撮像装置及びそれを備える監視システム、制御方法並びにプログラム

Publications (1)

Publication Number Publication Date
EP3617795A1 true EP3617795A1 (en) 2020-03-04

Family

ID=67809233

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19193874.5A Withdrawn EP3617795A1 (en) 2018-08-30 2019-08-27 Image capture apparatus having illumination section, monitoring system including image capture apparatus, method of controlling image capture apparatus, and storage medium

Country Status (4)

Country Link
US (1) US10887527B2 (ja)
EP (1) EP3617795A1 (ja)
JP (1) JP7191597B2 (ja)
CN (1) CN110876023A (ja)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108377345B (zh) * 2018-04-11 2020-04-03 浙江大华技术股份有限公司 一种曝光参数值确定方法、装置、多目摄像机及存储介质
CN116034586A (zh) * 2020-05-08 2023-04-28 Oppo广东移动通信有限公司 曝光水平控制方法和装置,以及存储有用于实现该方法的软件的计算机可用介质
KR102458470B1 (ko) * 2020-05-27 2022-10-25 베이징 샤오미 모바일 소프트웨어 컴퍼니 리미티드 난징 브랜치. 이미지 처리 방법 및 장치, 카메라 컴포넌트, 전자 기기, 저장 매체

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5657073A (en) * 1995-06-01 1997-08-12 Panoramic Viewing Systems, Inc. Seamless multi-camera panoramic imaging with distortion correction and selectable field of view
US6778207B1 (en) * 2000-08-07 2004-08-17 Koninklijke Philips Electronics N.V. Fast digital pan tilt zoom video
US20040212677A1 (en) * 2003-04-25 2004-10-28 Uebbing John J. Motion detecting camera system
US20090147071A1 (en) * 2007-11-16 2009-06-11 Tenebraex Corporation Systems and methods of creating a virtual window
JP2013041282A (ja) 2011-08-17 2013-02-28 Lg Innotek Co Ltd ネットワークカメラ及びその照明制御方法

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11205648A (ja) * 1998-01-09 1999-07-30 Olympus Optical Co Ltd 画像合成装置
JP2001245205A (ja) * 2000-02-29 2001-09-07 Matsushita Electric Ind Co Ltd 撮像装置
JP2009017474A (ja) * 2007-07-09 2009-01-22 Fujitsu Ten Ltd 画像処理装置および画像処理方法
JP5687289B2 (ja) * 2010-02-01 2015-03-18 ヨンコック エレクトロニクス、カンパニーリミテッドYoungkook Electronics,Co.,Ltd. 追跡監視用カメラ装置及びこれを採用する遠隔監視システム
JP6230265B2 (ja) * 2013-05-17 2017-11-15 キヤノン株式会社 撮像装置
JP2015195499A (ja) * 2014-03-31 2015-11-05 キヤノン株式会社 撮像装置、その制御方法、および制御プログラム
JP6471953B2 (ja) * 2014-05-23 2019-02-20 パナソニックIpマネジメント株式会社 撮像装置、撮像システム、及び撮像方法
JP6388115B2 (ja) * 2014-09-12 2018-09-12 カシオ計算機株式会社 撮像装置、撮像制御方法及びプログラム
JP6361931B2 (ja) * 2015-04-23 2018-07-25 パナソニックIpマネジメント株式会社 画像処理装置及びこれを備えた撮像システムならびに画像処理方法
JP6064151B1 (ja) * 2015-06-12 2017-01-25 パナソニックIpマネジメント株式会社 照明装置、撮像システムおよび照明方法
JP2019080226A (ja) 2017-10-26 2019-05-23 キヤノン株式会社 撮像装置、撮像装置の制御方法、及びプログラム
JP7043219B2 (ja) * 2017-10-26 2022-03-29 キヤノン株式会社 撮像装置、撮像装置の制御方法、及びプログラム
JP7025223B2 (ja) * 2018-01-18 2022-02-24 キヤノン株式会社 撮像装置及びその制御方法、プログラム、記憶媒体

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5657073A (en) * 1995-06-01 1997-08-12 Panoramic Viewing Systems, Inc. Seamless multi-camera panoramic imaging with distortion correction and selectable field of view
US6778207B1 (en) * 2000-08-07 2004-08-17 Koninklijke Philips Electronics N.V. Fast digital pan tilt zoom video
US20040212677A1 (en) * 2003-04-25 2004-10-28 Uebbing John J. Motion detecting camera system
US20090147071A1 (en) * 2007-11-16 2009-06-11 Tenebraex Corporation Systems and methods of creating a virtual window
JP2013041282A (ja) 2011-08-17 2013-02-28 Lg Innotek Co Ltd ネットワークカメラ及びその照明制御方法

Also Published As

Publication number Publication date
JP7191597B2 (ja) 2022-12-19
CN110876023A (zh) 2020-03-10
JP2020034758A (ja) 2020-03-05
US20200077007A1 (en) 2020-03-05
US10887527B2 (en) 2021-01-05

Similar Documents

Publication Publication Date Title
JP7271132B2 (ja) 撮像装置および監視システム
US8218069B2 (en) Camera body with which various flash devices can be interchangeably used
US10419685B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
US9854178B2 (en) Image pickup apparatus with flicker detection and having plurality of unit pixel areas, control method therefor, and storage medium
EP3617795A1 (en) Image capture apparatus having illumination section, monitoring system including image capture apparatus, method of controlling image capture apparatus, and storage medium
US10416026B2 (en) Image processing apparatus for correcting pixel value based on difference between spectral sensitivity characteristic of pixel of interest and reference spectral sensitivity, image processing method, and computer-readable recording medium
US9875423B2 (en) Image pickup apparatus that calculates light amount change characteristic, electronic apparatus, and method of calculating light amount change characteristic
EP2710340B1 (en) Camera arrangement for a vehicle and method for calibrating a camera and for operating a camera arrangement
US8497919B2 (en) Imaging apparatus and control method thereof for controlling a display of an image and an imaging condition
US9921454B2 (en) Imaging apparatus capable of generating an image with a weight of a signal corresponding to each type of spectral sensitivity characteristics and method of controlling the apparatus
US10999523B2 (en) Image pickup apparatus, method for controlling image pickup apparatus, and storage medium for controlling flash photography when a still image is imaged
US10630883B2 (en) Image capturing apparatus and focus detection method
US10212344B2 (en) Image capturing device and control method capable of adjusting exposure timing based on detected light quantity change characteristic
JP7446804B2 (ja) 撮像装置およびその制御方法、プログラム
JP2011232615A (ja) 撮像装置
JP5436139B2 (ja) 撮像装置
JP2020048140A (ja) 撮影装置
JP2020072392A (ja) 撮像装置、撮像装置の制御方法およびプログラム
JP7171265B2 (ja) 撮像装置およびその制御方法
US10873707B2 (en) Image pickup apparatus and method, for ensuring correct color temperature based on first or second preliminary light emission of a flash device
JP2017044816A (ja) 撮像装置、その制御方法およびプログラム
JP2003174581A (ja) 撮像装置
JP2016142776A (ja) 撮像装置
JP2010183191A (ja) 露光量制御装置および露光量制御方法
JP2003215436A (ja) カメラ

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200904

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

18W Application withdrawn

Effective date: 20210215