WO2023181558A1 - Imaging device - Google Patents

Imaging device Download PDF

Info

Publication number
WO2023181558A1
WO2023181558A1 PCT/JP2022/047594 JP2022047594W WO2023181558A1 WO 2023181558 A1 WO2023181558 A1 WO 2023181558A1 JP 2022047594 W JP2022047594 W JP 2022047594W WO 2023181558 A1 WO2023181558 A1 WO 2023181558A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
image
image sensor
infrared light
visible light
Prior art date
Application number
PCT/JP2022/047594
Other languages
French (fr)
Japanese (ja)
Inventor
拓洋 澁谷
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Priority to JP2024509768A priority Critical patent/JPWO2023181558A1/ja
Publication of WO2023181558A1 publication Critical patent/WO2023181558A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components

Definitions

  • the present invention relates to an imaging device, and particularly to an imaging device that expands the apparent depth of field.
  • Means to increase the brightness level are: [1] Open the aperture to increase the amount of light received, [2] Increase the exposure time by lengthening the time for one frame, [3] Irradiate light onto the subject. [4] There is a method of applying gain correction to the video signal.
  • Patent Document 1 discloses that it is determined whether it is daytime, nighttime, early morning, or evening based on the brightness of the photographed image acquired by the camera and the brightness of the reference image of the road surface, and the aperture of the camera is adjusted according to the determination result. A technique for controlling this is disclosed.
  • an object of the present invention to provide an imaging device that can obtain an image with a deep apparent depth of field even when the aperture is opened.
  • one of the typical imaging devices of the present invention is capable of separately receiving visible light and near-infrared light with respect to incident light, and is capable of receiving a visible light signal based on the visible light and a near-infrared light signal based on the visible light.
  • an image sensor unit that outputs a near-infrared light signal based on infrared light, and a process for extracting a contour component from an image generated by the near-infrared light signal and superimposing the extracted contour component on the image generated by the visible light signal.
  • a video signal processing section is capable of separately receiving visible light and near-infrared light with respect to incident light, and is capable of receiving a visible light signal based on the visible light and a near-infrared light signal based on the visible light.
  • FIG. 1 is a block diagram of a computer system for implementing aspects according to embodiments of the present disclosure.
  • FIG. 2 is a block diagram showing an example of the configuration of an imaging device according to the present invention.
  • FIG. 3 is a diagram showing a first example of the image sensor section in the image pickup apparatus of the present invention.
  • FIG. 4 is a diagram showing a second example of the image sensor section in the image pickup apparatus of the present invention.
  • FIG. 5 is a diagram showing a third example of the image sensor section in the image pickup apparatus of the present invention.
  • FIG. 6 is a diagram showing a fourth example of the image sensor section in the image pickup apparatus of the present invention.
  • FIG. 7 is a diagram showing a fifth example of the image sensor section in the image pickup apparatus of the present invention.
  • FIG. 1 is a block diagram of a computer system for implementing aspects according to embodiments of the present disclosure.
  • FIG. 2 is a block diagram showing an example of the configuration of an imaging device according to the present invention.
  • FIG. 8 is a diagram showing an example of a flowchart of the imaging apparatus of the present invention.
  • FIG. 9 is a diagram illustrating a specific example of superimposing contour components from a visible light image and a near-infrared light image in the imaging apparatus of the present invention.
  • FIG. 1 is a block diagram of a computer system 300 for implementing aspects according to embodiments of the present disclosure.
  • the mechanisms and apparatus of the various embodiments disclosed herein may be applied to any suitable computing system.
  • the main components of computer system 300 include one or more processors 302 , memory 304 , terminal interface 312 , storage interface 314 , I/O (input/output) device interface 316 , and network interface 318 . These components may be interconnected via memory bus 306, I/O bus 308, bus interface unit 309, and I/O bus interface unit 310.
  • Computer system 300 may include one or more processing devices 302A and 302B, collectively referred to as processor 302. Each processor 302 executes instructions stored in memory 304 and may include onboard cache. In some embodiments, computer system 300 may include multiple processors, and in other embodiments, computer system 300 may be a single processing unit system. Processing devices include CPU (Central Processing Unit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), and DSP (Digital Processing Unit). al Signal Processor) etc. can be applied.
  • CPU Central Processing Unit
  • FPGA Field-Programmable Gate Array
  • GPU Graphics Processing Unit
  • DSP Digital Processing Unit
  • memory 304 may include random access semiconductor memory, storage devices, or storage media (either volatile or nonvolatile) for storing data and programs.
  • memory 304 represents the entire virtual memory of computer system 300 and may include the virtual memory of other computer systems connected to computer system 300 via a network.
  • memory 304 may be conceptually considered a single entity, in other embodiments this memory 304 may be a more complex arrangement, such as a hierarchy of caches and other memory devices. .
  • memory may exist as multiple levels of caches, and these caches may be divided by function. As a result, one cache may hold instructions while the other cache holds non-instruction data used by the processor.
  • Memory may be distributed and associated with various different processing units, such as in a so-called NUMA (Non-Uniform Memory Access) computer architecture.
  • NUMA Non-Uniform Memory Access
  • Memory 304 may store all or a portion of the programs, modules, and data structures that perform the functions described herein.
  • memory 304 may store latent factor identification application 350.
  • latent factor identification application 350 may include instructions or writings that perform functions described below on processor 302, or may include instructions or writings that are interpreted by other instructions or writings.
  • latent factor identification application 350 may be applied to semiconductor devices, chips, logic gates, circuits, circuit cards, and/or other physical hardware instead of or in addition to processor-based systems. It may also be implemented in hardware via a device.
  • latent factor identification application 350 may include data other than instructions or descriptions.
  • cameras, sensors, or other data input devices may be provided to communicate directly with bus interface unit 309, processor 302, or other hardware of computer system 300. . Such a configuration may reduce the need for processor 302 to access memory 304 and latent factor identification applications.
  • Computer system 300 may include a bus interface unit 309 that provides communication between processor 302 , memory 304 , display system 324 , and I/O bus interface unit 310 .
  • I/O bus interface unit 310 may be coupled to I/O bus 308 for transferring data to and from various I/O units.
  • I/O bus interface unit 310 connects, via I/O bus 308, a plurality of I/O interface units 312, 314, 316, also known as I/O processors (IOPs) or I/O adapters (IOAs). and 318.
  • Display system 324 may include a display controller, display memory, or both. A display controller may provide video, audio, or both data to display device 326.
  • Computer system 300 may also include devices, such as one or more sensors, configured to collect data and provide the data to processor 302.
  • computer system 300 may include environmental sensors that collect humidity data, temperature data, pressure data, etc., motion sensors that collect acceleration data, motion data, etc., and the like. Other types of sensors can also be used.
  • the display memory may be a dedicated memory for buffering video data.
  • Display system 324 may be connected to a display device 326, such as a standalone display screen, a television, a tablet, or a handheld device.
  • display device 326 may include speakers to render audio.
  • a speaker for rendering audio may be connected to the I/O interface unit.
  • the functionality provided by display system 324 may be implemented by an integrated circuit that includes processor 302.
  • bus interface unit 309 may be implemented by an integrated circuit that includes processor 302.
  • the I/O interface unit has the ability to communicate with various storage or I/O devices.
  • the terminal interface unit 312 may include a user output device such as a video display device, a speaker television, or a user input device such as a keyboard, mouse, keypad, touch pad, trackball, button, light pen, or other pointing device.
  • user I/O devices 320 such as: Using the user interface, a user operates a user input device to input input data and instructions to user I/O device 320 and computer system 300, and to receive output data from computer system 300. Good too.
  • the user interface may be displayed on a display device, played through a speaker, or printed through a printer, for example, via the user I/O device 320.
  • Storage interface 314 may include one or more disk drives or direct access storage devices 322 (typically magnetic disk drive storage devices, but an array of disk drives or other storage devices configured to appear as a single disk drive). ) can be installed. In some embodiments, storage device 322 may be implemented as any secondary storage device. The contents of memory 304 may be stored in storage device 322 and read from storage device 322 as needed.
  • Network interface 318 may provide a communication pathway so that computer system 300 and other devices can communicate with each other. This communication path may be, for example, network 330.
  • the computer system 300 shown in FIG. 1 includes a bus structure that provides a direct communication path between a processor 302, a memory 304, a bus interface 309, a display system 324, and an I/O bus interface unit 310;
  • computer system 300 may include point-to-point links in a hierarchical, star, or web configuration, multiple hierarchical buses, and parallel or redundant communication paths.
  • I/O bus interface unit 310 and I/O bus 308 are shown as a single unit, computer system 300 may actually include multiple I/O bus interface units 310 or multiple I/O bus A bus 308 may also be provided.
  • multiple I/O interface units are shown to separate I/O bus 308 from various communication paths leading to various I/O devices, in other embodiments, one of the I/O devices may Some or all may be directly connected to one system I/O bus.
  • computer system 300 is a device that receives requests from other computer systems (clients) that do not have a direct user interface, such as a multi-user mainframe computer system, a single-user system, or a server computer. There may be. In other embodiments, computer system 300 may be a desktop computer, portable computer, laptop, tablet computer, pocket computer, telephone, smart phone, or any other suitable electronic device.
  • FIG. 2 is a block diagram showing an example of the configuration of an imaging device according to the present invention.
  • the imaging device 1 includes a lens section 2, an image sensor section 3, a video signal processing section 4, and a control section 5.
  • the imaging device 1 captures video at, for example, 3 frames per second (3 fps) or more, and performs processing to be described later.
  • Incident light from a subject is imaged by the lens section 2, and photoelectrically converted into an electrical signal by the image sensor section 3.
  • the lens section 2 includes concave and convex lenses 6 that adjust zoom and focus, an aperture 7 that adjusts the amount of light, and a near-infrared light cut filter 8. These are provided in this order from the incident light side: each concave-convex lens 6, an aperture 7, and a near-infrared light cut filter 8. Note that various configurations can be applied to the configuration of each concave-convex lens 6 depending on the purpose.
  • the aperture 7 adjusts the amount of light entering from the concave-convex lens 6 by changing the size of the central hole using a structure such as a plurality of blades. Opening the aperture increases the amount of light, allowing you to capture good images even in dark surroundings. However, in this case, the depth of field becomes shallow, and objects at a certain distance are in focus, while objects at other distances tend to be blurred. This makes it difficult to identify objects at a distance other than a specific distance. On the other hand, if the surroundings are bright, there will be enough light even when the aperture is closed, and the depth of field will be deep. In this case, it is easy to focus over a wide range of different distances, and even at distances that are out of focus, the degree of blur is small.
  • the near-infrared light cut filter 8 is a filter that cuts only near-infrared light and allows visible light to pass through.
  • near-infrared light is light with a wavelength of approximately 800 to 2,500 nanometers.
  • visible light is light with a wavelength of about 380 to 770 nanometers.
  • the near-infrared light cut filter 8 is located adjacent to the diaphragm 7, and is open only in the center, so that it is transparent without a filter.
  • the size of the aperture in the central part where there is no filter can be set to be smaller. For example, the aperture may be made smaller than the aperture when the diaphragm 7 is at its most closed position.
  • the aperture may be smaller than the aperture of the diaphragm 7 at the time of the brightness level threshold value in step S101 in FIG. 8, which will be described later. As a result, it is possible to set a deep depth of field for near-infrared light.
  • the image sensor section 3 converts the incident light that has passed through the lens section 2, the aperture 7, and the near-infrared cut filter 8 into an electrical signal, and transmits the signal to the video signal processing section 4.
  • Examples of the image sensor here include a CCD (Charge-Coupled Device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and the like.
  • the image sensor section 3 is configured to be able to separately receive visible light and near-infrared light, and the detailed configuration will be described later with reference to FIGS. 3 to 7.
  • the video signal processing unit 4 performs various video signal processing such as gain correction, gamma correction, knee correction, contour correction, and color correction on the signal input from the image sensor unit 3.
  • the video signal processing section 4 includes a contour extraction section 9.
  • the contour extraction section 9 performs contour correction processing on the near-infrared light signal input from the image sensor section 3 and outputs it as a contour signal. Then, the above contour signal is added to the visible light signal inputted from the image sensor section 3 to create a video signal.
  • An HD-SDI (High Definition Serial Digital Interface) signal is generated from this video signal and output to the outside. Note that the video signal output from the video signal processing unit 4 is not limited to the HD-SDI described above, and other types of video signals such as compressed and encrypted video signals may be applied.
  • the control unit 5 controls the lens unit 2, the image sensor unit 3, and the video signal processing unit 4 of the imaging device 1.
  • the control unit 5 can be configured by, for example, a CPU.
  • the computer system 300 shown in FIG. 1 can be applied to the video signal processing section 4 and the control section 5.
  • FIG. 3 is a diagram showing a first example of the image sensor section in the image pickup device of the present invention.
  • the image sensor section 3 shown in FIG. 3 includes an image sensor 31 having a filter 31a.
  • the filter 31a is a filter installed on the incident light side, and is a filter in which a plurality of visible light filters “Vis” and a plurality of near-infrared light filters “IR” are arranged alternately in a mosaic pattern (checkerboard pattern). be.
  • the visible light filter “Vis” is a filter that transmits only visible light.
  • the near-infrared light filter "IR” is a filter that transmits only near-infrared light. Therefore, the image sensor section 3 in FIG.
  • FIG. 3 is a monochrome single-plate type, and serves as an image sensor for monochrome images.
  • the incident light passes through the filter 31a and reaches the image sensor 31, but at this time, the visible light filter "Vis” and the near-infrared light filter “IR” separate visible light and near-infrared light for each light receiving element. Receive light. Then, it is output to the video signal processing unit 4 as a visible light signal and a near-infrared light signal.
  • FIG. 3 shows an example of 16 filters, the arrangement is not limited to this and can be arranged according to the number of light receiving elements.
  • FIG. 4 is a diagram showing a second example of the image sensor section in the imaging device of the present invention.
  • the image sensor section 3 shown in FIG. 4 includes a prism 12, an image sensor 32, and an image sensor 33.
  • the prism 12 is a prism that separates incident light into visible light and near-infrared light.
  • the image sensor 32 is an image sensor that receives visible light separated by the prism 12.
  • the image sensor 33 is an image sensor that receives near-infrared light separated by the prism 12.
  • the image sensor section 3 in FIG. 4 is a monochrome two-plate type using two monochrome image sensors that each receive visible light and near-infrared light. The incident light passes through the prism 12 and is separated into visible light and near-infrared light.
  • the image sensor 32 receives only visible light and outputs it to the video signal processing section 4 as a visible light signal.
  • the image sensor 33 receives only near-infrared light and outputs it to the video signal processing section 4 as a near-infrared light signal.
  • the video signal processing unit 4 generates a monochrome image based on the received visible light signal. Therefore, both image sensors do not have a mosaic filter as shown in FIG. 3.
  • FIG. 5 is a diagram showing a third example of the image sensor section in the imaging device of the present invention.
  • the image sensor section 3 shown in FIG. 5 includes an image sensor 34 having a filter 34a.
  • the filter 34a is a filter installed on the incident light side, and includes a plurality of red light filters "R”, a plurality of green light filters “G”, a plurality of blue light filters “B”, and a plurality of near-infrared light filters.
  • Optical filters "IR” are arranged alternately in a mosaic pattern.
  • multiple patterns are lined up with four patterns: a green light filter “G” in the upper left, a red light filter “R” in the upper right, a green light filter “G” in the lower left, and a near-infrared light filter “IR” in the lower right.
  • Green light filter “G” is a filter that passes only green light.
  • the red light filter “R” is a filter that passes only red light.
  • Blue light filter “B” is a filter that passes only blue light.
  • the near-infrared light filter “IR” is a filter that transmits only near-infrared light.
  • the image sensor section 3 in FIG. 5 is of a color single-plate type, and serves as an image sensor for color images.
  • the incident light passes through the filter 34a and reaches the image sensor 34, but at this time, the visible light passes through the red light filter “R” as red light and passes through the green light filter “G”. The light that has passed is received as green light, and the light that has passed through the blue light filter “B” is received as blue light by the image sensor 34. Signals based on these received lights are output to the video signal processing section 4 as visible light signals. Further, the light that has passed through the near-infrared light filter "IR” is received by the image sensor 34 as near-infrared light. A signal based on this received light is output to the video signal processing section 4 as a near-infrared light signal. This makes it possible to receive visible light and near-infrared light separately for each light receiving element.
  • FIG. 5 shows an example of 16 filters, the arrangement is not limited to this and can be arranged according to the number of light receiving elements.
  • FIG. 6 is a diagram showing a fourth example of the image sensor section in the imaging device of the present invention.
  • the image sensor section 3 shown in FIG. 6 includes a prism 12, an image sensor 35, and an image sensor 36.
  • the image sensor 35 has a filter 35a.
  • the prism 12 is a prism that separates incident light into visible light and near-infrared light, and is similar to the prism 12 in FIG. 4 .
  • the filter 35a is a filter in which a plurality of red light filters "R", a plurality of green light filters "G”, and a plurality of blue light filters "B" are alternately arranged in a mosaic pattern.
  • FIG. 6 shows a Bayer array filter 35a that receives visible light.
  • the image sensor 35 is an image sensor that receives visible light separated by the prism 12.
  • the image sensor 36 is an image sensor that receives near-infrared light separated by the prism 12.
  • the image sensor section 3 in FIG. 6 is a color two-plate type using an image sensor that receives visible light and a monochrome image sensor that receives near-infrared light.
  • incident light passes through a prism 12 and is separated into visible light and near-infrared light.
  • the image sensor 35 receives only visible light and outputs it to the video signal processing section 4 as a visible light signal.
  • the image sensor 36 receives only near-infrared light and outputs it to the video signal processing section 4 as a near-infrared light signal.
  • the image sensor 35 can receive light as an RGB color image through the filter 35a.
  • FIG. 6 shows an example of 16 filters, the arrangement is not limited to this and can be arranged according to the number of light receiving elements.
  • FIG. 7 is a diagram showing a fifth example of the image sensor section in the imaging device of the present invention.
  • the image sensor section 3 shown in FIG. 7 includes a prism 15, an image sensor 37, an image sensor 38, an image sensor 39, and an image sensor 40.
  • the prism 15 is a prism that separates incident light into red light, green light, blue light, and near-infrared light, respectively.
  • the image sensor 37 is an image sensor that receives red light separated by the prism 15.
  • the image sensor 38 is an image sensor that receives green light separated by the prism 15.
  • the image sensor 39 is an image sensor that receives blue light separated by the prism 15.
  • the image sensor 40 is an image sensor that receives near-infrared light separated by the prism 15.
  • the image sensor section 3 in FIG. 7 is a color four-plate type using four monochrome image sensors that receive red light, green light, blue light, and near-infrared light, respectively.
  • incident light passes through a prism 15 and is separated into red light, green light, blue light, and near-infrared light.
  • the image sensor 37 receives only red light
  • the image sensor 38 receives only green light
  • the image sensor 39 receives only blue light. Signals based on these received lights are output to the video signal processing section 4 as visible light signals.
  • the image sensor 40 receives only near-infrared light. A signal based on this received light is output to the video signal processing section 4 as a near-infrared light signal.
  • the video signal processing unit 4 generates a color video based on the received red light, green light, and blue light. Therefore, both image sensors do not have a mosaic filter as shown in FIG. 6.
  • FIG. 8 is a diagram showing an example of a flowchart of the imaging apparatus of the present invention. The operation of expanding the apparent depth of field when the aperture 7 of the lens section 2 is opened will be described using FIG. 8. Here, unless otherwise explained, processing in the video signal processing section 4 is shown.
  • step S101 it is determined whether the brightness level of the video input to the video signal processing unit 4 is smaller than a threshold value. If it is determined that the brightness level of the input video is smaller than the threshold, the process advances to step S102. If the brightness level of the input video is equal to or higher than the threshold, the process ends.
  • the brightness level threshold can be set in advance. As the input video here, a video based on a visible light signal can be used.
  • step S102 the aperture 7 of the lens section 2 is opened so that a predetermined brightness level is obtained.
  • the predetermined brightness level here may be, for example, the brightness level threshold value in step S101. If the brightness level is low, the image will be dark, so by opening the aperture 7, the brightness of the image can be maintained at a predetermined level.
  • the degree to which the aperture 7 opens may be determined according to the brightness level detected in step S101.
  • the aperture 7 can be controlled and opened under the control of the control section 5 according to the brightness level detected by the video signal processing section 4.
  • opening the aperture 7 increases the amount of light and raises the brightness level, the depth of field becomes shallower and the degree of blur in the image at a distance other than the specific distance increases.
  • the contour extraction unit 9 extracts contour components from the near-infrared light signal.
  • the near-infrared light signal is a signal of near-infrared light obtained by passing through a near-infrared light cut filter 8 adjacent to the aperture 7.
  • the near-infrared light cut filter 8 is open only at the center and is transparent, so that most of the near-infrared light does not pass through, but only a part of the center passes through. That is, the optical path of the near-infrared light is not affected by the diaphragm 7, and is always in the same state as if the diaphragm were closed. As a result, an image with a deep depth of field can be obtained although the brightness level of the near-infrared light signal is low.
  • the video signal processing unit 4 creates a video (near-infrared video) based on the near-infrared light signal, but since the near-infrared video has a low brightness level, the video is dark as a whole. For this reason, the near-infrared light image may be subjected to gain correction as necessary, or may be passed through an LPF (Low-Pass Filter) to remove noise.
  • LPF Low-Pass Filter
  • Extraction of the contour component can be performed by performing image processing on each image of the near-infrared light video and extracting the contour of each subject.
  • step S104 the video signal processing unit 4 superimposes the contour component extracted from the near-infrared light image onto the visible light image.
  • the visible light image is an image created by a visible light signal that has passed through the aperture 7.
  • the superimposition involves aligning and superimposing outline components from visible light images and near-infrared light images taken at the same time. As a result, the outline component of the near-infrared light image is added to the blurred visible light image with a shallow depth of field, so the outline of each subject can be made clear.
  • FIG. 9 is a diagram illustrating a specific example of superimposing contour components from a visible light image and a near-infrared light image in the imaging apparatus of the present invention.
  • the visible light image 201 is an image obtained when the aperture 7 is opened in step S102 of FIG. Although the brightness level can be increased by opening the aperture 7, the depth of field becomes shallower. In the example of FIG. 9, the car in the foreground is in focus, and other subjects are out of focus, resulting in a blurred image (an image with blurred colors).
  • the brightness level of the image is maintained at a predetermined level so as not to become dark.
  • the contour image 202 is an image representing the contour components extracted in step S103 of FIG. 8.
  • the near-infrared light passes through the near-infrared light cut filter 8 that is open only in the center, resulting in an image with a deep depth of field, but initially the entire image is dark. For this reason, gain correction is performed to increase the gain and brighten the image. At this time, since noise also increases, noise correction such as DNR (Digital Noise Reduction) is also applied.
  • DNR Digital Noise Reduction
  • contour components are extracted based on the image.
  • the contour image 202 in FIG. 9 has a deep depth of field, so that not only the car in the foreground but also buildings, roads, billboards, and other cars in the background are in focus or slightly out of focus. Therefore, it is possible to capture the contour components of each subject (each object) through image processing.
  • the contour component of the contour image 202 can represent the contour of each subject with a line or the like.
  • the superimposed video 203 is a video that has undergone the superimposition process in step S104 of FIG. 8.
  • a visible light image 201 and an outline image 202 are superimposed. Since the visible light image 201 maintains the brightness level, the image does not become dark as a whole. Furthermore, since the contour image 202 adds contour components over a wide range, the contour can be made clear and the apparent depth of field can be increased. This allows you to clearly capture not only the car in the foreground, but also other subjects.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the embodiments described above are described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described.
  • it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
  • the present invention can be applied not only when the amount of light is insufficient, but also when the amount of light is sufficient.
  • a predetermined effect can be obtained by combining it with other operations that affect the brightness level. For example, it is possible to increase the frame rate by shortening the exposure time by the amount of light increased by opening the aperture 7 of the lens, or to improve the image quality by performing gain correction in the compression direction.
  • FIG. 2 shows an example in which the aperture 7 and the near-infrared light cut filter 8 are close to each other
  • the present invention is also applicable even if they are separated.
  • the aperture 7 is narrowed down to the same size as the aperture of the near-infrared light cut filter 8
  • near-infrared light will not reach anything other than parallel light from the central aperture, but in this case, the visible light image will be Because the depth is deep, contour correction is not necessary.
  • the zoom magnification it can be corrected by correcting it using aberration correction (digital zoom).
  • Computer system 302 ...Processor, 302A, 302B...Processing device, 304...Memory, 306...Memory bus, 308...I/O bus, 309...Bus interface unit, 310...I/O bus interface unit, 312...Terminal interface unit, 314...Storage Interface, 316...I/O device interface, 318...Network interface, 320...User I/O device, 324...Display system, 326...Display device, 330...Network, 350...Latent factor identification application

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The purpose of the present patent is to provide an imaging device capable of obtaining an image with a deep depth of field even when an aperture is open. This imaging device comprises: an imaging element unit that separately receives visible light and near-infrared light, and outputs a visible light signal and a near-infrared light signal; and a video signal processing unit that extracts a contour component from a video by means of the near-infrared light signal, and superimposes the extracted contour component on a video of the visible light signal. The imaging device further comprises: a lens; an aperture that adjusts the amount of light entering the lens, and a control unit that controls the aperture opening, wherein the control unit opens the aperture in accordance with the luminance level of the video by means of the visible light signal.

Description

撮像装置Imaging device
 本発明は、撮像装置に関し、特に、見た目の被写界深度を拡大する撮像装置に関する。 The present invention relates to an imaging device, and particularly to an imaging device that expands the apparent depth of field.
 近年、セキュリティ強化における監視手段としての屋内外の定点監視カメラ、及び事故発生時における証拠記録手段としての車載監視カメラが広く普及している。それらの監視カメラにおいて、夕方や夜間等の光量が少ない環境下で撮影をする場合、映像が暗くなり監視映像としての価値が損なわれる。これを防ぐため、監視カメラで撮影した映像の輝度レベルを上げるための操作又は処理が行われる。 In recent years, indoor and outdoor fixed-point surveillance cameras have become widely used as a means of monitoring to strengthen security, and vehicle-mounted surveillance cameras have become a means of recording evidence in the event of an accident. When these surveillance cameras take pictures in environments with low light levels, such as in the evening or at night, the images become dark and lose their value as surveillance images. To prevent this, an operation or process is performed to increase the brightness level of the video captured by the surveillance camera.
 輝度レベルを上げるための手段としては、[1]絞りを開いて受光量を増やす、[2]1フレームの時間を長くして露光時間を増やす、[3]被写体に対して光を照射する、[4]映像信号に利得補正を施す、といった手段がある。 Means to increase the brightness level are: [1] Open the aperture to increase the amount of light received, [2] Increase the exposure time by lengthening the time for one frame, [3] Irradiate light onto the subject. [4] There is a method of applying gain correction to the video signal.
 また特許文献1には、カメラで取得した撮影画像の輝度及び路面の基準画像の輝度に基づいて昼間、夜間、朝方、夕方の何れであるかを判定し、その判定結果に応じてカメラの絞りを制御する技術が開示されている。 Furthermore, Patent Document 1 discloses that it is determined whether it is daytime, nighttime, early morning, or evening based on the brightness of the photographed image acquired by the camera and the brightness of the reference image of the road surface, and the aperture of the camera is adjusted according to the determination result. A technique for controlling this is disclosed.
特開2001-21958号公報Japanese Patent Application Publication No. 2001-21958
 上記[1]の手段に関しては、絞りを開くと被写界深度が浅くなり、特定距離にある被写体以外の映像が暈けてしまう。このため、特定の1点のみではなく広範囲の多点を監視するようなカメラには不向きである。特許文献1の技術においても同様であり、例えば、夜間に受光量を増やすために絞りを開くと同様の問題が発生する。 Regarding the means [1] above, when the aperture is opened, the depth of field becomes shallow, and images of objects other than objects at a certain distance become blurred. For this reason, it is not suitable for cameras that monitor not only one specific point but many points over a wide range. The same problem applies to the technique disclosed in Patent Document 1, and for example, when the aperture is opened to increase the amount of light received at night, a similar problem occurs.
 上記[2]の露光時間を増やす手段に関しては、フレームレートを下げる必要が生じるとともに、動きのある被写体に発生する残像が増加してしまう。このため、一瞬の事象の記録が求められる監視カメラには不向きな映像となる。 Regarding the means for increasing the exposure time in [2] above, it becomes necessary to lower the frame rate, and the number of afterimages that occur on a moving subject increases. For this reason, the images are unsuitable for surveillance cameras that are required to record instantaneous events.
 上記[3]の光を照射する手段に関しては、光を照射する装置を設置しなければならずコスト及び消費電力が増加してしまう。さらに、可視光を照射すると被写体に影響を与えてしまう場合が存在する。また、近赤外光を用いる場合はカラー映像を得ることができないといった課題も生じる。 Regarding the means for irradiating light in [3] above, it is necessary to install a device for irradiating light, which increases cost and power consumption. Furthermore, there are cases where the irradiation of visible light may affect the subject. Furthermore, when near-infrared light is used, a problem arises in that color images cannot be obtained.
 上記[4]の映像信号に利得補正を施す手段に関しては、利得補正の際に映像ノイズも一緒に増加してしまいノイズの多い画像となる。さらに、映像の諧調が荒くなってしまう問題も生じる。このため、取得した画質が大きく劣化してしまうという課題が存在する。 Regarding the means for performing gain correction on the video signal in [4] above, video noise also increases during gain correction, resulting in a noisy image. Furthermore, a problem arises in that the gradation of the image becomes rough. Therefore, there is a problem in that the quality of the acquired image is significantly degraded.
 本発明は、上記課題に鑑みて、絞りを開いた場合においても、見た目の被写界深度が深い映像を得ることができる撮像装置を提供することを目的とする。 In view of the above problems, it is an object of the present invention to provide an imaging device that can obtain an image with a deep apparent depth of field even when the aperture is opened.
 上記目的を達成するため、代表的な本発明の撮像装置の一つは、入射光に対して可視光と近赤外光とを別個に受光可能で前記可視光に基づく可視光信号と前記近赤外光に基づく近赤外光信号を出力する撮像素子部と、前記近赤外光信号による映像から輪郭成分を抽出し、抽出した輪郭成分を前記可視光信号による映像に重畳する処理を行う映像信号処理部とを有することを特徴とする。 In order to achieve the above object, one of the typical imaging devices of the present invention is capable of separately receiving visible light and near-infrared light with respect to incident light, and is capable of receiving a visible light signal based on the visible light and a near-infrared light signal based on the visible light. an image sensor unit that outputs a near-infrared light signal based on infrared light, and a process for extracting a contour component from an image generated by the near-infrared light signal and superimposing the extracted contour component on the image generated by the visible light signal. and a video signal processing section.
 本発明によれば、撮像装置において、レンズ絞りを開いた場合においても、見た目の被写界深度が深い映像を得ることができる。
 上記以外の課題、構成及び効果は、以下の実施形態により明らかにされる。
According to the present invention, even when the lens aperture is opened in the imaging device, an image with a deep apparent depth of field can be obtained.
Problems, configurations, and effects other than those described above will be clarified by the following embodiments.
図1は、本開示の実施形態による態様を実施するためのコンピュータシステムのブロック図である。FIG. 1 is a block diagram of a computer system for implementing aspects according to embodiments of the present disclosure. 図2は、本発明の撮像装置の構成例を示すブロック図である。FIG. 2 is a block diagram showing an example of the configuration of an imaging device according to the present invention. 図3は、本発明の撮像装置における撮像素子部の第1の例を示す図である。FIG. 3 is a diagram showing a first example of the image sensor section in the image pickup apparatus of the present invention. 図4は、本発明の撮像装置における撮像素子部の第2の例を示す図である。FIG. 4 is a diagram showing a second example of the image sensor section in the image pickup apparatus of the present invention. 図5は、本発明の撮像装置における撮像素子部の第3の例を示す図である。FIG. 5 is a diagram showing a third example of the image sensor section in the image pickup apparatus of the present invention. 図6は、本発明の撮像装置における撮像素子部の第4の例を示す図である。FIG. 6 is a diagram showing a fourth example of the image sensor section in the image pickup apparatus of the present invention. 図7は、本発明の撮像装置における撮像素子部の第5の例を示す図である。FIG. 7 is a diagram showing a fifth example of the image sensor section in the image pickup apparatus of the present invention. 図8は、本発明の撮像装置のフローチャートの例を示す図である。FIG. 8 is a diagram showing an example of a flowchart of the imaging apparatus of the present invention. 図9は、本発明の撮像装置において、可視光映像と近赤外光映像からの輪郭成分を重畳する具体例を説明する図である。FIG. 9 is a diagram illustrating a specific example of superimposing contour components from a visible light image and a near-infrared light image in the imaging apparatus of the present invention.
 本発明を実施するための形態を説明する。 A mode for carrying out the present invention will be described.
<実施形態による態様を実施するためのコンピュータシステム>
 図1は、本開示の実施形態による態様を実施するためのコンピュータシステム300のブロック図である。本明細書で開示される様々な実施形態の機構及び装置は、任意の適切なコンピューティングシステムに適用されてもよい。コンピュータシステム300の主要コンポーネントは、1つ以上のプロセッサ302、メモリ304、端末インターフェース312、ストレージインターフェース314、I/O(入出力)デバイスインターフェース316、及びネットワークインターフェース318を含む。これらのコンポーネントは、メモリバス306、I/Oバス308、バスインターフェースユニット309、及びI/Oバスインターフェースユニット310を介して、相互的に接続されてもよい。
<Computer system for implementing aspects according to embodiments>
FIG. 1 is a block diagram of a computer system 300 for implementing aspects according to embodiments of the present disclosure. The mechanisms and apparatus of the various embodiments disclosed herein may be applied to any suitable computing system. The main components of computer system 300 include one or more processors 302 , memory 304 , terminal interface 312 , storage interface 314 , I/O (input/output) device interface 316 , and network interface 318 . These components may be interconnected via memory bus 306, I/O bus 308, bus interface unit 309, and I/O bus interface unit 310.
 コンピュータシステム300は、プロセッサ302と総称される1つ又は複数の処理装置302A及び302Bを含んでもよい。各プロセッサ302は、メモリ304に格納された命令を実行し、オンボードキャッシュを含んでもよい。ある実施形態では、コンピュータシステム300は複数のプロセッサを備えてもよく、また別の実施形態では、コンピュータシステム300は単一の処理装置によるシステムであってもよい。処理装置としては、CPU(Central Processing Unit)、FPGA(Field-Programmable Gate Array)、GPU(Graphics Processong Unit)、DSP(Digital Signal Processor)等を適用できる。 Computer system 300 may include one or more processing devices 302A and 302B, collectively referred to as processor 302. Each processor 302 executes instructions stored in memory 304 and may include onboard cache. In some embodiments, computer system 300 may include multiple processors, and in other embodiments, computer system 300 may be a single processing unit system. Processing devices include CPU (Central Processing Unit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), and DSP (Digital Processing Unit). al Signal Processor) etc. can be applied.
 ある実施形態では、メモリ304は、データ及びプログラムを記憶するためのランダムアクセス半導体メモリ、記憶装置、又は記憶媒体(揮発性又は不揮発性のいずれか)を含んでもよい。ある実施形態では、メモリ304は、コンピュータシステム300の仮想メモリ全体を表しており、ネットワークを介してコンピュータシステム300に接続された他のコンピュータシステムの仮想メモリを含んでもよい。メモリ304は、概念的には単一のものとみなされてもよいが、他の実施形態では、このメモリ304は、キャッシュおよび他のメモリデバイスの階層など、より複雑な構成となる場合がある。例えば、メモリは複数のレベルのキャッシュとして存在し、これらのキャッシュは機能毎に分割されてもよい。その結果、1つのキャッシュは命令を保持し、他のキャッシュはプロセッサによって使用される非命令データを保持する構成であってもよい。メモリは、いわゆるNUMA(Non-Uniform Memory Access)コンピュータアーキテクチャのように、分散され、種々の異なる処理装置に関連付けられてもよい。 In some embodiments, memory 304 may include random access semiconductor memory, storage devices, or storage media (either volatile or nonvolatile) for storing data and programs. In some embodiments, memory 304 represents the entire virtual memory of computer system 300 and may include the virtual memory of other computer systems connected to computer system 300 via a network. Although memory 304 may be conceptually considered a single entity, in other embodiments this memory 304 may be a more complex arrangement, such as a hierarchy of caches and other memory devices. . For example, memory may exist as multiple levels of caches, and these caches may be divided by function. As a result, one cache may hold instructions while the other cache holds non-instruction data used by the processor. Memory may be distributed and associated with various different processing units, such as in a so-called NUMA (Non-Uniform Memory Access) computer architecture.
 メモリ304は、本明細書で説明する機能を実施するプログラム、モジュール、及びデータ構造のすべて又は一部を格納してもよい。例えば、メモリ304は、潜在因子特定アプリケーション350を格納していてもよい。ある実施形態では、潜在因子特定アプリケーション350は、後述する機能をプロセッサ302上で実行する命令又は記述を含んでもよく、あるいは別の命令又は記述によって解釈される命令又は記述を含んでもよい。ある実施形態では、潜在因子特定アプリケーション350は、プロセッサベースのシステムの代わりに、またはプロセッサベースのシステムに加えて、半導体デバイス、チップ、論理ゲート、回路、回路カード、および/または他の物理ハードウェアデバイスを介してハードウェアで実施されてもよい。ある実施形態では、潜在因子特定アプリケーション350は、命令又は記述以外のデータを含んでもよい。ある実施形態では、カメラ、センサ、または他のデータ入力デバイス(図示せず)が、バスインターフェースユニット309、プロセッサ302、またはコンピュータシステム300の他のハードウェアと直接通信するように提供されてもよい。このような構成では、プロセッサ302がメモリ304及び潜在因子識別アプリケーションにアクセスする必要性が低減する可能性がある。 Memory 304 may store all or a portion of the programs, modules, and data structures that perform the functions described herein. For example, memory 304 may store latent factor identification application 350. In some embodiments, latent factor identification application 350 may include instructions or writings that perform functions described below on processor 302, or may include instructions or writings that are interpreted by other instructions or writings. In some embodiments, latent factor identification application 350 may be applied to semiconductor devices, chips, logic gates, circuits, circuit cards, and/or other physical hardware instead of or in addition to processor-based systems. It may also be implemented in hardware via a device. In some embodiments, latent factor identification application 350 may include data other than instructions or descriptions. In some embodiments, cameras, sensors, or other data input devices (not shown) may be provided to communicate directly with bus interface unit 309, processor 302, or other hardware of computer system 300. . Such a configuration may reduce the need for processor 302 to access memory 304 and latent factor identification applications.
 コンピュータシステム300は、プロセッサ302、メモリ304、表示システム324、及びI/Oバスインターフェースユニット310間の通信を行うバスインターフェースユニット309を含んでもよい。I/Oバスインターフェースユニット310は、様々なI/Oユニットとの間でデータを転送するためのI/Oバス308と連結していてもよい。I/Oバスインターフェースユニット310は、I/Oバス308を介して、I/Oプロセッサ(IOP)又はI/Oアダプタ(IOA)としても知られる複数のI/Oインターフェースユニット312、314、316、及び318と通信してもよい。表示システム324は、表示コントローラ、表示メモリ、又はその両方を含んでもよい。表示コントローラは、ビデオ、オーディオ、又はその両方のデータを表示装置326に提供することができる。また、コンピュータシステム300は、データを収集し、プロセッサ302に当該データを提供するように構成された1つまたは複数のセンサ等のデバイスを含んでもよい。例えば、コンピュータシステム300は、湿度データ、温度データ、圧力データ等を収集する環境センサ、及び加速度データ、運動データ等を収集するモーションセンサ等を含んでもよい。これ以外のタイプのセンサも使用可能である。表示メモリは、ビデオデータをバッファするための専用メモリであってもよい。表示システム324は、単独のディスプレイ画面、テレビ、タブレット、又は携帯型デバイスなどの表示装置326に接続されてもよい。ある実施形態では、表示装置326は、オーディオをレンダリングするためスピーカを含んでもよい。あるいは、オーディオをレンダリングするためのスピーカは、I/Oインターフェースユニットと接続されてもよい。他の実施形態では、表示システム324が提供する機能は、プロセッサ302を含む集積回路によって実現されてもよい。同様に、バスインターフェースユニット309が提供する機能は、プロセッサ302を含む集積回路によって実現されてもよい。 Computer system 300 may include a bus interface unit 309 that provides communication between processor 302 , memory 304 , display system 324 , and I/O bus interface unit 310 . I/O bus interface unit 310 may be coupled to I/O bus 308 for transferring data to and from various I/O units. I/O bus interface unit 310 connects, via I/O bus 308, a plurality of I/ O interface units 312, 314, 316, also known as I/O processors (IOPs) or I/O adapters (IOAs). and 318. Display system 324 may include a display controller, display memory, or both. A display controller may provide video, audio, or both data to display device 326. Computer system 300 may also include devices, such as one or more sensors, configured to collect data and provide the data to processor 302. For example, computer system 300 may include environmental sensors that collect humidity data, temperature data, pressure data, etc., motion sensors that collect acceleration data, motion data, etc., and the like. Other types of sensors can also be used. The display memory may be a dedicated memory for buffering video data. Display system 324 may be connected to a display device 326, such as a standalone display screen, a television, a tablet, or a handheld device. In some embodiments, display device 326 may include speakers to render audio. Alternatively, a speaker for rendering audio may be connected to the I/O interface unit. In other embodiments, the functionality provided by display system 324 may be implemented by an integrated circuit that includes processor 302. Similarly, the functionality provided by bus interface unit 309 may be implemented by an integrated circuit that includes processor 302.
 I/Oインターフェースユニットは、様々なストレージ又はI/Oデバイスと通信する機能を備える。例えば、端末インターフェースユニット312は、ビデオ表示装置、スピーカテレビ等のユーザ出力デバイスや、キーボード、マウス、キーパッド、タッチパッド、トラックボール、ボタン、ライトペン、又は他のポインティングデバイス等のユーザ入力デバイスのようなユーザI/Oデバイス320の取り付けが可能である。ユーザは、ユーザインターフェースを使用して、ユーザ入力デバイスを操作することで、ユーザI/Oデバイス320及びコンピュータシステム300に対して入力データや指示を入力し、コンピュータシステム300からの出力データを受け取ってもよい。ユーザインターフェースは例えば、ユーザI/Oデバイス320を介して、表示装置に表示されたり、スピーカによって再生されたり、プリンタを介して印刷されたりしてもよい。 The I/O interface unit has the ability to communicate with various storage or I/O devices. For example, the terminal interface unit 312 may include a user output device such as a video display device, a speaker television, or a user input device such as a keyboard, mouse, keypad, touch pad, trackball, button, light pen, or other pointing device. It is possible to attach user I/O devices 320 such as: Using the user interface, a user operates a user input device to input input data and instructions to user I/O device 320 and computer system 300, and to receive output data from computer system 300. Good too. The user interface may be displayed on a display device, played through a speaker, or printed through a printer, for example, via the user I/O device 320.
 ストレージインターフェース314は、1つ又は複数のディスクドライブや直接アクセスストレージ装置322(通常は磁気ディスクドライブストレージ装置であるが、単一のディスクドライブとして見えるように構成されたディスクドライブのアレイ又は他のストレージ装置であってもよい)の取り付けが可能である。ある実施形態では、ストレージ装置322は、任意の二次記憶装置として実装されてもよい。メモリ304の内容は、ストレージ装置322に記憶され、必要に応じてストレージ装置322から読み出されてもよい。ネットワークインターフェース318は、コンピュータシステム300と他のデバイスが相互的に通信できるように、通信経路を提供してもよい。この通信経路は、例えば、ネットワーク330であってもよい。 Storage interface 314 may include one or more disk drives or direct access storage devices 322 (typically magnetic disk drive storage devices, but an array of disk drives or other storage devices configured to appear as a single disk drive). ) can be installed. In some embodiments, storage device 322 may be implemented as any secondary storage device. The contents of memory 304 may be stored in storage device 322 and read from storage device 322 as needed. Network interface 318 may provide a communication pathway so that computer system 300 and other devices can communicate with each other. This communication path may be, for example, network 330.
 図1に示されるコンピュータシステム300は、プロセッサ302、メモリ304、バスインタフェース309、表示システム324、及びI/Oバスインターフェースユニット310の間の直接通信経路を提供するバス構造を備えているが、他の実施形態では、コンピュータシステム300は、階層構成、スター構成、又はウェブ構成のポイントツーポイントリンク、複数の階層バス、平行又は冗長の通信経路を含んでもよい。さらに、I/Oバスインターフェースユニット310及びI/Oバス308が単一のユニットとして示されているが、実際には、コンピュータシステム300は複数のI/Oバスインターフェースユニット310又は複数のI/Oバス308を備えてもよい。また、I/Oバス308を様々なI/Oデバイスに繋がる各種通信経路から分離するための複数のI/Oインターフェースユニットが示されているが、他の実施形態では、I/Oデバイスの一部または全部が、1つのシステムI/Oバスに直接接続されてもよい。 The computer system 300 shown in FIG. 1 includes a bus structure that provides a direct communication path between a processor 302, a memory 304, a bus interface 309, a display system 324, and an I/O bus interface unit 310; In embodiments, computer system 300 may include point-to-point links in a hierarchical, star, or web configuration, multiple hierarchical buses, and parallel or redundant communication paths. Additionally, although I/O bus interface unit 310 and I/O bus 308 are shown as a single unit, computer system 300 may actually include multiple I/O bus interface units 310 or multiple I/O bus A bus 308 may also be provided. Additionally, although multiple I/O interface units are shown to separate I/O bus 308 from various communication paths leading to various I/O devices, in other embodiments, one of the I/O devices may Some or all may be directly connected to one system I/O bus.
 ある実施形態では、コンピュータシステム300は、マルチユーザメインフレームコンピュータシステム、シングルユーザシステム、又はサーバコンピュータ等の、直接的ユーザインターフェースを有しない、他のコンピュータシステム(クライアント)からの要求を受信するデバイスであってもよい。他の実施形態では、コンピュータシステム300は、デスクトップコンピュータ、携帯型コンピュータ、ノートパソコン、タブレットコンピュータ、ポケットコンピュータ、電話、スマートフォン、又は任意の他の適切な電子機器であってもよい。 In some embodiments, computer system 300 is a device that receives requests from other computer systems (clients) that do not have a direct user interface, such as a multi-user mainframe computer system, a single-user system, or a server computer. There may be. In other embodiments, computer system 300 may be a desktop computer, portable computer, laptop, tablet computer, pocket computer, telephone, smart phone, or any other suitable electronic device.
<撮像装置のブロック図>
 図2は本発明の撮像装置の構成例を示すブロック図である。図2において、撮像装置1は、レンズ部2、撮像素子部3、映像信号処理部4、制御部5を備えている。撮像装置1は、映像として、例えば、1秒間に3フレーム(3fps)以上等で撮影して、後述する処理を行う。被写体からの入射光はレンズ部2で結像され、撮像素子部3で電気信号に光電変換される。
<Block diagram of imaging device>
FIG. 2 is a block diagram showing an example of the configuration of an imaging device according to the present invention. In FIG. 2, the imaging device 1 includes a lens section 2, an image sensor section 3, a video signal processing section 4, and a control section 5. The imaging device 1 captures video at, for example, 3 frames per second (3 fps) or more, and performs processing to be described later. Incident light from a subject is imaged by the lens section 2, and photoelectrically converted into an electrical signal by the image sensor section 3.
 レンズ部2は、ズーム及びフォーカスを調節する各凹凸レンズ6、光量を調節する絞り7、近赤外光カットフィルタ8を備えている。これらは、入射光側から、各凹凸レンズ6絞り7、近赤外光カットフィルタ8の順に備えられている。なお、各凹凸レンズ6の構成については、用途に応じて各種構成を適用できる。 The lens section 2 includes concave and convex lenses 6 that adjust zoom and focus, an aperture 7 that adjusts the amount of light, and a near-infrared light cut filter 8. These are provided in this order from the incident light side: each concave-convex lens 6, an aperture 7, and a near-infrared light cut filter 8. Note that various configurations can be applied to the configuration of each concave-convex lens 6 depending on the purpose.
 ここで絞り7について説明する。絞り7は、複数の羽等の構造により中央の孔の大きさを変えることで凹凸レンズ6から入る光の量を調節する。絞りを開くと光量が増えるため、周囲が暗い場合でも、良好な映像を撮影することができる。しかしこの場合は、被写界深度が浅くなり、特定の距離にある被写体に焦点が合い、それ以外の距離にある被写体については暈けが生じやすくなる。このため、特定の距離以外の被写体の識別が難しくなる。一方、周囲が明るい場合は、絞りを閉じても光量が足りるため、被写界深度は深くなる。この場合は、距離が異なる広い範囲で焦点が合いやすく、たとえ焦点が合わない距離でも暈けの度合いが少ない。 Here, the aperture 7 will be explained. The aperture 7 adjusts the amount of light entering from the concave-convex lens 6 by changing the size of the central hole using a structure such as a plurality of blades. Opening the aperture increases the amount of light, allowing you to capture good images even in dark surroundings. However, in this case, the depth of field becomes shallow, and objects at a certain distance are in focus, while objects at other distances tend to be blurred. This makes it difficult to identify objects at a distance other than a specific distance. On the other hand, if the surroundings are bright, there will be enough light even when the aperture is closed, and the depth of field will be deep. In this case, it is easy to focus over a wide range of different distances, and even at distances that are out of focus, the degree of blur is small.
 近赤外光カットフィルタ8は、近赤外光のみをカットして、可視光は通過するフィルタである。ここで、近赤外光は波長が約800~2500ナノメートル程度の光である。また可視光は波長が約380~770ナノメートル程度の光である。近赤外光カットフィルタ8は、絞り7に隣接して位置し、中央部のみを開口してフィルタがなく素通しとなっている。中央部のフィルタがない部分の開口の大きさは小さめに設定できる。例えば、絞り7の最も閉じた時の開口よりも小さくする等である。また、後述する図8のステップS101の輝度レベルの閾値時の絞り7の開口よりも小さい開口としてもよい。これにより、近赤外光では、被写界深度を深く設定可能となる。 The near-infrared light cut filter 8 is a filter that cuts only near-infrared light and allows visible light to pass through. Here, near-infrared light is light with a wavelength of approximately 800 to 2,500 nanometers. Furthermore, visible light is light with a wavelength of about 380 to 770 nanometers. The near-infrared light cut filter 8 is located adjacent to the diaphragm 7, and is open only in the center, so that it is transparent without a filter. The size of the aperture in the central part where there is no filter can be set to be smaller. For example, the aperture may be made smaller than the aperture when the diaphragm 7 is at its most closed position. Further, the aperture may be smaller than the aperture of the diaphragm 7 at the time of the brightness level threshold value in step S101 in FIG. 8, which will be described later. As a result, it is possible to set a deep depth of field for near-infrared light.
 撮像素子部3は、レンズ部2、絞り7、近赤外光カットフィルタ8を通過した入射光を撮像素子に入射光を電気信号に変換してその信号を映像信号処理部4に送信する。ここでの撮像素子の例としては、CCD(Charge-Coupled Device)イメージセンサやCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等があげられる。撮像素子部3は、可視光と近赤外光を別個に受光可能な構成となっており、詳細な構成については図3~7で後述する。 The image sensor section 3 converts the incident light that has passed through the lens section 2, the aperture 7, and the near-infrared cut filter 8 into an electrical signal, and transmits the signal to the video signal processing section 4. Examples of the image sensor here include a CCD (Charge-Coupled Device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and the like. The image sensor section 3 is configured to be able to separately receive visible light and near-infrared light, and the detailed configuration will be described later with reference to FIGS. 3 to 7.
 映像信号処理部4では、撮像素子部3から入力した信号に対して利得補正、ガンマ補正、ニー補正、輪郭補正、色補正等の各種映像信号処理を施す。映像信号処理部4は、輪郭抽出部9を備えている。輪郭抽出部9は、撮像素子部3から入力した近赤外光信号に対して輪郭補正処理を行い、輪郭信号として出力する。そして、撮像素子部3から入力した可視光信号に対して、上記の輪郭信号を加えて映像信号を作成する。この映像信号からHD-SDI(High Definition Serial Digital Interface)信号を生成し、外部に出力する。なお、映像信号処理部4から出力する映像信号は、上記のHD-SDIに限定されるものではなく、圧縮や暗号化等の他の種類の映像信号を適用してもよい。 The video signal processing unit 4 performs various video signal processing such as gain correction, gamma correction, knee correction, contour correction, and color correction on the signal input from the image sensor unit 3. The video signal processing section 4 includes a contour extraction section 9. The contour extraction section 9 performs contour correction processing on the near-infrared light signal input from the image sensor section 3 and outputs it as a contour signal. Then, the above contour signal is added to the visible light signal inputted from the image sensor section 3 to create a video signal. An HD-SDI (High Definition Serial Digital Interface) signal is generated from this video signal and output to the outside. Note that the video signal output from the video signal processing unit 4 is not limited to the HD-SDI described above, and other types of video signals such as compressed and encrypted video signals may be applied.
 制御部5は、撮像装置1のレンズ部2、撮像素子部3、映像信号処理部4の各部を制御する。制御部5は、例えば、CPUなどにより構成できる。 The control unit 5 controls the lens unit 2, the image sensor unit 3, and the video signal processing unit 4 of the imaging device 1. The control unit 5 can be configured by, for example, a CPU.
 映像信号処理部4や制御部5は、図1で示したコンピュータシステム300を適用できる。 The computer system 300 shown in FIG. 1 can be applied to the video signal processing section 4 and the control section 5.
 <撮像素子部の具体例>
 図3~7を用いて撮像素子部3の具体例について説明する。
<Specific example of image sensor section>
A specific example of the image sensor section 3 will be explained using FIGS. 3 to 7.
 図3は、本発明の撮像装置における撮像素子部の第1の例を示す図である。図3で示す撮像素子部3は、フィルタ31aを有する撮像素子31を備えている。フィルタ31aは、入射光側に設置されるフィルタであり、複数の可視光フィルタ「Vis」と複数の近赤外光フィルタ「IR」がモザイク状(市松模様状)に交互に配置されたフィルタである。可視光フィルタ「Vis」は、可視光のみを透過するフィルタである。近赤外光フィルタ「IR」は、近赤外光のみを透過するフィルタである。このため図3の撮像素子部3は、モノクロ単板式であり、モノクロ画像用の撮像素子となる。入射光は、フィルタ31aを通過して撮像素子31に達するが、この際、可視光フィルタ「Vis」と近赤外光フィルタ「IR」により、受光素子ごと可視光か近赤外光を分けて受光する。そして、可視光信号と近赤外光信号として映像信号処理部4へ出力する。なお図3では、16個のフィルタの例で記載してあるがこれに限らず、受光素子の数に応じて配置することができる。 FIG. 3 is a diagram showing a first example of the image sensor section in the image pickup device of the present invention. The image sensor section 3 shown in FIG. 3 includes an image sensor 31 having a filter 31a. The filter 31a is a filter installed on the incident light side, and is a filter in which a plurality of visible light filters "Vis" and a plurality of near-infrared light filters "IR" are arranged alternately in a mosaic pattern (checkerboard pattern). be. The visible light filter "Vis" is a filter that transmits only visible light. The near-infrared light filter "IR" is a filter that transmits only near-infrared light. Therefore, the image sensor section 3 in FIG. 3 is a monochrome single-plate type, and serves as an image sensor for monochrome images. The incident light passes through the filter 31a and reaches the image sensor 31, but at this time, the visible light filter "Vis" and the near-infrared light filter "IR" separate visible light and near-infrared light for each light receiving element. Receive light. Then, it is output to the video signal processing unit 4 as a visible light signal and a near-infrared light signal. Although FIG. 3 shows an example of 16 filters, the arrangement is not limited to this and can be arranged according to the number of light receiving elements.
 図4は、本発明の撮像装置における撮像素子部の第2の例を示す図である。図4で示す撮像素子部3は、プリズム12、撮像素子32、撮像素子33を備えている。プリズム12は入射光に対して可視光と近赤外光を分光するプリズムである。撮像素子32は、プリズム12で分光した可視光を受光する撮像素子である。撮像素子33は、プリズム12で分光した近赤外光を受光する撮像素子である。このため図4の撮像素子部3は、可視光と近赤外光をそれぞれ受光する2つのモノクロ撮像素子を用いたモノクロ2板式である。入射光は、プリズム12を通過して可視光と近赤外光に分光される。撮像素子32では可視光のみを受光し可視光信号として映像信号処理部4へ出力する。撮像素子33では近赤外光のみを受光し近赤外光信号として映像信号処理部4へ出力する。映像信号処理部4では、受光した可視光信号に基づきモノクロ画像を生成する。このため、いずれの撮像素子も図3のようなモザイクフィルタを有さない撮像素子となる。 FIG. 4 is a diagram showing a second example of the image sensor section in the imaging device of the present invention. The image sensor section 3 shown in FIG. 4 includes a prism 12, an image sensor 32, and an image sensor 33. The prism 12 is a prism that separates incident light into visible light and near-infrared light. The image sensor 32 is an image sensor that receives visible light separated by the prism 12. The image sensor 33 is an image sensor that receives near-infrared light separated by the prism 12. For this reason, the image sensor section 3 in FIG. 4 is a monochrome two-plate type using two monochrome image sensors that each receive visible light and near-infrared light. The incident light passes through the prism 12 and is separated into visible light and near-infrared light. The image sensor 32 receives only visible light and outputs it to the video signal processing section 4 as a visible light signal. The image sensor 33 receives only near-infrared light and outputs it to the video signal processing section 4 as a near-infrared light signal. The video signal processing unit 4 generates a monochrome image based on the received visible light signal. Therefore, both image sensors do not have a mosaic filter as shown in FIG. 3.
 図5は、本発明の撮像装置における撮像素子部の第3の例を示す図である。図5で示す撮像素子部3は、フィルタ34aを有する撮像素子34を備えている。フィルタ34aは、入射光側に設置されるフィルタであり、複数の赤色光フィルタ「R」と、複数の緑色光フィルタ「G」と、複数の青色光フィルタ「B」と、複数の近赤外光フィルタ「IR」がモザイク状に交互に配置されたフィルタである。図5では、左上の緑色光フィルタ「G」、右上の赤色光フィルタ「R」、左下の緑色光フィルタ「G」、右下の近赤外光フィルタ「IR」の4つによるパターンが複数並んでる。緑色光フィルタ「G」は、緑色光のみを通過するフィルタである。赤色光フィルタ「R」は、赤色光のみを通過するフィルタである。青色光フィルタ「B」は青色光のみを通過するフィルタである。近赤外光フィルタ「IR」は、近赤外光のみを透過するフィルタである。これにより図5の撮像素子部3は、カラー単板式であり、カラー画像用の撮像素子となる。 FIG. 5 is a diagram showing a third example of the image sensor section in the imaging device of the present invention. The image sensor section 3 shown in FIG. 5 includes an image sensor 34 having a filter 34a. The filter 34a is a filter installed on the incident light side, and includes a plurality of red light filters "R", a plurality of green light filters "G", a plurality of blue light filters "B", and a plurality of near-infrared light filters. Optical filters "IR" are arranged alternately in a mosaic pattern. In Figure 5, multiple patterns are lined up with four patterns: a green light filter "G" in the upper left, a red light filter "R" in the upper right, a green light filter "G" in the lower left, and a near-infrared light filter "IR" in the lower right. Out. Green light filter "G" is a filter that passes only green light. The red light filter "R" is a filter that passes only red light. Blue light filter "B" is a filter that passes only blue light. The near-infrared light filter "IR" is a filter that transmits only near-infrared light. As a result, the image sensor section 3 in FIG. 5 is of a color single-plate type, and serves as an image sensor for color images.
 図5において、入射光は、フィルタ34aを通過して撮像素子34に達するが、この際、可視光は、赤色光フィルタ「R」を通過した光は赤色光として、緑色光フィルタ「G」を通過した光は緑色光として、青色光フィルタ「B」を通過した光は青色光として、撮像素子34が受光する。これらの受光した光による信号は可視光信号として映像信号処理部4へ出力する。また、近赤外光フィルタ「IR」を通過した光は近赤外光として、撮像素子34が受光する。この受光した光による信号は近赤外光信号として映像信号処理部4へ出力する。これにより、可視光と近赤外光を受光素子ごとに分けて受光することが可能となる。なお図5では、16個のフィルタの例で記載してあるがこれに限らず、受光素子の数に応じて配置することができる。 In FIG. 5, the incident light passes through the filter 34a and reaches the image sensor 34, but at this time, the visible light passes through the red light filter "R" as red light and passes through the green light filter "G". The light that has passed is received as green light, and the light that has passed through the blue light filter "B" is received as blue light by the image sensor 34. Signals based on these received lights are output to the video signal processing section 4 as visible light signals. Further, the light that has passed through the near-infrared light filter "IR" is received by the image sensor 34 as near-infrared light. A signal based on this received light is output to the video signal processing section 4 as a near-infrared light signal. This makes it possible to receive visible light and near-infrared light separately for each light receiving element. Although FIG. 5 shows an example of 16 filters, the arrangement is not limited to this and can be arranged according to the number of light receiving elements.
 図6は、本発明の撮像装置における撮像素子部の第4の例を示す図である。図6に示す撮像素子部3は、プリズム12、撮像素子35、撮像素子36を備えている。撮像素子35は、フィルタ35aを有する。プリズム12は入射光に対して可視光と近赤外光を分光するプリズムであり図4のプリズム12と同様である。フィルタ35aは、複数の赤色光フィルタ「R」と、複数の緑色光フィルタ「G」と、複数の青色光フィルタ「B」をモザイク状に交互に配置されたフィルタである。図6では可視光を受光するベイヤー配列のフィルタ35aを示している。撮像素子35は、プリズム12で分光した可視光を受光する撮像素子である。撮像素子36は、プリズム12で分光した近赤外光を受光する撮像素子である。このため図6の撮像素子部3は、可視光を受光する撮像素子と、近赤外光を受光するモノクロ撮像素子を用いたカラー2板式である。 FIG. 6 is a diagram showing a fourth example of the image sensor section in the imaging device of the present invention. The image sensor section 3 shown in FIG. 6 includes a prism 12, an image sensor 35, and an image sensor 36. The image sensor 35 has a filter 35a. The prism 12 is a prism that separates incident light into visible light and near-infrared light, and is similar to the prism 12 in FIG. 4 . The filter 35a is a filter in which a plurality of red light filters "R", a plurality of green light filters "G", and a plurality of blue light filters "B" are alternately arranged in a mosaic pattern. FIG. 6 shows a Bayer array filter 35a that receives visible light. The image sensor 35 is an image sensor that receives visible light separated by the prism 12. The image sensor 36 is an image sensor that receives near-infrared light separated by the prism 12. For this reason, the image sensor section 3 in FIG. 6 is a color two-plate type using an image sensor that receives visible light and a monochrome image sensor that receives near-infrared light.
 図6において、入射光は、プリズム12を通過して可視光と近赤外光に分光される。撮像素子35では可視光のみを受光し可視光信号として映像信号処理部4へ出力する。撮像素子36では近赤外光のみを受光し近赤外光信号として映像信号処理部4へ出力する。このとき、撮像素子35はフィルタ35aによりRGBによるカラー画像として受光することができる。なお図6では、16個のフィルタの例で記載してあるがこれに限らず、受光素子の数に応じて配置することができる。 In FIG. 6, incident light passes through a prism 12 and is separated into visible light and near-infrared light. The image sensor 35 receives only visible light and outputs it to the video signal processing section 4 as a visible light signal. The image sensor 36 receives only near-infrared light and outputs it to the video signal processing section 4 as a near-infrared light signal. At this time, the image sensor 35 can receive light as an RGB color image through the filter 35a. Although FIG. 6 shows an example of 16 filters, the arrangement is not limited to this and can be arranged according to the number of light receiving elements.
 図7は、本発明の撮像装置における撮像素子部の第5の例を示す図である。図7で示す撮像素子部3は、プリズム15、撮像素子37、撮像素子38、撮像素子39、撮像素子40を備えている。プリズム15は入射光に対して、赤色光と緑色光と青色光と近赤外光とにそれぞれ分光するプリズムである。撮像素子37は、プリズム15で分光した赤色光を受光する撮像素子である。撮像素子38は、プリズム15で分光した緑色光を受光する撮像素子である。撮像素子39は、プリズム15で分光した青色光を受光する撮像素子である。撮像素子40は、プリズム15で分光した近赤外光を受光する撮像素子である。このため図7の撮像素子部3は、赤色光と緑色光と青色光と近赤外光のそれぞれを受光する4つのモノクロ撮像素子を用いたカラー4板式である。 FIG. 7 is a diagram showing a fifth example of the image sensor section in the imaging device of the present invention. The image sensor section 3 shown in FIG. 7 includes a prism 15, an image sensor 37, an image sensor 38, an image sensor 39, and an image sensor 40. The prism 15 is a prism that separates incident light into red light, green light, blue light, and near-infrared light, respectively. The image sensor 37 is an image sensor that receives red light separated by the prism 15. The image sensor 38 is an image sensor that receives green light separated by the prism 15. The image sensor 39 is an image sensor that receives blue light separated by the prism 15. The image sensor 40 is an image sensor that receives near-infrared light separated by the prism 15. For this reason, the image sensor section 3 in FIG. 7 is a color four-plate type using four monochrome image sensors that receive red light, green light, blue light, and near-infrared light, respectively.
 図7において、入射光は、プリズム15を通過して赤色光と緑色光と青色光と近赤外光に分光される。撮像素子37では赤色光のみを受光し、撮像素子38では緑色光のみを受光し、撮像素子39では青色光のみを受光する。これらの受光した光による信号は可視光信号として映像信号処理部4へ出力する。また、撮像素子40では近赤外光のみを受光する。この受光した光による信号は近赤外光信号として映像信号処理部4へ出力する。映像信号処理部4では、受光した赤色光、緑色光、青色光に基づきカラー映像を生成する。このため、いずれの撮像素子も図6のようなモザイクフィルタを有さない撮像素子となる。 In FIG. 7, incident light passes through a prism 15 and is separated into red light, green light, blue light, and near-infrared light. The image sensor 37 receives only red light, the image sensor 38 receives only green light, and the image sensor 39 receives only blue light. Signals based on these received lights are output to the video signal processing section 4 as visible light signals. Further, the image sensor 40 receives only near-infrared light. A signal based on this received light is output to the video signal processing section 4 as a near-infrared light signal. The video signal processing unit 4 generates a color video based on the received red light, green light, and blue light. Therefore, both image sensors do not have a mosaic filter as shown in FIG. 6.
<フローチャートの例>
 図8は、本発明の撮像装置のフローチャートの例を示す図である。図8を用いて、レンズ部2の絞り7を開いた場合における見た目の被写界深度を拡大する動作について説明する。ここでは、特に説明がない場合は映像信号処理部4における処理を示している。
<Example of flowchart>
FIG. 8 is a diagram showing an example of a flowchart of the imaging apparatus of the present invention. The operation of expanding the apparent depth of field when the aperture 7 of the lens section 2 is opened will be described using FIG. 8. Here, unless otherwise explained, processing in the video signal processing section 4 is shown.
 最初にステップS101では、映像信号処理部4に入力された映像の輝度レベルが閾値より小さいか否かを判定する。入力された映像の輝度レベルが閾値よりも小さいと判定された場合はステップS102へ進む。入力された映像の輝度レベルが閾値以上である場合は、処理は終了する。輝度レベルの閾値はあらかじめ設定しておくことができる。ここでの入力される映像としては、可視光信号による映像を利用できる。 First, in step S101, it is determined whether the brightness level of the video input to the video signal processing unit 4 is smaller than a threshold value. If it is determined that the brightness level of the input video is smaller than the threshold, the process advances to step S102. If the brightness level of the input video is equal to or higher than the threshold, the process ends. The brightness level threshold can be set in advance. As the input video here, a video based on a visible light signal can be used.
 ステップS102では、所定の輝度レベルが得られるようにレンズ部2の絞り7を開く。ここでの所定の輝度レベルは、例えば、ステップS101の輝度レベルの閾値の値を用いてもよい。輝度レベルが低いと暗い映像になってしまうため、絞り7を開くことで、映像の明るさを所定のレベルに保つことができる。絞り7の開く度合いは、ステップS101で検知した輝度レベルに応じて決定してもよい。ここでは、映像信号処理部4が検知した輝度レベルに応じて、制御部5の制御により絞り7を制御して開くことができる。ただし、絞り7を開くことにより光量が増えて輝度レベルが上がるが、被写界深度が浅くなり特定距離以外の映像の暈けの度合いは大きくなる。 In step S102, the aperture 7 of the lens section 2 is opened so that a predetermined brightness level is obtained. The predetermined brightness level here may be, for example, the brightness level threshold value in step S101. If the brightness level is low, the image will be dark, so by opening the aperture 7, the brightness of the image can be maintained at a predetermined level. The degree to which the aperture 7 opens may be determined according to the brightness level detected in step S101. Here, the aperture 7 can be controlled and opened under the control of the control section 5 according to the brightness level detected by the video signal processing section 4. However, although opening the aperture 7 increases the amount of light and raises the brightness level, the depth of field becomes shallower and the degree of blur in the image at a distance other than the specific distance increases.
 次に、ステップS103では、輪郭抽出部9において近赤外光信号から輪郭成分を抽出する。近赤外光信号は、絞り7に隣接している近赤外光カットフィルタ8を通過して得られた近赤外光の信号である。近赤外光カットフィルタ8は、中央部のみ開口して素通しとなっていることにより、近赤外光の光束の大部分が通過せず中央の一部分だけが通過する。即ち、近赤外光の光路は絞り7に影響されることなく、常に絞りが閉じているのと同じ状態となる。このことにより、近赤外光信号の輝度レベルは小さいが被写界深度は深い映像が得られることになる。 Next, in step S103, the contour extraction unit 9 extracts contour components from the near-infrared light signal. The near-infrared light signal is a signal of near-infrared light obtained by passing through a near-infrared light cut filter 8 adjacent to the aperture 7. The near-infrared light cut filter 8 is open only at the center and is transparent, so that most of the near-infrared light does not pass through, but only a part of the center passes through. That is, the optical path of the near-infrared light is not affected by the diaphragm 7, and is always in the same state as if the diaphragm were closed. As a result, an image with a deep depth of field can be obtained although the brightness level of the near-infrared light signal is low.
 映像信号処理部4では、近赤外光信号による映像(近赤外光映像)を作成するが、近赤外光映像は輝度レベルが小さいため、全体として暗い映像となる。このため、近赤外光映像に対して必要に応じて利得補正を施したり、ノイズを除去するためにLPF(Low-Pass Filter)を通したりしても良い。ここで、LPFにより解像感が多少劣化する場合があるが、被写界深度が浅い暈けた映像に比べれば、輪郭成分を抽出するには十分な映像を得られる。輪郭成分の抽出は、近赤外光映像の各画像に対して、画像処理を行い、各被写体に対してその輪郭を抽出することで行うことができる。 The video signal processing unit 4 creates a video (near-infrared video) based on the near-infrared light signal, but since the near-infrared video has a low brightness level, the video is dark as a whole. For this reason, the near-infrared light image may be subjected to gain correction as necessary, or may be passed through an LPF (Low-Pass Filter) to remove noise. Here, although the LPF may slightly degrade the resolution, compared to a blurred image with a shallow depth of field, it is possible to obtain an image sufficient for extracting contour components. Extraction of the contour component can be performed by performing image processing on each image of the near-infrared light video and extracting the contour of each subject.
 次に、ステップS104では、映像信号処理部4は、近赤外光映像から抽出した輪郭成分を可視光映像に重畳する。可視光映像は、絞り7を通過した可視光信号により作成される映像である。重畳は、同じ時間の可視光映像と近赤外光映像による輪郭成分を、位置を合わせて重畳する。これにより、被写界深度の浅い可視光映像の暈けた映像に対して、近赤外光映像による輪郭成分を追加するため、各被写体の輪郭をはっきりさせることができる。被写界深度の浅い可視光映像は、暈けた映像(モノクロ映像であれば輝度の滲み、カラー映像であれば輝度及び色の滲み)が完全に解消される訳ではないが、輪郭成分の追加により見た目には被写界深度が深い映像とすることができる。このとき、可視光映像は輝度を上げるための大きな利得補正を行っておらず、フレームレートを下げる処理も行っていない。このため、ノイズや諧調及び残像等の劣化がなく、かつ適切な輝度レベルの映像を得ることができる。 Next, in step S104, the video signal processing unit 4 superimposes the contour component extracted from the near-infrared light image onto the visible light image. The visible light image is an image created by a visible light signal that has passed through the aperture 7. The superimposition involves aligning and superimposing outline components from visible light images and near-infrared light images taken at the same time. As a result, the outline component of the near-infrared light image is added to the blurred visible light image with a shallow depth of field, so the outline of each subject can be made clear. For visible light images with a shallow depth of field, blurred images (brightness blur in monochrome images, brightness and color blur in color images) cannot be completely eliminated, but the addition of contour components This makes it possible to create an image with a seemingly deep depth of field. At this time, the visible light image is not subjected to large gain correction to increase brightness, nor is processing to reduce the frame rate. Therefore, it is possible to obtain an image without deterioration such as noise, gradation, afterimage, etc., and with an appropriate brightness level.
<重畳の具体例>
 図9は、本発明の撮像装置において、可視光映像と近赤外光映像からの輪郭成分を重畳する具体例を説明する図である。
<Specific example of superimposition>
FIG. 9 is a diagram illustrating a specific example of superimposing contour components from a visible light image and a near-infrared light image in the imaging apparatus of the present invention.
 可視光映像201は、図8のステップS102で絞り7を開いた場合に取得した映像である。絞り7を開いたことにより輝度レベルを上げることができるが、被写界深度は浅くなる。図9の例では、手前の自動車に焦点が合っており、それ以外の被写体には焦点は合っておらず、暈けた映像(色が滲んだ映像)となっている。輝度レベルは、暗くならないように所定のレベルが維持された映像である。 The visible light image 201 is an image obtained when the aperture 7 is opened in step S102 of FIG. Although the brightness level can be increased by opening the aperture 7, the depth of field becomes shallower. In the example of FIG. 9, the car in the foreground is in focus, and other subjects are out of focus, resulting in a blurred image (an image with blurred colors). The brightness level of the image is maintained at a predetermined level so as not to become dark.
 輪郭映像202は、図8のステップS103で抽出した輪郭成分を表す映像である。近赤外光は中央部のみ開口した近赤外光カットフィルタ8を通過することで、被写界深度は深い映像となるが、最初は全体が暗い映像となる。このため、利得補正をして、ゲインを上げて映像を明るくする補正を行う。このとき、ノイズも増えるので、DNR(Digital Noise Reduction)等のノイズ補正もかける。そして、その映像に基づき輪郭成分を抽出する。図9の輪郭映像202は、被写界深度は深いため、手前の自動車だけでなく、背景のビルや道路、看板、その他の自動車にも焦点が合っているか、焦点のずれがわずかである。このため、各被写体(各物体)を画像処理により輪郭成分を捉えることが可能となる。輪郭映像202の輪郭成分は、各被写体の輪郭を線等で表すことができる。 The contour image 202 is an image representing the contour components extracted in step S103 of FIG. 8. The near-infrared light passes through the near-infrared light cut filter 8 that is open only in the center, resulting in an image with a deep depth of field, but initially the entire image is dark. For this reason, gain correction is performed to increase the gain and brighten the image. At this time, since noise also increases, noise correction such as DNR (Digital Noise Reduction) is also applied. Then, contour components are extracted based on the image. The contour image 202 in FIG. 9 has a deep depth of field, so that not only the car in the foreground but also buildings, roads, billboards, and other cars in the background are in focus or slightly out of focus. Therefore, it is possible to capture the contour components of each subject (each object) through image processing. The contour component of the contour image 202 can represent the contour of each subject with a line or the like.
 重畳映像203は、図8のステップS104により重畳の処理を行った映像である。図9の例では、可視光映像201と輪郭映像202を重畳したものである。可視光映像201により、輝度レベルを維持しているため全体として暗い映像にはならない。さらに、輪郭映像202により、広範囲に輪郭成分を付加しているため、輪郭をはっきりすることができ、見た目の被写界深度を深くすることができる。これにより、手前の自動車だけでなく、それ以外の各被写体についてもはっきり捉えることができる。 The superimposed video 203 is a video that has undergone the superimposition process in step S104 of FIG. 8. In the example of FIG. 9, a visible light image 201 and an outline image 202 are superimposed. Since the visible light image 201 maintains the brightness level, the image does not become dark as a whole. Furthermore, since the contour image 202 adds contour components over a wide range, the contour can be made clear and the apparent depth of field can be increased. This allows you to clearly capture not only the car in the foreground, but also other subjects.
<効果>
 以上のような実施形態により、レンズの絞りを開いた場合においても、見た目の被写界深度が深い映像を得ることができる。これにより、例えば、監視カメラにおいて、夕方や夜間等の光量が少ない環境下で撮影をする場合も、暗くならない映像を取得することが可能となる。その場合、見た目の被写界深度が深くなるため、特定の1点のみではなく広範囲の多点の監視を行うことが可能となる。この際、フレームレートを下げる必要がないため、被写体に発生する残像が増加することもない。
<Effect>
According to the embodiments described above, even when the aperture of the lens is opened, it is possible to obtain an image with a deep apparent depth of field. As a result, for example, even when a surveillance camera takes pictures in an environment with a low amount of light, such as in the evening or at night, it is possible to obtain images that do not become dark. In this case, the apparent depth of field becomes deeper, making it possible to monitor not only one specific point but a wide range of multiple points. At this time, there is no need to lower the frame rate, so there is no increase in afterimages that occur on the subject.
 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施例の構成の一部を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。  Note that the present invention is not limited to the above-described embodiments, and includes various modifications. For example, the embodiments described above are described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described. Furthermore, it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. Further, it is possible to add, delete, or replace a part of the configuration of each embodiment with other configurations.​
 例えば、本発明は、光量が不足している場合だけでなく、光量が足りている場合においても適用できる。この場合は、輝度レベルに影響する他の動作と組み合わせることで所定の効果を得ることができる。例えば、レンズの絞り7を開いて光量が増えた分だけ、露光時間を短くすることでフレームレートを上げたり、圧縮方向の利得補正を施すことで画質を向上させることが可能となる。 For example, the present invention can be applied not only when the amount of light is insufficient, but also when the amount of light is sufficient. In this case, a predetermined effect can be obtained by combining it with other operations that affect the brightness level. For example, it is possible to increase the frame rate by shortening the exposure time by the amount of light increased by opening the aperture 7 of the lens, or to improve the image quality by performing gain correction in the compression direction.
 また、図2では、絞り7と近赤外光カットフィルタ8が近接している例を示したが、これらは離れていても適用可能である。例えば、絞り7が近赤外光カットフィルタ8の開口と同じぐらいに絞られた場合は、近赤外光は中央開口部の平行光以外届かないが、この場合、可視光映像は被写界深度が深いため輪郭補正が不要となる。また、焦点位置に差がでたとしても近赤外光は被写界深度が深いため問題は生じない。また、ズーム倍率に差がでた場合は、収差補正(デジタルズーム)で補正を行うことで対応できる。 Further, although FIG. 2 shows an example in which the aperture 7 and the near-infrared light cut filter 8 are close to each other, the present invention is also applicable even if they are separated. For example, if the aperture 7 is narrowed down to the same size as the aperture of the near-infrared light cut filter 8, near-infrared light will not reach anything other than parallel light from the central aperture, but in this case, the visible light image will be Because the depth is deep, contour correction is not necessary. Furthermore, even if there is a difference in focal position, no problem will occur because near-infrared light has a deep depth of field. Furthermore, if there is a difference in zoom magnification, it can be corrected by correcting it using aberration correction (digital zoom).
1…撮像装置、2…レンズ部、3…撮像素子部、4…映像信号処理部、5…制御部、6…凹凸レンズ、7…絞り、8…近赤外光カットフィルタ、9…輪郭抽出部、10…撮像装置、12、15…プリズム、31~40…撮像素子、31a、34a、35a…フィルタ、201…可視光映像、202…輪郭映像、203…重畳映像、300…コンピュータシステム、302…プロセッサ、302A、302B…処理装置、304…メモリ、306…メモリバス、308…I/Oバス、309…バスインターフェースユニット、310…I/Oバスインターフェースユニット、312…端末インターフェースユニット、314…ストレージインターフェース、316…I/Oデバイスインターフェース、318…ネットワークインターフェース、320…ユーザI/Oデバイス、324…表示システム、326…表示装置、330…ネットワーク、350…潜在因子特定アプリケーション DESCRIPTION OF SYMBOLS 1...Imaging device, 2...Lens part, 3...Image sensor part, 4...Video signal processing part, 5...Control part, 6...Concave-convex lens, 7...Aperture, 8...Near-infrared light cut filter, 9...Contour extraction Part, 10... Imaging device, 12, 15... Prism, 31-40... Imaging element, 31a, 34a, 35a... Filter, 201... Visible light image, 202... Contour image, 203... Superimposed image, 300... Computer system, 302 ...Processor, 302A, 302B...Processing device, 304...Memory, 306...Memory bus, 308...I/O bus, 309...Bus interface unit, 310...I/O bus interface unit, 312...Terminal interface unit, 314...Storage Interface, 316...I/O device interface, 318...Network interface, 320...User I/O device, 324...Display system, 326...Display device, 330...Network, 350...Latent factor identification application

Claims (3)

  1.  入射光に対して可視光と近赤外光とを別個に受光可能で前記可視光に基づく可視光信号と前記近赤外光に基づく近赤外光信号を出力する撮像素子部と、前記近赤外光信号による映像から輪郭成分を抽出し、抽出した輪郭成分を前記可視光信号による映像に重畳する処理を行う映像信号処理部とを有することを特徴とする撮像装置。 an image sensor section capable of separately receiving visible light and near-infrared light with respect to incident light and outputting a visible light signal based on the visible light and a near-infrared light signal based on the near-infrared light; An imaging device comprising: a video signal processing unit that extracts a contour component from an image based on an infrared light signal and superimposes the extracted contour component on a video based on the visible light signal.
  2.  請求項1に記載の撮像装置において、
     レンズと、前記レンズから入る光の量を調整する絞りと、前記絞りの開度を制御する制御部とを有し、前記制御部は、前記可視光信号による映像の輝度レベルに応じて前記絞りを開ける制御を行うことを特徴とする撮像装置。
    The imaging device according to claim 1,
    It has a lens, an aperture that adjusts the amount of light entering from the lens, and a control unit that controls the opening degree of the aperture, and the control unit adjusts the aperture according to the brightness level of the image based on the visible light signal. An imaging device characterized by performing control to open an image.
  3.  請求項1に記載の撮像装置において、
     中央部のみフィルタがない開口部を有する近赤外カットフィルタを有し、前記撮像素子部は、前記近赤外カットフィルタを通過した光を受光することを特徴とする撮像装置。
    The imaging device according to claim 1,
    An imaging device comprising a near-infrared cut filter having an opening with no filter only in the center, and wherein the image sensor section receives light that has passed through the near-infrared cut filter.
PCT/JP2022/047594 2022-03-24 2022-12-23 Imaging device WO2023181558A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2024509768A JPWO2023181558A1 (en) 2022-03-24 2022-12-23

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-047858 2022-03-24
JP2022047858 2022-03-24

Publications (1)

Publication Number Publication Date
WO2023181558A1 true WO2023181558A1 (en) 2023-09-28

Family

ID=88100459

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/047594 WO2023181558A1 (en) 2022-03-24 2022-12-23 Imaging device

Country Status (2)

Country Link
JP (1) JPWO2023181558A1 (en)
WO (1) WO2023181558A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006180269A (en) * 2004-12-22 2006-07-06 Sony Corp Image processing apparatus, image processing method, imaging apparatus, program, and recording medium
JP2013152369A (en) * 2012-01-25 2013-08-08 Nippon Seimitsu Sokki Kk Diaphragm device and camera
JP2015029841A (en) * 2013-08-06 2015-02-16 三菱電機エンジニアリング株式会社 Imaging device and imaging method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006180269A (en) * 2004-12-22 2006-07-06 Sony Corp Image processing apparatus, image processing method, imaging apparatus, program, and recording medium
JP2013152369A (en) * 2012-01-25 2013-08-08 Nippon Seimitsu Sokki Kk Diaphragm device and camera
JP2015029841A (en) * 2013-08-06 2015-02-16 三菱電機エンジニアリング株式会社 Imaging device and imaging method

Also Published As

Publication number Publication date
JPWO2023181558A1 (en) 2023-09-28

Similar Documents

Publication Publication Date Title
US11582400B2 (en) Method of image processing based on plurality of frames of images, electronic device, and storage medium
WO2020207262A1 (en) Image processing method and apparatus based on multiple frames of images, and electronic device
EP2589226B1 (en) Image capture using luminance and chrominance sensors
US9077916B2 (en) Improving the depth of field in an imaging system
KR20210024053A (en) Night view photography method, device, electronic equipment, and storage medium
JP7077395B2 (en) Multiplexed high dynamic range image
WO2020207261A1 (en) Image processing method and apparatus based on multiple frames of images, and electronic device
US20200228781A1 (en) Devices And Methods For An Imaging System With A Dual Camera Architecture
CN109005364A (en) Image formation control method, device, electronic equipment and computer readable storage medium
CN111028190A (en) Image processing method, image processing device, storage medium and electronic equipment
CN107948500A (en) Image processing method and device
CN109089046B (en) Image noise reduction method and device, computer readable storage medium and electronic equipment
JP2020150331A (en) Image processing apparatus, image processing apparatus control method, system, and program
US8368968B2 (en) Imaging apparatus and image correction method
US11184553B1 (en) Image signal processing in multi-camera system
US20080158372A1 (en) Anti-aliasing in an imaging device using an image stabilization system
CN114338958B (en) Image processing method and related equipment
US8836800B2 (en) Image processing method and device interpolating G pixels
CN110213462B (en) Image processing method, image processing device, electronic apparatus, image processing circuit, and storage medium
JP7052811B2 (en) Image processing device, image processing method and image processing system
US8976286B2 (en) Imaging apparatus, lens unit, and imaging unit
CN110930440B (en) Image alignment method, device, storage medium and electronic equipment
CN110276730B (en) Image processing method and device and electronic equipment
WO2023181558A1 (en) Imaging device
EP3780594B1 (en) Imaging device and method, image processing device and method, and imaging element

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22933685

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024509768

Country of ref document: JP