WO2016104235A1 - Stereo imaging device and moving body - Google Patents

Stereo imaging device and moving body Download PDF

Info

Publication number
WO2016104235A1
WO2016104235A1 PCT/JP2015/085013 JP2015085013W WO2016104235A1 WO 2016104235 A1 WO2016104235 A1 WO 2016104235A1 JP 2015085013 W JP2015085013 W JP 2015085013W WO 2016104235 A1 WO2016104235 A1 WO 2016104235A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical system
filter
imaging
imaging device
image
Prior art date
Application number
PCT/JP2015/085013
Other languages
French (fr)
Japanese (ja)
Inventor
敦司 山下
山本 信一
野崎 昭俊
修 丹内
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2016566133A priority Critical patent/JPWO2016104235A1/en
Publication of WO2016104235A1 publication Critical patent/WO2016104235A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits

Definitions

  • the present invention relates to a stereo imaging device including a plurality of compound-eye imaging devices and a moving body equipped with such stereo imaging devices.
  • an obstacle detection device for detecting an obstacle ahead of a traveling direction has been developed in an unmanned moving body or a vehicle.
  • the obstacle detection apparatus disclosed in Patent Document 1 detects an obstacle by scanning infrared rays projected in the traveling direction, but the detection range remains within the infrared scan range and is relatively narrow. There is.
  • the obstacle detection device disclosed in Patent Document 2 detects an obstacle using a millimeter wave radar, but has a problem that it is difficult to identify the shape and size of the obstacle.
  • the obstacle detection device disclosed in Patent Document 3 calculates, by template matching, positional deviation (parallax) of the same object on a plurality of images captured at the same time by using a stereo camera. The position of the object in the real space can be calculated based on the parallax that has been obtained, using a known conversion formula, and the above-described problems can be solved.
  • Patent Document 3 has a problem that it is difficult to identify a subject under a reflecting object or backlight condition. Although it is conceivable to mount the above-described three types of obstacle detection devices and use them according to the conditions, the cost increases and the weight increases. In particular, when an obstacle detection device is mounted on an unmanned air vehicle or the like, there is an actual situation that an obstacle detection device that is as small and light as possible is desired in order to secure one cruising distance.
  • the present invention has been made in view of such problems, and an object of the present invention is to provide a stereo imaging device capable of recognizing various obstacles while being small and light and a moving body having the same.
  • a stereo imaging device reflecting one aspect of the present invention.
  • a first array optical system having a plurality of first imaging optical systems arranged with different optical axes and an image formed by each of the plurality of first imaging optical systems are photoelectrically converted and an image signal is output.
  • a first imaging device having a first solid-state imaging device;
  • a second array optical system having a plurality of second imaging optical systems arranged with different optical axes and an image formed by each of the plurality of second imaging optical systems are photoelectrically converted and an image signal is output.
  • a second imaging device having a second solid-state imaging device, and a stereo imaging device comprising: The optical axes of the first imaging optical system and the second imaging optical system are parallel, and the first imaging device and the second imaging device are arranged apart from each other in a direction orthogonal to the optical axis,
  • Each of the first imaging optical systems includes any one of a filter without optical correction, a red filter, a green filter, a blue filter, an ND filter, a near-infrared filter, and a polarizing filter, and the first There are at least two types of filters in one array optical system, Each filter of the second imaging optical system is the same type as the filter of the first array optical system, Based on the image signal of the image formed by the first imaging optical system and the image signal of the image formed by the second imaging optical system having the same type of filter, three-dimensional information of the subject is obtained.
  • a stereo imaging device capable of recognizing various obstacles while being small and light, and a movable body having the same.
  • FIG. 1 is a block diagram showing a configuration of an unmanned air vehicle 100.
  • FIG. It is a block diagram which shows the processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR.
  • FIG. 1 is a perspective view of an unmanned air vehicle as a moving body according to the present embodiment.
  • the main body 101 supported by the legs 102 has four arms 103A to 103D horizontally implanted at equal intervals in the circumferential direction.
  • Motors 104A to 104D which are propulsive force generators, are attached to the tips of the arms 103A to 103D, and propellers 105A to 105D are rotatably attached to the rotation shafts of the motors 104A to 104D that face in the vertical direction. It has been.
  • the first image pickup device CA1 and the second image pickup device CA2 are installed with their respective optical axes parallel to each other and spaced apart in the direction perpendicular to the optical axis and facing the traveling direction (arrow direction). Yes.
  • FIG. 2 is a diagram illustrating a schematic configuration of the first imaging device CA1.
  • the first imaging device CA1 photoelectrically converts a subject image formed by each of the single-eye optical system IL and the single-eye optical system IL arranged in three rows and three columns with the optical axes parallel to each other.
  • a solid-state imaging device (first solid-state imaging device) SR having nine conversion regions (which may be integrated) Ia, and an optical filter CF disposed between the single-eye optical system IL and the photoelectric conversion region Ia; Have.
  • the optical filter CF is divided into nine filter elements CFa according to the individual eye optical system IL.
  • the optical filter CF may be disposed on the subject side with respect to the single-eye optical system IL.
  • a plurality of types of optical filters CF and a plurality of single-eye optical systems IL constitute a first array optical system.
  • the same type of optical filter CF and a plurality of single-eye optical systems IL constitute a second array optical system, and a subject image formed thereby.
  • the solid-state imaging element SR that photoelectrically converts the second to the second solid-state imaging element.
  • FIG. 3 is a diagram illustrating an example of the arrangement of filter elements in the optical filter used in the first imaging device CA1.
  • the optical filter CF includes a filter element CFa (hereinafter referred to as NO) that is a transparent glass, a filter element CFa that transmits red light (hereinafter referred to as R), and a filter element CFa (hereinafter referred to as R) that transmits green light.
  • NO filter element CFa
  • R red light
  • R filter element CFa
  • An image forming optical system (first image forming optical system or second image forming optical system) is configured by one type of filter element CFa and the corresponding single-eye optical system IL. It is sufficient that at least two types of filter elements are provided
  • the second imaging device CA2 has the same configuration as the first imaging device CA1 shown in FIGS.
  • the arrangement of the filter elements CFa may be the same as that of the first imaging device, an arrangement having a mirror image relationship, or a random arrangement.
  • the “filter without optical correction” refers to a filter that is not substantially optically corrected for transmitted light.
  • a transparent resin material may be used, and no filter element is provided. In some cases, it is possible to capture wavelength information in a wider band than in each case of RGB.
  • FIG. 4 is a block diagram showing the configuration of the unmanned air vehicle 100.
  • Image signals output from the first imaging device CA1 and the second imaging device CA2 are image processing units PR that are mounted on the main body 101 of the unmanned air vehicle 100 and connected to the first imaging device CA1 and the second imaging device CA2.
  • the subject distance information acquisition unit PR1 in the image processing unit PR obtains three-dimensional information of the imaging region, thereby obtaining subject information related to the obstacle.
  • a visible image is formed by the monitoring visible image information acquisition unit PR2 in the image processing unit PR, and the data is transmitted to an external monitor (display device) MT.
  • the visible image captured from the unmanned air vehicle 100 can be observed by the operator through the monitor MT.
  • a color visible light image can be formed by using a light beam transmitted through the filter elements (R, G, B) of the first imaging device CA1 and / or the second imaging device CA2, and this is displayed on the monitor MT.
  • a monitor image may be displayed using a light beam that has passed through another filter element.
  • the subject information output from the image processing unit PR is output to a drive control unit (movement control unit) DR disposed in the main body 101, and based on this, motors 104A to 104D are used so as not to collide with obstacles. It is possible to realize autonomous flight by controlling.
  • a drive control unit movement control unit
  • FIG. 5 is a block diagram illustrating a processing flow in the image processing unit PR.
  • the unmanned air vehicle 100 detects an object ahead in the moving direction as an obstacle, and performs a flight so as to avoid the obstacle.
  • the first imaging device CA1 and the second imaging device CA2 respectively receive the light beams transmitted through the filter elements CFa and the individual eye optical systems IL by the photoelectric conversion region Ia in the solid-state imaging device SR. Perform photoelectric conversion. After photoelectric conversion, each signal is output from the solid-state imaging device SR, stored in a memory MR (not shown) provided in the image processing unit PR, and called up as necessary.
  • a memory MR not shown
  • First processing mode In the first processing mode, a target for distance measurement is extracted in advance, and distance measurement is performed on the target to create a distance map. In addition, when a reflector is detected, distance information of the reflector itself is not detected.
  • the image processing unit PR is to determine the exposure time (exposure amount) by the light flux that has passed through the filter element (NO) of the transparent glass by a predetermined program or the operation of the operator, the process P101 , P102, a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass is input independently from each of the first imaging device CA1 and the second imaging device CA2, and the exposure time is calculated from the value. This is determined and fed back to control the exposure time of the solid-state imaging device SR individually. After imaging with the determined exposure time, signals output from the photoelectric conversion regions Ia of the first imaging device CA1 and the second imaging device CA2 are stored in the memory MR.
  • the image processing unit PR receives from the memory MR a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass according to each of the first imaging device CA1 and the second imaging device CA2, and the first The signal corresponding to the light beam that has passed through the filter element (ND1) having the attenuation factor of 1 and the signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input, and the contrast is the highest. An image is selected, and a subject is discriminated from a signal value corresponding to the image.
  • a subject with high brightness such as the sun may appear under imaging conditions such as under backlighting, but the pixel value of the photoelectric conversion region Ia to which the light beam that has passed through the filter element (NO) of the transparent glass is incident is as described above.
  • a subject image having a contrast that does not cause whiteout is obtained based on a signal corresponding to the light beam that has passed through the filter element (ND1 or ND2) having the first attenuation factor or the second attenuation factor. Therefore, it is possible to obtain an image with the optimum brightness for obstacle detection. For this reason, it is possible to prevent erroneous detection.
  • any two of the filter elements may be selected, and the same operation as described above may be performed from the obtained signal.
  • any two or three of the filter elements (NO, ND1, ND2) may be selected to create an image with a wide dynamic range.
  • the image processing unit PR outputs a signal corresponding to the light beam that has passed through the filter element (PF) and a signal corresponding to the light beam that has passed through the filter element (SF) from the memory MR. Then, the difference between the two signal values is obtained, and it is determined from these values whether the subject contains a reflector. That is, if the difference between the two signal values exceeds the threshold value, it can be seen that the reflected light has entered, so the image processing unit PR recognizes the region range exceeding the threshold value as a reflector.
  • the signal used for the determination of the reflector may be only the signal in the first imaging device CA1, only the signal in the second imaging device CA2, or from both the first imaging device CA1 and the second imaging device CA2. This signal may be used.
  • the image processing unit PR inputs a signal corresponding to the light beam that has passed through the filter element (IR) that transmits near-infrared light from the memory MR in the process P105, and the subject is lower than the threshold value. Identify and extract. For example, overhead lines installed on the forest may be difficult to discern with visible light because the color tone is close to the background, but plants generally reflect near-infrared light, so only the signal of the overhead line part Becomes lower. Therefore, an overhead line can be identified from the forest by grasping near-infrared light.
  • IR filter element
  • the image processing unit PR outputs from the memory MR a signal corresponding to the red light beam that has passed through the filter element (R) and a signal corresponding to the green light beam that has passed through the filter element (G). And at least one of the signals corresponding to the blue luminous flux that has passed through the filter element (B) and the signal corresponding to the luminous flux that has passed through the filter element (IR) that transmits near-infrared light, Calculate to identify the subject.
  • an NDVI (normalized difference vegetation index) formula can be used to recognize an object.
  • the luminance value of the signal corresponding to the red light beam passing through the filter element (R) is Rs
  • the luminance value of the signal corresponding to the near-infrared light beam passing through the filter element (IR) is IRs
  • a signal corresponding to the green light beam that has passed through the filter element (G) or a signal corresponding to the blue light beam that has passed through the filter element (B) may be used as appropriate.
  • (NDVI) (IRs ⁇ Rs) / (IRs + Rs) (1)
  • the image processing unit PR applies the signal corresponding to the red light beam that has passed through the filter element (R) from the memory MR and the filter element (G) to the region other than the reflector determined in process P104. ) And a signal corresponding to the blue luminous flux that has passed through the filter element (B) are respectively input to generate two sets of visible light image data, and based on the parallax information A stereo image of each color is formed and distance measurement is performed.
  • the distance measuring technique using a stereo image is disclosed in Japanese Patent Application Laid-Open No. 2009-186228, and thus details are not described.
  • the image processing unit PR creates a visible light distance map for each pixel in process P108, and obtains three-dimensional information of the subject.
  • the “distance map” refers to an object image having distance information.
  • the image processing unit PR outputs a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass from the memory MR in the process P109 for the region other than the reflector determined in the process P104. And a signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and a signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input respectively. Then, a stereo image is formed for the image having the highest contrast, and the distance is measured. Thereafter, the image processing unit PR creates a backlight distance map for each pixel in process P110, and obtains three-dimensional information of the subject.
  • any two or three of the filter elements may be used to create an image with a wide dynamic range, form a stereo image, and perform distance measurement.
  • the image processing unit PR obtains the boundary portion of the reflector determined in the process P104, and the light flux that has passed through the filter element (PF) from the memory MR with respect to this boundary portion. Or a signal corresponding to the light beam that has passed through the filter element (SF) is input to form a stereo image and perform distance measurement. Thereafter, in process P112, the image processing unit PR creates a distance map including the reflector using the obtained distance measurement value of the boundary as a distance measurement value of the reflector, and obtains three-dimensional information of the subject.
  • the image processing unit PR outputs a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) from the memory MR in the process P113 for the region other than the reflector determined in the process P104. Is input, a near-infrared stereo image is formed, and distance measurement is performed. Thereafter, the image processing unit PR creates a near-infrared distance map for each pixel in process P114, and obtains three-dimensional information of the subject.
  • IR filter element
  • the image processing unit PR outputs a signal corresponding to the red light beam that has passed through the filter element (R) from the memory MR in the process P115 for the region other than the reflector determined in the process P104. Inputs a signal corresponding to the green light beam that has passed through the filter element (G), a signal that corresponds to the blue light beam that has passed through the filter element (B), and a signal that corresponds to the near-infrared light beam that has passed through the filter element (IR). Then, visible light and near-infrared stereo images are formed, and distance measurement is performed. Thereafter, the image processing unit PR creates a visible light and near-infrared distance map for each pixel in process P116, and obtains three-dimensional information of the subject.
  • the image processing unit PR superimposes the distance maps obtained in processes P108, P110, P112, P114, and P116, and the distance measurement information of the reflector itself that cannot guarantee the accuracy (the distance measurement information of the reflector). To avoid this when there is an object recognized by two or more distance maps, for example.
  • Subject information is output to the drive controller DR.
  • the image processing unit PR avoids the case where there is an object closest to the unmanned air vehicle 100 based on the distance map obtained in the processes P108, P110, P112, P114, and P116.
  • the subject information may be output to the drive control unit DR.
  • FIG. 6 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR receives a red light beam that has passed through the filter element (R), a green light beam that has passed through the filter element (G), and a blue light that has passed through the filter element (B) by a predetermined program or an operator's operation.
  • the red luminous flux that has passed through the filter element (R) independently from each of the first imaging device CA1 and the second imaging device CA2 in the processes P101A and P102A.
  • a signal corresponding to the green light beam that has passed through the filter element (G), and a signal corresponding to the blue light beam that has passed through the filter element (B), and a luminance value is calculated from these values.
  • the exposure time is determined and fed back to control the exposure time of the solid-state imaging element SR individually. Since the processes P103 to P117 are the same as the processes in FIG.
  • FIG. 7 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR performs the first in the processing P101B and P102B.
  • a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) is independently input from the imaging device CA1 and the second imaging device CA2, and the exposure time is determined from this value and fed back.
  • the exposure time of the solid-state image sensor SR is individually controlled. Since the processes P103 to P117 are the same as the processes in FIG.
  • FIG. 8 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR has passed through the filter element (ND1) having the first attenuation factor and the filter element (ND2) having the second attenuation factor by a predetermined program or an operator's operation.
  • the processing P101C and P102C have the first attenuation rate independently from the first imaging device CA1 and the second imaging device CA2.
  • a signal corresponding to the light beam passing through the filter element (ND1) and / or a signal corresponding to the light beam passing through the filter element (ND2) having the second attenuation factor is input, and the exposure time is determined based on this value. This is determined and fed back to control the exposure time of the solid-state imaging device SR individually. Since the processes P103 to P117 are the same as the processes in FIG.
  • FIG. 9 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR determines the exposure time based on a signal corresponding to the light beam that has passed through the filter element (PF) or a signal corresponding to the light beam that has passed through the filter element (SF) by a predetermined program or an operator's operation.
  • a signal corresponding to the light beam that has passed through the filter element (PF) or the filter element independently from the first image pickup apparatus CA1 and the second image pickup apparatus CA2).
  • the signal corresponding to the light flux that has passed SF) is input, the exposure time is determined based on this value, and this is fed back to individually control the exposure time of the solid-state imaging device SR. Since the processes P103 to P117 are the same as the processes in FIG.
  • the drive control unit DR temporarily stops the unmanned air vehicle 100.
  • FIG. 10 is a block diagram illustrating a processing flow in the image processing unit PR according to the second processing mode.
  • the image processing unit PR in processing P201, P202, A signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass is input independently from each of the first imaging device CA1 and the second imaging device CA2, and the exposure time is determined from the value. And the exposure time of the solid-state image sensor SR is individually controlled. After imaging with the determined exposure time, signals output from the photoelectric conversion regions Ia of the first imaging device CA1 and the second imaging device CA2 are stored in the memory MR.
  • the image processing unit PR for each of the first imaging device CA1 and the second imaging device CA2, outputs a signal corresponding to the red light beam that has passed through the filter element (R) and the filter element (G ) And a signal corresponding to the blue light beam that has passed through the filter element (B) are respectively input, a stereo image of each color is formed, a distance measurement is performed, and an RGB distance measurement is performed. Get the data. If the subject includes a reflector, the RGB distance measurement data also includes the actual distance measurement data.
  • the image processing unit PR outputs a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass for each of the first imaging device CA1 and the second imaging device CA2 from the memory MR. And a signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and a signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input respectively. From the values corresponding to the same kind of filter elements, a stereo image is formed, and distance measurement is performed to obtain ND distance measurement data.
  • the ND distance measurement data also includes the actual distance measurement data when the subject includes a reflector.
  • any two of the filter elements may be selected, and the same operation as described above may be performed from the obtained signal.
  • any two or three of the filter elements (NO, ND1, ND2) may be used to create an image with a wide dynamic range, form a stereo image, and perform distance measurement.
  • the image processing unit PR receives, from the memory MR, a signal corresponding to the light beam that has passed through the filter element (PF) and the filter for each of the first imaging device CA1 and the second imaging device CA2.
  • a signal corresponding to the light beam that has passed through the element (SF) is input, a stereo image is formed from values corresponding to the same type of filter element, and distance measurement is performed to obtain polarization distance measurement data.
  • the polarization distance measurement data also includes the actual distance measurement data.
  • the image processing unit PR receives a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) from the memory MR for each of the first imaging device CA1 and the second imaging device CA2.
  • IR filter element
  • the image processing unit PR receives a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) from the memory MR for each of the first imaging device CA1 and the second imaging device CA2.
  • IR filter element
  • the image processing unit PR receives a signal corresponding to the light beam that has passed through the filter element (PF) and a signal that corresponds to the light beam that has passed through the filter element (SF) from the memory MR, and The difference between the two signal values is obtained, and it is determined from these values whether the subject contains a reflector. That is, if the difference between the two signal values exceeds the threshold value, it can be seen that the reflected light has entered, so the image processing unit PR recognizes the region range that exceeds the threshold value as a reflector.
  • the signal used for the determination of the reflector may be only the signal in the first image pickup device CA1, only the signal in the second image pickup device CA2, or both the first image pickup device CA1 and the second image pickup device CA2.
  • the image processing unit PR obtains a boundary portion of the reflector in the subject in the process P208, and the boundary value of the distance measurement value of the reflector is obtained from the polarization distance measurement data obtained in the process P205.
  • step P209 a distance map of the reflector is created and the three-dimensional information of the subject is obtained.
  • the image processing unit PR removes the distance measurement data corresponding to the reflector determined in the process P207 from the visible light distance measurement data obtained in the process P203, and in the process P210, the distance map for visible light. And 3D information of the subject is obtained.
  • the image processing unit PR receives from the memory MR a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass in accordance with each of the first imaging device CA1 and the second imaging device CA2.
  • the signal corresponding to the light beam that has passed through the filter element (ND1) having the attenuation factor of 1 and the signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input, and the contrast is the highest.
  • An image is selected, and ND ranging data corresponding to the image is selected from the ND ranging data obtained in process P204.
  • the image processing unit PR removes distance measurement data corresponding to the reflector determined in process P207 from the selected ND distance measurement data, creates a distance map for backlighting, and creates a three-dimensional image of the subject. Ask for information.
  • any two of the filter elements NO, ND1, ND2 may be selected, and the same operation as described above may be performed from the obtained signal.
  • any two or three of the filter elements may be used to create an image with a wide dynamic range, form a stereo image, and perform distance measurement.
  • the image processing unit PR removes the distance measurement data corresponding to the reflector determined in the process P207 from the near infrared distance measurement data obtained in the process P206, and in the process P213, the near infrared distance. A map is created and 3D information of the subject is obtained.
  • the image processing unit PR receives from the memory MR a signal corresponding to the red light beam that has passed through the filter element (R), a signal corresponding to the green light beam that has passed through the filter element (G), and the filter element.
  • Input at least one of the signals corresponding to the blue luminous flux that has passed through (B) and the signal corresponding to the luminous flux that has passed through the filter element (IR) that transmits near-infrared light, and calculate these to calculate the subject. Is identified.
  • the image processing unit PR receives from the memory MR a signal corresponding to the red light beam that has passed through the filter element (R), a signal corresponding to the green light beam that has passed through the filter element (G), and a filter element.
  • a signal corresponding to the blue light beam that has passed through (B) and a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) are input to form visible light and a near-infrared stereo image, and distance measurement is performed.
  • the image processing unit PR removes the distance measurement data corresponding to the reflector determined in process P207 from the obtained distance measurement data, creates a vegetation distance map, and obtains the three-dimensional information of the subject.
  • the image processing unit PR superimposes the distance maps obtained in processes P210, P212, P209, P213, and P215.
  • the distance measurement information of the reflector that is missing in each distance measurement map is replaced with the distance measurement information of the boundary portion of the reflector, and the distance map is complemented.
  • the subject information is output to the drive control unit DR so as to avoid this.
  • the image processing unit PR avoids the case where there is an object closest to the unmanned air vehicle 100 based on the distance map obtained in the processes P210, P212, P209, P213, and P215.
  • the subject information may be output to the drive control unit DR.
  • FIG. 11 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR receives a signal corresponding to the red light beam that has passed through the filter element (R), a signal corresponding to the green light beam that has passed through the filter element (G), and a filter by a predetermined program or operator operation.
  • the exposure time is to be determined based on the signal corresponding to the blue light beam that has passed through the element (B)
  • the first imaging device CA1 and the second imaging device CA2 are independent from each other in the processes P201A and P202A.
  • the signal corresponding to the red light beam that has passed through the filter element (R), the signal that corresponds to the green light beam that has passed through the filter element (G), and the signal that corresponds to the blue light beam that has passed through the filter element (B) are input. Then, the brightness value is calculated from these values to determine the exposure time, and this is fed back to individually control the exposure time of the solid-state image sensor SR. Going on. Note that processing P203 to P216 is the same as the processing in FIG.
  • FIG. 12 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR is to determine the exposure time based on a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) by a predetermined program or an operator's operation, the processing P201B, P202B , The signal corresponding to the near-infrared luminous flux that has passed through the filter element (IR) is input independently from each of the first imaging device CA1 and the second imaging device CA2, and the exposure time is determined from this value. This is fed back to individually control the exposure time of the solid-state image sensor SR. Note that processing P203 to P216 is the same as the processing in FIG.
  • FIG. 13 is a block diagram illustrating another example of the processing flow in the image processing unit PR.
  • the image processing unit PR receives a signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and a filter element (ND2 having the second attenuation factor) by a predetermined program or an operator's operation. ),
  • the exposure time is determined by at least one of the signals corresponding to the light beam that has passed through the first imaging device CA1 and the second imaging device CA2 in the processes P201C and P202C.
  • processing P203 to P216 is the same as the processing in FIG.
  • FIG. 14 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR determines the exposure time based on a signal corresponding to the light beam that has passed through the filter element (PF) or a signal corresponding to the light beam that has passed through the filter element (SF) by a predetermined program or an operator's operation. If this is the case, in the processes P201D and P202D, a signal corresponding to the light beam that has passed through the filter element (PF) or the filter element (independently from the first image pickup apparatus CA1 and the second image pickup apparatus CA2).
  • processing P203 to P216 is the same as the processing in FIG.
  • the present embodiment described above by providing the first array optical system and the second optical system, it is possible to reduce the size and weight as compared with a combination of these units in separate units, and to increase the functionality.
  • the base line length is longer than that of the image pickup device having a single array optical system. Can be ensured, and distance measurement accuracy at a long distance can be ensured.
  • the single-eye optical system IL having the filter element (NO) of the transparent glass the light of all wavelength regions sensitive to the solid-state imaging element SR can be condensed, and the filter elements (ND1, ND2 having different attenuation factors).
  • the red, green, and blue filter elements can measure an object in the visible light region, and a visible image to be displayed to the operator of the unmanned air vehicle 100 can be obtained.
  • the subject can be specified from the intensity difference or ratio.
  • the near-infrared filter element (IR) can extract vegetation such as forests and discriminate overhead lines and the like.
  • a reflector can be detected by detecting a light amount difference by using two types of polarizing filter elements (SF, PF) having different transmission axes.
  • the visible light stereo image is formed using the signal obtained by transmitting the red, green, and blue filter elements (R, G, B).
  • the red, green, and blue filter elements R, G, B
  • at least one color filter is used.
  • a single-color or two-color stereo image may be formed using the element.
  • the image processing unit PR may first detect whether or not a reflector exists in the subject, and when the reflector is detected, distance measurement of the corresponding area may not be performed.
  • FIG. 15 is a perspective view of an unmanned air vehicle as a moving body according to another embodiment.
  • the unmanned aerial vehicle 100 ′ of the present embodiment has a function capable of capturing an image of a structure such as a bridge while autonomously flying.
  • a shaft 106 that can rotate with respect to the main body 101 is provided with respect to the embodiment shown in FIG. 1, and a frame 107 is attached to the upper end of the shaft 106.
  • the frame 107 mounts the high-pixel camera HCA and the third imaging device CA3 and the fourth imaging device CA4 arranged on both sides thereof so that the optical axes are parallel.
  • the third image pickup device CA3 and the fourth image pickup device CA4 have the same configuration as the first image pickup device CA1 and the second image pickup device CA2, but are preferably directed in different directions (for example, 90 degrees).
  • FIG. 16 is a block diagram showing the configuration of the unmanned air vehicle 100 '.
  • the image signals output from the third imaging device CA3 and the fourth imaging device CA4 are input to the sub-image processing unit PR ′ arranged in the main body 101 of the unmanned air vehicle 100 ′, and are stored in the sub-image processing unit PR ′.
  • the inspection object distance information acquisition unit PR3 the distance to the structure as the inspection object can be acquired.
  • a self-location information acquisition unit that can acquire self-location information (GPS signals may be used) of the unmanned air vehicle 100 ′ by pattern-matching images captured by the third imaging device CA3 and the fourth imaging device CA4.
  • PR4 is provided in the sub-image processing unit PR ′. These outputs are output to the drive control unit DR.
  • the unmanned air vehicle 100 ′ flies so as to avoid an obstacle ahead in the moving direction based on signals from the first imaging device CA1 and the second imaging device CA2, as in the above-described embodiment.
  • the object distance information acquisition unit PR3 can acquire the distance to the structure as the inspection object, and the drive control unit DR can perform feedback control so that the distance is constant, and the self-position information acquisition unit PR4 It is possible to fly along a predetermined route with the signal from. Further, during the flight, the structure is photographed by the high pixel camera HCA, and the image is transmitted to the monitor MT so that the operator can observe and record it at the transmission destination. Since the other configuration is the same as that of the above-described embodiment, the description thereof is omitted.
  • an unmanned air vehicle 100 ' is approached for inspecting an infrastructure structure such as a bridge, and the inspection image is obtained by imaging the bridge with a high-pixel camera HCA for inspection, it collides with a bridge that is an inspection object. It is necessary to keep a certain distance so as not to. Further, there are cases where the moving direction of the unmanned air vehicle 100 ′ and the direction of the inspection object are different from each other, such as being perpendicular to each other.
  • the first image pickup device CA1 and the second image pickup device CA2 are directed forward in the traveling direction of the unmanned air vehicle 100 ′, and another set of the third image pickup device CA3 and the fourth image pickup device CA4 are inspected.
  • the unmanned air vehicle 100 ′ capable of autonomous control can be realized by appropriately maintaining the distance from the inspection object while detecting the obstacle in the traveling direction.
  • the exposure time (exposure amount) is determined by the light flux that has passed through the filter element of the transparent glass, it is possible to capture a relatively low-brightness subject at a high shutter speed, and to detect blurring of moving objects and subjects.
  • an image formed with a light beam that has passed through the filter element of the transparent glass can be an image without black crushing.
  • an image formed with a light beam transmitted through another optical filter element is slightly underexposed, and an image without whiteout can be obtained.
  • ranging accuracy is improved for both low-luminance subjects and high-luminance subjects.
  • the exposure time (exposure amount) is determined by at least one of the light beams transmitted through the red filter element, the green filter element, and the blue filter element, the ranging performance in the visible light region is improved and the unmanned flight is performed. Visible visible image information can be transmitted to the body operator.
  • the exposure time (exposure amount) is determined by the light beam transmitted through the near-infrared filter element, for example, the light reflected by vegetation or the like can be accurately imaged, and the extraction ability of vegetation or the like is improved.
  • the exposure time (exposure amount) is determined by the light beam that has passed through the polarizing filter element, it is possible to accurately capture the object light from the reflector and the like, and the extraction ability of the reflector and the like is improved.
  • the exposure time (exposure amount) is determined by the light beam that has passed through the filter element that can attenuate the incident light, the amount of light from the backlight or high-intensity subject can be suppressed, and as a result, whiteout of the image can be avoided. Is possible.
  • an image reflected on a reflector such as the water surface, transparent acrylic, or glass
  • the distance of a virtual object that exists farther than that will be detected.
  • the position cannot be detected.
  • the reflector if the reflector is photographed through two or more types of polarizing filter elements having different transmission axes, the reflected light is polarized and thus obtained through the polarizing filter elements having different transmission axes.
  • the brightness of the image is different from each other at the part of the reflector. If a difference in image luminance between polarization filter elements is detected in this way, it can be identified as a reflector.
  • the combination of polarizing filter elements having different transmission axes may be in the same array optical system or in different array optical systems.
  • the reflector can be detected in this way, it is possible to obtain accurate three-dimensional information of the subject by not adopting distance information based on the light flux from the reflector or by not performing distance measurement processing on the light flux from the reflector. it can.
  • the distance to the reflector remains unclear if it remains as it is, so the non-reflector existing at the boundary of the reflector is replaced with a filter element made of plain glass, red light, green light.
  • a filter element that transmits blue light, a filter element that transmits near-infrared light, a filter element having a different attenuation factor, an imaging optical system having any one of polarizing filter elements, or an image having different types of filter elements A distance measurement process is performed based on a combination of signals from the optical system, and the distance information obtained here is used as the distance of the reflector, thereby avoiding a situation in which no distance information exists.
  • the calculation load becomes smaller than the distance measurement for all the pixel areas, and the process can be performed in a short time.
  • the processing configuration in the image processing algorithm may be simplified if the distance is collectively measured.
  • an optimum distance image can be obtained for each subject by specifying the subject or obtaining subject information based on the characteristics of the light flux that has passed through each filter element after measuring all the subjects at once.
  • a color visible image obtained through a filter element that transmits red light, green light, and blue light it is possible to transmit effective information necessary for the operation of the unmanned air vehicle operator. Furthermore, it is possible to superimpose and display images obtained through a filter element made of plain glass or a filter element that attenuates incident light, so that an image with a wide dynamic range without whiteout or blackout can be obtained even in backlight. Can be provided. In addition, if the image obtained through the polarizing filter element is displayed, it is possible to indicate the position of the reflector to be careful in obstacle detection. These images may be displayed in combination as appropriate.
  • the present invention is not limited to the embodiments described in the present specification, and includes other embodiments and modifications based on the embodiments and technical ideas described in the present specification. It is obvious to For example, the filters included in each of the second imaging optical systems of the second imaging device are the same type as the filters included in the first array optical system of the first imaging device, but the filters included in each of the second imaging optical systems. Is not limited to the case where the filters of each of the first imaging optical systems are completely the same, and the filters of each of the second imaging optical systems of the second imaging device have each of the first imaging optical systems. In addition to having the same filter, an extra filter (a filter that is not in the first imaging optical system but only in the second imaging optical system) may be provided.
  • the stereo imaging device of the present invention can be mounted not only on an unmanned air vehicle but also on a vehicle.
  • the first imaging device CA1 and the second imaging device CA2 are arranged on both sides of the room mirror RM of the vehicle VH, and the imaging is performed with the optical axis directed toward the front of the vehicle. Obstacles can be detected and driving assistance can be realized.
  • the first imaging device CA1 and the second imaging device CA2 may be provided in a headlamp or a bumper.

Abstract

The present invention provides a stereo imaging device that is compact, light, and capable of recognizing a variety of obstacles, and a moving body having the same. Configuring a stereo imaging device from a first imaging device having a first array optical system provided with various optical filters and a second imaging device having a second array optical system results in a longer baseline than that of an imaging device having a single array optical system and thus makes it possible to detect a variety of obstacles from images having passed through the various optical filters while maintaining distance measurement accuracy at a long distance.

Description

ステレオ撮像装置及び移動体Stereo imaging apparatus and moving body
 本発明は、複数の複眼撮像装置を備えたステレオ撮像装置と、かかるステレオ撮像装置を搭載した移動体に関する。 The present invention relates to a stereo imaging device including a plurality of compound-eye imaging devices and a moving body equipped with such stereo imaging devices.
 近年、無人移動体或いは車両などにおいて、進行方向前方の障害物を検出する障害物検出装置が開発されている。例えば、特許文献1に開示された障害物検出装置は、進行方向に投射した赤外線をスキャンすることで障害物を検出するものであるが、検出範囲が赤外線のスキャン範囲に留まり比較的狭いという問題がある。 In recent years, an obstacle detection device for detecting an obstacle ahead of a traveling direction has been developed in an unmanned moving body or a vehicle. For example, the obstacle detection apparatus disclosed in Patent Document 1 detects an obstacle by scanning infrared rays projected in the traveling direction, but the detection range remains within the infrared scan range and is relatively narrow. There is.
 これに対し、特許文献2に開示された障害物検出装置は、ミリ波レーダーを用いて障害物を検出するものであるが、障害物の形や寸法の識別が困難であるという問題がある。一方、特許文献3に開示された障害物検出装置は、ステレオカメラを用いることで同時刻に撮像された複数の画像上における同一対象物の位置ずれ(視差)を、テンプレートマッチングにより算出し、算出した視差をもとに対象物の実空間上の位置を、周知の変換式により算出できるものであり、上述の問題を解消することができる。 On the other hand, the obstacle detection device disclosed in Patent Document 2 detects an obstacle using a millimeter wave radar, but has a problem that it is difficult to identify the shape and size of the obstacle. On the other hand, the obstacle detection device disclosed in Patent Document 3 calculates, by template matching, positional deviation (parallax) of the same object on a plurality of images captured at the same time by using a stereo camera. The position of the object in the real space can be calculated based on the parallax that has been obtained, using a known conversion formula, and the above-described problems can be solved.
特開2014-24431号公報JP 2014-24431 A 特開2013-86560号公報JP 2013-86560 A 特開2009-146217号公報JP 2009-146217 A
 しかしながら、特許文献3に開示された技術では、反射物や逆光条件下等で被写体を識別にしにくくなるという問題がある。上述の3種類の障害物検出装置を搭載して、条件に応じて使い分けることも考えられるが、コストが増大すると共に重量増を招く。特に、無人飛行体などに障害物検出装置を搭載する場合、一回の航続距離を確保する為に、極力小型軽量の障害物検出装置が望まれているという実情がある。 However, the technique disclosed in Patent Document 3 has a problem that it is difficult to identify a subject under a reflecting object or backlight condition. Although it is conceivable to mount the above-described three types of obstacle detection devices and use them according to the conditions, the cost increases and the weight increases. In particular, when an obstacle detection device is mounted on an unmanned air vehicle or the like, there is an actual situation that an obstacle detection device that is as small and light as possible is desired in order to secure one cruising distance.
 本発明は、かかる問題点に鑑みてなされ、小型軽量でありながら、多彩な障害物を認識できるステレオ撮像装置及びそれを有する移動体を提供することを目的とする。 The present invention has been made in view of such problems, and an object of the present invention is to provide a stereo imaging device capable of recognizing various obstacles while being small and light and a moving body having the same.
 上述した目的のうち少なくとも一つを実現するために、本発明の一側面を反映したステレオ撮像装置は、
 光軸を異ならせて配置した複数の第1結像光学系を備えた第1アレイ光学系と、前記複数の第1結像光学系の各々が形成する画像を光電変換し画像信号を出力する第1固体撮像素子と、を有する第1撮像装置と、
 光軸を異ならせて配置した複数の第2結像光学系を備えた第2アレイ光学系と、前記複数の第2結像光学系の各々が形成する画像を光電変換し画像信号を出力する第2固体撮像素子と、を有する第2撮像装置と、を備えたステレオ撮像装置であって、
 前記第1結像光学系と前記第2結像光学系の光軸は平行で、前記第1撮像装置と前記第2撮像装置とは、光軸直交方向に離間して配置され、
 前記第1結像光学系の各々は、光学補正無しフィルタ、赤色フィルタ、緑色フィルタ、青色フィルタ、NDフィルタ、近赤外フィルタ、及び偏光フィルタのうちいずれか1種類を備えてなり、且つ前記第1アレイ光学系の有するフィルタは、少なくとも2種類であり、
 前記第2結像光学系の各々の有するフィルタは、前記第1アレイ光学系の有するフィルタと同種であって、
 同種のフィルタを有する、前記第1結像光学系により形成された画像の前記画像信号と前記第2結像光学系により形成された画像の前記画像信号に基づいて被写体の3次元情報を得る。
In order to realize at least one of the above-described objects, a stereo imaging device reflecting one aspect of the present invention is provided.
A first array optical system having a plurality of first imaging optical systems arranged with different optical axes and an image formed by each of the plurality of first imaging optical systems are photoelectrically converted and an image signal is output. A first imaging device having a first solid-state imaging device;
A second array optical system having a plurality of second imaging optical systems arranged with different optical axes and an image formed by each of the plurality of second imaging optical systems are photoelectrically converted and an image signal is output. A second imaging device having a second solid-state imaging device, and a stereo imaging device comprising:
The optical axes of the first imaging optical system and the second imaging optical system are parallel, and the first imaging device and the second imaging device are arranged apart from each other in a direction orthogonal to the optical axis,
Each of the first imaging optical systems includes any one of a filter without optical correction, a red filter, a green filter, a blue filter, an ND filter, a near-infrared filter, and a polarizing filter, and the first There are at least two types of filters in one array optical system,
Each filter of the second imaging optical system is the same type as the filter of the first array optical system,
Based on the image signal of the image formed by the first imaging optical system and the image signal of the image formed by the second imaging optical system having the same type of filter, three-dimensional information of the subject is obtained.
 本発明によれば、小型軽量でありながら、多彩な障害物を認識できるステレオ撮像装置及びそれを有する移動体を提供することができる。 According to the present invention, it is possible to provide a stereo imaging device capable of recognizing various obstacles while being small and light, and a movable body having the same.
本実施形態にかかる移動体としての無人飛行体の斜視図である。It is a perspective view of the unmanned air vehicle as a moving body concerning this embodiment. 第1撮像装置CA1の概略構成を示す図である。It is a figure which shows schematic structure of 1st imaging device CA1. 第1撮像装置CA1に用いるカラーフィルタにおけるフィルタ素子の並びの一例を示す図である。It is a figure which shows an example of the arrangement | sequence of the filter element in the color filter used for 1st imaging device CA1. 無人飛行体100の構成を示すブロック図である。1 is a block diagram showing a configuration of an unmanned air vehicle 100. FIG. 画像処理部PRにおける処理フローを示すブロック図である。It is a block diagram which shows the processing flow in the image process part PR. 画像処理部PRにおける別な処理フローを示すブロック図である。It is a block diagram which shows another processing flow in the image process part PR. 画像処理部PRにおける別な処理フローを示すブロック図である。It is a block diagram which shows another processing flow in the image process part PR. 画像処理部PRにおける別な処理フローを示すブロック図である。It is a block diagram which shows another processing flow in the image process part PR. 画像処理部PRにおける別な処理フローを示すブロック図である。It is a block diagram which shows another processing flow in the image process part PR. 第2の処理態様にかかる画像処理部PRにおける処理フローを示すブロック図である。It is a block diagram which shows the processing flow in the image process part PR concerning a 2nd process aspect. 画像処理部PRにおける別な処理フローを示すブロック図である。It is a block diagram which shows another processing flow in the image process part PR. 画像処理部PRにおける別な処理フローを示すブロック図である。It is a block diagram which shows another processing flow in the image process part PR. 画像処理部PRにおける別な処理フローを示すブロック図である。It is a block diagram which shows another processing flow in the image process part PR. 画像処理部PRにおける別な処理フローを示すブロック図である。It is a block diagram which shows another processing flow in the image process part PR. 別な実施形態にかかる移動体としての無人飛行体の斜視図である。It is a perspective view of the unmanned aerial vehicle as a moving body concerning another embodiment. 無人飛行体100’の構成を示すブロック図である。It is a block diagram which shows the structure of unmanned air vehicle 100 '. ステレオ撮像装置を搭載した車両の例を示す図である。It is a figure which shows the example of the vehicle carrying a stereo imaging device.
 以下、本発明に係る実施形態を説明する。図1は、本実施形態にかかる移動体としての無人飛行体の斜視図である。図1に示す無人飛行体100において、脚部102により支持された本体101は、周方向に等間隔に4本のアーム103A~103Dを水平に植設している。アーム103A~103Dの先端には、推進力発生装置であるモータ104A~104Dが取り付けられており、各モータ104A~104Dの垂直方向を向いた回転軸には、プロペラ105A~105Dが回転可能に取り付けられている。 Hereinafter, embodiments according to the present invention will be described. FIG. 1 is a perspective view of an unmanned air vehicle as a moving body according to the present embodiment. In the unmanned air vehicle 100 shown in FIG. 1, the main body 101 supported by the legs 102 has four arms 103A to 103D horizontally implanted at equal intervals in the circumferential direction. Motors 104A to 104D, which are propulsive force generators, are attached to the tips of the arms 103A to 103D, and propellers 105A to 105D are rotatably attached to the rotation shafts of the motors 104A to 104D that face in the vertical direction. It has been.
 本体101の上面には、第1撮像装置CA1と第2撮像装置CA2とが各光軸を互いに平行にし且つ光軸直交方向に離間した状態で、進行方向(矢印方向)を向いて設置されている。 On the upper surface of the main body 101, the first image pickup device CA1 and the second image pickup device CA2 are installed with their respective optical axes parallel to each other and spaced apart in the direction perpendicular to the optical axis and facing the traveling direction (arrow direction). Yes.
 図2は、第1撮像装置CA1の概略構成を示す図である。図2において、第1撮像装置CA1は、光軸を互いに平行として3行3列に配置された個眼光学系ILと、個眼光学系ILの各々により形成される被写体像を光電変換する光電変換領域(一体であっても良い)Iaを9領域備えた固体撮像素子(第1固体撮像素子)SRと、個眼光学系ILと光電変換領域Iaとの間に配置された光学フィルタCFと、を有する。光学フィルタCFは、個眼光学系ILの個々に応じて9個のフィルタ素子CFaに分割されている。なお、光学フィルタCFは、個眼光学系ILより被写体側に配置されていてもよい。複数種の光学フィルタCFと複数の個眼光学系ILとで、第1アレイ光学系を構成する。尚、図示していないが、同じ構成を有する第2撮像装置CA2において、同種の光学フィルタCFと複数の個眼光学系ILとで第2アレイ光学系を構成し、それにより形成される被写体像を光電変換する固体撮像素子SRが第2固体撮像素子を構成する。 FIG. 2 is a diagram illustrating a schematic configuration of the first imaging device CA1. In FIG. 2, the first imaging device CA1 photoelectrically converts a subject image formed by each of the single-eye optical system IL and the single-eye optical system IL arranged in three rows and three columns with the optical axes parallel to each other. A solid-state imaging device (first solid-state imaging device) SR having nine conversion regions (which may be integrated) Ia, and an optical filter CF disposed between the single-eye optical system IL and the photoelectric conversion region Ia; Have. The optical filter CF is divided into nine filter elements CFa according to the individual eye optical system IL. The optical filter CF may be disposed on the subject side with respect to the single-eye optical system IL. A plurality of types of optical filters CF and a plurality of single-eye optical systems IL constitute a first array optical system. Although not shown, in the second imaging device CA2 having the same configuration, the same type of optical filter CF and a plurality of single-eye optical systems IL constitute a second array optical system, and a subject image formed thereby. The solid-state imaging element SR that photoelectrically converts the second to the second solid-state imaging element.
 図3は、第1撮像装置CA1に用いる光学フィルタにおけるフィルタ素子の並びの一例を示す図である。光学フィルタCFは、素通しのガラスであるフィルタ素子CFa(以下、NOとする)と、赤色の光を透過するフィルタ素子CFa(以下、Rとする)と、緑色の光を透過するフィルタ素子CFa(以下、Gとする)と、青色の光を透過するフィルタ素子CFa(以下、Bとする)と、近赤外の光を透過するフィルタ素子CFa(以下、IRとする)と、第1の減衰率を有するフィルタ素子(NDフィルタ)CFaと(以下、ND1とする)、第1の減衰率より高い第2の減衰率を有するフィルタ素子(NDフィルタ)CFa(以下、ND2とする)と、透過軸が0度であるフィルタ素子(偏光フィルタ)CFa(以下、PFとする)と、透過軸が90度であるフィルタ素子(偏光フィルタ)CFa(以下、SFとする)とを有する。1種類のフィルタ素子CFaと、それに対応する個眼光学系ILとで、結像光学系(第1結像光学系又は第2結像光学系)を構成する。尚、少なくとも2種類のフィルタ素子が用途に応じて設けられていれば足りる。 FIG. 3 is a diagram illustrating an example of the arrangement of filter elements in the optical filter used in the first imaging device CA1. The optical filter CF includes a filter element CFa (hereinafter referred to as NO) that is a transparent glass, a filter element CFa that transmits red light (hereinafter referred to as R), and a filter element CFa (hereinafter referred to as R) that transmits green light. G), a filter element CFa that transmits blue light (hereinafter referred to as B), a filter element CFa that transmits near infrared light (hereinafter referred to as IR), and a first attenuation A filter element (ND filter) CFa having a rate (hereinafter referred to as ND1), a filter element (ND filter) CFa (hereinafter referred to as ND2) having a second attenuation rate higher than the first attenuation rate, and a transmission It has a filter element (polarization filter) CFa (hereinafter referred to as PF) whose axis is 0 degrees and a filter element (polarization filter) CFa (hereinafter referred to as SF) whose transmission axis is 90 degrees. An image forming optical system (first image forming optical system or second image forming optical system) is configured by one type of filter element CFa and the corresponding single-eye optical system IL. It is sufficient that at least two types of filter elements are provided depending on the application.
 第2撮像装置CA2は、図2,3に示す第1撮像装置CA1と同様な構成を有する。但し、フィルタ素子CFaの配列は、第1撮像装置と同一でも良いし、鏡像関係となる配置であっても良いし、ランダム配置であっても良い。尚、「光学補正無しフィルタ」とは、透過光に対して実質的に光学補正がなされないフィルタをいい、図3に示す素通しガラスの他、素通しの樹脂素材でも良いし、フィルタ素子を設けない場合も含み、RGBの各場合よりも広帯域の波長情報を取り込むことができる。 The second imaging device CA2 has the same configuration as the first imaging device CA1 shown in FIGS. However, the arrangement of the filter elements CFa may be the same as that of the first imaging device, an arrangement having a mirror image relationship, or a random arrangement. The “filter without optical correction” refers to a filter that is not substantially optically corrected for transmitted light. In addition to the transparent glass shown in FIG. 3, a transparent resin material may be used, and no filter element is provided. In some cases, it is possible to capture wavelength information in a wider band than in each case of RGB.
 図4は、無人飛行体100の構成を示すブロック図である。第1撮像装置CA1と第2撮像装置CA2とから出力された画像信号は、無人飛行体100の本体101に同架され第1撮像装置CA1と第2撮像装置CA2に接続された画像処理部PRに入力され、画像処理部PR内の被写体距離情報取得部PR1にて、撮像領域の3次元情報を得ることで障害物に関する被写体情報が求められる。また、入力された画像信号に基づいて、画像処理部PR内のモニタリング用可視画像情報取得部PR2にて可視画像を形成し、そのデータを外部のモニタ(表示装置)MTに送信することで、無人飛行体100から撮像した可視画像を、オペレータがモニタMTを通して観察できる。特に、第1撮像装置CA1及び/又は第2撮像装置CA2のフィルタ素子(R,G,B)を透過した光束を用いることで、カラーの可視光画像を形成できるので、これをモニタMTに表示することで、オペレータの操作を支援できる。尚、その他のフィルタ素子を通過させた光束を用いてモニタ画像を表示しても良い。 FIG. 4 is a block diagram showing the configuration of the unmanned air vehicle 100. As shown in FIG. Image signals output from the first imaging device CA1 and the second imaging device CA2 are image processing units PR that are mounted on the main body 101 of the unmanned air vehicle 100 and connected to the first imaging device CA1 and the second imaging device CA2. The subject distance information acquisition unit PR1 in the image processing unit PR obtains three-dimensional information of the imaging region, thereby obtaining subject information related to the obstacle. Further, based on the input image signal, a visible image is formed by the monitoring visible image information acquisition unit PR2 in the image processing unit PR, and the data is transmitted to an external monitor (display device) MT. The visible image captured from the unmanned air vehicle 100 can be observed by the operator through the monitor MT. In particular, a color visible light image can be formed by using a light beam transmitted through the filter elements (R, G, B) of the first imaging device CA1 and / or the second imaging device CA2, and this is displayed on the monitor MT. By doing so, the operation of the operator can be supported. A monitor image may be displayed using a light beam that has passed through another filter element.
 更に、画像処理部PRから出力された被写体情報は、本体101内に配置された駆動制御部(移動制御部)DRに出力され、これに基づいて障害物に衝突しないように、モータ104A~104Dを制御して、自律飛行を実現できるようになっている。 Further, the subject information output from the image processing unit PR is output to a drive control unit (movement control unit) DR disposed in the main body 101, and based on this, motors 104A to 104D are used so as not to collide with obstacles. It is possible to realize autonomous flight by controlling.
 次に、図面を参照して、画像処理部PRにおける自律飛行制御を含む処理について説明する。図5は、画像処理部PRにおける処理フローを示すブロック図である。ここで、無人飛行体100は、移動方向前方の対象物を障害物として検出し、これを避けるように飛行を行うものとする。飛行状態で、第1撮像装置CA1と第2撮像装置CA2とは、各フィルタ素子CFa及び各個眼光学系ILを透過した光束を、固体撮像素子SR内の光電変換領域Iaで、それぞれ受光して光電変換を行う。光電変換された後に、各信号は固体撮像素子SRから出力され、画像処理部PR内に設けられた不図示のメモリMRに記憶され、必要に応じて呼び出されるようになっている。 Next, processing including autonomous flight control in the image processing unit PR will be described with reference to the drawings. FIG. 5 is a block diagram illustrating a processing flow in the image processing unit PR. Here, it is assumed that the unmanned air vehicle 100 detects an object ahead in the moving direction as an obstacle, and performs a flight so as to avoid the obstacle. In the flight state, the first imaging device CA1 and the second imaging device CA2 respectively receive the light beams transmitted through the filter elements CFa and the individual eye optical systems IL by the photoelectric conversion region Ia in the solid-state imaging device SR. Perform photoelectric conversion. After photoelectric conversion, each signal is output from the solid-state imaging device SR, stored in a memory MR (not shown) provided in the image processing unit PR, and called up as necessary.
(第1の処理態様)
 第1の処理態様では、予め、測距する対象を抽出し、その対象について測距を行って距離マップを作成するものである。また、反射体を検出した場合には、反射体自体の距離情報は検出しないようにしている。まず、画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、素通しガラスのフィルタ素子(NO)を通過した光束により露出時間(露光量)を決定することとなっている場合、処理P101、P102において、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、素通しガラスのフィルタ素子(NO)を通過した光束に対応する信号を入力して、その値から露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。決定された露出時間での撮像後、第1撮像装置CA1と第2撮像装置CA2の光電変換領域Iaから出力された信号は、メモリMRにそれぞれ記憶される。
(First processing mode)
In the first processing mode, a target for distance measurement is extracted in advance, and distance measurement is performed on the target to create a distance map. In addition, when a reflector is detected, distance information of the reflector itself is not detected. First, when the image processing unit PR is to determine the exposure time (exposure amount) by the light flux that has passed through the filter element (NO) of the transparent glass by a predetermined program or the operation of the operator, the process P101 , P102, a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass is input independently from each of the first imaging device CA1 and the second imaging device CA2, and the exposure time is calculated from the value. This is determined and fed back to control the exposure time of the solid-state imaging device SR individually. After imaging with the determined exposure time, signals output from the photoelectric conversion regions Ia of the first imaging device CA1 and the second imaging device CA2 are stored in the memory MR.
 次いで、画像処理部PRは、処理P103において、メモリMRから、第1撮像装置CA1と第2撮像装置CA2それぞれに応じ、素通しガラスのフィルタ素子(NO)を通過した光束に対応する信号と、第1の減衰率を有するフィルタ素子(ND1)を通過した光束に対応する信号と、第2の減衰率を有するフィルタ素子(ND2)を通過した光束に対応する信号とを入力し、最もコントラストが大きい画像を選択し、その画像に対応した信号値から被写体を判別する。例えば逆光下などの撮像条件では、太陽などの高輝度の被写体が写り込むことがあるが、このように素通しガラスのフィルタ素子(NO)を通過した光束を入射した光電変換領域Iaの画素値が飽和するような撮像状態でも、第1の減衰率又は第2の減衰率を有するフィルタ素子(ND1又はND2)を通過した光束に対応する信号に基づいて、白飛びしないコントラストのある被写体画像を得ることができ、障害物検知に最適な輝度の画像を得ることが出来る。このため、誤検知を防ぐことが可能である。尚、フィルタ素子(NO,ND1,ND2)のいずれか2つを選択し、得られた信号から上記と同様のことを行っても良い。また、フィルタ素子(NO,ND1,ND2)のいずれか2つ又は3つを選択してダイナミックレンジの広い画像を作成してもよい。 Next, in the process P103, the image processing unit PR receives from the memory MR a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass according to each of the first imaging device CA1 and the second imaging device CA2, and the first The signal corresponding to the light beam that has passed through the filter element (ND1) having the attenuation factor of 1 and the signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input, and the contrast is the highest. An image is selected, and a subject is discriminated from a signal value corresponding to the image. For example, a subject with high brightness such as the sun may appear under imaging conditions such as under backlighting, but the pixel value of the photoelectric conversion region Ia to which the light beam that has passed through the filter element (NO) of the transparent glass is incident is as described above. Even in an imaging state that saturates, a subject image having a contrast that does not cause whiteout is obtained based on a signal corresponding to the light beam that has passed through the filter element (ND1 or ND2) having the first attenuation factor or the second attenuation factor. Therefore, it is possible to obtain an image with the optimum brightness for obstacle detection. For this reason, it is possible to prevent erroneous detection. Note that any two of the filter elements (NO, ND1, ND2) may be selected, and the same operation as described above may be performed from the obtained signal. Alternatively, any two or three of the filter elements (NO, ND1, ND2) may be selected to create an image with a wide dynamic range.
 以上と並行して、画像処理部PRは、処理P104において、メモリMRから、フィルタ素子(PF)を通過した光束に対応する信号と、フィルタ素子(SF)を通過した光束に対応する信号とを入力して、両信号値の差を求め、これらの値から被写体に反射体が含まれていないか判断する。すなわち、両信号値の差が閾値を超えていれば、反射光が入射したことが分かるので、画像処理部PRは、閾値の超えた領域範囲を反射体と認識する。尚、反射体の判定に用いる信号は、第1撮像装置CA1内の信号のみでも良いし、第2撮像装置CA2内の信号のみでも良いし、第1撮像装置CA1及び第2撮像装置CA2双方からの信号でも良い。 In parallel with the above, in the process P104, the image processing unit PR outputs a signal corresponding to the light beam that has passed through the filter element (PF) and a signal corresponding to the light beam that has passed through the filter element (SF) from the memory MR. Then, the difference between the two signal values is obtained, and it is determined from these values whether the subject contains a reflector. That is, if the difference between the two signal values exceeds the threshold value, it can be seen that the reflected light has entered, so the image processing unit PR recognizes the region range exceeding the threshold value as a reflector. Note that the signal used for the determination of the reflector may be only the signal in the first imaging device CA1, only the signal in the second imaging device CA2, or from both the first imaging device CA1 and the second imaging device CA2. This signal may be used.
 以上と並行して、画像処理部PRは、処理P105において、メモリMRから、近赤外の光を透過するフィルタ素子(IR)を通過した光束に対応する信号を入力して、閾値より低い被写体を特定し抽出する。例えば森林の上に設けられた架線などは、色調が背景に近くなるので、可視光では判別しにくいことがあるが、一般的に植物は近赤外の光を反射するため架線部分の信号だけが低くなる。従って、近赤外の光を把握することで森林から架線を判別できる。 In parallel with the above, the image processing unit PR inputs a signal corresponding to the light beam that has passed through the filter element (IR) that transmits near-infrared light from the memory MR in the process P105, and the subject is lower than the threshold value. Identify and extract. For example, overhead lines installed on the forest may be difficult to discern with visible light because the color tone is close to the background, but plants generally reflect near-infrared light, so only the signal of the overhead line part Becomes lower. Therefore, an overhead line can be identified from the forest by grasping near-infrared light.
 以上と並行して、画像処理部PRは、処理P106において、メモリMRから、フィルタ素子(R)を通過した赤色光束に対応する信号と、フィルタ素子(G)を通過した緑色光束に対応する信号と、フィルタ素子(B)を通過した青色光束に対応する信号のうち少なくとも1つと、近赤外の光を透過するフィルタ素子(IR)を通過した光束に対応する信号を入力して、これらを演算して被写体を特定する。 In parallel with the above, in the process P106, the image processing unit PR outputs from the memory MR a signal corresponding to the red light beam that has passed through the filter element (R) and a signal corresponding to the green light beam that has passed through the filter element (G). And at least one of the signals corresponding to the blue luminous flux that has passed through the filter element (B) and the signal corresponding to the luminous flux that has passed through the filter element (IR) that transmits near-infrared light, Calculate to identify the subject.
 例えば対象物の認識に、NDVI(正規化差植生指数)の式を用いることができる。フィルタ素子(R)を通過した赤色光束に対応する信号の輝度値をRsとし、フィルタ素子(IR)を通過した近赤外光束に対応する信号の輝度値をIRsとしたときに、(1)式で表されるNDVI値が所定の範囲にあれば、撮像した被写体が植物であると判別できる。フィルタ素子(IR)を通過した近赤外光束に対応する信号のみでも植物であると判別できる場合もあるが、(1)式を用いることで、確度の高い対象物認定が可能となる。なお、適宜フィルタ素子(G)を通過した緑色光束に対応する信号や、フィルタ素子(B)を通過した青色光束に対応する信号を用いても良い。
(NDVI)=(IRs-Rs)/(IRs+Rs)   (1)
For example, an NDVI (normalized difference vegetation index) formula can be used to recognize an object. When the luminance value of the signal corresponding to the red light beam passing through the filter element (R) is Rs and the luminance value of the signal corresponding to the near-infrared light beam passing through the filter element (IR) is IRs, (1) If the NDVI value represented by the formula is within a predetermined range, it can be determined that the imaged subject is a plant. Although only a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) may be identified as a plant, using the equation (1) makes it possible to identify an object with high accuracy. A signal corresponding to the green light beam that has passed through the filter element (G) or a signal corresponding to the blue light beam that has passed through the filter element (B) may be used as appropriate.
(NDVI) = (IRs−Rs) / (IRs + Rs) (1)
 更に、画像処理部PRは、処理P104で判別された反射体以外の領域に対し、処理P107において、メモリMRから、フィルタ素子(R)を通過した赤色光束に対応する信号と、フィルタ素子(G)を通過した緑色光束に対応する信号と、フィルタ素子(B)を通過した青色光束に対応する信号とをそれぞれ入力して、2組の可視光画像データを生成し、その視差情報に基づいて各色のステレオ画像を形成し、測距を行う。尚、ステレオ画像を用いた測距技術については、特開2009-186228号公報に開示されているため、詳細は記載しない。その後、画像処理部PRは、処理P108で画素毎に可視光距離マップを作成し、被写体の3次元情報を求める。「距離マップ」とは、被写体の画像に距離情報を持たせたものをいう。 Further, the image processing unit PR applies the signal corresponding to the red light beam that has passed through the filter element (R) from the memory MR and the filter element (G) to the region other than the reflector determined in process P104. ) And a signal corresponding to the blue luminous flux that has passed through the filter element (B) are respectively input to generate two sets of visible light image data, and based on the parallax information A stereo image of each color is formed and distance measurement is performed. The distance measuring technique using a stereo image is disclosed in Japanese Patent Application Laid-Open No. 2009-186228, and thus details are not described. Thereafter, the image processing unit PR creates a visible light distance map for each pixel in process P108, and obtains three-dimensional information of the subject. The “distance map” refers to an object image having distance information.
 これと並行して、画像処理部PRは、処理P104で判別された反射体以外の領域に対し、処理P109において、メモリMRから、素通しガラスのフィルタ素子(NO)を通過した光束に対応する信号と、第1の減衰率を有するフィルタ素子(ND1)を通過した光束に対応する信号と、第2の減衰率を有するフィルタ素子(ND2)を通過した光束に対応する信号とをそれぞれ入力して、コントラストの最も大きい画像についてステレオ画像を形成し、測距を行う。その後、画像処理部PRは、処理P110で画素毎に逆光用距離マップを作成し、被写体の3次元情報を求める。尚、フィルタ素子(NO,ND1,ND2)のいずれか2つを選択して得られた信号から上記と同様のことを行っても良い。また、フィルタ素子(NO,ND1,ND2)のいずれか2つ又は3つの画像を用いてダイナミックレンジの広い画像を作成し、ステレオ画像を形成し、測距を行ってもよい。 In parallel with this, the image processing unit PR outputs a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass from the memory MR in the process P109 for the region other than the reflector determined in the process P104. And a signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and a signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input respectively. Then, a stereo image is formed for the image having the highest contrast, and the distance is measured. Thereafter, the image processing unit PR creates a backlight distance map for each pixel in process P110, and obtains three-dimensional information of the subject. In addition, you may perform the same thing as the above from the signal obtained by selecting any two of filter elements (NO, ND1, ND2). Alternatively, any two or three of the filter elements (NO, ND1, ND2) may be used to create an image with a wide dynamic range, form a stereo image, and perform distance measurement.
 これと並行して、画像処理部PRは、処理P111において、処理P104で判別された反射体の境界部を求め、この境界部に対して、メモリMRから、フィルタ素子(PF)を通過した光束に対応する信号、又はフィルタ素子(SF)を通過した光束に対応する信号を入力して、ステレオ画像を形成し、測距を行う。その後、画像処理部PRは、処理P112において、得られた境界部の測距値を反射体の測距値として、反射体を含む距離マップを作成し、被写体の3次元情報を求める。 In parallel with this, in the process P111, the image processing unit PR obtains the boundary portion of the reflector determined in the process P104, and the light flux that has passed through the filter element (PF) from the memory MR with respect to this boundary portion. Or a signal corresponding to the light beam that has passed through the filter element (SF) is input to form a stereo image and perform distance measurement. Thereafter, in process P112, the image processing unit PR creates a distance map including the reflector using the obtained distance measurement value of the boundary as a distance measurement value of the reflector, and obtains three-dimensional information of the subject.
 これと並行して、画像処理部PRは、処理P104で判別された反射体以外の領域に対し、処理P113において、メモリMRから、フィルタ素子(IR)を通過した近赤外光束に対応する信号を入力して、近赤外ステレオ画像を形成し、測距を行う。その後、画像処理部PRは、処理P114で画素毎に近赤外距離マップを作成し、被写体の3次元情報を求める。 In parallel with this, the image processing unit PR outputs a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) from the memory MR in the process P113 for the region other than the reflector determined in the process P104. Is input, a near-infrared stereo image is formed, and distance measurement is performed. Thereafter, the image processing unit PR creates a near-infrared distance map for each pixel in process P114, and obtains three-dimensional information of the subject.
 これと並行して、画像処理部PRは、処理P104で判別された反射体以外の領域に対し、処理P115において、メモリMRから、フィルタ素子(R)を通過した赤色光束に対応する信号と、フィルタ素子(G)を通過した緑色光束に対応する信号と、フィルタ素子(B)を通過した青色光束に対応する信号と、フィルタ素子(IR)を通過した近赤外光束に対応する信号を入力して、可視光及び近赤外ステレオ画像を形成し、測距を行う。その後、画像処理部PRは、処理P116で画素毎に可視光及び近赤外距離マップを作成し、被写体の3次元情報を求める。 In parallel with this, the image processing unit PR outputs a signal corresponding to the red light beam that has passed through the filter element (R) from the memory MR in the process P115 for the region other than the reflector determined in the process P104. Inputs a signal corresponding to the green light beam that has passed through the filter element (G), a signal that corresponds to the blue light beam that has passed through the filter element (B), and a signal that corresponds to the near-infrared light beam that has passed through the filter element (IR). Then, visible light and near-infrared stereo images are formed, and distance measurement is performed. Thereafter, the image processing unit PR creates a visible light and near-infrared distance map for each pixel in process P116, and obtains three-dimensional information of the subject.
 その後、処理P117において、画像処理部PRは、処理P108,P110,P112,P114、P116で得られた距離マップを重ね合わせ、精度を保証できない反射体自体の測距情報(反射体の測距情報を境界部の測距情報に置き換えたものを除く)を排除するなど取捨選択を行った上で、例えば2つ以上の距離マップにて認識された対象物がある場合、これを回避するように駆動制御部DRに被写体情報を出力する。尚、これ以外にも、画像処理部PRは、処理P108,P110,P112,P114、P116で得られた距離マップに基づいて、最も無人飛行体100に近い対象物がある場合、これを回避するように駆動制御部DRに被写体情報を出力するようにしても良い。 Thereafter, in process P117, the image processing unit PR superimposes the distance maps obtained in processes P108, P110, P112, P114, and P116, and the distance measurement information of the reflector itself that cannot guarantee the accuracy (the distance measurement information of the reflector). To avoid this when there is an object recognized by two or more distance maps, for example. Subject information is output to the drive controller DR. In addition to this, the image processing unit PR avoids the case where there is an object closest to the unmanned air vehicle 100 based on the distance map obtained in the processes P108, P110, P112, P114, and P116. As described above, the subject information may be output to the drive control unit DR.
 図6は、画像処理部PRにおける処理フローの別な例を示すブロック図である。画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、フィルタ素子(R)を通過した赤色光束と、フィルタ素子(G)を通過した緑色光束と、フィルタ素子(B)を通過した青色光束とにより露出時間を決定することとなっている場合、処理P101A、P102Aにおいて、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、フィルタ素子(R)を通過した赤色光束に対応する信号と、フィルタ素子(G)を通過した緑色光束に対応する信号と、フィルタ素子(B)を通過した青色光束に対応する信号を入力して、これら値から輝度値を演算して露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。尚、処理P103~P117については、図5の処理と同様であるので説明を省略する。 FIG. 6 is a block diagram showing another example of the processing flow in the image processing unit PR. The image processing unit PR receives a red light beam that has passed through the filter element (R), a green light beam that has passed through the filter element (G), and a blue light that has passed through the filter element (B) by a predetermined program or an operator's operation. When the exposure time is to be determined by the luminous flux, the red luminous flux that has passed through the filter element (R) independently from each of the first imaging device CA1 and the second imaging device CA2 in the processes P101A and P102A. , A signal corresponding to the green light beam that has passed through the filter element (G), and a signal corresponding to the blue light beam that has passed through the filter element (B), and a luminance value is calculated from these values. The exposure time is determined and fed back to control the exposure time of the solid-state imaging element SR individually. Since the processes P103 to P117 are the same as the processes in FIG.
 図7は、画像処理部PRにおける処理フローの別な例を示すブロック図である。画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、フィルタ素子(IR)を通過した近赤外光束により露出時間を決定することとなっている場合、処理P101B、P102Bにおいて、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、フィルタ素子(IR)を通過した近赤外光束に対応する信号を入力して、この値から露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。尚、処理P103~P117については、図5の処理と同様であるので説明を省略する。 FIG. 7 is a block diagram showing another example of the processing flow in the image processing unit PR. When the exposure time is determined by the near-infrared light beam that has passed through the filter element (IR) by a predetermined program or an operator's operation, the image processing unit PR performs the first in the processing P101B and P102B. A signal corresponding to the near-infrared light beam that has passed through the filter element (IR) is independently input from the imaging device CA1 and the second imaging device CA2, and the exposure time is determined from this value and fed back. Thus, the exposure time of the solid-state image sensor SR is individually controlled. Since the processes P103 to P117 are the same as the processes in FIG.
 図8は、画像処理部PRにおける処理フローの別な例を示すブロック図である。画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、第1の減衰率を有するフィルタ素子(ND1)を通過した光束と、第2の減衰率を有するフィルタ素子(ND2)を通過した光束との少なくとも一方により露出時間を決定することとなっている場合、処理P101C、P102Cにおいて、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、第1の減衰率を有するフィルタ素子(ND1)を通過した光束に対応する信号、及び/又は第2の減衰率を有するフィルタ素子(ND2)を通過した光束に対応する信号を入力して、この値に基づいて露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。尚、処理P103~P117については、図5の処理と同様であるので説明を省略する。 FIG. 8 is a block diagram showing another example of the processing flow in the image processing unit PR. The image processing unit PR has passed through the filter element (ND1) having the first attenuation factor and the filter element (ND2) having the second attenuation factor by a predetermined program or an operator's operation. When the exposure time is to be determined by at least one of the luminous fluxes, the processing P101C and P102C have the first attenuation rate independently from the first imaging device CA1 and the second imaging device CA2. A signal corresponding to the light beam passing through the filter element (ND1) and / or a signal corresponding to the light beam passing through the filter element (ND2) having the second attenuation factor is input, and the exposure time is determined based on this value. This is determined and fed back to control the exposure time of the solid-state imaging device SR individually. Since the processes P103 to P117 are the same as the processes in FIG.
 図9は、画像処理部PRにおける処理フローの別な例を示すブロック図である。画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、フィルタ素子(PF)を通過した光束に対応する信号又はフィルタ素子(SF)を通過した光束に対応する信号により露出時間を決定することとなっている場合、処理P101D、P102Dにおいて、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、フィルタ素子(PF)を通過した光束に対応する信号、又はフィルタ素子(SF)を通過した光束に対応する信号を入力して、この値に基づいて露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。尚、処理P103~P117については、図5の処理と同様であるので説明を省略する。 FIG. 9 is a block diagram showing another example of the processing flow in the image processing unit PR. The image processing unit PR determines the exposure time based on a signal corresponding to the light beam that has passed through the filter element (PF) or a signal corresponding to the light beam that has passed through the filter element (SF) by a predetermined program or an operator's operation. In this case, in the processes P101D and P102D, a signal corresponding to the light beam that has passed through the filter element (PF) or the filter element (independently from the first image pickup apparatus CA1 and the second image pickup apparatus CA2). The signal corresponding to the light flux that has passed SF) is input, the exposure time is determined based on this value, and this is fed back to individually control the exposure time of the solid-state imaging device SR. Since the processes P103 to P117 are the same as the processes in FIG.
 たとえば水面に向かって無人飛行体100が移動する場合など、撮像領域の大半が反射体(水面)で占められる可能性も考えられる。このような時、正確な測距が極めて困難となり、無人飛行体が水面に着水してしまう恐れがあるため、反射体が被写体の大半(例えば80%以上)を占めるような場合には、駆動制御部DRが無人飛行体100を一時停止させることが好ましい。 For example, when the unmanned aerial vehicle 100 moves toward the water surface, there is a possibility that most of the imaging region is occupied by the reflector (water surface). In such a case, since accurate ranging becomes extremely difficult and the unmanned flying object may land on the water surface, when the reflector occupies most of the subject (for example, 80% or more), It is preferable that the drive control unit DR temporarily stops the unmanned air vehicle 100.
(第2の処理態様)
 第2の処理態様では、同種のフィルタで得られた画像信号を用い、撮像された被写体全てについて測距を行って距離マップを形成し、必要なものを抽出するものである。また、反射体を検出した場合、反射体自体の距離情報を検出しても用いないようにしている。図10は、第2の処理態様にかかる画像処理部PRにおける処理フローを示すブロック図である。まず、画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、素通しガラスのフィルタ素子(NO)を通過した光束により露出時間を決定することとなっている場合、処理P201、P202において、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、素通しガラスのフィルタ素子(NO)を通過した光束に対応する信号を入力して、その値から露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。決定された露出時間での撮像後、第1撮像装置CA1と第2撮像装置CA2の光電変換領域Iaから出力された信号は、メモリMRにそれぞれ記憶される。
(Second processing mode)
In the second processing mode, an image signal obtained by the same type of filter is used, and distance measurement is performed on all captured subjects to form a distance map, and necessary ones are extracted. Further, when a reflector is detected, it is not used even if distance information of the reflector itself is detected. FIG. 10 is a block diagram illustrating a processing flow in the image processing unit PR according to the second processing mode. First, when the exposure time is to be determined by the light beam that has passed through the filter element (NO) of the transparent glass by a predetermined program or an operator's operation, the image processing unit PR, in processing P201, P202, A signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass is input independently from each of the first imaging device CA1 and the second imaging device CA2, and the exposure time is determined from the value. And the exposure time of the solid-state image sensor SR is individually controlled. After imaging with the determined exposure time, signals output from the photoelectric conversion regions Ia of the first imaging device CA1 and the second imaging device CA2 are stored in the memory MR.
 次いで、画像処理部PRは、処理P203において、メモリMRから、第1撮像装置CA1と第2撮像装置CA2毎に、フィルタ素子(R)を通過した赤色光束に対応する信号と、フィルタ素子(G)を通過した緑色光束に対応する信号と、フィルタ素子(B)を通過した青色光束に対応する信号とをそれぞれ入力して、各色のステレオ画像を形成し、測距を行って、RGB測距データを得る。RGB測距データは、被写体に反射体が含まれている場合、その実際の測距データも含む。 Next, in the process P203, the image processing unit PR, for each of the first imaging device CA1 and the second imaging device CA2, outputs a signal corresponding to the red light beam that has passed through the filter element (R) and the filter element (G ) And a signal corresponding to the blue light beam that has passed through the filter element (B) are respectively input, a stereo image of each color is formed, a distance measurement is performed, and an RGB distance measurement is performed. Get the data. If the subject includes a reflector, the RGB distance measurement data also includes the actual distance measurement data.
 これと並行して、画像処理部PRは、処理P204において、メモリMRから、第1撮像装置CA1と第2撮像装置CA2毎に、素通しガラスのフィルタ素子(NO)を通過した光束に対応する信号と、第1の減衰率を有するフィルタ素子(ND1)を通過した光束に対応する信号と、第2の減衰率を有するフィルタ素子(ND2)を通過した光束に対応する信号とをそれぞれ入力して、同種フィルタ素子に対応する値から、それぞれステレオ画像を形成し、測距を行って、ND測距データを得る。ND測距データは、被写体に反射体が含まれている場合、その実際の測距データも含む。尚、フィルタ素子(NO,ND1,ND2)のいずれか2つを選択し、得られた信号から上記と同様のことを行っても良い。また、フィルタ素子(NO,ND1,ND2)のいずれか2つ又は3つの画像を用いてダイナミックレンジの広い画像を作成し、ステレオ画像を形成し、測距を行ってもよい。 In parallel with this, in the process P204, the image processing unit PR outputs a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass for each of the first imaging device CA1 and the second imaging device CA2 from the memory MR. And a signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and a signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input respectively. From the values corresponding to the same kind of filter elements, a stereo image is formed, and distance measurement is performed to obtain ND distance measurement data. The ND distance measurement data also includes the actual distance measurement data when the subject includes a reflector. Note that any two of the filter elements (NO, ND1, ND2) may be selected, and the same operation as described above may be performed from the obtained signal. Alternatively, any two or three of the filter elements (NO, ND1, ND2) may be used to create an image with a wide dynamic range, form a stereo image, and perform distance measurement.
 以上と並行して、画像処理部PRは、処理P205において、メモリMRから、第1撮像装置CA1と第2撮像装置CA2毎に、フィルタ素子(PF)を通過した光束に対応する信号と、フィルタ素子(SF)を通過した光束に対応する信号とをそれぞれ入力して、同種フィルタ素子に対応する値から、それぞれステレオ画像を形成し、測距を行って、偏光測距データを得る。偏光測距データは、被写体に反射体が含まれている場合、その実際の測距データも含む。 In parallel with the above, in the process P205, the image processing unit PR receives, from the memory MR, a signal corresponding to the light beam that has passed through the filter element (PF) and the filter for each of the first imaging device CA1 and the second imaging device CA2. A signal corresponding to the light beam that has passed through the element (SF) is input, a stereo image is formed from values corresponding to the same type of filter element, and distance measurement is performed to obtain polarization distance measurement data. When the subject includes a reflector, the polarization distance measurement data also includes the actual distance measurement data.
 以上と並行して、画像処理部PRは、処理P206において、メモリMRから、第1撮像装置CA1と第2撮像装置CA2毎に、フィルタ素子(IR)を通過した近赤外光束に対応する信号をそれぞれ入力して、ステレオ画像を形成し、測距を行って、近赤外測距データを得る。近赤外測距データは、被写体に反射体が含まれている場合、その実際の測距データも含む。 In parallel with the above, in the process P206, the image processing unit PR receives a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) from the memory MR for each of the first imaging device CA1 and the second imaging device CA2. Are input, a stereo image is formed, and distance measurement is performed to obtain near-infrared distance measurement data. Near-infrared distance measurement data also includes actual distance measurement data when the subject includes a reflector.
 次いで、画像処理部PRは、処理P207において、メモリMRから、フィルタ素子(PF)を通過した光束に対応する信号と、フィルタ素子(SF)を通過した光束に対応する信号とを入力して、両信号値の差を求め、これらの値から被写体に反射体が含まれていないか判断する。すなわち、両信号値の差が閾値を超えていれば、反射光が入射したことが分かるので、画像処理部PRは、閾値を超えた領域範囲を反射体と認識する。尚、反射体の判定に用いる信号は、第1撮像装置CA1内の信号のみでも良いし、第2撮像装置CA2内の信号のみでも良いし、第1撮像装置CA1及び第2撮像装置CA2の双方からの信号でも良い。 Next, in process P207, the image processing unit PR receives a signal corresponding to the light beam that has passed through the filter element (PF) and a signal that corresponds to the light beam that has passed through the filter element (SF) from the memory MR, and The difference between the two signal values is obtained, and it is determined from these values whether the subject contains a reflector. That is, if the difference between the two signal values exceeds the threshold value, it can be seen that the reflected light has entered, so the image processing unit PR recognizes the region range that exceeds the threshold value as a reflector. Note that the signal used for the determination of the reflector may be only the signal in the first image pickup device CA1, only the signal in the second image pickup device CA2, or both the first image pickup device CA1 and the second image pickup device CA2. The signal from
 更に、画像処理部PRは、処理P208において、被写体中の反射体の境界部を求め、この境界部に対して、処理P205により得られた偏光測距データから、反射体の測距値を境界部の測距値に置換して、処理P209において、反射体の距離マップを作成し、被写体の3次元情報を求める。 Further, the image processing unit PR obtains a boundary portion of the reflector in the subject in the process P208, and the boundary value of the distance measurement value of the reflector is obtained from the polarization distance measurement data obtained in the process P205. In step P209, a distance map of the reflector is created and the three-dimensional information of the subject is obtained.
 更に、画像処理部PRは、処理P203で得られた可視光測距データ内から、処理P207で判別された反射体に対応する測距データを除去して、処理P210で可視光用の距離マップを作成し、被写体の3次元情報を求める。 Further, the image processing unit PR removes the distance measurement data corresponding to the reflector determined in the process P207 from the visible light distance measurement data obtained in the process P203, and in the process P210, the distance map for visible light. And 3D information of the subject is obtained.
 更に、画像処理部PRは、処理P211において、メモリMRから、第1撮像装置CA1と第2撮像装置CA2それぞれに応じ、素通しガラスのフィルタ素子(NO)を通過した光束に対応する信号と、第1の減衰率を有するフィルタ素子(ND1)を通過した光束に対応する信号と、第2の減衰率を有するフィルタ素子(ND2)を通過した光束に対応する信号とを入力し、最もコントラストが大きい画像を選択し、これに対応するND測距データを、処理P204で得られたND測距データ内から選択する。画像処理部PRは、処理P212で、選択したND測距データから、処理P207で判別された反射体に対応する測距データを除去して、逆光用の距離マップを作成し、被写体の3次元情報を求める。尚、フィルタ素子(NO,ND1,ND2)のいずれか2つを選択し、得られた信号から上記と同様のことを行っても良い。また、フィルタ素子(NO,ND1,ND2)のいずれか2つ又は3つの画像を用いてダイナミックレンジの広い画像を作成し、ステレオ画像を形成し、測距を行ってもよい。 Further, in process P211, the image processing unit PR receives from the memory MR a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass in accordance with each of the first imaging device CA1 and the second imaging device CA2. The signal corresponding to the light beam that has passed through the filter element (ND1) having the attenuation factor of 1 and the signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input, and the contrast is the highest. An image is selected, and ND ranging data corresponding to the image is selected from the ND ranging data obtained in process P204. In process P212, the image processing unit PR removes distance measurement data corresponding to the reflector determined in process P207 from the selected ND distance measurement data, creates a distance map for backlighting, and creates a three-dimensional image of the subject. Ask for information. Note that any two of the filter elements (NO, ND1, ND2) may be selected, and the same operation as described above may be performed from the obtained signal. Alternatively, any two or three of the filter elements (NO, ND1, ND2) may be used to create an image with a wide dynamic range, form a stereo image, and perform distance measurement.
 更に、画像処理部PRは、処理P206で得られた近赤外測距データ内から、処理P207で判別された反射体に対応する測距データを除去して、処理P213で近赤外用の距離マップを作成し、被写体の3次元情報を求める。 Further, the image processing unit PR removes the distance measurement data corresponding to the reflector determined in the process P207 from the near infrared distance measurement data obtained in the process P206, and in the process P213, the near infrared distance. A map is created and 3D information of the subject is obtained.
 更に、画像処理部PRは、処理P214において、メモリMRから、フィルタ素子(R)を通過した赤色光束に対応する信号と、フィルタ素子(G)を通過した緑色光束に対応する信号と、フィルタ素子(B)を通過した青色光束に対応する信号のうち少なくとも1つと、近赤外の光を透過するフィルタ素子(IR)を通過した光束に対応する信号を入力して、これらを演算して被写体を特定する。 Further, in process P214, the image processing unit PR receives from the memory MR a signal corresponding to the red light beam that has passed through the filter element (R), a signal corresponding to the green light beam that has passed through the filter element (G), and the filter element. Input at least one of the signals corresponding to the blue luminous flux that has passed through (B) and the signal corresponding to the luminous flux that has passed through the filter element (IR) that transmits near-infrared light, and calculate these to calculate the subject. Is identified.
 更に、画像処理部PRは、処理P215において、メモリMRから、フィルタ素子(R)を通過した赤色光束に対応する信号と、フィルタ素子(G)を通過した緑色光束に対応する信号と、フィルタ素子(B)を通過した青色光束に対応する信号と、フィルタ素子(IR)を通過した近赤外光束に対応する信号を入力して、可視光及び近赤外ステレオ画像を形成し、測距を行う。その後、画像処理部PRは、得られた測距データ内から、処理P207で判別された反射体に対応する測距データを除去して、植生用距離マップを作成し、被写体の3次元情報を求める。 Further, in process P215, the image processing unit PR receives from the memory MR a signal corresponding to the red light beam that has passed through the filter element (R), a signal corresponding to the green light beam that has passed through the filter element (G), and a filter element. A signal corresponding to the blue light beam that has passed through (B) and a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) are input to form visible light and a near-infrared stereo image, and distance measurement is performed. Do. Thereafter, the image processing unit PR removes the distance measurement data corresponding to the reflector determined in process P207 from the obtained distance measurement data, creates a vegetation distance map, and obtains the three-dimensional information of the subject. Ask.
 その後、処理P216において、画像処理部PRは、処理P210,P212,P209,P213、P215で得られた距離マップを重ね合わせる。このとき、各測距マップで欠けていた反射体の測距情報は、反射体の境界部の測距情報に置換され、距離マップの補完がなされる。そして、例えば2つ以上の距離マップにて認識された対象物がある場合、これを回避するように駆動制御部DRに被写体情報を出力する。尚、これ以外にも、画像処理部PRは、処理P210,P212,P209,P213、P215で得られた距離マップに基づいて、最も無人飛行体100に近い対象物がある場合、これを回避するように駆動制御部DRに被写体情報を出力するようにしても良い。 Thereafter, in process P216, the image processing unit PR superimposes the distance maps obtained in processes P210, P212, P209, P213, and P215. At this time, the distance measurement information of the reflector that is missing in each distance measurement map is replaced with the distance measurement information of the boundary portion of the reflector, and the distance map is complemented. For example, when there is an object recognized by two or more distance maps, the subject information is output to the drive control unit DR so as to avoid this. In addition to this, the image processing unit PR avoids the case where there is an object closest to the unmanned air vehicle 100 based on the distance map obtained in the processes P210, P212, P209, P213, and P215. As described above, the subject information may be output to the drive control unit DR.
 図11は、画像処理部PRにおける処理フローの別な例を示すブロック図である。画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、フィルタ素子(R)を通過した赤色光束に対応する信号と、フィルタ素子(G)を通過した緑色光束に対応する信号と、フィルタ素子(B)を通過した青色光束に対応する信号とにより露出時間を決定することとなっている場合、処理P201A、P202Aにおいて、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、フィルタ素子(R)を通過した赤色光束に対応する信号と、フィルタ素子(G)を通過した緑色光束に対応する信号と、フィルタ素子(B)を通過した青色光束に対応する信号を入力して、これら値から輝度値を演算して露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。尚、処理P203~P216については、図10の処理と同様であるので説明を省略する。 FIG. 11 is a block diagram showing another example of the processing flow in the image processing unit PR. The image processing unit PR receives a signal corresponding to the red light beam that has passed through the filter element (R), a signal corresponding to the green light beam that has passed through the filter element (G), and a filter by a predetermined program or operator operation. When the exposure time is to be determined based on the signal corresponding to the blue light beam that has passed through the element (B), the first imaging device CA1 and the second imaging device CA2 are independent from each other in the processes P201A and P202A. The signal corresponding to the red light beam that has passed through the filter element (R), the signal that corresponds to the green light beam that has passed through the filter element (G), and the signal that corresponds to the blue light beam that has passed through the filter element (B) are input. Then, the brightness value is calculated from these values to determine the exposure time, and this is fed back to individually control the exposure time of the solid-state image sensor SR. Going on. Note that processing P203 to P216 is the same as the processing in FIG.
 図12は、画像処理部PRにおける処理フローの別な例を示すブロック図である。画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、フィルタ素子(IR)を通過した近赤外光束に対応する信号により露出時間を決定することとなっている場合、処理P201B、P202Bにおいて、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、フィルタ素子(IR)を通過した近赤外光束に対応する信号を入力して、この値から露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。尚、処理P203~P216については、図10の処理と同様であるので説明を省略する。 FIG. 12 is a block diagram showing another example of the processing flow in the image processing unit PR. When the image processing unit PR is to determine the exposure time based on a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) by a predetermined program or an operator's operation, the processing P201B, P202B , The signal corresponding to the near-infrared luminous flux that has passed through the filter element (IR) is input independently from each of the first imaging device CA1 and the second imaging device CA2, and the exposure time is determined from this value. This is fed back to individually control the exposure time of the solid-state image sensor SR. Note that processing P203 to P216 is the same as the processing in FIG.
 図13は、画像処理部PRにおける処理フローの別な例を示すブロック図である。画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、第1の減衰率を有するフィルタ素子(ND1)を通過した光束に対応する信号と、第2の減衰率を有するフィルタ素子(ND2)を通過した光束に対応する信号との少なくとも一方により露出時間を決定することとなっている場合、処理P201C、P202Cにおいて、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、第1の減衰率を有するフィルタ素子(ND1)を通過した光束に対応する信号と、第2の減衰率を有するフィルタ素子(ND2)を通過した光束に対応する信号とを入力して、この値に基づいて露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。尚、処理P203~P216については、図10の処理と同様であるので説明を省略する。 FIG. 13 is a block diagram illustrating another example of the processing flow in the image processing unit PR. The image processing unit PR receives a signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and a filter element (ND2 having the second attenuation factor) by a predetermined program or an operator's operation. ), The exposure time is determined by at least one of the signals corresponding to the light beam that has passed through the first imaging device CA1 and the second imaging device CA2 in the processes P201C and P202C. The signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and the signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input, and this The exposure time is determined based on the value, and this is fed back to individually control the exposure time of the solid-state image sensor SR. Note that processing P203 to P216 is the same as the processing in FIG.
 図14は、画像処理部PRにおける処理フローの別な例を示すブロック図である。画像処理部PRは、予め決められたプログラム又はオペレータの操作によって、フィルタ素子(PF)を通過した光束に対応する信号又はフィルタ素子(SF)を通過した光束に対応する信号により露出時間を決定することとなっている場合、処理P201D、P202Dにおいて、第1撮像装置CA1と第2撮像装置CA2とから、それぞれ独立して、フィルタ素子(PF)を通過した光束に対応する信号、又はフィルタ素子(SF)を通過した光束に対応する信号を入力して、この値に基づいて露出時間を決定し、これをフィードバックして、固体撮像素子SRの露出時間を個々に制御するようになっている。尚、処理P203~P216については、図10の処理と同様であるので説明を省略する。 FIG. 14 is a block diagram showing another example of the processing flow in the image processing unit PR. The image processing unit PR determines the exposure time based on a signal corresponding to the light beam that has passed through the filter element (PF) or a signal corresponding to the light beam that has passed through the filter element (SF) by a predetermined program or an operator's operation. If this is the case, in the processes P201D and P202D, a signal corresponding to the light beam that has passed through the filter element (PF) or the filter element (independently from the first image pickup apparatus CA1 and the second image pickup apparatus CA2). The signal corresponding to the light flux that has passed SF) is input, the exposure time is determined based on this value, and this is fed back to individually control the exposure time of the solid-state imaging device SR. Note that processing P203 to P216 is the same as the processing in FIG.
 以上述べた本実施形態によれば、第1アレイ光学系及び第2光学系を設けることにより、これらを別ユニットで組み合わせた物より小型軽量化でき、多機能化できる。第1アレイ光学系を有した第1撮像装置及び第2アレイ光学系を有した第2撮像装置をステレオ撮像可能な配置とすることにより、単体のアレイ光学系を有する撮像装置よりも長い基線長を確保することができ、遠距離における測距精度を確保することが可能となる。素通しガラスのフィルタ素子(NO)を有する個眼光学系ILでは、固体撮像素子SRに感度のあるすべての波長領域の光を集光させることができ、また減衰率が異なるフィルタ素子(ND1,ND2)と組み合わせることで、1回の撮像で低輝度から高輝度にかけてダイナミックレンジの広い被写体画像を得ることができる。又、赤色、緑色、青色フィルタ素子(R,G,B)により可視光領域にある物体の測距ができ、無人飛行体100のオペレータに表示するための可視画像を得ることができ、各波長の強度差や比から被写体を特定する等できる。近赤外フィルタ素子(IR)により、森林等の植生を抽出でき、架線等を判別することができる。透過軸の異なる2種の偏光フィルタ素子(SF、PF)により、光量差を検知することで、反射体を検出することができる。これら各個眼の機能を複数組み合わせることで、小型、軽量ながら障害物検知のための所望の画像を得ることが可能となる。尚、以上の実施形態では、赤色、緑色、青色フィルタ素子(R,G,B)を透過して得られた信号を用いて可視光ステレオ画像を形成していたが、少なくとも1つの色のフィルタ素子を用いて、単色又は2色のステレオ画像を形成しても良い。又、画像処理部PRが、被写体中に反射体が存在するか否かを最初に検出し、反射体を検出したときは、それに対応する領域の測距を行わないようにすることもできる。 According to the present embodiment described above, by providing the first array optical system and the second optical system, it is possible to reduce the size and weight as compared with a combination of these units in separate units, and to increase the functionality. By arranging the first image pickup device having the first array optical system and the second image pickup device having the second array optical system in a stereo-capable arrangement, the base line length is longer than that of the image pickup device having a single array optical system. Can be ensured, and distance measurement accuracy at a long distance can be ensured. In the single-eye optical system IL having the filter element (NO) of the transparent glass, the light of all wavelength regions sensitive to the solid-state imaging element SR can be condensed, and the filter elements (ND1, ND2 having different attenuation factors). ), It is possible to obtain a subject image with a wide dynamic range from low luminance to high luminance by one imaging. In addition, the red, green, and blue filter elements (R, G, and B) can measure an object in the visible light region, and a visible image to be displayed to the operator of the unmanned air vehicle 100 can be obtained. The subject can be specified from the intensity difference or ratio. The near-infrared filter element (IR) can extract vegetation such as forests and discriminate overhead lines and the like. A reflector can be detected by detecting a light amount difference by using two types of polarizing filter elements (SF, PF) having different transmission axes. By combining a plurality of functions of each individual eye, it is possible to obtain a desired image for obstacle detection while being small and light. In the above embodiment, the visible light stereo image is formed using the signal obtained by transmitting the red, green, and blue filter elements (R, G, B). However, at least one color filter is used. A single-color or two-color stereo image may be formed using the element. Further, the image processing unit PR may first detect whether or not a reflector exists in the subject, and when the reflector is detected, distance measurement of the corresponding area may not be performed.
 図15は、別な実施形態にかかる移動体としての無人飛行体の斜視図である。本実施形態の無人飛行体100’は、自律飛行しながら、橋梁等の構造物を撮像できる機能を有する。図15に示す無人飛行体100’においては、図1に示す実施形態に対して、本体101に対して回転可能な軸106が設けられ、軸106の上端にはフレーム107が取り付けられている。フレーム107は、高画素カメラHCAと、その両側に配置された第3撮像装置CA3及び第4撮像装置CA4とを、光軸を平行とするようにして搭載している。第3撮像装置CA3及び第4撮像装置CA4は、第1撮像装置CA1及び第2撮像装置CA2と同様な構成を有するが、異なる方向(例えば90度)に向いていることが望ましい。 FIG. 15 is a perspective view of an unmanned air vehicle as a moving body according to another embodiment. The unmanned aerial vehicle 100 ′ of the present embodiment has a function capable of capturing an image of a structure such as a bridge while autonomously flying. In the unmanned air vehicle 100 ′ shown in FIG. 15, a shaft 106 that can rotate with respect to the main body 101 is provided with respect to the embodiment shown in FIG. 1, and a frame 107 is attached to the upper end of the shaft 106. The frame 107 mounts the high-pixel camera HCA and the third imaging device CA3 and the fourth imaging device CA4 arranged on both sides thereof so that the optical axes are parallel. The third image pickup device CA3 and the fourth image pickup device CA4 have the same configuration as the first image pickup device CA1 and the second image pickup device CA2, but are preferably directed in different directions (for example, 90 degrees).
 図16は、無人飛行体100’の構成を示すブロック図である。第3撮像装置CA3と第4撮像装置CA4とから出力された画像信号は、無人飛行体100’の本体101内に配置された副画像処理部PR’に入力され、副画像処理部PR’内の検査物距離情報取得部PR3にて、検査物としての構造物までの距離を取得できる。また、第3撮像装置CA3と第4撮像装置CA4の撮像した画像をパターンマッチングすることで、無人飛行体100’の自己位置情報(GPS信号を用いても良い)を取得できる自己位置情報取得部PR4を、副画像処理部PR’内に設けている。これらの出力は、駆動制御部DRに出力される。 FIG. 16 is a block diagram showing the configuration of the unmanned air vehicle 100 '. The image signals output from the third imaging device CA3 and the fourth imaging device CA4 are input to the sub-image processing unit PR ′ arranged in the main body 101 of the unmanned air vehicle 100 ′, and are stored in the sub-image processing unit PR ′. In the inspection object distance information acquisition unit PR3, the distance to the structure as the inspection object can be acquired. In addition, a self-location information acquisition unit that can acquire self-location information (GPS signals may be used) of the unmanned air vehicle 100 ′ by pattern-matching images captured by the third imaging device CA3 and the fourth imaging device CA4. PR4 is provided in the sub-image processing unit PR ′. These outputs are output to the drive control unit DR.
 無人飛行体100’は、上述の実施形態と同様に、第1撮像装置CA1及び第2撮像装置CA2からの信号に基づいて、移動方向前方の障害物を回避するように飛行するが、更に検査物距離情報取得部PR3にて、検査物としての構造物までの距離を取得して、これが一定になるように駆動制御部DRにてフィードバック制御を行うことができ、また自己位置情報取得部PR4からの信号で所定のルートに沿った飛行も行えるようになっている。更に飛行中に、高画素カメラHCAで構造物を撮影することで、その画像をモニタMTに送信し、送信先でオペレータが観察したり記録できるようになっている。それ以外の構成は、上述した実施形態と同様であるため、説明を省略する。 The unmanned air vehicle 100 ′ flies so as to avoid an obstacle ahead in the moving direction based on signals from the first imaging device CA1 and the second imaging device CA2, as in the above-described embodiment. The object distance information acquisition unit PR3 can acquire the distance to the structure as the inspection object, and the drive control unit DR can perform feedback control so that the distance is constant, and the self-position information acquisition unit PR4 It is possible to fly along a predetermined route with the signal from. Further, during the flight, the structure is photographed by the high pixel camera HCA, and the image is transmitted to the monitor MT so that the operator can observe and record it at the transmission destination. Since the other configuration is the same as that of the above-described embodiment, the description thereof is omitted.
 例えば橋梁のようなインフラ構造物の点検用に無人飛行体100’を接近させ、検査用の高画素カメラHCAで橋梁を撮像して検査画像を得るような場合、検査対象物である橋梁に衝突しないように一定の距離を保つ必要がある。又、無人飛行体100’の移動方向と、検査対象物の方向が互いに直角である等、異なる方向を取る場合がある。本実施形態によれば、第1撮像装置CA1及び第2撮像装置CA2を無人飛行体100’の進行方向前方に向け、もう一組の第3撮像装置CA3及び第4撮像装置CA4を検査対象物に向けることで、進行方向の障害物を検知しながら検査対象物との距離を適宜維持でき、自律制御可能な無人飛行体100’を実現できる。 For example, when an unmanned air vehicle 100 'is approached for inspecting an infrastructure structure such as a bridge, and the inspection image is obtained by imaging the bridge with a high-pixel camera HCA for inspection, it collides with a bridge that is an inspection object. It is necessary to keep a certain distance so as not to. Further, there are cases where the moving direction of the unmanned air vehicle 100 ′ and the direction of the inspection object are different from each other, such as being perpendicular to each other. According to the present embodiment, the first image pickup device CA1 and the second image pickup device CA2 are directed forward in the traveling direction of the unmanned air vehicle 100 ′, and another set of the third image pickup device CA3 and the fourth image pickup device CA4 are inspected. The unmanned air vehicle 100 ′ capable of autonomous control can be realized by appropriately maintaining the distance from the inspection object while detecting the obstacle in the traveling direction.
 本実施形態によれば、素通しガラスのフィルタ素子を透過した光束で露出時間(露光量)を決めれば、比較的低輝度の被写体を速いシャッタースピードで撮像することができ、移動体や被写体のブレの影響を受けにくく、素通しガラスのフィルタ素子を透過した光束で形成された画像を、黒潰れのない画像とすることができる。また、他の光学フィルタ素子を透過した光束で形成された画像は、若干のアンダー露光となり白飛びのない画像を得ることができる。その結果、低輝度被写体、高輝度被写体共に測距精度が向上する。 According to the present embodiment, if the exposure time (exposure amount) is determined by the light flux that has passed through the filter element of the transparent glass, it is possible to capture a relatively low-brightness subject at a high shutter speed, and to detect blurring of moving objects and subjects. Thus, an image formed with a light beam that has passed through the filter element of the transparent glass can be an image without black crushing. In addition, an image formed with a light beam transmitted through another optical filter element is slightly underexposed, and an image without whiteout can be obtained. As a result, ranging accuracy is improved for both low-luminance subjects and high-luminance subjects.
 赤色のフィルタ素子、緑色のフィルタ素子、青色のフィルタ素子を透過した光束のいずれか1つ以上で露出時間(露光量)を決めれば、可視光領域での測距性能が向上するとともに、無人飛行体のオペレータに鮮明な可視画像情報を送信することができる。 If the exposure time (exposure amount) is determined by at least one of the light beams transmitted through the red filter element, the green filter element, and the blue filter element, the ranging performance in the visible light region is improved and the unmanned flight is performed. Visible visible image information can be transmitted to the body operator.
 近赤外のフィルタ素子を透過した光束で露出時間(露光量)を決めれば、例えば植生等で反射された光を的確に撮像することが可能となり、植生等の抽出能力が向上する。 If the exposure time (exposure amount) is determined by the light beam transmitted through the near-infrared filter element, for example, the light reflected by vegetation or the like can be accurately imaged, and the extraction ability of vegetation or the like is improved.
 偏光フィルタ素子を透過した光束で露出時間(露光量)を決めれば、反射体等からの物体光を的確に撮像することが可能となり、反射体等の抽出能力が向上する。 If the exposure time (exposure amount) is determined by the light beam that has passed through the polarizing filter element, it is possible to accurately capture the object light from the reflector and the like, and the extraction ability of the reflector and the like is improved.
 入射光を減衰可能なフィルタ素子を透過した光束で露出時間(露光量)を決めれば、逆光時や高輝度被写体からの光量を抑制することができ、その結果、画像の白飛びを回避することが可能である。 If the exposure time (exposure amount) is determined by the light beam that has passed through the filter element that can attenuate the incident light, the amount of light from the backlight or high-intensity subject can be suppressed, and as a result, whiteout of the image can be avoided. Is possible.
 また、水面、透明アクリル、ガラス等の反射体に映りこんだ像をステレオ画像の測距対象とすると、それより遠方に存在する虚物体の距離を検知してしまうこととなり、実際の反射体の位置を検出できない。このような誤検出を防止するために、反射体を認識しておくことが重要である。反射体を認識する手段としては、2種類以上の異なる透過軸を持った偏光フィルタ素子を通して反射体を撮影すれば、反射光は偏光しているため、異なる透過軸の偏光フィルタ素子を通して得られた像の輝度が反射体の部位では互いに異なる。このように偏光フィルタ素子間での像輝度の差を検出すれば、それを反射体と特定することが可能となる。なお、互いに異なる透過軸の偏光フィルタ素子の組み合わせは、同一アレイ光学系内でも良いし、互いに異なるアレイ光学系同士でも良い。 In addition, if an image reflected on a reflector such as the water surface, transparent acrylic, or glass is used as a distance measurement target for a stereo image, the distance of a virtual object that exists farther than that will be detected. The position cannot be detected. In order to prevent such erroneous detection, it is important to recognize the reflector. As a means for recognizing the reflector, if the reflector is photographed through two or more types of polarizing filter elements having different transmission axes, the reflected light is polarized and thus obtained through the polarizing filter elements having different transmission axes. The brightness of the image is different from each other at the part of the reflector. If a difference in image luminance between polarization filter elements is detected in this way, it can be identified as a reflector. The combination of polarizing filter elements having different transmission axes may be in the same array optical system or in different array optical systems.
 このように反射体を検出できれば、反射体からの光束に基づく距離情報を採用しない、もしくは反射体からの光束では測距処理を行わないことで、精度の良い被写体の3次元情報を得ることができる。 If the reflector can be detected in this way, it is possible to obtain accurate three-dimensional information of the subject by not adopting distance information based on the light flux from the reflector or by not performing distance measurement processing on the light flux from the reflector. it can.
 但し、反射体を特定できても、そのままでは反射体までの距離は依然不明なままなので、反射体の境界部に存在する非反射体を、素通しのガラスであるフィルタ素子、赤色光、緑色光、青色光を透過するフィルタ素子、近赤外光を透過するフィルタ素子、異なる減衰率を有するフィルタ素子、偏光フィルタ素子のいずれかを有する結像光学系、又は異なる種類のフィルタ素子を有する結像光学系の信号を組み合わせたものに基づいて測距処理を行い、ここで得られた距離情報を反射体の距離とすることにより、距離情報が存在しない事態を回避できる。 However, even if the reflector can be specified, the distance to the reflector remains unclear if it remains as it is, so the non-reflector existing at the boundary of the reflector is replaced with a filter element made of plain glass, red light, green light. , A filter element that transmits blue light, a filter element that transmits near-infrared light, a filter element having a different attenuation factor, an imaging optical system having any one of polarizing filter elements, or an image having different types of filter elements A distance measurement process is performed based on a combination of signals from the optical system, and the distance information obtained here is used as the distance of the reflector, thereby avoiding a situation in which no distance information exists.
 また、電線等の架線の背景に森林が存在すると、通常の可視画像からは架線の検出が困難となるが、近赤外光を透過するフィルタ素子を設けた結像光学系があれば、それに対応した植生画像が得られ、この場合、固体撮像素子の画素の輝度値(画素値)の低い被写体が電線になると推定できるため、このような被写体を抽出することで電線等の架線の検出が容易となり、正確な障害物検知が可能となる。 In addition, if there is a forest in the background of an overhead line such as an electric wire, it is difficult to detect the overhead line from a normal visible image, but if there is an imaging optical system provided with a filter element that transmits near-infrared light, Corresponding vegetation images are obtained, and in this case, it can be estimated that a subject with a low luminance value (pixel value) of a pixel of the solid-state image sensor is an electric wire. It becomes easy and accurate obstacle detection becomes possible.
 測距処理前に、測距できる領域を特定しておければ、全画素領域を測距するよりも計算負荷が小さくなり、短時間での処理が可能となる。 If the area that can be measured is specified before the distance measurement process, the calculation load becomes smaller than the distance measurement for all the pixel areas, and the process can be performed in a short time.
 逆光等で画素値が飽和するような撮像条件下でも、入射光を減衰可能なフィルタ素子を用いることで、白飛びしないコントラストのある画像を得ることができ、障害物検知に最適な輝度の画像を得ることが出来るため、誤検知を防ぐことが可能である。 By using a filter element that can attenuate incident light even under imaging conditions where the pixel value is saturated due to backlight or the like, it is possible to obtain an image with contrast that does not cause whiteout, and an image with optimal brightness for obstacle detection Therefore, it is possible to prevent false detection.
 ステレオ画像の視差情報から距離計算を行う前に、各フィルタ素子を透過した光束の特性により被写体を特定したり被写体情報を得ておけば、これに従い適宜被写体毎に最適な距離画像を得ることができ、一括して測距処理を行うよりも計算の負荷や時間を減らすことができる。 Before calculating the distance from the parallax information of the stereo image, if the subject is specified or the subject information is obtained from the characteristics of the light flux that has passed through each filter element, an optimum distance image can be obtained for each subject accordingly. It is possible to reduce the calculation load and time compared to the case where the distance measurement processing is performed collectively.
 但し、測距不可能な反射体等が存在していても、一括して測距を行う方が、画像処理アルゴリズム上の処理構成が簡素になることもある。かかる場合、全ての被写体につき一括で測距した後に、各フィルタ素子を透過した光束の特性により被写体を特定したり被写体情報を得ることで、被写体毎に最適な距離画像を得ることができる。 However, even if there are reflectors and the like that cannot be measured, the processing configuration in the image processing algorithm may be simplified if the distance is collectively measured. In such a case, an optimum distance image can be obtained for each subject by specifying the subject or obtaining subject information based on the characteristics of the light flux that has passed through each filter element after measuring all the subjects at once.
 赤色光、緑色光、青色光を透過するフィルタ素子を介して得られたカラー可視画像を表示させることで、無人飛行体のオペレータの操作に必要な有効な情報を伝達することができる。更に、素通しのガラスのフィルタ素子や、入射光を減衰させるフィルタ素子を通して得られた画像を重畳表示させることもでき、それにより、逆光時等でも白飛びや黒潰れのないダイナミックレンジの広い画像を提供できる。加えて偏光フィルタ素子を通して得られた画像を表示させれば、障害物検知で注意すべき反射体の位置を示すことができる。なお、これらの画像は適宜組み合わせて表示させても良い。 By displaying a color visible image obtained through a filter element that transmits red light, green light, and blue light, it is possible to transmit effective information necessary for the operation of the unmanned air vehicle operator. Furthermore, it is possible to superimpose and display images obtained through a filter element made of plain glass or a filter element that attenuates incident light, so that an image with a wide dynamic range without whiteout or blackout can be obtained even in backlight. Can be provided. In addition, if the image obtained through the polarizing filter element is displayed, it is possible to indicate the position of the reflector to be careful in obstacle detection. These images may be displayed in combination as appropriate.
 このような各種機能を持ったステレオ撮像装置を無人飛行体等に搭載することで、幅広い条件下での障害物検知が可能となる。 By installing such a stereo imaging device with various functions in an unmanned air vehicle, obstacle detection under a wide range of conditions becomes possible.
 たとえば水面に向かって無人飛行体が移動する場合など、撮像領域の大半が反射体(水面)で占められる可能性も考えられる。このような時、正確な測距が極めて困難となり、無人飛行体が水面に着水してしまう恐れがあるため、このような場合には、移動体を一時停止させるのが良い。 For example, when an unmanned aerial vehicle moves toward the water surface, there is a possibility that most of the imaging area is occupied by a reflector (water surface). In such a case, accurate ranging becomes extremely difficult, and the unmanned flying object may land on the water surface. In such a case, it is preferable to temporarily stop the moving object.
 本発明は、本明細書に記載の実施形態に限定されるものではなく、他の実施形態・変形例を含むことは、本明細書に記載された実施形態や技術思想から本分野の当業者にとって明らかである。例えば、第2撮像装置の第2結像光学系の各々の有するフィルタは、第1撮像装置の第1アレイ光学系の有するフィルタと同種であるが、第2結像光学系の各々が有するフィルタは第1結像光学系の各々が有するフィルタと完全同一である場合に限定されず、第2撮像装置の第2結像光学系の各々が有するフィルタは第1結像光学系の各々が有するフィルタと同じものを有するとともに、さらに、余分なフィルタ(第1結像光学系にはないが、第2結像光学系にのみあるフィルタ)を備えてもよい。 The present invention is not limited to the embodiments described in the present specification, and includes other embodiments and modifications based on the embodiments and technical ideas described in the present specification. It is obvious to For example, the filters included in each of the second imaging optical systems of the second imaging device are the same type as the filters included in the first array optical system of the first imaging device, but the filters included in each of the second imaging optical systems. Is not limited to the case where the filters of each of the first imaging optical systems are completely the same, and the filters of each of the second imaging optical systems of the second imaging device have each of the first imaging optical systems. In addition to having the same filter, an extra filter (a filter that is not in the first imaging optical system but only in the second imaging optical system) may be provided.
 また、例えば、本発明のステレオ撮像装置は、無人飛行体に限らず、車両にも搭載できる。例えば図17において、車両VHのルームミラーRMの両側に、第1撮像装置CA1及び第2撮像装置CA2を配置して、その光軸を車両の先方に向けて撮像を行うことで、上述と同様に障害物の検出を行うことができ、運転支援を実現できる。尚、第1撮像装置CA1及び第2撮像装置CA2は、ヘッドランプやバンパーに設けても良い。 For example, the stereo imaging device of the present invention can be mounted not only on an unmanned air vehicle but also on a vehicle. For example, in FIG. 17, the first imaging device CA1 and the second imaging device CA2 are arranged on both sides of the room mirror RM of the vehicle VH, and the imaging is performed with the optical axis directed toward the front of the vehicle. Obstacles can be detected and driving assistance can be realized. Note that the first imaging device CA1 and the second imaging device CA2 may be provided in a headlamp or a bumper.
100、100’   無人飛行体
101        本体
102        脚部
103A-103D  アーム
104A-104D  モータ
105A-105D  プロペラ
102        本体
106        軸
107        フレーム
CA1        第1撮像装置
CA2        第2撮像装置
CA3        第3撮像装置
CA4        第4撮像装置
CF         光学フィルタ
CFa        フィルタ素子
DR         駆動制御部
HCA        高画素カメラ
Ia         光電変換領域
IL         個眼光学系
MR         メモリ
MT         モニタ
PR         画像処理部
PR’        副画像処理部
PR1        被写体距離情報取得部
PR2        モニタリング用可視画像情報取得部
PR3        検査物距離情報取得部
PR4        自己位置情報取得部
RM         ルームミラー
SR         固体撮像素子
VH         車両
100, 100 'Unmanned air vehicle 101 Main body 102 Legs 103A-103D Arm 104A-104D Motor 105A-105D Propeller 102 Main body 106 Axis 107 Frame CA1 First imaging device CA2 Second imaging device CA3 Third imaging device CA4 Fourth imaging device CF optical filter CFa filter element DR drive control unit HCA high pixel camera Ia photoelectric conversion region IL single-eye optical system MR memory MT monitor PR image processing unit PR ′ sub-image processing unit PR1 subject distance information acquisition unit PR2 acquisition of visible image information for monitoring Part PR3 inspection object distance information acquisition part PR4 self-position information acquisition part RM room mirror SR solid-state image sensor VH Vehicle

Claims (23)

  1.  光軸を異ならせて配置した複数の第1結像光学系を備えた第1アレイ光学系と、前記複数の第1結像光学系の各々が形成する画像を光電変換し画像信号を出力する第1固体撮像素子と、を有する第1撮像装置と、
     光軸を異ならせて配置した複数の第2結像光学系を備えた第2アレイ光学系と、前記複数の第2結像光学系の各々が形成する画像を光電変換し画像信号を出力する第2固体撮像素子と、を有する第2撮像装置と、を備えたステレオ撮像装置であって、
     前記第1結像光学系と前記第2結像光学系の光軸は平行で、前記第1撮像装置と前記第2撮像装置とは、光軸直交方向に離間して配置され、
     前記第1結像光学系の各々は、光学補正無しフィルタ、赤色フィルタ、緑色フィルタ、青色フィルタ、NDフィルタ、近赤外フィルタ、及び偏光フィルタのうちいずれか1種類を備えてなり、且つ前記第1アレイ光学系の有するフィルタは、少なくとも2種類であり、
     前記第2結像光学系の各々の有するフィルタは、前記第1アレイ光学系の有するフィルタと同種であって、
     同種のフィルタを有する、前記第1結像光学系により形成された画像の前記画像信号と前記第2結像光学系により形成された画像の前記画像信号に基づいて被写体の3次元情報を得るステレオ撮像装置。
    A first array optical system having a plurality of first imaging optical systems arranged with different optical axes and an image formed by each of the plurality of first imaging optical systems are photoelectrically converted and an image signal is output. A first imaging device having a first solid-state imaging device;
    A second array optical system having a plurality of second imaging optical systems arranged with different optical axes and an image formed by each of the plurality of second imaging optical systems are photoelectrically converted and an image signal is output. A second imaging device having a second solid-state imaging device, and a stereo imaging device comprising:
    The optical axes of the first imaging optical system and the second imaging optical system are parallel, and the first imaging device and the second imaging device are arranged apart from each other in a direction orthogonal to the optical axis,
    Each of the first imaging optical systems includes any one of a filter without optical correction, a red filter, a green filter, a blue filter, an ND filter, a near-infrared filter, and a polarizing filter, and the first There are at least two types of filters in one array optical system,
    Each filter of the second imaging optical system is the same type as the filter of the first array optical system,
    A stereo having the same type of filter and obtaining three-dimensional information of a subject based on the image signal of the image formed by the first imaging optical system and the image signal of the image formed by the second imaging optical system Imaging device.
  2.  前記第1アレイ光学系の有するフィルタは、少なくとも2種類であり、前記光学補正無しフィルタ、前記赤色フィルタ、前記緑色フィルタ、及び前記青色フィルタのうちの少なくとも1種類と、前記光学補正無しフィルタ、前記NDフィルタ、前記近赤外フィルタ、及び前記偏光フィルタのうちの少なくとも1種類とを備える請求項1に記載のステレオ撮像装置。 There are at least two types of filters of the first array optical system, and at least one of the filter without optical correction, the red filter, the green filter, and the blue filter, the filter without optical correction, The stereo imaging device according to claim 1, comprising at least one of an ND filter, the near-infrared filter, and the polarizing filter.
  3.  前記第1アレイ光学系及び前記第2アレイ光学系は、前記光学補正無しフィルタを備えた前記第1結像光学系及び前記第2結像光学系を有し、
     撮像時の露光量が、前記光学補正無しフィルタを備えた前記第1結像光学系により形成された画像の前記画像信号及び/又は前記光学補正無しフィルタを備えた前記第2結像光学系により形成された画像の前記画像信号により決定される請求項1又は2に記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system and the second imaging optical system including the optical correction-free filter,
    The exposure amount at the time of imaging is determined by the image signal of the image formed by the first imaging optical system including the optical correction-free filter and / or the second imaging optical system including the optical correction-free filter. The stereo imaging device according to claim 1, wherein the stereo imaging device is determined based on the image signal of the formed image.
  4.  前記第1アレイ光学系及び前記第2アレイ光学系は、前記赤色フィルタ、前記緑色フィルタ、及び前記青色フィルタを備えた前記第1結像光学系及び前記第2結像光学系を有し、
     撮像時の露光量が、前記赤色フィルタ、前記緑色フィルタ、及び前記青色フィルタを備えたそれぞれの前記第1結像光学系により形成された画像の前記画像第信号及び/又は前記赤色フィルタ、前記緑色フィルタ、及び前記青色フィルタを備えたそれぞれの前記第2結像光学系により形成された画像の前記画像信号により決定される請求項1又は2に記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system and the second imaging optical system including the red filter, the green filter, and the blue filter,
    The exposure amount at the time of imaging is the image first signal of the image formed by the first imaging optical system including the red filter, the green filter, and the blue filter, and / or the red filter, and the green color. The stereo imaging device according to claim 1, wherein the stereo imaging device is determined by the image signal of an image formed by each of the second imaging optical systems including a filter and the blue filter.
  5.  前記第1アレイ光学系及び前記第2アレイ光学系は、前記NDフィルタを備えた前記第1結像光学系及び前記第2結像光学系を有し、
     撮像時の露光量が、前記NDフィルタを備えた前記第1結像光学系により形成された画像の前記画像信号及び/又は前記NDフィルタを備えた前記第2結像光学系により形成された画像の前記画像信号により決定される請求項1又は2に記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system and the second imaging optical system including the ND filter,
    The exposure amount at the time of imaging is the image signal of the image formed by the first imaging optical system provided with the ND filter and / or the image formed by the second imaging optical system provided with the ND filter. The stereo imaging device according to claim 1, wherein the stereo imaging device is determined by the image signal.
  6.  前記第1アレイ光学系及び前記第2アレイ光学系は、前記近赤外フィルタを備えた前記第1結像光学系及び前記第2結像光学系を有し、
     撮像時の露光量が、前記近赤外フィルタを備えた前記第1結像光学系により形成された画像の前記画像信号及び/又は前記近赤外フィルタを備えた前記第2結像光学系により形成された画像の前記画像信号により決定される請求項1又は2に記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system and the second imaging optical system including the near-infrared filter,
    The exposure amount at the time of imaging is determined by the image signal of the image formed by the first imaging optical system including the near-infrared filter and / or the second imaging optical system including the near-infrared filter. The stereo imaging device according to claim 1, wherein the stereo imaging device is determined based on the image signal of the formed image.
  7.  前記第1アレイ光学系及び前記第2アレイ光学系は、前記偏光フィルタを備えた前記第1結像光学系及び前記第2結像光学系を有し、
     撮像時の露光量が、前記偏光フィルタを備えた前記第1結像光学系により形成された画像の前記画像信号及び/又は前記偏光フィルタを備えた前記第2結像光学系により形成された画像の前記画像信号により決定される請求項1又は2に記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system and the second imaging optical system including the polarizing filter,
    The exposure amount at the time of imaging is the image signal of the image formed by the first imaging optical system provided with the polarizing filter and / or the image formed by the second imaging optical system provided with the polarizing filter. The stereo imaging device according to claim 1, wherein the stereo imaging device is determined by the image signal.
  8.  別の前記第1結像光学系及び前記第2結像光学系が、それぞれ前記NDフィルタ、前記近赤外フィルタ及び前記偏光フィルタのうち少なくともいずれか一つを有する請求項3又は4に記載のステレオ撮像装置。 The said 1st image formation optical system and the said 2nd image formation optical system have at least any one among the said ND filter, the said near-infrared filter, and the said polarization filter, respectively. Stereo imaging device.
  9.  前記第1アレイ光学系及び前記第2アレイ光学系は、互いに異なる透過軸を持つ前記偏光フィルタを備えた前記第1結像光学系及び前記第2結像光学系を有し、
     互いに透過軸の異なる前記偏光フィルタを備えた前記第1結像光学系により形成された2つの画像の前記画像信号、又は互いに透過軸の異なる前記偏光フィルタを備えた前記第2結像光学系により形成された2つの画像の前記画像信号を比較し、両信号に閾値以上の輝度差がある場合は、前記輝度差のある領域は反射体と判断する請求項1から8のいずれかに記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system and the second imaging optical system including the polarizing filter having transmission axes different from each other,
    The image signals of the two images formed by the first imaging optical system having the polarizing filters having different transmission axes, or the second imaging optical system having the polarizing filters having different transmission axes. The image signal of two formed images is compared, and when both signals have a luminance difference equal to or greater than a threshold value, the region having the luminance difference is determined as a reflector. Stereo imaging device.
  10.  前記処理部が前記反射体を検出した場合には、前記被写体に対して測距は行わない、もしくは前記被写体に対して測距を行っても距離情報として用いない請求項9に記載のステレオ撮像装置。 The stereo imaging according to claim 9, wherein when the processing unit detects the reflector, distance measurement is not performed on the subject, or even if distance measurement is performed on the subject, stereo imaging is not used. apparatus.
  11.  前記反射体と判断した場合は、前記反射体の境界部を抽出し、同種のフィルタを有した前記第1結像光学系により形成された画像の前記画像信号と前記第2結像光学系により形成された画像の前記画像信号に基づいて前記境界部の3次元情報を求め、前記境界部の3次元情報を前記反射体の3次元情報として用いる請求項9又は10に記載のステレオ撮像装置。 When it is determined as the reflector, the boundary portion of the reflector is extracted, and the image signal of the image formed by the first imaging optical system having the same type of filter and the second imaging optical system are used. The stereo imaging device according to claim 9 or 10, wherein three-dimensional information of the boundary portion is obtained based on the image signal of the formed image, and the three-dimensional information of the boundary portion is used as the three-dimensional information of the reflector.
  12.  前記第1アレイ光学系及び前記第2アレイ光学系は、前記近赤外フィルタを備えた前記第1結像光学系及び前記第2結像光学系を有し、
     前記近赤外フィルタを備えた前記第1結像光学系により形成された画像の前記画像信号及び/又は前記近赤外フィルタを備えた前記第2結像光学系により形成された画像の前記画像信号から輝度値が閾値より低い領域を抽出することにより被写体を判別する請求項1から11のいずれかに記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system and the second imaging optical system including the near-infrared filter,
    The image signal of the image formed by the first imaging optical system including the near infrared filter and / or the image of the image formed by the second imaging optical system including the near infrared filter. The stereo imaging device according to claim 1, wherein the subject is determined by extracting a region having a luminance value lower than a threshold value from the signal.
  13.  前記被写体は、架線である請求項12に記載のステレオ撮像装置。 The stereo imaging device according to claim 12, wherein the subject is an overhead line.
  14.  前記第1アレイ光学系及び前記第2アレイ光学系は、前記赤色フィルタ、前記緑色フィルタ、前記青色フィルタの少なくともいずれか一つと前記近赤外フィルタを備えた前記第1結像光学系及び前記第2結像光学系を有し、
     前記赤色フィルタ、前記緑色フィルタ、前記青色フィルタの少なくともいずれか一つと前記近赤外フィルタとを備えた前記第1結像光学系により形成された画像の前記画像信号及び/又は前記赤色フィルタ、前記緑色フィルタ、前記青色フィルタの少なくともいずれか一つと前記近赤外フィルタとを備えた前記第2結像光学系により形成された画像の前記画像信号を比較し、被写体を判別する請求項1から13のいずれかに記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system including at least one of the red filter, the green filter, and the blue filter and the near infrared filter, and the first array optical system. Having two imaging optics,
    The image signal of the image formed by the first imaging optical system and / or the red filter, comprising at least one of the red filter, the green filter, and the blue filter and the near-infrared filter, 14. The subject is discriminated by comparing the image signals of the image formed by the second imaging optical system including at least one of a green filter and the blue filter and the near-infrared filter. The stereo imaging device according to any one of the above.
  15.  前記被写体は、植物である請求項14に記載のステレオ撮像装置。 The stereo imaging device according to claim 14, wherein the subject is a plant.
  16.  前記第1アレイ光学系及び前記第2アレイ光学系は、前記光学補正無しフィルタ及び減衰率を異ならせた複数の前記NDフィルタのうち少なくとも2種を備えた前記第1結像光学系及び前記第2結像光学系を有し、
     前記光学補正無しフィルタ及び減衰率を異ならせた複数の前記NDフィルタのうち少なくとも2種を備えた前記第1結像光学系により形成された画像の前記画像信号及び前記光学補正無しフィルタ及び減衰率を異ならせた複数の前記NDフィルタのうち少なくとも2種を備えた前記第2結像光学系により形成された画像の前記画像信号に基づいて画像のコントラストを個々に取得し、最もコントラストの高い画像の画像信号を用いて被写体の3次元情報を得る請求項1から13のいずれかに記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system and the first array optical system each including at least two of the non-optical correction filter and the plurality of ND filters having different attenuation factors. Having two imaging optics,
    The image signal of the image formed by the first imaging optical system including at least two of the ND filters having different optical correction and the plurality of ND filters having different attenuation factors, the filter without optical correction, and the attenuation rate An image having the highest contrast is obtained by individually acquiring the image contrast based on the image signal of the image formed by the second imaging optical system including at least two of the plurality of ND filters having different values. The stereo imaging device according to claim 1, wherein three-dimensional information of a subject is obtained using the image signal.
  17.  前記第1アレイ光学系及び前記第2アレイ光学系は、前記光学補正無しフィルタ及び減衰率を異ならせた複数の前記NDフィルタのうち少なくとも2種を備えた前記第1結像光学系及び前記第2結像光学系を有し、
     前記光学補正無しフィルタ及び減衰率を異ならせた複数の前記NDフィルタのうち少なくとも2種を備えた前記第1結像光学系により形成された画像の前記画像信号及び前記光学補正無しフィルタ及び減衰率を異ならせた複数の前記NDフィルタのうち少なくとも2種を備えた前記第2結像光学系により形成された画像の前記画像信号に基づいて、ダイナミックレンジの広い画像を個々に作成し、作成したダイナミックレンジの広い画像の画像信号を用いて被写体の3次元情報を得る請求項1から16のいずれかに記載のステレオ撮像装置。
    The first array optical system and the second array optical system include the first imaging optical system and the first array optical system each including at least two of the non-optical correction filter and the plurality of ND filters having different attenuation factors. Having two imaging optics,
    The image signal of the image formed by the first imaging optical system including at least two of the ND filters having different optical correction and the plurality of ND filters having different attenuation factors, the filter without optical correction, and the attenuation rate Based on the image signal of the image formed by the second imaging optical system having at least two of the plurality of ND filters with different values, images having a wide dynamic range were individually created. The stereo imaging device according to any one of claims 1 to 16, wherein three-dimensional information of a subject is obtained using an image signal of an image having a wide dynamic range.
  18.  前記ステレオ撮像装置は、処理部を有し、
     前記処理部は、前記第1撮像装置と前記第2撮像装置における同じ種類のフィルタを備えた前記第1結像光学系により形成された画像の前記画像信号及び前記第2結像光学系により形成された画像の前記画像信号の視差情報から被写体の3次元情報を各々算出した後、前記第1撮像装置と前記第2撮像装置における所定の種類のフィルタを備えた前記第1結像光学系により形成された画像の前記画像信号及び/又は前記第2結像光学系により形成された画像の前記画像信号に基づいて、算出した被写体の3次元情報のうち使用可能な被写体領域を選択する請求項1から17のいずれかに記載のステレオ撮像装置。
    The stereo imaging device has a processing unit,
    The processing unit is formed by the image signal of the image formed by the first imaging optical system and the second imaging optical system formed by the first imaging optical system including the same type of filter in the first imaging device and the second imaging device. After calculating the three-dimensional information of the subject from the parallax information of the image signal of the captured image, the first imaging optical system provided with a predetermined type of filter in the first imaging device and the second imaging device. The usable subject region is selected from the calculated three-dimensional information of the subject based on the image signal of the formed image and / or the image signal of the image formed by the second imaging optical system. The stereo imaging device according to any one of 1 to 17.
  19.  前記ステレオ撮像装置は、処理部を有し、
     前記処理部は、前記第1撮像装置と前記第2撮像装置における所定の種類のフィルタを備えた前記第1結像光学系により形成された画像の前記画像信号及び/又は前記第2結像光学系により形成された画像の前記画像信号に基づいて、使用可能な被写体領域を選択し、
     前記第1撮像装置と前記第2撮像装置における同じ種類のフィルタを備えた前記第1結像光学系により形成された画像の前記画像信号及び前記第2結像光学系により形成された画像の前記画像信号の視差情報から使用可能な被写体領域の3次元情報を各々算出する請求項1から17に記載のステレオ撮像装置。
    The stereo imaging device has a processing unit,
    The processing unit includes the image signal and / or the second imaging optics of an image formed by the first imaging optical system including a predetermined type of filter in the first imaging device and the second imaging device. Based on the image signal of the image formed by the system, select a usable subject area,
    The image signal of the image formed by the first imaging optical system including the same type of filter in the first imaging device and the second imaging device, and the image of the image formed by the second imaging optical system The stereo imaging device according to claim 1, wherein three-dimensional information of a usable subject area is calculated from parallax information of an image signal.
  20.  請求項1から19のいずれかに記載のステレオ撮像装置の撮像により得られた画像を前記ステレオ装置以外の機器に送信するステレオ撮像装置。 A stereo imaging device that transmits an image obtained by imaging by the stereo imaging device according to any one of claims 1 to 19 to a device other than the stereo device.
  21.  請求項1から19のいずれかに記載のステレオ撮像装置を備えた移動体。 A moving body comprising the stereo imaging device according to any one of claims 1 to 19.
  22.  前記移動体は、前記移動体の移動を制御する移動制御部を有し、
     互いに透過軸の異なる前記偏光フィルタを備えた前記第1結像光学系により形成された2つの画像の前記画像信号、又は互いに透過軸の異なる前記偏光フィルタを備えた前記第2結像光学系により形成された2つの画像の前記画像信号を比較し、閾値以上の輝度差があった場合には、一時停止する請求項21に記載の移動体。
    The moving body has a movement control unit that controls movement of the moving body,
    The image signals of the two images formed by the first imaging optical system having the polarizing filters having different transmission axes, or the second imaging optical system having the polarizing filters having different transmission axes. The moving body according to claim 21, wherein the image signals of the two formed images are compared, and when there is a luminance difference equal to or greater than a threshold value, the moving body is temporarily stopped.
  23.  光軸を異ならせて配置した複数の第3結像光学系を備えた第3アレイ光学系と、前記複数の第3結像光学系の各々が形成する画像を光電変換し画像信号を出力する第3固体撮像素子と、を有する第3撮像装置と、
     光軸を異ならせて配置した複数の第4結像光学系を備えた第4アレイ光学系と、前記複数の第4結像光学系の各々が形成する画像を光電変換し画像信号を出力する第4固体撮像素子と、を有する第4撮像装置と、をさらに有し、
     前記第3結像光学系と前記第4結像光学系の光軸は平行で、前記第3撮像装置と前記第4撮像装置とは、光軸直交方向に離間して配置されると共に、前記第1撮像装置と前記第2撮像装置の撮像方向とは異なる方向を撮像できるよう配置されている請求項21又は22に記載の移動体。
    A third array optical system having a plurality of third imaging optical systems arranged with different optical axes and an image formed by each of the plurality of third imaging optical systems are photoelectrically converted and an image signal is output. A third imaging device having a third solid-state imaging device;
    A fourth array optical system having a plurality of fourth imaging optical systems arranged with different optical axes and an image formed by each of the plurality of fourth imaging optical systems are photoelectrically converted and an image signal is output. A fourth imaging device having a fourth solid-state imaging device;
    The optical axes of the third imaging optical system and the fourth imaging optical system are parallel, and the third imaging device and the fourth imaging device are arranged apart from each other in a direction orthogonal to the optical axis, and The moving body according to claim 21 or 22, wherein the moving body is arranged so that a direction different from an imaging direction of the first imaging device and the second imaging device can be imaged.
PCT/JP2015/085013 2014-12-26 2015-12-15 Stereo imaging device and moving body WO2016104235A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016566133A JPWO2016104235A1 (en) 2014-12-26 2015-12-15 Stereo imaging apparatus and moving body

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014264014 2014-12-26
JP2014-264014 2014-12-26

Publications (1)

Publication Number Publication Date
WO2016104235A1 true WO2016104235A1 (en) 2016-06-30

Family

ID=56150260

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/085013 WO2016104235A1 (en) 2014-12-26 2015-12-15 Stereo imaging device and moving body

Country Status (2)

Country Link
JP (1) JPWO2016104235A1 (en)
WO (1) WO2016104235A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018013949A (en) * 2016-07-21 2018-01-25 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Mobile body, method for detecting obstacle of mobile body, and program for detecting obstacle of mobile body
WO2018188627A1 (en) * 2017-04-12 2018-10-18 普宙飞行器科技(深圳)有限公司 Omnidirectional obstacle avoidance apparatus, tripod head, tripod head control method, and obstacle avoidance control method
US10776938B2 (en) 2017-05-19 2020-09-15 Waymo Llc Camera systems using filters and exposure times to detect flickering illuminated objects

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002171430A (en) * 2000-11-30 2002-06-14 Canon Inc Compound eye imaging system, imaging device and electronic apparatus
JP2007127431A (en) * 2005-11-01 2007-05-24 Fuji Xerox Co Ltd Method and apparatus for detecting end position
JP2008005488A (en) * 2006-06-19 2008-01-10 Samsung Electro Mech Co Ltd Camera module
JP2008157851A (en) * 2006-12-26 2008-07-10 Matsushita Electric Ind Co Ltd Camera module
JP2011164061A (en) * 2010-02-15 2011-08-25 Ricoh Co Ltd Transparent object detection system
JP2011176710A (en) * 2010-02-25 2011-09-08 Sharp Corp Imaging apparatus
JP2013044597A (en) * 2011-08-23 2013-03-04 Canon Inc Image processing device and method, and program
JP2013156109A (en) * 2012-01-30 2013-08-15 Hitachi Ltd Distance measurement device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002171430A (en) * 2000-11-30 2002-06-14 Canon Inc Compound eye imaging system, imaging device and electronic apparatus
JP2007127431A (en) * 2005-11-01 2007-05-24 Fuji Xerox Co Ltd Method and apparatus for detecting end position
JP2008005488A (en) * 2006-06-19 2008-01-10 Samsung Electro Mech Co Ltd Camera module
JP2008157851A (en) * 2006-12-26 2008-07-10 Matsushita Electric Ind Co Ltd Camera module
JP2011164061A (en) * 2010-02-15 2011-08-25 Ricoh Co Ltd Transparent object detection system
JP2011176710A (en) * 2010-02-25 2011-09-08 Sharp Corp Imaging apparatus
JP2013044597A (en) * 2011-08-23 2013-03-04 Canon Inc Image processing device and method, and program
JP2013156109A (en) * 2012-01-30 2013-08-15 Hitachi Ltd Distance measurement device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018013949A (en) * 2016-07-21 2018-01-25 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Mobile body, method for detecting obstacle of mobile body, and program for detecting obstacle of mobile body
WO2018188627A1 (en) * 2017-04-12 2018-10-18 普宙飞行器科技(深圳)有限公司 Omnidirectional obstacle avoidance apparatus, tripod head, tripod head control method, and obstacle avoidance control method
US10776938B2 (en) 2017-05-19 2020-09-15 Waymo Llc Camera systems using filters and exposure times to detect flickering illuminated objects
US11341667B2 (en) 2017-05-19 2022-05-24 Waymo Llc Camera systems using filters and exposure times to detect flickering illuminated objects

Also Published As

Publication number Publication date
JPWO2016104235A1 (en) 2017-10-05

Similar Documents

Publication Publication Date Title
US10564266B2 (en) Distributed LIDAR with fiber optics and a field of view combiner
US10408940B2 (en) Remote lidar with coherent fiber optic image bundle
KR101951318B1 (en) 3D image acquisition apparatus and method of obtaining color and depth images simultaneously
JP6878219B2 (en) Image processing device and ranging device
JP2018160228A (en) Route generation device, route control system, and route generation method
KR102119289B1 (en) Systems and methods for sample inspection and review
US20140009611A1 (en) Camera System and Method for Observing Objects at Great Distances, in Particular for Monitoring Target Objects at Night, in Mist, Dust or Rain
TWI781109B (en) System and method for stereo triangulation
WO2016104235A1 (en) Stereo imaging device and moving body
US11781913B2 (en) Polarimetric imaging camera
KR101545971B1 (en) System for sensing complex image
JP6971933B2 (en) Image processing equipment and imaging equipment
US20180098053A1 (en) Imaging device, endoscope apparatus, and imaging method
US20180092516A1 (en) Imaging device, endoscope apparatus, and imaging method
US11172108B2 (en) Imaging device
WO2016039053A1 (en) Surveying device
JP6756898B2 (en) Distance measuring device, head-mounted display device, personal digital assistant, video display device, and peripheral monitoring system
JP6202364B2 (en) Stereo camera and moving object
US11422264B2 (en) Optical remote sensing
KR20160139927A (en) Medical multi-modal imaging system
US10075646B2 (en) Sensor systems and methods
JP6847891B2 (en) Image processing equipment, imaging equipment and methods
JP2006024986A (en) Multiband imaging apparatus
KR102613150B1 (en) High resolution LiDAR system with LCoS panel
JP2007057386A (en) In-line three-dimensional measuring device and measuring method using one camera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15872804

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016566133

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15872804

Country of ref document: EP

Kind code of ref document: A1