WO2016104235A1 - Dispositif d'imagerie stéréoscopique et corps mobile - Google Patents

Dispositif d'imagerie stéréoscopique et corps mobile Download PDF

Info

Publication number
WO2016104235A1
WO2016104235A1 PCT/JP2015/085013 JP2015085013W WO2016104235A1 WO 2016104235 A1 WO2016104235 A1 WO 2016104235A1 JP 2015085013 W JP2015085013 W JP 2015085013W WO 2016104235 A1 WO2016104235 A1 WO 2016104235A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical system
filter
imaging
imaging device
image
Prior art date
Application number
PCT/JP2015/085013
Other languages
English (en)
Japanese (ja)
Inventor
敦司 山下
山本 信一
野崎 昭俊
修 丹内
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2016566133A priority Critical patent/JPWO2016104235A1/ja
Publication of WO2016104235A1 publication Critical patent/WO2016104235A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits

Definitions

  • the present invention relates to a stereo imaging device including a plurality of compound-eye imaging devices and a moving body equipped with such stereo imaging devices.
  • an obstacle detection device for detecting an obstacle ahead of a traveling direction has been developed in an unmanned moving body or a vehicle.
  • the obstacle detection apparatus disclosed in Patent Document 1 detects an obstacle by scanning infrared rays projected in the traveling direction, but the detection range remains within the infrared scan range and is relatively narrow. There is.
  • the obstacle detection device disclosed in Patent Document 2 detects an obstacle using a millimeter wave radar, but has a problem that it is difficult to identify the shape and size of the obstacle.
  • the obstacle detection device disclosed in Patent Document 3 calculates, by template matching, positional deviation (parallax) of the same object on a plurality of images captured at the same time by using a stereo camera. The position of the object in the real space can be calculated based on the parallax that has been obtained, using a known conversion formula, and the above-described problems can be solved.
  • Patent Document 3 has a problem that it is difficult to identify a subject under a reflecting object or backlight condition. Although it is conceivable to mount the above-described three types of obstacle detection devices and use them according to the conditions, the cost increases and the weight increases. In particular, when an obstacle detection device is mounted on an unmanned air vehicle or the like, there is an actual situation that an obstacle detection device that is as small and light as possible is desired in order to secure one cruising distance.
  • the present invention has been made in view of such problems, and an object of the present invention is to provide a stereo imaging device capable of recognizing various obstacles while being small and light and a moving body having the same.
  • a stereo imaging device reflecting one aspect of the present invention.
  • a first array optical system having a plurality of first imaging optical systems arranged with different optical axes and an image formed by each of the plurality of first imaging optical systems are photoelectrically converted and an image signal is output.
  • a first imaging device having a first solid-state imaging device;
  • a second array optical system having a plurality of second imaging optical systems arranged with different optical axes and an image formed by each of the plurality of second imaging optical systems are photoelectrically converted and an image signal is output.
  • a second imaging device having a second solid-state imaging device, and a stereo imaging device comprising: The optical axes of the first imaging optical system and the second imaging optical system are parallel, and the first imaging device and the second imaging device are arranged apart from each other in a direction orthogonal to the optical axis,
  • Each of the first imaging optical systems includes any one of a filter without optical correction, a red filter, a green filter, a blue filter, an ND filter, a near-infrared filter, and a polarizing filter, and the first There are at least two types of filters in one array optical system, Each filter of the second imaging optical system is the same type as the filter of the first array optical system, Based on the image signal of the image formed by the first imaging optical system and the image signal of the image formed by the second imaging optical system having the same type of filter, three-dimensional information of the subject is obtained.
  • a stereo imaging device capable of recognizing various obstacles while being small and light, and a movable body having the same.
  • FIG. 1 is a block diagram showing a configuration of an unmanned air vehicle 100.
  • FIG. It is a block diagram which shows the processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR. It is a block diagram which shows another processing flow in the image process part PR.
  • FIG. 1 is a perspective view of an unmanned air vehicle as a moving body according to the present embodiment.
  • the main body 101 supported by the legs 102 has four arms 103A to 103D horizontally implanted at equal intervals in the circumferential direction.
  • Motors 104A to 104D which are propulsive force generators, are attached to the tips of the arms 103A to 103D, and propellers 105A to 105D are rotatably attached to the rotation shafts of the motors 104A to 104D that face in the vertical direction. It has been.
  • the first image pickup device CA1 and the second image pickup device CA2 are installed with their respective optical axes parallel to each other and spaced apart in the direction perpendicular to the optical axis and facing the traveling direction (arrow direction). Yes.
  • FIG. 2 is a diagram illustrating a schematic configuration of the first imaging device CA1.
  • the first imaging device CA1 photoelectrically converts a subject image formed by each of the single-eye optical system IL and the single-eye optical system IL arranged in three rows and three columns with the optical axes parallel to each other.
  • a solid-state imaging device (first solid-state imaging device) SR having nine conversion regions (which may be integrated) Ia, and an optical filter CF disposed between the single-eye optical system IL and the photoelectric conversion region Ia; Have.
  • the optical filter CF is divided into nine filter elements CFa according to the individual eye optical system IL.
  • the optical filter CF may be disposed on the subject side with respect to the single-eye optical system IL.
  • a plurality of types of optical filters CF and a plurality of single-eye optical systems IL constitute a first array optical system.
  • the same type of optical filter CF and a plurality of single-eye optical systems IL constitute a second array optical system, and a subject image formed thereby.
  • the solid-state imaging element SR that photoelectrically converts the second to the second solid-state imaging element.
  • FIG. 3 is a diagram illustrating an example of the arrangement of filter elements in the optical filter used in the first imaging device CA1.
  • the optical filter CF includes a filter element CFa (hereinafter referred to as NO) that is a transparent glass, a filter element CFa that transmits red light (hereinafter referred to as R), and a filter element CFa (hereinafter referred to as R) that transmits green light.
  • NO filter element CFa
  • R red light
  • R filter element CFa
  • An image forming optical system (first image forming optical system or second image forming optical system) is configured by one type of filter element CFa and the corresponding single-eye optical system IL. It is sufficient that at least two types of filter elements are provided
  • the second imaging device CA2 has the same configuration as the first imaging device CA1 shown in FIGS.
  • the arrangement of the filter elements CFa may be the same as that of the first imaging device, an arrangement having a mirror image relationship, or a random arrangement.
  • the “filter without optical correction” refers to a filter that is not substantially optically corrected for transmitted light.
  • a transparent resin material may be used, and no filter element is provided. In some cases, it is possible to capture wavelength information in a wider band than in each case of RGB.
  • FIG. 4 is a block diagram showing the configuration of the unmanned air vehicle 100.
  • Image signals output from the first imaging device CA1 and the second imaging device CA2 are image processing units PR that are mounted on the main body 101 of the unmanned air vehicle 100 and connected to the first imaging device CA1 and the second imaging device CA2.
  • the subject distance information acquisition unit PR1 in the image processing unit PR obtains three-dimensional information of the imaging region, thereby obtaining subject information related to the obstacle.
  • a visible image is formed by the monitoring visible image information acquisition unit PR2 in the image processing unit PR, and the data is transmitted to an external monitor (display device) MT.
  • the visible image captured from the unmanned air vehicle 100 can be observed by the operator through the monitor MT.
  • a color visible light image can be formed by using a light beam transmitted through the filter elements (R, G, B) of the first imaging device CA1 and / or the second imaging device CA2, and this is displayed on the monitor MT.
  • a monitor image may be displayed using a light beam that has passed through another filter element.
  • the subject information output from the image processing unit PR is output to a drive control unit (movement control unit) DR disposed in the main body 101, and based on this, motors 104A to 104D are used so as not to collide with obstacles. It is possible to realize autonomous flight by controlling.
  • a drive control unit movement control unit
  • FIG. 5 is a block diagram illustrating a processing flow in the image processing unit PR.
  • the unmanned air vehicle 100 detects an object ahead in the moving direction as an obstacle, and performs a flight so as to avoid the obstacle.
  • the first imaging device CA1 and the second imaging device CA2 respectively receive the light beams transmitted through the filter elements CFa and the individual eye optical systems IL by the photoelectric conversion region Ia in the solid-state imaging device SR. Perform photoelectric conversion. After photoelectric conversion, each signal is output from the solid-state imaging device SR, stored in a memory MR (not shown) provided in the image processing unit PR, and called up as necessary.
  • a memory MR not shown
  • First processing mode In the first processing mode, a target for distance measurement is extracted in advance, and distance measurement is performed on the target to create a distance map. In addition, when a reflector is detected, distance information of the reflector itself is not detected.
  • the image processing unit PR is to determine the exposure time (exposure amount) by the light flux that has passed through the filter element (NO) of the transparent glass by a predetermined program or the operation of the operator, the process P101 , P102, a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass is input independently from each of the first imaging device CA1 and the second imaging device CA2, and the exposure time is calculated from the value. This is determined and fed back to control the exposure time of the solid-state imaging device SR individually. After imaging with the determined exposure time, signals output from the photoelectric conversion regions Ia of the first imaging device CA1 and the second imaging device CA2 are stored in the memory MR.
  • the image processing unit PR receives from the memory MR a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass according to each of the first imaging device CA1 and the second imaging device CA2, and the first The signal corresponding to the light beam that has passed through the filter element (ND1) having the attenuation factor of 1 and the signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input, and the contrast is the highest. An image is selected, and a subject is discriminated from a signal value corresponding to the image.
  • a subject with high brightness such as the sun may appear under imaging conditions such as under backlighting, but the pixel value of the photoelectric conversion region Ia to which the light beam that has passed through the filter element (NO) of the transparent glass is incident is as described above.
  • a subject image having a contrast that does not cause whiteout is obtained based on a signal corresponding to the light beam that has passed through the filter element (ND1 or ND2) having the first attenuation factor or the second attenuation factor. Therefore, it is possible to obtain an image with the optimum brightness for obstacle detection. For this reason, it is possible to prevent erroneous detection.
  • any two of the filter elements may be selected, and the same operation as described above may be performed from the obtained signal.
  • any two or three of the filter elements (NO, ND1, ND2) may be selected to create an image with a wide dynamic range.
  • the image processing unit PR outputs a signal corresponding to the light beam that has passed through the filter element (PF) and a signal corresponding to the light beam that has passed through the filter element (SF) from the memory MR. Then, the difference between the two signal values is obtained, and it is determined from these values whether the subject contains a reflector. That is, if the difference between the two signal values exceeds the threshold value, it can be seen that the reflected light has entered, so the image processing unit PR recognizes the region range exceeding the threshold value as a reflector.
  • the signal used for the determination of the reflector may be only the signal in the first imaging device CA1, only the signal in the second imaging device CA2, or from both the first imaging device CA1 and the second imaging device CA2. This signal may be used.
  • the image processing unit PR inputs a signal corresponding to the light beam that has passed through the filter element (IR) that transmits near-infrared light from the memory MR in the process P105, and the subject is lower than the threshold value. Identify and extract. For example, overhead lines installed on the forest may be difficult to discern with visible light because the color tone is close to the background, but plants generally reflect near-infrared light, so only the signal of the overhead line part Becomes lower. Therefore, an overhead line can be identified from the forest by grasping near-infrared light.
  • IR filter element
  • the image processing unit PR outputs from the memory MR a signal corresponding to the red light beam that has passed through the filter element (R) and a signal corresponding to the green light beam that has passed through the filter element (G). And at least one of the signals corresponding to the blue luminous flux that has passed through the filter element (B) and the signal corresponding to the luminous flux that has passed through the filter element (IR) that transmits near-infrared light, Calculate to identify the subject.
  • an NDVI (normalized difference vegetation index) formula can be used to recognize an object.
  • the luminance value of the signal corresponding to the red light beam passing through the filter element (R) is Rs
  • the luminance value of the signal corresponding to the near-infrared light beam passing through the filter element (IR) is IRs
  • a signal corresponding to the green light beam that has passed through the filter element (G) or a signal corresponding to the blue light beam that has passed through the filter element (B) may be used as appropriate.
  • (NDVI) (IRs ⁇ Rs) / (IRs + Rs) (1)
  • the image processing unit PR applies the signal corresponding to the red light beam that has passed through the filter element (R) from the memory MR and the filter element (G) to the region other than the reflector determined in process P104. ) And a signal corresponding to the blue luminous flux that has passed through the filter element (B) are respectively input to generate two sets of visible light image data, and based on the parallax information A stereo image of each color is formed and distance measurement is performed.
  • the distance measuring technique using a stereo image is disclosed in Japanese Patent Application Laid-Open No. 2009-186228, and thus details are not described.
  • the image processing unit PR creates a visible light distance map for each pixel in process P108, and obtains three-dimensional information of the subject.
  • the “distance map” refers to an object image having distance information.
  • the image processing unit PR outputs a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass from the memory MR in the process P109 for the region other than the reflector determined in the process P104. And a signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and a signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input respectively. Then, a stereo image is formed for the image having the highest contrast, and the distance is measured. Thereafter, the image processing unit PR creates a backlight distance map for each pixel in process P110, and obtains three-dimensional information of the subject.
  • any two or three of the filter elements may be used to create an image with a wide dynamic range, form a stereo image, and perform distance measurement.
  • the image processing unit PR obtains the boundary portion of the reflector determined in the process P104, and the light flux that has passed through the filter element (PF) from the memory MR with respect to this boundary portion. Or a signal corresponding to the light beam that has passed through the filter element (SF) is input to form a stereo image and perform distance measurement. Thereafter, in process P112, the image processing unit PR creates a distance map including the reflector using the obtained distance measurement value of the boundary as a distance measurement value of the reflector, and obtains three-dimensional information of the subject.
  • the image processing unit PR outputs a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) from the memory MR in the process P113 for the region other than the reflector determined in the process P104. Is input, a near-infrared stereo image is formed, and distance measurement is performed. Thereafter, the image processing unit PR creates a near-infrared distance map for each pixel in process P114, and obtains three-dimensional information of the subject.
  • IR filter element
  • the image processing unit PR outputs a signal corresponding to the red light beam that has passed through the filter element (R) from the memory MR in the process P115 for the region other than the reflector determined in the process P104. Inputs a signal corresponding to the green light beam that has passed through the filter element (G), a signal that corresponds to the blue light beam that has passed through the filter element (B), and a signal that corresponds to the near-infrared light beam that has passed through the filter element (IR). Then, visible light and near-infrared stereo images are formed, and distance measurement is performed. Thereafter, the image processing unit PR creates a visible light and near-infrared distance map for each pixel in process P116, and obtains three-dimensional information of the subject.
  • the image processing unit PR superimposes the distance maps obtained in processes P108, P110, P112, P114, and P116, and the distance measurement information of the reflector itself that cannot guarantee the accuracy (the distance measurement information of the reflector). To avoid this when there is an object recognized by two or more distance maps, for example.
  • Subject information is output to the drive controller DR.
  • the image processing unit PR avoids the case where there is an object closest to the unmanned air vehicle 100 based on the distance map obtained in the processes P108, P110, P112, P114, and P116.
  • the subject information may be output to the drive control unit DR.
  • FIG. 6 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR receives a red light beam that has passed through the filter element (R), a green light beam that has passed through the filter element (G), and a blue light that has passed through the filter element (B) by a predetermined program or an operator's operation.
  • the red luminous flux that has passed through the filter element (R) independently from each of the first imaging device CA1 and the second imaging device CA2 in the processes P101A and P102A.
  • a signal corresponding to the green light beam that has passed through the filter element (G), and a signal corresponding to the blue light beam that has passed through the filter element (B), and a luminance value is calculated from these values.
  • the exposure time is determined and fed back to control the exposure time of the solid-state imaging element SR individually. Since the processes P103 to P117 are the same as the processes in FIG.
  • FIG. 7 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR performs the first in the processing P101B and P102B.
  • a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) is independently input from the imaging device CA1 and the second imaging device CA2, and the exposure time is determined from this value and fed back.
  • the exposure time of the solid-state image sensor SR is individually controlled. Since the processes P103 to P117 are the same as the processes in FIG.
  • FIG. 8 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR has passed through the filter element (ND1) having the first attenuation factor and the filter element (ND2) having the second attenuation factor by a predetermined program or an operator's operation.
  • the processing P101C and P102C have the first attenuation rate independently from the first imaging device CA1 and the second imaging device CA2.
  • a signal corresponding to the light beam passing through the filter element (ND1) and / or a signal corresponding to the light beam passing through the filter element (ND2) having the second attenuation factor is input, and the exposure time is determined based on this value. This is determined and fed back to control the exposure time of the solid-state imaging device SR individually. Since the processes P103 to P117 are the same as the processes in FIG.
  • FIG. 9 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR determines the exposure time based on a signal corresponding to the light beam that has passed through the filter element (PF) or a signal corresponding to the light beam that has passed through the filter element (SF) by a predetermined program or an operator's operation.
  • a signal corresponding to the light beam that has passed through the filter element (PF) or the filter element independently from the first image pickup apparatus CA1 and the second image pickup apparatus CA2).
  • the signal corresponding to the light flux that has passed SF) is input, the exposure time is determined based on this value, and this is fed back to individually control the exposure time of the solid-state imaging device SR. Since the processes P103 to P117 are the same as the processes in FIG.
  • the drive control unit DR temporarily stops the unmanned air vehicle 100.
  • FIG. 10 is a block diagram illustrating a processing flow in the image processing unit PR according to the second processing mode.
  • the image processing unit PR in processing P201, P202, A signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass is input independently from each of the first imaging device CA1 and the second imaging device CA2, and the exposure time is determined from the value. And the exposure time of the solid-state image sensor SR is individually controlled. After imaging with the determined exposure time, signals output from the photoelectric conversion regions Ia of the first imaging device CA1 and the second imaging device CA2 are stored in the memory MR.
  • the image processing unit PR for each of the first imaging device CA1 and the second imaging device CA2, outputs a signal corresponding to the red light beam that has passed through the filter element (R) and the filter element (G ) And a signal corresponding to the blue light beam that has passed through the filter element (B) are respectively input, a stereo image of each color is formed, a distance measurement is performed, and an RGB distance measurement is performed. Get the data. If the subject includes a reflector, the RGB distance measurement data also includes the actual distance measurement data.
  • the image processing unit PR outputs a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass for each of the first imaging device CA1 and the second imaging device CA2 from the memory MR. And a signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and a signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input respectively. From the values corresponding to the same kind of filter elements, a stereo image is formed, and distance measurement is performed to obtain ND distance measurement data.
  • the ND distance measurement data also includes the actual distance measurement data when the subject includes a reflector.
  • any two of the filter elements may be selected, and the same operation as described above may be performed from the obtained signal.
  • any two or three of the filter elements (NO, ND1, ND2) may be used to create an image with a wide dynamic range, form a stereo image, and perform distance measurement.
  • the image processing unit PR receives, from the memory MR, a signal corresponding to the light beam that has passed through the filter element (PF) and the filter for each of the first imaging device CA1 and the second imaging device CA2.
  • a signal corresponding to the light beam that has passed through the element (SF) is input, a stereo image is formed from values corresponding to the same type of filter element, and distance measurement is performed to obtain polarization distance measurement data.
  • the polarization distance measurement data also includes the actual distance measurement data.
  • the image processing unit PR receives a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) from the memory MR for each of the first imaging device CA1 and the second imaging device CA2.
  • IR filter element
  • the image processing unit PR receives a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) from the memory MR for each of the first imaging device CA1 and the second imaging device CA2.
  • IR filter element
  • the image processing unit PR receives a signal corresponding to the light beam that has passed through the filter element (PF) and a signal that corresponds to the light beam that has passed through the filter element (SF) from the memory MR, and The difference between the two signal values is obtained, and it is determined from these values whether the subject contains a reflector. That is, if the difference between the two signal values exceeds the threshold value, it can be seen that the reflected light has entered, so the image processing unit PR recognizes the region range that exceeds the threshold value as a reflector.
  • the signal used for the determination of the reflector may be only the signal in the first image pickup device CA1, only the signal in the second image pickup device CA2, or both the first image pickup device CA1 and the second image pickup device CA2.
  • the image processing unit PR obtains a boundary portion of the reflector in the subject in the process P208, and the boundary value of the distance measurement value of the reflector is obtained from the polarization distance measurement data obtained in the process P205.
  • step P209 a distance map of the reflector is created and the three-dimensional information of the subject is obtained.
  • the image processing unit PR removes the distance measurement data corresponding to the reflector determined in the process P207 from the visible light distance measurement data obtained in the process P203, and in the process P210, the distance map for visible light. And 3D information of the subject is obtained.
  • the image processing unit PR receives from the memory MR a signal corresponding to the light beam that has passed through the filter element (NO) of the transparent glass in accordance with each of the first imaging device CA1 and the second imaging device CA2.
  • the signal corresponding to the light beam that has passed through the filter element (ND1) having the attenuation factor of 1 and the signal corresponding to the light beam that has passed through the filter element (ND2) having the second attenuation factor are input, and the contrast is the highest.
  • An image is selected, and ND ranging data corresponding to the image is selected from the ND ranging data obtained in process P204.
  • the image processing unit PR removes distance measurement data corresponding to the reflector determined in process P207 from the selected ND distance measurement data, creates a distance map for backlighting, and creates a three-dimensional image of the subject. Ask for information.
  • any two of the filter elements NO, ND1, ND2 may be selected, and the same operation as described above may be performed from the obtained signal.
  • any two or three of the filter elements may be used to create an image with a wide dynamic range, form a stereo image, and perform distance measurement.
  • the image processing unit PR removes the distance measurement data corresponding to the reflector determined in the process P207 from the near infrared distance measurement data obtained in the process P206, and in the process P213, the near infrared distance. A map is created and 3D information of the subject is obtained.
  • the image processing unit PR receives from the memory MR a signal corresponding to the red light beam that has passed through the filter element (R), a signal corresponding to the green light beam that has passed through the filter element (G), and the filter element.
  • Input at least one of the signals corresponding to the blue luminous flux that has passed through (B) and the signal corresponding to the luminous flux that has passed through the filter element (IR) that transmits near-infrared light, and calculate these to calculate the subject. Is identified.
  • the image processing unit PR receives from the memory MR a signal corresponding to the red light beam that has passed through the filter element (R), a signal corresponding to the green light beam that has passed through the filter element (G), and a filter element.
  • a signal corresponding to the blue light beam that has passed through (B) and a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) are input to form visible light and a near-infrared stereo image, and distance measurement is performed.
  • the image processing unit PR removes the distance measurement data corresponding to the reflector determined in process P207 from the obtained distance measurement data, creates a vegetation distance map, and obtains the three-dimensional information of the subject.
  • the image processing unit PR superimposes the distance maps obtained in processes P210, P212, P209, P213, and P215.
  • the distance measurement information of the reflector that is missing in each distance measurement map is replaced with the distance measurement information of the boundary portion of the reflector, and the distance map is complemented.
  • the subject information is output to the drive control unit DR so as to avoid this.
  • the image processing unit PR avoids the case where there is an object closest to the unmanned air vehicle 100 based on the distance map obtained in the processes P210, P212, P209, P213, and P215.
  • the subject information may be output to the drive control unit DR.
  • FIG. 11 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR receives a signal corresponding to the red light beam that has passed through the filter element (R), a signal corresponding to the green light beam that has passed through the filter element (G), and a filter by a predetermined program or operator operation.
  • the exposure time is to be determined based on the signal corresponding to the blue light beam that has passed through the element (B)
  • the first imaging device CA1 and the second imaging device CA2 are independent from each other in the processes P201A and P202A.
  • the signal corresponding to the red light beam that has passed through the filter element (R), the signal that corresponds to the green light beam that has passed through the filter element (G), and the signal that corresponds to the blue light beam that has passed through the filter element (B) are input. Then, the brightness value is calculated from these values to determine the exposure time, and this is fed back to individually control the exposure time of the solid-state image sensor SR. Going on. Note that processing P203 to P216 is the same as the processing in FIG.
  • FIG. 12 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR is to determine the exposure time based on a signal corresponding to the near-infrared light beam that has passed through the filter element (IR) by a predetermined program or an operator's operation, the processing P201B, P202B , The signal corresponding to the near-infrared luminous flux that has passed through the filter element (IR) is input independently from each of the first imaging device CA1 and the second imaging device CA2, and the exposure time is determined from this value. This is fed back to individually control the exposure time of the solid-state image sensor SR. Note that processing P203 to P216 is the same as the processing in FIG.
  • FIG. 13 is a block diagram illustrating another example of the processing flow in the image processing unit PR.
  • the image processing unit PR receives a signal corresponding to the light beam that has passed through the filter element (ND1) having the first attenuation factor and a filter element (ND2 having the second attenuation factor) by a predetermined program or an operator's operation. ),
  • the exposure time is determined by at least one of the signals corresponding to the light beam that has passed through the first imaging device CA1 and the second imaging device CA2 in the processes P201C and P202C.
  • processing P203 to P216 is the same as the processing in FIG.
  • FIG. 14 is a block diagram showing another example of the processing flow in the image processing unit PR.
  • the image processing unit PR determines the exposure time based on a signal corresponding to the light beam that has passed through the filter element (PF) or a signal corresponding to the light beam that has passed through the filter element (SF) by a predetermined program or an operator's operation. If this is the case, in the processes P201D and P202D, a signal corresponding to the light beam that has passed through the filter element (PF) or the filter element (independently from the first image pickup apparatus CA1 and the second image pickup apparatus CA2).
  • processing P203 to P216 is the same as the processing in FIG.
  • the present embodiment described above by providing the first array optical system and the second optical system, it is possible to reduce the size and weight as compared with a combination of these units in separate units, and to increase the functionality.
  • the base line length is longer than that of the image pickup device having a single array optical system. Can be ensured, and distance measurement accuracy at a long distance can be ensured.
  • the single-eye optical system IL having the filter element (NO) of the transparent glass the light of all wavelength regions sensitive to the solid-state imaging element SR can be condensed, and the filter elements (ND1, ND2 having different attenuation factors).
  • the red, green, and blue filter elements can measure an object in the visible light region, and a visible image to be displayed to the operator of the unmanned air vehicle 100 can be obtained.
  • the subject can be specified from the intensity difference or ratio.
  • the near-infrared filter element (IR) can extract vegetation such as forests and discriminate overhead lines and the like.
  • a reflector can be detected by detecting a light amount difference by using two types of polarizing filter elements (SF, PF) having different transmission axes.
  • the visible light stereo image is formed using the signal obtained by transmitting the red, green, and blue filter elements (R, G, B).
  • the red, green, and blue filter elements R, G, B
  • at least one color filter is used.
  • a single-color or two-color stereo image may be formed using the element.
  • the image processing unit PR may first detect whether or not a reflector exists in the subject, and when the reflector is detected, distance measurement of the corresponding area may not be performed.
  • FIG. 15 is a perspective view of an unmanned air vehicle as a moving body according to another embodiment.
  • the unmanned aerial vehicle 100 ′ of the present embodiment has a function capable of capturing an image of a structure such as a bridge while autonomously flying.
  • a shaft 106 that can rotate with respect to the main body 101 is provided with respect to the embodiment shown in FIG. 1, and a frame 107 is attached to the upper end of the shaft 106.
  • the frame 107 mounts the high-pixel camera HCA and the third imaging device CA3 and the fourth imaging device CA4 arranged on both sides thereof so that the optical axes are parallel.
  • the third image pickup device CA3 and the fourth image pickup device CA4 have the same configuration as the first image pickup device CA1 and the second image pickup device CA2, but are preferably directed in different directions (for example, 90 degrees).
  • FIG. 16 is a block diagram showing the configuration of the unmanned air vehicle 100 '.
  • the image signals output from the third imaging device CA3 and the fourth imaging device CA4 are input to the sub-image processing unit PR ′ arranged in the main body 101 of the unmanned air vehicle 100 ′, and are stored in the sub-image processing unit PR ′.
  • the inspection object distance information acquisition unit PR3 the distance to the structure as the inspection object can be acquired.
  • a self-location information acquisition unit that can acquire self-location information (GPS signals may be used) of the unmanned air vehicle 100 ′ by pattern-matching images captured by the third imaging device CA3 and the fourth imaging device CA4.
  • PR4 is provided in the sub-image processing unit PR ′. These outputs are output to the drive control unit DR.
  • the unmanned air vehicle 100 ′ flies so as to avoid an obstacle ahead in the moving direction based on signals from the first imaging device CA1 and the second imaging device CA2, as in the above-described embodiment.
  • the object distance information acquisition unit PR3 can acquire the distance to the structure as the inspection object, and the drive control unit DR can perform feedback control so that the distance is constant, and the self-position information acquisition unit PR4 It is possible to fly along a predetermined route with the signal from. Further, during the flight, the structure is photographed by the high pixel camera HCA, and the image is transmitted to the monitor MT so that the operator can observe and record it at the transmission destination. Since the other configuration is the same as that of the above-described embodiment, the description thereof is omitted.
  • an unmanned air vehicle 100 ' is approached for inspecting an infrastructure structure such as a bridge, and the inspection image is obtained by imaging the bridge with a high-pixel camera HCA for inspection, it collides with a bridge that is an inspection object. It is necessary to keep a certain distance so as not to. Further, there are cases where the moving direction of the unmanned air vehicle 100 ′ and the direction of the inspection object are different from each other, such as being perpendicular to each other.
  • the first image pickup device CA1 and the second image pickup device CA2 are directed forward in the traveling direction of the unmanned air vehicle 100 ′, and another set of the third image pickup device CA3 and the fourth image pickup device CA4 are inspected.
  • the unmanned air vehicle 100 ′ capable of autonomous control can be realized by appropriately maintaining the distance from the inspection object while detecting the obstacle in the traveling direction.
  • the exposure time (exposure amount) is determined by the light flux that has passed through the filter element of the transparent glass, it is possible to capture a relatively low-brightness subject at a high shutter speed, and to detect blurring of moving objects and subjects.
  • an image formed with a light beam that has passed through the filter element of the transparent glass can be an image without black crushing.
  • an image formed with a light beam transmitted through another optical filter element is slightly underexposed, and an image without whiteout can be obtained.
  • ranging accuracy is improved for both low-luminance subjects and high-luminance subjects.
  • the exposure time (exposure amount) is determined by at least one of the light beams transmitted through the red filter element, the green filter element, and the blue filter element, the ranging performance in the visible light region is improved and the unmanned flight is performed. Visible visible image information can be transmitted to the body operator.
  • the exposure time (exposure amount) is determined by the light beam transmitted through the near-infrared filter element, for example, the light reflected by vegetation or the like can be accurately imaged, and the extraction ability of vegetation or the like is improved.
  • the exposure time (exposure amount) is determined by the light beam that has passed through the polarizing filter element, it is possible to accurately capture the object light from the reflector and the like, and the extraction ability of the reflector and the like is improved.
  • the exposure time (exposure amount) is determined by the light beam that has passed through the filter element that can attenuate the incident light, the amount of light from the backlight or high-intensity subject can be suppressed, and as a result, whiteout of the image can be avoided. Is possible.
  • an image reflected on a reflector such as the water surface, transparent acrylic, or glass
  • the distance of a virtual object that exists farther than that will be detected.
  • the position cannot be detected.
  • the reflector if the reflector is photographed through two or more types of polarizing filter elements having different transmission axes, the reflected light is polarized and thus obtained through the polarizing filter elements having different transmission axes.
  • the brightness of the image is different from each other at the part of the reflector. If a difference in image luminance between polarization filter elements is detected in this way, it can be identified as a reflector.
  • the combination of polarizing filter elements having different transmission axes may be in the same array optical system or in different array optical systems.
  • the reflector can be detected in this way, it is possible to obtain accurate three-dimensional information of the subject by not adopting distance information based on the light flux from the reflector or by not performing distance measurement processing on the light flux from the reflector. it can.
  • the distance to the reflector remains unclear if it remains as it is, so the non-reflector existing at the boundary of the reflector is replaced with a filter element made of plain glass, red light, green light.
  • a filter element that transmits blue light, a filter element that transmits near-infrared light, a filter element having a different attenuation factor, an imaging optical system having any one of polarizing filter elements, or an image having different types of filter elements A distance measurement process is performed based on a combination of signals from the optical system, and the distance information obtained here is used as the distance of the reflector, thereby avoiding a situation in which no distance information exists.
  • the calculation load becomes smaller than the distance measurement for all the pixel areas, and the process can be performed in a short time.
  • the processing configuration in the image processing algorithm may be simplified if the distance is collectively measured.
  • an optimum distance image can be obtained for each subject by specifying the subject or obtaining subject information based on the characteristics of the light flux that has passed through each filter element after measuring all the subjects at once.
  • a color visible image obtained through a filter element that transmits red light, green light, and blue light it is possible to transmit effective information necessary for the operation of the unmanned air vehicle operator. Furthermore, it is possible to superimpose and display images obtained through a filter element made of plain glass or a filter element that attenuates incident light, so that an image with a wide dynamic range without whiteout or blackout can be obtained even in backlight. Can be provided. In addition, if the image obtained through the polarizing filter element is displayed, it is possible to indicate the position of the reflector to be careful in obstacle detection. These images may be displayed in combination as appropriate.
  • the present invention is not limited to the embodiments described in the present specification, and includes other embodiments and modifications based on the embodiments and technical ideas described in the present specification. It is obvious to For example, the filters included in each of the second imaging optical systems of the second imaging device are the same type as the filters included in the first array optical system of the first imaging device, but the filters included in each of the second imaging optical systems. Is not limited to the case where the filters of each of the first imaging optical systems are completely the same, and the filters of each of the second imaging optical systems of the second imaging device have each of the first imaging optical systems. In addition to having the same filter, an extra filter (a filter that is not in the first imaging optical system but only in the second imaging optical system) may be provided.
  • the stereo imaging device of the present invention can be mounted not only on an unmanned air vehicle but also on a vehicle.
  • the first imaging device CA1 and the second imaging device CA2 are arranged on both sides of the room mirror RM of the vehicle VH, and the imaging is performed with the optical axis directed toward the front of the vehicle. Obstacles can be detected and driving assistance can be realized.
  • the first imaging device CA1 and the second imaging device CA2 may be provided in a headlamp or a bumper.

Abstract

La présente invention concerne un dispositif d'imagerie stéréoscopique qui est compact, léger et capable de reconnaître une variété d'obstacles, et un corps mobile le comprenant. La configuration d'un dispositif d'imagerie stéréoscopique à partir d'un premier dispositif d'imagerie comprenant un premier système optique à réseau comportant divers filtres optiques et d'un second dispositif d'imagerie comprenant un second système optique à réseau permet d'obtenir une ligne de base plus longue que celle d'un dispositif d'imagerie comprenant un système optique à réseau unique et, de ce fait, permet de détecter une variété d'obstacles à partir d'images passées à travers les divers filtres optiques tout en maintenant une précision de mesure de distance à une longue distance.
PCT/JP2015/085013 2014-12-26 2015-12-15 Dispositif d'imagerie stéréoscopique et corps mobile WO2016104235A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016566133A JPWO2016104235A1 (ja) 2014-12-26 2015-12-15 ステレオ撮像装置及び移動体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-264014 2014-12-26
JP2014264014 2014-12-26

Publications (1)

Publication Number Publication Date
WO2016104235A1 true WO2016104235A1 (fr) 2016-06-30

Family

ID=56150260

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/085013 WO2016104235A1 (fr) 2014-12-26 2015-12-15 Dispositif d'imagerie stéréoscopique et corps mobile

Country Status (2)

Country Link
JP (1) JPWO2016104235A1 (fr)
WO (1) WO2016104235A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018013949A (ja) * 2016-07-21 2018-01-25 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 移動体、移動体の障害物検知方法および移動体の障害物検知プログラム
WO2018188627A1 (fr) * 2017-04-12 2018-10-18 普宙飞行器科技(深圳)有限公司 Appareil d'évitement d'obstacle omnidirectionnel, tête de trépied, procédé de commande de tête de trépied, et procédé de commande d'évitement d'obstacle
US10776938B2 (en) 2017-05-19 2020-09-15 Waymo Llc Camera systems using filters and exposure times to detect flickering illuminated objects

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002171430A (ja) * 2000-11-30 2002-06-14 Canon Inc 複眼撮像系、撮像装置および電子機器
JP2007127431A (ja) * 2005-11-01 2007-05-24 Fuji Xerox Co Ltd 端部位置検出方法及び端部位置検出装置
JP2008005488A (ja) * 2006-06-19 2008-01-10 Samsung Electro Mech Co Ltd カメラモジュール
JP2008157851A (ja) * 2006-12-26 2008-07-10 Matsushita Electric Ind Co Ltd カメラモジュール
JP2011164061A (ja) * 2010-02-15 2011-08-25 Ricoh Co Ltd 透明体検出システム
JP2011176710A (ja) * 2010-02-25 2011-09-08 Sharp Corp 撮像装置
JP2013044597A (ja) * 2011-08-23 2013-03-04 Canon Inc 画像処理装置および方法、プログラム
JP2013156109A (ja) * 2012-01-30 2013-08-15 Hitachi Ltd 距離計測装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002171430A (ja) * 2000-11-30 2002-06-14 Canon Inc 複眼撮像系、撮像装置および電子機器
JP2007127431A (ja) * 2005-11-01 2007-05-24 Fuji Xerox Co Ltd 端部位置検出方法及び端部位置検出装置
JP2008005488A (ja) * 2006-06-19 2008-01-10 Samsung Electro Mech Co Ltd カメラモジュール
JP2008157851A (ja) * 2006-12-26 2008-07-10 Matsushita Electric Ind Co Ltd カメラモジュール
JP2011164061A (ja) * 2010-02-15 2011-08-25 Ricoh Co Ltd 透明体検出システム
JP2011176710A (ja) * 2010-02-25 2011-09-08 Sharp Corp 撮像装置
JP2013044597A (ja) * 2011-08-23 2013-03-04 Canon Inc 画像処理装置および方法、プログラム
JP2013156109A (ja) * 2012-01-30 2013-08-15 Hitachi Ltd 距離計測装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018013949A (ja) * 2016-07-21 2018-01-25 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 移動体、移動体の障害物検知方法および移動体の障害物検知プログラム
WO2018188627A1 (fr) * 2017-04-12 2018-10-18 普宙飞行器科技(深圳)有限公司 Appareil d'évitement d'obstacle omnidirectionnel, tête de trépied, procédé de commande de tête de trépied, et procédé de commande d'évitement d'obstacle
US10776938B2 (en) 2017-05-19 2020-09-15 Waymo Llc Camera systems using filters and exposure times to detect flickering illuminated objects
US11341667B2 (en) 2017-05-19 2022-05-24 Waymo Llc Camera systems using filters and exposure times to detect flickering illuminated objects

Also Published As

Publication number Publication date
JPWO2016104235A1 (ja) 2017-10-05

Similar Documents

Publication Publication Date Title
US10564266B2 (en) Distributed LIDAR with fiber optics and a field of view combiner
US10408940B2 (en) Remote lidar with coherent fiber optic image bundle
KR101951318B1 (ko) 컬러 영상과 깊이 영상을 동시에 얻을 수 있는 3차원 영상 획득 장치 및 3차원 영상 획득 방법
JP6878219B2 (ja) 画像処理装置および測距装置
JP2022001882A (ja) 測距装置及び移動体
JP2018160228A (ja) 経路生成装置、経路制御システム、及び経路生成方法
KR102119289B1 (ko) 샘플 검사 및 검수 시스템 및 방법
US20140009611A1 (en) Camera System and Method for Observing Objects at Great Distances, in Particular for Monitoring Target Objects at Night, in Mist, Dust or Rain
US11781913B2 (en) Polarimetric imaging camera
TWI781109B (zh) 立體三角測量的系統和方法
WO2016104235A1 (fr) Dispositif d'imagerie stéréoscopique et corps mobile
KR101545971B1 (ko) 복합 영상 센싱 시스템
JP6971933B2 (ja) 画像処理装置及び撮像装置
US20180098053A1 (en) Imaging device, endoscope apparatus, and imaging method
US11172108B2 (en) Imaging device
WO2016039053A1 (fr) Dispositif d'arpentage
JP6756898B2 (ja) 距離計測装置、ヘッドマウントディスプレイ装置、携帯情報端末、映像表示装置、及び周辺監視システム
JP6202364B2 (ja) ステレオカメラ及び移動体
US11422264B2 (en) Optical remote sensing
US10075646B2 (en) Sensor systems and methods
JP6847891B2 (ja) 画像処理装置、撮像装置及び方法
JP2006024986A (ja) マルチバンド撮像装置
KR102613150B1 (ko) LCoS를 이용한 고해상도 라이다 시스템
US20100253802A1 (en) Enhanced microscan apparatus and methods
KR20230124498A (ko) 스테레오 카메라 테스트를 위한 테스트 장비 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15872804

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016566133

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15872804

Country of ref document: EP

Kind code of ref document: A1