WO2015198851A1 - Dispositif et procédé de mesure de distance - Google Patents

Dispositif et procédé de mesure de distance Download PDF

Info

Publication number
WO2015198851A1
WO2015198851A1 PCT/JP2015/066553 JP2015066553W WO2015198851A1 WO 2015198851 A1 WO2015198851 A1 WO 2015198851A1 JP 2015066553 W JP2015066553 W JP 2015066553W WO 2015198851 A1 WO2015198851 A1 WO 2015198851A1
Authority
WO
WIPO (PCT)
Prior art keywords
group
photoelectric conversion
optical systems
subject
value
Prior art date
Application number
PCT/JP2015/066553
Other languages
English (en)
Japanese (ja)
Inventor
壮功 北田
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Publication of WO2015198851A1 publication Critical patent/WO2015198851A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor

Definitions

  • the present invention relates to a distance measuring apparatus and a distance measuring method using a solid-state imaging device such as a CCD (Charge Coupled Device) type image sensor or a CMOS (Complementary Metal Oxide Semiconductor) type image sensor which is small and thin.
  • a solid-state imaging device such as a CCD (Charge Coupled Device) type image sensor or a CMOS (Complementary Metal Oxide Semiconductor) type image sensor which is small and thin.
  • a technique for measuring the distance to an object using a stereo camera or a multi-lens multi-array camera is known.
  • corresponding points are searched for a plurality of images obtained by imaging, and the distance is measured based on the principle of triangulation using the calculated parallax.
  • an image for distance measurement cannot be appropriately acquired due to specular reflection occurring on the surface of glass, water, plastic, or the like. This is because the diffusely reflected light from the subject cannot be properly separated and received, for example, when the subject is present on the other side of glass or the like where the specular reflected light is strong.
  • Patent Document 1 uses the fact that a high-luminance portion changes depending on the polarization angle of the polarization filter when a polarization filter portion having a plurality of different polarization angles is mounted and the subject has a specular reflection portion.
  • An imaging module that performs distance estimation using image data captured with an optimum polarization angle is disclosed.
  • Example 1 of Patent Document 1 when a plurality of images are simultaneously picked up through polarizing filters having different polarization angles, the sum of the difference values is calculated when the difference value for each pixel from the reference image is obtained. It is possible to grasp that the polarization angle at the point where the monotonous increase has changed to the monotonous decrease is the optimum polarization angle, and obtain the ranging image data using the polarization filter having the optimum polarization angle.
  • Example 1 of Patent Document 1 since the difference value is calculated for each pixel and the difference value is integrated, the magnitude of the reflected light can be determined as the entire screen. For example, there are reflection surfaces having different angles. In such a case, since the intensity of the reflected light varies depending on the polarization angle of the polarizing filter, there is a risk of erroneous determination. In addition, because the multi-camera has a difference in units of pixels even though there is parallax, it means that the brightness values of completely different areas are compared, such as in the case of a short-distance subject, and the ranging accuracy is poor. There's a problem.
  • Example 2 of Patent Document 1 50 ⁇ 50 pixels are defined as one block, and an image having many invalid blocks is defined as an effective block if the luminance difference in the block is greater than or equal to a threshold, and an invalid block if the luminance difference is smaller than the threshold.
  • the present invention has been made in view of such problems, and an object of the present invention is to provide a distance measuring device and a distance measuring method capable of accurately measuring a distance with a simple process even for a subject having specular reflection. To do.
  • a distance measuring device reflecting one aspect of the present invention.
  • a plurality of single-eye optical systems (A1, A2,..., An) having a first group with different optical axes;
  • a plurality of single-eye optical systems (second group) having the same number as the number (n) of single-eye optical systems in the first group, shifted in the direction perpendicular to the optical axis, and having different optical axes.
  • B1, B2,... Bn) A first group of photoelectric conversion regions (C1, C2,... Cn) in which a subject image is formed by each of the first group of single-eye optical systems (A1, A2,... An);
  • Solid-state imaging including a second group of photoelectric conversion regions (D1, D2,...
  • Dn in which a subject image is formed by each of the two groups of single-eye optical systems (B1, B2,... Bn).
  • Elements Arranged in an optical path that passes through the first group of single-eye optical systems (A1, A2,... An) and reaches the first group of photoelectric conversion regions (C1, C2,... Cn), A first group of polarizing filters (E1, E2,... En) with different polarization phase angles; Arranged in an optical path passing through the second group of individual optical systems (B1, B2,... Bn) to the photoelectric conversion regions (D1, D2,... Dn) of the second group, and A second group of polarizing filters (F1, F2,... Fn) having the same polarization phase angle as the first group of polarizing filters (E1, E2,...
  • a processing unit that processes the image signal output from the photoelectric conversion region The processing unit is arranged in the first group of photoelectric conversion regions (C1, C2,... Cn) in which subject light is incident by the first group of single-eye optical systems (A1, A2,... An). And dividing into a plurality of imaging areas, calculating index values relating to luminance values according to the subject light incident on the imaging area, and comparing each of the imaging areas corresponding to each other between the photoelectric conversion areas.
  • an imaging area in which the difference value between the maximum value and the minimum value is larger than a predetermined value is recognized as a specific imaging area, and the sum of pixel values in the specific imaging area And select a photoelectric conversion region (Ck) where the total sum of the obtained pixel values is the smallest, Further, the polarization phase angle of the polarization filter (Ek) used for imaging the image signal output from the selected photoelectric conversion region (Ck) of the first group and the selected photoelectric conversion region (Ck). The distance to the subject is measured based on the image signal output from the photoelectric conversion region (Dk) on which the subject light is incident through the second group of polarizing filters (Fk) having the same polarization phase angle. And
  • a distance measuring method reflecting one aspect of the present invention is: A plurality of single-eye optical systems (A1, A2,..., An) having a first group with different optical axes; A plurality of single-eye optical systems (second group) having the same number as the number (n) of single-eye optical systems in the first group, shifted in the direction perpendicular to the optical axis, and having different optical axes. B1, B2,... Bn), A first group of photoelectric conversion regions (C1, C2,... Cn) in which a subject image is formed by each of the first group of single-eye optical systems (A1, A2,... An); Solid-state imaging including a second group of photoelectric conversion regions (D1, D2,...
  • a distance measuring method using a device A plurality of images are formed in the photoelectric conversion regions (C1, C2,... Cn) of the first group in which subject light is incident by the single-eye optical system (A1, A2,... An) of the first group. Dividing into areas, calculating index values related to luminance values according to subject light incident on the imaging area, and comparing each photoelectric imaging area for each imaging area corresponding to the maximum value. When the minimum value is obtained, an imaging area in which the difference value between the maximum value and the minimum value is larger than a predetermined value is identified as a specific imaging area, and a total sum of pixel values in the specific imaging area is obtained and obtained.
  • the distance to the subject is measured based on the image signal output from the photoelectric conversion region (Dk) on which the subject light is incident through the second group of polarizing filters (Fk) having the same polarization phase angle.
  • the imaging area in which the difference value between the maximum value and the minimum value is larger than a predetermined value is recognized as a specific imaging area, and the pixel value in the specific imaging area is determined.
  • the total sum is obtained, the photoelectric conversion region (Ck) having the smallest sum of the obtained pixel values is selected, and the image signal output from the selected photoelectric conversion region (Ck) of the first group; Selected photoelectric conversion area Output from the photoelectric conversion region (Dk) through which the subject light enters through the second group of polarization filters (Fk) having the same polarization phase angle as the polarization phase angle of the polarization filter (Ek) used for imaging with respect to Ck).
  • the distance to the subject is measured based on the measured image signal, so even if there are reflective surfaces with different angles as the subject, the imaging area that receives the reflected light is recognized as the specific imaging area.
  • the difference value can be obtained for each reflecting surface, thereby avoiding the influence of specular reflection as much as possible and performing accurate ranging.
  • the present invention it is possible to provide a distance measuring device and a distance measuring method capable of accurately measuring a distance even with a subject having specular reflection with a simple process.
  • FIG. 3 is a perspective view of an automobile as an example of a subject having specular reflection surfaces with different polarization angles. It is a graph which shows the luminance value of an image on a vertical axis
  • FIG. 1 is a perspective view showing a part of the distance measuring apparatus DM according to the present embodiment in an exploded state.
  • the distance measuring device DM includes a lens unit LU, a solid-state image sensor SR that forms an image of light from a distance measuring object (also referred to as subject light) via the lens unit LU, and a lens unit LU. And a lens frame HLD.
  • the lens unit LU is formed by laminating one or a plurality (two in this embodiment) of array lenses LA formed by integrally forming lenses of the same shape arranged in 3 rows and 4 columns.
  • Each lens constitutes a single-eye optical system.
  • the single-eye optical system is divided into two groups of three rows and two columns, the first group of single-eye optical systems is A1 to A6, and the second group of single-eye optical systems is Let B1 to B6.
  • the lens frame HLD has a casing shape, and a lens unit LU is assembled and bonded to the inside.
  • the lens frame HLD has an object side wall HLPa on the object side, and the object side wall HLPa has openings HLDb in 3 rows and 4 columns corresponding to the single-eye optical system, and a polarizing filter is provided in the opening HLDb. It is arranged.
  • FIG. 2 is a view of the object side wall HLLa of the lens frame HLD as viewed from the object side, and the polarizing filter schematically shows the phase angle (polarization phase angle).
  • the polarizing filters are divided into two groups of 3 rows and 2 columns, the first group of polarizing filters is designated as E1 to E6, and the second group of polarizing filters is designated as F1 to F6.
  • the phase angle of the polarizing filter within the same group is shifted from each other by 30 °
  • the phase angle of the paired polarizing filters is the same between the groups. Specifically, it is as follows.
  • the solid-state imaging device SR has an imaging surface SS. Although there is no separation on the imaging surface SS, for convenience, it is assumed that photoelectric conversion areas are formed in 3 rows and 4 columns, and image signals can be extracted independently from the individual photoelectric conversion areas. However, it may be a physically independent photoelectric conversion region.
  • the photoelectric conversion regions are divided into two groups of 3 rows and 2 columns, the first group of photoelectric conversion regions is C1 to C6, and the second group of photoelectric conversion regions is D1 to D6.
  • the subject light passes through the polarization filters E1 to E6 and is focused on the photoelectric conversion regions C1 to C6 by the single-eye optical systems A1 to A6, respectively, with respect to the first group.
  • the second group passes through the polarization filters F1 to F6 and is imaged in the photoelectric conversion regions D1 to D6 by the single-eye optical systems B1 to B6, respectively.
  • Each polarizing filter may be disposed on the object side with respect to the corresponding single-eye optical system.
  • FIG. 3 is a block diagram of the distance measuring apparatus DM according to the present embodiment.
  • the distance measuring device DM further includes a drive circuit DR that drives the solid-state image sensor SR, an image selection unit IS that inputs a signal from the solid-state image sensor SR, and a distance measuring unit MS that measures the distance to the object to be measured. And a system control unit SCON for controlling these.
  • the system control unit SCON performs a control operation according to a program stored in the RAM.
  • the image selection unit IS and the distance measuring unit MS constitute a processing unit.
  • the operation of this embodiment will be described.
  • specular reflection light the reflected light generated on the surface of the object. That is, if the specular reflection light is stronger than the diffuse reflection light directly emitted from the distance measurement object, there is a strong possibility that the distance to the distance measurement object cannot be measured with high accuracy. Therefore, in order to measure the distance with high accuracy, it is necessary to devise a method for eliminating the specular reflection light.
  • the specular reflection light is polarized, and can be completely removed or the reflected light component can be suppressed by the polarization filter.
  • a polarizing filter having a polarization angle (phase angle) perpendicular to the polarization direction of the specular reflection light For example, in order to remove specularly reflected light polarized in the horizontal direction, a polarizing filter having a vertical phase angle may be passed. Since the polarization direction of the specular reflected light varies depending on the subject, it is desirable to select the phase angle of the polarizing filter according to the subject.
  • the specular reflection component since the specular reflection component is polarized due to the characteristics of light, it can be removed by a polarizing filter.
  • the intensity of polarized light that can be removed differs depending on the polarization angle of the specular reflection light and the phase angle of the polarization filter.
  • the light intensity I 1 after passing the light from the subject through the polarizing filter inclined by the angle ⁇ with respect to the polarization direction of the specular reflection component is expressed as follows, where I S is the light intensity of the specular reflection component: ) Expression.
  • I 1 I S ⁇ cos 2 ⁇ (2)
  • I 1 I S ⁇ cos 2 ⁇ (2)
  • the distance measuring device DM since the polarization angle of the specular reflected light in the light from the subject is generally unknown, it is a problem as to which value the inclination angle ⁇ should be set. Therefore, in the distance measuring device DM according to the present embodiment, the subject light is passed through a polarizing filter having a plurality of types of phase angles, and a plurality of images having different specular reflection light intensities are obtained. An image with less specular reflection light is selected. By using the image from which the specular reflection light has been removed, for example, it is possible to accurately measure the distance to the subject through the glass.
  • FIG. 4B is a simple illustration of the subject of FIG. 4A.
  • a rubber mat GM is laid on the table TL placed on the floor FL, and water is stored on it.
  • ⁇ DS is placed.
  • FIG. 5 is a flowchart showing a distance measuring method by the distance measuring device DM.
  • imaging is performed in step S101 of FIG.
  • the light from the subject including the specular reflection light from the water surface passes through the polarization filters (E1 to E6) for the first group, and the single-eye optical system (A1 to A1) of the array lens LA.
  • Images are formed on the photoelectric conversion regions (C1 to C6) of the solid-state imaging device SR via A6).
  • the formed subject image is photoelectrically converted in each photoelectric conversion region (C1 to C6), converted into an image signal, and input to the image selection unit IS.
  • the light from the subject passes through the polarizing filter (F1 to F6) and passes through the individual optical system (B1 to B6) of the array lens LA.
  • An image is formed in each of the photoelectric conversion regions (D1 to D6) of the imaging element SR.
  • FIG. 6 shows an example of images formed on all photoelectric conversion areas and based on the output image signals, but it is not necessary to output image signals from all photoelectric conversion areas at this time. Only the first group is sufficient.
  • the imaging conditions of the image shown in FIG. 6 differ only in the phase angle of the polarizing filter except for parallax.
  • the arrangement of images in FIG. 6 corresponds to the arrangement of the polarizing filters in FIG.
  • the light intensity of the subject light is expressed as a luminance value of the image.
  • the specular reflection component from the water surface is polarized, the light intensity obtained varies depending on the phase angle of the polarizing filters (E1 to E6, F1 to F6), as can be seen from the above-described equation (2).
  • the diffuse reflection component from the subject is not polarized, the diffuse reflection component has the same light intensity even when the phase angle of the polarizing filter is changed. Therefore, the reason why the luminance value changes between the plurality of images shown in FIG. 6 is that the passing amount of the specular reflection component in the polarizing filter changes.
  • FIG. 7 is a graph in which the vertical axis indicates the luminance value of the image and the horizontal axis indicates the phase angle, and the floor, the table, the water surface, and the ridge are distinguished from each other in the image (FIG. 4B). ing.
  • the vertical axis indicates the luminance value of the image and the horizontal axis indicates the phase angle
  • the floor, the table, the water surface, and the ridge are distinguished from each other in the image (FIG. 4B).
  • the water surface is a region with a lot of specular reflection light.
  • the side surfaces of the floor and the ridge are areas with little specular reflection light.
  • C Since the image obtained from the subject light that has passed through the polarizing filter (E1, F1) having a phase angle of 90 ° has the lowest luminance, the specular reflection component is suppressed most.
  • the following method is employed in comparing the luminance values of the subjects.
  • parallax occurs between a plurality of images because imaging is performed using a single-eye optical system having different optical axes. Therefore, when a two-dimensional coordinate system with the center as the origin is defined for each photoelectric conversion region, pixels of the same coordinate in different photoelectric conversion regions inherently image different parts of the subject unless they are infinitely distant subjects. Will be.
  • comparing image signals in units of pixels means comparing image signals of different subjects, and it can be said that the reliability of the comparison is lowered.
  • the pixel unit is easily affected by noise. Therefore, in the present embodiment, the photoelectric conversion regions (C1 to C6) are divided into image forming area units composed of a plurality of pixels, and the luminance values thereof are compared with each other.
  • the size to be divided it is preferable to divide the imaging area in units of 3 ⁇ 3 pixels, 5 ⁇ 5 pixels, etc., but is not limited to this division size.
  • the image selection unit IS divides each photoelectric conversion area into an imaging area composed of a plurality of pixels in step S102 of FIG. Such a configuration is suitable when parallax occurs in a plurality of pixels.
  • the image selection unit IS obtains an average luminance value (an index value related to the luminance value) that is an average of the luminance values of the divided imaging areas for each of the photoelectric conversion regions (C1 to C6).
  • FIG. 8 is an example of an image in which luminance values are averaged. Note that an integral value of the luminance values of the divided imaging areas may be used as an index value regarding the luminance value.
  • step S104 the image selection unit IS extracts the maximum value and the minimum value of the average luminance value in the corresponding imaging area between the photoelectric conversion regions (C1 to C6). That is, for example, the pixels in the divided image formation area of the photoelectric conversion region C1 are made to correspond to the pixels in the image formation area divided in the photoelectric conversion region C2 (having the same coordinates as the pixels of the photoelectric conversion region C1). Thus, the pixels having the same coordinates in the image forming areas divided in the photoelectric conversion regions (C1 to C6) are compared to extract the maximum value and the minimum value of the average luminance value.
  • step S105 the image selection unit IS obtains a difference value obtained by taking the difference between the maximum value and the minimum value of the extracted average luminance values.
  • the image selection unit IS observes a change in the difference value in step S106.
  • specular reflection light from glass or the like generally has a large luminance value. Using this, it is determined that the specular reflection light is incident on the imaging area where the luminance difference between the photoelectric conversion regions (C1 to C6) is large.
  • the difference value of the image formation area is a predetermined value 100 or more, the image formation area is recognized as a specific image formation area and is used as a comparison target between photoelectric conversion regions.
  • the image selection unit IS determines that an image formation area whose difference value is less than the predetermined value 100 is not incident with specular reflection light, and excludes it from the comparison target in step S107.
  • FIG. 9 is an example of an image in which the imaging area excluded from the comparison target is represented with a luminance value of zero, and represents the luminance value of only the specific imaging area.
  • the image selection unit IS may use another or auxiliary determination criterion when identifying a specific imaging area.
  • FIG. 7 as a result of measuring the amount of change in the luminance value depending on the phase angle of the polarizing filter, there is no image area where the luminance value changes in a cosine curve and there is no change in the luminance value. It is divided into an imaging area whose luminance value changes at random. In an imaging area where a large amount of specular reflected light is incident, the luminance value tends to change in a cosine curve, but otherwise there is no such tendency.
  • the image selection unit IS can select an imaging area whose luminance value changes in a cosine curve as a comparison target, and can exclude an imaging area whose luminance value does not change in a cosine curve from the comparison target.
  • step S108 the image selection unit IS determines the pixel value of the imaging area for each photoelectric conversion region (C1 to C6) (an output value output from the pixel, including a brightness value obtained by processing this). And the photoelectric conversion region (here, C1) in which the sum of the pixel values obtained in step S109 is minimum is selected.
  • the distance measurement unit MS performs a distance measurement process.
  • the parallax is calculated by general template matching (SSD, SAD, etc.). The two images are associated based on the calculated parallax. If information on the positional relationship (translation, rotation) between the focal length, image center, lens distortion coefficient, photoelectric conversion region (C1, D1) of each individual optical system is obtained in advance, from these coefficients and the parallax value, The triangulation principle enables distance measurement to an object. These coefficients may be calculated by a general stereo camera calibration method (for example, Zhang's method).
  • the image selection unit IS can select the image with the least specular reflection component to avoid the influence of the specular reflection component. Therefore, the measurement is performed by searching for corresponding points between the paired images. Even if the distance unit MS is an object existing in water, highly accurate distance measurement can be performed.
  • the distance measuring device of the present embodiment can also be used as an imaging device.
  • the image signal of the imaging area that has passed through the polarizing filter is used so that the specular reflection component is minimized, and for the other subjects, the imaging area that has passed through all the polarizing filters. If these image signals are synthesized by super-resolution processing, a high-pixel image can be obtained while suppressing reflected light from the water surface.
  • the polarization angle in the specular reflection component may be different.
  • the polarization angles of the specular reflection components reflected from the windshield FS and the side glass GS may be different.
  • two types of changes in cosine curve-like luminance values are obtained as shown in FIG.
  • the specific imaging area in which the difference value is equal to or greater than a predetermined value and the average value of the luminance values is the minimum value is classified for each phase angle of the polarization filter.
  • the number of specific imaging areas in which the difference value is equal to or greater than a predetermined value and the average value of luminance values is the minimum value: 30 Polarizing filters with a phase angle of 30 ° are used The number of specific imaging areas in which the difference value is equal to or greater than a predetermined value and the average value of luminance values is the minimum value: 10 When a polarizing filter with a phase angle of 60 ° is used, the difference value is equal to or greater than the predetermined value.
  • the number of specific imaging areas in which the average value of the luminance values is the minimum value 8 When a polarizing filter having a phase angle of 90 ° is used, the difference value is equal to or greater than the predetermined value, and the average value of the luminance values is the minimum value.
  • Number of specific imaging areas 40 When using a polarizing filter with a phase angle of 120 °, the number of specific imaging areas where the difference value is not less than a predetermined value and the average value of the luminance values is the minimum value: 7 When using a polarizing filter with a phase angle of 150 °, the difference value is less than or equal to a predetermined value. Number of specific imaging area average value of the brightness values comprising the becomes the minimum: 5
  • the number of specific imaging areas is 30 or more, so it is estimated that there are two specular reflection surfaces having different polarization angles. it can. In other cases, the number of specific imaging areas is small, so it is assumed to be noise, and even if it is not noise, the influence on distance measurement is considered to be small.
  • the specular reflection component is minimized, and distance measurement can be performed using the image.
  • the difference value is equal to or greater than a predetermined value, and an image is formed on an image formation area where the average value of the luminance value is the minimum value, thereby causing the windshield FS to Extracted as a corresponding first image signal, passed through a polarizing filter with a phase angle of 90 °, and imaged in an imaging area where the difference value is equal to or greater than a predetermined value and the average value of luminance values is the minimum value
  • it is extracted as the second image signal corresponding to the side glass GS, and the other subject light is imaged in the corresponding imaging area after passing through the polarization filters of all phase angles.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

Dans la présente invention, un premier groupe de régions de conversion photoélectrique (C1... Cn) exposés à une lumière objet provenant d'un premier groupe de systèmes optiques monoculaires (A1... An) sont chacune divisées en une pluralité de zones de formation d'image, des indices sont calculés pour chaque valeur de luminosité correspondant à la lumière objet irradiée sur les zones de formation d'image, des zones de formation d'image pour lesquelles la différence entre une valeur maximale et minimale déterminées par la comparaison des zones de formation d'image correspondantes des régions de conversion photoélectrique est supérieure à une valeur prescrite sont reconnues en tant que zones de formation d'image désignées, les valeurs de pixel sont additionnées, la région de conversion photoélectrique (Ck) ayant la plus petite somme est sélectionnée, et la distance à un sujet est mesurée sur la base du signal d'image délivré en sortie par la région de conversion photoélectrique sélectionné (Ck) du premier groupe et le signal d'image délivré en sortie par la région de conversion photoélectrique (Dk) irradiée par la lumière objet ayant traversé un filtre de polarisation (Fk) d'un second groupe qui a le même angle de phase de polarisation que celui du filtre de polarisation (Ek) utilisé pour la formation d'image sur une région de conversion photoélectrique (Ck). Il est possible, même pour des surfaces de réflexion présentant des angles différents, de déterminer des valeurs de différence pour chaque surface de réflexion, et l'évitement extrême de l'influence de réflexion spéculaire est possible.
PCT/JP2015/066553 2014-06-23 2015-06-09 Dispositif et procédé de mesure de distance WO2015198851A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-127914 2014-06-23
JP2014127914 2014-06-23

Publications (1)

Publication Number Publication Date
WO2015198851A1 true WO2015198851A1 (fr) 2015-12-30

Family

ID=54937945

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/066553 WO2015198851A1 (fr) 2014-06-23 2015-06-09 Dispositif et procédé de mesure de distance

Country Status (1)

Country Link
WO (1) WO2015198851A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020071253A1 (fr) * 2018-10-03 2020-04-09 富士フイルム株式会社 Dispositif d'imagerie

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001174694A (ja) * 1999-12-21 2001-06-29 Olympus Optical Co Ltd 測距装置
US20060055811A1 (en) * 2004-09-14 2006-03-16 Frtiz Bernard S Imaging system having modules with adaptive optical elements
JP2008016918A (ja) * 2006-07-03 2008-01-24 Matsushita Electric Ind Co Ltd 画像処理装置、画像処理システムおよび画像処理方法
JP2010109562A (ja) * 2008-10-29 2010-05-13 Panasonic Corp 撮像装置
JP2010243463A (ja) * 2009-04-10 2010-10-28 Ricoh Co Ltd ステレオカメラ装置及び車外監視装置
JP2011085539A (ja) * 2009-10-19 2011-04-28 Ricoh Co Ltd 測距カメラ装置
JP2012247356A (ja) * 2011-05-30 2012-12-13 Canon Inc 撮像モジュール、撮像装置、画像処理装置及び画像処理方法。
JP2013044597A (ja) * 2011-08-23 2013-03-04 Canon Inc 画像処理装置および方法、プログラム

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001174694A (ja) * 1999-12-21 2001-06-29 Olympus Optical Co Ltd 測距装置
US20060055811A1 (en) * 2004-09-14 2006-03-16 Frtiz Bernard S Imaging system having modules with adaptive optical elements
JP2008016918A (ja) * 2006-07-03 2008-01-24 Matsushita Electric Ind Co Ltd 画像処理装置、画像処理システムおよび画像処理方法
JP2010109562A (ja) * 2008-10-29 2010-05-13 Panasonic Corp 撮像装置
JP2010243463A (ja) * 2009-04-10 2010-10-28 Ricoh Co Ltd ステレオカメラ装置及び車外監視装置
JP2011085539A (ja) * 2009-10-19 2011-04-28 Ricoh Co Ltd 測距カメラ装置
JP2012247356A (ja) * 2011-05-30 2012-12-13 Canon Inc 撮像モジュール、撮像装置、画像処理装置及び画像処理方法。
JP2013044597A (ja) * 2011-08-23 2013-03-04 Canon Inc 画像処理装置および方法、プログラム

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020071253A1 (fr) * 2018-10-03 2020-04-09 富士フイルム株式会社 Dispositif d'imagerie
CN112805992A (zh) * 2018-10-03 2021-05-14 富士胶片株式会社 摄像装置
JPWO2020071253A1 (ja) * 2018-10-03 2021-09-09 富士フイルム株式会社 撮像装置
US11457202B2 (en) 2018-10-03 2022-09-27 Fujifilm Corporation Imaging device
JP7169363B2 (ja) 2018-10-03 2022-11-10 富士フイルム株式会社 撮像装置
CN112805992B (zh) * 2018-10-03 2024-04-23 富士胶片株式会社 摄像装置

Similar Documents

Publication Publication Date Title
JP6480441B2 (ja) 飛行時間型カメラシステム
CN108650447B (zh) 图像传感器、深度数据测量头及测量系统
US9715734B2 (en) Image processing apparatus, imaging apparatus, and image processing method
JP6585006B2 (ja) 撮影装置および車両
US9048153B2 (en) Three-dimensional image sensor
US20200210733A1 (en) Enhanced video-based driver monitoring using phase detect sensors
TW201531730A (zh) 資訊處理裝置及資訊處理方法
US20210150744A1 (en) System and method for hybrid depth estimation
CN111492201B (zh) 测距摄像机
JP5455033B2 (ja) 距離画像入力装置と車外監視装置
JP5375531B2 (ja) 距離画像取得装置及び距離画像取得処理方法
JP2011149931A (ja) 距離画像取得装置
CN106973199B (zh) 利用对焦距离扫描提高深度精确度的多光圈相机系统
JP2022128517A (ja) 測距カメラ
WO2020008832A1 (fr) Caméra de mesure de distance
WO2019181622A1 (fr) Caméra de mesure de distance
CN108693538A (zh) 基于双目结构光的准确置信度深度摄像机测距装置及方法
WO2015198851A1 (fr) Dispositif et procédé de mesure de distance
US20230408253A1 (en) Three-dimensional scanner having sensors with overlapping fields of view
KR20180033135A (ko) 제어 장치, 제어 방법, 컴퓨터 프로그램 및 전자 기기
JP2013044893A (ja) 複眼撮像装置及び距離画像取得装置
JP2011090166A (ja) ステレオ撮像装置
JP6632406B2 (ja) 距離算出装置、撮像装置、および距離算出方法
US10613417B2 (en) Multi-aperture camera system having auto focusing function and/or depth estimation function
US20230090825A1 (en) Optical element assembly, optical apparatus, estimation method, and non-transitory storage medium storing estimation program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15812719

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 15812719

Country of ref document: EP

Kind code of ref document: A1