WO2015182225A1 - Dispositif d'imagerie - Google Patents

Dispositif d'imagerie Download PDF

Info

Publication number
WO2015182225A1
WO2015182225A1 PCT/JP2015/058339 JP2015058339W WO2015182225A1 WO 2015182225 A1 WO2015182225 A1 WO 2015182225A1 JP 2015058339 W JP2015058339 W JP 2015058339W WO 2015182225 A1 WO2015182225 A1 WO 2015182225A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
polarization
pixel
imaging
degree
Prior art date
Application number
PCT/JP2015/058339
Other languages
English (en)
Japanese (ja)
Inventor
敏嗣 山本
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Publication of WO2015182225A1 publication Critical patent/WO2015182225A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing

Definitions

  • the present invention relates to an imaging device that acquires an image of the subject by imaging the subject, and more particularly to an imaging device that acquires a plurality of images of the subject with different polarization characteristics.
  • an imaging apparatus that acquires a digital image of a subject by using a solid-state imaging device such as a CCD (Charge Coupled Device) type image sensor or a CMOS (Complementary Metal Oxide Semiconductor) type image sensor.
  • a solid-state imaging device such as a CCD (Charge Coupled Device) type image sensor or a CMOS (Complementary Metal Oxide Semiconductor) type image sensor.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal Oxide Semiconductor
  • an imaging device for personal use such as a compact camera or a single-lens reflex camera, an imaging device mounted on a so-called smart phone (high-function mobile phone) or a portable personal computer, or the periphery monitoring of a moving body such as a vehicle or a robot.
  • an imaging device mounted on the mobile body for monitoring the above a monitoring imaging device for monitoring a predetermined space such as in a town on a production line, and the like.
  • an imaging device that acquires a plurality of images of a subject with mutually different polarization characteristics and generates a composite image of the subject by combining the acquired plurality of images. Yes.
  • images with different polarization characteristics for example, the direction and time can be estimated even in the sky where the sun is not visible, for example, by using a polarization angle image focusing on the polarization angle.
  • a polarization degree image focusing on the polarization degree makes it possible to identify a subject that looks the same in terms of the brightness, the surface condition, the light source, the subject surface angle, and the positional relationship with the imaging device.
  • Patent Document 1 one structure for imaging a subject with different polarization characteristics is disclosed in Patent Document 1, for example.
  • the imaging apparatus disclosed in Patent Document 1 includes a linearly polarized illumination light source, a photographing apparatus in which a linearly polarizing element (filter) is mounted in front of a lens or an imaging element, and a subject in a matte black light shielding box. And a subject stand for placing the subject. Then, the subject is irradiated with linearly polarized light from the illumination light source, and the subject is photographed while changing the plane of polarization by rotating the linearly polarized light element mounted on the photographing device, thereby obtaining images having different polarization characteristics. Can do. In Patent Document 1, by collecting minimum luminance values from these images, a final photographed image from which reflection on the subject surface and / or reflection around the subject is removed is obtained.
  • the imaging unit is composed of an NTSC video camera, a linear polarization filter mounted on the camera optical axis, and a polarization plane changing stepping motor for rotating the linear polarization filter around the camera optical axis.
  • the polarizing filter is mechanically switched, and a complicated mechanism is required for switching the polarizing filter.
  • the mechanical (dynamic) switching mechanism has a higher risk of failure as compared with a static structure. Further, since it is necessary to image the subject a plurality of times by switching the polarizing filter, it is difficult to image the subject that changes during the switching of the polarizing filter.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide an imaging apparatus that does not require switching of a polarizing filter.
  • An imaging apparatus generates a new composite image related to polarization by combining a plurality of images captured by an array imaging unit having a plurality of imaging optical systems of at least three or more based on corresponding points.
  • the array imaging unit includes a plurality of polarizing filters disposed on an optical path of each of the plurality of imaging optical systems, and the plurality of polarizing filters include polarizing filters having different polarization characteristics. .
  • FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus according to an embodiment.
  • FIG. 2 is a diagram illustrating a configuration of an array imaging unit in the imaging apparatus.
  • FIG. 2A is a perspective view thereof, and
  • FIG. 2B is a sectional view thereof.
  • FIG. 3 is a diagram illustrating a configuration of an imaging unit in the array imaging unit.
  • FIG. 3A shows the imaging unit of the first aspect
  • FIG. 3B shows the imaging unit of the second aspect.
  • the imaging apparatus acquires a plurality of images of an object (subject) with mutually different polarization characteristics, and combines the acquired plurality of images to obtain a new combined image of the object (subject) regarding polarization. It is a device to generate.
  • Such an imaging apparatus D includes, for example, an array imaging unit 1A, a control image processing unit 2, a storage unit 3, an input unit 4, an output unit 5, and an interface unit (IF unit) as shown in FIG. 6).
  • the array imaging unit 1 ⁇ / b> A is a so-called compound-eye imaging device, and corresponds to at least three or more imaging optical systems 12 and imaging optical systems 12.
  • 12 includes a plurality of imaging units 13A that capture an optical image of the object formed by each of the 12 and generate an image.
  • the array imaging unit 1 ⁇ / b> A further includes a plurality of polarizing filters 11 corresponding to the plurality of imaging optical systems 12 and arranged on the optical paths of the plurality of imaging optical systems 12.
  • One imaging optical system 12 is configured to include one or a plurality of optical lenses along its optical axis.
  • the plurality of imaging optical systems 12 are arranged so that the optical axes are substantially parallel to each other, and in two linearly independent directions, more specifically, two in the X direction and the Y direction orthogonal to each other. Arranged in a two-dimensional matrix in the direction.
  • the plurality of imaging optical systems 12 includes four imaging optical systems 12-11, 12-12, 12-21, 12-22 (two rows and two columns arranged in a two-dimensional matrix).
  • such a matrix arrangement may be expressed as “12-11 to 12-22”).
  • the plurality of imaging optical systems 12 may be individually configured, in the present embodiment, for example, they are array lenses configured integrally.
  • One imaging unit 13A includes a plurality of photoelectric conversion elements (a plurality of pixels) arranged in a two-dimensional matrix, and each photoelectric conversion element corresponds to the amount of light received through the polarization filter 11, respectively.
  • the converted electrical signal is output as data of each pixel in the image.
  • the plurality of imaging units 13A correspond to the plurality of imaging optical systems 12 and are arranged so that the imaging surfaces are on the same plane.
  • the plurality of imaging units 13 ⁇ / b> A are two-dimensional matrix in two linearly independent directions, more specifically, two directions of X and Y directions orthogonal to each other, like the plurality of imaging optical systems 12. Are arranged in a shape. In the example shown in FIG.
  • the plurality of imaging units 13A are configured to include four imaging units 13A-11 to 13A-22 arranged in a two-dimensional matrix in two rows and two columns. As shown in FIG. 3A, the plurality of imaging units 13A are configured to include one solid-state imaging device, and an effective pixel area in the single solid-state imaging device corresponds to each imaging unit 13A. The area is divided into a plurality of areas arranged in a two-dimensional matrix, and each of these areas is used as each imaging unit 13A.
  • a plurality of individual imaging units 13B may be used as illustrated in FIG. 3B.
  • the plurality of imaging units 13B include a plurality of solid-state imaging elements arranged in a two-dimensional matrix on the same substrate.
  • the plurality of polarizing filters 11 include polarizing filters having different polarization characteristics. As described above, the plurality of polarizing filters 11 are disposed on the optical paths of the plurality of imaging optical systems 12. For example, in the present embodiment, as shown in FIG. 2B, the plurality of imaging optical systems 12 are arranged on the object side. In FIG. 2A, the plurality of polarizing filters 11 are omitted.
  • Each of the plurality of polarization filters 11 is an optical element that emits incident light as linearly polarized light having a predetermined angle as the polarization characteristic.
  • the plurality of polarizing filters 11 may be individually configured, but in the present embodiment, for example, are array filters configured integrally.
  • the plurality of polarizing filters 11 are, for example, a plurality of polarizing filters 11a to 11d of the following first to fourth modes according to corresponding point search units 22a to 22d of the first to fourth modes described below. Can take.
  • the plurality of polarizing filters 11a in the first mode are used corresponding to the corresponding point search unit 22a in the first mode to be described later, and include a pair of polarizing filters 11a having different polarization characteristics and having the same polarization characteristics.
  • the polarizing filter 11a is included.
  • the plurality of polarizing filters 11b in the second mode are used corresponding to the corresponding point search unit 22b of the second mode described later, and are at least 4 or more corresponding to at least 4 or more imaging optical systems 12.
  • a polarizing filter 11b having different polarization characteristics is included, and a pair of polarizing filters 11b having the same polarization characteristics are included.
  • the plurality of polarizing filters 11c in the third aspect are used corresponding to corresponding point search units 22c of the third aspect described later, and are at least four corresponding to at least four imaging optical systems 12. Two sets of polarizing filter sets, each having one or more first and second polarizing filters 11c, each having different polarization characteristics, are included.
  • the plurality of polarizing filters 11d in the fourth aspect are used corresponding to the corresponding point search unit 22d in the fourth aspect described later, and are at least four or more corresponding to at least four or more imaging optical systems 12. It includes two polarizing filter sets each including a polarizing filter 11d having different polarization characteristics and at least three types of polarizing filters 11d having first to third polarization characteristics different from each other.
  • the polarizing filter set includes a common polarizing filter 11d.
  • one of the plurality of polarization filters 11 arranged along a common line for example, the optical axis of one imaging unit 13A
  • the polarizing filter 11, one imaging optical system 12 of the plurality of imaging optical systems 12, and one imaging unit of the plurality of imaging units 13A constitute a so-called single eye.
  • a light beam from an object (subject) is changed into a predetermined polarization by the one polarizing filter 11, and the light receiving surface (imaging surface) of the one imaging unit 13A by the one imaging optical system 12.
  • the array imaging unit 1 configured as described above captures an optical image of an object (subject) with each individual eye, and outputs each image data of each individual eye to the control image processing unit 2.
  • the array imaging units 1A and 1B are the array filters as the plurality of polarizing filters 11, the array lenses as the plurality of imaging optical systems 12, and the imaging units 13A.
  • a plurality of solid-state image pickup devices each including a single solid-state image pickup device or a plurality of solid-state image pickup devices serving as the plurality of image pickup units 13B.
  • An array imaging unit 1 arranged in a two-dimensional array so that the optical axes are parallel to each other may be used in place of the array imaging units 1A and 1B.
  • the storage unit 3 is a circuit that is connected to the control image processing unit 2 and stores various predetermined programs and various predetermined data under the control of the control image processing unit 2.
  • the various predetermined programs include, for example, an image composition program that searches for corresponding points between a plurality of images, and synthesizes the plurality of images based on the searched corresponding points to create a new composite image related to polarization. And the like.
  • the storage unit 3 includes, for example, a ROM (Read Only Memory) that is a nonvolatile storage element, an EEPROM (Electrically Erasable Programmable Read Only Memory) that is a rewritable nonvolatile storage element, and the like.
  • the storage unit 3 includes a RAM (Random Access Memory) that serves as a working memory of the so-called control image processing unit 2 that stores data and the like generated during execution of the predetermined program.
  • the control image processing unit 2 controls each unit of the imaging device D according to the function of each unit, generates a plurality of images by capturing an optical image of the object with the array imaging unit 1A, and generates the plurality of images. This is a circuit for synthesizing and creating a new synthesized image related to polarized light.
  • the control image processing unit 2 includes, for example, a CPU (Central Processing Unit) and its peripheral circuits.
  • the control unit 21, the corresponding point search unit 22, and the image composition unit 23 are functionally configured by executing a control processing program.
  • the control unit 21 is for controlling each unit of the imaging device D according to the function of each unit.
  • Corresponding point search unit 22 obtains corresponding points between a plurality of images acquired by a plurality of imaging units 13A in array imaging unit 1A. More specifically, in the present embodiment, the corresponding point search unit 22 can take, for example, the corresponding point search units 22a to 22d of the following first to fourth modes.
  • the imaging device D includes the plurality of polarization filters 11 a of the first aspect as the plurality of polarization filters 11. Then, the corresponding point search unit 22a of the first aspect uses one of the set of images generated by the set of imaging units 13A that captures the optical image of the object via the set of polarizing filters 11a as a reference image.
  • the difference between the reference pixel value related to the density of a predetermined reference pixel in the reference image and the reference pixel value related to the density of the reference pixel in the reference image is the reference pixel in the reference image
  • the reference pixel that is the smallest is the corresponding pixel in the reference image corresponding to the reference pixel, and the corresponding point is It is what you want.
  • the imaging device D includes at least four or more imaging optical systems 12 as the plurality of imaging optical systems 12, and the plurality of polarizations of the second aspect as the plurality of polarizing filters 11.
  • a filter 11b is provided.
  • the corresponding point search unit 22b of the second aspect uses one of the set of images generated by the set of imaging units 13A that has captured the optical image of the object via the set of polarizing filters 11b as a reference image. And the other as the first reference image, and an image pickup unit corresponding to one image pickup optical system 12 of the remaining image pickup optical systems 12 excluding one set of image pickup optical systems 12 corresponding to the one set of polarizing filters 11b.
  • the image generated in 13A is used as the second reference image, and the image generated in the imaging unit 13A corresponding to the other imaging optical system 12 among the remaining imaging optical systems 12 is used as the third reference image.
  • the difference between the standard pixel value related to the density of a predetermined standard pixel in the image and the first reference pixel value related to the density of the first reference pixel in the first reference image, and the density of the second reference pixel in the second reference image The sum of the difference between the second reference pixel value and the third reference pixel value related to the density of the third reference pixel in the third reference image is the first to third reference pixels in the first to third reference images.
  • the imaging device D includes at least four or more imaging optical systems 12 as the plurality of imaging optical systems 12, and the plurality of polarization filters of the third aspect as the plurality of polarization filters 11.
  • a filter 11c is provided.
  • the corresponding point search unit 22c of the third aspect captures an optical image of an object through one of the first and second polarizing filters 11c for one of the two polarizing filter sets.
  • the image generated by the unit 13A is used as a reference image, and the image generated by the imaging unit 13A that captures an optical image of the object via the other is used as a first reference image.
  • the first polarization degree obtained based on the pixel value and the first reference pixel value relating to the density of the first reference pixel in the first reference image, and the other of the two polarization filter sets An image generated by the imaging unit 13A that captured an optical image of the object through one of the first and second polarizing filters 11c is used as a second reference image and the object is transmitted through the other. An image generated by the imaging unit 13A that captured the academic image is set as a third reference image, and obtained based on the second and third reference pixel values relating to the densities of the second and third reference pixels in the second and third reference images.
  • the difference between the second polarization degree and the first to third reference pixels in the first to third reference images correspond to each other, and the epipolar line is scanned from the pixels corresponding to infinity at a predetermined pixel interval.
  • the corresponding points are obtained by setting the minimum first to third reference pixels as corresponding pixels in the first to third reference images corresponding to the reference pixel. .
  • the imaging device D includes at least four or more imaging optical systems 12 as the plurality of imaging optical systems 12, and the plurality of polarizations of the fourth aspect as the plurality of polarization filters 11.
  • a filter 11d is provided.
  • the corresponding point search unit 22d of the fourth aspect generates an image generated by the imaging unit 13A that captures an optical image of the object via the common polarizing filter 11d for one of the two sets of polarizing filters.
  • a reference image is generated by each imaging unit 13A that captures an optical image of the object through each of the remaining polarizing filters 11d excluding the common polarizing filter 11d among the first to third polarizing filters 11d.
  • Each image is a first and second reference image, a reference pixel value relating to the density of a predetermined reference pixel in the reference image, and a first relating to the density of the first and second reference pixels in the first and second reference images.
  • the third and fourth references refer to the images generated by the imaging units 13A that have captured the optical image of the object through the remaining polarizing filters 11d other than the common polarizing filter 11d of the third polarizing filter 11d.
  • a second degree of polarization and a second degree of polarization determined based on the reference pixel value and the third and fourth reference pixel values relating to the densities of the third and fourth reference pixels in the third and fourth reference images.
  • the sum of the difference between the respective polarization angles corresponds to the first to fourth reference pixels in the first to fourth reference images, and the epipolar line is scanned from the pixels corresponding to infinity at a predetermined pixel interval.
  • the first to fourth reference pixels that are the smallest are the corresponding pixels in each of the first to fourth reference images corresponding to the reference pixel, And requests the serial corresponding points.
  • the corresponding point search unit 22 preferably obtains the corresponding pixel in the reference image corresponding to the reference pixel, with each pixel of the reference image as the reference pixel in the first to fourth aspects. Is.
  • the image composition unit 23 synthesizes the plurality of images based on the corresponding points obtained by the corresponding point search unit 22 (22a to 22d) to create a new composite image related to polarization.
  • Such an image composition unit 23 preferably corresponds to each other obtained by the corresponding point search units 22a to 22d for each pixel of the reference image in the corresponding point search units 22a to 22d of the first to fourth aspects.
  • the degree of polarization is obtained based on each pixel value related to the density of the reference pixel and the corresponding pixel, and a degree of polarization image representing the obtained degree of polarization is created as the composite image.
  • the image composition unit 23 corresponds to the corresponding points obtained by the corresponding point search units 22a to 22d for each pixel of the reference image in the corresponding point search units 22a to 22d of the first to fourth aspects.
  • a polarization angle is obtained based on each pixel value related to the density in the reference pixel and the corresponding pixel, and a polarization angle image representing each obtained polarization angle is created as the synthesized image.
  • the image composition unit 23 corresponds to the corresponding points obtained by the corresponding point search units 22a to 22d for each pixel of the reference image in the corresponding point search units 22a to 22d of the first to fourth aspects.
  • a non-polarized component is obtained based on each pixel value relating to the density of the reference pixel and the corresponding pixel, and a non-polarized component image representing each of the obtained non-polarized components is created as the composite image.
  • the input unit 4 is connected to the control image processing unit 2 and, for example, various commands such as a command for instructing imaging conditions and a command for instructing start of imaging, and various types necessary for imaging such as input of date setting, for example.
  • a device that inputs data to the imaging device D for example, a plurality of input switches assigned a predetermined function.
  • the output unit 5 is connected to the control image processing unit 2, and outputs a command or data input from the input unit 4 and an image of a subject imaged by the imaging device D under the control of the control image processing unit 2.
  • a display device such as an LCD and an organic EL display, a printing device such as a printer, and the like.
  • a touch panel may be configured from the input unit 4 and the output unit 5.
  • the input unit 4 is a position input device that detects and inputs an operation position such as a resistive film method or a capacitance method
  • the output unit 5 is a display device.
  • a position input device is provided on the display surface of the display device, one or more input content candidates that can be input to the display device are displayed, and the user touches the display position where the input content to be input is displayed. Then, the position is detected by the position input device, and the display content displayed at the detected position is input to the imaging device D as the operation input content of the user.
  • the user can easily understand the input operation intuitively, and thus an imaging device D that is easy for the user to handle is provided.
  • the IF unit 6 is a circuit that is connected to the control image processing unit 2 and inputs / outputs data to / from an external device according to the control of the control image processing unit 2.
  • the IF unit 6 is an RS-232C serial communication method. These include an interface circuit, an interface circuit using the Bluetooth (registered trademark) standard, an interface circuit for performing infrared communication such as an IrDA (Infrared Data Association) standard, and an interface circuit using the USB (Universal Serial Bus) standard.
  • FIG. 4 is a flowchart illustrating the operation of the imaging apparatus according to the embodiment.
  • the control unit 21 of the control image processing unit 2 causes the array imaging unit 1A to perform an imaging operation.
  • the array imaging unit 1A images an object (subject) with each eye. That is, the light beam from the object (subject) is changed to predetermined polarized light according to the polarization characteristics by each of the plurality of polarization filters 11, and each light receiving surface ( The image is imaged on the imaging surface and captured by each of the plurality of imaging units 13A, and each image data of a plurality of images with respect to the subject is generated as each image data of each eye. Then, the array imaging unit 1A outputs the image data of each individual eye generated in this way to the control image processing unit 2.
  • the control image processing unit 2 Upon receiving the image data of each individual eye, the control image processing unit 2 searches for corresponding points in order to synthesize a plurality of images, and generates a composite image based on the corresponding points obtained by the corresponding point search. . That is, in FIG. 4, in process S ⁇ b> 1, one of a plurality of images to be combined is set as a reference image, the remaining image is set as a reference image, and a predetermined pixel in the reference image is set as a reference point ( Reference pixel).
  • step S2 candidates for corresponding points (corresponding pixels) corresponding to the reference points (reference pixels) are determined. For example, a point (reference pixel) in each reference image when the distance of the reference point (reference pixel) is infinite is set as a corresponding point (corresponding pixel) candidate. For each pixel of the standard image, a point (pixel) in each reference image when the distance is infinity is associated in advance based on the camera parameters of the array imaging unit 1A. The correspondence between each pixel and each pixel of each reference image at infinity is stored in advance in the storage unit 3.
  • each point (each reference pixel) of a corresponding point candidate in each reference image corresponding to the reference point is determined from the correspondence relationship stored in advance in the storage unit 3.
  • each image data at each point of the reference point and corresponding point candidate is acquired. For example, each pixel value related to the density in each of the standard pixel and each reference image is acquired.
  • each point of the corresponding point candidate is an actual corresponding point (corresponding pixel) with respect to the reference point based on the acquired image data at each of the reference point and corresponding point candidate points.
  • a predetermined evaluation value for evaluating whether or not there is is calculated by a predetermined method.
  • process S6 the evaluation value calculated in process S5 is evaluated by a predetermined method defined in advance.
  • process S7 the evaluation result of process S6 is stored in the storage unit 3.
  • step S8 it is determined whether all corresponding point candidates have been completed. That is, it is determined whether or not each process described above has been completed for all corresponding point candidates. As a result of this determination, if all the corresponding point candidates have been completed (Y), the next process S9 is executed. On the other hand, if all of the corresponding point candidates have not been completed (N) ) Returns the process to the process S2.
  • each corresponding point (each corresponding pixel) of each reference image that actually corresponds to the reference point (reference pixel) in the reference image is determined based on the evaluation result stored in the storage unit 3 in the process S7.
  • predetermined polarization data relating to polarization is calculated based on each corresponding point of each reference image determined in process S9, and the calculation result is stored in the storage unit 3.
  • process S11 it is determined whether or not all the pixels of the reference image have been completed. That is, it is determined whether or not the above-described processes have been completed for all the pixels in the reference image. As a result of this determination, when all the pixels have been completed (Y), the next process S12 is executed. On the other hand, when all the pixels have not been completed as a result of the determination (N), The process returns to process S1. For example, each pixel from the upper left pixel to the lower right pixel of the reference image is set as a reference pixel in order, and the processes S1 to S10 are executed.
  • process S12 since each polarization data was calculated
  • the composite image is output to the output unit 5 and the process is terminated. If necessary, the control unit 21 of the control image processing unit 2 may output the composite image from the IF unit 6 to an external device.
  • FIG. 5 is a diagram illustrating a plurality of polarizing filters according to the first embodiment.
  • FIG. 5 also serves as an illustration of the plurality of polarizing filters of the second mode.
  • FIG. 6 is a diagram for explaining the parallax between individual eyes. 6A shows the parallax between the first eye and the second eye, FIG. 6B shows an example of an image obtained by superimposing the first eye image and the second eye image, and FIG. The parallax between the first eye and the third eye is shown, and FIG. 6D shows an example of an image obtained by superimposing the first eye image and the third eye image.
  • FIG. 6A shows the parallax between the first eye and the second eye
  • FIG. 6B shows an example of an image obtained by superimposing the first eye image and the second eye image
  • FIG. 6D shows an example of an image obtained by superimposing the first eye image and the third eye image.
  • FIG. 7 is a diagram for explaining the corresponding point search in the corresponding point search unit of the first aspect.
  • FIG. 8 is a graph showing an evaluation value (difference between the first density and the second density) when a corresponding point candidate (each reference pixel) is scanned on the epipolar line.
  • the horizontal axis in FIG. 8 is 1 / Z when the distance from the imaging device D to the object (subject) is Z, and the vertical axis is the evaluation value (difference between the first density and the second density). It is.
  • FIG. 9 is a diagram illustrating the relationship between the parallax and the distance to the object (subject) in the stereo method.
  • the plurality of polarizing filters 11a in the first aspect have the same first polarization characteristics at the positions of 2 rows and 1 column and 2 rows and 2 columns, as shown in FIG.
  • a pair of polarizing filters 11a-21 and 11a-22 is included, and a polarizing filter 11a-12 having a second polarizing characteristic different from the first polarizing characteristic in the pair of polarizing filters 11a-21 and 11a-22 is included.
  • each of the pair of polarizing filters 11a-21 and 11a-22 is a 0-degree polarizing filter that emits incident light as 0-degree linearly polarized light as the first polarization characteristic.
  • the polarizing filters 11a-21 and 11a-22 are disposed adjacent to each other along one direction (for example, the horizontal direction).
  • the polarization filter 11a-12 having the second polarization characteristic is a 90-degree polarization filter that emits incident light as 90-degree linearly polarized light as the second polarization characteristic, and is in a direction orthogonal to the one direction (for example, a vertical direction). ) Along one of the pair of polarizing filters 11a-21 and 11a-22.
  • the polarization filter 11a-12 having the second polarization characteristic is arranged at a position of 1 row and 2 columns so as to be adjacent to the polarization filter 11a-22.
  • one of the pair of polarizing filters 11a-21 and 11a-22 in the example shown in FIG. 5, the imaging optical system 12-22 arranged at a position of 2 rows and 2 columns corresponding to the polarizing filter 11a-22.
  • the imaging unit 13A-22 is referred to as a first eye.
  • the other of the pair of polarizing filters 11a-21 and 11a-22 corresponds to the polarizing filter 11a-21.
  • the imaging optical system 12-21 and the imaging unit 13A-21 arranged at the position of the row 1 column are referred to as the second eye, and correspond to the polarization filter 11a-12 having the second polarization characteristic.
  • the imaging optical system 12-12 and the imaging unit 13A-12 arranged at the position are referred to as a third eye.
  • the polarizing filter 11a-11 is arranged at the position of 1 row and 1 column, and the imaging optical system 12-11 and the imaging unit 13A-11 are arranged at the position of 1 row and 1 column correspondingly.
  • the fourth eye Is called the fourth eye, but in this first aspect, the fourth eye is not necessarily required. Therefore, the polarizing filter 11a-11 is not disposed, and the fourth eye may be used as, for example, a normal camera, or may be used as an infrared camera that images a subject in the infrared wavelength range. It may be a camera for the intended use.
  • the polarizing filter 11a-22 has a polarization component of 0 degree out of the light flux from the object.
  • This 0-degree polarized light component is imaged by the imaging optical system 12-22 and imaged by the imaging unit 13A-22. Therefore, the first eye generates image data with light having a polarization component of 0 degree by the imaging unit 13A-22.
  • the polarization filter 11a-21 transmits a 0-degree polarized light component of the light flux from the object, and the 0-degree polarized light component captures the imaging optical system 12-21. And imaged by the imaging unit 13A-21.
  • the second eye generates image data based on light having a polarization component of 0 degree by the imaging unit 13A-21.
  • the polarizing filter 11a-12 transmits 90-degree polarization component light out of the luminous flux from the object, and the 90-degree polarization component light is transmitted by the imaging optical system 12-12.
  • the image is formed and imaged by the imaging unit 13A-12. Therefore, the third eye generates image data based on light having a polarization component of 0 degree by the imaging unit 13A-21.
  • each image by such first and second eyes produces a parallax corresponding to the arrangement positional relationship between the first and second eyes.
  • FIG. 6A since the first and second oculi eyes are arranged along the horizontal direction, the parallax generated between the images by the first and second oculi eyes occurs along the horizontal direction.
  • FIG. 6B When the images of the first and second eyes are superimposed and displayed, in one example, the image shown in FIG. 6B is obtained.
  • each image by the first and third eyes also generates a parallax corresponding to the arrangement positional relationship between the first and third eyes.
  • FIG. 6C the first and third eyes are arranged along the vertical direction, so that the parallax generated between the images by the first and third eyes occurs along this vertical direction.
  • the image shown in FIG. 6D is obtained.
  • the corresponding point search unit 22a of the first aspect corresponds to each other according to the above-described operation shown in FIG. Search for a point. More specifically, the corresponding point search unit 22a sets a set of imaging units 13A-21 and 13A that capture an optical image of the object via the set of polarizing filters 11a-21 and 11a-22 in the process S1.
  • One of the set of images generated at -22 is set as a reference image and the other is set as a reference image, and a predetermined pixel in the reference image is set as a reference point (reference pixel).
  • a reference point reference point
  • an image with the first eye is used as a reference image
  • an image with the second eye is used as a reference image
  • pixels indicated by white circles ( ⁇ ) in the reference image are reference images.
  • Set to point (reference pixel) are reference images.
  • step S2 the corresponding point search unit 22a determines that the point (reference pixel) in the reference image when the distance between the reference points is infinity is a candidate for the corresponding point (corresponding pixel).
  • step S3 the corresponding point search unit 22a determines a corresponding point candidate point in the reference image corresponding to the reference point from the corresponding relationship stored in advance in the storage unit 3.
  • points (pixels) in the reference image when the distance of the reference point ( ⁇ ) is infinity are indicated by black circles ( ⁇ ).
  • step S4 the corresponding point search unit 22a obtains pixel values related to the densities at the reference point and the corresponding candidate points in each of the images (the reference image and the reference image).
  • the pixel value (first density) of the reference point and the pixel value (second density) of the candidate corresponding point are not actually corresponding points, they are different from each other.
  • the difference between the reference point pixel value and the corresponding candidate point pixel value
  • the corresponding point on the reference image with respect to the reference point exists on the epipolar line from the known epipolar geometry.
  • the corresponding point search unit 22a sets the corresponding point candidate points in the reference image at predetermined pixel intervals (for example, one pixel at a time) on the epipolar line from the point in the reference image when the distance between the reference points is infinity. ) Scanning is performed to search for a corresponding point candidate point that minimizes the difference between the pixel value of the reference point and the corresponding point candidate point, and the searched corresponding point candidate point is the minimum.
  • the actual corresponding point In FIG. 7, an example of a corresponding point candidate point being scanned on the epipolar line is indicated by a dotted circle, and an actual corresponding point is indicated by a white circle ( ⁇ ).
  • Epipolar geometry refers to a geometrical correspondence that occurs when the same point is viewed from two viewpoints in a three-dimensional space.
  • the corresponding point search unit 22a calculates a difference between the first density of the reference point and the second density of the candidate corresponding point as the evaluation value in the process S5, and performs processing In S6, the corresponding point search unit 22a compares with the minimum evaluation value (here, the minimum density difference) stored in the storage unit 3 until the previous time, and in processing S7, the corresponding point search unit 22a performs this comparison. As a result, if the minimum density difference until the previous time is less than or equal to the current density difference, the current density difference is discarded, while the result of the comparison shows that the previous minimum density difference is less than the current density difference. Is larger (when the current density difference is smaller than the previous minimum density difference), the storage content of the storage unit 3 is updated with the current density difference.
  • the minimum evaluation value here, the minimum density difference
  • step S8 the corresponding point search unit 22a determines whether all the corresponding point candidates have been completed by determining whether all the epipolar lines have been scanned. As a result of this determination, when all the epipolar lines are scanned (Y), the next process S9 is executed. On the other hand, when all the epipolar lines are not scanned as a result of the determination (N), The process returns to process S2.
  • the corresponding point search unit 22a sets the corresponding point candidates stored in the storage unit 3 as actual corresponding points (corresponding pixels). By scanning on the epipolar line as described above, the density difference between the first density of the reference point and the second density of the corresponding candidate point changes as shown in FIG. 8, and becomes the minimum value. 1 / Z 1 is the actual corresponding point (corresponding pixel) in the reference image corresponding to the reference point (reference pixel) of the reference image. Then, the corresponding point search unit 22a obtains corresponding points (corresponding pixels) of the third eye image corresponding to the reference point of the reference image using the corresponding points of the reference image corresponding to the reference point of the reference image. .
  • the base line length in a set of imaging units 13A that generate a set of images for which corresponding points are obtained is B, and an optical image of an object is formed on the set of imaging units 13A.
  • 1 / Z d / (B ⁇ f) is established, where f is the focal length in the set of imaging optical systems 12, d is the parallax based on the corresponding points, and Z is the distance to the object. .
  • the corresponding point searching unit 22a using the corresponding points of the reference image by the first two eyes (pixels corresponding to 1 / Z 1) corresponding to the reference point of the reference image according to one eye, the corresponding to the reference point of the reference image by one eye, it can be obtained (pixel corresponding to 1 / Z 1) corresponding points of the image according to the three eyes.
  • the image composition unit 23 obtains the degree of polarization as the predetermined polarization data based on each pixel value relating to the density at the reference point and the actual corresponding point determined in process S9, and the obtained degree of polarization. Is stored in the storage unit 3.
  • the degree of polarization is obtained by using each image obtained through the polarization filter 11 having different polarization characteristics. For example, the pixel value at the reference point of the reference image according to one eye (first concentration) and E 1, the pixel value at the corresponding point of the image according to the three eyes (third concentration) and E 3, the polarization component Assuming A 1 , the polarization angle ⁇ 1 , and the non-polarization component B 1 , the following equations (1) and (2) hold.
  • a 1 / B 1 The degree of polarization A 1 / B 1 is obtained by eliminating cos ⁇ 1 from these equations (1) and (2).
  • E 1 A 1 ⁇ cos 2 ( ⁇ 1 + 0 °) + B 1 (1)
  • E 3 A 1 ⁇ cos 2 ( ⁇ 1 + 90 °) + B 1 (2)
  • each polarization degree is obtained as each polarization data for each pixel of the reference image by each process described above.
  • a new composite image (polarization degree image) relating to the degree of polarization is created by setting the value representing the degree as each pixel value, and this composite image is output to the output unit 5, and the process is terminated.
  • the imaging apparatus D according to the first mode since each process is performed as described above, the imaging apparatus D according to the first mode includes a pair of polarizing filters 11a-21 and 11a-22 having the same polarization characteristics.
  • Corresponding points can be obtained between images picked up through this pair of polarizing filters 11a-21 and 11a-22, and corresponding points for images picked up through other polarizing filters 11a-12 are: This can be obtained from the obtained corresponding point and the arrangement positional relationship between the imaging units 13A-22, 13A-21, and 13A-12.
  • FIG. 10 is a diagram for explaining the corresponding point search in the corresponding point search unit of the second mode.
  • FIG. 11 is a graph showing the evaluation value (sum of the first density difference and the second density difference) when a corresponding point candidate (each reference pixel) is scanned on the epipolar line.
  • the horizontal axis in FIG. 11 is 1 / Z, where Z is the distance from the imaging device D to the object (subject), and the vertical axis is the evaluation value (the first density difference and the second density difference). Sum).
  • the plurality of polarizing filters 11 b in the second mode are in the case of 2 rows and 2 columns, the positions of 2 rows and 1 column and 2 rows and 2 columns, and 1 row and 1 column and 1 row.
  • Two pairs of polarizing filters 11b-21, 11b-22; 11b-11, 11b-12 having the same polarization characteristics are included at the positions of the two rows, and these two sets of polarizing filters 11b-21, 11b-22; 11b-11 and 11b-12 have mutually different polarization characteristics.
  • FIG. 5 the plurality of polarizing filters 11 b in the second mode are in the case of 2 rows and 2 columns, the positions of 2 rows and 1 column and 2 columns, and 1 row and 1 column and 1 row.
  • Two pairs of polarizing filters 11b-21, 11b-22; 11b-11, 11b-12 having the same polarization characteristics are included at the positions of the two rows, and these two sets of polarizing filters 11b-21, 11b-22; 11b-11 and 11b-12 have mutually different polarization characteristics.
  • the polarization filter is a 0 degree polarization filter, and the pair of polarization filters 11b-21 and 11b-22 are arranged adjacent to each other along one direction (for example, the horizontal direction).
  • the other set of polarizing filters 11b-11 and 11b-12 out of the two sets of polarizing filters 11b-21 and 11b-22; 11b-11 and 11b-12 has a second polarization characteristic of 90 degrees, respectively.
  • a polarizing filter, and the pair of polarizing filters 11b-11 and 11b-12 are arranged adjacent to each other along one direction (for example, the horizontal direction).
  • the first polarization characteristics of the pair of polarization filters 11b-21 and 11b-22 and the second polarization characteristics of the pair of polarization filters 11b-11 and 11b-12 are different from each other.
  • the one set of polarizing filters 11b-21 and 11b-22 and the other set of polarizing filters 11b-11 and 11b-12 are mutually connected along a direction (for example, a vertical direction) orthogonal to the one direction. It arrange
  • one of the pair of polarizing filters 11b-21 and 11b-22 in the example shown in FIG. 5, the imaging optical system 12 disposed at a position of 2 rows and 2 columns corresponding to the polarizing filter 11b-22. ⁇ 22 and the imaging unit 13A-22 are referred to as a first eye, and the other of the pair of polarizing filters 11b-21 and 11b-22 corresponds to the polarizing filter 11b-21 in the example shown in FIG.
  • the imaging optical system 12-21 and the imaging unit 13A-21 arranged at the position of 2 rows and 1 column are referred to as the second eye.
  • one of the other pair of polarizing filters 11b-11 and 11b-12 in the example shown in FIG.
  • the imaging optical system 12 disposed at a position of 1 row and 2 columns corresponding to the polarizing filter 11b-12. -12 and the imaging unit 13A-12 are referred to as a third eye, and the other of the pair of polarizing filters 11b-11 and 11b-12 corresponds to the polarizing filter 11b-11 in the example shown in FIG.
  • the imaging optical system 12-11 and the imaging unit 13A-11 arranged at the position of 1 row and 1 column are referred to as a fourth eye.
  • the first eye includes the polarizing filter 11b-22.
  • Image data based on component light is generated.
  • the second eye includes the polarization filter 11b-21
  • image data based on light having a polarization component of 0 degrees is generated by the imaging unit 13A-21.
  • the imaging unit 13A-12 since the third eye includes the polarization filter 11b-12, the imaging unit 13A-12 generates image data using light having a polarization component of 90 degrees.
  • the fourth eye includes the polarization filter 11b-11, image data is generated by light of a polarization component of 90 degrees by the imaging unit 13A-11.
  • Such images of the first to fourth eyes produce parallax corresponding to the arrangement positional relationship of the first to fourth eyes.
  • the corresponding point search unit 22b of the second aspect corresponds to each other according to the above-described operation shown in FIG. Search for a point. More specifically, the corresponding point search unit 22b captures an optical image of the object via the one set of polarizing filters 11b-21 and 11b-22 in the process S1, and sets a pair of imaging units 13A-21. , 13A-22, one of the sets of images generated as a standard image and the other as a first reference image.
  • the corresponding point search unit 22b is an imaging unit 13A corresponding to one imaging optical system 12 of the remaining imaging optical systems 12 excluding the one set of imaging optical systems 12 corresponding to the one set of polarizing filters 11b.
  • the generated image is used as the second reference image, and the image generated by the imaging unit 13A corresponding to the other imaging optical system 12 among the remaining imaging optical systems 12 is used as the third reference image.
  • the plurality of polarizing filters 11b are composed of two sets of 2 rows and 2 columns, an optical image of the object is passed through the other set of polarizing filters 11b-11 and 11b-12.
  • the corresponding point search unit 22b sets a predetermined pixel in the reference image as a reference point (reference pixel).
  • a reference point reference point
  • the first eye image is the reference image
  • the second eye image is the first reference image
  • the third eye image is the second reference image
  • An image with four eyes is taken as a third reference image
  • a pixel indicated by a white circle ( ⁇ ) is set as a standard point (standard pixel).
  • step S2 the corresponding point search unit 22b determines that a point (reference pixel) in the reference image when the distance between the reference points is infinity is a candidate for a corresponding point (corresponding pixel).
  • step S3 the corresponding point search unit 22b determines a corresponding point candidate point in the reference image corresponding to the reference point from the correspondence relationship stored in advance in the storage unit 3.
  • points (pixels) in the first to third reference images when the distance of the reference point ( ⁇ ) is infinity are indicated by black circles ( ⁇ ).
  • step S4 the corresponding point search unit 22b acquires pixel values related to the density at each of the reference point and the corresponding point candidates in each of the images (the reference image and the first to third reference images).
  • the pixel value (first density) of the reference point and the pixel values (second density to fourth density) at each corresponding point candidate (first to third reference pixels) are usually actual. Since they are not corresponding points, they have different values, but the corresponding candidate points are scanned on the first to third reference images, and the corresponding candidate points are located at or near the actual corresponding points.
  • the corresponding point search unit 22b makes the first to third in the case where the distance between the reference points is infinite while the corresponding point candidate points in the first to third reference images correspond to each other (synchronized). Scanning of each epipolar line from each point in each of the three reference images at a predetermined pixel interval (for example, one pixel at a time), corresponding point candidates that minimize the sum of the difference between the first density difference and the second density Each of the points is searched, and each point of the searched corresponding point candidate corresponding to the minimum is set as each actual corresponding point in the first to third reference images.
  • FIG. 10 an example of candidate points for corresponding points being scanned on the epipolar line is indicated by a broken-line circle, and actual corresponding points are indicated by white circles ( ⁇ ).
  • the corresponding point search unit 22b calculates the sum of the difference between the first density difference and the second density as the evaluation value in the process S5, and in step S6, the corresponding point search The unit 22b compares the minimum evaluation value stored in the storage unit 3 up to the previous time (here, the minimum sum of the first density difference and the second density difference). In step S7, the corresponding point search unit 22b, if the result of this comparison shows that the minimum sum of the first density difference and the second density difference until the previous time is less than or equal to the current sum of the first density difference and the second density difference, The sum of the first density difference and the second density difference is discarded.
  • the sum of the minimum first density difference and the second density difference until the previous time is the current first density difference. If the difference is greater than the sum of the second density difference and the current first density difference, The sum of the 2 density differences updates the stored contents of the storage unit 3.
  • step S8 the corresponding point search unit 22b determines whether all corresponding point candidates have been completed by determining whether all the epipolar lines have been scanned. As a result of this determination, when all the epipolar lines are scanned (Y), the next process S9 is executed. On the other hand, when all the epipolar lines are not scanned as a result of the determination (N), The process returns to process S2.
  • step S9 the corresponding point search unit 22b sets each corresponding point candidate stored in the storage unit 3 as each actual corresponding point (each corresponding pixel) in the first to third reference images.
  • the sum of the absolute value of the first density difference and the absolute value of the second density difference is as shown in FIG. 1 / Z 2 that changes and becomes the minimum value becomes actual corresponding points (corresponding pixels) in the first to third reference images corresponding to the reference point (reference pixel) of the reference image.
  • the image composition unit 23 obtains the degree of polarization as the predetermined polarization data as described above based on the pixel values related to the density at the reference point determined in the process S9 and the actual corresponding points.
  • the obtained degree of polarization is stored in the storage unit 3.
  • each polarization degree is obtained as each polarization data for each pixel of the reference image by each process described above.
  • each pixel value is a value representing the degree
  • a new composite image relating to the degree of polarization is created, this composite image is output to the output unit 5, and the process ends.
  • the imaging device D according to the second aspect obtains corresponding points in the second reference image by the third eye based on the above-described idea without using the relational expression described with reference to FIG. 9 as in the first aspect. Therefore, the corresponding points can be obtained with higher accuracy than in the case of the first aspect described above.
  • the degree of polarization is obtained.
  • the polarization filters 11a-11 and 11b-11 for the fourth eye are used for each polarization in the first to third eyes.
  • a polarizing filter having a third polarization characteristic different from the polarization characteristics of the filters 11a-12 to 11a-22 and 11b-12 to 11b-22, for example, 45 degrees for emitting incident light as 45 degrees linearly polarized light as the third polarization characteristic By configuring with the polarization filter, the imaging device D according to the first and second aspects described above can obtain the polarization angle as the polarization data, and the value representing the polarization angle is used as the pixel value to thereby relate to the polarization angle.
  • a new composite image (polarization angle image) can be created.
  • FIG. 12 is a diagram illustrating a plurality of polarizing filters according to the third aspect.
  • FIG. 13 is a diagram for explaining the corresponding point search in the corresponding point search unit of the third aspect.
  • FIG. 14 is a graph showing an evaluation value (difference between the first polarization degree and the second polarization degree) when a corresponding point candidate (each reference pixel) is scanned on the epipolar line.
  • the horizontal axis in FIG. 14 is 1 / Z when the distance from the imaging device D to the object (subject) is Z, and the vertical axis represents the evaluation value (the first polarization degree and the second polarization degree). Difference).
  • the plurality of polarizing filters 11c in the third aspect include two first and second polarizing filters 11c-22, 11c-12; 11c-21, Two sets of polarizing filter sets each including 11c-11 are included, and each of the two sets of polarizing filter sets includes at least a polarizing filter 11c having different polarization characteristics.
  • the polarization filters 11c-22 and 11c-11 arranged at the position of 2 rows and 2 columns and the position of 1 row and 1 column are 0 degree polarization filters having a first polarization characteristic, respectively,
  • the polarization filters 11c-12 and 11c-21 arranged at the positions of 2 rows and 1 column are 90-degree polarization filters having the second polarization characteristic, respectively.
  • the polarization filters 11c-22 and 11c-11 having the first polarization characteristic are arranged along the first diagonal direction in the two-dimensional array of two rows and two columns, and are polarized light having the second polarization characteristic.
  • the filters 11c-12 and 11c-21 are arranged along the second diagonal direction in a two-row and two-column two-dimensional matrix arrangement, and the first and second diagonal directions intersect each other (FIG. 12). In the example shown in FIG.
  • the imaging optical system 12-22 and the imaging unit 13A-22 arranged at the position of 2 rows and 2 columns corresponding to the polarizing filter 11c-22 are referred to as the first eye, and correspond to the polarizing filter 11c-21.
  • the imaging optical system 12-21 and the imaging unit 13A-21 arranged at the position of 2 rows and 1 column are referred to as the second eye.
  • the imaging optical system 12-12 and the imaging unit 13A-12 arranged at the position of 1 row and 2 column corresponding to the polarizing filter 11c-12 are referred to as the third eye, and 1 corresponding to the polarizing filter 11c-11.
  • the imaging optical system 12-11 and the imaging unit 13A-11 arranged at the position of row 1 column are referred to as a fourth eye.
  • the first eye includes the polarizing filter 11c-22.
  • Image data based on component light is generated.
  • the fourth eye includes the polarization filter 11c-11, the image data is generated by the imaging unit 13A-11 using light having a polarization component of 0 degree.
  • the imaging unit 13A-21 since the second eye includes the polarization filter 11c-21, the imaging unit 13A-21 generates image data using light having a polarization component of 90 degrees.
  • the third eye includes the polarization filter 11c-12, image data based on light having a polarization component of 90 degrees is generated by the imaging unit 13A-12.
  • Such images of the first to fourth eyes produce parallax corresponding to the arrangement positional relationship of the first to fourth eyes.
  • the corresponding point search unit 22c of the third aspect corresponds to each other according to the above-described operation shown in FIG. Search for a point. More specifically, the corresponding point search unit 22c determines one of the first and second polarizing filters 11c-22 and 11c-12 for one of the two sets of polarizing filters in step S1.
  • the image generated by the imaging unit 13A that captured the optical image of the object via the reference image is used as a reference image, and the image generated by the imaging unit 13A that captured the optical image of the object via the other is defined as the first reference image.
  • an image generated by the imaging unit 13A that captured an optical image of the object via one of the first and second polarizing filters 11c-21 and 11c-11 is obtained.
  • An image generated by the imaging unit 13A that captures an optical image of the object via the other reference image as the second reference image is defined as a third reference image.
  • the corresponding point search unit 22c sets a predetermined pixel in the reference image as a reference point (reference pixel). For example, as shown in FIG.
  • the first eye image is the reference image
  • the third eye image is the first reference image
  • the second eye image is the second reference image
  • An image with four eyes is taken as a third reference image
  • a pixel indicated by a white circle ( ⁇ ) is set as a standard point (standard pixel).
  • step S2 the corresponding point search unit 22c determines that the point (reference pixel) in the reference image when the distance between the reference points is infinity is a candidate for the corresponding point (corresponding pixel) (tentatively determined as the corresponding point).
  • step S3 the corresponding point search unit 22c determines candidate points of corresponding points in the reference image corresponding to the reference point from the corresponding relationship stored in advance in the storage unit 3.
  • points (pixels) in the first to third reference images when the distance of the reference point ( ⁇ ) is infinity are indicated by black circles ( ⁇ ).
  • step S4 the corresponding point search unit 22c acquires pixel values related to the density at each of the reference point and the corresponding point candidate in each of the images (the reference image and the first to third reference images).
  • the pixel value (first density) of the reference point in the standard image and the pixel value (second density) at the candidate point (first reference pixel) of the corresponding point in the first reference image are obtained.
  • the first polarization degree, the pixel value (third density) of the candidate point (second reference pixel) of the corresponding point in the second reference image, and the candidate point (third reference pixel) of the corresponding point in the third reference image The second degree of polarization obtained based on the pixel value (fourth density) in FIG. 3 is not actually an actual corresponding point, and thus is a different value from each other.
  • the corresponding point search unit 22c makes the first to third in the case where the distance between the reference points is infinite while making the corresponding points of the corresponding points in the first to third reference images correspond to each other (synchronized).
  • the corresponding point search unit 22c calculates the difference between the first polarization degree and the second polarization degree as the evaluation value in the process S5, and in the process S6, the corresponding point search unit 22c. Is compared with the minimum evaluation value stored in the storage unit 3 until the previous time (here, the minimum difference between the first polarization degree and the second polarization degree), and in step S7, the corresponding point search unit 22c As a result of this comparison, if the difference between the minimum degree of first polarization and the second degree of polarization up to the previous time is less than or equal to the difference between the first degree of polarization and the second degree of polarization, The difference between the first polarization degree and the second polarization degree is discarded.
  • the difference between the minimum first polarization degree and the second polarization degree until the previous time is the current first polarization degree.
  • the difference between the second polarization degrees is larger, the first polarization degree and the second polarization degree. It updates the stored contents of the storage unit 3 by the difference in light intensity.
  • E 1 A 1 ⁇ cos 2 ( ⁇ 1 + 0 °) + B 1 (5)
  • E 3 A 1 ⁇ cos 2 ( ⁇ 1 + 90 °) + B 1 (6)
  • the polarization component is A 1
  • the polarization angle is ⁇ 1
  • the non-polarization component is B 1 .
  • E 2 A 2 ⁇ cos 2 ( ⁇ 2 + 0 °) + B 2 (7)
  • E 4 A 2 ⁇ cos 2 ( ⁇ 2 + 90 °) + B 2 (8)
  • the polarization component is A 2
  • the polarization angle is ⁇ 2
  • the non-polarization component is B 2 .
  • step S8 the corresponding point search unit 22c determines whether all corresponding point candidates have been completed by determining whether all the epipolar lines have been scanned. As a result of this determination, when all the epipolar lines are scanned (Y), the next process S9 is executed. On the other hand, when all the epipolar lines are not scanned as a result of the determination (N), The process returns to process S2.
  • the corresponding point search unit 22c sets each corresponding point candidate stored in the storage unit 3 as each actual corresponding point (each corresponding pixel) in the first to third reference images.
  • the difference between the first polarization degree and the second polarization degree changes as shown in FIG. 1 / Z 3 that becomes the actual corresponding points (corresponding pixels) in the first to third reference images corresponding to the reference point (reference pixel) of the reference image.
  • the image composition unit 23 obtains the degree of polarization as the predetermined polarization data as described above based on the pixel values related to the density at the reference point determined in the process S9 and the actual corresponding points.
  • the obtained degree of polarization is stored in the storage unit 3.
  • each polarization degree is obtained as each polarization data for each pixel of the reference image by each process described above.
  • each pixel value is a value representing the degree
  • a new composite image relating to the degree of polarization is created, this composite image is output to the output unit 5, and the process ends.
  • the imaging apparatus D can obtain the corresponding points based on the above-mentioned idea, and even when there is not much difference between pixels in density and it is difficult to search for corresponding points, the degree of polarization is significantly different between pixels. When there is a difference, a corresponding point search is possible.
  • FIG. 15 is a diagram illustrating a plurality of polarization filters according to the fourth aspect.
  • FIG. 16 is a diagram for explaining the corresponding point search in the corresponding point search unit of the fourth aspect.
  • FIG. 16 shows the evaluation values (difference between the first polarization degree and the second polarization degree, and the difference between the first polarization angle and the second polarization angle when the corresponding point candidates (each reference pixel) are scanned on the epipolar line. It is a graph which shows the sum of a difference. The horizontal axis in FIG.
  • the vertical axis represents the evaluation values (the first and second polarization degrees). Difference and the sum of the difference between the first polarization angle and the second polarization angle).
  • the plurality of polarization filters 11d include three first to third polarization elements having three different first to third polarization characteristics.
  • 2 sets of polarizing filter sets including 11d-22, 11d-21, 11d-11; 11d-22, 11d-12, 11d-11 as a set, and the two sets of polarizing filters include two rows
  • a common polarizing filter 11d-22 is included at two positions.
  • the polarizing filter 11d-11 arranged at the position of 11 rows and 1 column is also common to the two polarizing filter groups.
  • the polarizing filter 11d-22 arranged at the position of 2 rows and 2 columns is a 0-degree polarizing filter having the first polarization characteristic, and the polarizing filter arranged at the positions of 2 rows and 1 columns and the positions of 1 rows and 2 columns.
  • Reference numerals 11d-21 and 11d-12 denote 90-degree polarizing filters having the second polarization characteristics, respectively.
  • the polarizing filter 11d-11 disposed at the position of 1 row and 1 column has 45 degrees of incident light as the third polarization characteristics. It is a 45 degree
  • the imaging optical system 12-22 and the imaging unit 13A-22 arranged at the position of 2 rows and 2 columns corresponding to the polarizing filter 11d-22 are referred to as the first eye, and 2 corresponding to the polarizing filter 11d-21.
  • the imaging optical system 12-21 and the imaging unit 13A-21 arranged at the position of row 1 column are referred to as the second eye, and the imaging is arranged at the position of 1 row 2 column corresponding to the polarization filter 11d-12.
  • the optical system 12-12 and the image pickup unit 13A-12 are referred to as a third eye, and the image pickup optical system 12-11 and the image pickup unit disposed at the position of one row and one column corresponding to the polarization filter 11d-11. 13A-11 is referred to as the fourth eye.
  • the imaging unit 13A-22 performs 0 degree polarization. Image data based on component light is generated. Since the second eye includes the polarization filter 11d-21, image data is generated by the light of the polarization component of 90 degrees by the imaging unit 13A-21. Similarly, since the third eye includes the polarization filter 11d-12, the image pickup unit 13A-12 generates image data based on light having a polarization component of 90 degrees. Since the fourth eye includes the polarizing filter 11d-11, the image pickup unit 13A-11 generates image data based on light having a polarization component of 45 degrees.
  • Such images of the first to fourth eyes produce parallax corresponding to the arrangement positional relationship of the first to fourth eyes.
  • the corresponding point search unit 22d of the fourth aspect corresponds to each other according to the above-described operation illustrated in FIG. Search for a point. More specifically, the corresponding point search unit 22d captures an optical image of the object through the common polarizing filter 11d-22 for one of the two polarizing filter groups in the process S1.
  • the image generated at ⁇ 22 is used as a reference image, and the remaining polarizing filters except the common polarizing filter 11d-22 among the first to third polarizing filters 11d-22, 11d-21, and 11d-11
  • Each of the images generated by the imaging units 13A-21 and 13A-11 obtained by imaging the optical image of the object via 11d-21 and 11d-11 is defined as the first and second reference images, and the two sets of polarizing filter sets The other of the first to third polarizing filters 11d-22, 11d-12, 11d-11 except the common polarizing filter 11d-22
  • Each image generated by the polarization filter 11d-12,11d-11 each imaging unit 13A-12, 13a-11 obtained by imaging an optical image of the object through the the third and fourth reference image.
  • the polarizing filter 11d-11 is used to pass the polarizing filter 11d-11.
  • the imaging unit 13A-11 that has captured the optical image of the object generates image data as the second reference image and the fourth reference image, respectively.
  • the corresponding point search unit 22d sets a predetermined pixel in the reference image as a reference point (reference pixel).
  • a reference point reference point
  • an image with the first eye is used as a standard image
  • an image with the second eye is used as a first reference image
  • an image with the fourth eye is used as a second reference image.
  • the third eye image is set as the third reference image, and, for example, a pixel indicated by a white circle ( ⁇ ) is the reference point in the reference image. (Reference pixel).
  • step S2 the corresponding point search unit 22d determines that the point (reference pixel) in the reference image when the distance between the reference points is infinity is a candidate for the corresponding point (corresponding pixel).
  • step S3 the corresponding point search unit 22c determines candidate points of corresponding points in the reference image corresponding to the reference point from the corresponding relationship stored in advance in the storage unit 3.
  • points (pixels) in the first to third reference images when the distance of the reference point ( ⁇ ) is infinity are indicated by black circles ( ⁇ ).
  • the corresponding point search unit 22d performs the reference point in these images (the reference image and the first to fourth reference images, in this example, the second reference image and the fourth reference image are common). And each pixel value regarding the density
  • the pixel value (first density) of the reference point in the reference image, the pixel value (second density) at the candidate point (first reference pixel) of the corresponding point in the first reference image, and the second reference image The first polarization degree and the first polarization angle obtained based on the pixel value (third density) at the corresponding point candidate point (second reference pixel), and the pixel value of the reference point in the reference image (first 1 density), the corresponding point candidate point (third reference pixel) in the third reference image, the pixel value (third density), and the corresponding point candidate point in the fourth reference image (fourth reference pixel).
  • the second polarization degree and the second polarization angle obtained based on the pixel value are not actually corresponding points, they are different from each other.
  • corresponding candidate points are designated as first to fourth reference images (in this example, second reference images).
  • each corresponding point on the first to fourth reference images corresponding to the reference point exists on the epipolar line in each of the first to fourth reference images. Therefore, the corresponding point search unit 22d makes the first to fourth points when the distance between the reference points is infinite while the corresponding point candidate points in the first to fourth reference images correspond to each other (synchronized).
  • Each epipolar line is scanned from each point in each of the four reference images at a predetermined pixel interval (for example, one pixel at a time), and each corresponding point candidate corresponding to the sum of the polarization degree difference and the polarization angle difference is minimized.
  • a search is performed, and each point of the searched corresponding point candidate corresponding to the minimum is set as an actual corresponding point in the first to fourth reference images.
  • an example of corresponding point candidate points being scanned on the epipolar line is indicated by a dashed-line circle, and an actual corresponding point is indicated by a white circle ( ⁇ ).
  • the corresponding point search unit 22d calculates the sum of the polarization degree difference and the polarization angle difference as the evaluation value in the process S5, and in the process S6, the corresponding point search unit 22c Compared with the minimum evaluation value (here, the sum of the minimum polarization degree difference and the polarization angle difference) stored in the storage unit 3 until the previous time, in step S7, the corresponding point search unit 22c As a result, when the sum of the minimum polarization degree difference and the polarization angle difference until the previous time is equal to or less than the sum of the current polarization degree difference and the polarization angle difference, the current polarization degree difference and the polarization angle difference.
  • E 1 A 1 ⁇ cos 2 ( ⁇ 1 + 0 °) + B 1 (9)
  • E 2 A 1 ⁇ cos 2 ( ⁇ 1 + 90 °) + B 1 (10)
  • E 4 A 1 ⁇ cos 2 ( ⁇ 1 + 45 °) + B 1 (11)
  • the pixel value at the reference point of the reference image according to one eye (first concentration) and E 1, the pixel value at the corresponding point of the first reference image according to two eyes (second concentration) and E 2, pixel values at corresponding points of the second reference image according to the four eyes (third concentration) and E 4, and the polarization component and a 1, the polarization angle and theta 1, to the non-polarized light component and B 1 .
  • E 1 A 2 ⁇ cos 2 ( ⁇ 2 + 0 °) + B 2 (12)
  • E 3 A 2 ⁇ cos 2 ( ⁇ 2 + 90 °) + B 2 (13)
  • E 4 A 2 ⁇ cos 2 ( ⁇ 2 + 45 °) + B 2 (14)
  • the pixel value at the reference point of the reference image according to one eye (first concentration) and E 1 the pixel value at the corresponding point of the third reference picture according to three eyes (fourth concentration) and E 3
  • the polarization component and a 2 the polarization angle and theta 2 the non the polarized light component and B 2.
  • step S8 the corresponding point search unit 22d determines whether all corresponding point candidates have been completed by determining whether all the epipolar lines have been scanned. As a result of this determination, when all the epipolar lines are scanned (Y), the next process S9 is executed. On the other hand, when all the epipolar lines are not scanned as a result of the determination (N), The process returns to process S2.
  • step S9 the corresponding point search unit 22d sets each corresponding point candidate stored in the storage unit 3 as each actual corresponding point (each corresponding pixel) in the first to fourth reference images.
  • the sum of the polarization degree difference and the polarization angle difference changes and becomes the minimum value as shown in FIG. 1 / Z 4 becomes each actual corresponding point (each corresponding pixel) in the first to fourth reference images corresponding to the reference point (reference pixel) of the reference image.
  • the image composition unit 23 obtains the polarization angle as the predetermined polarization data as described above based on the pixel values related to the density at the reference point determined in process S9 and the actual corresponding points.
  • the obtained polarization angle is stored in the storage unit 3.
  • each polarization angle is obtained as each polarization data for each pixel of the reference image by each process described above.
  • a new composite image relating to the polarization angle is created, this composite image is output to the output unit 5, and the process is terminated.
  • the polarization angle image is created as the composite image.
  • the polarization degree image and the non-polarization component image related to the non-polarization component are combined as the composite image. It may be created as an image.
  • the simultaneous equations (or the three equations (12) to (11) consisting of the above-described three equations (9) to (11)) for one set of polarizing filters.
  • the polarization angle ⁇ is obtained by solving the simultaneous equations (14) for A, ⁇ and B which are unknown variables.
  • one of the plurality of single eyes corresponding to one set of polarizing filters may be configured to image incident light with its characteristics.
  • the imaging device D can obtain the corresponding points based on the above-mentioned idea, and even when there is not much difference between pixels in density and degree of polarization and searching for corresponding points is difficult, the angle of polarization between pixels If there is a significant difference between the two, the corresponding point search can be performed.
  • the first to fourth eyes are referred to as common eyes in the first to fourth aspects, but the first to fourth reference images are The first to fourth aspects are not necessarily called in common.
  • the image by the second eye is the first reference image
  • the image by the second eye is the second reference image.
  • the plurality of imaging units 13A are objects through the plurality of polarization filters 11 that are fixedly arranged on the optical paths of the plurality of imaging optical systems 12, respectively. Therefore, the image pickup apparatus D in the present embodiment does not need to switch the polarization filter as in the conventional case. For this reason, in the imaging apparatus D in the present embodiment, the conventional switching mechanism is not necessary, and the risk of occurrence of a failure is reduced. Further, in the imaging apparatus D according to the present embodiment, it is not necessary to capture a subject a plurality of times as in the conventional case, and it is possible to capture the changing subject.
  • the imaging device D uses a corresponding point search unit 22 between a plurality of images. Since the corresponding point is obtained and the plurality of images are synthesized by the image synthesis unit 23 based on the obtained corresponding point, a new synthesized image can be created from the plurality of images having parallax.
  • the imaging device D in the present embodiment creates a polarization degree image as the composite image
  • the object of the subject that looks the same in terms of luminance is obtained by referring to the created polarization degree image.
  • the light source, the subject surface angle (object surface angle), and the positional relationship with the imaging device are included in the created polarization degree image.
  • FIG. 18 is a diagram showing a polarization degree image as an example.
  • FIG. 18A is an image (0 degree polarization image) captured and generated by the imaging unit 13A via the 0 degree polarization filter
  • FIG. 18B is an image captured and generated by the imaging unit 13A via the 90 degree polarization filter.
  • 18C is an image (90-degree polarization image)
  • FIG. 18C is an image (90-degree-0-degree polarization image) of the difference between the 90-degree polarization image and the 0-degree polarization image
  • FIG. 18D is a 90-degree polarization image and 0-degree image.
  • FIG. 18E is a polarization degree image synthesized based on the 0 degree polarization image and the 90 degree polarization image.
  • FIG. 18 for example, the side surface of a cylindrical body (a barrel of a monocular imaging optical system) that is one of the subjects does not have a significant contrast in FIGS. 18A to 18D, and corresponding point search is difficult.
  • FIG. 18E there is a significant contrast, and corresponding point search is possible.
  • 18A to 18D even a subject that is difficult to identify, such as a substrate or a cylindrical body, can be identified in FIG. 18E.
  • the imaging device D creates a polarization angle image as the composite image. Therefore, by referring to the created polarization angle image, for example, in the sky where the sun is not visible, the direction And time can be estimated.
  • an imaging apparatus that creates a non-polarized component image as the composite image can be provided.
  • the plurality of polarizing filters 11 may be a 0-degree polarizing filter, a 90-degree polarizing filter, and a 45-degree polarizing filter.
  • the object of the subject has a horizontal plane along the horizontal and a vertical plane along the vertical.
  • Such an imaging apparatus D includes a 0-degree polarizing filter, a 90-degree polarizing filter, and a 45-degree polarizing filter between 0 and 90 degrees as the plurality of polarizing filters 11, so that most object objects are suitable. Can be imaged.
  • the plurality of polarizing filters 11 emit 0-degree linearly polarized light as the polarization characteristic, and emit incident light as 60-degree linearly polarized light as the polarization characteristic.
  • a 60-degree polarizing filter and a 120-degree polarizing filter that emits incident light as 120-degree linearly polarized light may be used as the polarization characteristics. Since such an imaging apparatus D includes a polarizing filter that equally divides 180 °, the error sensitivity can be reduced regardless of the angle of polarization of the optical image of the object.
  • the imaging device D obtains the parallax based on the corresponding points obtained by the corresponding point search unit 22 as shown by the broken line in FIG.
  • the control image processing unit 2 may further be functionally provided with a distance calculation unit 24 for obtaining a distance.
  • a distance calculation unit 24 is obtained based on the principle of so-called triangulation as described above with reference to FIG. According to this, the imaging device D that can measure the distance to a predetermined subject is provided.
  • the array imaging units 1A and 1B are configured to include four individual eyes arranged in 2 rows and 2 columns, but the present invention is not limited thereto. What is necessary is just to comprise at least 3 or more individual eyes.
  • an object in front of the imaging apparatus D hides an object in the back to cause an occlusion that cannot be seen from the imaging apparatus D, or an object having a repetitive pattern such as an object having a striped pattern, for example.
  • the imaging apparatus D including four single eyes arranged in 2 rows and 2 columns the corresponding point search may be difficult.
  • the imaging apparatus D is configured to include the array imaging unit 1 having four single eyes in the top, bottom, left, and right with respect to the single eye that generates the reference image.
  • the object can be imaged with any of the four individual eyes, and corresponding point search can be performed.
  • the repetition pattern scans on the epipolar line at different angles for the single eye arranged in a different direction with respect to the single eye for generating the reference image, and the period of the evaluation value changes. Is possible.
  • the array imaging unit 1 may be configured with nine individual eyes arranged in three rows and three columns.
  • one of the plurality of polarizing filters 11e to 11g in the fifth to seventh aspects shown in FIG. 19 is used in the array imaging unit 1 having nine eyes.
  • FIG. 19 is a diagram showing a plurality of polarizing filters in another mode.
  • FIG. 19A shows a plurality of polarizing filters 11e in the fifth embodiment
  • FIG. 19B shows a plurality of polarizing filters 11f in the sixth embodiment
  • FIG. 19C shows a plurality of polarizing filters 11g in the seventh embodiment.
  • the plurality of polarizing filters 11e in the fifth aspect includes nine polarizing filters 11e-11 to 11e-33 arranged in a two-dimensional array of three rows and three columns.
  • Polarizing filters 11e-11, 11e-13, 11e- disposed at the position of 1 row 1 column, the position of 1 row 3 column, the position of 2 row 2 column, the position of 3 row 1 column, and the position of 3 row 3 column
  • Reference numerals 22, 11e-31 and 11e-33 denote 0-degree polarizing filters, which are arranged at the positions of 1 row and 2 columns, 2 rows and 1 columns, 2 rows and 3 columns, and 3 rows and 2 columns.
  • the filters 11e-12, 11e-21, 11e-23, and 11e-32 are 90-degree polarizing filters.
  • An image by a single eye arranged at a position of 2 rows and 2 columns is used as the reference image, and is arranged at a position of 1 row 1 column, a position of 1 row 3 columns, a position of 3 rows 1 column, and a position of 3 rows 3 columns.
  • the plurality of polarizing filters 11e in the fifth aspect are preferably used for the corresponding point search unit 22a in the first aspect.
  • the plurality of polarizing filters 11f in the sixth aspect includes nine polarizing filters 11f-11 to 11f-33 arranged in a two-dimensional array of three rows and three columns.
  • the plurality of polarizing filters 11f according to the sixth aspect are similar to the plurality of polarizing filters 11e according to the fifth aspect described above, the position of 1 row 1 column, the position of 1 row 3 column, the position of 2 row 2 column, 3 row 1 Polarization filters 11f-11, 11f-13, 11f-22, 11f-31, and 11f-33 arranged at the column position and the position of 3 rows and 3 columns are 0 degree polarization filters, and the positions of 1 row and 2 columns Polarizing filters 11f-12, 11f-21, 11f-23, and 11f-32 arranged at the position of 2 rows and 1 column, the position of 2 rows and 3 columns, and the position of 3 rows and 2 columns are 90-degree polarizing filters.
  • an image of a single eye arranged at a position of 2 rows and 2 columns is set as the standard image and the second reference image, and each image of each eye arranged at a position of 2 rows and 1 column and each position of 2 rows and 3 columns is used.
  • the plurality of polarizing filters 11f in the sixth aspect are preferably used for the corresponding point search unit 22c in the third aspect.
  • an image by a single eye arranged at a position of 2 rows and 2 columns is set as the standard image and the second reference image, and each image by an individual eye arranged at a position of 1 row and 2 columns and a position of 3 rows and 2 columns.
  • the plurality of polarizing filters 11f in the sixth aspect are preferably used for the corresponding point search unit 22c in the third aspect.
  • the plurality of polarizing filters 11g according to the seventh aspect includes nine polarizing filters 11g-11 to 11g-33 arranged in a two-dimensional array of three rows and three columns as shown in FIG. 19C.
  • the plurality of polarizing filters 11g according to the seventh aspect is a polarizing filter 11g-22 arranged at a position of 2 rows and 2 columns, a 0 degree polarizing filter, a position of 1 row 1 column, a position of 1 row 3 columns, 3 Polarizing filters 11g-11, 11g-13, 11g-31, 11g-33 arranged at the position of row 1 column and the position of 3 row 3 column are 90-degree polarizing filters, and the position of 1 row 2 column 2
  • the polarizing filters 11g-12, 11g-21, 11g-23, and 11g-32 arranged at the position of row 1 column, the position of 2 rows 3 columns, and the position of 3 rows 2 columns are 45 degree polarizing filters.
  • an image of a single eye arranged at a position of 2 rows and 2 columns is used as the reference image, and each image of each eye arranged at a position of 2 rows and 1 column and each position of 1 row and 1 column is the first and second images.
  • the images of the individual eyes arranged at the position of 1 row and 2 column and the position of 1 row and 1 column are used as the third and fourth reference images and are arranged at the position of 2 rows and 2 columns.
  • the image by the single eye is used as the reference image, and the images by the individual eyes arranged at the positions of 2 rows and 3 columns and 1 row and 3 columns are the first and second reference images, respectively.
  • the plurality of polarization filters 11f in the seventh aspect are the fourth aspect by using the images of the individual eyes arranged at the positions of the first row and the third column as the third and fourth reference images.
  • the corresponding point search unit 22d is preferably used. Further, an image of a single eye arranged at a position of 2 rows and 2 columns is used as the reference image, and each image of each eye arranged at a position of 2 rows and 1 column and each position of 3 rows and 1 column is the first and second images. As the second reference image, the images of the individual eyes arranged at the position of 3 rows and 2 columns and the positions of 3 rows and 1 column are used as the third and fourth reference images and are arranged at the positions of 2 rows and 2 columns.
  • the image by the single eye is used as the reference image, and the images by the individual eyes arranged at the positions of 2 rows and 3 columns and the positions of 3 rows and 3 columns are the first and second reference images, and 3 rows and 2 columns.
  • the plurality of polarization filters 11f in the seventh aspect are the fourth aspect by using the images of the individual eyes arranged at the positions of 3 rows and 3 columns as the third and fourth reference images.
  • the corresponding point search unit 22d is preferably used.
  • An imaging apparatus captures an optical image of an object formed by each of the plurality of imaging optical systems, corresponding to the plurality of imaging optical systems of at least three or more and the plurality of imaging optical systems.
  • An array imaging unit including a plurality of imaging units, a corresponding point search unit for obtaining corresponding points between a plurality of images acquired by the plurality of imaging units, and the corresponding points obtained by the corresponding point searching unit
  • An image combining unit that combines the plurality of images to create a new combined image relating to polarization, and the array imaging unit corresponds to the plurality of imaging optical systems, and each of the plurality of imaging optical systems.
  • a plurality of polarizing filters disposed on the optical path, wherein the plurality of polarizing filters include polarizing filters having different polarization characteristics.
  • each of the plurality of imaging units captures an optical image of an object via a plurality of polarization filters fixedly disposed on the optical path of each of the plurality of imaging optical systems.
  • the device does not need to switch the polarization filter as in the prior art. For this reason, in the imaging apparatus, the conventional switching mechanism is unnecessary, and the risk of occurrence of a failure is reduced. Further, in the above imaging apparatus, it is not necessary to image a subject a plurality of times as in the conventional case, and it is possible to image the changing subject.
  • the imaging device obtains corresponding points between the plurality of images by a corresponding point search unit. Since the plurality of images are combined by the image combining unit based on the obtained corresponding points, a new combined image can be created from the plurality of images having parallax.
  • the plurality of polarizing filters include a pair of polarizing filters having the same polarization characteristics
  • the corresponding point search unit includes the pair of polarizing filters via the pair of polarizing filters.
  • the imaging apparatus includes a set of polarizing filters having the same polarization characteristics, corresponding points can be obtained between images captured through the set of polarizing filters.
  • the said imaging device since the said imaging device is imaging the same said object, it calculates
  • the plurality of imaging optical systems includes at least four or more, and the plurality of polarizing filters include a pair of polarizing filters having the same polarization characteristics, and the correspondence
  • the point search unit uses one of the set of images generated by the set of imaging units that has captured the optical image of the object through the set of polarizing filters as a reference image and the other as the first reference image.
  • an image generated by an imaging unit corresponding to one imaging optical system among the remaining imaging optical systems excluding one set of imaging optical systems corresponding to the one set of polarizing filters is used as a second reference image and An image generated by an imaging unit corresponding to another imaging optical system among the remaining imaging optical systems is set as a third reference image, and a reference pixel value related to the density of a predetermined reference pixel in the reference image and the first reference In the image Difference between the first reference pixel value relating to the density of the first reference pixel and the second reference pixel value relating to the density of the second reference pixel in the second reference image and the density of the third reference pixel in the third reference image.
  • the sum of the difference from the third reference pixel value for the first to third reference images in the first to third reference images corresponds to each other, and the epipolar line is scanned from the pixel corresponding to infinity.
  • the corresponding points are obtained by setting the first to third reference pixels that are the smallest to be the corresponding pixels in the first to third reference images corresponding to the reference pixel, respectively.
  • the imaging apparatus can determine the corresponding point based on such an idea, and can determine the corresponding point with higher accuracy than in the case of using the above-described pair of polarizing filters.
  • the plurality of imaging optical systems is at least four or more, and the plurality of polarizing filters have different polarization characteristics from each other, and each of the first and the first is one or more.
  • Two sets of polarizing filter sets including one set of second polarizing filters are included, and the corresponding point search unit is configured to select one of the first and second polarizing filters for one of the two sets of polarizing filter sets.
  • the image generated by the imaging unit that captured the optical image of the object via the reference image is used as a standard image, and the image generated by the imaging unit that captured the optical image of the object via the other image is defined as the first reference image, and the standard A first degree of polarization determined based on a reference pixel value relating to the density of a predetermined reference pixel in the image and a first reference pixel value relating to the density of the first reference pixel in the first reference image; and the two sets For the other of the polarizing filter sets, an image generated by an imaging unit that images an optical image of the object through one of the first and second polarizing filters is used as the second reference image and the other Based on the second and third reference pixel values relating to the densities of the second and third reference pixels in the second and third reference images, an image generated by the imaging unit that has captured the optical image of the object is defined as a third reference image.
  • the corresponding points are obtained by setting the first to third reference pixels that are the smallest to be the corresponding pixels in each of the first to third reference images corresponding to the reference pixel.
  • the difference between the first polarization degree and the second polarization degree is determined by the scanning.
  • the first to third reference pixels are located at or near the corresponding point of the reference pixel, it is considered to be minimum.
  • the imaging apparatus can determine the corresponding points based on such an idea, and even when there is not much difference between pixels in density and it is difficult to search for corresponding points, there is a significant difference between pixels in degree of polarization. It is possible to search for corresponding points.
  • the plurality of imaging optical systems is at least four or more, and the plurality of polarizing filters have at least three or more types of first to third polarization characteristics different from each other.
  • Two sets of polarizing filter sets each including the first to third polarizing filters are included, the two sets of polarizing filter sets include a common polarizing filter, and the corresponding point search unit includes the two sets of polarizing filters.
  • an image generated by an imaging unit that images the optical image of the object through the common polarization filter is used as a reference image, and the common of the first to third polarization filters is used.
  • Each image generated by each imaging unit that images an optical image of the object through each remaining polarization filter excluding the polarization filter is set as a first and second reference image, and a predetermined value in the reference image
  • the first degree of polarization and the first degree of polarization determined based on the reference pixel value related to the density of the reference pixel and the first and second reference pixel values related to the density of the first and second reference pixels in the first and second reference images.
  • the third and fourth reference images are used as the third and fourth reference images, and the third and fourth reference pixels in the third and fourth reference images are related to the third and fourth reference images.
  • the sum of the differences between the second polarization degree and the second polarization angle obtained based on the fourth reference pixel value is the first to fourth reference pixels in the first to fourth reference images.
  • the imaging device can determine the corresponding points based on such an idea, and even if there is not much difference between pixels in density and polarization degree and it is difficult to search for corresponding points, there is a significant difference between pixels in polarization angle. Corresponding point search can be performed when there is.
  • the plurality of polarizing filters include a 0-degree polarizing filter that emits incident light with linear polarization of 0 degrees as the polarization characteristics, and incident light of 45 degrees as the polarization characteristics.
  • the imaging apparatus includes the 0-degree polarizing filter, the 90-degree polarizing filter, and the 45-degree polarizing filter between 0 and 90 degrees, it is possible to appropriately capture an object of most subjects.
  • the plurality of polarizing filters include a 0-degree polarizing filter that emits incident light with linear polarization of 0 degrees as the polarization characteristics, and an incident light of 60 degrees as the polarization characteristics.
  • Such an image pickup apparatus includes a polarizing filter that equally divides 180 °, so that the error sensitivity can be reduced regardless of the angle of the optical image of the object.
  • the corresponding point search unit obtains the corresponding pixel in the reference image corresponding to the reference pixel, using each pixel of the reference image as the reference pixel,
  • the image synthesizing unit obtains the degree of polarization for each pixel of the reference image based on each pixel value related to the density of the reference pixel and the corresponding pixel corresponding to each other obtained by the corresponding point search unit.
  • a polarization degree image representing each polarization degree is created as the composite image.
  • the plurality of polarization filters include at least three types of polarization filters having different polarization characteristics
  • the corresponding point search unit includes each pixel of the reference image.
  • the reference pixel the corresponding pixel in the reference image corresponding to the reference pixel is obtained
  • the image composition unit for each pixel of the reference image, the corresponding reference obtained by the corresponding point search unit
  • a polarization angle is obtained based on each pixel value related to the density of the pixel and the corresponding pixel, and a polarization angle image representing the obtained polarization angle is created as the composite image.
  • Such an imaging apparatus creates a polarization angle image as the composite image, it is possible to estimate the direction, time, etc. even in the sky where the sun is not visible, for example, by referring to the created polarization angle image.
  • the plurality of polarization filters include at least three types of polarization filters having different polarization characteristics
  • the corresponding point search unit includes each pixel of the reference image.
  • the reference pixel the corresponding pixel in the reference image corresponding to the reference pixel is obtained
  • the image composition unit for each pixel of the reference image, the corresponding reference obtained by the corresponding point search unit
  • a non-polarized component is obtained based on each pixel value relating to the density of the pixel and the corresponding pixel, and a non-polarized component image representing the obtained non-polarized component is created as the synthesized image.
  • the above-described imaging devices further include a distance calculation unit that obtains a parallax based on the corresponding point obtained by the corresponding point search unit and obtains a distance to the subject reflected in the corresponding point. .
  • an imaging apparatus that acquires a plurality of images having different polarization characteristics.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un dispositif d'imagerie qui synthétise, sur la base de points correspondants, une pluralité d'images imagées par une unité de réseau d'imagerie ayant au moins 3 systèmes optiques d'imagerie, pour générer une nouvelle image de synthèse par rapport à la polarisation. L'unité de réseau d'imagerie comporte une pluralité de filtres de polarisation disposés le long des chemins optiques de chacun de la pluralité de systèmes optiques d'imagerie. La pluralité de filtres de polarisation comprennent des filtres de polarisation ayant des caractéristiques de polarisation mutuellement différentes.
PCT/JP2015/058339 2014-05-30 2015-03-19 Dispositif d'imagerie WO2015182225A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014113101 2014-05-30
JP2014-113101 2014-05-30

Publications (1)

Publication Number Publication Date
WO2015182225A1 true WO2015182225A1 (fr) 2015-12-03

Family

ID=54698567

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/058339 WO2015182225A1 (fr) 2014-05-30 2015-03-19 Dispositif d'imagerie

Country Status (1)

Country Link
WO (1) WO2015182225A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012033149A (ja) * 2010-07-01 2012-02-16 Ricoh Co Ltd 物体識別装置
JP2013044597A (ja) * 2011-08-23 2013-03-04 Canon Inc 画像処理装置および方法、プログラム
WO2014119257A1 (fr) * 2013-01-29 2014-08-07 パナソニック株式会社 Système de capture d'image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012033149A (ja) * 2010-07-01 2012-02-16 Ricoh Co Ltd 物体識別装置
JP2013044597A (ja) * 2011-08-23 2013-03-04 Canon Inc 画像処理装置および方法、プログラム
WO2014119257A1 (fr) * 2013-01-29 2014-08-07 パナソニック株式会社 Système de capture d'image

Similar Documents

Publication Publication Date Title
WO2018161466A1 (fr) Système et procédé d'acquisition d'image de profondeur
US20140307100A1 (en) Orthographic image capture system
JP6223169B2 (ja) 情報処理装置、情報処理方法およびプログラム
CN108700408B (zh) 三维形状数据及纹理信息生成系统、方法及拍摄控制方法
KR101737085B1 (ko) 3차원 카메라
JP2013207415A (ja) 撮像システム及び撮像方法
US10057563B2 (en) Image pick-up apparatus, portable terminal having the same and image pick-up method using apparatus
JP2015207861A (ja) 撮像装置および撮像方法
WO2013069292A1 (fr) Dispositif de correction de flou d'image
US20170155889A1 (en) Image capturing device, depth information generation method and auto-calibration method thereof
US20230017668A1 (en) Mobile communication terminal
JP2021141446A (ja) 撮像装置及びその制御方法
KR20180033128A (ko) 촬상 장치, 촬상 방법, 및 프로그램
TWI508554B (zh) 基於光場相機的影像對焦處理方法及其系統
US20140354777A1 (en) Apparatus and method for obtaining spatial information using active array lens
JP2015148498A (ja) 測距装置および測距方法
WO2020017377A1 (fr) Caméra de télémétrie
JP2020193820A (ja) 計測装置、撮像装置、制御方法及びプログラム
WO2015182225A1 (fr) Dispositif d'imagerie
CN112866546B (zh) 对焦方法和装置、电子设备、计算机可读存储介质
US9743069B2 (en) Camera module and apparatus for calibrating position thereof
WO2011148746A1 (fr) Dispositif d'imagerie en trois dimensions, dispositif de détection de visage et procédé de commande de fonctionnement pour ceux-ci
JP6045280B2 (ja) 撮像装置
JP2015207862A (ja) 撮像装置および撮像方法
JP2017215851A (ja) 画像処理装置および画像処理方法、造形システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15800159

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 15800159

Country of ref document: EP

Kind code of ref document: A1