WO2011070708A1 - Appareil de traitement d'image - Google Patents

Appareil de traitement d'image Download PDF

Info

Publication number
WO2011070708A1
WO2011070708A1 PCT/JP2010/006346 JP2010006346W WO2011070708A1 WO 2011070708 A1 WO2011070708 A1 WO 2011070708A1 JP 2010006346 W JP2010006346 W JP 2010006346W WO 2011070708 A1 WO2011070708 A1 WO 2011070708A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
luminance
pixel
polarization
camera
Prior art date
Application number
PCT/JP2010/006346
Other languages
English (en)
Japanese (ja)
Inventor
克洋 金森
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Priority to JP2010549980A priority Critical patent/JP4762369B2/ja
Priority to CN201080012155.7A priority patent/CN102356628B/zh
Priority to EP10835639.5A priority patent/EP2512124B1/fr
Priority to US13/102,308 priority patent/US8654246B2/en
Publication of WO2011070708A1 publication Critical patent/WO2011070708A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/75Circuitry for compensating brightness variation in the scene by influencing optical camera components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present invention relates to an image processing apparatus having a flexible zooming function capable of generating a high-resolution image having a higher definition than an original pixel that can be acquired by an image sensor.
  • An attempt to achieve higher resolution by further subdividing one pixel of conventional luminance while using a small camera is a technique for increasing the resolution of an image generally called “super-resolution technology”.
  • This technique a plurality of images obtained by continuous shooting or moving image shooting of an object are input.
  • a correspondence between pixels obtained by imaging the same part of the subject is obtained between the plurality of images, and a pattern on a high resolution image is estimated from the corresponding pixel group.
  • product photography and medical images for online shopping are usually one still image.
  • surveillance camera images often provide discrete still images. Therefore, in these cases, the conventional high resolution technology using “motion” cannot be used.
  • the movement amount of the subject corresponding portion included in the plurality of temporally different frame images is detected using the association based on the luminance. This makes it difficult to detect the amount of movement in a region where a large number of specular reflections occur on the surface, such as a glossy surface.
  • Non-Patent Document 1 discloses a technique that enables division within one pixel even on a glossy surface on the premise of a still image.
  • an object such as a metal coin or stone sculpture is selected that is subject to specular reflection and that is a microfaceted physical reflection model.
  • a large number of subject images are taken while changing the direction of the illumination light source from a fixed camera.
  • the micro facet model the spatial distribution of the fine normal in one pixel is estimated from the many images, and the luminance when the fine shape in one pixel is illuminated is calculated.
  • a histogram of the fine region normals distributed within one pixel is constructed from a large number of shots with the light source moved.
  • normal integrability means that if a closed path is given to the surface of the subject, the surface height information, which is the line integral value of the normal on that path, goes around the path and returns to the original position. Returning to means that it is the same as the original surface height at that position.
  • Non-Patent Document 2 discloses a technique for capturing a plurality of images while moving an illumination light source and increasing the resolution of the images.
  • subjects such as stuffed dolls and sneakers for which a complete diffuse reflection model is established are selected. 8 to 16 images are captured to increase the resolution of 2 ⁇ 2 images.
  • This technique uses both normal image, albedo image resolution degradation models, normal integrability constraints, and Markov random field probabilistic models, both subject normal and albedo reflectance. Simultaneously increase the resolution.
  • Patent Document 1 discloses a patterned polarizer.
  • Patent Document 2 and Non-Patent Document 3 disclose devices capable of rotating the polarization plane. The elements disclosed in these documents can be used in the embodiments of the present invention described later.
  • the conventional technology has the following problems.
  • the present invention has been made to solve the above-described problems, and a main object of the present invention is to provide an image processing apparatus that can estimate a normal in a minute region within one pixel by imaging without a large light source movement. There is.
  • An image processing apparatus is an imaging in which a polarized light source that irradiates a surface of a subject with polarized illumination that is linearly polarized light whose polarization plane sequentially changes, and a plurality of photosensitive cells each having a size corresponding to one pixel.
  • a camera that obtains a luminance image of the subject surface from reflected light from the subject surface, and a luminance fluctuation amount of the luminance image that accompanies a change in the polarization plane, which is smaller than one pixel on the subject surface
  • a normalization of a plurality of sub-pixel regions, and a high-resolution processing unit that increases the resolution of the luminance image using the estimated normal, and when the camera acquires a luminance image of the subject surface,
  • the plane of polarization of the linearly polarized light changes while the positional relationship between the polarized light source and the camera is fixed.
  • an angle between the optical axis of the polarized light source and the optical axis of the camera is 10 ° or less.
  • the camera is a polarization camera including a polarization filter that transmits linearly polarized light polarized in a specific direction.
  • the polarizing filter has a plurality of patterned polarizers disposed at positions facing the plurality of photosensitive cells, respectively.
  • the camera is a camera that acquires a monochrome luminance image or a color luminance image.
  • the polarized light source irradiates a subject with polarized illumination while changing a polarization plane during still image shooting, and irradiates non-polarized light with a random polarization plane during moving image shooting, Increase the resolution of the video.
  • the polarized light source includes a plurality of divided light sources each capable of emitting light independently, and a plurality of patterned polarizers provided at positions facing the plurality of divided light sources,
  • the direction of the plane of polarization of the linearly polarized light is defined by the direction of the polarization transmission axis of the patterned polarizer that transmits the light emitted from some of the plurality of divided light sources.
  • the high resolution processing unit uses the luminance fluctuation amount accompanying the change of the polarization plane for each color component, and a component that exhibits specular reflection from the multiple reflection component of the subject surface twice. Separate the component exhibiting diffuse reflection at least once.
  • the high resolution processing unit classifies one pixel of the luminance image into a plurality of regions according to the reflection state of the subject surface, and estimates a luminance, a normal, and an area ratio in each region. Thus, a normal histogram in one pixel is determined.
  • the plurality of regions form at least one groove extending in one direction within each pixel.
  • the resolution enhancement processing unit uses a multiple reflection phenomenon of light generated on the subject surface by polarized illumination irradiated on the subject surface from almost right above, and uses a method within one pixel of the luminance image. Estimate a line histogram.
  • the high resolution processing unit estimates a normal histogram in one pixel of the luminance image using a multiple reflection phenomenon of light on the subject surface.
  • the high resolution processing unit determines an arrangement of the sub-pixel regions in the one pixel from the estimated normal histogram in one pixel, and generates a high resolution normal image.
  • the plurality of regions include an S region in which the reflected light reaches the camera by one reflection of the polarized illumination, and the reflected light by the multiple reflection of the polarized illumination by the camera. And a T region that does not allow the reflected light to reach the camera.
  • the image processing method of the present invention includes irradiating a subject surface with polarized illumination that is linearly polarized light whose polarization plane sequentially changes, obtaining a luminance image of the subject surface from reflected light of the subject surface, Estimating a normal of a sub-pixel region smaller than one pixel on the subject surface using a luminance fluctuation amount of the luminance image according to a change, and increasing the resolution of the luminance image using the estimated normal.
  • the camera acquires a luminance image of the subject surface, the plane of polarization of the linearly polarized light is changed in a state where the positional relationship between the polarized light source and the camera is fixed.
  • the program of the present invention includes a polarization light source that irradiates a subject surface with polarized illumination that is linearly polarized light whose polarization plane sequentially changes, and an imaging surface in which a plurality of photosensitive cells each having a size corresponding to one pixel are arranged.
  • a program for an apparatus comprising a camera that acquires a luminance image of the subject surface from reflected light from the subject surface, the step of irradiating the subject surface with polarized illumination that is linearly polarized light whose polarization plane sequentially changes Subtracting a sub-pixel smaller than one pixel on the subject surface using the step of obtaining a luminance image of the subject surface from the reflected light of the subject surface, and using a luminance fluctuation amount of the luminance image accompanying a change in the polarization plane Estimating the normal of the region, and causing the apparatus to execute the step of increasing the resolution of the luminance image using the estimated normal.
  • information reflecting the concavo-convex structure of a fine region in one pixel on the surface of the subject is obtained by irradiation with polarized illumination with different polarization states. Based on the information thus obtained, the normal distribution in the concavo-convex structure within one pixel is estimated. As a result, according to the present invention, a large movement of the light source is not necessary.
  • the figure which shows Embodiment 1 of this invention 1 is a configuration diagram of an image processing apparatus according to the first embodiment.
  • Diagram showing the configuration of a polarized light source Definition of polarization plane angle The figure which shows the example of arrangement
  • FIG. 8A The figure which shows the surface shape of Drawing 8B typically (A)-(c) is a figure which shows the meaning of the brightness
  • Graph showing Fresnel reflectivity of P wave and S wave when the horizontal axis is incident angle (A) and (b) are diagrams in which incident light causes multiple reflection due to a pair of microfacets.
  • Diagram showing the positional relationship of the paired micro facets A diagram explaining the “three-region model” of the subject surface (A) is a diagram representing a state in which the unevenness of the three-region model exists in one pixel, and (b) is a diagram illustrating the relationship between the angle of the polarization plane of the polarized illumination and the principal azimuth angle ⁇ Imax of the principal axis 1701. (A) And (b) is a figure which shows the relationship between a normal vector, a camera coordinate system, and gradient space.
  • Diagram showing how the plane of polarization changes with respect to the groove The figure which shows the normal line, reflection luminance, and area ratio in each area of “3 area model” Diagram showing the position of normals in gradient space
  • the figure which shows the histogram of the normal line in 1 pixel in gradient space (A) to (c) is a diagram showing how to assign a normal arrangement (No. 1)
  • A) to (c) are diagrams showing how to assign a normal arrangement
  • No. 2 A) to (c) are diagrams showing how to assign a normal arrangement (No. 3)
  • A) to (c) is a diagram showing how to assign a normal arrangement (No.
  • (A) And (b) is a figure which shows normal line arrangement
  • (A) And (b) is a figure which shows 8-dimensional arrangement
  • the flowchart which shows the process sequence in Embodiment 1.
  • FIG. 2 Configuration of an image processing apparatus according to the second embodiment (A) And (b) is a figure which shows the example of arrangement
  • FIG. 3 The flowchart which shows the process sequence in Embodiment 2.
  • Configuration diagram of polarized illumination in Embodiment 4 The figure which shows the change on the time-axis of the illumination in Embodiment 4.
  • FIG. 7 is a vector diagram showing color reflection component separation in the fifth embodiment.
  • 10 is a flowchart showing a processing procedure in the fifth embodiment.
  • the image processing apparatus of the present invention includes a polarized light source that irradiates a subject surface with linearly polarized light whose polarization plane sequentially changes as polarized illumination, and a camera (imaging device).
  • This camera includes a plurality of photosensitive cells each having a size corresponding to one pixel, and acquires a luminance image of the subject surface from reflected light of the subject surface.
  • the image processing apparatus of the present invention further includes a high resolution processing unit that increases the resolution of the luminance image.
  • the high resolution processing unit estimates a normal line of a sub-pixel area smaller than one pixel on the surface of the subject using the luminance fluctuation amount of the luminance image accompanying the change of the polarization plane.
  • the luminance image is increased in resolution using the estimated normal line.
  • the polarization plane of linearly polarized light changes in a state where the positional relationship between the polarized light source and the camera is fixed.
  • “higher resolution” in the present specification is to estimate the luminance of a plurality of sub-pixel areas included in one pixel with respect to one pixel whose luminance has been averaged. It is to increase the resolution of the luminance image.
  • the luminance of each sub pixel area included in one pixel can be determined by obtaining the normal line of each sub pixel area.
  • the high resolution processing unit is not limited to being realized by dedicated hardware, and can be suitably realized by a combination of software and hardware.
  • the high resolution processing unit of the present invention can be configured by incorporating software (program) for executing the processing of the present invention into hardware (processor) having a known configuration. Such a program is preferably incorporated in the image processing apparatus while being recorded on a recording medium, but can also be incorporated into the image processing apparatus via a communication line or wirelessly.
  • a normal histogram in one pixel is obtained by using multiple reflections on the surface of an object irradiated with polarized illumination. Then, the arrangement of the sub-pixel regions corresponding to the individual normal lines constituting the normal histogram is determined.
  • the size of “one pixel” corresponds to the size of the individual photosensitive cells (photodiodes) arranged on the imaging surface of the imaging device. Normally, one photosensitive cell corresponds to one pixel, but the central portion of four adjacent photosensitive cells may function as one pixel. Even in such a case, the size of one pixel corresponds to the size of the photosensitive cell.
  • FIG. 1B is a diagram schematically illustrating the overall configuration of the image processing apparatus according to the first embodiment of the present invention.
  • the image processing apparatus 101 of this embodiment having a free zooming function includes a polarized light source 102 and a polarized camera 103.
  • the subject 104 is irradiated with the polarized illumination 105 from the polarized light source 102.
  • Polarized illumination polarized reflected light 106
  • reflected by the surface of the subject 104 reaches the polarization camera 103, and polarization information is recorded in the polarization camera 103.
  • the polarization camera 104 can acquire both the luminance image of the subject 104 and the polarization information, as will be described later.
  • the angle LV107 formed by the polarized illumination 105 and the polarized reflected light 106 is desirably 10 ° or less.
  • the angle LV 107 By making the angle LV 107 small, it is possible to acquire information on the concavo-convex structure by using multiple reflection (Interreflection) of light generated in the fine concavo-convex structure on the subject 104. Further, by reducing the angle LV107, the influence of the shadow on the surface of the subject 104 can be reduced. It can be seen from the Fresnel reflection theory that when the angle LV107 is set to 10 ° or less, the P-polarized light and the S-polarized light of the natural object having a refractive index of 1.4 to 2.0 exhibit substantially equivalent behavior. Details of these will be described later.
  • FIG. 2 is a diagram illustrating a configuration example of the polarization camera 103 in the present embodiment.
  • the polarization camera 103 includes a polarization imaging device 201, a luminance / polarization information processing unit 202, a polarization plane control unit 204, and an imaging control unit 205.
  • a high resolution normal image 208 and a high resolution luminance image 209 are generated.
  • the polarization camera 103 in the present embodiment can simultaneously acquire a monochrome image and a polarization image in real time.
  • the polarization camera 103 outputs three types of image data: a luminance image Y, a polarization degree image D, and a polarization phase image P.
  • the polarization image sensor 201 incorporated in the polarization camera 103 acquires a luminance image and a partially polarized image of the subject at the same time. For this reason, in the polarization image sensor 201, an array of finely patterned polarizers having a plurality of different polarization main axes is arranged on an image sensor such as a CCD or MOS sensor. As the fine patterned polarizer, a photonic crystal, a structural birefringent wave plate, a wire grid, or the like can be used. This configuration is disclosed in Patent Document 1.
  • FIG. 3 is a diagram illustrating a configuration example of the polarized light source 102.
  • the polarized light source 102 includes a polarizing filter unit 301, a light source 302 that generates non-polarized light, a movable mirror 308, a mirror 309, a movable mirror control unit 312, and shutters 310 and 311.
  • the polarizing filter unit 301 can be configured by a voltage application type liquid crystal device combining a twisted nematic liquid crystal cell and a polarizing film.
  • the polarization filter unit 301 converts non-polarized light generated from the light source 302 into linearly polarized light having a polarization plane at an arbitrary polarization angle. Examples of the configuration of a device capable of rotating the polarization plane are disclosed in Patent Documents 2 and 3, Non-Patent Document 3, and the like.
  • the movable mirror 308 can move from position A to position B in FIG.
  • the light from the light source 302 passes through 301 and the shutter 311 and irradiates the subject while rotating the polarization plane. Images are taken at the same time, and this is performed multiple times. That is, the first image is captured when the polarization plane is 0 ° (state 303 in the figure), the second image is captured when the polarization plane is 45 ° (state 304 in the figure), and the polarization plane is 90 ° ( A third image is captured in the state 305) in the figure, and a fourth image is captured in a state where the polarization plane is 135 ° (state 306 in the figure).
  • the movable mirror 308 moves to position B in FIG. Then, the light is guided from the light source 302 to the mirror 309, passes through the shutter 310, and is irradiated onto the subject. In that case, the plane of polarization of light is randomly present in all directions as indicated by reference numeral 307. As described above, the shutters 310 and 311 are opened and closed with respect to each other, and at the same time, are prevented from being irradiated with polarized light and non-polarized light.
  • FIG. 4 is a diagram showing the definition of the angle ⁇ I of the polarization plane.
  • An XY coordinate system is set toward the subject, and the angle of the polarization plane is defined as a rotation angle with the X axis negative direction being 0 ° and the Y axis positive direction being positive.
  • the polarization incident angle ⁇ I is preserved in reflection, the polarization plane angle of the reflected light and the polarization plane angle of the incident light are the same.
  • FIG. 5 is a diagram illustrating an imaging surface of the polarization imaging element 201.
  • a plurality of photosensitive cells photodiodes
  • Each photosensitive cell generates an electric signal according to the amount of incident light by photoelectric conversion.
  • the imaging surface is covered with an array of patterned polarizers.
  • One of the pattern polarizers corresponds to each photosensitive cell of the polarization imaging device 201.
  • the four pattern polarizers corresponding to the four adjacent photosensitive cells are set so that their polarization transmission planes are 0 °, 45 °, 90 °, and 135 °, respectively.
  • polarization information and luminance information can be obtained without substantially reducing the resolution. That is, when the resolution (number of pixels or number of photodiodes) of the image sensor is 1120 ⁇ 868, Y, D, and P images of 1119 ⁇ 867 pixels can be obtained.
  • the polarization information of the regions located in the center of them can be obtained.
  • the central region (the size corresponds to the size of one photosensitive cell) of the pixel set 501 functions as “one pixel”.
  • the size of one pixel corresponds to the size of one photosensitive cell, but the center position of one pixel is not constrained by the center position of the photosensitive cell.
  • the present invention it is possible to estimate the normal line of an area having a size smaller than the size of one photosensitive cell and obtain the luminance of such a fine area.
  • a luminance image Y is calculated from four luminance values of four polarization pixels having different polarization angles at intervals of 45 °, and a luminance image Y206 is generated.
  • phase c o of the cosine function that minimizes the square error can be obtained from the following equation.
  • the angle taking the minimum value and the angle taking the maximum value can be calculated as follows by performing the case difference from the magnitude relationship between a and c.
  • ⁇ max The value of ⁇ max that takes this maximum value may be used as the polarization phase image P207 as it is.
  • the maximum value and the minimum value of the amplitude are obtained.
  • the square error is minimized using the following equation.
  • the maximum and minimum values of the amplitude are as follows.
  • two types of image information of the luminance image Y and the polarization image P can be obtained from the captured polarization image.
  • the resolution of the luminance image Y is considered to be almost the same as the resolution of the luminance image in a state where no patterned polarizer is installed in the image sensor.
  • the polarization image P can be simultaneously acquired from the image sensor in the present embodiment. Therefore, according to this embodiment, the polarization phase and the polarization degree are increased as the polarization information. That is, by using a polarization camera, more information about the surface reflection of the subject can be obtained than conventional luminance image information.
  • photographing with a polarization camera is performed while further changing the polarization plane of the polarized illumination.
  • FIG. 7A and FIG. 7B show images obtained by polarization imaging using a ceramic cup with a smooth surface and a wooden board having fine irregularities on the surface as subjects.
  • Each of the four images in FIG. 7B is a diagram schematically depicting the four images in FIG. 7AB.
  • Each of the images shown in these figures corresponds to the luminance image Y shown in FIG.
  • FIG. 8A is a graph showing luminance fluctuation of the same pixel value when a luminance image is taken while changing the polarization plane of light (polarized illumination) incident on the surface of a wooden board.
  • FIG. 8B is a luminance image of a wooden board to be imaged.
  • FIG. 8C is a diagram schematically showing the unevenness of the surface of the wooden board of FIG. 8B.
  • FIG. 8A shows the luminance Y at a specific pixel of the luminance image obtained when the angle ⁇ I of the polarization plane of the polarized illumination is 0 °, 45 °, 90 °, and 135 °. From this graph, it can be seen that the luminance Y shows a periodic variation with respect to the angle ⁇ I of the polarization plane of each polarized illumination. Such information on the dependency of the luminance Y on the angle ⁇ I is new information that cannot be observed by ordinary polarization imaging performed without changing the polarization state of illumination.
  • FIG. 9A shows the angle of the polarization plane of illumination.
  • the spatial change in luminance obtained with the polarization pixel at each angle is shown by a curve 901 in the graph 9 (c).
  • the polarization phase of the curve 901 corresponds to the spatial variation obtained with respect to the angle ⁇ O of the polarization transmission axis at the position of the four pixels of the polarizer mosaic on the light receiving element shown in FIG.
  • the variation in luminance Y is a temporal variation (FIG. 9B) obtained on the axis of ⁇ I accompanying the change in the polarization plane of illumination.
  • This function includes three types of information: amplitude, phase, and average value.
  • Information on the maximum value Ymax, the minimum value Ymin, and the angles ⁇ Imax and ⁇ Imin that give the maximum and minimum values using the amplitude and the average value is acquired.
  • the method described in (Equation 1) to (Equation 12) can be used as it is as a method for estimating the above value by fitting a cosine function from four equally spaced angle samples.
  • the fine concavo-convex model employed in this embodiment is a “micro facet model” widely used in the field of physics-based computer graphics.
  • FIG. 10 is a perspective view showing a fine normal distribution in one pixel when an uneven surface is observed.
  • a region 1001 indicating one pixel of the camera has an average luminance, but the luminance distribution inside thereof is unknown. In other words, it is not possible to photograph even the structure within one pixel.
  • each microfacet has a fine normal. Since each micro facet exhibits only specular reflection, when the illumination direction L and the viewpoint direction V are determined, there are a group of micro facets 1002 that shine and a group of micro facets 1003 that do not shine.
  • the total luminance of all the shining microfacets 1002 existing in one pixel corresponds to the average luminance of one pixel. Luminance from a non-shining microfacet is zero (true black).
  • the fine normal line of the shining microfacet 1002 corresponds to the bisection vector H of the illumination vector L and the line-of-sight vector V. That is, estimating the concavo-convex structure within one pixel estimates the fine normal of this micro facet.
  • FIG. 12 shows a state in which polarized light having an incident angle close to zero is incident on the micro facet 1201 and direct reflection is observed with a camera.
  • 12A and 12B the planes of polarization of incident polarized light are different by 90 °.
  • the polarization information of the reflected light is almost the same as the incident light only by changing the traveling direction of the light. This is due to the following reason.
  • FIG. 13 is a graph showing the incident angle dependence of the Fresnel reflectivity.
  • the horizontal axis represents the incident angle
  • An incident angle in the vicinity of 0 ° to 10 ° that can be regarded as a substantially normal incidence corresponds to a range 1301.
  • FIGS. 14 (a) and 14 (b) show a state in which polarized illumination is similarly incident on the subject at an incident angle near zero and the reflected light is observed with a camera.
  • two micro facets are paired to cause multiple reflection.
  • two microfacets form a groove: groove 1401, and two reflections are generated on the inclined surface.
  • the phenomenon of specular reflection can be considered as the main phenomenon in the first and second rounds of 4).
  • FIG. 15 shows the positional relationship of the paired micro facets when specular reflection is assumed for the first time and the second time.
  • the angles measured from the macro surface 1503 of the subject on the slopes 1501 and 1502 forming the groove are ⁇ and ⁇ .
  • the incident angle of the first reflection is ⁇ 1
  • the incident angle ⁇ 2 of the second reflection is ⁇ 1
  • the angles formed by the incident light, the reflected light and the groove slope are ⁇ and ⁇
  • the polarized illumination incident perpendicularly to the groove main axis direction 1402 is P-polarized light.
  • this P-polarized light becomes very weak during one and two reflections.
  • the S-polarized light shown in FIG. 14B does not weaken so much even after two reflections.
  • the incident polarization plane that is P-polarized with respect to the groove the reflected light is extremely weak in terms of energy, and the luminance is lowered.
  • the incident polarization plane that is S-polarized light the reflected light is not so attenuated in energy and has high brightness.
  • FIG. 16 is a diagram for explaining the “three-region model” of the subject surface obtained from the above consideration.
  • the three-region model is a model in which the normal structure of the surface irregularities of one pixel is simplified from the viewpoint of multiple reflection and polarization phenomenon.
  • the three-region model it is assumed that the following three types of two-dimensional regions are mixed at a constant area ratio in one pixel.
  • the groove is divided into a T region and a D region as will be described later.
  • Direct reflection area S (Specular) area: A region formed by microfacets in which incident light is directly reflected and the reflection luminance does not vary with rotation of the polarization plane of the incident light.
  • the microfacet normal in this region corresponds to the bisector of the incident light and the camera viewpoint vector.
  • T Twice Reflection: A region where incident light is reflected twice in the groove and then reflected the second time. It is formed of a micro facet whose reflected luminance changes with rotation of the incident polarization plane.
  • D (Dark) region An area formed by microfacets that reflect incident light in a direction unrelated to the camera and do not contribute to brightness. This area can be further classified into two. The first is a first reflection area for forming a groove, which illuminates the other slopes of the groove, but is not illuminated from the camera photographing viewpoint. The second is a region that reflects light in completely different unknown directions. However, since this unknown direction is inconvenient to handle, the preferred embodiment of the present invention assumes that it has a microfacet normal in the same direction as the first reflection that creates the T region. The reflected brightness in this region also does not vary with respect to the polarization plane rotation of the incident light.
  • a normal histogram can be estimated by obtaining a normal in each region, that is, a microfacet normal and an area ratio of each microfacet within one pixel.
  • the direction of the normal of the micro facet has two degrees of freedom. Specifically, the direction of the normal line is specified by the azimuth angle and the zenith angle.
  • FIG. 17A is a perspective view schematically showing a state in which the unevenness of the three-region model is aligned in a certain azimuth angle within one pixel on the subject surface.
  • the main axis 1701 of the groove corresponds to the main azimuth angle ⁇ Imax.
  • FIG. 17B is a diagram showing the relationship between the angle of the polarization plane of the polarized illumination and the main azimuth angle ⁇ Imax of the main axis 1701.
  • the main axis 1701 coincides with the direction of the maximum luminance when the polarized illumination is rotated. Therefore, hereinafter, the main azimuth angle is treated as being equal to ⁇ Imax.
  • the orientation of the microfacet normal in each of the S, T, and D regions has an angle ⁇ Imin that is 90 ° shifted from the main azimuth angle ⁇ Imax, and is expressed as an angle of two degrees of freedom that individually has a zenith angle.
  • the zenith angle of the normal in the S region is equal to the bisector of incident light and reflected light.
  • the normal is expressed as a point on the camera projection plane in the camera coordinate system. This has almost the same meaning as the later gradient space (p, q).
  • ⁇ I is the polarization plane angle of illumination.
  • the polarization plane angle ⁇ I of the polarized illumination and the angle ⁇ IN formed by the groove can be determined as follows because the value ⁇ Imax observed from the polarization plane rotation is equal to the principal axis angle of the groove.
  • the refractive index n is essentially unknown because of the amount related to the subject.
  • n 1.5 to 1.8 may be used as an approximate value.
  • This value is a refractive index of a typical dielectric material existing in nature such as plastic, resin, and glass. In this way, ⁇ 1 that satisfies Equation 21 may be determined.
  • the zenith angle of the normal in the T region is It becomes.
  • the normal of D region is from the assumption in the three region model And it is sufficient.
  • the brightness of each region will be considered. Since the absolute luminance varies depending on the polarization illumination illuminance and camera settings under actual experimental conditions, the reference value needs to be obtained from observation of the observed luminance image Y. As this reference value, the maximum brightness MAX_Y of the specular reflection area in the image is used. This maximum luminance is obtained when the S region of the three-region model occupies the whole (100%) of one pixel. The large luminance MAX_Y is set as the luminance Is of the S region.
  • the minimum luminance MIN_Y of the specular reflection region in the image is used as the luminance of the D region. This minimum luminance is obtained when the entire pixel (100%) of the three-reflection model is occupied.
  • FIG. 20 is a diagram showing the microfacet normal, reflection luminance, and area ratio in each region of the three-region model.
  • the reflection luminances of the three regions are Is, IP T , IS T , and Id as follows. Note that in the T region, the reflected luminance varies depending on the incidence of the P wave and the S wave. 1) S region: Is 2) T region: when P wave is incident IP T , when S wave is incident I S T 3) D region: Id
  • the unknowns As, A T , and AD can be estimated by the following equations.
  • the gradient space (p, q) may be considered substantially as the camera projection plane shown in FIG.
  • FIG. 21 is a diagram showing the position of the micro facet normal obtained in the two-dimensional gradient space.
  • the normal of the S region is located almost on the origin.
  • the normal lines of the T region and the D region are located on a line 2102 orthogonal to the main axis direction of the groove on the subject surface.
  • FIG. 22 is a histogram of micro facet normals obtained from a three-region model. On the main azimuth line, the three distributions are aligned with their respective area ratio heights.
  • FIG. 23 to FIG. 26 show a method of assigning normal arrangement corresponding to the above three regions to sub-pixels obtained by dividing one pixel into four as an initial normal arrangement. Since this assignment method depends on the angle of the main azimuth angle ⁇ Imax of the groove, it is described with reference to FIGS.
  • FIG. 23A shows an example of a histogram of three regions.
  • the T area is 2 pixels
  • the D area is 1 pixel
  • the S area is 1 pixel.
  • the range of the main azimuth angle is a horizontal direction close to the X axis.
  • the normal lines of the T region and the D region are paired as much as possible, and are orthogonal to the main azimuth angle.
  • a concave type forming a valley and a convex type forming a mountain as shown in FIG. Are considered, and there are two combinations of sequences.
  • the normal line of the S region is perpendicular to the paper surface and is facing forward.
  • FIG. 24, FIG. 25, and FIG. 26 depict a case where only the main azimuth is different in the same normal histogram.
  • 27 (a) and 27 (b) show the case where the histogram is all in the S region, and there is only one arrangement possibility as shown in FIG. 27 (b).
  • the optimization of the normal may be basically performed by a repetitive method described in Non-Patent Document 1 that maximizes the energy evaluation function.
  • the process flow will be briefly described below.
  • FIG. 29A is a diagram for explaining optimization using normal integrability. As described in Non-Patent Document 1, it is a condition that a line integration is considered in a cell of 2 ⁇ 2 pixels and this line integration is made zero.
  • FIG. 29B represents a condition in the shift pixel 2901.
  • the condition of the integral value is
  • the texton in this overlap pixel area should also be obtained from the solution texton. However, since it is an array pattern that does not appear from the original pixel, in FIG. 28 it is at a position 2802 deviated from the pixel. However, since the appearance probability can be expressed by the maximum value of the Gaussian function component of the Texton GMM model, it is weighted by this probability. That is, the condition for this pixel pair will maximize:
  • the solution texton is obtained as maximizing the product of two energy functions in the entire image.
  • s indicates all the pixels of the image
  • Neighbor (s, t) indicates a pair of all four neighboring pixels.
  • is the solution texton to be obtained.
  • FIGS. 30A to 30C are diagrams illustrating the effect of increasing the resolution of one pixel by combining the above processes.
  • the normal lines in the four pixels obtained by increasing the resolution of one pixel of the original image to 2 ⁇ 2 are optimally arranged as shown in FIG.
  • the normal direction of each region of S, D, and I uniquely corresponds to the luminance, and as a result, the resolution within one pixel can be increased to a luminance of 2 ⁇ 2 pixels.
  • the pixel brightness should be based on shooting under non-polarized illumination. Therefore, as shown in FIG. 30A, the luminance of the original image becomes the average luminance Ymax in FIG. 1) S region: Is 2) T region ... (IP T + IS T ) / 2 3) D region: Id (See FIG. 30C).
  • FIG. 31 is a flowchart showing all procedures for estimating a normal distribution histogram and normal array within one pixel through subject image capturing and subject image processing in the present invention, and further estimating the luminance array within one pixel.
  • the part surrounded by a dotted line corresponds to the processing of the high resolution processing unit.
  • the high resolution processing unit includes a program that causes, for example, known hardware to execute the processing steps in the portion surrounded by a dotted line.
  • step S3102 the subject is irradiated with polarized illumination while rotating the angle ⁇ I of the polarization plane, and the subject is polarized. Then, a Y image and a P image at each ⁇ I angle are acquired.
  • step S3103 function fitting is performed using (Equation 14) for each pixel of the Y image whose luminance varies.
  • the method used in (Expression 1) to (Expression 12) may be used as it is. From this, the phase angles ⁇ Imax and ⁇ Imin at which the brightness is maximum and minimum, and the brightness YImax and Ymin at that time are obtained.
  • step S3104 the azimuth angle and the zenith angle of the normal of each region of S, T, and D in the three-region model are obtained. This two degree of freedom represents the normal.
  • the method uses (Expression 17) to (Expression 21).
  • step S3105 the brightness of each of the S, T, and D regions in the three-region model is obtained. According to (Formula 22) to (Formula 24).
  • step S3106 the area ratio of each of the S, T, and D regions in the three-region model is obtained using (Equation 26).
  • step S3107 after the normal expression method is converted from an angle into a gradient space (p, q) in the camera coordinate system, a normal histogram in one pixel shown in FIGS. 21 and 22 is obtained.
  • step S3108 as shown in FIGS. 23 to 28, an optimal arrangement of normals in one pixel is obtained from the histogram distribution of normals.
  • step S3109 the luminance is obtained from the normal line optimally arranged in one pixel as shown in FIG.
  • Embodiment 2 In the first embodiment, high resolution is implemented using polarized illumination and a polarized camera. However, a polarization camera is expensive because it is a special form that builds a patterned polarizer. Furthermore, the sensitivity is lowered due to a decrease in the amount of light due to the polarizing plate, and there is a possibility that the image quality is reduced. In the present embodiment, as in the first embodiment, while using polarized illumination whose polarization plane rotates, a camera that observes normal luminance can be used for image shooting to enable free zooming.
  • FIG. 32 is a diagram illustrating the second embodiment.
  • the configuration of this embodiment is different from the configuration shown in FIG. 1 only in that the camera 3201 is not a polarization camera but a camera that observes normal luminance.
  • FIG. 33 is a diagram for explaining the camera 3201.
  • the camera 3201 includes a luminance imaging element 3301, a polarization plane control unit 204, and an imaging control unit 205.
  • An image is captured each time the polarization illumination is changed from the polarization plane control unit 204.
  • a plurality of luminance images Y (206) are obtained corresponding to each polarization state.
  • the high-resolution normal image 208 and the high-resolution luminance image 209 are generated from the high-resolution processing unit 3303.
  • FIG. 34 is a diagram illustrating the luminance imaging element 3301.
  • the luminance imaging element 3301 does not have a pattern polarizer as shown in FIG. For this reason, when the resolution of the original image sensor is 1120 ⁇ 868, a Y image of 1120 ⁇ 868 pixels having the same resolution can be obtained.
  • the monochrome imaging device is written in the luminance imaging device of FIG. 34A, a color imaging device equipped with a known Bayer mosaic as shown in FIG. 34B may be used.
  • the image output from the camera 3201 is only the luminance image Y, and a P image that is polarization information cannot be acquired.
  • Equation 39 is thus obtained.
  • the normal of the D region is expressed by the following expression 40 based on the assumption in the three region model.
  • FIG. 35 is a flowchart showing a flow of processing according to the second embodiment of the present invention.
  • step S3501 the imaging is performed with the illumination polarization plane ⁇ I changed in step S3501 to obtain a Y image for each ⁇ I, and the azimuth angles in the S, T, and D regions in step S3502. It is only the part which asks for the zenith angle. A portion surrounded by a dotted line corresponds to the processing of the high resolution processing unit.
  • the method may be either a method using a color camera as a luminance camera or a method of colorizing polarized illumination, but since both are known techniques, detailed description thereof is omitted.
  • the configuration in which the image is a multiband color luminance image for medical use in the second embodiment is also included here.
  • This method is a method using a multiband color camera as a luminance camera, and since it is a known technique, detailed description thereof is omitted.
  • an image is an infrared image and a polarized light source is an infrared light source for monitoring camera use is included here, but since it is a known technique, description thereof is omitted.
  • Embodiment 3 an embodiment of an image processing apparatus capable of shooting while switching between a moving image and a still image will be described.
  • any of the configurations of Embodiments 1 and 2 can be adopted as a basic configuration.
  • the case where it has the structure of Embodiment 2 is demonstrated to an example.
  • a normal moving image is picked up with a non-polarized light source and a conventional moving image high-resolution process is performed.
  • the image is picked up by irradiating a polarized light source capable of free zooming.
  • FIG. 36 is a diagram showing a configuration of the movie camera 3500 of the present embodiment.
  • a moving image / still image command input unit 3601 is mounted on the camera.
  • irradiation with non-polarized illumination is performed via the polarization plane control unit 204 and the imaging control unit 205.
  • the image signal from the image sensor 3301 is acquired continuously in time in the luminance moving image string memory buffer 3603 via the moving image still image selection unit 3602.
  • the image signal read from the luminance moving image string memory buffer 3603 is processed by the existing “motion” -based moving image resolution enhancement processing unit 3604 and accumulated as a high-resolution moving image 3605.
  • FIG. 37 is a flowchart showing a control method.
  • step S3700 it is determined whether or not a still image shooting command has been input. If no still image shooting command is input (NO) in step S3700, moving image shooting is performed. Specifically, the non-polarized illumination is turned on in step S3701, a moving image is recorded in step S3702, and the existing moving image high-resolution processing is performed in step S3703. For example, a conventional technique as disclosed in Patent Document 3 is used for the high-resolution moving image processing.
  • This moving image high resolution technology is a general technology that uses motion between a plurality of frames of moving images, and the following processing is executed.
  • Execute interpolation to generate the first high resolution estimated image Alignment is performed to generate an alignment image.
  • the high-resolution estimated image is smeared and a residual image is generated by subtracting the smeared image and the alignment image.
  • the missing residual is interpolated using the residual values of neighboring pixels that are not missing.
  • a back projection image is generated from the residual image using a point spread inverse function, and an enhancement coefficient is generated by combining the smoothed image and the back projection image.
  • step S3700 If the still image shooting command is input (YES) in step S3700, still image shooting is performed. Specifically, polarized illumination is turned on in step S3704. In step S3501, a still image is captured while changing the polarization plane of the polarized illumination. Then, the high resolution processing described in the second embodiment is performed in step S3500.
  • the polarized light source 102 in the first and second embodiments is replaced with a light source 3800 that can rotate the plane of polarization at a higher speed.
  • a light source 3800 that can rotate the plane of polarization at a higher speed.
  • the configuration other than the polarized light source 102 is the same as the configuration shown in FIGS. 1 and 2, 32, and 33.
  • FIG. 38 shows a configuration of a polarized light source for rotating the polarization plane by 45 ° at high speed.
  • This polarized light source includes a patterned polarizer filter 3801 and an LED surface light source 3802.
  • the patterned polarizer filter 3801 in addition to a film-type polarizing plate, a photonic crystal, a structural birefringent wave plate, a wire grid, or the like can be used.
  • the LED surface light source 3802 is divided so as to be lit independently with respect to each of the pattern polarizers 3701 having different polarization planes by 45 °. Then, the four divided light emitting areas A, B, C, and D in the LED surface light source 3802 corresponding to the patterned polarizers of 0 °, 45 °, 90 °, and 135 ° are sequentially turned on. For example, when the divided light emitting area A is lit, the other divided light emitting areas B, C, and D are not lit.
  • the light emitted from the divided light-emitting region A is not polarized, but is incident on the patterned polarizer having an angle of 0 ° in the patterned polarizer filter 3801. Only light polarized in the direction of the polarization transmission axis of the patterned polarizer can pass through the patterned polarizer filter 3801.
  • the polarization plane of the emitted light rotates as indicated by reference numerals 303 to 306 in FIG.
  • Each of the divided light emitting areas A, B, C, and D includes at least one LED element.
  • the patterned polarizer filter 3801 has a four-divided configuration.
  • a configuration may be adopted in which the light emitting area allocated to is divided more finely.
  • the time required to rotate the polarization plane can be reduced to about 10 ( ⁇ S), which is the LED lighting response speed, and the change of the polarization plane can be completed within the frame switching time. This shortens the time for changing the polarization plane during movie shooting so as not to be a problem.
  • FIG. 39 shows a time chart when a moving image is shot using this polarized light source.
  • Time t1 (3901) indicates the polarization plane switching time
  • time t2 (3902) indicates the imaging exposure time for one frame.
  • 4T (sec) As one set are required.
  • High-speed shooting is required.
  • Such high-speed shooting is also possible if the imaging device is made faster and the illumination time in this embodiment is improved to shorten the exposure time. As a result, it is possible to freely achieve high resolution even in moving images.
  • FIG. 40 is a diagram illustrating the fifth embodiment.
  • the only difference from FIG. 33 of the second embodiment is that the camera 3201 is a normal color camera.
  • the color camera acquires an R image 4002, a G image 4003, and a B image 4004 for each of the R, G, and B wavelength bands by using the color imaging element 4001.
  • the point is that the attributes of light color and polarization change due to the reflection phenomenon.
  • the diffused light Since diffused light is emitted once penetrated into the medium, the diffused light is colored with the color of the medium of the object when the illumination light is white, and (ii) polarization at the time of emission depends on the emission angle However, in this embodiment, since it is 45-50 °, it should be noted that it is very low and almost non-polarized light.
  • FIG. 41 depicts a double reflection corresponding to 1) above. Since the light that has become non-polarized diffused light by the first reflection has a color specific to the medium of the object, the second specular reflection becomes a colored reflection component and also exhibits partial polarization as the specular reflection. In the figure, this is represented by ellipses 4101 and 4102. In addition, the ellipse does not accurately represent the shape of partial polarization but is schematically represented.
  • the ellipse when the polarization direction of incident light is changed, the amount of light that is refracted and penetrated into the medium changes, so that the amount of incident light on the second reflection changes, and as a result, the amount of light of the second reflection component changes.
  • the size of the ellipse such as 4102 varies, and the luminance varies.
  • FIG. 42 depicts a state of twice reflection corresponding to 2) above.
  • the diffused light colored by the first reflection is emitted as the diffused light colored again the second time.
  • the amount of light that is refracted and penetrated into the medium changes, so the amount of incident light on the second reflection fluctuates.
  • the amount of light reflected by the second reflection also fluctuates. Non-polarized light.
  • FIG. 43 depicts a double reflection corresponding to 3) above.
  • the light that has been specularly reflected by the first reflection is white light and becomes a specularly polarized light. Then, it is reflected as colored non-polarized diffused light for the second time, but when the polarization direction of the incident light is changed, the amount of light reflected for the first time changes, and as a result, the light amount of the second reflection component varies.
  • a color camera is used to capture images while sequentially changing the polarization plane of the polarized illumination, and the function of changes in color luminance R, G, B is approximated as a cosine function with a period of 180 °.
  • the angle of the polarization plane of illumination is ⁇ I as follows.
  • the luminance I and the amplitude A are both vectors having three components (R, G, B).
  • the phase ⁇ I in this luminance variation is generally common to each color.
  • FIG. 44 shows a conceptual diagram of this vector relationship in the RGB color space.
  • FIG. 45 is a flowchart showing the above operation.
  • the difference from FIG. 34 described in the second embodiment is only a part for acquiring a color RGB image in S4501 and a part for separating the specular reflection component from SRGB2 in Table 14) once or twice in S4502. Since the separated luminance component has white illumination, only the luminance can be handled, and the other parts of the flowchart are the same. Since the high-resolution image obtained here has only a specular reflection component, the final high-resolution color image can be obtained by synthesizing it again using Expression 46.
  • the polarization plane rotation angle of the polarized illumination has been described by taking 45 ° as an example, but this angle is arbitrary.
  • the present invention is widely applicable to consumer digital cameras, movie cameras, medical diagnostic imaging apparatuses, medical endoscopic cameras, surveillance cameras, robot vision, surface inspection apparatuses, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Input (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Studio Devices (AREA)
  • Polarising Elements (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

L'invention porte sur un appareil de traitement d'image (101) qui comprend une source de lumière de polarisation (102) et un appareil photo à polarisation (103). Lorsqu'une image d'un sujet (104) doit être capturée, un éclairage polarisant (105), dont le plan de polarisation est amené à tourner, est émis vers le sujet. Une lumière réfléchie polarisée (106), qui est réfléchie par une surface du sujet, atteint l'appareil photo à polarisation (103) et l'image est ainsi enregistrée. L'appareil photo à polarisation (103) comprend un élément de capture d'image à polarisation (201), une unité de traitement d'informations de luminosité/lumière polarisée (202), une unité de commande de plan de polarisation (204) et une unité de commande de capture d'image (205). Chaque fois que l'éclairage polarisant est changé par l'unité de commande de plan de polarisation (204), l'image est capturée, pour ainsi obtenir une pluralité d'images de luminance (Y) et d'images de phase de lumière polarisée (P) correspondant aux conditions de polarisation respectives. À partir de ces images, une unité de traitement à haute résolution (203) génère une image normale à haute résolution (208) et une image de luminance à haute résolution (209).
PCT/JP2010/006346 2009-12-08 2010-10-27 Appareil de traitement d'image WO2011070708A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2010549980A JP4762369B2 (ja) 2009-12-08 2010-10-27 画像処理装置
CN201080012155.7A CN102356628B (zh) 2009-12-08 2010-10-27 图像处理装置及图像处理方法
EP10835639.5A EP2512124B1 (fr) 2009-12-08 2010-10-27 Appareil de traitement d'image
US13/102,308 US8654246B2 (en) 2009-12-08 2011-05-06 High resolution images using polarized light source and polarization camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009278557 2009-12-08
JP2009-278557 2009-12-08

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/102,308 Continuation US8654246B2 (en) 2009-12-08 2011-05-06 High resolution images using polarized light source and polarization camera

Publications (1)

Publication Number Publication Date
WO2011070708A1 true WO2011070708A1 (fr) 2011-06-16

Family

ID=44145276

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/006346 WO2011070708A1 (fr) 2009-12-08 2010-10-27 Appareil de traitement d'image

Country Status (5)

Country Link
US (1) US8654246B2 (fr)
EP (1) EP2512124B1 (fr)
JP (1) JP4762369B2 (fr)
CN (1) CN102356628B (fr)
WO (1) WO2011070708A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102629986A (zh) * 2012-04-10 2012-08-08 广州市奥威亚电子科技有限公司 一种自动跟踪拍摄方法
JP2013065995A (ja) * 2011-09-16 2013-04-11 Ricoh Co Ltd 撮像装置及びこれを用いた物体識別装置
CN103206917A (zh) * 2012-01-11 2013-07-17 广州市奥威亚电子科技有限公司 一种室内定位方法
CN103874451A (zh) * 2011-10-12 2014-06-18 富士胶片株式会社 内窥镜系统以及图像生成方法
JP2017058383A (ja) * 2014-03-04 2017-03-23 パナソニックIpマネジメント株式会社 偏光画像処理装置
JP2018029279A (ja) * 2016-08-18 2018-02-22 ソニー株式会社 撮像装置と撮像方法
JPWO2017014137A1 (ja) * 2015-07-17 2018-04-26 ソニー株式会社 眼球観察装置、アイウェア端末、視線検出方法及びプログラム
JP2019139694A (ja) * 2018-02-15 2019-08-22 キヤノン株式会社 画像処理方法、画像処理装置、撮像装置、画像処理プログラム、および、記憶媒体
JP2020061130A (ja) * 2018-09-01 2020-04-16 タタ コンサルタンシー サービシズ リミテッドTATA Consultancy Services Limited グラフ信号処理を使用したオブジェクトの高密度な表面を再構築するためのシステムおよび方法
CN111095288A (zh) * 2019-12-02 2020-05-01 深圳市汇顶科技股份有限公司 屏下光学指纹识别装置及系统、液晶显示屏

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102742258B (zh) 2010-07-21 2016-10-26 松下知识产权经营株式会社 图像处理装置
CN103037752B (zh) 2010-09-24 2015-05-13 松下电器产业株式会社 图像处理装置
US20120092668A1 (en) * 2010-10-15 2012-04-19 The Hong Kong University Of Science And Technology Patterned polarization converter
WO2012073414A1 (fr) * 2010-11-30 2012-06-07 パナソニック株式会社 Dispositif de traitement d'image
JP5884367B2 (ja) 2011-09-27 2016-03-15 セイコーエプソン株式会社 プロジェクター
US9739448B2 (en) 2012-05-08 2017-08-22 The Hong Kong University Of Science And Technology Patterned polarization grating polarization converter
US9341855B2 (en) 2012-05-08 2016-05-17 The Hong Kong University Of Science And Technology Polarization converter by patterned polarization grating
JP5603508B2 (ja) 2012-05-22 2014-10-08 パナソニック株式会社 撮像処理装置および内視鏡
US20150374210A1 (en) * 2013-03-13 2015-12-31 Massachusetts Institute Of Technology Photometric stereo endoscopy
JP6376785B2 (ja) * 2014-03-14 2018-08-22 キヤノン株式会社 撮像装置、および、撮像システム
US9305948B2 (en) * 2014-05-30 2016-04-05 Raytheon Company Dynamic polarizer having material operable to alter its conductivity responsive to an applied stimulus
US10798367B2 (en) * 2015-02-27 2020-10-06 Sony Corporation Imaging device, image processing device and image processing method
JP2016213718A (ja) * 2015-05-11 2016-12-15 キヤノン株式会社 画像処理装置及び画像処理方法、プログラム、記憶媒体
KR102519803B1 (ko) * 2016-04-11 2023-04-10 삼성전자주식회사 촬영 장치 및 그 제어 방법
JP2017191572A (ja) * 2016-04-15 2017-10-19 キヤノン株式会社 画像処理装置及びその方法、プログラム
JP6818444B2 (ja) * 2016-06-22 2021-01-20 キヤノン株式会社 画像処理装置、撮像装置、画像処理プログラムおよび画像処理方法
WO2018034209A1 (fr) * 2016-08-18 2018-02-22 ソニー株式会社 Dispositif d'imagerie, et procédé d'imagerie
TWI595445B (zh) * 2016-08-31 2017-08-11 致茂電子股份有限公司 抗雜訊之立體掃描系統
US11336849B2 (en) * 2016-08-31 2022-05-17 Sony Corporation Image processing apparatus and image processing method
JP6807546B2 (ja) * 2016-11-15 2021-01-06 パナソニックIpマネジメント株式会社 画像形成装置
CN106725250A (zh) * 2017-01-09 2017-05-31 清华大学深圳研究生院 一种头端调制的内窥偏振成像系统及测量方法
WO2018221025A1 (fr) * 2017-06-01 2018-12-06 富士フイルム株式会社 Dispositif d'imagerie, dispositif de traitement d'image, système d'imagerie, procédé de traitement d'image et support d'enregistrement
JP7006690B2 (ja) * 2017-06-13 2022-01-24 ソニーグループ株式会社 撮像装置と撮像素子および画像処理方法
JP7207303B2 (ja) * 2017-07-12 2023-01-18 ソニーグループ株式会社 撮像装置、画像生成方法、撮像システム
WO2019225080A1 (fr) * 2018-05-24 2019-11-28 ソニー株式会社 Dispositif et procédé de traitement d'informations, et programme associé
US11067837B2 (en) * 2018-08-21 2021-07-20 Arizona Board Of Regents On Behalf Of The University Of Arizona Polarization state scrambler
CN109118919A (zh) * 2018-09-17 2019-01-01 安徽新华学院 一种演示光的偏振特性的像素拼图系统及其方法
US10721458B1 (en) * 2018-12-14 2020-07-21 Ambarella International Lp Stereoscopic distance measurements from a reflecting surface
JP7358970B2 (ja) * 2018-12-26 2023-10-11 株式会社デンソーウェーブ 光学的情報読取装置
US11961224B2 (en) * 2019-01-04 2024-04-16 Stella Surgical Device for the qualitative evaluation of human organs
US11004253B2 (en) * 2019-02-21 2021-05-11 Electronic Arts Inc. Systems and methods for texture-space ray tracing of transparent and translucent objects
JP2021085788A (ja) * 2019-11-28 2021-06-03 株式会社リコー 評価装置、評価方法
DE102020203293B4 (de) * 2020-03-13 2022-09-08 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung eingetragener Verein Eine Vorrichtung zum Erkennen von Wasser auf einer Oberfläche und ein Verfahren zum Erkennen von Wasser auf einer Oberfläche
CN113497913B (zh) * 2020-03-20 2023-02-21 浙江宇视科技有限公司 一种车辆监控方法和监控系统
KR102216999B1 (ko) * 2020-09-28 2021-02-18 주식회사 하이브비젼 라인 스캔용 논-램버시안 표면 검사 시스템
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US11689813B2 (en) * 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers
US11644606B1 (en) * 2021-10-14 2023-05-09 Omnivision Technologies, Inc. Image sensor with sub-pixel photodiode array with multiple mini-wire grid polarizers for polarization imaging

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06275804A (ja) * 1993-03-22 1994-09-30 Olympus Optical Co Ltd 超解像撮像装置
JPH11313242A (ja) 1998-04-28 1999-11-09 Nippon Hoso Kyokai <Nhk> 撮像用光学機器および撮像装置
JP2007086720A (ja) 2005-08-23 2007-04-05 Photonic Lattice Inc 偏光イメージング装置
JP2007316161A (ja) 2006-05-23 2007-12-06 Matsushita Electric Ind Co Ltd 残差補間を用いた超解像処理方法及び装置
JP2008016918A (ja) * 2006-07-03 2008-01-24 Matsushita Electric Ind Co Ltd 画像処理装置、画像処理システムおよび画像処理方法
WO2008026518A1 (fr) * 2006-08-31 2008-03-06 Panasonic Corporation Dispositif, procédé et programme de traitement d'image

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005063524B4 (de) * 2005-07-08 2011-01-27 Grau, Günter, Dr. Vorrichtung zur Messung und Erzeugung der Polarisation von Licht
JP3955616B2 (ja) * 2005-09-01 2007-08-08 松下電器産業株式会社 画像処理方法、画像処理装置及び画像処理プログラム
US7582857B2 (en) * 2006-04-18 2009-09-01 The Trustees Of The University Of Pennsylvania Sensor and polarimetric filters for real-time extraction of polarimetric information at the focal plane
US20090021598A1 (en) * 2006-12-06 2009-01-22 Mclean John Miniature integrated multispectral/multipolarization digital camera
EP2202688B1 (fr) * 2007-02-13 2013-11-20 Panasonic Corporation Système, procédé et appareil pour le traitement d'images et format d'image
CN102316329B (zh) * 2007-05-31 2014-03-12 松下电器产业株式会社 彩色偏振摄像装置
JP4435865B2 (ja) * 2008-06-26 2010-03-24 パナソニック株式会社 画像処理装置、画像分割プログラムおよび画像合成方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06275804A (ja) * 1993-03-22 1994-09-30 Olympus Optical Co Ltd 超解像撮像装置
JPH11313242A (ja) 1998-04-28 1999-11-09 Nippon Hoso Kyokai <Nhk> 撮像用光学機器および撮像装置
JP2007086720A (ja) 2005-08-23 2007-04-05 Photonic Lattice Inc 偏光イメージング装置
JP2007316161A (ja) 2006-05-23 2007-12-06 Matsushita Electric Ind Co Ltd 残差補間を用いた超解像処理方法及び装置
JP2008016918A (ja) * 2006-07-03 2008-01-24 Matsushita Electric Ind Co Ltd 画像処理装置、画像処理システムおよび画像処理方法
WO2008026518A1 (fr) * 2006-08-31 2008-03-06 Panasonic Corporation Dispositif, procédé et programme de traitement d'image

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
LAWRENCE B. WOLFF: "Polarization Vision: a New Sensory Approach to Image Understanding", IMAGE AND COMPUTING, vol. 15, 1997, pages 81 - 93
MANJUNATH: "Simultaneous Estimation of Super-Resolved Depth Map and Intensity Field Using Photometric Cue", COMPUTER VISION AND IMAGE UNDERSTANDING, vol. 101, 2006, pages 31 - 44
PING TAN: "Resolution-Enhanced Photometric Stereo", EUROPEAN CONFERENCE ON COMPUTER VISION, vol. 3, 2006, pages 58 - 71, XP019036524
See also references of EP2512124A4

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013065995A (ja) * 2011-09-16 2013-04-11 Ricoh Co Ltd 撮像装置及びこれを用いた物体識別装置
CN103874451A (zh) * 2011-10-12 2014-06-18 富士胶片株式会社 内窥镜系统以及图像生成方法
CN103206917A (zh) * 2012-01-11 2013-07-17 广州市奥威亚电子科技有限公司 一种室内定位方法
CN102629986A (zh) * 2012-04-10 2012-08-08 广州市奥威亚电子科技有限公司 一种自动跟踪拍摄方法
JP2017058383A (ja) * 2014-03-04 2017-03-23 パナソニックIpマネジメント株式会社 偏光画像処理装置
JPWO2017014137A1 (ja) * 2015-07-17 2018-04-26 ソニー株式会社 眼球観察装置、アイウェア端末、視線検出方法及びプログラム
JP2018029279A (ja) * 2016-08-18 2018-02-22 ソニー株式会社 撮像装置と撮像方法
US10877288B2 (en) 2016-08-18 2020-12-29 Sony Corporation Imaging device and imaging method
JP2019139694A (ja) * 2018-02-15 2019-08-22 キヤノン株式会社 画像処理方法、画像処理装置、撮像装置、画像処理プログラム、および、記憶媒体
JP7286268B2 (ja) 2018-02-15 2023-06-05 キヤノン株式会社 画像処理方法、画像処理装置、撮像装置、画像処理プログラム、および、記憶媒体
JP2020061130A (ja) * 2018-09-01 2020-04-16 タタ コンサルタンシー サービシズ リミテッドTATA Consultancy Services Limited グラフ信号処理を使用したオブジェクトの高密度な表面を再構築するためのシステムおよび方法
CN111095288A (zh) * 2019-12-02 2020-05-01 深圳市汇顶科技股份有限公司 屏下光学指纹识别装置及系统、液晶显示屏
CN111095288B (zh) * 2019-12-02 2023-09-05 深圳市汇顶科技股份有限公司 屏下光学指纹识别装置及系统、液晶显示屏

Also Published As

Publication number Publication date
JP4762369B2 (ja) 2011-08-31
EP2512124A4 (fr) 2013-11-20
JPWO2011070708A1 (ja) 2013-04-22
CN102356628B (zh) 2015-03-11
CN102356628A (zh) 2012-02-15
EP2512124A1 (fr) 2012-10-17
US20110267483A1 (en) 2011-11-03
EP2512124B1 (fr) 2016-07-06
US8654246B2 (en) 2014-02-18

Similar Documents

Publication Publication Date Title
JP4762369B2 (ja) 画像処理装置
US7893971B2 (en) Light source estimation device that captures light source images when it is determined that the imaging device is not being used by the cameraman
JP4469021B2 (ja) 画像処理方法、画像処理装置、画像処理プログラム、画像合成方法、および画像合成装置
US8648902B2 (en) Image processing apparatus
US8648907B2 (en) Image processing apparatus
US7792367B2 (en) System, method and apparatus for image processing and image format
US8184194B2 (en) Image processing apparatus, image division program and image synthesising method
JP2009055624A (ja) カラー偏光撮像装置および画像処理装置
WO2022026342A1 (fr) Dispositifs et procédés pour déterminer la confiance dans une adaptation stéréo à l&#39;aide d&#39;un filtre à base de classificateur
CN208536839U (zh) 图像采集设备
EP4189639A1 (fr) Mélangeur de canaux infrarouges et non infrarouges pour cartographie de profondeur faisant appel à une lumière structurée
JPH1141514A (ja) 映り込み防止形撮像装置
WO2021229984A1 (fr) Dispositif de traitement d&#39;image, procédé de traitement d&#39;image et programme
Takiguchi et al. Designing flat-bed scanning system for spectral and glossiness recording
Xu Capturing real-world dynamic objects using temporally-coded photography

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080012155.7

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2010549980

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10835639

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2010835639

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2010835639

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE