WO2012073414A1 - 画像処理装置 - Google Patents
画像処理装置 Download PDFInfo
- Publication number
- WO2012073414A1 WO2012073414A1 PCT/JP2011/005293 JP2011005293W WO2012073414A1 WO 2012073414 A1 WO2012073414 A1 WO 2012073414A1 JP 2011005293 W JP2011005293 W JP 2011005293W WO 2012073414 A1 WO2012073414 A1 WO 2012073414A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- luminance
- unit
- subject
- polarization
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2461—Illumination
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00096—Optical elements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0646—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements with illumination filters
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/28—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
Definitions
- the present invention relates to an image processing apparatus capable of obtaining surface unevenness information exceeding information obtained from a two-dimensional luminance image acquired by an image sensor.
- Endoscope takes images by illuminating the wall surface of a living organ organ covered with mucous membrane.
- the angle between the optical axis of illumination and the optical axis of imaging is set to approximately 0 ° so that no shadow is generated on the subject.
- a conventional technique has been proposed in which surface irregularities are identified from the gray level information of an image by devising image processing using an existing endoscope imaging system with color luminance.
- a polarization endoscope using polarization illumination and polarization imaging has also been proposed.
- Patent Document 1 The former technique is disclosed in Patent Document 1, for example.
- This technique compares the signal level value of a specific pixel of a captured color image with the average signal level value of the surrounding 8 pixel pixels.
- the signal level value of the specific pixel is low, it is determined that the corresponding portion of the subject is concave (depressed) from the periphery.
- a blue component is emphasized by reducing the signal level value of a red pixel signal and a green pixel signal.
- the color image reproduced on the monitor device exhibits a color contrast similar to that of a blue dye solution distribution performed by a doctor in a normal endoscopic examination, and the unevenness of the organ surface becomes clear.
- a polarized light irradiation unit that irradiates an object with light of a specific polarization component, a light of the specific polarization component in the return light from the object, and a polarization component different from the specific polarization component in the return light
- An endoscope includes a polarization mosaic type light receiving unit that receives light and performs polarization imaging, and generates a shape change image indicating a shape change of the surface of the object.
- the polarization characteristic calculation unit can calculate the polarization direction and generate a two-dimensional distribution of surface tilt information.
- Patent Document 4 Patent Document 5, and Non-Patent Document 1 disclose devices capable of rotating the plane of polarization.
- the conventional technology using polarized light can irradiate an object with circularly polarized light and calculate the surface inclination based on the orientation of elliptically polarized light with respect to the return light from the object.
- the conventional technology provides a two-dimensional display of recessed areas such as “pseudo pigment distribution” as a display technique for clarifying recessed areas (grooves).
- recessed areas such as “pseudo pigment distribution”
- grooves recessed areas
- the present invention solves the above problems, and an object of the present invention is to provide a small and realizable image processing apparatus capable of reproducing not only luminance but also fine irregularities on the surface of a subject with high reality.
- the subject is irradiated with the polarization illumination unit that sequentially irradiates the subject with three or more types of linearly polarized light having different polarization plane angles, and each of the three or more types of linearly polarized light.
- an imaging unit that sequentially images the subject and an image processing unit are provided.
- the image processing unit processes the luminance of the image captured by the imaging unit to calculate a reflected polarization state on the surface of the subject, and performs twice in the concave region based on the output of the variable luminance processing unit.
- a reflection determination unit that discriminates between the multiple reflection area that is reflected and becomes return light, and a one-time reflection area that is reflected once on the subject surface and becomes return light, and is reflected twice on the concave area of the subject surface
- a mirror image search unit for determining a pair of multiple reflection regions to be returned light.
- the image processing unit generates an image indicating the concave region on the subject surface based on the pair of the multiple reflection regions.
- the image processing unit generates a groove segment from each pair of the multiple reflection regions, connects the groove segment to estimate the position of the concave region, and the concave region A cross-sectional shape modeling unit for determining a cross-sectional shape.
- the image processing unit includes a normal reproduction unit that generates a normal image by integrating the surface normal in the multiple reflection region and the surface normal in the one-time reflection region.
- a light source direction setting unit that sets a light source position
- a light source variation image generation unit that generates a light source variation image when the light source position is virtually moved, and the light source variation image and the luminance image are superimposed.
- an image display unit for combining and displaying.
- the light source fluctuation image generation unit generates a luminance image of the subject using the normal image and a physical reflection model.
- the polarized illumination unit and the imaging unit are attached to an endoscope.
- the polarized illumination unit irradiates non-polarized light with linearly polarized light whose polarization plane is sequentially changed into three or more types by transmitting the polarization plane conversion element.
- the polarized illumination unit includes a light guide that guides non-polarized light to the polarization plane conversion element.
- the imaging unit includes a monochrome imaging device or a color imaging device.
- three or more types of linearly polarized light having different angles of polarization planes are sequentially averaged over a plurality of luminance images acquired by the imaging unit while irradiating the subject.
- a luminance addition average unit that generates an average luminance image corresponding to the image of
- variable luminance processing unit obtains a relationship between an angle of the polarization plane and a luminance value of each pixel based on a pixel signal output from the imaging unit, and the luminance value is maximum for each pixel.
- an image display unit that superimposes and displays either the luminance maximum angle image or the luminance modulation degree image and the luminance image is displayed.
- the reflection determination unit is set in advance in a pseudo color image in which the luminance maximum angle is a hue angle and the luminance modulation degree is saturation based on the luminance maximum angle image and the luminance modulation degree image.
- a pixel having a luminance modulation degree equal to or greater than the value is extracted as a pixel constituting the multiple reflection region.
- the concave region connecting portion estimates an azimuth angle of the surface normal of the concave region.
- the cross-sectional shape modeling unit models the cross-sectional shape of the concave region with a specific function, and utilizes the property that the azimuth angle of the surface normal in the multiple reflection region is approximately 45 degrees. Then, the Zenith angle of the surface normal at an arbitrary position in the concave region is estimated.
- the polarization illumination unit irradiates the subject with light transmitted through a polarization plane conversion element whose polarization plane sequentially changes into three or more types via a relay lens optical system.
- the polarization illumination unit has an optical system in which a plurality of concentric ring illuminations or surface illumination light sources are combined with a polarization plane conversion element.
- the polarized illumination unit has an optical system in which a surface illumination light source having an illumination system optical axis directed inward from the optical axis of the imaging system and a polarization plane conversion element are combined.
- the polarization illumination unit has an optical system that combines a non-polarization broadband beam splitter, a surface illumination light source, and a polarization plane conversion element.
- the subject is irradiated with a polarization illumination step of sequentially irradiating the subject with three or more types of linearly polarized light having different polarization plane angles, and each of the three or more types of linearly polarized light.
- the image processing method includes an imaging process for imaging the subject and an image processing process.
- the image processing step includes a variable luminance processing step that calculates the reflected polarization state on the surface of the subject by processing the luminance of the image captured in the imaging step, and twice in the concave region based on the result of the variable luminance processing step.
- a reflection determination step for discriminating between the multiple reflection region that is reflected and becomes return light, and a one-time reflection region that is reflected once on the subject surface and becomes return light, and is reflected twice on the concave region of the subject surface A mirror image searching step of determining a pair of multiple reflection regions to be returned light, and a step of generating an image showing the concave region of the subject surface based on the pair of multiple reflection regions.
- a polarization illumination unit that sequentially irradiates a subject with three or more types of linearly polarized light with different angles of polarization planes, and the subject is irradiated with each of the three or more types of linearly polarized light.
- An imaging unit that images the subject, an image processing unit, and a display unit that displays an image based on the output of the image processing unit.
- the image processing unit generates a pseudo color image based on an output of the variable luminance processing unit that processes the luminance of the image captured by the imaging unit and calculates a reflected polarization state on the surface of the subject.
- a pseudo color image conversion unit A pseudo color image conversion unit.
- the variable luminance processing unit obtains a relationship between an angle of the polarization plane and a luminance value of each pixel based on a pixel signal output from the imaging unit, and the polarization plane that maximizes the luminance value for each pixel And a luminance modulation degree image defined by a ratio between an amplitude of a variation of the luminance value accompanying a change of the polarization plane and a luminance average value for each pixel.
- the color image conversion unit generates a pseudo color image with the luminance maximum angle as a hue angle and the luminance modulation degree as a saturation based on the luminance maximum angle image and the luminance modulation degree image, and the pseudo color image and the luminance The image is synthesized and displayed on the display unit.
- the subject when the subject is irradiated with the polarization illumination unit that sequentially irradiates the subject with three or more types of linearly polarized light having different polarization plane angles, and each of the three or more types of linearly polarized light. Since it has an imaging unit that sequentially images the subject, it is possible to acquire information on the reflected polarization state according to the uneven shape of the surface simultaneously with the color image without the need to newly develop a special polarization imaging device. it can. Thus, in the present invention, information on surface irregularities is obtained from the reflected polarization state. Since a normal color image sensor can be used as it is, it is not necessary to use an expensive polarization image sensor, and information for visually recognizing surface irregularities can be acquired.
- the figure which shows the basic composition of the image processing apparatus by this invention The perspective view which shows typically the polarization direction of three types of linearly polarized light from which the angle of a polarization plane differs
- Diagram showing the operation of the polarization plane control element Definition of polarization plane angle (A) And (b) is a figure which shows the example of photosensitive cell arrangement
- (A) and (b) are diagrams in which incident light is incident on the subject surface from directly above and is reflected once.
- Graph showing the Fresnel reflectivity of P wave and S wave energy when the horizontal axis is the angle of incidence (A) And (b) is a graph which shows the luminance fluctuation for every pixel by the polarization plane rotation of polarized illumination. (A) And (b) is explanatory drawing of the brightness
- A The shape of the simplest groove (groove) and (b) a view showing the state of reflection of incident light (A) Groove (groove) shape which has a bottom face and a slope, and (b) The figure which shows the mode of reflection of incident light (A) A shape in which ridges (convex parts) are dense and (b) a view showing a state of reflection of incident light (A) Hole (concave) shape and (b) A diagram showing how incident light is reflected 1 is a block diagram showing a configuration of an image processor according to Embodiment 1 of the present invention.
- Diagram of cosine function fitting from polarized luminance samples corresponding to four types of polarized illumination Flow chart explaining the processing of the reflection judgment unit once or twice
- photography scene The figure which shows typically separated 1 time reflection area
- the figure explaining the process of a mirror image search part The figure explaining the process of a mirror image search part Flowchart explaining processing of mirror image search unit Diagram showing data in one reflection area The figure explaining the detail of search of a to-be-searched area
- the figure which shows the luminance image of the real subject A diagram in which the maximum brightness angle image YPH and the brightness modulation image YD are made into one pseudo color image.
- the figure which shows the micro reflective area judged to be twice reflected by quantization processing The figure which shows the result of having extracted the groove segment by the process of the mirror image search part Diagram explaining the cause of multiple mirror image pairs (cross-sectional view) Diagram explaining the cause of multiple mirror image pairs (reflective area diagram)
- region connection part The figure which shows the setting process result of the groove segment in a recessed area
- region connection part The figure which shows the chemical result of thinning in a concave area expansion part
- the figure which shows the processing result of the normal line determination in a concave area expansion part The figure which shows the processing result of concave area connection in the case of a hole (setting of a groove segment)
- the example of the image processing apparatus includes a polarization illumination unit 120, an imaging unit 140, a variable luminance processing unit 1302, a reflection determination unit 1305, and a mirror image search unit 1306. .
- the variable brightness processing unit 1302, the reflection determination unit 1305, and the mirror image search unit 1306 are included in the image processing unit 150.
- the polarization illumination unit 120 sequentially irradiates the subject 100 with three or more types of linearly polarized light having different angles of polarization planes.
- a plurality of concave regions are observed on the surface of the organ 100 of the subject 100, for example.
- the linearly polarized light is reflected by the concave area 100 a existing on the surface of the subject 100 and an area other than the concave area 100 a and enters the imaging unit 140.
- the imaging unit 140 sequentially images the subject 100 when the subject 100 is irradiated with each of three or more types of linearly polarized light.
- FIG. 1B is a perspective view schematically showing the polarization directions of three types of linearly polarized light having different polarization plane angles.
- the three polarization states 10, 12, and 14 shown in the figure each have polarization planes with different angles.
- a bidirectional arrow is described inside a circle schematically showing the polarization states 10, 12, and 14 in FIG. 1B. This arrow indicates the vibration direction of the electric field vector that defines the polarization plane of linearly polarized light.
- FIG. 1B shows the XYZ coordinates of the right hand system.
- the X axis and the Y axis are set in the image plane acquired by the imaging unit 140, and the negative direction of the Z axis is set in the line-of-sight (optical axis) direction.
- the plane of polarization of linearly polarized light is a plane including the optical axis that is parallel to the oscillating electric field vector.
- the vibration direction of the electric field vector of linearly polarized light is parallel to the XY plane.
- the angle ( ⁇ I) of the polarization plane is defined by the angle formed by the polarization direction (vibration direction of the electric field vector) with respect to the positive direction of the X axis. This angle ⁇ I will be described in more detail later with reference to FIG.
- the polarization illumination unit 120 sequentially irradiates the subject 100 with three or more types of linearly polarized light having different polarization plane angles, and the imaging unit 140 irradiates the subject 100 with each of the three or more types of linearly polarized light. In this case, the subject 100 is sequentially imaged.
- variable luminance processing unit 1302 obtains the relationship between the angle of the polarization plane and the luminance value of each pixel based on the pixel signal output from the imaging unit 140, and obtains the “maximum luminance angle image” and the “luminance modulation degree image”. Generate.
- the “maximum luminance angle image” is an image defined by the angle of the polarization plane that maximizes the luminance value for each pixel constituting the image obtained by imaging.
- luminance value of the pixel P (x, y) specified by a certain coordinate (x, y) is maximized when the subject 100 is irradiated with linearly polarized light having a polarization plane of 45 °
- a value of 45 ° which is the maximum luminance angle is set for the pixel P (x, y).
- luminance maximum angle image is configured by setting such a value of the maximum luminance angle in each pixel.
- the “brightness modulation degree image” is an image defined by the ratio between the amplitude of the fluctuation of the luminance value accompanying the change of the polarization plane and the average luminance value for each pixel.
- luminance modulation degree image is configured by setting such a luminance modulation degree value in each pixel.
- image in this specification means not only a luminance image that is directly recognized by human vision, but also a wide array of numerical values given to each of a plurality of pixels.
- image when one “luminance maximum angle image” is displayed, the image can be displayed with brightness according to the value of the luminance maximum angle set for each pixel of the “luminance maximum angle image”.
- the “luminance maximum angle image” expressed in this way includes a light and dark pattern that can be recognized by human vision, but this is different from a normal luminance image showing the luminance of the subject.
- images the data itself indicating various “images” may be referred to as “images”.
- the reflection determination unit 1305 shown in FIG. 1A reflects the multiple reflection region that is reflected twice in the concave region and becomes return light, and the reflected light that is reflected once on the subject surface and returned light. And a one-time reflection area.
- the concave region multiple reflection (typically twice reflection) occurs, and thus the multiple reflection region forms a pair showing a similar polarization reflection state.
- a typical example of such a multiple reflection region may be a groove having a V-shaped cross section.
- a groove having the simplest structure is a groove that linearly extends in one direction as shown in FIG.
- the concave region having the multiple reflection region only needs to have a surface whose cross section is inclined or curved in a generally V shape or U shape, and may have other forms.
- Such a pair of multiple reflection regions exhibiting the same polarization reflection state are continuous on the surface of the subject to form a wider concave region.
- a concave region including a pair of multiple reflection regions may be referred to as a “groove”.
- the “groove” in the present specification is not limited to the groove-like concave region extending in one direction on the surface of the subject.
- the “groove” in the present specification may be a concave region having a shape that cannot be strictly called a groove (for example, a shape as shown in FIGS. 11 and 12).
- the mirror image search unit 1306 determines a pair of multiple reflection areas that are returned light by reflection twice in the concave area of the subject surface. This pair determination method will be described in detail later.
- the image processing unit 150 generates an image indicating a concave area on the surface of the subject based on the pair of multiple reflection areas.
- FIG. 1C is a diagram schematically illustrating the overall configuration of the image processing apparatus according to the first embodiment of the present invention.
- the image processing apparatus includes an endoscope 101 and a control device 102.
- the endoscope 101 includes a distal end portion 113 having an image sensor 110, a light guide 105, and an insertion portion 103 having a video signal line 111.
- the insertion portion 103 of the endoscope 101 has a structure that is long to the left and right and can be bent flexibly as shown.
- the light guide 105 can transmit light even in a bent state.
- the control device 102 includes a light source 104 and an image processor 108.
- White non-polarized light emitted from the light source 104 is guided to the polarization plane control element 106 of the tip 113 via the light guide 105 and becomes linearly polarized light 121 irradiated to the subject.
- the polarization plane control element 106 is composed of, for example, a polarizing plate and a liquid crystal element, and can convert non-polarized light into linearly polarized light with an arbitrary polarization plane by voltage.
- the polarization plane control element 106 is a device that can rotate the polarization plane using liquid crystal. Examples of the configuration have already been disclosed in Patent Document 4, Patent Document 5, Non-Patent Document 1, and the like.
- the polarization plane control element 106 can be configured by a voltage application type liquid crystal device that combines, for example, a ferroelectric liquid crystal, a polarizing film, a quarter wavelength plate, and the like. Then, this polarized illumination is applied to the subject through the illumination lens 107.
- the synchronization device 112 sends an instruction signal for rotating the polarization plane to the polarization plane control element 106, rotates the polarization plane of the illumination, and sends a shooting start signal to the imaging element 110 to acquire an image.
- the above processing is performed a plurality of times. carry out.
- the image sensor 110 may be a monochrome image sensor or a single plate color image sensor having a color mosaic.
- the imaged video signal reaches the image processor 108 via the video signal line 111.
- the light source 104, the light guide 105, the polarization plane control element 106, and the illumination lens 107 implement the polarization illumination unit 120 of FIG. 1A.
- the imaging unit 140 of FIG. 1A is realized by the photographing lens 109 and the imaging element 110.
- the variable luminance processing unit 1302, the reflection determination unit 1305, and the mirror image search unit 1306 in FIG. 1A are realized by the image processor 108.
- the first image is captured when the polarization plane is 0 ° state 203
- the second image is captured when the polarization plane is 45 ° state 204
- the third image is captured when the polarization plane is 90 ° state 205.
- the time required for the rotation of the polarization plane ranges from about 20 (ms) to about 40 to 100 ( ⁇ sec) high-speed type. If high-speed liquid crystal is used and the sensitivity of the image sensor is increased to such an extent that imaging can be performed in this time, even if shooting is performed with polarization rotation in four directions, sufficient performance is provided for shooting moving images. It is possible.
- the image processing is performed for at least four frames of image capturing units. However, the processing time can be kept within one frame time by using pipeline processing.
- the optical axis of the illumination lens 107 and the optical axis of the photographing lens 109 are substantially equal. This is because shadows are not generated as much as possible on the subject during observation with the endoscope.
- an unpolarized average luminance image can be generated by adding separate polarized images from the first image to the fourth image.
- FIG. 3 is a diagram showing the definition of the angle ⁇ I of the polarization plane in the polarized illumination.
- the XY coordinate system is set toward the subject.
- the angle ⁇ I of the polarization plane is defined as shown in FIG. 3 with the X-axis direction being 0 °.
- the angle ⁇ I is preserved in the reflection, the angle of the polarization plane of the reflected light and the angle of the polarization plane of the incident light are the same.
- the angle ⁇ I of the polarization plane is increased or decreased, the same polarization state is repeated with a period of 180 °. That is, the function having the polarization plane angle ⁇ I as a variable is a periodic function having a period of 180 °.
- the angle ⁇ I of the polarization plane in the polarized illumination may be referred to as “incident polarization plane angle”.
- FIGS. 4A and 4B are diagrams each illustrating an example of the configuration of the imaging surface of the imaging device 110.
- FIG. 4A a plurality of photosensitive cells (photodiodes) are regularly arranged in rows and columns (XY direction) on the imaging surface.
- a color mosaic filter that transmits light of three types of RGB wavelengths is installed as shown in FIG.
- Each photosensitive cell generates an electrical signal according to the amount of incident light by photoelectric conversion.
- a general single-plate color image sensor can be used for this portion.
- a conventional luminance image can be used as the image sensor 110.
- FIG. 5 shows a state in which polarized light having an incident angle close to zero is incident on the surface 801 and direct reflection is observed with a camera.
- the polarization planes of incident polarized light are different by 90 °.
- the luminance as energy does not change only by changing the traveling direction of the light. This is due to the following reason.
- FIG. 6 is a graph showing the dependence of the specular reflectance on the incident angle according to Fresnel theory.
- the horizontal axis represents the incident angle
- the vertical axis represents the Fresnel reflectance.
- An incident angle in the vicinity of 0 ° to 15 ° that can be regarded as normal incidence corresponds to the range 601.
- the reflectance of the P wave and the S wave are substantially the same. Therefore, when the polarized light is incident on the surface almost perpendicularly, it is reflected with the same behavior because there is no distinction between the P wave and the S wave with respect to the surface.
- FIG. 7 shows the variation of the luminance Y at a specific pixel of the luminance image obtained when the angle ⁇ I of the polarization plane of the polarized illumination with respect to the uneven surface is 0 °, 45 °, 90 °, 135 °. ing.
- the luminance Y periodically varies with respect to the angle ⁇ I of the polarization plane of each polarized illumination. The reason for this will be described in detail below.
- FIG. 8 shows a state in which a groove 801 is formed on an uneven surface and two reflections are generated on the inclined surface. This multiple reflection is considered to always occur on the surface of an object with many surface irregularities, and the nature of the first and second reflections is important. Depending on the geometrical arrangement, the reflection may occur three times at almost the same position as the two reflections. However, since the frequency of occurrence is very low, only the two reflections will be focused on thereafter.
- the polarized illumination polarized perpendicular to the groove main axis direction 802 is a P wave.
- the S wave Compared with the P-wave reflectivity, it becomes extremely weak.
- the P wave becomes weaker while passing through the reflection once and twice.
- the S-polarized light shown in FIG. 8B does not weaken so much even after two reflections.
- the reflected light becomes extremely weak in terms of energy, and the luminance decreases.
- the incident polarization plane that becomes the S wave the reflected light is not so attenuated in energy and has high brightness.
- ⁇ Two reflections in the groove can be detected as a polarization phenomenon, and this can be observed as a luminance fluctuation using polarization rotation illumination.
- the groove model as described above was somewhat artificial. Unevenness on the organs and mucosal surfaces of living bodies has various shapes. 9 to 12 are models of such actual unevenness.
- FIG. 9 shows the simplest groove. This is formed of only two kinds of inclined surfaces 901, and light incident from substantially right above is reflected twice (902) on the inclined surfaces and becomes return light. In this modeling, there is only a double reflection phenomenon in the groove. Therefore, in a situation where the resolution is low to some extent, the groove center is often the darkest, and the unevenness can be detected even by using conventional image processing with luminance.
- FIG. 10 shows a case where the groove has a bottom surface 1001, which is a shape seen in a shallow wide depression.
- the light incident from directly above is reflected twice (1002) on the inclined surface 1004 to become return light, and is reflected once (1103) on the bottom surface.
- the center of the groove becomes brightest and the concave portion becomes darker than the surroundings, and it becomes difficult to determine the unevenness by image processing using luminance.
- FIG. 11 shows a shape of a region where convex shapes 1105 on the flat portion are dense, and is a shape seen when a raised tumor or the like is generated.
- the convex portion is modeled as a hemisphere for simplicity, the gap portion 1101 of the convex portion 1105 can be regarded as a concave portion, and light incident from almost right above is adjacent to the surface as shown in FIG. And reflected twice (1102) to return light.
- the light is reflected once to form a very high luminance region, but there is also a region 1104 where the reflected light is generated once at the bottom of the concave portion of the gap.
- This region 1104 is often very bright. Therefore, when observed with luminance, the concave portion becomes brighter than the surroundings, and it becomes difficult to accurately detect the concave region as shown in FIG.
- FIG. 12 shows a shape in which a hole-like concave portion 1204 exists alone on the flat portion.
- the recess 1204 is modeled as a hemispherical hole. Also in this case, since there are a pair of facing slopes, the light incident from substantially right above is reflected twice (1201) and becomes return light. However, since the incident light is reflected (1202) once on the bottom surface, the center 1203 of the recess 1204 is in the brightest state when observed with luminance. As a result, since the concave 1204 becomes darker than the surroundings, it becomes difficult to accurately detect the concave region as shown in FIG.
- the surface unevenness information is correctly obtained by separating the single reflection and the double reflection phenomenon from the information using rotationally polarized illumination. Can be detected.
- FIG. 13 is a block diagram showing a configuration of the image processor 108.
- the image processing executed in this embodiment uses the principle of detecting irregularities from information obtained by irradiating polarized illumination with a rotating polarization plane and observing luminance fluctuations.
- the polarization plane angle ⁇ I of illumination is changed to 0 °, 45 °, 90 °, and 135 °, four luminance image groups 1301 captured respectively are input to the image processor 108.
- the fluctuation luminance processing unit 1302 optimally fits this to the cosine function.
- the luminance variation is expressed as follows, where the angle of the polarization plane of illumination is ⁇ I.
- FIG. 14 shows the cosine function of the luminance fluctuation, and represents the amplitude AI, phase ⁇ o, and average value Y ⁇ I_ave .
- the four sample points are drawn so as to be placed on the cosine function for the sake of simplicity.
- a method for estimating the above value by fitting a cosine function from four equally spaced angle samples is as follows. First, the luminance Y ⁇ I_ave of the original image under non-polarized illumination is obtained by the following equation. This approximately reproduces a luminance image under non-polarized illumination, and this image can be used as a normal observation image of an endoscope.
- the variable luminance processing unit 1302 performs optimal fitting using the least square error from the sampled luminance to the cosine function. Here, it is carried out from samples in four directions of 0 °, 45 °, 90 °, and 135 °. Since the cosine function is determined by three types of information of amplitude, phase, and average value, in order to determine these, any number of samples may be used as long as it is three or more samples. However, in the case of a 45 ° sample, there is a property that the optimum fitting can be simplified.
- phase ⁇ o of the cosine function that minimizes this square error can be obtained from the following equation.
- the angle taking the minimum value and the angle taking the maximum value can be calculated as follows by performing the case difference from the magnitude relationship between a and c.
- ⁇ 0max The value of ⁇ 0max that takes this maximum value may be used as the luminance maximum angle image YPH as it is.
- the maximum value and the minimum value of the amplitude is as follows.
- the luminance modulation degree image YD is obtained by the following Expression 12.
- the maximum luminance angle image YPH and the luminance modulation degree image YD are obtained. These pieces of information are sent to the reflection determination unit 1305 as shown in FIG.
- the two types of images, the maximum luminance angle image YPH and the luminance modulation degree image YD, are often expressed together as a pseudo color image. In that case, the luminance maximum angle image YPH represents the hue angle of the color, and the luminance modulation degree image YD represents the color saturation.
- the image group 1301 photographed under different polarized illuminations is added and averaged by the luminance addition averaging unit 1310 and becomes equivalent to an image photographed under non-polarized illumination.
- the reflection determination unit 1305 uses the maximum luminance angle image YPH, the luminance modulation degree image YD, and the luminance image Y to distinguish and determine the reflection generated on the subject surface as a single reflection and a double reflection.
- FIG. 15 is a flowchart for explaining the processing of the reflection determination unit 1305.
- step S1501 the luminance modulation degree image YD and the luminance image Y are input.
- step S1502 the value of each pixel of the luminance modulation image YD (modulation) determines whether a predetermined threshold level TH _ YD above.
- it can be separated only reflective region in which a predetermined threshold level TH _ YD more degree of intensity modulation is present as a pixel region of the two reflective regions REF2.
- the region composed of such a pixel is determined as the single reflection region REF1.
- the once-reflection area REF1 and the twice-reflection area REF2 constitute a “specular reflection area”.
- the double reflection area REF2 may include an area where two or more reflections occur, and may be referred to as a “multiple reflection area”.
- the once-reflection area REF1 and the twice-reflection area REF2 are separated on the image.
- a region that is neither the one-time reflection region REF1 nor the two-time reflection region REF2 is distinguished from the “specular reflection region”.
- FIG. 16A schematically shows a result of processing of the reflection determination unit 1305 for a scene (FIG. 11) in which a protruding tumor or the like is generated on a flat part.
- the single reflection area is described as a hatched circular area.
- the one-time reflection region is a high-intensity region existing near the top of the raised portion and the bottom surface of the recess.
- each pair of the twice-reflecting regions 1601, 1602, 1603 is located on the slope of the concave portion where the raised portions contact each other.
- FIGS. 17A and 17B are diagrams that schematically represent the processing results. A state in which the minute reflection regions 171, 172, 173, 174, and 175 having various shapes separated and extracted have the luminance maximum angle H as an attribute is shown.
- the maximum luminance angle H is an angle with a cycle of 180 degrees. Regions having the same maximum luminance angle H are represented by pattern textures in FIGS. 17A and 17B. It is assumed that the areas 171 and 172 shown in FIG. 17A have an attribute corresponding to the angle H1, and the areas 173, 174 and 175 shown in FIG. 17B have an attribute corresponding to the angle H2. In other words, the maximum luminance angle H of the areas 171 and 172 is H1, and the maximum luminance angle H of the areas 173, 174 and 175 is H2.
- a concave portion is determined by finding a region that becomes a mirror image pair such as the two-time reflection region 1601, 1602, or 1603 in FIG. 16C.
- the shape as the binary image is not reliable and is not very useful.
- the angle H1 that the twice-reflection area has is a big clue. This is because the angle H1 indicates the groove main axis angle, and the paired regions always exist in the direction perpendicular to the groove main axis.
- the search may be performed only on a straight line orthogonal to the groove main axis.
- a straight line orthogonal to the groove main axis is referred to as a search line 176.
- the search line 176 for the region 171 in FIG. 17A passes through the center of gravity 178 of the region 171 and is orthogonal to the angle H1.
- the mirror image corresponding to the region 173 passes through the center of gravity of the region 173 and is located on a straight line (search line 177) orthogonal to the angle H2. Therefore, the mirror image corresponding to the region 173 may be searched only on the search line 177.
- a region 175 having an angle value equal to the angle value H2 is located in the vicinity of the region 173, but this region 175 is not on the search line 177. That is, the region 173 and the region 175 do not constitute a pair of two-time reflection regions. According to the method of searching on the search line, the region 175 is not mistaken for being a region that forms a pair with the region 173.
- FIG. 18A is a flowchart illustrating an example of processing of the mirror image search unit 1306.
- a twice-reflection area image is input.
- This is data including a binary image showing a pair of twice-reflection areas as exemplified in FIG. 16C, a luminance maximum angle image YPH, and a luminance modulation degree image YD. Specific examples of data will be described later (FIG. 18B).
- the value of the maximum luminance angle in each pixel of the maximum luminance angle image YPH is referred to as a YPH value
- the degree of modulation in each pixel of the luminance modulation degree image YD is referred to as a YD value.
- step 1802 the fine dispersed fine regions on the image generated for the binarization processing described in (Equation 13) in the reflection determination unit 1305 are removed as noise. This can be done by using normal binary image contraction / expansion processing.
- step 1803 the binary image is labeled, and an average value of the number of pixels, barycentric coordinates, YD value, and YPH value is calculated for each region.
- pseudo color values R, G, B expressed by using the YD value as the color saturation and the YPH value as the hue of the color are used.
- FIG. 18B shows an example of pseudo color values (R, G, B) in one reflection region.
- This area is an area consisting of 11 pixels existing on the X-axis coordinates 275-278 and the Y-axis coordinates 183-186 on the image.
- step S1804 of FIG. 18A the search area number nr is set to 1 as an initial setting.
- step S1805 a search line is calculated for the search area. This search line defines a search range used when searching for a mirror image as described above.
- step S1806 the search area number nr1 is set to 1, and the search is actually performed in step S1807.
- FIG. 19 is a diagram for explaining the details of the search for the search target region.
- a region serving as a reference for searching for the other region constituting the pair of two-time reflection regions is set as a search region 191, and the groove main axis direction is set to a direction represented by an arrow 192.
- the search area 191 and a similar color area are determined using Expression 15 for determining whether or not the principal axis angle between the search area 191 and the search target area is similar, and the search line 193 is determined at the same time.
- the search line equation is expressed as follows using the center-of-gravity coordinates of the current search region and the principal axis angle ⁇ of the twice-reflection groove.
- the condition of the vertical distance D between the search area (number nr1) and the search line with respect to the search line of the search area (number nr) is as follows.
- the search line distance Len is calculated as follows.
- step S1809 it is determined whether or not the search of all search areas for the search area nr is completed. If the search is not completed, the next search area nr1 is set (S1808).
- step S1810 the search line distances Len are sorted to select two from the short distances. This becomes a mirror image pair, but there cannot be a mirror image pair that is too far away. So until the second from a short distance in step S1811 is smaller than the maximum distance LEN _ TH, it is recognized as a pair of mirror image.
- a perpendicular bisector corresponding to the groove is generated from the obtained mirror image pair region.
- This line segment is referred to as a groove segment, and the length can be expressed as GRVLEN as follows.
- step S1812 If it is confirmed in step S1812 that all areas nr have been searched, the process ends. If not completed, the process proceeds to the next search area (step S1813).
- 20A to 20D are diagrams showing experimental images in which groove segments are extracted from an actual subject.
- FIG. 20A is a luminance image generated from four images obtained by photographing a subject with rotationally polarized illumination.
- a fish egg that is assumed to have a shape similar to the unevenness of the organ surface and reflection characteristics, specifically, an assembly of “Sujiko” is used.
- On the surface a large number of images of a single reflection of ring illumination, which is illumination, and a ring-shaped image due to the translucent characteristics are observed, which becomes a serious noise in luminance image processing.
- the processing can be normally performed with almost no problem.
- FIG. 20B is a diagram illustrating a result of performing the processing of the variable luminance processing unit 1302, and one pseudo color image is shown by combining the luminance maximum angle image YPH and the luminance modulation degree image YD.
- FIG. 20C is a diagram illustrating a result of the reflection determination unit, and illustrates a result of separation / extraction of a minute reflection region determined to be reflected twice by the quantization process. These minute regions correspond to the respective regions in FIGS. 16C, 17A, and 17B.
- FIG. 20D is a diagram illustrating a result of performing the process of the mirror image search unit 1306, and a groove segment is extracted in the vicinity of each contour line of a fish egg.
- a portion from which a plurality of parallel lines 2001 are extracted corresponds to a portion where a plurality of mirror image area pairs are found.
- FIG. 21A and FIG. 21B are diagrams for explaining the cause of occurrence of the plurality of mirror image pairs.
- FIG. 21A is a cross-sectional view of the unevenness on the surface of the subject, and since the slope has a curve having inflection points, one reflection at points (A) and (D), (B) and (C) ) Another double reflection occurs at the point. Therefore, as shown in FIG. 21B, four twice-reflection areas are observed on the same search line, and as a result, three groove segments are extracted. In this case, it may be determined by a separate algorithm that the original groove center is the midpoint between (B) and (C).
- the concave area connecting unit 1307 generates a single concave area by connecting the estimated groove segments, and determines the azimuth angle of the normal line.
- FIG. 22 is a flowchart showing the processing flow of the recessed area connecting portion 1307.
- step S221 shown in FIG. 22 all the groove segments S in the image, which have been segmented by the mirror image search unit 1306, are targeted.
- step S222 an expansion process of binary image processing is executed depending on the direction of each groove segment S.
- the direction of this expansion process is determined depending on the direction of the individual groove segments, and expands almost evenly with respect to the main axis of the groove segment and its vertical direction, and the groove slope area is reproduced using the groove segment as the bottom of the groove. To do.
- step S223 the parts close to the expansion area are connected to each other by image processing, and a discrete groove segment depending on the minute reflection area is set as a large continuous connection groove area.
- step S224 a thin line corresponding to the bottom of the continuous connection groove area is determined by thinning the connection groove area by binary image processing.
- step S225 a vector perpendicular to the thin line and directed in the direction of the fine line is set, and this is estimated as the azimuth (azimuth) angle of the normal vector of the groove.
- step S226 the Zenith angle of the groove is estimated by fitting the cross-sectional shape in the connection groove region with an existing function form such as a quadratic function. At this time, the reflection intensity of the minute reflection region becomes a constraint condition. As can be seen from FIG. 10, FIG. 11, FIG. 12, and the like, the slant angle that generates strong double reflection in each cross section is utilized in the vicinity of about 45 degrees with respect to the normal. These processes are performed to estimate the connection groove region, and the surface normal vector at that location, that is, the azimuth angle and the zenith angle are estimated. The above processing will be described with reference to an actual example.
- FIG. 23A to FIG. 23D are diagrams showing examples of a curved groove, and this groove may be considered as a groove present around a raised convex region.
- FIG. 23A shows the micro-reflecting areas 151 and 152 and the set groove segment 179.
- FIG. 23B shows the result of the expansion process of the groove segment 179.
- FIG. 23C shows a state in which thinning processing is performed by connecting to another groove segment. At this stage, the connection groove region is curved centering on the lower left of the image, and the bottom surface position that is the valley is indicated by a thin line.
- FIG. 23D shows a result of estimating the normal azimuth angle with respect to the obtained groove.
- FIG. 24A to FIG. 24D are diagrams showing examples of depressions (holes) in a flat plane. Since the shape of the hole is considered to be that the groove is point-symmetric, there are theoretically innumerable mirror image pairs of two reflections.
- the minute reflection region 2000 and the minute reflection region 2001 form a mirror image pair, and the groove segment 2004 is estimated. Further, the minute reflection region 2002 and the minute reflection region 2003 form a mirror image pair, and the groove segment 2005 is estimated.
- FIG. 24B shows the result of expanding the groove segment. It can be seen that the expansion region forms one cross-shaped closed region. If thinning is performed from here, one point on the bottom surface is estimated.
- FIG. 24C a hole which is a concave region obtained from the groove segment and one point on the bottom surface of the hole are illustrated.
- FIG. 24D shows the result of estimating the azimuth angle of the normal to the obtained hole region.
- the cross-sectional shape modeling unit 1308 receives the result of estimating the azimuth angle by the above processing, estimates the Zenith angle in the recessed area, and determines two angles representing the normal line. When the Azimuth (azimuth) angle and Zenith (zenith) angle of the concave region are determined, a normal image of the concave region is generated.
- FIG. 25 is a diagram illustrating the azimuth angle and the zenith angle.
- the normal vector is a three-dimensional vector, but its length is normalized to 1, so the degree of freedom is 2, and when expressed as an angle, the azimuth angle ⁇ in the screen and the zenith angle ⁇ with respect to the line of sight Express.
- an XY axis is set in the image, and the negative direction of the Z axis is the line-of-sight (optical axis) direction.
- the relationship with the three normal components (Nx, Ny, Nz) is as shown in the figure. That is, if the azimuth angle ⁇ and the zenith angle ⁇ are obtained from the polarization information, the surface normal at that point is as follows.
- FIG. 26 shows a processing flow of the cross-sectional shape modeling unit 1308.
- step S2601 the Zenith angle is estimated in a state where the Azimuth angle is already estimated.
- the groove section may be fitted with a specific function, but here a normal distribution function is used.
- FIG. 27 shows a plan view and a cross section of the estimated groove as viewed from directly above the subject.
- the cross-sectional shape is expressed by a normal distribution function using a parameter ⁇ .
- the normal distribution function is selected as an approximation of the various cross-sectional shapes in FIGS. 10 to 12 because it is expressed with only one parameter ⁇ and is simple, and is similar to the shapes of various grooves common to living bodies. is doing.
- the cross-sectional shape of the groove is expressed by a normal distribution function, and at the same time, the Zeinth angle is determined according to the deviation value W from the center line 2702 of the groove bottom surface. Note that the value of the normal distribution function asymptotically shifts to the flat portion as the distance from the groove center increases. Therefore, this model is suitable as a model of a recess in a mucosal surface such as the stomach.
- the zenith angle ⁇ at a point x away from the groove center line is As required. Needless to say, a cross-sectional shape model other than the normal distribution function may be used.
- step S2602 a normal vector (Nx, Ny, Nz) of the subject surface in the camera coordinate system is obtained from the obtained azimuth angle ⁇ and zenith angle ⁇ using (Expression 21), and this is obtained as a two-dimensional normal image.
- the above processing is the normal estimation processing in the concave region where the reflection occurs twice on the surface of the subject. Next, the processing of the region where the reflection occurs once will be described.
- the high brightness area processing unit 1312 determines the normal line in the portion extracted as the area REF1 having very bright brightness on the surface of the subject in FIG.
- FIG. 28 is a diagram showing the principle of determining the normal line.
- the surface normal line N1 in the REF1 region is equal to the viewpoint vector V and the light source vector L because of the property that the incident angle and the reflection angle are equal in the specular reflection in the single reflection. It can be calculated as a minute vector.
- the surface normal in the REF2 (double reflection) region and the REF1 (single reflection) region can be determined.
- these normals are only required locally. Therefore, a global normal over the entire surface of the subject is determined.
- the normal reproduction unit 1313 obtains a normal to the normal undetermined region using the normal continuity assumption from the relationship between the theoretical surface reflection model and the observed luminance of a single image, and Determine the normal. In this region, almost no reflection occurs twice, so it can be considered that the luminance is determined by a single reflection from the direction of illumination, normal line, and line of sight, and SFS (Shape From Shading), which has been often used conventionally.
- SFS Shape From Shading
- the surface normal (Nx, Ny, Nz), which is a vector in the three-dimensional space, is useful for expressing the outer edge line constraint of the subject via the (p, q) gradient space (f, g ) Express in space.
- (Formula 26) and (Formula 27) may be used.
- FIG. 29A schematically shows how the normal line of the convex portion of the surface is gradually formed by this iterative method.
- the initial surface is flat as 2901, but when the iterative method is applied, convex regions are gradually formed as 2902 and 2903.
- the normal line group in the groove region and the normal line in the REF1 region can be used as constraint conditions.
- the normal image N can be generated by the above processing.
- FIG. 29B is a diagram schematically representing a normal image. Each pixel of the luminance image is accompanied by surface normal vector data at that location. The following description relates to a portion for combining and displaying the image generated in this way.
- the normal image N is sent to the light source fluctuation image generation unit 117.
- the light source fluctuation image generation unit 117 generates a luminance image using a physical reflection model formula by giving a camera viewpoint direction and an illumination light source direction to the obtained normal line image.
- the Cook-Trans model is used as a model formula that well expresses the specular reflection of the subject. According to this, the luminance Is is expressed by the following equation.
- ⁇ in the above equation is an angle formed by the bisected vector H and the normal line N
- ⁇ r is an angle formed by the line-of-sight vector and the normal line N. It is.
- the Fresnel coefficient F and the geometric damping factor G are expressed by the following equations.
- the light source direction setting unit 114 is a man-machine interface for setting the light source vector, and is freely set by a doctor who is an observing user in endoscopic diagnosis or the like.
- This setting is a virtual setting that illuminates the illumination light source from a location other than the actual endoscope light source, for example, the left side, right side, upper side, and lower side. it can.
- FIG. 31 schematically shows an image in which such a light source is changed. Since this image is based on the estimated normal image, the illumination can be changed freely on the computer, and the illumination position, which is one of the disadvantages of the endoscope, cannot be changed. The problem of being difficult can be solved.
- the image expressed by the Cook transformer model is a specular reflection image of only the glossy part.
- the specular reflection image is a monochrome image because of only a white light source, and this alone lacks reality. Therefore, the image composition unit 115 composites and displays the luminance image Y.
- FIG. 32A is a diagram for explaining this composition processing.
- the luminance image Y and the light source fluctuation image SY are sent to the image composition unit 115, and first, the diffuse reflection image DY is separated by the image component separation unit 3202. This is performed by subtracting from the luminance image Y a light source fluctuation image SY that is a specular reflection image in which the light source is set to the same state as when photographing. The color components and the like remain in the diffuse reflection image DY. Next, a light source fluctuation image SY when the light source is changed to an arbitrary position is generated, weighted by the weighting coefficient setting unit 3204, and added and synthesized by the adding unit 3203. The composite image is sent to the display unit.
- FIG. 32B is an example of a display image realized by the above synthesis process.
- the light source 3206 is an arbitrary parallel light source set virtually.
- the subject image changes as shown in images 3208A to 3208D.
- the unevenness can be visually perceived in that the specular reflection portion 3207 and the shadow portion 3209 that are reflected once move with a change in the position of the light source.
- the doctor can see an image in which the unevenness is emphasized in a realistic manner, and can obtain information useful for diagnosis.
- FIG. 33A is a diagram showing another form of the synthesis process.
- the maximum luminance angle image YPH which is information before processing to that extent
- the pseudo color image generated from the luminance modulation degree image YD are synthesized.
- the pseudo color conversion unit 3301 converts the pseudo maximum color image C into the pseudo color image C.
- a well-known HSV-RGB conversion or the like can be used.
- this image is weighted by the weighting unit 3303 and sent to the adding unit 3304 together with the luminance image weighted by the weighting unit 3302 to be added and synthesized.
- the hue indicates the azimuth angle of the surface groove.
- FIG. 33B is an example of a display image realized by the above synthesis process.
- the pseudo color image is synthesized here, the luminance maximum angle YPH image and the luminance modulation degree YD image can be synthesized as a monochrome image.
- this apparatus does not employ a configuration for observing separate information by switching the illumination light to be observed, and therefore all processes can be executed in parallel. . That is, while observing the luminance image Y, which is a color image essential for normal endoscopy, simultaneously displays a composite image with the light source variation image SY or a composite image with the pseudo color image C on the display unit 116 at a time. There is a feature that it is possible to do. Of course, the doctor who scans the endoscope can switch the display as appropriate.
- the normal is determined assuming that the REF1 (single reflection) region having very bright luminance on the surface of the subject shown in FIG. 16 is a convex region (see FIG. 28).
- the REF1 (single reflection) region may be a concave region.
- the apparatus of this modification includes an unevenness determination unit that determines whether the REF1 (single reflection) region is a concave region or a convex region.
- the surface of the subject is an aggregate of spherical regions.
- FIG. 34 is a diagram showing the configuration of this modification.
- the apparatus of this modification includes an unevenness determination unit 1314.
- the apparatus of this modification includes an unevenness determination unit 1314.
- the REF1 area detected by the high luminance area processing unit 1312 is a convex area
- processing by the normal reproduction unit 1313 is executed. If it is determined that the REF1 region is a concave region, this region is excluded from the REF1 region and the subsequent processing is performed.
- FIG. 35A shows a case where the REF1 region is a convex region.
- the REF1 region 1324 exists in the spherical region 1320, and three REF2 regions 1321, 1322, and 1323 exist around it.
- V be a two-dimensional position vector starting from the REF1 area 1324 and ending at each of the REF2 areas 1321, 1322, and 1323.
- a vector expressing the axis of the groove in each of the REF2 regions 1321, 1322, and 1323 is T.
- This vector T may be a groove segment found as a result of mirror image search, or may be obtained as a groove main axis without mirror image search.
- the direction of the vector T is fixed, but the direction is indefinite.
- the reason why the absolute value of the outer product of the vector V and the vector T is used is that the direction of the vector T is not considered.
- E indicates an average value in a plurality of twice-reflective areas
- Thresh indicates a predetermined constant threshold value. The value of “Thresh” may be set to a different value depending on the type of subject, but may be determined by experiment or calculation.
- the unevenness determination unit 1314 determines that the REF1 region 1324 is a convex region.
- FIG. 35B shows a case where the REF1 region is a concave region.
- the REF1 region 1325 exists in the center of the region surrounded by the plurality of REF2 regions 1326, 1327, and 1328.
- the vector T and the vector V are almost parallel. . Therefore, when the REF1 region 1324 is a concave region, the following Expression 36 is established.
- the unevenness determination unit 1314 can identify whether the one-time reflection region is a concave region or a convex region based on the magnitude of the absolute value of the outer product of the vector T and the vector V.
- the image processing apparatus is a so-called flexible mirror type in which the imaging element is located at the tip.
- the endoscope in the present invention is not limited to the flexible endoscope type, and may be a rigid endoscope type.
- FIG. 36 is a block diagram illustrating a configuration example of the endoscope 101 according to the present embodiment.
- the configuration other than the endoscope 101 is the same as the configuration of the image processing apparatus shown in FIG. 1C, the description of these configurations will not be repeated here.
- the endoscope 101 included in the image processing apparatus of the present embodiment includes an imaging relay lens optical system 3401 and an illumination relay lens optical system 3302.
- the imaging relay lens optical system 3401 guides the image of the subject obtained from the photographing lens 109 to the imaging element 110 located at the root of the endoscope, and has a general configuration in a rigid endoscope.
- a polarization plane control element 106 is installed immediately before the light source 104. The non-polarized light emitted from the light source 104 is converted into rotating linearly polarized light, and this linearly polarized light is irradiated from the illumination lens 107 while maintaining the polarization state while passing through the illumination relay lens optical system 3302.
- the polarization plane control element 106 it is not necessary to install the polarization plane control element 106 in FIG. 1C at the distal end portion of the endoscope, and the diameter of the distal end portion can be reduced.
- the polarization plane control element 106 instead of the polarization plane control element 106, it is possible to use a mechanism for rotating a polarizer having a plurality of different transmission polarization planes.
- the polarization state needs to be maintained in the middle, but in the imaging relay lens optical system 3401, the luminance itself may be maintained. This facilitates adjustment of the imaging relay lens optical system 3401.
- the illumination relay lens optical system 3302 can be substituted by a light guide using a polarization maintaining fiber that maintains the polarization state.
- the luminance image is picked up using the rotationally polarized illumination, and the processing performed by the image processor is the same as in the first embodiment.
- FIGS. A front view of the tip portion 113 is shown on the left side of these drawings, and a cross-sectional view of the tip portion 113 is shown on the right side.
- FIG. 37 shows a configuration example provided with a tip portion close to the configuration of the tip portion in the first embodiment.
- the polarization plane control element 106 is mounted outside the illumination lens 107.
- FIG. 38 shows a configuration example having a ring illumination unit 306 at the tip.
- the light emitted from the ring illumination unit 306 is converted through the doughnut-shaped polarization plane control element 106 having a hole in the center.
- the ring illumination unit 306 can be configured by arranging a plurality of end faces of individual optical fibers branched from the light guide 105 on the circumference.
- FIG. 39 shows a configuration example having a wide ring illumination portion 307 having a wide configuration at the tip.
- the ring illumination unit 307 has a plurality of LED chips arranged in a wide ring-shaped band region, and ring-shaped surface light emission is obtained as a whole.
- the plane of polarization is controlled.
- FIG. 40 shows a configuration example in which the center of the tip is recessed so that the optical axis of illumination is inclined with respect to the optical axis of imaging.
- a photographing lens 109 is located in the recessed central portion, and a plurality of LED light sources 308 are arranged around it.
- FIG. 41 shows a configuration example using coaxial epi-illumination in which the optical axis of illumination and the optical axis of imaging can be made the same.
- Non-polarized light emitted from the circular surface-emitting LED illumination light source 309 passes through the circular polarization plane control element 106 and then enters the broadband non-polarization BS (beam splitter) at an angle of 45 degrees.
- the broadband non-polarized BS transmits half of the incident polarized light in the visible light range and reflects half of the incident polarized light, but has the property of maintaining the polarization at that time. For this reason, the linearly polarized light is irradiated to the subject as it is. Similarly, half of the return light is transmitted and enters the photographing lens 109.
- the light absorbing plate 310 it is desirable to install the light absorbing plate 310 in order to minimize reflection of a part of the light emitted from the light source after passing through the broadband non-polarized BS.
- a plate-type broadband non-polarized BS is used here, a cube-type one may be used. In that case, since the reflection of the light source becomes stronger, a countermeasure is required.
- an LED light source that emits linearly polarized light may be used, and its polarization plane may be rotated by a polarization plane control element.
- all of the above embodiments are configured with a combination of rotationally polarized illumination and a luminance imaging device.
- the imaging element 110 is replaced with a polarization imaging element instead of the need for the polarization plane control element 106 of FIG. 1C.
- the polarization imaging element an element capable of imaging polarization in a certain wavelength band simultaneously with visible color luminance can be used.
- the processing unit receives one polarized captured image instead of the luminance image group 1301 shown in FIG. 13 and performs processing.
- Each maximum luminance image YPH and luminance modulation degree image YD in FIG. 13 may be replaced with a polarization main axis angle (phase) image and a polarization degree image, respectively.
- the present invention provides medical endoscopes, dermatology, dentistry, ophthalmology, surgery and other medical use cameras, industrial endoscopes, fingerprint imaging devices, surface inspection devices, etc. It can be widely applied to the image processing field that is required.
- DESCRIPTION OF SYMBOLS 100 Subject 100a Concave area
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Astronomy & Astrophysics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Image Input (AREA)
Abstract
Description
1)1回目:拡散反射 2回目: 鏡面反射
2)1回目:拡散反射 2回目: 拡散反射
3)1回目:鏡面反射 2回目: 拡散反射
4)1回目:鏡面反射 2回目: 鏡面反射
の4通りの現象が想定できる。
図34、図35A、図35Bを参照しながら、第1の実施形態の変形例を説明する。
第1の実施形態の画像処理装置は、撮像素子が先端部に位置するいわゆる軟性鏡タイプである。本発明における内視鏡は、軟性鏡タイプに限定されず、硬性鏡タイプであってもよい。
第1の実施形態、第2の実施形態に共通して先端部の偏光回転照明部の構成には種々の変形例が考えられる。被写体の凹凸にて鏡面反射を多く発生させるためには、なるべく均一で広い面積の照明を用いることが望ましい。しかし、内視鏡であることから、先端部口径はなるべく小さく、構造が簡単であることが望ましい。
100a 被写体の凹領域
101 内視鏡
102 制御装置
103 挿入部
104 光源
105 ライトガイド
106 偏光面制御素子
107 照明レンズ
108 画像プロセッサ
109 撮影レンズ
110 撮像素子
111 映像信号線
112 同期装置
113 先端部
114 光源方向設定部
115 画像合成部
116 表示部
117 光源変動画像生成部
120 偏光照明部
140 撮像部
150 画像処理部
1302 変動輝度処理部
1305 反射判定部
1306 鏡像探索部
Claims (21)
- 偏光面の角度が異なる3種類以上の直線偏光を、順次、被写体に照射する偏光照明部と、
前記3種類以上の直線偏光の各々によって前記被写体が照射されているときに、順次、前記被写体を撮像する撮像部と、
画像処理部と、
を備え、
前記画像処理部は、
前記撮像部で撮影した画像の輝度を処理して被写体表面における反射偏光状態を算出する変動輝度処理部と、
前記変動輝度処理部の出力に基づいて前記凹領域で2回反射して戻り光となる前記多重反射領域と、前記被写体表面で1回反射して戻り光となる1回反射領域とを判別する反射判定部と、
前記被写体表面の凹領域で2回反射によって戻り光となる多重反射領域の対を決定する鏡像探索部とを有し、
前記画像処理部は、前記多重反射領域の対に基づいて前記被写体表面の前記凹領域を示す画像を生成する、画像処理装置。 - 前記画像処理部は、
前記多重反射領域の各対からグルーブセグメントを生成し、前記グルーブセグメントを接続して前記凹領域の位置を推定する凹領域接続部と、
前記凹領域の断面形状を決定する断面形状モデル化部と、
を有する請求項1に記載の画像処理装置。 - 前記画像処理部は、前記多重反射領域における表面法線と前記1回反射領域における表面法線とを統合して法線画像を生成する法線再現部を備える請求項2に記載の画像処理装置。
- 光源位置を設定する光源方向設定部と、
前記光源位置を仮想的に動かしたときの光源変動画像を生成する光源変動画像生成部と、
前記光源変動画像と輝度画像とを重ねあわせ合成して表示する画像表示部と
を有する請求項3に記載の画像処理装置。 - 前記光源変動画像生成部は、前記法線画像および物理反射モデルを用いて前記被写体の輝度画像を生成する請求項4に記載の画像処理装置。
- 前記偏光照明部および前記撮像部は、内視鏡に取り付けられている請求項1から5のいずれかの画像処理装置。
- 前記偏光照明部は、非偏光の光を、偏光面変換素子を透過させることによって偏光面が3種類以上に順次変化する直線偏光を照射する請求項1に記載の画像処理装置。
- 前記偏光照明部は、非偏光の光を前記偏光面変換素子に導くライトガイドを有している請求項1に記載の画像処理装置。
- 前記撮像部はモノクロ撮像素子またはカラー撮像素子を有している請求項1に記載の画像処理装置。
- 偏光面の角度が異なる3種類以上の直線偏光を、順次、被写体に照射しながら前記撮像部によって取得された複数の輝度画像の加算平均を行うことにより、非偏光照明下での画像に相当する平均輝度画像を生成する輝度加算平均部を有する、請求項1に記載の画像処理装置。
- 前記変動輝度処理部は、前記撮像部から出力される画素信号に基づいて、前記偏光面の角度と各画素の輝度値との関係を求め、各画素について前記輝度値が最大となる前記偏光面の角度によって定義される輝度最大角画像、および各画素について前記偏光面の変化にともなう前記輝度値の変動の振幅と輝度平均値との比率によって定義される輝度変調度画像を生成する、請求項1から10のいずれかに記載の画像処理装置。
- 前記輝度最大角画像、前記輝度変調度画像のいずれかと、輝度画像とを重ねあわせ合成して表示する画像表示部を有する請求項11に記載の画像処理装置。
- 前記反射判定部は、前記輝度最大角画像および前記輝度変調度画像に基づき、
前記輝度最大角を色相角、前記輝度変調度を彩度とする擬似カラー画像において、あらかじめ設定された値以上の輝度変調度を持つ画素を、前記多重反射領域を構成する画素として抽出する請求項11に記載の画像処理装置。 - 前記凹領域接続部は、前記凹領域の表面法線のAzimuth(方位)角を推定する請求項2に記載の画像処理装置。
- 前記断面形状モデル化部は、前記凹領域の断面形状を特定の関数にてモデル化し、
前記多重反射領域における表面法線のAzimuth角が概略45度になる性質を利用して、前記凹領域における任意の位置における表面法線のZenith(天頂)角を推定する請求項2に記載の画像処理装置。 - 前記偏光照明部は、偏光面が3種類以上に順次変化する偏光面変換素子を透過させた光を、リレーレンズ光学系を経由して前記被写体に照射する請求項1に記載の画像処理装置。
- 前記偏光照明部は、同心円状態の複数のリング照明または面照明光源を偏光面変換素子と組み合わせた光学系を有する請求項1に記載の画像処理装置。
- 前記偏光照明部は、撮像系の光軸よりも内側向きの照明系光軸を有する面照明光源と偏光面変換素子とを組み合わせた光学系を有する請求項1に記載の画像処理装置。
- 前記偏光照明部は、無偏光広帯域ビームスプリッタと面照明光源と偏光面変換素子とを組み合わせた光学系を有する請求項1に記載の画像処理装置。
- 偏光面の角度が異なる3種類以上の直線偏光を、順次、被写体に照射する偏光照明工程と、
前記3種類以上の直線偏光の各々によって前記被写体が照射されているときに、順次、前記被写体を撮像する撮像工程と、
画像処理工程と、
を含む画像処理方法であって、
前記画像処理工程は、
前記撮像工程で撮影した画像の輝度を処理して被写体表面における反射偏光状態を算出する変動輝度処理工程と、
前記変動輝度処理工程の結果に基づいて前記凹領域で2回反射して戻り光となる前記多重反射領域と、前記被写体表面で1回反射して戻り光となる1回反射領域とを判別する反射判定工程と、
前記被写体表面の凹領域で2回反射によって戻り光となる多重反射領域の対を決定する鏡像探索工程と、
前記多重反射領域の対に基づいて前記被写体表面の前記凹領域を示す画像を生成する工程と
を含む、画像処理方法。 - 偏光面の角度が異なる3種類以上の直線偏光を、順次、被写体に照射する偏光照明部と、
前記3種類以上の直線偏光の各々によって前記被写体が照射されているときに、順次、前記被写体を撮像する撮像部と、
画像処理部と、
画像処理部の出力に基づいて画像を表示する表示部と、
を備え、
前記画像処理部は、
前記撮像部で撮影した画像の輝度を処理して被写体表面における反射偏光状態を算出する変動輝度処理部と、
前記変動輝度処理部の出力に基づいて疑似カラー画像を生成する疑似カラー画像変換部と、
を備え、
前記変動輝度処理部は、前記撮像部から出力される画素信号に基づいて、前記偏光面の角度と各画素の輝度値との関係を求め、各画素について前記輝度値が最大となる前記偏光面の角度によって定義される輝度最大角画像、および各画素について前記偏光面の変化にともなう前記輝度値の変動の振幅と輝度平均値との比率によって定義される輝度変調度画像を生成し、
前記疑似カラー画像変換部は、前記輝度最大角画像および前記輝度変調度画像に基づき、前記輝度最大角を色相角、前記輝度変調度を彩度とする擬似カラー画像を生成し、前記擬似カラー画像と輝度画像とを合成して前記表示部に表示させる、内視鏡装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012534172A JP5238098B2 (ja) | 2010-11-30 | 2011-09-20 | 画像処理装置および画像処理装置の作動方法 |
CN201180037504.5A CN103037751B (zh) | 2010-11-30 | 2011-09-20 | 图像处理装置 |
US13/571,686 US8648907B2 (en) | 2010-11-30 | 2012-08-10 | Image processing apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010267436 | 2010-11-30 | ||
JP2010-267436 | 2010-11-30 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/571,686 Continuation US8648907B2 (en) | 2010-11-30 | 2012-08-10 | Image processing apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012073414A1 true WO2012073414A1 (ja) | 2012-06-07 |
Family
ID=46171392
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/005293 WO2012073414A1 (ja) | 2010-11-30 | 2011-09-20 | 画像処理装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8648907B2 (ja) |
JP (1) | JP5238098B2 (ja) |
CN (1) | CN103037751B (ja) |
WO (1) | WO2012073414A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014108193A (ja) * | 2012-11-30 | 2014-06-12 | Panasonic Corp | 画像処理装置および内視鏡 |
WO2014125529A1 (ja) * | 2013-02-15 | 2014-08-21 | パナソニック株式会社 | 画像処理装置および内視鏡 |
JP2018084572A (ja) * | 2016-11-15 | 2018-05-31 | パナソニックIpマネジメント株式会社 | 画像形成装置 |
JP2020512089A (ja) * | 2017-03-24 | 2020-04-23 | シーメンス ヘルスケア ゲゼルシヤフト ミツト ベシユレンクテル ハフツング | 奥行き知覚を高める仮想陰影 |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9536345B2 (en) * | 2012-12-26 | 2017-01-03 | Intel Corporation | Apparatus for enhancement of 3-D images using depth mapping and light source synthesis |
JP6107537B2 (ja) * | 2013-08-27 | 2017-04-05 | ソニー株式会社 | 撮像システムおよびその画像処理方法、画像処理装置およびその画像処理方法、並びに、プログラム |
EP2865988B1 (de) * | 2013-10-22 | 2018-09-19 | Baumer Electric Ag | Lichtschnittsensor |
US9664507B2 (en) | 2014-09-17 | 2017-05-30 | Canon Kabushiki Kaisha | Depth value measurement using illumination by pixels |
WO2016175084A1 (ja) | 2015-04-30 | 2016-11-03 | 富士フイルム株式会社 | 画像処理装置、方法及びプログラム |
JP6679289B2 (ja) * | 2015-11-30 | 2020-04-15 | キヤノン株式会社 | 処理装置、処理システム、撮像装置、処理方法、処理プログラムおよび記録媒体 |
US9958259B2 (en) | 2016-01-12 | 2018-05-01 | Canon Kabushiki Kaisha | Depth value measurement |
JP6735474B2 (ja) * | 2016-02-29 | 2020-08-05 | パナソニックIpマネジメント株式会社 | 撮像装置 |
CN110741632B (zh) * | 2017-06-13 | 2021-08-03 | 索尼公司 | 拍摄装置、拍摄元件以及图像处理方法 |
JP2019086310A (ja) * | 2017-11-02 | 2019-06-06 | 株式会社日立製作所 | 距離画像カメラ、距離画像カメラシステム、及びそれらの制御方法 |
EP3530179A1 (en) * | 2018-02-27 | 2019-08-28 | Koninklijke Philips N.V. | Obtaining images for use in determining one or more properties of skin of a subject |
WO2019215821A1 (ja) | 2018-05-08 | 2019-11-14 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および被写体情報取得方法 |
WO2019215820A1 (ja) * | 2018-05-08 | 2019-11-14 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および被写体情報取得方法 |
CN110600109B (zh) * | 2019-09-10 | 2023-12-26 | 安徽中科微因健康科技有限公司 | 一种彩色图像融合的诊断监控综合医疗系统及其融合方法 |
JP7272927B2 (ja) * | 2019-10-15 | 2023-05-12 | 日立Astemo株式会社 | 車載カメラシステム、車両、及び偏光素子 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10104524A (ja) * | 1996-08-08 | 1998-04-24 | Nikon Corp | 微分干渉顕微鏡 |
JPH10122829A (ja) * | 1996-10-25 | 1998-05-15 | Matsushita Electric Ind Co Ltd | 半田フィレットの外観検査装置 |
JP2010082271A (ja) * | 2008-09-30 | 2010-04-15 | Fujifilm Corp | 凹凸検出装置、プログラム、及び方法 |
JP2010104424A (ja) * | 2008-10-28 | 2010-05-13 | Fujifilm Corp | 撮像システムおよび撮像方法 |
JP4762369B2 (ja) * | 2009-12-08 | 2011-08-31 | パナソニック株式会社 | 画像処理装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11313242A (ja) | 1998-04-28 | 1999-11-09 | Nippon Hoso Kyokai <Nhk> | 撮像用光学機器および撮像装置 |
JP3869698B2 (ja) | 2001-10-23 | 2007-01-17 | ペンタックス株式会社 | 電子内視鏡装置 |
CN101194202B (zh) * | 2005-05-10 | 2011-08-17 | 莱弗技术有限公司 | 电光滤波器 |
WO2008102294A2 (en) * | 2007-02-20 | 2008-08-28 | Koninklijke Philips Electronics N.V. | An optical device for assessing optical depth in a sample |
CN100455253C (zh) * | 2007-03-29 | 2009-01-28 | 浙江大学 | 用于在体光学活检的谱域oct内窥成像系统 |
JP4235252B2 (ja) | 2007-05-31 | 2009-03-11 | パナソニック株式会社 | 画像処理装置 |
US8004675B2 (en) | 2007-09-20 | 2011-08-23 | Boss Nova Technologies, LLC | Method and system for stokes polarization imaging |
JP4940440B2 (ja) | 2008-03-31 | 2012-05-30 | 富士フイルム株式会社 | 撮像装置、撮像方法、およびプログラム |
CN101449963B (zh) * | 2008-12-29 | 2011-12-28 | 浙江大学 | 激光共聚焦显微内窥镜 |
-
2011
- 2011-09-20 WO PCT/JP2011/005293 patent/WO2012073414A1/ja active Application Filing
- 2011-09-20 JP JP2012534172A patent/JP5238098B2/ja active Active
- 2011-09-20 CN CN201180037504.5A patent/CN103037751B/zh active Active
-
2012
- 2012-08-10 US US13/571,686 patent/US8648907B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10104524A (ja) * | 1996-08-08 | 1998-04-24 | Nikon Corp | 微分干渉顕微鏡 |
JPH10122829A (ja) * | 1996-10-25 | 1998-05-15 | Matsushita Electric Ind Co Ltd | 半田フィレットの外観検査装置 |
JP2010082271A (ja) * | 2008-09-30 | 2010-04-15 | Fujifilm Corp | 凹凸検出装置、プログラム、及び方法 |
JP2010104424A (ja) * | 2008-10-28 | 2010-05-13 | Fujifilm Corp | 撮像システムおよび撮像方法 |
JP4762369B2 (ja) * | 2009-12-08 | 2011-08-31 | パナソニック株式会社 | 画像処理装置 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014108193A (ja) * | 2012-11-30 | 2014-06-12 | Panasonic Corp | 画像処理装置および内視鏡 |
WO2014125529A1 (ja) * | 2013-02-15 | 2014-08-21 | パナソニック株式会社 | 画像処理装置および内視鏡 |
JP5799264B2 (ja) * | 2013-02-15 | 2015-10-21 | パナソニックIpマネジメント株式会社 | 画像処理装置および内視鏡 |
US9808147B2 (en) | 2013-02-15 | 2017-11-07 | Panasonic Intellectual Property Management Co., Ltd. | Image processing apparatus and endoscope |
JP2018084572A (ja) * | 2016-11-15 | 2018-05-31 | パナソニックIpマネジメント株式会社 | 画像形成装置 |
JP2020512089A (ja) * | 2017-03-24 | 2020-04-23 | シーメンス ヘルスケア ゲゼルシヤフト ミツト ベシユレンクテル ハフツング | 奥行き知覚を高める仮想陰影 |
Also Published As
Publication number | Publication date |
---|---|
JP5238098B2 (ja) | 2013-07-17 |
US20120307028A1 (en) | 2012-12-06 |
JPWO2012073414A1 (ja) | 2014-05-19 |
US8648907B2 (en) | 2014-02-11 |
CN103037751A (zh) | 2013-04-10 |
CN103037751B (zh) | 2015-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5238098B2 (ja) | 画像処理装置および画像処理装置の作動方法 | |
JP4971531B2 (ja) | 画像処理装置 | |
JP4762369B2 (ja) | 画像処理装置 | |
JP4469021B2 (ja) | 画像処理方法、画像処理装置、画像処理プログラム、画像合成方法、および画像合成装置 | |
JP5450826B2 (ja) | 画像処理装置 | |
US8184194B2 (en) | Image processing apparatus, image division program and image synthesising method | |
US9392231B2 (en) | Imaging device and endoscope for detecting and displaying the shape of the micro-geometric texture of a transparent structure | |
US7274470B2 (en) | Optical 3D digitizer with enlarged no-ambiguity zone | |
JP4235252B2 (ja) | 画像処理装置 | |
KR102048793B1 (ko) | 표면 컬러를 이용한 표면 토포그래피 간섭측정계 | |
JP6084995B2 (ja) | 画像処理装置 | |
US20150085079A1 (en) | Dynamic range of color camera images superimposed on scanned three-dimensional gray-scale images | |
JP5799264B2 (ja) | 画像処理装置および内視鏡 | |
US20120099172A1 (en) | Microscope system | |
JP2012143363A (ja) | 画像処理装置 | |
JPH1069543A (ja) | 被写体の曲面再構成方法及び被写体の曲面再構成装置 | |
WO2015048053A1 (en) | Improved dynamic range of color camera images superimposed on scanned three dimensional gray scale images | |
JP2009236696A (ja) | 被写体の3次元画像計測方法、計測システム、並びに計測プログラム | |
JP5503573B2 (ja) | 撮像装置および画像処理情報生成プログラム | |
Clancy et al. | An endoscopic structured lighting probe using spectral encoding | |
JPS63246716A (ja) | 内視鏡画像取込み方法および内視鏡装置 | |
KR101469615B1 (ko) | 무작위검색 알고리즘을 이용한 비전시스템의 컬러 조명 제어방법 | |
Lapray et al. | Performance comparison of division of time and division of focal plan polarimeters | |
CA2475391C (en) | Optical 3d digitizer with enlarged non-ambiguity zone | |
JP2004325397A (ja) | 測距装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180037504.5 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11844953 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012534172 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11844953 Country of ref document: EP Kind code of ref document: A1 |