CN100515318C - Medical image processing method - Google Patents

Medical image processing method Download PDF

Info

Publication number
CN100515318C
CN100515318C CNB2005800397547A CN200580039754A CN100515318C CN 100515318 C CN100515318 C CN 100515318C CN B2005800397547 A CNB2005800397547 A CN B2005800397547A CN 200580039754 A CN200580039754 A CN 200580039754A CN 100515318 C CN100515318 C CN 100515318C
Authority
CN
China
Prior art keywords
mucosa
image processing
processing method
image
detects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB2005800397547A
Other languages
Chinese (zh)
Other versions
CN101060806A (en
Inventor
西村博一
长谷川润
田中秀树
井上凉子
野波徹绪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN101060806A publication Critical patent/CN101060806A/en
Application granted granted Critical
Publication of CN100515318C publication Critical patent/CN100515318C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Endoscopes (AREA)
  • Image Processing (AREA)

Abstract

A medical image processing method for processing a medical image showing a mucosa comprising a boundary information detecting step of detecting boundary information corresponding to the boundary portion of a mucosa from a medical image and mucosa property detecting step of detecting a mucosa having a different property on the basis of the boundary information detected at the boundary information detecting step.

Description

Medical image processing method
Technical field
The present invention relates to discern the medical image processing method of character of the mucosal tissue etc. of the boundary vicinity from the esophagus to the stomach.
Background technology
In recent years, use endoscope to carry out the endoscope apparatus of splanchnoscopy etc., in medical field and industrial field, be widely adopted.In medical field, for example the insertion section of endoscope is inserted in the body cavity, come observation and inspection object position, to belong to normal condition still be the character variable condition that character has changed at work up object position thus.
In this case, can belong to the judgement that normal condition still is the character variable condition by Flame Image Process from endoscopic images, operative doctor is diagnosed the position of this result of determination by emphasis, can carry out efficient diagnosis.
For example, as example in the past, the spy of Japan opens the 2000-155840 communique and has put down in writing the influence that reduces imaging conditions, according to the relevant information of character variation portion with the boundary shape at normal position, comes the calculated characteristics amount.
But above-mentioned example in the past is because the relevant information of boundary shape is the direction of the Concentraton gradient of boundary changes, so particularly the shape self on the border has under the situation of feature, might be difficult for detecting and becomes the character of detected object change section.
Barret (Barrett) mucosa is the junction surface (mucosa border) at the esophagus stomach function regulating, because the influence of refluence esophagitis etc., the mucosa that the squamous epithelium of esophagus is replaced into stomach forms.Apart from more than the normal mucosa border 3cm and result under the situation of whole esophagus tube chamber section, be diagnosed as the such disease of Barret esophagus at the Barret mucosa.
The Barret esophagus increases to some extent the America and Europe especially, and produces adenocarcinoma with higher probability, has become significant problem, so find that as early as possible the Barret mucosa is extremely important.
The Barret mucosa not only results from whole esophagus, the mucosa border picture (being also referred to as Z-shaped line) of also local sometimes development and generation ligule or Z word shape.
And the residual squamous epithelium that the esophagus that isolates into island is arranged in the Barret mucosa in the time can not finding the border observed result of ligule etc., also can be diagnosed the Barret mucosa sometimes.And the squamous epithelium of esophagus presents white tone relatively, and the Barret mucosa presents red color tone.
And, in above-mentioned example in the past, when in image, having dark portion, might be difficult for detecting and become the character of detected object variation portion.
The splanchnoscopy of boundary vicinity from esophagus to close stomach, from the esophagus of lumen shape to the part of the stomach of its depths, for example exist the situation of dark portion more in the image, in example in the past, because this dark portions etc. might can not carry out more suitable judgement or detection.That is, anti-γ proofread and correct and pretreatment such as shading correction in, become bright etc. by a part that makes dark portion, might improve, still we can say and can not eliminate dark portion.Therefore, owing to zones such as dark portions, can not improve the precision of processing result image.
Summary of the invention
The present invention In view of the foregoing finishes, and its purpose is, a kind of medical image processing method that can change mucosa by the character that Flame Image Process detects Barret mucosa etc. and become detected object is provided.
The medical image processing method of first mode of the present invention carries out Flame Image Process to the medical imaging of having taken the live body mucosa, it is characterized in that, this method comprises: boundary information detects step, and it detects the boundary information of the boundary member that is equivalent to described live body mucosa from described medical imaging; And mucosa character detection step, it detects whether there is the different live body mucosa of character according to detecting the boundary information that detects in the step at described boundary information.
According to said structure, according to detecting the boundary information that detects in the step at boundary information, whether detection exists the different live body mucosa of character, can detect thus to have or not esophageal mucosa membrane injury character such as the Barret mucosa variation mucosa of degeneration.
And, the medical image processing method of the second mode of the present invention pair picture signal corresponding with the medical imaging more than two kinds of colors of having taken the live body mucosa carried out Flame Image Process, it is characterized in that, this method comprises: the 1st step, it carries out removing inappropriate pixel from the detected object of described medical imaging, and sets the processing in process object zone; The 2nd step, it carries out at least more than one each pixel count to described process object zone, calculates the processing of tone characteristics amount; And the 3rd step, it carries out detecting the processing of the existence of specific live body mucosa according to the described tone characteristics amount of being calculated.
According to said structure, can pass through Flame Image Process, the character that detects Barret mucosa etc. and become detected object from the shape of boundary member, the tone of pixel etc. changes mucosa, and is not subjected to the influence of the improper parts such as dark portion in the image.
Description of drawings
Fig. 1 is the block diagram of structure of endoscopic system of the function of the image processing method of expression with embodiments of the invention 1.
Fig. 2 is that expression is according to the block diagram of handling procedure by the structure of the processing capacity of CPU generation boundary coordinate point range.
Fig. 3 is that expression is according to the block diagram of handling procedure by the structure of CPU generation mucosa border detection result's processing capacity.
Fig. 4 is the flow chart of expression by the treatment step of CPU processing.
Fig. 5 A is that expression has been set from the figure of the example images at the edge that original image extracts.
Fig. 5 B is that the figure of example images that carries out the control point of shape decision from original image has been set in expression.
Fig. 6 is the treatment step of inflection point is determined in expression from the boundary coordinate point range a flow chart.
Fig. 7 is the key diagram that calculates the summit evaluation of estimate of inflection point in the treatment step of Fig. 6.
Fig. 8 is the treatment step of inflection point is determined in expression from the candidate point of inflection point a flow chart.
Fig. 9 is a key diagram of deciding inflection point from the candidate point Lieque of inflection point.
Figure 10 is based on the key diagram of smoothing of the boundary coordinate point range of rolling average.
Figure 11 be expression embodiments of the invention 2 according to handling procedure, generate the block diagram of structure of mucosa border detection result's processing capacity by CPU.
Figure 12 is the flow chart of expression by the treatment step of CPU processing.
Figure 13 A is the figure of the example images of expression original image.
Figure 13 B is illustrated in the figure that is set with the image of boundary coordinate point range in the original image of Figure 13 A.
Figure 14 be expression embodiments of the invention 3 according to handling procedure, generate the block diagram of structure of mucosa border detection result's processing capacity by CPU.
Figure 15 is the flow chart of expression by the treatment step of CPU processing.
Figure 16 A is that expression has been set from the figure of the image at the edge that original image extracts.
Figure 16 B is the figure of the boundary coordinate point that is illustrated in the image of Figure 16 A regional area of listing setting etc.
Figure 17 is the figure of major part that expression has the endoscopic system of the 1st variation.
Figure 18 is the block diagram of structure that expression has the capsule-type endoscope system of the 2nd variation.
Figure 19 is the block diagram of structure of endoscopic system of the function of the image processing method of expression with embodiments of the invention 4.
Figure 20 be the expression image processing method that is used to realize embodiment 4 processing capacity and the figure of the image that uses etc.
Figure 21 is that the expression deletion does not need the zone that or not in zone to delete the flow chart of the contents processing of handling.
Figure 22 A is the figure of the example of expression original image.
Figure 22 B is the figure that do not need regional deleted image of expression from the original image generation of Figure 22 A.
Figure 22 C is that expression is handled the figure of image from the regional deleted image that do not need of Figure 22 B by the tone that computing generated of tone characteristics amount.
Figure 22 D is the regional sign picture picture that image generated is handled in expression from the tone of Figure 22 C figure.
Figure 23 is the flow chart of the contents processing of expression zone sign.
Figure 24 is the flow chart of the contents processing of expression mucosa characteristic quantity calculating.
Figure 25 is the figure of a part of structure of the endoscopic system of 1st variation of expression with embodiment 4.
Figure 26 be expression the 2nd variation that is used to realize embodiment 4 image processing method processing capacity and the figure of the image that uses etc.
Figure 27 is the figure that expression is divided into the example images of rectangle zonule.
Figure 28 is the flow chart of the contents processing that calculates of the mucosa characteristic quantity of expression the 2nd variation.
Figure 29 be the expression image processing method that is used to realize embodiments of the invention 5 processing capacity and the figure of the image that uses etc.
Figure 30 is the figure of the ring buffer of expression storage mucosa information.
Figure 31 is the flow chart of the definite processing of expression mucosa.
Figure 32 A is the processing capacity of the expression image processing method that is used to realize embodiments of the invention 6 and the figure of the image that uses.
Figure 32 B be expression be used to be implemented in the image processing method that carries out after the processing of Figure 32 A processing capacity and the figure of the image that uses etc.
Figure 33 is the figure that expression is divided into the image of rectangle zonule image sets.
Figure 34 is the flow chart of the definite contents processing in expression process object zone.
Figure 35 is a key diagram of determining the process object zone according to the dark portion zone of being detected.
Figure 36 is the figure of the example images of the dark portion detected of expression.
Figure 37 is the block diagram of structure of the capsule-type endoscope system with capsule type endoscope of the variation of expression embodiment 6.
The specific embodiment
Below, with reference to description of drawings each embodiment of the present invention.
(embodiment 1)
With reference to Fig. 1~Figure 10 embodiments of the invention 1 are described.
Endoscopic system 1 shown in Figure 1 is made of endoscopic visualization device 2, image processing apparatus 3 and display monitor 4, described image processing apparatus 3 is made of personal computer etc., the image that obtains by this endoscopic visualization device 2 is carried out Flame Image Process, and described display monitor 4 shows by this image processing apparatus 3 and carries out image after the Flame Image Process.
Endoscopic visualization device 2 has: insert endoceliac endoscope 6; This endoscope 6 is provided the light supply apparatus 7 of illumination light; The image unit of endoscope 6 is carried out the camera control unit (abbreviating CCU as) 8 of signal processing; Monitor 9, it is transfused to from the signal of video signal of this CCU 8 outputs, shows the endoscopic images of taking by imaging apparatus thus.
Endoscope 6 has the operating portion 12 that inserts endoceliac insertion section 11 and be located at the rear end of this insertion section 11.And 11 interpolations are being led to the light guide 13 that transmits illumination light in the insertion section.
The rear end of this light guide 13 connects light supply apparatus 7.And the illumination light that provides from this light supply apparatus 7 is transmitted by light guide 13, penetrates the illumination light that is transmitted from the front end face that is installed on the illuminating window, and described illuminating window is located on the leading section 14 of insertion section 11.Subjects such as affected part are thrown light on by this illumination light.
Camera head 17 is made of for example charge coupled cell (abbreviating CCD as) 16 as solid-state imager that is installed in the object lens 15 on the observation window adjacent with illuminating window and be configured on the image space of these object lens 15.And the optical image of imaging carries out opto-electronic conversion by this CCD16 on the shooting face of this CCD 16.
This CCD16 is connected with CCU 8 by holding wire, and is applied in the CCD driving signal from this CCU 8, thus the picture signal after the CCD 16 output opto-electronic conversion.This picture signal is carried out signal processing by the image-processing circuit in the CCU 8, is converted into signal of video signal.This signal of video signal is exported to monitor 9, shows the image by the CCD16 shooting of corresponding signal of video signal on the display surface of monitor 9, as endoscopic images.This signal of video signal also is transfused to image processing apparatus 3.
In the present embodiment, endoscope 6 uses when carrying out following splanchnoscopy, the leading section 14 of its insertion section 11 is inserted from oral area, be inserted into the boundary vicinity of stomach from esophagus always, whether inspection exists the Barret mucosa at this boundary vicinity, and this Barret mucosa is the esophagus normal mucosa (being squamous epithelium specifically) that becomes the detected object mucosa presents the character of gastric mucosa part owing to degeneration a mucosa (be also referred to as character in this manual and change mucosa).
During this situation, also be transfused to image processing apparatus 3 with the endoscopic images corresponding image signal of having taken intravital live body mucomembranous surface, by the image processing method of following explanation, whether exist the detection (judgement) of Barret mucosa to handle to this signal of video signal.
This image processing apparatus 3 has: image input part 21, its input and the endoscopic images corresponding image signal of importing from endoscopic visualization device 2; Central authorities' arithmetic processing apparatus (abbreviating CPU as) 22, it is to carrying out Flame Image Process from the view data of these image input part 21 inputs; And handling procedure storage part 23, its storage makes the handling procedure (control sequence) of these CPU 22 carries out image processing.
And this image processing apparatus 3 also has: image storage part 24, the view data that its storage is imported from image input part 21 etc.; Analytical information storage part 25, the analytical information that its storage is handled by CPU 22 etc.; As the hard disk 27 of storage device, view data after it is handled by CPU 22 by memory device interface 26 storages and analytical information etc.; Display process portion 28, it is used to show the display process of view data after handling by CPU 22 etc.; And input operation part 29, it is carried out the input of data such as parameter of Flame Image Process and the keyboard of indication operation etc. and constitutes by the user.
And the signal of video signal that generates by this display process portion 28 is exported to display monitor 4, the processing image after display image on the display surface of this display monitor 4 is handled.In addition, image input part 21, CPU 22, handling procedure storage part 23, image storage part 24, analytical information storage part 25, memory device interface 26, display process portion 28 and input operation part 29 interconnect by data/address bus 30.
In the present embodiment, as shown in Figures 2 and 3, by being stored in handling procedure A in the handling procedure storage part 23 and the processing of handling procedure B, CPU 22 is for the detected object mucosa, detect (judgement) character different with normal mucosa change the Flame Image Process of the Barret mucosa of mucosa as character.The handling procedure A of Fig. 2 carries out generating edge-detected image 32 from original image 31, and generates the processing of boundary coordinate point range 33.And,, generate mucosa border detection result 34 processing according to handling procedure B shown in Figure 3 at this boundary coordinate point range 33.
As shown in Figure 2, the CPU 22 that carries out handling procedure A has the function that profile is extracted filtering portion 35 out, original image 31 is carried out profile extract filtering out, and generate edge-detected image 32.That is, becoming the position of judging object, in normal mucosa and degeneration mucosa, the R of view data, G, B concentration value difference.Therefore, by detecting or extracting profile (edge) out, can detect the border of normal mucosa and degeneration mucosa.The concentration value of the RGB of this view data is the value after known anti-γ proofreaies and correct.
Profile is extracted band filter and the known edge detection filter (for example Prewitt operator, Sobel operator) of filtering portion 35 by application of known out, generates edge-detected image 32.As this edge detection filter, at the capable work of Tian Cunxiu, always grind the document 1 of publication: in " コ Application ピ ユ one draw a portrait Door " the 120th~122 page detailed record is arranged.
And the CPU 22 that carries out handling procedure A has: binary conversion treatment portion 36, and it carries out binary conversion treatment to edge-detected image 32; Graph thinning handling part 37, it carries out graph thinning and handles after this binary conversion treatment; End points extraction unit 38, it carries out end points and extracts out after this graph thinning is handled; And connecting line segment extraction unit 39, it connects line segment and extracts out after extracting this end points out, generates boundary coordinate point range 33.
36 pairs of edge-detected image of binary conversion treatment portion 32 are set the threshold value θ of regulation, by the pixel more than or equal to this threshold value θ is made as 1, the pixel less than defined threshold is made as 0, generate binary image.
And graph thinning handling part 37 is at the image after the binary conversion treatment, generates the pixel value of line is made as 1, the pixel value of background is made as 0 graph thinning image.Graph thinning handling part 37 can use the wire narrowing method of known method as Hildlich, can keep under internuncial situation of pixel, carries out the pixel value of this pixel is made as 0 processing that reduces live width.
About the wire narrowing method of Hildlich, at document 2: " the software library periodical: the 9th phase of C MAGAZINE2000 ' portrait is handled を Very め Ru ア Le go リ ズ system ラ ボ ' " the 123rd~129 page in detailed record is arranged.
And end points extraction unit 38 is obtained the data of pixel successively at carried out the image that graph thinning is handled by graph thinning handling part 37 with 3 * 3 patterns along the scanning direction of picture.Center pixel value at the pattern that is obtained is " 1 ", and the pixel of pixel value=" 1 " that tilts up and down on every side is judged to be end points to the pixel of this central point when having only one, in other words is judged to be boundary point.
And, connect line segment extraction unit 39 an end points (boundary point) as starting point, the graph thinning image is followed the tracks of processing, thereby obtains the boundary coordinate point range.
This tracking is handled to follow the tracks of to handle by known boundary line and be realized, at the capable work of Tian Cunxiu, always grind the document 3 of publication: in " コ Application ピ ユ one draw a portrait Door " the 84th~85 page detailed record is arranged.
And, the CPU 22 that carries out handling procedure B shown in Figure 3 has the function of inflection point test section 41 and shape evaluation portion 42, the detection of 41 pairs of borders of described inflection point test section coordinate points row 33 march breaks (bending point), 42 pairs of detections of described shape evaluation portion have the boundary coordinate point range 33 of this inflection point to carry out shape evaluation, thereby draw mucosa border detection result 34.
Shape evaluation portion 42 is when detecting the inflection point that satisfies condition as hereinafter described, in the quantity of the inflection point that satisfies condition during more than or equal to setting, judgement is that character changes the mucosa border, when not having complications (bending) concavo-convex, when the zigzag direction that promptly satisfies condition is a direction, judge it is not that character changes the mucosa border.
Below, the image processing method of Barret edge determination is described according to the handling process of Fig. 4, this method is according to handling procedure A and the B of Fig. 2 and Fig. 3, judge whether be to become the particular mucosal of judging object, specifically, whether be (as the squamous epithelium of esophagus normal mucosa degeneration as the degeneration mucosa) border of Barret mucosa.
After the action of image processing apparatus 3 began, CPU 22 read the handling procedure of handling procedure storage part 23, and beginning is based on the processing of this handling procedure.
That is, carry out the function of handling procedure A shown in Figure 2.More specifically, as shown in Figure 4, CPU 22 obtains the view data as original image 31 of process image input part 21 inputs from the CCU 8 of endoscopic visualization device 2 in initial step S1.
And in the step S2 of back, 22 pairs of these view data of CPU are carried out above-mentioned known edge detection process, extract Filtering Processing out as profile, generate edge-detected image 32.Fig. 5 A represents edge-detected image 32.In addition, in the shooting that obtains by direct viewing type endoscope 6 shown in Figure 1 in the image of esophagus inside of tube chamber shape, the deep side of tube chamber (stomach side part) is dark portion.And the dark portion in the image of Fig. 5 A is corresponding near near the cardia the inlet of stomach.
In the step S3 of back, 22 pairs of these edge-detected image 32 of CPU are carried out above-mentioned binary conversion treatment, and carry out graph thinning processing and end points extraction (obtaining the border).
In the step S4 of back, CPU follows the tracks of processing in 22 pairs of borders that obtained, obtain along the border the coordinate point range, be the coordinate point range on border.
Then, CPU 22 carries out the processing of the handling procedure B of Fig. 3.That is, the coordinate point range on 22 pairs of borders that in step S4, generate of CPU, shown in step S5, according to the summit evaluation of estimate A of borderline each point, calculated example such as N zigzag inflection point.
The computing of this inflection point will use Fig. 6 to narrate in the back, but use Fig. 7 to be simply described as follows, extracting (setting) out is the long curve of its front and back L/2 at center with borderline each point, the air line distance (chord length) of calculating its initial point and terminal point is as summit evaluation of estimate A (j), before and after its h point less than the point of summit evaluation of estimate A (j-h) and A (j+h) candidate, definite inflection point from these candidates as inflection point.
In the step S6 of back, CPU 22 is initialized as 0,1 to the parameter k of the quantity at expression control point and the parameter i of inflection point respectively, begins to judge the processing that whether exists the character that becomes detected object to change mucosa.
After with parameter i, k initialization, in the step S7 of back, CPU 22 carries out its evaluation of estimate B whether less than the judgement of defined threshold thr in the inflection point that is calculated (i).Under this situation, CPU 22 calculates summit evaluation of estimate A/L as evaluation of estimate B, and carries out this evaluation of estimate B in the inflection point (i) whether less than the judgement of defined threshold thr.
The evaluation of estimate B that is calculated in 22 pairs of inflection points of CPU (i), carry out the judgement that character that normal mucosa and character changed changes mucosa in advance, more specifically, utilization is from the sampled data by diagnosing definite squamous epithelium and Barret mucosa to take a sample respectively, compare with the thr that sets as their reference value of identification, thereby judge.
And when satisfying the condition of evaluation of estimate B<thr, in step S8, CPU 22 obtains this inflection point (i) as control point M, with the value increase by 1 of parameter k.And, when not satisfying the condition of evaluation of estimate B<thr, change step S9 over to.
In step S9, whether CPU 22 is last inflection point, is whether i is N by critical parameter i, judge and whether all inflection points have been carried out the processing of step S7 and S8.And, when all inflection points not being carried out the processing of step S7 and S8, shown in step S10, parameter i is increased by 1, return step S6.
Like this, CPU 22 sets the control point by all inflection points are carried out the processing of step S7 and S8 shown in Fig. 5 B, so change step S11 over to.
In this step S11, whether the parameter k that CPU 22 carries out number of control points is greater than 1 judgement.Judging this parameter k greater than 1 o'clock, in the step S12 of back, CPU 22 carries out the whether irregular judgement of the radial direction that is configured in tube chamber at control point.
Shown in Fig. 5 B, do not form concentric circles at the control point, and be configured under the situation of concaveconvex shape at the radial direction of tube chamber, shown in step S13, CPU 22 judge be the Barret mucosa the border, be the Barret border.
On the other hand, in the judgement of step S11, in the k that do not satisfy condition>1 o'clock, and in the judgement of step S12, the radial direction that is configured in tube chamber at control point does not present concavo-convex and when becoming concentric circles, shown in step S14, CPU 22 judgements are the borders as the squamous epithelium of normal mucosa.Like this, can judge the probability on the border (Barret border) that whether has the Barret mucosa in this image according to view data.
Below, the processing that the inflection point in the border of above-mentioned steps S5 calculates is described with reference to Fig. 6.
In step S21, CPU 22 obtains the boundary coordinate point range that is made of N_p point.
In step S22, CPU 22 carries out the comparison of the threshold value Vmin that counts with the minimum that becomes the boundary coordinate point range of process object.And, if N_p>Vmin then changes step S23 over to, otherwise, be considered as not belonging to process object and finish.
In the step S23 of back, and the summit evaluation of estimate A (j) of the inflection point of CPU 22 calculating each points (1≤j≤N_p).Concrete condition about step S23 will be narrated in the back.
In step S24, with the j initialization, in step S25, CPU 22 some P (j) as object-point, obtain this object-point inflection point summit evaluation of estimate A (j) and light from object before and after summit evaluation of estimate A (j-h), the A (j+h) of inflection point of h point P (j-h), the P (j+h) of ordering.Value h is a parameter of selecting the comparison other among the step S26, in the present embodiment, and h=5 for example.Concrete condition about step S25 will be narrated in the back.
In step S26, the comparison of the summit evaluation of estimate A (j) of CPU 22 march breaks, A (j-h), A (j+h).If A (j)<A (j-h) and A (j)<A (j+h) then changes step S27 over to, otherwise changes step S29 over to.Concrete condition about step S26 will be narrated in the back.
In step S27, CPU 22 carries out and the tortuous comparison that detects the tortuous big or small threshold value Vc of usefulness of restriction.If A (j)/L<Vc then changes step S28 over to, otherwise finish this processing.
In step S28, CPU 22 detected object point P (j) are stored in this information in analytical information storage part 25 grades as the candidate point of inflection point.
In step S29, be j=N_p if object-point P (j) is the rearmost point of boundary coordinate point range, then CPU 22 determines inflection point and end process in step S30, if j ≠ N_p then changes step S31 over to.
In step S31, CPU 22 is updated to j=j+1, a series of processing shown in repeating step S25~S29 with object-point.
Fig. 9 represents the flow process of definite processing of the inflection point among the step S30, but specifically narration in the back.
Below, specify in step S23 the method for the summit evaluation of estimate A that calculates inflection point and in step S25, step S26, use the summit evaluation of estimate A of inflection point to detect the processing of the candidate point of inflection point.
Fig. 7 be used for illustrating present embodiment inflection point summit evaluation of estimate A computational methods and use the summit evaluation of estimate A of inflection point to detect the key diagram of processing of the candidate point of inflection point.
As the computational methods of the summit evaluation of estimate A of inflection point, use document 4: the method for " ' Fine Lineization figure shape inflexion point detects a maneuver ', hill etc., the data PRL80-107 PP.80-90 of research association of electronic communication association (1980) " disclosure.
The each point of 22 pairs of boundary coordinate point ranges of being obtained of CPU, extracting out with the each point along this point range is the curve that utilizes the length L that solid line represents among the Fig. 7 at center, the distance (chord length) of initial point and terminal point of calculating this curve is as the summit evaluation of estimate A (j) of inflection point.But, when a P (j) is present in two ends apart from the boundary coordinate point range and can not extracts the curve of length L out in for the scope of [L/2], be considered as A (j)=L.Wherein, [] is Gauss's mark.
And CPU 22 is in the detection of the candidate point of inflection point is handled, and being present in apart from two ends at a P (j) is in the scope of h, when point P (j-h), P (j+h) can not become the point that the boundary coordinate point lists, is considered as A (j-h)=L, A (j+h)=L.The summit evaluation of estimate A (j-h) of the inflection point of the some P (j-h) before the summit evaluation of estimate A (j) of the inflection point of a P (j) is less than the h point and during less than the summit evaluation of estimate A (j+h) of the inflection point of the some P (j+h) behind the h point, P (j) expression ratio point P (j-h), the tortuous feature that P (j+h) is big are so CPU 22 detects the candidate point of these P (j) as inflection point.
To form the boundary coordinate point range have a few and carry out this comparative evaluation successively.At the two ends of distance boundary coordinate point range is in the scope of [L/2], and as mentioned above, so A (j)=L is by except from the candidate point of inflection point detects.
Below, Fig. 8 represents the handling process of definite processing of inflection point.
In step S41, CPU 22 obtains the candidate point of handling the inflection point that is detected by the detection of the candidate point of aforementioned inflection point from analytical information storage part 25.Under the situation of the candidate point adjacent continuous of inflection point, extract this point range out as a group (hereinafter referred to as pattern).At this, CPU22 as N_pattern (the candidate point quantity of 1≤N_pattern≤inflection point), calculates the candidate point quantity Cn (1≤n≤N_pattern) of the inflection point that forms each pattern to the number of patterns that generates in the boundary coordinate point range.
And in step S42, CPU 22 will represent that the n of initial pattern is initialized as 1, and in step S43, in the pattern that CPU 22 obtaining step S41 are generated one as object-point pattern (n) (1≤n≤N_pattern).
If the candidate point quantity Cn=1 of the inflection point of object pattern pattern (n), then CPU 22 judges the non-conterminous but individualism of the candidate point of other inflection points, and the candidate point of this inflection point is defined as inflection point.
If candidate point quantity Cn 〉=2 of the inflection point of object pattern pattern (n) then change step S44 over to, CPU 22 determines at 1 as inflection point from candidate's point range of the inflection point of object pattern.
In step S44, CPU 22 is defined as inflection point with A (j) among the object pattern pattern (n) for minimum point.Fig. 9 is inflection point is determined in expression from the candidate point of inflection point figure.
In step S45, if object pattern pattern (n) is last pattern, is n=N_pattern that CPU 22 end process then are if n ≠ N_pattern then changes step S46 over to.
In step S46, CPU 22 will determine that the object pattern of inflection point is updated to n=n+1, a series of processing shown in repeating step S43~S45.
In addition, described according to above-mentioned document 4 when the boundary coordinate point range is unsmooth, for the boundary coordinate point range of in step S21, obtaining, utilize moving average method to make the coordinate figure cunning that flattens.
Moving average method herein refers to the coordinate figure of a P is set at the method for meansigma methods of coordinate figure of front and back [m/2] point of coordinate point range.Value m is the parameter that rolling average counts, promptly is used for to determine the smoothing degree.For example shown in Figure 10, m=9, i.e. the situation of 9 rolling average is used in expression.
And, be in the scope of [L/2] at the two ends of distance boundary coordinate point range, suppose A (j)=L, though so can not detect the candidate point of inflection point, but in the time of in being contained in detected object, described according to above-mentioned patent documentation, the two ends prolongation of the boundary coordinate point range that will obtain in step S21 gets final product.And the parameter h among the step S25 also can determine according to the points N _ p that forms the boundary coordinate point range.
And in the present embodiment, in step S44, CPU 22 is defined as inflection point to the minimum point of the A (j) of candidate's point range of the inflection point in the object pattern, but also can be defined as the central point of candidate's point range of inflection point in the object pattern, i.e. [Cn/2] point.
As described above, in the present embodiment, about original image 31, by different mucosa borders of detection character such as edge detection process, inflection point in the shape on this mucosa border of further detection, and whether satisfy defined terms according to this inflection point, be set at control point as the representative point of inflection point.And whether according to this control point, detecting (judgement) is the Barret border that changes the border of mucosa as character.
Exist in the case of situation of Barret mucosa at the boundary vicinity of esophagus stomach function regulating, when having this Barret mucosa, the border of Barret mucosa is not a concentric circles, but it is in the majority to be rendered as the situation that the star-like grade shown in Fig. 5 B has concavo-convex shape.
Therefore, according to the control point of top described detection as the representative point of inflection point, belong to quantity during more than or equal to stated number with concavo-convex configuration and control point at this control point, judgement is the border of Barret mucosa, so can carry out the judgement on Barret border accurately according to occurring more feature in the case of Barret mucosa.
In addition, when judging the Barret border, can whether be the judgement of Barret mucosa also according to the complexity on the border of being detected.That is, the shape that detects or estimate the border of being detected is not single shapes such as circle but the judgement on Barret border is carried out in the processing of complicated shape according to this evaluation result.
Whether more specifically, for example, obtain near the radius of curvature the point of setting with predetermined distance along boundary coordinate point range 33, with the value grouping of this radius of curvature, according to the distribution of the value of radius of curvature, be the correspondence of complicated shape.And, also can be during more than or equal to threshold value in the distribution of radius of curvature, judgement is the border of Barret mucosa.
(embodiment 2)
Below, with reference to Figure 11~Figure 13 embodiments of the invention 2 are described.Figure 11 represents to pass through according to the program C of embodiment 2 structure of the function of CPU 22 realizations.
Present embodiment is identical with Fig. 2 part among the embodiment 1, so omit its explanation, replaces processing capacity shown in Figure 3 with processing capacity shown in Figure 11,, has region setting part 51, zonule extraction unit 52 and the mucosa edge determination portion 53 of care that is.
As shown in figure 11, by being concerned about that 51 pairs of original images 31 of region setting part and 33 settings of boundary coordinate point range become the care zone of detected object.
The zonule is extracted in the care zone that 52 pairs of zonule extraction units should be set out, and whether 53 pairs of zonules of being extracted out of mucosa edge determination portion are the judgements on mucosa border, and output mucosa border detection result 34.
Below, with reference to the handling process of Figure 12, its concrete processing is described.
Same as shown in Figure 4, CPU 22 obtains view data (S1), carries out rim detection (S2), generates edge-detected image 32 and with its graph thinning, obtains border (S3), obtains the coordinate point range on border again, generates boundary coordinate point range 33 (S4).In addition, also can under the state before generating boundary coordinate point range 33 (after being the processing of step S3), carry out following processing.
In the step S51 of back, CPU 22 is at the zone of pressing boundary demarcation in step S4 (S3), be concerned about the zone being set near the cardia that becomes the tube chamber central side (because the deepening of tube chamber central side, so the determination processing of threshold value that can be by having used dark portion is extracted out) or the zone that comprises cardia in the image.
After setting the care zone like this, shown in step S52, CPU carries out edge detection process in 22 pairs of these care zones once more, perhaps use BPF (band filter) detect to be concerned about edge in the zone, generate from the original image of Figure 13 A and comprise that image pattern 13B presses the image of the zonule of dividing at the edge 55 like that.
In the step S53 of back, CPU 22 will represent (being set at i=1) after the parameter i initialization of zonule 55, in the step S54 of back, calculate the average tone C (i) in each zonule (i).
Average tone C (i) for example is made as the meansigma methods of the value of IHb.This IHb use value 32log 2(R/G), this value is by the luminance signal Vg that extracts each pixel on the endoscopic images out (near the G color signal of the light the 560nm that reflection can be absorbed by hemoglobin) and luminance signal Vr (reflecting the R color signal that is got near the light the less 650nm by modified hemoglobin), and the ratio of two R, G color signal is carried out logarithm is converted to.
And in the step S55 of back, CPU 22 averages tone C (i) whether greater than the judgement of the threshold value thr2 of regulation.During this situation, the threshold value of regulation for example is to use the tone meansigma methods of at least more than one color signal in the image of the Barret mucosa part of having taken the case of determining diagnosis and has taken that the tone meansigma methods of the same color signal in the image of squamous epithelium part sets.
When the result of determination of step S55 satisfies the decision condition of average tone C (i)>thr2, shown in step S56, CPU 22 is thought of as the squamous epithelium that residues on the Barret mucosa to this zonule (i), when having more than one this squamous epithelium, comprise that the care zone of zonule is judged as Barret border (border of Barret mucosa).
On the other hand, when the result of determination of step S55 does not satisfy the decision condition of average color accent C (i)>thr2, shown in step S57, CPU 22 carries out the judgement whether parameter i equals the quantity N of zonule, when being not equal to N, shown in step S58, i is increased by 1, return step S54.And, the processing of repeating step S54~step S57.
Under the situation of Barret mucosa case, the often isolated zonule that has the squamous epithelium of island in the Barret mucosa.
Therefore, in the present embodiment, according to top described generation boundary coordinate point range 33, utilize the boundary coordinate point range 33 in the original image 31 in the zone of tube chamber side, to detect the edge, perhaps utilize boundary coordinate point range 33 detected cells territories (i), for each zonule (i), whether be the judgement of squamous epithelium according to the characteristic quantity of tone.And, judging zonule (i) when being squamous epithelium, judge that it is isolated Barret mucosa or Barret border that the zonule is the isolated squamous epithelium of island.
By carrying out this judgement, can carry out whether existing in the original image judgement of Barret mucosa accurately.
In addition, in the step S52 of Figure 12, detect and press the zonule 55 that the edge is divided, but, carry out binaryzation, also can obtain zonule 55 by threshold value according to the rules as its variation.
And, in step S54, S55, can whether be the judgement on Barret mucosa or Barret border also according to the tone variations of (or both sides) before and after the edge of zonule 55.
And, in the Flame Image Process of Figure 12, the zone of tube chamber side is set at is concerned about the zone, to judging by the zonule 55 of divisions such as edge in this care zone, but also can wait and judge by the exterior lateral area of this zonule 55 being obtained average tone.
For example, can be according at the result of determination of zonule 55 and the result of determination of exterior lateral area thereof, synthetic determination is concerned about the zone.Like this, can carry out more high-precision judgement.
In addition, as pressing the zonule 55 that the edge is divided, might be complete isolated zonule such shown in Figure 13 B.And, also might be the shape that the squamous epithelium side is extended and connected in zonule 55 laterally.
During this situation, shown in Figure 13 B, it is more that zonule 55 shows as the situation that isolates curve, so can whether be isolated curve according to the zonule of being detected 55 also, be concerned about whether the zone is the judgement on Barret mucosa or Barret border.
(embodiment 3)
Below, with reference to Figure 14~Figure 18 embodiments of the invention 3 are described.Figure 14 represents to pass through according to the program D of embodiment 3 structure of the function of CPU 22 realizations.
Present embodiment has the processing capacity shown in Figure 2 among the embodiment 1, and has processing capacity shown in Figure 14 to replace processing capacity shown in Figure 3.Omit explanation processing capacity shown in Figure 2.
Processing capacity shown in Figure 14 has: the regional area configuration part 63 of setting regional area group 61 according to boundary coordinate point range 33; And the mucosa edge determination portion 64 that regional area group 61 is generated mucosa border detection result 62.
Below, the determination processing on the mucosa border of present embodiment is described with reference to Figure 15.
As shown in figure 15, in initial step S61, CPU 22 carries out obtaining of G view data, and then, similarly carry out edge detection process (S2) and carry out graph thinning with embodiment 1 or 2, thereby obtain border (S3).Obtain the coordinate point range (S4) on border again.And, can obtain the edge-detected image of Figure 16 A.
Then, carry out the processing of handling procedure D shown in Figure 14.That is, shown in the step S62 of Figure 15, CPU 22 sets for example N regional area along the border.In other words, generate the regional area group 61 that comprises that N is individual.
Boundary coordinate point range to the edge-detected image of Figure 16 A of obtaining by processing by the end of step S4, utilization is the borderline short line segment (or tangent line) shown at regular intervals Figure 16 B for example along this border, with local orthogonal direction of this line segment (being gradient direction) on, set regional area in these borderline short line segment both sides, this regional area is roughly halved by line segment.
In addition, in Figure 16 B, the regional area example of setting is amplified expression on its right side.During this situation, set regional area, but also can be the regional area of circular and other shapes by the rectangular area.In the step S63 of back, CPU 22 will represent the parameter i initialization of each regional area, then in the step S64 of back, calculate in the regional area (i) along average pixel value E, the F in two zones of boundary demarcation.
In the step S65 of back, whether CPU 22 carries out by the ratio E/F of the average pixel value in two zones of the both sides of boundary demarcation greater than the judgement of the threshold value thr3 of regulation.
During this situation, obtain from the tone characteristics amount of the pixel of the taking a sample meansigma methods of the value of IHb for example by the Barret border of the case determined of diagnosis, and the meansigma methods of the pixel IHb of the sampling of the border during from non-Barret border, be identified for determining the threshold value thr3 of difference according to these values.
When the ratio E/F of average pixel value>thr3, in the step S66 of back, CPU 22 is identified as the Barret borderline region to the part on the border in this regional area.On the other hand, when the result of determination of step S65 does not satisfy the decision condition of E/F>thr3, shown in step S67, judge it is not the Barret border.
After the processing of above-mentioned steps S66, S67, in step S68, whether the parameter i that CPU 22 carries out regional area (i) equals the judgement of N, when i is not equal to N, in step S69, i is increased by 1, returns step S64, the processing of repeating step S64~S68.
And, after all regional areas (N) are carried out above-mentioned processing, finish this processing.
Like this, according to present embodiment, use the regional area of setting along the border to judge whether be the Barret border, can judge whether be the Barret border thus.
In addition, also can carry out edge determination, carry out final edge determination according to these a plurality of result of determination to all regional areas (N).
And, in step S62, when setting regional area, also can utilize symmetry is set on the orthogonal direction in line segment (fine rule) center short on respect to this border line segment or to wait at 2 and form regional area along the border.
And the regional area of setting in step S62 also can be the regional area of taking a sample arbitrarily.And, in this step S66, also can follow the tracks of the border that is identified as the Barret borderline region, synthetically judge it is the Barret border according to these results.
Below, the endoscopic system with the 1st variation is described.Figure 17 represents to have the structure of major part of the endoscopic system 1B of the 1st variation.This endoscopic system 1B is in the endoscopic system 1 of Fig. 1, and the base end side in for example insertion section 11 of the endoscope 6 that constitutes endoscopic visualization device 2 is provided for detecting the insertion amount detecting device 71 that insertion section 11 is inserted into endoceliac insertion amount.This insertion amount detecting device 71 sends to the information of the insertion amount that detected for example CPU 22 of image processing apparatus 3.
Inserting amount detecting device 71 has: a plurality of rollers 72, and the outer peripheral face of its contact insertion section 11 is also free to rotate; And rotation amount detecting sensor 73 such as rotary encoder, it is installed on the rotating shaft of these rollers 72, detects rotation amount.
And the signal that detects by this rotation amount detecting sensor 73 is transfused to for example CPU 22 of image processing apparatus 3, and CPU 22 is according to the information of insertion amount, and whether carry out captured image is the judgement that comprises the image at detected object position.That is, CPU 22 has the function of the detected object spectral discrimination 22a of portion, and whether according to the information of insertion amount, carrying out captured image is the judgement that comprises the image at detected object position.
During this situation, after the insertion section 11 of endoscope 6 was inserted by the oral area from the patient, the value of rotation amount detecting sensor 73 that can be when inserting detected the position of the leading section 14 when being inserted into esophagus deep side.
And, according to shooting characteristic of the camera head 17 that comprises CCD 16 etc., and judge that in conjunction with following condition this condition is: judge whether be to have taken the above image of resolution that above-mentioned image processing method can judge and whether be the image that comprises the detected object position.
Promptly, sometimes can obtain to comprise the image at detected object position from afar, but part is regarded as being not enough to setting decision condition under the situation of the resolution of analysis image or illumination light quantity not sufficient a long way off, makes the judgement of the image that do not comprise the detected object position.
After the image that CCD by endoscope 6 takes carries out signal processing by CCU 8, be transfused to the image input part 21 of image processing apparatus 3 all the time.This image is stored in hard disk 27 grades successively.
During this situation, CPU 22 is kept at the information of insertion amount in the image that is stored in successively in hard disk 27 grades as incidental information.Whether perhaps, CPU 22 also can be according to the information of insertion amount, be the determination information (identifying information) of judgement that comprises the image at detected object position with carrying out captured image, preserves as incidental information.
And CPU 22 carries out the processing identical with the situation of the foregoing description 3 according to determination information to comprising the image of judging the object position.In addition, be not limited to embodiment 3, also go for embodiment 1 or embodiment 2.
That is, in this variation, need the original image of graphical analysis among the embodiment 1~3, become the selection image that is judged to be detected object according to location determination information automatically.
Therefore, according to this variation, whether be the image that character that character has changed changes the judgement (detections) of mucosa by graphical analysis, be restricted to the image that objectively meets predetermined conditions, so can obtain objective more result of determination.
And, can use insertion amount detecting device 71 grades suitably to detect or select to set the image that becomes detected object, so can reduce the situation of the unwanted picture of having taken non-detected object position being carried out Flame Image Process.
And; even taken the image at the position that becomes detected object; under situation away from camera head 17; precision based on graphical analysis is insufficient; but according to present embodiment; leave in the characteristic of also having considered camera head 17 under the situation more than the predetermined distance,, can guarantee precision based on graphical analysis by setting the not decision condition at shot detection object position.In addition, in the above description, the situation of utilizing the image that endoscope 6 with elongated insertion section obtains has been described, but as described below, has also utilized the image that uses capsule type endoscope to obtain.
Capsule-type endoscope system 81 with variation shown in Figure 180 is by constituting with the lower part: swallow the capsule type endoscope device (being designated hereinafter simply as capsule type endoscope) 82 of taking body cavity inside by the patient; Device outside 83, it is configured in the external of patient, receives and write down the view data from capsule type endoscope 82; And by image processing apparatus 84 from these device outside 83 input pictures.
Capsule type endoscope 82 has in the container of capsule shape: as for example LED85 of lighting unit; Form the object lens 86 of the picture of illuminated subject; CCD 87, and it is configured in this image space, constitute the image unit of taking; Control circuit 88, it carries out signal processing etc. to the image pickup signal of taking by this CCD 87; Radio-circuit 89, it carries out the captured treatment of picture of wireless transmission; And the battery 90 that electric power is provided to each circuit etc.
And device outside 83 utilizes radio-circuit 92 to receive electric wave from the antenna 89a of the radio-circuit 89 of capsule type endoscope 82 by a plurality of antenna 91a, 91b, 91c, and this signal is sent to control circuit 93.Be converted to signal of video signal by this control circuit 93, export to image processing apparatus 84.
And image processing apparatus 84 carries out the processing of above-mentioned each embodiment etc.
In addition, control circuit 93 has the position detecting function 93a that infers the position of capsule type endoscope 82 by a plurality of antenna 91a~91c.And, also can utilize this position detecting function 93a, select to set the image of detected object.That is, replace original image 31, utilize this position detecting function 93a to detect the position of whether having taken near the border from the esophagus to the stomach, identical with above-mentioned situation when the position of having taken near the border, also can be used as original image.
Like this, can effectively judge become detected object near the live body mucosa of esophagus to the position on the border of stomach.
In addition, in each above-mentioned embodiment, use the IHb value, still, for example also can use R/G, R/ (R+G+B) to wait other characteristic quantities as the characteristic quantity of estimating tone.And, for example also can use the form and aspect/colorfulness in the HSI color space.
According to the foregoing description 1~embodiment 3, according to the boundary information of live body mucomembranous surface, use Flame Image Process to detect the existence of the different live body mucosa of character, the mucosa that can the detect esophagus thus character such as Barret mucosa of degeneration changes mucosa.
(embodiment 4)
Can more reasonably detect specific live body mucosa, the particularly existence of Barret mucosa that becomes detected object below with reference to Figure 19~Figure 28 explanation, and not be subjected to the image processing method of the influence of the improper parts such as dark portion in the image.
The hardware configuration of endoscopic system 1C shown in Figure 19 is identical with endoscopic system shown in Figure 11.But, be stored in the handling procedure E in the handling procedure storage part 23 of Figure 19, different with the handling procedure in the handling procedure storage part that is stored in Fig. 1.And CPU 22 is according to this handling procedure of execution shown in Figure 20 E.Therefore, give the symbol identical, and omit its explanation with element shown in Figure 1 to each element among Figure 19.
Figure 20 represents that CPU 22 carries out the block diagram of the image processing function of Flame Image Process according to being stored in handling procedure E in the handling procedure storage part 23, and this Flame Image Process is used to detect the existence as the Barret mucosa of detected object mucosal tissue.
CPU 22 shown in Figure 19 has the function that does not need regional deletion portion 133 and tone handling part 135, the described regional deletion portion 133 that do not need is for the original image 131 that should carry out Flame Image Process, deletion does not need the zone from this original image 131, generating does not need regional deleted image 132, this does not need 135 pairs of described tone handling parts regional deleted image 132 to carry out tone to handle, generate tone and handle image 134.
And, this CPU 22 has the function that the zone indicates handling part 137 and mucosa feature value calculation unit 139, described zone indicates handling part 137 and checks colors and mediate reason image 134 and carry out zone and indicate processing, and generate regional sign picture as 136,139 pairs of described mucosa feature value calculation unit should be calculated the mucosa characteristic quantity as 136 by the zone sign picture, and output mucosa information 138.
From the signal of video signal of CPU 22 output as endoscopic images input picture input part 21, the known technology that by A/D conversion with the linearity correction is purpose is anti-γ calibration shift, become the original digital image data of numeral, CPU 22 carries out following processing according to handling procedure.
At first, whether CPU 22 carry out not needing the zone deletion that or not in zone to handle from the original digital image data deletion.This does not need the zone deletion to handle at the pixel value (R, G, B) of colour original as data, the pixel value of unwanted pixel in the judgement of mucosal tissue classification is replaced into (0,0,0), generate do not need the zone deleted do not need regional deleted image 132.The so-called zone that do not need is meant the dark portion in the endoscopic images and the halation part of high brightness.
Figure 22 A represents the representative example of original image 131.
Figure 21 represents that this does not need the zone deletion to handle.After not needing the zone deletion to handle beginning, shown in step S71, each pixel that 22 couples of CPU constitute colour original as data, whether carry out its pixel value (R, G, B) is the judgement of dark portion pixel.In addition, below explanation is the example that unit does not need zone deletion to handle etc. with 1 pixel, but is that the example that unit handles does not break away from aim of the present invention yet with a plurality of pixels.
CPU 22 is used to judge whether be threshold value (or reference value) Rcut, Gcut, the Bcut of dark portion pixel, whether be the judgement of the pixel of R<Rcut and G<Gcut and B<Bcut.Herein, threshold value Rcut, Gcut, Bcut are used to judge whether be set value dark portion pixel, that brightness value is lower, set in handling procedure E.
In step S71, when having the pixel that belongs to dark portion pixel, shown in step S72, CPU 22 is replaced into pixel value (0,0,0) to the pixel value of this pixel (R, G, B), changes the step S73 of back over to.
And, in step S71, when not belonging to the pixel of dark portion, change step S73 over to.
In step S73, each pixel that 22 couples of CPU constitute colour original as data, whether carry out its pixel value (R, G, B) is the judgement of halation pixel.
That is, whether whether CPU 22 is used to judge is threshold value Rhal, Ghal, the Bhal of halation pixel, be the judgement of the pixel of R>Rhal and G>Ghal and B>Bhal.Herein, threshold value Rhal, Ghal, Bhal are used to judge whether be the set value halation pixel, that brightness value is higher, set in handling procedure E.
In addition, user such as operative doctor also can be from threshold value Rhal, Ghal, the Bhal of threshold value Rcut, Gcut, Bcut and the halation of the dark portion of input operation part 29 changes.
In step S73, when having the pixel that belongs to halation, shown in step S74, CPU22 is replaced into pixel value (0,0,0) to the pixel value of this pixel (R, G, B), and finishing this does not need the zone deletion to handle.And in step S73, when not belonging to the pixel of halation, finishing this does not need the zone deletion to handle.
And, finish this Figure 21 do not need the zone deletion to handle after, generate do not need the zone deleted do not need regional deleted image 132.Figure 22 B represents do not need to finish the zone deletion to handle the concrete example that does not need regional deleted image 132 that the back generates to the original image 131 of Figure 22 A.
Then, 135 pairs of tone handling parts do not need each pixel of regional deleted image 132, calculate the tone characteristics amount based on pixel value.
But for the pixel that not needing to be regarded as regional pixel value (0,0,0) in the processing of Figure 21, tone handling part 135 does not carry out the computing of tone characteristics amount.
As the tone characteristics amount, use that in known technology, use with relevant higher hemoglobin (abbreviating Hb as) index blood flow, be Hbindex (abbreviating IHb as).This IHb use value is a calculating formula: log 2(R/G), this value is by the luminance signal Vg that extracts each pixel on the endoscopic images out (near the G signal of the light the 560nm that reflection can be absorbed by hemoglobin) and luminance signal Vr (reflecting the R signal that is got near the light the less 650nm by modified hemoglobin), and ratio between two is carried out logarithm is converted to.In Figure 21, utilize step S75 to represent this processing.
Adopt the tone characteristics amount of the IHb that is calculated, be stored on the position identical with the pixel of original image as pixel value.For example, be stored in image storage part 24 or the analytical information storage part 25.Thus, generate tone and handle image 134.Figure 22 C represents the regional deleted image 132 that do not need of Figure 22 B is carried out the example that tone after the computing of tone characteristics amount is handled image 134.In addition, Figure 22 C for example represents the image of pseudo-coloursization.
During this situation, the IHb that handles image 134 at tone is during greater than the threshold value θ of regulation, and the image of this shooting object comprises that the probability of Barret mucosa (Barret epithelium) is bigger.
Then, handle image 134 for tone, the zone indicates handling part 137 according to handling process described later, generates regional sign picture as 136.
The zone sign picture similarly is the sign picture picture of the epithelium classification of storage shooting object on location of pixels.In the present embodiment, storage 1 when being set at shooting object and belonging to the squamous epithelium of normal mucosa in the picture position, belong to that character changed as the Barret mucosa of degeneration mucosa the time storage 2, do not belong to storage 0 when needing the zone.
Below, with reference to the handling process of Figure 23 declare area sign.
After zone sign handling process began, CPU 22 handled the pixel value D that image 134 obtains the pixel that indicates processing from tone in initial step S81.
In the step S82 of back, whether CPU 22 carries out pixel value D is 0 judgement.And, judging that pixel value D is not at 0 o'clock, change the step S83 of back over to, judge that on the contrary pixel value D is, changes step S86 at 0 o'clock.
In step S83, CPU 22 judges the threshold value θ of regulation and the size of described pixel value D according to handling procedure E.And, when this result of determination is D>θ, change the step S84 of back over to.On the other hand, in the judgement of step S83, if D≤θ then changes step S85 over to.In addition, according to normal mucosa with from the sampled data of the IHb that takes a sample in advance as the Barret mucosa of degeneration mucosa, set and carry out the threshold value θ that this judgements is classified.
In step S84, CPU 22 is set at 2 to regional sign picture as 136 respective pixel value, changes step S87 over to.In step S85, CPU 22 is set at 1 to regional sign picture as 136 respective pixel value, changes step S87 over to.
And in step S86, CPU 22 is set at 0 to regional sign picture as 136 respective pixel value, changes step S87 over to.
And in step S87, whether CPU 22 has the judgement of the pixel of the processing of indicating, if the pixel of the processing of indicating is arranged, then returns step S81, if do not indicate the pixel of processing, then finishes the processing that this zone indicates.
Like this, indicate processing by the zone, each pixel generation that tone is handled image 134 is denoted as the regional sign picture of pixel value 0,1 or 2 as 136.Figure 22 D represents that regional sign picture is as 136 example.
Then, 139 pairs of regional sign pictures of mucosa feature value calculation unit are as the mucosa characteristic quantity of the various epitheliums of 136 calculating.
In the present embodiment, obtain the pixel count of each mucosa classification of various epitheliums, based on the regional sign picture of known boundary line track algorithm as the pixel count of the mucosa classification of 136 boundary pixel point range, epithelium than (pixel count of the pixel count/squamous epithelium of Barret mucosa), preserve as the mucosa information of epithelium.
And the pixel count ratio and the set threshold tau of the mucosa classification of mucosa feature value calculation unit 139 more described epitheliums are judged to have or not the Barret mucosa, result of determination are saved as the mucosa information of epithelium.During this situation, when the pixel count ratio 〉=τ of the mucosa classification of epithelium, there is the Barret mucosa in the process decision chart picture.
On the other hand, when the pixel count ratio<τ of the mucosa classification of epithelium, there is not the Barret mucosa in the process decision chart picture.
Figure 24 represents the determination processing of the Barret mucosa that mucosa feature value calculation unit 139 is carried out.
Shown in step S91, CPU 22 counts and calculates the pixel count N1 of the pixel value 1 that is denoted as squamous epithelium and is denoted as the pixel count N2 of the pixel value 2 of Barret mucosa.In the step S92 of back, CPU 22 is used to judge the threshold tau that has or not the Barret mucosa, judges whether the pixel count of the mucosa classification of epithelium is N2/N1 〉=τ than N2/N1.
And according to the result of determination of step S92, when the N2/N1 that satisfies condition 〉=τ, shown in step S93, CPU 22 judges in the above-mentioned original image and has the Barret mucosa, and this result is kept in hard disk 27 grades.
On the other hand, according to the result of determination of step S92, when N2/N1<τ, shown in step S94, CPU 22 judges in the above-mentioned original image and does not have the Barret mucosa, and this result is kept in hard disk 27 grades.
After step S93 and step S94, change step S95 over to, CPU 22 sends to display monitor 4 to result of determination by display process portion 28, shows result of determination on the display surface of display monitor 4.And, finish this processing.
In addition, in the above description, the pixel count of mucosa classification that has used epithelium is than N2/N1, but also can adopt N2/ (N1+N2) to replace this pixel count to compare N2/N1.During this situation, the decision condition of N2/N1 〉=τ becomes N2/ (N1+N2) 〉=(τ+1).That is, even the sum of all pixels (N1+N2) in the use process object zone also can carry out identical judgement.
Like this, according to present embodiment,, find the Barret mucosa easily as early as possible, and do not rely on subjective judgment by whether there being the Barret mucosa of the specific live body mucosa of the conduct that becomes detected object in the graphical analysis process decision chart picture.And,, can inform in this image of operative doctor to exist the probability of Barret mucosa bigger, so whether operative doctor can be the diagnosis of degeneration portion more effectively by showing result of determination.
In addition, as 136, be purpose for regional sign picture to remove noise, also can after indicating the processing that handling part 137 carries out, the zone implement expansion, shrink process as known technology at once.In addition, in the present embodiment, use IHb, still, for example also can use R/G, G/ (R+G+B) to wait other characteristic quantities as the tone characteristics amount.And, for example also can adopt form and aspect, colorfulness in the IHS color space.
Figure 25 represents the structure of major part of the endoscopic system 1D of the 1st variation.This endoscopic system 1D is in the endoscopic system 1C of Figure 19, and the base end side in for example insertion section 11 of the endoscope 6 that constitutes endoscopic visualization device 2 is provided for detecting the insertion amount detecting device 41 that insertion section 11 is inserted into endoceliac insertion amount.It is identical with situation illustrated in fig. 17 that this inserts amount detecting device 41, so omit most of explanation.
CPU 22 has the function of the detected object spectral discrimination 22a of portion, and whether according to the information of insertion amount, carrying out captured image is the judgement that comprises the image at detected object position.
After the image that CCD by endoscope 6 takes carries out signal processing by CCU 8, be stored in successively in hard disk 27 grades.
CPU 22 is kept at the information of insertion amount in the image that is stored in successively in hard disk 27 grades as incidental information.Whether perhaps, CPU 22 also can be according to the information of insertion amount, be the determination information (identifying information) of judgement that comprises the image at detected object position with carrying out captured image, preserves as incidental information.
And CPU 22 is according to determination information, to comprising the image of judging the object position, similarly do not need the zone deletion to handle etc. with the situation of the foregoing description 4.That is, in this variation, need the original image of graphical analysis among the embodiment 4, become the selection image that is judged to be detected object according to determination information automatically.
Therefore,, need the image of graphical analysis to be restricted to the image that objectively meets predetermined conditions, so can obtain more objectively result of determination according to this variation.
And, can use and insert amount detecting device 41 grades and suitably detect or select to set the image that becomes detected object, so also can reduce the situation of the unwanted picture of having taken non-detected object position being carried out Flame Image Process.
And; even taken the image at the position that becomes detected object; under situation away from camera head 17; precision based on graphical analysis is insufficient; but according to this variation; leave in the characteristic of also having considered camera head 17 under the situation more than the predetermined distance,, can guarantee precision based on graphical analysis by setting the not decision condition at shot detection object position.Below, the 2nd variation is described.The part of this variation is different with the handling procedure E of embodiment 4.The processing capacity of the CPU 22B that handles according to the handling procedure F of present embodiment as shown in figure 26.
Processing capacity shown in Figure 26 in the processing capacity of Figure 20, is provided with at the rectangle zonule cutting treatmenting part 152 of regional sign picture as 136 generation rectangle zonule dividing processing groups 151.These rectangle zonule cutting treatmenting part 152 cut zone sign pictures are as 136, to generate rectangle zonule dividing processing group 151.
Figure 27 schematically shows the image of the rectangle zonule 153 of cutting apart generation.
As shown in figure 26, rectangle zonule cutting treatmenting part 152 constitutes, and the rectangle zonule dividing processing group 151 when generating rectangle zonule dividing processing group 151 at regional sign picture as 136 is used the mucosa characteristic quantities and calculated.
And in this variation, mucosa feature value calculation unit 139B calculates mucosa information 138 to each rectangle zonule 153, and compares the mucosa classification pixel count ratio and the predefined set threshold tau of epithelium, determines the attribute of rectangle zonule 153.
For example, when the pixel count ratio 〉=τ of the mucosa classification of epithelium, the attribute of rectangle zonule is made as the Barret mucosa.On the other hand, when the pixel count ratio<τ of the mucosa classification of epithelium, the attribute of rectangle zonule is made as squamous epithelium.
And, the quantity of mucosa feature value calculation unit 139B counting fixed attribute in rectangle zonule 153, relatively this attribute is than (the rectangle zonule count number of the rectangle zonule count number/squamous epithelium of Barret mucosa) and set threshold epsilon.
For example, when attribute ratio 〉=ε, there is the Barret mucosa in the process decision chart picture.On the other hand, when attribute ratio<ε, there is not the Barret mucosa in the process decision chart picture.
Figure 28 represents the handling process of the mucosa characteristic quantity calculating that mucosa feature value calculation unit 139B carries out.
After the processing of calculating the mucosa characteristic quantity begins, shown in step S101, the pixel count n1 of the pixel value 1 of each rectangle zonule that CPU 22 calculating have indicated and the pixel count n2 of pixel value 2.
In the step S102 of back, CPU 22 judge these pixel counts than n2/n1 whether more than or equal to set threshold tau.That is, whether CPU 22 judges n2/n1 〉=τ.And when satisfying this condition, shown in step S103, CPU 22 is set at the Barret mucosa to the attribute that satisfies the rectangle zonule of this condition.
On the other hand, when the n2/n1 that do not satisfy condition 〉=τ, shown in step S104, CPU 22 is set at squamous epithelium to the attribute of this rectangle zonule.
After the processing of step S103 and S104, in step S105, CPU 22 judges whether untreated rectangle zonule is arranged, and when untreated rectangle zonule is arranged, returns step S101, the processing of repeating step S101~S105.
And, after the processing of step S101~S105 being carried out in all rectangle zonules, shown in step S106, the quantity m2 of the quantity m1 of the rectangle zonule of CPU 22 calculating Barret mucosa attributes and the rectangle zonule of squamous epithelium attribute.
In the step S107 of back, CPU 22 judge these pixel counts than m2/m1 whether more than or equal to set threshold epsilon.That is, whether CPU 22 judges m2/m1 〉=ε.And, when satisfying this condition, shown in step S108, there is the Barret mucosa in the CPU 22 process decision chart pictures, this result of determination is kept in the hard disk 27, and on display monitor 4, shows.
On the other hand, when the m2/m1 that do not satisfy condition 〉=ε, shown in step S109, CPU 22 judges in these images and does not have the Barret mucosa, this result of determination is kept in the hard disk 27, and shows on display monitor 4.
According to this variation, is image division the zonule with size to a certain degree, in each zonule, each pixel in this zonule is judged it whether is the zone that has as the probability of the Barret mucosa of detected object mucosa, and judge whether be the Barret mucosa according to these result of determination, so can judge whether be the detected object mucosa more accurately.
Promptly, in embodiment 4, the sign that whether each pixel of entire image is had the probability of detected object mucosa, according to indicating result's pixel count ratio of all categories based on this, whether comprise the judgement of Barret mucosa, so for the result of determination of certain part, no matter be near part or leave part far away, can both be reflected in the result of determination equally.
To this, in this variation, be divided into the zonule and judge, so can more reasonably carry out the judgement of each several part, can improve final decision result's reliability.
(embodiment 5)
Below, with reference to Figure 29~Figure 31 embodiment 5 is described.Figure 29 represents to pass through according to the handling procedure G of embodiment 5 contents processing of the function of CPU 22 execution.The hardware configuration of present embodiment is identical with image processing apparatus shown in Figure 19 3, the different handling procedure G of handling procedure E in having a part and being stored in handling procedure storage part 23.
In the present embodiment, for example as dynamic image, each two field picture to dynamic image carries out Flame Image Process successively original image.
The key element of explanation utilizes same-sign to represent in embodiment 4, so omit its explanation.
As shown in figure 29, present embodiment uses the mucosa characteristic quantity handling part 160 of the computing carry out the mucosa characteristic quantity, replaces among the handling procedure E of Figure 20 regional sign picture as the 136 mucosa feature value calculation unit 139 of calculating the mucosa characteristic quantities.And 160 pairs of regional sign pictures of this mucosa characteristic quantity handling part calculate various mucosa characteristic quantities as 136, are kept at successively in the mucosa information buffer 161.
Mucosa information buffer 161 is ring buffers shown in Figure 30, its constitute since next time the position of the buffer when preserving on circumference, move successively, so after the mucosa information that generates more than the specified quantity, the mucosa information of storage is preserved this mucosa information before covering.In the example of Figure 30, for example store mucosa information successively with the order of mucosa information 162a, 162b, 162c, 162d according to time series.And when importing mucosa information, the oldest mucosa information 162a will be covered by new mucosa information next time.In addition, the situation of 4 mucosa information 162a~162d shown in Figure 30, but also can be to have the more ring buffer of large storage capacity.
Mucosa characteristic quantity handling part 160 constitutes according to the pixel count ratio of regional sign picture as 136 various epithelium classifications, judges to have or not the Barret mucosa in this image, and result of determination is kept in the mucosa information buffer 161 as mucosa information 162.
The mucosa classification of the mucosa information 162 of mucosa information buffer 161 is determined by mucosa classification determination portion 163.
Figure 31 represents the handling process of mucosa classification determination portion 163.
In initial step S141, CPU 22 obtains all mucosa information 162 that are kept in the mucosa information buffer 161.
In the step S142 of back, CPU 22 is according to various mucosa information 162, and counting mucosa kind judging result is the quantity of Barret mucosa, and this count number is made as C '.
In the step S143 of back, CPU 22 is relatively by the set threshold value θ ' of handling procedure G setting and the size of described count number C '.That is, whether judge C '>θ '.
And when C '>θ ', shown in step S144, CPU 22 judges in the original images and has the Barret mucosa.
On the other hand, when C '≤θ ', shown in step S145, CPU 22 judges in the original images and does not have the Barret mucosa.
And among the step S146 after step S144 and S145, CPU 22 makes display monitor 4 show result of determination, and finishes this processing.
In addition, except that result of determination, described count number C ' is also shown on the display monitor 4.
According to present embodiment, judge the mucosa classification according to mucosa classification result, so can prevent because the mistake of the mucosa classification that the relatively poor image of observation condition causes is judged at a plurality of images.
And, according to present embodiment, the dynamic image that insertion section 11 one side that can insert endoscope 6 to one side is taken by the image unit that is located at its leading section 14 is as original image, and judge whether there is the Barret mucosa successively, so operative doctor can effectively according to this result of determination be diagnosed near the detected object position (or rapidly).
And, according to present embodiment, insertion section 11 when its front is inserted and be inserted into intraesophageal deep side always, boundary vicinity at stomach exists under the situation of Barret mucosa, the probability that shows the result of determination that has the Barret mucosa increases, so can carry out the diagnosis of Barret mucosa according to this demonstration effectively.
And, if be also shown on the display monitor 4, then when having the Barret mucosa, along with position near this Barret mucosa of existence by the top described count number C ' that makes, the value of general count number C increases, so can effectively carry out the diagnosis of Barret mucosa according to this demonstration.
In addition, also can constitute each frame of in turn not importing dynamic image continuously, and with certain frame period input original image.
And, in the time of in mucosa characteristic quantity handling part 160, also can constituting pixel count in pixel value=0 that regional sign picture comprises in as 136 more than or equal to set threshold value, do not preserve mucosa information 162, can also constitute the information that is not suitable for determining the mucosa classification of not preserving.
(embodiment 6)
Below, with reference to Figure 32 A~Figure 37 embodiment 6 is described.The purpose of present embodiment is, provides a kind of and can detect the image processing method whether (judgement) exists the detected object mucosa accurately.In order to reach this purpose, this method is carried out following processing: the zonule that is not suitable for detected object in the detected image as dark portion, and with its from detected object except, and according to this positional information of the dark portion of testing result reference in image, and then will be adjacent with dark portion and be judged as should except the zonule from detected object except, generate the image in process object zone.
Figure 32 A represents the contents processing of the function of being undertaken by CPU 22 according to the handling procedure H of embodiment 6.The hardware configuration of present embodiment is identical with image processing apparatus shown in Figure 19 3, different handling procedure H and the I of handling procedure E in having a part and being stored in handling procedure storage part 23.
The mucosa kind judging of present embodiment is handled and is constituted the handling procedure I shown in the handling procedure H shown in the continuous execution graph 32A and Figure 32 B.Handling procedure I utilizes the process object area information 174 as the result of handling procedure H.
Illustrate that present embodiment becomes the processing block of the handling procedure H of feature.
Original image 131 is transfused to rectangle zonule cutting treatmenting part 171, and this rectangle zonule cutting treatmenting part 171 is divided into rectangle zonule image sets 172 according to shown in Figure 33 with endoscopic images.In addition, the zonule is not limited to rectangle.
Rectangle zonule image sets 172 is transfused to process object zone determination portion 173, and this process object zone determination portion 173 is carried out flow processing shown in Figure 34, generates process object area information 174.
In step S151, process object zone determination portion 173 is obtained rectangle zonule 175 from rectangle zonule image sets 172, the dark portion pixel count in the counting rectangle zonule 175, zonule except the detection process object.
In the present embodiment, secretly portion's pixel is that pixel value is the pixel of R<Rcut and G<Gcut and B<Bcut.This count number is made as Cn (wherein, n is the continuous sequence number of rectangle zonule).Herein, Rcut etc. is the threshold value of explanation in embodiment 4.In addition, also can be set at the threshold value different with embodiment 4.
In step S152, process object zone determination portion 173 judges whether there is (the not processing of execution in step S151) the rectangle zonule 175 that handle, if there is the rectangle zonule 175 that handle, then return step S151, count processing.The rectangle zonule 175 that if there is no should handle then changes step S153 over to.
In step S153, process object zone determination portion 173 is according to the dark pixel count Cn of portion of rectangle zonule 175, carries out by the determination processing in the dark portion zone except from process object.The Cn that judgement in the present embodiment is used for to each rectangle zonule 175 to be calculated is defined as dark portion zone for maximum rectangle zonule 175.
Then, in step S154, process object zone determination portion 173 is determined the process object zone except this dark portion zone is from the process object zone according to the position in this dark portion zone in the original image 131.
Use Figure 35 to illustrate that it determines method.
The 176a of dual oblique line portion (grid oblique line portion) among Figure 35 is illustrated in the dark portion zone of determining among the step S153,9 patterns of all patterns that can realize when having enumerated with 3 * 3 divisions.And the 176b of oblique line portion represents by the zonule except from process object.
In step S154, process object zone determination portion 173 judges according to the dark position of portion zone in original image 131 dark portion zone belongs to which pattern among Figure 35.And process object zone determination portion 173 is according to corresponding pattern, only dark portion zone and removed handle except the zonule of zonule be defined as process object.
For example shown in Figure 36, when being detected as dark portion in upper right bight, the rightmost side in Figure 35 is shown in the pattern of topmost, except two of both sides that will be adjacent with this dual oblique line 176a of portion.
And generation is the process object area information 174 of the zonule 176c of the fully additional oblique line outside dual oblique line 176a of portion among Figure 35 and the 176b of oblique line portion as process object.That is,, generate (determining) process object area information 174 according to the information of the dual oblique line 176a of portion that is detected.
Therefore, in the memory element of handling procedure storage part 23 and analytical information storage part 25 etc., according to the piece of the dark portion of detecting in advance zone (zonule) whether in certain piece position of original image 131, registration (storage) be used for definite piece except the process object except block message.Specifically, as except block message, store a plurality of patterns shown in Figure 35.And,, can determine the process object zone according to the position in dark portion zone by block message except this.
The regional deletion 133B of portion that do not need in the handling procedure I of Figure 32 B uses above-mentioned process object area information 174, generate the pixel replacement of non-process object be (R, G, B)=(0,0,0) do not need regional deleted image 132.
The later processing for example generation contents processing with the mucosa information of Figure 20 is identical.
In addition, in the determination portion 173 of process object zone, also can constitute in the process object zone, pixel count according to each zonule 175 counting R=255, at the pixel count of R=255 during more than or equal to the pixel count of regulation, be considered as too near the shooting object, except this zonule 175 is from the process object zone.
In addition, also can select to set the size (pixel count) of zonule 175.And, can also detect a plurality of dark portions according to size, according to its configuration and distribution etc., except the part around them is from the process object zone.Like this, can set the process object zone more suitably.
According to present embodiment, except that the zonule that comprises dark portion, the zonule adjacent with dark portion also judged whether be the zonule that is not suitable for handling, unaccommodated from process object except, therefore remainder is considered as being fit to the zone of detected object, therefore can further improves the accuracy of detection (judgement precision) of specific live body mucosa.
Remark additionally, for example distribute when becoming concentric circular in dark portion zone and with its approaching regional corresponding endoceliac position with other shapes, carry out the detection of dark portion if merely set zonule 175, then in fact can not be with except dark portion and the zone approaching with it are reliably from detected object.
In this case, according to regional location and configuration or distribution etc., can suitably will be not suitable for effectively except the zone of detected object according to the dark portion of top described detection.In addition, though also can be according to each pixel with except the pixel of dark portion etc., this moment is with spended time.To this, handle with piece in the present embodiment, therefore can be effectively with except inappropriate.
In addition, in the above description, the situation of utilizing the image that endoscope 6 with elongated insertion section obtains has been described, but as described below, also can have utilized the image that uses capsule type endoscope to obtain.
The hardware configuration of capsule-type endoscope system 81B with variation shown in Figure 37 is same as shown in Figure 18.Therefore, omit its structure explanation.
And image processing apparatus 84 carries out a Flame Image Process among the foregoing description 4~embodiment 6.
According to embodiment 4~embodiment 6, except the pixel that dark portion etc. is not suitable for detected object, can detect the existence of the specific live body mucosa of detected object more accurately.
In addition, above-mentioned each embodiment waited the embodiment that carries out the part combination or constitute by the sequential scheduling that changes treatment step etc., also belong to the present invention.
Utilizability on the industry
For having taken the image of esophagus to the mucomembranous surface of the boundary vicinity of stomach, by extracting its limit out Boundaries etc., and self shape of computation bound and according to the characteristic quantity of the both sides of boundary demarcation etc. are detected Whether be the border of the Barret mucous membrane of sex change of gastric mucosa, diagnose thereby can assist.

Claims (17)

1. medical image processing method, this method is carried out Flame Image Process to the medical imaging of having taken the live body mucosa, it is characterized in that, and this medical image processing method comprises:
Boundary information detects step, and it detects the boundary information of the boundary member that is equivalent to described live body mucosa from described medical imaging; And
The mucosa character detects step, and it detects whether there is the different live body mucosa of character according to detecting the boundary information that detects in the step at described boundary information.
2. medical image processing method according to claim 1, it is characterized in that, this medical image processing method also comprises: input step, its more than one medical imaging input that will take described live body mucosa is used to detect the image processing apparatus of described boundary information.
3. medical image processing method according to claim 1, it is characterized in that, this medical image processing method also comprises: shape information detects step, and it detects the shape information of described boundary member according to detecting the boundary information that detects in the step at described boundary information.
4. medical image processing method according to claim 1, it is characterized in that, this medical image processing method also comprises: isolated area detects step, it is according to detecting the boundary information that detects in the step at described boundary information, detection is present in the zone partly, with the mutually isolated isolated area in another zone.
5. medical image processing method according to claim 1 is characterized in that, this medical image processing method also comprises:
Zone enactment steps, it sets the regional area that comprises the border according to detecting the boundary information that detects in the step at described boundary information; And
The characteristic quantity calculation procedure, it comes the calculated characteristics amount according to two zones of described boundaries on either side in described regional area.
6. medical image processing method according to claim 1 is characterized in that, described boundary information detects the profile that step comprises the profile that is used to detect described live body mucosa and detects step.
7. medical image processing method according to claim 3 is characterized in that, described shape information detects step and detects in the described boundary information information based on the complexity of boundary shape.
8. medical image processing method according to claim 1 is characterized in that, described boundary information detects step and comprises and be used to the graph thinning step of carrying out the binaryzation step of binaryzation and being used to generate the mucosa border picture of graph thinning.
9. medical image processing method according to claim 3, it is characterized in that, described shape information detects step detects more than one bending according to whether from the boundary shape of described boundary information inflection point, detects the live body mucosa that whether exists character different.
10. medical image processing method according to claim 6 is characterized in that, described profile detects step and comprises the filter step of using filtering, and this filtering is used to extract out the boundary member of live body mucosa.
11. medical image processing method according to claim 1 is characterized in that, described boundary information detects the information of the boundary-related of step detection and esophageal mucosa membrane injury and gastric mucosa, as described boundary information.
12. medical image processing method according to claim 1 is characterized in that, described mucosa character detects the Barret mucosa that step detects the boundary vicinity that is present in esophagus and gastric mucosa, as the different live body mucosa of character.
13. medical image processing method according to claim 4, it is characterized in that, described isolated area detects step and comprises determination step, and this determination step judges that whether the mucosa of the isolated area that is detected is the mucosa with described another regional mucosa same type.
14. medical image processing method according to claim 13, it is characterized in that, described determination step comprises the characteristic quantity calculation procedure, and the described isolated area of this characteristic quantity calculation procedure to being detected calculates the lightness or the relevant characteristic quantity of tone that have with described another regional mucosa.
15. medical image processing method according to claim 14 is characterized in that, described determination step comprises the comparison step that the described characteristic quantity that calculates and threshold value are compared.
16. medical image processing method according to claim 1 is characterized in that, described medical imaging is an endoscopic images.
17. medical image processing method according to claim 1 is characterized in that, described medical imaging is the capsule type endoscope image.
CNB2005800397547A 2004-12-10 2005-12-08 Medical image processing method Expired - Fee Related CN100515318C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2004359054A JP2006166939A (en) 2004-12-10 2004-12-10 Image processing method
JP359054/2004 2004-12-10
JP360319/2004 2004-12-13

Publications (2)

Publication Number Publication Date
CN101060806A CN101060806A (en) 2007-10-24
CN100515318C true CN100515318C (en) 2009-07-22

Family

ID=36668287

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2005800397547A Expired - Fee Related CN100515318C (en) 2004-12-10 2005-12-08 Medical image processing method

Country Status (2)

Country Link
JP (1) JP2006166939A (en)
CN (1) CN100515318C (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008136098A1 (en) * 2007-04-24 2008-11-13 Olympus Medical Systems Corp. Medical image processing device and medical image processing method
JP4801654B2 (en) * 2007-11-30 2011-10-26 パナソニック株式会社 Composite image generator
JP5305850B2 (en) * 2008-11-14 2013-10-02 オリンパス株式会社 Image processing apparatus, image processing program, and image processing method
JP5800468B2 (en) 2010-05-11 2015-10-28 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
JP5597049B2 (en) 2010-07-07 2014-10-01 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
JP5065538B2 (en) * 2010-09-14 2012-11-07 オリンパスメディカルシステムズ株式会社 ENDOSCOPE SYSTEM AND METHOD OF OPERATING VISUAL DEFECT JUDGING SYSTEM
JP5683888B2 (en) * 2010-09-29 2015-03-11 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
JP5526044B2 (en) 2011-01-11 2014-06-18 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
JP5167442B2 (en) * 2011-02-17 2013-03-21 三洋電機株式会社 Image identification apparatus and program
JP5329593B2 (en) * 2011-04-01 2013-10-30 富士フイルム株式会社 Biological information acquisition system and method of operating biological information acquisition system
JP5926937B2 (en) * 2011-11-30 2016-05-25 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
WO2016175098A1 (en) * 2015-04-27 2016-11-03 オリンパス株式会社 Image analysis device, image analysis system, and operation method for image analysis device
JPWO2016208016A1 (en) * 2015-06-24 2018-04-05 オリンパス株式会社 Image processing apparatus, image processing method, and image processing program
US10925527B2 (en) * 2016-01-08 2021-02-23 Hoya Corporation Endoscope apparatus
JP6824868B2 (en) * 2017-12-22 2021-02-03 サイバネットシステム株式会社 Image analysis device and image analysis method
CN113038868A (en) 2018-11-14 2021-06-25 富士胶片株式会社 Medical image processing system

Also Published As

Publication number Publication date
JP2006166939A (en) 2006-06-29
CN101060806A (en) 2007-10-24

Similar Documents

Publication Publication Date Title
CN100515318C (en) Medical image processing method
EP1842481B1 (en) Medical image processing method
van der Sommen et al. Computer-aided detection of early neoplastic lesions in Barrett’s esophagus
Häfner et al. Color treatment in endoscopic image classification using multi-scale local color vector patterns
US9324145B1 (en) System and method for detection of transitions in an image stream of the gastrointestinal tract
US8483454B2 (en) Methods for tissue classification in cervical imagery
CN109635871B (en) Capsule endoscope image classification method based on multi-feature fusion
EP3727124B1 (en) Dual-mode endoscopic capsule with image processing capabilities
US8861783B1 (en) System and method for detection of content in an image stream of the gastrointestinal tract
JP4749732B2 (en) Medical image processing device
US9430706B1 (en) System and method for detection of in-vivo pathology sequences
CN102724909A (en) Medical image processing apparatus and medical image processing method
Yuan et al. Automatic bleeding frame detection in the wireless capsule endoscopy images
JP4855673B2 (en) Medical image processing device
Gueye et al. Automatic detection of colonoscopic anomalies using capsule endoscopy
Ghosh et al. Block based histogram feature extraction method for bleeding detection in wireless capsule endoscopy
Amiri et al. [Retracted] A Computer‐Aided Method for Digestive System Abnormality Detection in WCE Images
US20210209755A1 (en) Automatic lesion border selection based on morphology and color features
Van Der Sommen et al. Computer-aided detection of early cancer in the esophagus using HD endoscopy images
JP2019111040A (en) Image analysis device and image analysis method
Figueiredo et al. Unsupervised segmentation of colonic polyps in narrow-band imaging data based on manifold representation of images and Wasserstein distance
Wang et al. Organic Boundary Location Based on Color‐Texture of Visual Perception in Wireless Capsule Endoscopy Video
Suman et al. Detection and classification of bleeding using statistical color features for wireless capsule endoscopy images
US9854958B1 (en) System and method for automatic processing of images from an autonomous endoscopic capsule
CN112734749A (en) Vocal leukoplakia auxiliary diagnosis system based on convolutional neural network model

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20090722