CN101115435A - Medical image processing device, lumen image processing device, lumen image processing method, and programs for them - Google Patents

Medical image processing device, lumen image processing device, lumen image processing method, and programs for them Download PDF

Info

Publication number
CN101115435A
CN101115435A CNA2006800044681A CN200680004468A CN101115435A CN 101115435 A CN101115435 A CN 101115435A CN A2006800044681 A CNA2006800044681 A CN A2006800044681A CN 200680004468 A CN200680004468 A CN 200680004468A CN 101115435 A CN101115435 A CN 101115435A
Authority
CN
China
Prior art keywords
image
processing apparatus
living body
test section
body feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2006800044681A
Other languages
Chinese (zh)
Other versions
CN100563550C (en
Inventor
西村博一
长谷川润
田中秀树
井上凉子
野波徹绪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN101115435A publication Critical patent/CN101115435A/en
Application granted granted Critical
Publication of CN100563550C publication Critical patent/CN100563550C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A medical image processing device comprises an image extracting unit for extracting a frame image from data on a moving picture of the inside of the body captured by an in-vivo imaging device or data on static images continuously captured and an image analyzing unit for analyzing the extracted frame image and outputting the results of the image analysis. The image analyzing unit has a first biofeature detecting section for detecting a first biofeature, a second biofeature from a frame image temporally before and after the image used for the detection by the first biofeature detecting section according to the results of the detection by the first biofeature detecting section, and a state judging section for judging the state for an organism from the results of the detection by the second biofeature detecting section and outputting the judgment result.

Description

Medical image-processing apparatus, tube chamber image processing apparatus, tube chamber image processing method and be used for said apparatus and the program of method
Technical field
The present invention relates to according to the judgement of the character that a large amount of view data of organism is paid close attention to efficiently medical image-processing apparatus, detect tube chamber image processing apparatus, the tube chamber image processing method of pars cardiaca and be used for said apparatus and the program of method based on image in the tube chamber.
Background technology
In the past usually in the splanchnoscopy that uses endoscope, biological intravital view data by endoscope apparatus or endoscopic visualization device shooting, in display devices such as CRT, carry out instant playback, and externally be stored as dynamic image data, the doctor carries out inspections and examinations as dynamic image in checking process or after checking, perhaps the two field picture in the dynamic image is carried out inspections and examinations as still image.
Situation in recent years is to use the capsule endoscope of the type of swallowing.
For example TOHKEMY 2004-645 communique is described, the view data of using capsule endoscope to photograph in vivo externally is stored as dynamic image data by radio communication successively, and the doctor carries out inspections and examinations as dynamic image or with the two field picture in the dynamic image as still image.
In addition as described in the TOHKEMY 2004-188026 communique, exist still image is suitable for image analysis handles, and with its analysis result in device shown on the endoscopic images or on other viewing area.
The doctor utilizes this image analysis result, can not rely on subjective judgment, and diagnoses under the such objective determinating reference of image analysis values such as IHb, blood vessel parsing.
But in the observation of the dynamic image of doctor behind splanchnoscopy or use in the observation of dynamic image of capsule endoscope, because the number of contained two field picture expands in the dynamic image, therefore from dynamic image, find out the camera position at the doubtful position of pathological changes, simultaneously the two field picture of this position is one by one extracted to be suitable for image analysis and handle, and the unusual labor intensive of the operation of diagnosing based on each image analysis result.
Can use described image analysis apparatus to this problem, realize using identical image analysis to handle and store its result's system in the lump whole two field pictures contained in the dynamic image at still image.
But owing to use identical image analysis to handle in the lump, therefore cause the processing time to increase and must wait long just and can obtain result whole two field pictures.In addition, come under the situation of application image dissection process, also need just can obtain suitable result for a long time in the change parameter.Exist in addition and obtain the shortcoming that the proper resolution result needs data quantity stored also to increase before.
In addition, whether in the screening that the esophagus that uses endoscope apparatus is checked, checking has Barrett (Barrett) mucosa or Barrett (Barrett) esophagus etc.The Barrett mucosa is at the stomach esophagus junction surface (EG junction surface) as the connecting portion of esophagus stomach function regulating, owing to the influence of adverse current esophagitis etc. makes the squamous epithelium that forms esophagus be replaced as gastric mucosa, therefore is also referred to as cylindrical epithelium.More than the paranormal mucosa of this Barrett mucosa border 3cm, then be diagnosed as when taking place and suffer from the such disease of Barrett esophagus for full week property ground, esophagus tube chamber cross section.
The Barrett esophagus morbidity example in the occidentals is especially increasing, and causes adenocarcinoma with high probability, thereby becomes serious problem, so the early discovery of Barrett mucosa is extremely important.
Therefore wishing to have can objective judgement Barrett esophagus and the living body feature amount of Barrett mucosa etc. and the medical image-processing apparatus of this result of determination is provided to operative doctor.
And in medical field, be extensive use of armarium as mentioned above and carry out the observation and the diagnosis of endoceliac internal organs with image camera function.
For example in the diagnosis of esophagus illness, be under near the situation of illness diagnosis of the Barrett esophagus the EG junction surface (stomach esophagus junction surface) on the top of pars cardiaca at the boundary member that carries out the harmonization of the stomach esophagus, have the situation that is caused adenocarcinoma as mentioned above by the Barrett esophagus, therefore splanchnoscopy is extremely important in the diagnosis of esophageal.The doctor inserts endoscope and undertaken by the endoscopic images that the observation monitor picture shows from patient's oral area the illness diagnosis of esophagus.
And as mentioned above, developed capsule type endoscope in recent years, the doctor can the limit observation carry out the diagnosis of esophagus illness by the image limit that capsule type endoscope obtains.Proposed to carry out based on the organism image that obtains by capsule type endoscope system's (for example with reference to WO02/073507 A2 communique) of lesion detection.
But in the system of such scheme, for the technology of carrying out near the detection pars cardiaca or the cardia boundary portion according to image from the esophagus to the stomach and not mentioned.
For example if can the boundary portion of pars cardiaca or pars cardiaca be detected, then the doctor can carry out detailed observation to the bio-tissue image of the boundary portion of detected pars cardiaca or pars cardiaca, thereby can diagnose pathological changes such as Barrett esophagus rapidly.
Summary of the invention
The present invention is directed to the problems referred to above and finish, purpose is to provide a kind of medical image-processing apparatus of judgement of the character that can pay close attention to efficiently according to a large amount of view data.
And the object of the invention is to provide a kind of tube chamber image processing apparatus that can carry out the detection of pars cardiaca according to image in the tube chamber.
The medical image-processing apparatus of first mode of the present invention has: the image extraction unit, and it extracts two field picture from the intravital dynamic image data of being taken by the photographing in vivo device or several static image datas of taking continuously; And image analysis portion, it carries out image analysis and output image analysis result to the two field picture that is extracted by described image extraction unit, and described image analysis portion has: the first living body feature test section, it detects first living body feature; The second living body feature test section, it is based on the testing result of the described first living body feature test section, and the time is gone up captured two field picture before or after the image that the described first living body feature test section uses in detection, detects second living body feature; And the character detection unit, it is judged the character of organism and exports result of determination according to the testing result of the described second living body feature test section.
The medical image processing method of second mode of the present invention has following steps: the step of extracting two field picture from the intravital dynamic image data of being taken by the photographing in vivo device or several static image datas of taking continuously; The two field picture that extracts is carried out image analysis, detect the step of first living body feature; Based on the testing result of described first living body feature,, detect the step of second living body feature in detection before or after the image that the described first living body feature test section uses to the two field picture that the time upward took; Based on the testing result of the detection of described second living body feature, judge the character of organism and export the step of result of determination.
The program of Third Way of the present invention is used for making the following function of computer realization: from the function of the intravital dynamic image data of being taken by the photographing in vivo device or several static image datas extraction two field pictures of taking continuously; The two field picture that extracts is carried out image analysis, detect the function of first living body feature; Based on the testing result of described first living body feature,, detect the function of second living body feature in detection before or after the image that the described first living body feature test section uses to the two field picture that the time upward took; Based on the testing result of the detection of described second living body feature, judge the character of organism and export the function of result of determination.
The tube chamber image processing apparatus of cubic formula of the present invention has: feature value calculation unit, and it comes the characteristic quantity of computational rules according to digestive tract is taken image in a resulting width of cloth or several tube chambers by Flame Image Process; And the boundary portion test section, it detects gastral boundary portion according to the described characteristic quantity that calculates.
The tube chamber image processing method of cubic formula of the present invention has following steps: according to digestive tract being taken image in a resulting width of cloth or several tube chambers, the step of coming the characteristic quantity of computational rules by Flame Image Process; And the step that detects gastral boundary portion based on the described characteristic quantity that calculates.
The program of cubic formula of the present invention is used to make the following function of computer realization: according to digestive tract being taken image in a resulting width of cloth or several tube chambers, the function of the characteristic quantity of computational rules; And the function that detects gastral boundary portion based on the described characteristic quantity that calculates.
Description of drawings
Fig. 1 is the integrally-built block diagram that expression has the endoscopic system of first embodiment.
Fig. 2 is that expression is inserted the sketch map of implementing the gastral various piece in endoscopic top with endoscope from oral area.
Fig. 3 is the figure that the resulting endoscopic images example of boundary vicinity of esophagus stomach function regulating is taken in expression.
Fig. 4 is the figure of functional structure of the major part of presentation video blood processor.
Fig. 5 is the figure that the dynamic image data that will store in image storage part of expression is stored as the static image data collection respectively.
Fig. 6 A is the figure that is illustrated in the analysis result of storing in the resolving information storage part.
Fig. 6 B is the figure that is illustrated in the information example of using when carrying out dissection process by handling procedure storage part 23 or setting.
Fig. 7 is that expression shows routine figure with analysis result with the monitor that endoscopic images shows.
Fig. 8 is the flow chart that carries out the treatment step that the character of Barrett esophagus judges.
Fig. 9 flow chart that to be treatment step that the detection of carrying out stomach esophagus junction surface is handled represent with the information of the image of use or generation etc.
Figure 10 A is the figure on expression palisade blood vessel end points border.
Figure 10 B represents the figure on palisade blood vessel end points border and epithelium border.
Figure 10 C is the figure of state that the image on palisade blood vessel end points border etc. is divided with 8 lonizing radiation.
Figure 11 is the flow chart that the palisade vessel extraction is handled in the detailed presentation graphs 9.
Figure 12 A is the figure of the image example of the action of expression when being used to the processing of carrying out Figure 11 is described.
Figure 12 B is the figure of the image example of the action of expression when being used to the processing of carrying out Figure 11 is described.
Figure 12 C is the figure of the image example of the action of expression when being used to the processing of carrying out Figure 11 is described.
Figure 13 is a flow chart of representing the extremely viscous film determination processing of Figure 10 mini-bus thunder in detail.
Figure 14 is the flow chart of the variation of Fig. 9.
Figure 15 A is the figure that is illustrated in the image example of using in the action specification of Figure 14 etc.
Figure 15 B is the figure that is illustrated in the image example of using in the action specification of Figure 14 etc.
Figure 16 is a flow chart of representing the Barrett mucosa determination processing among Figure 14 in detail.
Figure 17 is the figure of functional structure of major part of the image processing apparatus of expression second embodiment.
Figure 18 is the flow chart of the treatment step judged of the character of carrying out the Barrett esophagus of expression second embodiment.
Figure 19 will carry out the flow chart that treatment step that cardia detect to handle is represented with the information of the image of use or generation etc.
Figure 20 A is the action specification figure of Figure 19.
Figure 20 B is the action specification figure of Figure 19.
Figure 21 is a flow chart of representing the concentration degree computing of Figure 19 in detail.
Figure 22 is the flow chart of the cardia determination processing of remaining silent of expression Figure 19.
Figure 23 is with the flow chart that processing sequence that cardia detect to handle is represented with the information of the image of use or generation etc. that carries out in the variation.
Figure 24 A is the action specification figure of Figure 23 and Figure 25.
Figure 24 B is the action specification figure of Figure 23 and Figure 25.
Figure 24 C is the action specification figure of Figure 23 and Figure 25.
Figure 25 represents that in detail the marginal element among Figure 23 generates the flow chart that angle calculation is handled.
Figure 26 is the flow chart of the opening cardia determination processing of expression Figure 23.
Figure 27 is the figure of functional structure of major part of the image processing apparatus of expression embodiment 2.
Figure 28 is the flow chart that carries out the treatment step that the character of Barrett esophagus judges.
Figure 29 is the flow chart of the treatment step judged of the character of carrying out the Barrett esophagus of variation.
Figure 30 A is the block diagram of schematic configuration of the capsule type endoscope device of expression the 4th embodiment.
Figure 30 B is the block diagram of expression as the schematic configuration of the termination of the tube chamber image processing apparatus of the 4th embodiment.
Figure 31 is the key diagram of schematic configuration of the capsule type endoscope of explanation the 4th embodiment.
Figure 32 is the flow chart of carrying out in termination that passes through to carry out the handling process example that pars cardiaca detects that utilizes stomach esophagus junction surface.
Figure 33 is the schematic graph that is used for illustrating the tone variations of resulting a series of endoscopic images.
Figure 34 is the flow chart of the handling process example of step S203 among expression Figure 32.
Figure 35 is expression detects the variation of average tone characteristic quantity by the differential value that calculates average tone characteristic quantity the flow chart of handling process example.
Figure 36 is the curve chart of the variation of the standard deviation of tone characteristics amount or centrifugal pump in resulting a series of endoscopic images in the 7th variation of explanation the 4th embodiment.
Figure 37 is the figure that is illustrated in the example in the zone of carrying out Flame Image Process in the image of each frame in the 4th embodiment and variation.
Figure 38 is used for illustrating that the light and shade variation in the resulting a series of endoscopic images of the 5th embodiment particularly is the schematic graph of brightness flop.
Figure 39 is the flow chart of carrying out in termination based on resulting a series of endoscopic images in expression the 5th embodiment that detects the handling process example of pars cardiaca by stomach esophagus junction surface the time.
Figure 40 is the flow chart of the handling process example of step S33 among expression Figure 39.
Figure 41 is used to illustrate the brightness of replacement according to three calculated for pixel values of above-mentioned RGB, and the schematic graph of the variation of the pixel data of G or B in a series of endoscopic images of the pixel data that uses G or B during as the light and shade information of image.
Figure 42 is the flow chart that detects the handling process example of light and shade variation in expression the 5th embodiment by the differential value that calculates average brightness value.
Figure 43 is the figure of the image example when capsule type endoscope is positioned at the side nearby of the pars cardiaca of opening in expression the 6th embodiment.
Figure 44 is the flow chart that detects the handling process example of the pars cardiaca of opening in expression the 6th embodiment based on resulting a series of endoscopic images.
Figure 45 is the figure of the image example the during pars cardiaca of capsule type endoscope by opening in expression the 7th embodiment.
Figure 46 is the flow chart that detects the handling process example of the pars cardiaca of opening in expression the 7th embodiment based on resulting a series of endoscopic images.
Figure 47 is the figure of the filter characteristic of bandpass filtering treatment in expression the 7th embodiment.
Figure 48 is the bandpass filtering treatment and the image example that resulting result is handled in 2 values of regulation are implemented in expression to the image of Figure 45 figure.
Figure 49 is the flow chart that detects the handling process example of pars cardiaca in expression the 8th embodiment based on resulting a series of endoscopic images.
Figure 50 is the figure of the boundary image that extracts in expression the 8th embodiment.
Figure 51 is the figure that in expression the 8th embodiment image of process object is implemented the bandpass filtering treatment and the image example that resulting result is handled in 2 values of regulation.
Figure 52 is the flow chart that detects the handling process example of pars cardiaca in expression the 9th embodiment based on resulting a series of endoscopic images.
Figure 53 is the figure that handles the position of centre of gravity of calculating in expression the 9th embodiment by dark portion regional barycenter coordinate Calculation.
Figure 54 is the key diagram of full week property evaluation in the 9th embodiment.
Figure 55 is the key diagram that carries out full week property evaluation based on region rate in the 4th variation of the 9th embodiment.
Figure 56 is the key diagram that carries out full week property evaluation based on angular range in the 4th variation of the tenth embodiment.
Figure 57 is the figure of the image example when capsule type endoscope is positioned at the side nearby of the pars cardiaca of closing in expression the tenth embodiment.
Figure 58 is the flow chart that detects the handling process example of the pars cardiaca of closing in expression the tenth embodiment based on resulting a series of endoscopic images.
Figure 59 is the flow chart that detects the handling process example of the pars cardiaca of closing in expression the 11 embodiment based on resulting a series of endoscopic images.
Figure 60 is that expression is used for illustrating that the 11 embodiment is according to the image of the pars cardiaca of closing and the figure of the image example of the cardia shape of graph thinning.
Figure 61 is that the parameter of calculating concentration degree in expression the 11 embodiment is the flow chart of the handling process example of centrifugal pump.
Figure 62 is the figure that expression is used to illustrate the image example of branch point in the 11 embodiment.
The specific embodiment
Following with reference to the description of drawings embodiments of the present invention.
(first embodiment)
Fig. 1 to Figure 16 represents first embodiment, and Fig. 1 represents to have the overall structure of the endoscopic system of present embodiment.Fig. 2 schematically illustrates endoscope inserted from oral area and implements the gastral various piece in endoscopic top.Fig. 3 represents to take the resulting endoscopic images example of boundary vicinity of esophagus stomach function regulating.Fig. 4 represents the functional structure of the image processing apparatus of present embodiment.The situation that the dynamic image data that Fig. 5 represents to store in image storage part is stored as the static image data collection respectively.
Fig. 6 A and Fig. 6 B are illustrated in the analysis result of storage in the resolving information storage and canned data etc. in the handling procedure storage part.Fig. 7 represents analysis result is shown example with the monitor that endoscopic images shows.Fig. 8 is the flow chart that the treatment step that the character of Barrett esophagus judges is carried out in expression in the present embodiment.Fig. 9 represents to carry out the information of the treatment step that the detection at stomach esophagus junction surface handles and the image of use or generation etc. together.
Figure 10 A to Figure 10 C represents palisade blood vessel end points border etc.The flow chart that the palisade vessel extraction is handled in Figure 11 presentation graphs 9.The image example of the action when Figure 12 A to Figure 12 C represents to be used for the processing of carrying out Figure 11 is described.Figure 13 represents the flow chart of the extremely viscous film determination processing of Figure 10 mini-bus thunder.The flow chart of the variation of Figure 14 presentation graphs 9.Figure 15 A and Figure 15 B represent to be used for the image example of the action specification of Figure 14 etc.Figure 16 represents the flow chart of the extremely viscous film determination processing of Figure 14 mini-bus thunder.
Endoscopic system 1 shown in Figure 1 constitutes and comprises: endoscopic visualization device 2; To implement the medical image-processing apparatus (being designated hereinafter simply as image processing apparatus) 3 that constitutes by personal computer etc. of Flame Image Process by this endoscopic visualization device 2 resulting images; The monitor 4 that the image of having implemented Flame Image Process by this image processing apparatus 3 is shown.
Endoscopic visualization device 2 has: can insert in the body cavity and form endoscope 6 to the photographing in vivo device of taking in the body; This endoscope 6 is supplied with the light supply apparatus 7 of illumination light; The image unit of endoscope 6 is carried out the photographing unit control unit (being designated hereinafter simply as CCU) 8 of signal processing; The monitor 9 that shows the endoscopic images of taking by imaging apparatus by input by the video signal of these CCU 8 outputs.
Endoscope 6 has the operating portion 12 that can insert endoceliac insertion section 11 and be arranged at the rear end of this insertion section 11.And 11 interpolations are connected with the light guide 13 of transmission illumination light in the insertion section.
The rear end of this light guide 13 is connected in light supply apparatus 7.And transmit by light guide 13 by the illumination light that this light supply apparatus 7 is supplied with, and penetrate (illumination light of transmission) from the front end face that the illuminating window on the leading section 14 that is arranged at insertion section 11 is installed, come subjects such as affected part are thrown light on.
Be provided with camera head 17, it constitutes and comprises: be installed in the object lens 15 on the observation window adjacent with illuminating window; And be configured in for example charge coupled cell (being designated hereinafter simply as CCD) 16 on the image space of these object lens 15 as solid-state image pickup.In addition, the optical image of imaging carries out opto-electronic conversion by this CCD 16 on the shooting face of this CCD 16.
This CCD 16 is connected with CCU 8 via holding wire, and drives signal by the CCD that applies from this CCU 8, the picture signal after the CCD 16 output opto-electronic conversion.This picture signal is carried out signal processing by the video processing circuits in the CCU 8, and is converted to video signal.This video signal exports monitor 9 to and show endoscopic images on the display surface of monitor 9.This video signal also is transfused to image processing apparatus 3.
In the present embodiment, endoscope 6 uses in following situation: the leading section 14 that is inserted into portion 11 inserts from oral area, whether be inserted into the boundary vicinity of stomach from esophagus, carrying out existing common mucosa (particularly being squamous epithelium) as the esophagus of detected object mucosa that the mucosa that degeneration presents the character of gastric mucosa part takes place at this boundary vicinity is the splanchnoscopy of Barrett mucosa.
Under this situation, with the corresponding video signal of the shooting resulting endoscopic images of intravital organism mucomembranous surface, also input image processing unit 3, and this video signal is implemented to be used for to detect the processing whether (judgement) Barrett mucosa exists or whether become the state of illness such as Barrett esophagus by following image processing method.
This image processing apparatus 3 has: the image input part 21 of the video signal that input is corresponding with the endoscopic images of importing from endoscopic visualization device 2; The central arithmetic processing apparatus of the view data of importing from this image input part 21 being implemented Flame Image Process is CPU 22; The handling procedure storage part 23 of the handling procedure (control sequence) of Flame Image Process is carried out in storage by this CUP 22.
This image processing apparatus 3 has in addition: the image storage part 24 of the view data that storage is imported from image input part 21 etc.; The resolving information storage part 25 of the resolving information handled etc. has been carried out in storage by CPU 22; Storing the storage device that has carried out the view data handled and resolving information etc. by CPU 22 via memory device interface 26 is hard disk 27; Be used to show display process portion 28 by the display process of CPU 22 processed images data etc.; User carries out the input operation part 29 that is made of keyboard etc. of the data input of the parameter of Flame Image Process etc. and indication operation.
In addition, the video signal that is generated by this display process portion 28 outputs to display monitor 4, and shows the processing image through Flame Image Process on the display surface of this display monitor 4.In addition, image input part 21, CPU 22, handling procedure storage part 23, image storage part 24, resolving information storage part 25, memory device interface 26, display process portion 28 and input operation part 29 are connected to each other via data/address bus 30.
Whether inspection or diagnosis object position are the junction surface peripheries of esophagus stomach function regulating in the present embodiment, by to carrying out image analysis by endoscope's 6 resulting images, exist the character at the position that the Barrett esophagus may take place to judge thereby carry out this periphery.
For this reason, made a video recording in its front insertion patient's mouth in the insertion section 11 of endoscope 6.Fig. 2 is the figure of the residing body cavity part of endoscope distal end when being illustrated in endoscope 6 in oral area inserts patient's body lumen.The leading section 14 of endoscope 6 enters in the esophagus 33 from esophagus inlet 32 by inserting in mouthfuls 31,34 stomach function regulating esophagus junction surfaces 35, passage in transit epithelium border and enter stomach 36 sides, and then arrive stomaches 36 inside through cardia 37.
By inserting the operation of endoscope 6, thereby can obtain the dynamic image data of taking according to said sequence.The dynamic image data of obtaining like this is stored in the image storage part 24, and the two field picture of the still image that constitutes this dynamic image data is carried out image analysis.
Fig. 3 is the skeleton diagram of the example of the endoscopic images that photographs of the boundary vicinity at esophagus 33 stomach function regulating 36.In this endoscopic images, cardia 37 opens and closes in the porch towards gastric portion.
The palisade blood vessel 38 that extends in the outside of this cardia 37 general radial is to exist only in esophagus 33 sides and along the blood vessel of the tube chamber longitudinal extension of esophagus 33.
In addition, be red strong mucosa tone (epithelium of distribution is called cylindrical epithelium) to cardia from epithelium border (chain-dotted line is represented the figure) 34 as the border of the mucosal tissue of mucosal tissue stomach function regulating 36 sides of esophagus 33 sides, its rightabout is white partially mucosa tone (epithelium of distribution is called squamous epithelium), thereby can differentiate the epithelium border by endoscopic observation.
The line (dotted line is represented among the figure) that connects the end points of palisade blood vessel 38 is boundary line (in fact not having line) not easy to identify in endoscopic observation, and this is called stomach esophagus junction surface 35, is the structural border of stomach 36 and esophagus 33.
Under normal circumstances, epithelium border 34 is positioned near the stomach esophagus junction surface 35, but when making the squamous epithelium that forms esophagus 33 be replaced into the mucosa (cylindrical epithelium or Barrett mucosa) of stomach 36 owing to the influence that is subjected to adverse current esophagitis etc., and this epithelium border 39 rises to esophagus 33 sides.
In addition, surpass more than the 3cm of normal mucosa border and when taking place, be diagnosed as Barrett esophagus illness when this Barrett mucosa for full week property ground, esophagus tube chamber cross section.
The functional structure of the major part of Fig. 4 presentation video blood processor 3.
Photograph and the dynamic view data of input image processing unit 3 by endoscope 6, in as image storage (image recording) unitary image storage part 24, be stored as dynamic image data Vm1, Vm2 ... Deng.
At this moment, dynamic image data Vm1, Vm2 ... for stored the data structure of still image according to the process of time.Therefore, each dynamic image data Vm1, Vm2 ... for example as shown in Figure 5, according to frame number 0,1 ..., MAX_COUNT static image data Vs0, Vs1 ..., VsM (wherein M=MAX_COUNT) distributes like that and be stored in image storage part 24.
And also be stored in the time of the frame of being stored in the image storage part 24 simultaneously.In addition, static image data also can be stored with the view data of having carried out compression by JPEG etc.
In addition, after Flame Image Process began, the static image data of the specified frame number scope from for example dynamic image data Vm1 that image storage part 24 is read was extracted piece 41 by the image that is made of software and extracts and read by CPU 22 and handling procedure.Image extracts piece 41 and constitutes the image extraction unit of extracting frame image data from several static image datas of intravital dynamic image data or shooting continuously.
Then, each static image data that extracts is transported to image analysis piece 42 and display process piece 43 successively.
Image analysis piece 42 has: the epithelium border detection piece 44 that detects the epithelium border; Detect the stomach esophagus junction surface at stomach esophagus junction surface and detect piece 45; Take a decision as to whether the Barrett esophagus decision block 46 of Barrett esophagus.Image analysis piece 42 constitutes the image analysis portion that the two field picture that is extracted by image extraction piece 41 is carried out image analysis and output image analysis result.
Epithelium border detection piece 44 for example is the edge by the difference of the mucosa tone in the image is detected, thus with image memory the epithelium boundary line detect and be point range.
The line detection that stomach esophagus junction surface detection piece 45 for example will connect the end points of palisade blood vessel is point range (its detection method will be described in detail in the back).
The shape on 46 pairs of epithelium borders of Barrett esophagus decision block, the striated of squamous epithelium are residual, epithelium border and the distance at stomach esophagus junction surface, the standard deviation of distance, the characteristic quantities such as maximum/minimum of distance calculate, and judge whether captured object position is the Barrett esophagus.
The result of determination information of judging by this Barrett esophagus decision block 46 is stored in the resolving information storage part 25, and be sent to display process piece 43, and in the still image that is shown in through image extraction piece 41 on the monitor 4, show the information of resolving resulting result of determination by image analysis piece 42.
Fig. 6 A is illustrated in the example of the analysis results of storage in the resolving information storage part 25, and Fig. 6 B represents to use when carrying out dissection process or the example of the information set by handling procedure storage part 23.
And being illustrated on the monitor 4 information with result of determination, Fig. 7 is presented at demonstration example in the still image that has carried out this parsing.
In the present embodiment, as reference Fig. 8 describes, when the character that whether has the static image data that comprises the Barrett esophagus according to dynamic image data is judged, whether judge the time was gone up before or after taking the resulting image in judgement object position that the character that will carry out the Barrett esophagus judges the processing that (also comprise roughly simultaneously situation) captured benchmark position with first living body feature (amount) (being stomach esophagus junction surface in the present embodiment) is taken.Stomach esophagus junction surface is detected piece 45 and is constituted the first living body feature test section that detects first living body feature.
Be characterised in that in addition, be judged to be by this determination processing under the situation of this benchmark position being taken, still image to the frame of the rear side of this frame or front side detects second living body feature (amount) (being the feature on epithelium border in the present embodiment), and based on the testing result of second living body feature, carry out the image processing step that the Barrett esophagus is judged by employing, thereby carry out judging the character judgement of the Barrett esophagus of object efficiently as character.Epithelium border detection piece 44 constitutes the second living body feature test section, this second living body feature test section detects the testing result of piece 45 based on stomach esophagus junction surface, two field picture to the time upward took before or after the image that stomach esophagus junction surface detection piece 45 uses in detection detects second living body feature.
Handle by carrying out this image analysis, to not having the image of first living body feature, can omit the processing of detection second living body feature etc., and can obtain efficiently at short notice at the character result of determination of judging object as the character of target, even also can tackle existing under the great amount of images data conditions.
Come the action of the image processing apparatus 3 of present embodiment is described below with reference to the flow chart of Fig. 8.
When user to the CPU 22 that handles according to handling procedure when input operation part 29 has been specified the filename of dynamic image data, CPU 22 reads the maximum frame number of specified dynamic image data from image storage part 24, and shown in Fig. 6 B, be updated to the parameter MAX COUNT of the maximum frame number of expression, begin to handle according to handling procedure.
In initial step S1, CPU 22 carries out the initial setting of frame number variable COUNT, promptly is set at COUNT=0.
In next step S2, CPU 22 compares frame number variable COUNT and MAX_COUNT, if COUNT>MAX_COUNT then finish this processing.
When this step S2 to be judged to be adverse consequences be COUNT≤MAX_COUNT the time, enter step S3, in this step S3, image extracts the image that piece 41 extracts frame number=COUNT.
In addition, in next step S4, stomach esophagus junction surface is detected piece 45 and is carried out the processing that stomach esophagus junction surface is detected in the present embodiment, detects the processing of first living body feature (following abbreviate living body feature as feature) as the image according to this frame number.
Whether obtain to represent the point range of the line at stomach esophagus junction surface 35 shown in step S5, whether have the judgement at stomach esophagus junction surface 35 according to the result who detect to handle.
Then, in this step S5, when being judged to be when not having stomach esophagus junction surface 35, the processing of interrupt step S3, S4 and enter next step S6 makes the value of frame number variable COUNT increase by 1 and return step S2, repeats the processing from step S2 to step S6.
On the other hand,, change detection second feature of step S7 over to, and whether be to judge the processing of character determination processing side of the Barrett esophagus of object as character based on this testing result when in step S5, being judged to be when having stomach esophagus junction surface 35.
In this step S7, in order to begin the determination processing of Barrett esophagus, carry out the setting of variable N, particularly be to be set at variable N=0.
Then, in next step S8, the constant MAX N of N and regulation is compared, whether this MAX_N particularly is need be the frame number of maximum of the determination processing of Barrett esophagus.In addition, if comparative result be N>MAX_N then finish this processing.In the present embodiment, whether be not the judgement of Barrett esophagus for the frame number after the frame number of predefined like this maximum.
On the other hand, when step S8 relatively be adverse consequences, promptly during N≤MAX_X, enter step S9, in this step S9, image extracts the image that piece 41 extracts frame number=COUNT+N.Promptly, extracting the image that has passed through the N frame in time from the image that detects stomach esophagus junction surface 35 (carves at this moment, because N is 0 a state of initial setting, so begin to carry out the determination processing of Barrett esophagus at first from the image that detects stomach esophagus junction surface 35, as by later processing as can be known, the time being gone up whether the image of taking the back is the determination processing of Barrett esophagus successively).
Then, in step S10, piece 45 carries out stomach esophagus junction surface 35 according to the image of this frame number detection processing is detected at stomach esophagus junction surface.
Then, in next step S11, epithelium border detection piece 44 carries out the detection processing that the processing of second feature detection is epithelium border 34 according to the image of this frame number.In addition, detect the processing on epithelium border 34, by for example in Japanese Patent Application 2004-360319 number the step S1 of Fig. 4 detect epithelium border 34 to the processing of S4.Particularly, because the cylindrical epithelium of the squamous epithelium stomach function regulating side of esophagus side tone as mentioned above there are differences, therefore after endoscope's view data being implemented edge treated and graph thinning processing, point range by connecting the border that is generated, along the coordinate point range on border, thereby can calculate the coordinate on (detection) epithelium border 34.
In next step S12, Barrett esophagus decision block 46 uses and is illustrated in the point range of the line at detected stomach esophagus junction surface 35 among the step S10 and the point range that is illustrated in the line on detected epithelium border 34 among the step S11, judges that the character that becomes in the captured image judges whether the position of object is the Barrett esophagus.Barrett esophagus decision block 46 constitutes the character detection unit, and this character detection unit is judged the character of organism and exported result of determination based on the testing result of epithelium border detection portion 44.
Particularly, by illustrated processing in the determination processing of the Barrett mucosa of Figure 13 described later, thereby can take a decision as to whether the Barrett esophagus.
Then, whether this Barrett esophagus decision block 46 will be that the result of determination and the frame number of Barrett esophagus sends to display process piece 43 in step S13.Display process piece 43 image data extraction that specified frame number is represented in its inner not shown buffer memory goes out, and the result of determination that superposes on this view data.Then, this image data transmission to monitor 4, is shown this image and result of determination in the lump on display surface.
For example under the situation that is judged to be the Barrett esophagus, shown in Fig. 6 B, judge in the image of object, for example show in " doubtful Barrett esophagus " at this.
After the processing of this step S13, return step S8 after in next step S14, making variable N increase by 1.In addition, repeat processing from step S8 to step S14.Like this, when surpassing its maximum MAX_N, finishes variable N this processing.
According to realizing the such structure and the present embodiment of processing, when whether the static image data as analysis object to the dynamic image data that constitutes captured endoscopic images is the image analysis of Barrett esophagus, carry out the processing of detected image in proper order according to shooting, this image has the feature as the stomach esophagus junction surface 35 of the palisade blood vessel end points of the periphery that is present in the position of carrying out the judgement of Barrett esophagus.Then, for the image that detects by this processing to image back with this feature, the character of carrying out the Barrett esophagus is judged that the feature detection on required epithelium border 34 is handled and is waited according to the position relation at its testing result and stomach esophagus junction surface 35 and take a decision as to whether the Barrett esophagus, so can take a decision as to whether Barrett esophagus etc. efficiently.
In addition, as described below can be that Barrett mucosa (barrett's epithelium) is judged to Barrett esophagus and the symptom in early stage that takes place before this Barrett esophagus illness also, thereby can help early treatment's etc. judgement.
In addition, in the present embodiment, whether preestablish is the largest frames numbering of the character judgement of Barrett esophagus, image for the frame number after this largest frames numbering, the character that whether is not the Barrett esophagus judges, thereby can prevent that whether to be the image that the character of Barrett esophagus is judged lose time to not needing.
That is, taking in order to esophagus 33 from mouth 31 sides as shown in Figure 2, is under the situation of inside of cardia 37 up to the inside of having taken stomach 36, for the image of stomach private side, becomes whether do not need be the image that the character of Barrett esophagus is judged.In this case, by in advance the frame number of this image being set at MAX_N, thereby can be not whether not be that the character of Barrett esophagus is judged.
The detection processing at stomach esophagus junction surface 35 is described to Figure 13 below with reference to Fig. 9.The situation that the image analysis of the detection that further proceeds to epithelium border 34 after detecting stomach esophagus junction surface 35 and Barrett mucosa determination processing is handled describes below.The purpose that this image analysis is handled is to provide device and the method that suitably takes a decision as to whether the Barrett mucosa, handles by carrying out such image analysis, thereby can suitably take a decision as to whether the Barrett mucosa.
The treatment step of this moment and the data that generated etc. are as shown in Figure 9.Expression contents processing in left side among Fig. 9, and the information of the image that expression is used or generated in the right frame etc.
After the processing of beginning image analysis, in initial step S21, the process object image is carried out edge extracting handle.This edge extracting is handled and for example the G colour content image applications band filter in the RGB image is generated edge image.
Using the edge extracting method of band filter is known technology.In addition, also can and use the brightness composition to generate edge image according to the process object image.Under the situation at the edge that except the edge of blood vessel, also extracts other shape (profile),, then can only extract the edge of blood vessel if except the edge of the shape that will extract for the R composition application band filter of process object image.
In addition, detect with the stomach esophagus of the step S4 of Fig. 8 and to handle suitable processing section, use step S21 among Fig. 9 to step S26.
In the next step S22 of Fig. 9, edge image is carried out 2 values handle, generate 2 value images.2 values in the present embodiment are handled by the pixel value with each pixel of edge image and are carried out size comparison with the threshold value of regulation, thereby each pixel of definite 2 value images is 0 or is 1.
In next step S23,, carry out graph thinning and handle and generation graph thinning image the known wire narrowing method of 2 value image applications.
In next step S24, this graph thinning image is extracted the palisade vessel extraction of esophagus 33 peculiar palisade blood vessels and handle, and preserve the palisade vessel information that is extracted.The flow process of this processing is (its explanation will be carried out in the back) as shown in figure 11.
In next step S25, obtain the end points coordinate of the palisade blood vessel of in above-mentioned palisade vessel extraction is handled, preserving, in step S26, utilize the boundary line of line segment connection end point coordinate point range to generate processing, generate (obtaining) boundary line information.By the boundary line information that this processing generated, particularly palisade blood vessel end points border is shown in Figure 10 A.
In addition, in step S27, generation comprises boundary line information (palisade blood vessel end points border), the dark portion that obtains in advance and the boundary line image of obtaining by above-mentioned boundary line image generation processing on epithelium border 34.This image is shown in Figure 10 B.
In next step S28,, take a decision as to whether the Barrett esophagus determination processing of Barrett esophagus or Barrett mucosa according to the position relation information on the epithelium border 34 of squamous epithelium of obtaining in advance and cylindrical epithelium.To be described in detail with reference to Figure 13 in the back for this processing.
Like this, carry out the judgement of Barrett esophagus or Barrett mucosa, show its result of determination and finish this processing.
Below with reference to Figure 11 the palisade vessel extraction processing of the step S24 of Fig. 9 is described.
After beginning this palisade vessel extraction processing, in initial step S31, obtain untreated line segment from the graph thinning image.The image of this moment is for example shown in Figure 12 A.
Pixel count with line segment in next step S32 calculates as line segment length L.Then, in next step S33, the threshold value thre1 of the line segment length L that calculates and regulation is compared and judge its size.In this determination processing, if L>ther1 then enter next step S34, if L≤ther1 then to be judged to be this line segment be not palisade blood vessel and change step S41 over to.Ther1=50 for example in the present embodiment.
Branch/intersections of in step S34, calculating line segment C, bending B that counts that counts, and in step S35, judge size with the threshold epsilon of stipulating.If C≤Cth and B<ε then enter next step S36, then being judged to be this line segment as C>Cth or B 〉=ε is not as the palisade blood vessel that extracts object, but dendroid blood vessel and change step S41 over to.In the present embodiment, set Cth=0 for, ε=3.
In step S36, from two end points of line segment, obtain an end points of the more approaching dark portion of image that obtains in advance, in step S37, calculate the vector v at connection end point and dark portion center.
The angle θ that straight line became of compute vectors v and connecting line segment endpoint and starting point in next step S38.In next step S39, judge the size between the threshold value thre2 of the angle θ calculate and regulation then.
Then in the determination processing of step S39, if θ<thre2 (for example θ 1 among Figure 12 B) then enter next step S40 is not a palisade blood vessel and change step S41 over to when θ 〉=thre2 (for example θ 2 among Figure 12 B) then is judged to be this line segment on the contrary.In the present embodiment, set the thre2=45 degree for.
In step S40, for the line segment that in step S31, extracts, the line segment that satisfies the decision condition of step S39 is judged to be the palisade blood vessel, the information relevant with this line segment (branch of described line segment length L, line segment/intersect count C, bending count coordinate point range, end points coordinate, the angle θ of B, line segment) is saved as the palisade vessel information.Like this, can shown in Figure 12 C, extract the palisade blood vessel.
As mentioned above, processing from step S31 to S35, in stomach esophagus junction surface detection piece 45, constitute handling part, the branch that this handling part is handled resulting line segment to the consideration of palisade blood vessel to frame image data enforcement graph thinning counts, intersects and count, bend and count, and judges whether line segment is the palisade blood vessel.In addition, processing from step S36 to S39, in stomach esophagus junction surface detection piece 45, constitute handling part, this handling part to the palisade blood vessel further consider connection to frame image data implement graph thinning handle the line segment at the two ends of resulting line segment, with the dark portion center of the dark portion of image that is connected two field picture and two ends in the vector angulation of an end of the dark portion of more approaching described image, judge whether line segment is the palisade blood vessel.
In addition, judge in step S41 whether untreated line segment is arranged, if untreated line segment is arranged then be implemented into the circular treatment of step S31, if do not have untreated line segment then finish this processing.
In addition, in step S36~step S39, also can use matched filter and only be extracted in the upwardly extending blood vessel in dark portion side.
Barrett mucosa determination processing below with reference to the step S28 of Figure 13 key diagram 9.In step S51, obtain by described boundary line image and generate the boundary line image that processing is generated.Shown in Figure 10 B.
Bar number with regulation in next step S52 is that N bar lonizing radiation are divided this integral image.For example the dividing condition during N=8 is shown in Figure 10 C.
In next step S53, will represent that the variable i of i bar lonizing radiation [i] is set at initial value 1.
Then, in next step S54, calculate the intersection point P2 on the intersection point P1 on i bar lonizing radiation [i] and epithelium border 34 and i bar lonizing radiation [i] and the border that forms.The image that calculates a P1 and some P2 is shown in Figure 10 C.
In next step S55 calculation level P1 and point between P2 apart from Q[i].
In next step S56, judge whether whole lonizing radiation are handled.That is, judge whether i is that lonizing radiation are counted N, when not reaching N, in step S57, make i increase by 1, return step S54 then, implement same processing, after whole lonizing radiation having been finished processing, enter step S58.
Like this to whole lonizing radiation be N bar radiationmeter let it pass between a P1 and P2 apart from Q[i] after, in step S58, use this N apart from Q[i] calculate centrifugal pump σ.
In next step S59, the size between the threshold value thre3 of judgement centrifugal pump σ and regulation.Then, entering step S60 when σ>thre3, is not to observe the image of Barrett mucosa and finish this processing otherwise be judged to be this image when σ≤thre3.Thre3=5 in the present embodiment.
For the image of obtaining in step S51, being judged to be when satisfying the decision condition of step S59 is the image of observing the Barrett mucosa in step S60, and this result of determination is shown or processing back these processing of end such as notice and preservation.
Take a decision as to whether Barrett mucosa (barrett's epithelium) according to Fig. 9 to processing shown in Figure 13, thereby can carry out high-precision judgement.
That is, in such processing, stomach esophagus junction surface 35 and epithelium border 34 are detected respectively, taken a decision as to whether the Barrett mucosa according to its testing result, thereby can carry out suitable and high-precision judgement.
In addition, also can do following change to the section processes of Figure 13, calculate the radius (or diameter) of esophagus 33 in (inferring) image, adopt known statistical value, judge whether comprise Barrett mucosa and Barrett esophagus quantitatively thereby can be similar to as this radius value.
In Figure 13, for example also calculate the processing of the O of central authorities of dark portion between step S55 and the S56 to the distance of a some P1 (or the O of central authorities of dark portion is to P2) (be expressed as R[i in order clearly to illustrate]).
In addition, through the determination processing of step S56 between whole lonizing radiation [i] calculation level P1 and P2 apart from Q[i] and distance R [i].Do not carry out then Figure 13 step S58 apart from Q[i] the calculating of centrifugal pump σ, but calculate the meansigma methods Rav of above-mentioned distance R [i], with the evaluation of estimate (presumed value) of this meansigma methods Rave as near the radius the epithelium border in the esophagus 33 34.
To be stored in the statistical radius value Rs (cm) of normal adult human or patient's build people of the kind's esophagus 33 in advance in the memorizer etc., with radius value Rs as above-mentioned meansigma methods Rav estimate between above-mentioned some P1 and P2 apart from Q[i] meansigma methods.
Then, judge that this is apart from Q[i] meansigma methods whether be more than the 3.0cm, under the situation more than the 3.0cm, be judged to be the Barrett esophagus.
That is, consider respectively and from the point of regulation be many lonizing radiation of the radial radiation of the dark O of central authorities of the portion epithelium border of intersecting and each distance between the O of central authorities of dark portion or stomach esophagus junction surface with the central O of dark portion between each distance, judge whether there is the Barrett esophagus.
In addition, as distance Q[i] meansigma methods during for the 1.5cm left and right sides for example, be judged to be the state that the Barrett mucosa has had suitable development.In addition, as distance Q[i] meansigma methods during for the 0.5cm left and right sides for example, then can be judged to be early symptom that the Barrett mucosaization occurs etc.
Like this according to this variation, can whether be the Barrett esophagus with approximate quantitative condition judgement, and can under the situation of Barrett mucosa, judge the development degree of the symptom of Barrett mucosa quantitatively, be easy to carry out the early treatment by showing its result of determination etc.
In addition, also can replace the Barrett mucosa determination processing of Fig. 9, and carry out the flow chart of variation shown in Figure 14.
This variation is with the difference of aforementioned processing flow chart, replace the blood vessel end points extraction processing of the step S25 among Fig. 9, the boundary line generation processing of step S26 and the boundary line image of step S27 to generate processing, and the epithelium border palisade blood-vessel image generation processing of execution in step S61 as shown in figure 14.Use by this and handle the epithelium border palisade blood-vessel image that generates, carry out the Barrett mucosa determination processing of step S62.
Generate in the processing at the epithelium border of step S61 palisade blood-vessel image, shown in Figure 15 A, generate epithelium border palisade blood-vessel image, this epithelium border palisade blood-vessel image comprises: the palisade blood vessel of obtaining in above-mentioned palisade vessel extraction is handled, and dark portion that obtains in advance and epithelium boundary line.
Then, in next step 62, use the epithelium border palisade blood-vessel image that in previous step S61, generates, carry out Barrett mucosa determination processing.The flow chart of this Barrett mucosa determination processing as shown in figure 16.
As shown in figure 16 in initial step S63, obtain and comprise the epithelium border 34 obtained in advance and the image of palisade blood vessel.
In next step S64, the palisade blood vessel that intersects with the epithelium boundary line is counted J carry out initialization, promptly be set at J=0.
In next step S65, from Q bar palisade blood vessel, obtain the process object blood vessel.And then in next step S66, whether determination processing object blood vessel intersects with epithelium border 34.When crossing, enter next step S67 and make the palisade blood vessel count J to increase by 1, disjoint situation next return the processing of step S65 and obtain next process object blood vessel lay equal stress on complex phase with processing.
After carrying out step S67, in next step S68, judge whether whole palisade blood vessels is handled, if untreated palisade blood vessel were arranged would return step S65 lay equal stress on complex phase with processing, then enter next step S69 if instead whole palisade blood vessels finished to handle.
Calculate the palisade blood vessel that intersects with epithelium border 34 like this and count the image of J for example shown in Figure 15 B.Under the situation of this Figure 15 B, it is 7 that the palisade blood vessel is counted Q, wherein have 6 (=J) intersect on bar and epithelium border 34.
In step S69, the size between the threshold value thre4 of judgement J/Q and regulation.And, when J/Q>thre4, enter next step S70, if instead J/Q≤thre4 then is judged to be this image and does not observe the Barrett mucosa and finish this processing.In the present embodiment, set thre4=0.5 for.
In step S70, when the image of obtaining in step S63 satisfied the decision condition of step S66 and step S69, being judged to be was the image of observing the Barrett mucosa, and showed on monitor 4 etc. and finish this processing.
In this variation, can there be the end points of the palisade blood vessel of what degree according to inboard on epithelium border 34, take a decision as to whether the Barrett mucosa.
As mentioned above according to present embodiment, when whether a large amount of static image data for the dynamic image data that constitutes endoscopic images is the parsing of Barrett esophagus, detect the treatment of picture of feature with stomach esophagus junction surface 35, this stomach esophagus junction surface 35 is the stomach side point of the palisade blood vessel of conduct first characteristic portion that is present in the periphery that carries out the position that the Barrett esophagus judges, and to handle detected image image afterwards by this, carry out detection processing as the epithelium border 34 of second characteristic portion etc., take a decision as to whether the Barrett esophagus, thereby can be efficiently whether be that the character of Barrett esophagus etc. is judged.Therefore can obtain to reduce the effect of extracting the amount of labour of operation by manual working.
In addition, be set to the stomach esophagus junction surface 35 of first characteristic portion of initial this feature of detection as mentioned above, also can be used in the character judgement of Barrett esophagus, thereby can effectively utilize the detection of this feature.
In addition and since also can become before the illness of Barrett esophagus early stage symptom the judgement of Barrett mucosa (barrett's epithelium), thereby can be suitable for early treatment's etc. judgement.
In addition, can whether do not carry out this processing to not needing if being the image of the judgement of Barrett esophagus.
In addition, though use dynamic image data to be illustrated in the present embodiment, also can be suitable for (similarly also can be applicable to other embodiment) for successive a plurality of static image datas in once checking.
(second embodiment)
Below with reference to Figure 17 to Figure 26 second embodiment is described.In the above-described first embodiment, in order whether to be the judgement of Barrett esophagus, to handle by the detection of carrying out stomach esophagus junction surface 35 at first, thereby the image that comprises stomach esophagus junction surface 35 is detected.
The detection at stomach esophagus junction surface 35 is handled because the load that detection is handled is big, thereby processing speed is slow, remains in the leeway of improving processing speed and detection speed.As carrying out the detection at stomach esophagus junction surface 35 with processing speed faster, measurable detection to the high organism part of accuracy of detection is handled, and considers that cardia detects to handle.
Be conceived to this point in the present embodiment and improve processing speed etc.
The hardware configuration of the image processing apparatus of present embodiment can similarly adopt Fig. 1 with first embodiment.In addition, present embodiment passes through functional structure that handling procedure is made of CPU 22 as shown in figure 17.Also have cardia in the structure shown in Figure 17 image analysis piece 42 in the structure of Fig. 4 and detect piece 47.Cardia detects piece 47 and constitutes the first living body feature test section that detects first living body feature.
The process chart of present embodiment as shown in figure 18.Carry out the judgement of Barrett esophagus and show result of determination etc. according to this flow chart.
Above-mentioned cardia detect piece 47 for example carry out dark portion detection and according near the severe degree of the brightness flop the edge of the shape of detected dark portion and dark portion detect image data memory cardia 37.Its particular content will describe with reference to Figure 19 in the back.
Treatment step shown in Figure 180, be in the treatment step of the flow chart of Fig. 8, replace the detection at the stomach esophagus junction surface 35 of step S4 to handle, and the detection of carrying out the cardia 37 of step S4 ' is handled, and replace the step S5 after the step S4 stomach esophagus junction surface 35 have a determination processing, and change to the determination processing that exists of the cardia 37 that carries out step S5 '.
In addition, replace the image of the extraction frame number=COUNT+N of the step S9 in the treatment step of Fig. 8, and shown in the step S9 ' of Figure 18, extract the image of two field picture=COUNT-N.
Promptly, in the first embodiment, because stomach esophagus junction surface 35 is position peripheries of judging the Barrett esophagus, therefore by being in the state image afterwards that detects this position in the image on the review time, thereby obtain and the more approaching image of this position periphery the imaging conditions in the time of therefore can answering opposite side to be made a video recording in the leading section 14 insertion limits of endoscope 6.
Relative therewith, as shown in Figure 2, cardia 37 is as the position that enters the inlet of gastric portion by stomach esophagus junction surface 35 and epithelium border 34, therefore extracts the two field picture of back reviewing the N frame in time from the image that has carried out this cardia detection.Then, whether be the judgement of Barrett esophagus successively according to the direction that begins from this image to review to the image of more early taking.
According to present embodiment, after the detection of the little cardia 37 of the detection at the duty ratio stomach esophagus junction surface 35 of having carried out the detection processing, carry out the determination processing of Barrett esophagus according to the image that detects this cardia 37, thereby can in the shorter time, carry out the judgement of Barrett esophagus.
Therefore, can change the determination processing side of Barrett esophagus more efficiently over to, and with the shorter time acquisition effect identical with first embodiment.
Below with reference to Figure 19 to Figure 22 the detection processing of cardia 37 is described.Handling process and the data of use or the data of generation that the cardia of remaining silent when Figure 19 represents to close in the lump detects.
After the detection of beginning cardia 37 is handled, shown in step S71, the process object image is carried out edge detection process, generate edge image.
This edge extracting is handled in the present embodiment, by R colour content image applications band filter is generated edge image.
Using the edge extracting method of band filter is known technology.In addition, also can and use the brightness composition to generate edge image according to the process object image.
In next step S72, edge image is carried out 2 values handle, generate 2 value images.2 values of present embodiment are handled, and carry out size comparison by the pixel value to each pixel of edge image with the threshold value of regulation, thereby each pixel of definite 2 value images are 0 or are 1.
In next step S73,, carry out graph thinning and handle and generation graph thinning image the known wire narrowing method of 2 value image applications.The graph thinning image that is generated is for example shown in Figure 20 A.
In next step S74, calculate the branch/intersection calculations in the branch/cross point of the whole fine rules in this graph thinning image and handle.Figure 20 B represents the example to the branch/cross point of the graph thinning image calculation of Figure 20 A.It is 5 situation that branch/intersection Nc that counts has been shown in this Figure 20 B.
The coordinate of will the branch/intersection calculations by this step S74 handling the branch/cross point of calculating saves as branch/cross point information.
In next step S75, carry out coming the concentration degree computing of the concentration degree in Branch Computed/cross point according to the coordinate figure in above-mentioned branch/cross point, calculate concentration degree information.
Flow chart according to Figure 21 describes this concentration degree computing.In initial step S77, obtain the coordinate figure in above-mentioned Nc branch/cross point.In next step S78, centrifugal pump σ x, the centrifugal pump σ y of y coordinate figure at the x coordinate figure in Nc branch/cross point calculate this two centrifugal pump σ x, σ y.Then, in next step S79, these two centrifugal pump σ x, σ y that calculate are saved as concentration degree information and finish this processing.
In addition, also can replace the centrifugal pump of in this concentration degree computing, carrying out to calculate, and obtain standard deviation or variation coefficient or apart from the meansigma methods of the distance of the center of gravity in Nc branch/cross point etc., and with it as concentration degree information.
Return Figure 19, use the concentration degree information of calculating by the concentration degree computing of step S75, the cardia determination processing of in next step S76, remaining silent.
This remains silent the cardia determination processing shown in the step S76a of Figure 22, respectively count Nc and concentration degree information (σ x, σ y) and the threshold value thre_N of regulation separately and the magnitude relationship between thre_x, the thre_y of branch/intersection is compared judgement.
Then, in step S76a, when being judged to be the condition that satisfies Nc>thre_N and σ x<thre_x and σ y<thre_y, being judged to be shown in step S76b is the cardia 37 of remaining silent.On the other hand, when being judged to be the condition that does not satisfy step S76a, promptly under the situation of Nc≤thre_N or σ x 〉=thre_x or σ y 〉=thre_y, shown in step S76c, being judged to be and not being to remain silent cardia 37.
After having carried out the cardia determination processing like this, finish cardia shown in Figure 19 and detect processing.
Can detect cardia 37 like this.
As mentioned above according to present embodiment, detect the processing of cardia 37 at first, under the situation that detects cardia 37, the image of back reviewing is in time judged the processing of the Barrett esophagus of conduct judgement object, even therefore, also can carry out the judgement of Barrett esophagus efficiently for the situation of great amount of images data.
In addition, as what illustrate in the first embodiment, when carrying out the judgement of Barrett esophagus, also can carry out the judgement of Barrett mucosa, this judgement is very effective for the early treatment.
In addition, Figure 23 shows the flow chart processing that the cardia in the variation detects.
This variation is in process chart shown in Figure 19, replace the branch/intersection calculations processing of step S74 and the concentration degree computing of next step S75, and carry out to detect marginal element's generation angle calculation processing that the opening cardia is the step S81 of purpose, and, carry out the opening cardia determination processing of detection (judgement) the opening cardia of step S82 according to generate the generation angle information that the angle calculation processing is calculated by this marginal element.
Step S71 among processing shown in Figure 23 and Figure 19 is identical to step S73.
Shown in step S71, the process object image is carried out edge extracting handle, generate edge image, and then edge image is carried out the 2 values processing of step S72, generate 2 value images, and then handle the graph thinning image shown in generation Figure 24 A by the graph thinning of step S73.
Below, the threshold value that to use dark portion to extract to the image that comprises dark portion in order to extract the dark portion of image is in advance carried out the image after 2 values of dark portion are handled, be superimposed upon by above-mentioned graph thinning and handle on the obtained graph thinning image, and obtain the image shown in Figure 24 B, the marginal element that this image is carried out step S81 generates angle calculation processing, the generation angle information of calculating big edge angle.
Then, whether the opening cardia determination processing by step S82 is the judgement of opening cardia.
Figure 25 is that the marginal element that represents the step S81 of Figure 23 in detail generates the flow chart that angle calculation is handled.
In initial step S83, the characteristic point of the dark portion in the image is selected a bit.In the present embodiment, for example calculate the focus point of dark portion, as the characteristic point of the dark portion in the image.
In next step S84, will be that the image at center upwards is being divided into a plurality of, for example M zone week with characteristic points such as this focus point of calculating or central points by radial line.
In next step S85, the fine rule in the graph thinning image shown in above-mentioned Figure 24 A, extract (obtaining) line segment i.
In next step S86, calculate the angle θ [i] of the line segment i existence that is extracted.That is, the number of regions Ni that line segment i is existed counts, and by this counting and according to θ [i]=Ni * (360/M) °, calculates the angle θ [i] that line segment i exists.
The example of calculating like this is shown in Figure 24 C.In Figure 24 C, represented from line segment i to be that the number of regions Ni in zone of cut-off rule in zone to 270 ° of 0 ° cut-off rule is 6 situation.That is, in Figure 24 C, the zone of marginal existence (oblique line part) crosses over 270 ° scope.
Whether in next step S87, judging has untreated line segment, if having untreated line segment then return step S85, obtains untreated line segment and carries out identical processing.
On the other hand, when not having untreated line segment, finish the processing that this marginal element generates angle calculation.Return Figure 23, use marginal element to generate the angle information that angle calculation is handled the angle θ [i] that is generated, and pass through the opening cardia determination processing of next step S82, take a decision as to whether the cardia of opening by step S81.
This opening cardia determination processing is judged the magnitude relationship between the threshold value thre5 of angle θ [i] and regulation for example shown in the step S82a of Figure 26.That is, judge the condition that whether satisfies θ [i]>thre5.In addition, when satisfying the condition of θ [i]>thre5, shown in step S82b, being judged to be is the edge of opening cardia, otherwise when not satisfying this condition, is judged to be the edge that is not the opening cardia.Like this, the processing of detection of end cardia.
Can detect the cardia of opening like this.
In addition, by detecting the processing of Barrett esophagus, thereby can carry out the judgement of Barrett esophagus etc. efficiently according to the image that detects cardia.
(the 3rd embodiment)
Below with reference to Figure 27 to Figure 29 third embodiment of the invention is described.The hardware configuration of the image processing apparatus of present embodiment is identical with first embodiment, can adopt Fig. 1.In addition, the functional structure of the major part that is made of the CPU 22 that carries out the handling procedure in the present embodiment as shown in figure 27.Structure shown in Figure 27 constitutes, and in the image analysis piece 42 in the structure of Fig. 4, also has to handle and continues decision block 48.Handle and continue the first living body feature test section that decision block 48 constitutes detection first living body feature.
This processing continues decision block 48 and judges the point range whether epithelium border detection piece 44 detected epithelium boundary lines are arranged, and comes the action of control figure as resolution block 42 according to result of determination.
The flow chart of the treatment step of present embodiment as shown in figure 28.Carry out the judgement of Barrett esophagus according to this flow chart, and show this result of determination etc.
The processing method of flow chart shown in Figure 28 in the treatment step of Fig. 8 in the first embodiment, replace to detect the processing at stomach esophagus junction surface 35, and carries out the processing on the detection epithelium border 34 that can detect by simpler processing.
At first, the image that epithelium border 34 is detected is carried out in retrieval, after retrieving the image that carries out detecting on epithelium border 34, changes the processing side of the judgement of carrying out the Barrett esophagus over to.
Flow chart below with reference to Figure 28 illustrates this treatment step.In addition, initial step S1 to S3 is the processing identical with the flow chart of Fig. 8, therefore omits its explanation.
The extraction frame number of carrying out step S3 is the treatment of picture of COUNT, carries out the detection on the epithelium border 34 of next step S91 and handles.
Then, detect the epithelium border 34 of handling, in next step S92, judge whether obtained the point range of the line on expression epithelium border 34 by handling continuation decision block 48, thereby judge whether there is epithelium border 34 by step S91 to having carried out.
By the judgement of this step S92, when being judged to be when not having epithelium border 34, entering step S6 and make the value of frame number variable COUNT increase by 1 and return step S2, proceed step S2 and move to the parsing that the processing of step S92 promptly detects epithelium border 34.
On the other hand, the judgement when by step S92 is judged to be when having epithelium border 34, changes step S93 side over to from the parsing action that detects epithelium border 34, changes the parsing action of judging the Barrett esophagus over to.
In this step S93, piece 45 carries out stomach esophagus junction surface 35 according to the image of this frame number detection processing is detected at stomach esophagus junction surface.
After the detection at this stomach esophagus junction surface 35 is handled, in next step S94, Barrett esophagus decision block 46 uses and is illustrated in the point range of the line at detected stomach esophagus junction surface 35 among the step S93 and the point range that is illustrated in the line on detected epithelium border 34 among the step S91, judges whether the object position in the captured image is the Barrett esophagus.
Then, when being judged to be when being the Barrett esophagus, Barrett esophagus decision block 46 passes to display process piece 43 with this result of determination and frame number in step S95.This display process piece 43 extracts the image of specified frame number from buffer memory, result of determination superposes on this view data.For example shown in Figure 7 the demonstration.
In next step S96, make COUNT increase by 1.Then, in next step S97, obtain next frame number (=COUNT) image again.Then, this image is carried out the detection processing on epithelium border 34.
In next step S98, whether judgement is handled to detect by the detection of front and is had epithelium border 34.
In this step S98, handle continuation decision block 48 and whether obtained the point range of the line on expression epithelium border 34, thereby judge whether there is epithelium border 34 by previous step S97 by judging.
According to the judgement of this step S98,, jump out cycle of treatment, be the dissection process action of Barrett esophagus determination processing side, and finish this processing from step S93 to step S98 when being judged to be when not having epithelium border 34.
On the other hand,, return step S93 and detect the processing at stomach esophagus junction surface 35, proceed the determination processing of Barrett esophagus when being judged to be when having epithelium border 34.Like this, when detecting the existing of epithelium border 34, repeat the processing in the above-mentioned cycle of treatment, in the time can not detecting the existing of epithelium border 34, finish this processing.That is, only, carry out the judgement of organism character to detecting the two field picture on epithelium border.
Present embodiment according to such action, carrying out the detection on epithelium border 34 at first handles, change over to according to handling the image that detects epithelium border 34 and judge the processing side that whether has the Barrett esophagus by this detection, if can't detect the existence on epithelium border 34 once more in this processing side, then finish this processing, thereby can carry out the judgement of Barrett esophagus efficiently.
Promptly, according to present embodiment, set handling continues detection unit, it carries out following control: the character that only detects the Barrett esophagus is judged the image of the periphery on needed epithelium border 34, and in the time can't in this image, detecting epithelium border 34, finish the determination processing of Barrett esophagus, thereby can not take the required image of determination processing that time ground extracts the Barrett esophagus, carry out the determination processing of Barrett esophagus.
That is, compare with first embodiment, can be in the shorter time and do not take the determination processing that time ground does not carry out the Barrett esophagus.
In addition, though in the present embodiment, after detecting epithelium border 34 according to two field picture, will the time go up and the two field picture at the successive rear of two field picture as the object images of next epithelium border detection, but also can be according to the direction of taking etc. by the time, review forward in time and obtain two field picture and be used as the detected object image.
In addition, also can make obtaining of continuous frame images be spaced apart N (wherein, N is a natural number 1,2,3 ...), COUNT ← COUNT+N in step S97 specifies the next one to obtain two field picture.
Below modified embodiment of the present embodiment is described.If whether to each width of cloth image is the judgement of Barrett esophagus one by one, then have following probability: since contained noise in the view data, halation, light quantity through the time variation, shadow etc. influence, though and cause originally having taken the Barrett esophagus, making not is the erroneous judgement of Barrett esophagus.
Therefore in this variation, improve by treatment step shown in Figure 29.The treatment step of flow chart shown in Figure 29, in the treatment step of Figure 28, step S1 to step S94 be identical processing (but, therefore in step S1, this variable Nb being initialized as 0) owing in Figure 29, also used frame number variable Nb.
In Figure 28 according to the result of determination of step S94, in next step S95, show result of determination, but in this variation, after the determination processing of step S94, the detection processing at stomach function regulating esophagus junction surface 35 etc. is handled in the detection that becomes next frame number and carry out epithelium border 34, carries out the determination processing of Barrett esophagus.
Like this, under for the situation of image that does not have epithelium border 34, whether be the resulting whole result of determination of judgement of Barrett esophagus according to having carried out so far, come whether synthetic determination is the Barrett esophagus, and show this result of determination.
Below with reference to Figure 29 treatment step is described.In addition, therefore step S1 omits its explanation to S94 identical with Figure 28 (but needing to make another frame number variable Nb also to be initialized as 0 as mentioned above) in step S1.
In step S94, the detection result according to the stomach esophagus junction surface 35 of the location on the epithelium border 34 among the step S92 and step S93 takes a decision as to whether the Barrett esophagus.Then, the result of determination of this Barrett esophagus is stored temporarily, and make the variable of frame number Nb increase by 1 by next step S101.Then, in next step S 102, extracting and making the frame number of this Nb increase by 1 is the image of COUNT+Nb.
In next step S103, carry out the detection on epithelium border 34 and handle.After this detects processing, in next step S104, judge whether there is epithelium border 34, when existing, return step S93, carry out the detection at stomach esophagus junction surface 35 and handle, whether be the processing such as judgement of Barrett esophagus similarly.
On the other hand,, change step S105 over to, obtain the result of determination to the obtained whole Barrett esophagus of the processing of step S104 by step S93 when being judged to be when not having epithelium border 34.
Then, in next step S106,, come whether synthetic determination is the Barrett esophagus according to the result of determination of this whole Barrett esophagus.In addition, in next step S107, the synthetic determination result is superimposed on image shows, finish this processing.
Whether Barrett esophagus decision block 46 following to carry out comprehensive among the step S106 be the judgement of Barrett esophagus if being.
For example whether computational discrimination be that the width of cloth of the image of Barrett esophagus counts Na and carried out is the ratio Na/Nb that the figure film size of the judgement of Barrett esophagus is counted Nb.Barrett esophagus decision block 46 this ratio Na/Nb greater than 0.8 situation under, being judged to be reference object is the Barrett esophagus, in step S107, to the whole Nb width of cloth images that in the Barrett esophagus is judged, use, the information of stack " doubtful Barrett esophagus ".
According to this variation of handling action like this, have the effect identical with the 3rd embodiment, and use multiple image separately whether further carry out synthetic determination as the result of determination information of Barrett esophagus, thereby can under the higher state of reliability, take a decision as to whether the Barrett esophagus.
In addition, in the above description, be illustrated with the situation of carrying out Flame Image Process at the situation of taking resulting endoscopic images in endoscope's 6 insertion bodies that will have elongated insertion section, but the contents processing of above-mentioned each embodiment and each variation can be applicable to by swallowing from mouth equally to the situation of the taken endoscopic images of the capsule type endoscope of taking in the body.
Under the situation of capsule type endoscope, normally with continuous photographing in vivo devices of taking still image such as constant interval.In addition, in this case, after swallowing, no longer return, but the limit is moved the limit and is taken to esophagus 33, stomach, small intestinal, large intestine from mouth.Also can use above-mentioned each embodiment and each variation this moment.
In addition, the present invention also comprises ground combinations such as above-mentioned each embodiment etc. being carried out part and the situation of the embodiment that constitutes.
As mentioned above, according to above-mentioned each embodiment and each variation, under the situation of judging the organism character, can be after having detected first living body feature, carry out the detection of second living body feature, therefore can carry out the judgement of the character of concerns such as Barrett esophagus efficiently according to a large amount of view data.That is, under the situation that does not detect first living body feature, omit the detection of second living body feature, handle and can carry out image analysis efficiently.Therefore, under the great amount of images data conditions, also can carry out the judgement of the character of concerns such as Barrett esophagus efficiently.
The tube chamber image processing apparatus that embodiment is related to describes with reference to the accompanying drawings.
(the 4th embodiment)
At first describe with reference to the tube chamber image processing apparatus and the method thereof of accompanying drawing to the employing capsule type endoscope device of the 4th embodiment.At first, with reference to Figure 30 A, Figure 30 B and Figure 31 the tube chamber image processing apparatus of the employing capsule type endoscope device of four embodiment of the invention is described.Figure 30 A and Figure 30 B are the capsule type endoscope devices 101 of expression present embodiment and as the block diagram of the schematic configuration of the termination 107 of tube chamber image processing apparatus.
Use the capsule type endoscope device 101 of the image processing method of present embodiment, shown in Figure 30 A, have capsule type endoscope 103, antenna element 104 and external device (ED) 105.Capsule type endoscope 103 will describe in detail in the back, its form can from subject be patient 102 oral area swallow in the body cavity and in esophagus and digestive tract progressive shape, and innerly have and can take and generate the camera function of its photographed images information and this photographed images information is sent to external sending function esophagus and digestive tract inside.Antenna element 104 is arranged on patient 102 the body surface, has a plurality of reception antennas 111 that the photographed images information that sends from described capsule type endoscope 103 is received.The profile of external device (ED) 105 forms the chest shape, and has the photographed images information that described antenna element 104 is received and carry out various processing, record photographed images information and use photographed images information to carry out the function of photographed images demonstration etc.On the surface of the shell of this external device (ED) 105, be provided with the operating portion 113 that is used to show the LCD monitor 112 of described photographed images and is used to carry out various function operations indications.In addition, this external device (ED) 105 also is provided with and carries out showing the LED of usefulness with the warning of the surplus of battery and as the on and off switch of operating portion 113 about driving power.
This external device (ED) 105 is installed on patient 102 the health, and shown in Figure 30 B, by being installed on the carriage 106 and be connected with termination 107.The tube chamber image processing apparatus promptly for example adopts PC as the termination 107 of cardia checkout gear, and it comprises: processing capacity and the terminal body 109 of memory function, the keyboard 108a and the mouse 108b that are used for various manipulation inputs and the display 108c that shows various results with various data.The basic function of this termination 107 is to carry out following Flame Image Process: the photographed images information that will store in described external device (ED) 105 via carriage 106 is taken into, and write in the mobile memories such as semiconductor memory that can rewrite that record the rewritable memorizer that is built in the terminal body 109 or can freely load and unload with terminal body 109, and the photographed images information that will write down is shown in display 108c.In addition, termination 107 is carried out cardia and is detected processing by the image processing method that the aftermentioned embodiment relates to.In addition, be stored in the photographed images information in the described external device (ED) 105, also can be taken into termination 107 by USB cable etc. by described carriage 106.These carriage 106 grades are the image input parts that are used to import the captured image of capsule type endoscope 3.
Use Figure 31 that the profile and the internal structure of described capsule type endoscope 103 are described below.Capsule type endoscope 103 forms capsule shape, and comprise: the cross section is the case member 114 of U word shape, and the roughly hemispheric cover 114a that installs by the bonding agent watertight in the open end of the front of this case member 114, formed by transparent component.
In the inner hollow portion of the capsule shape that constitutes by this case member 114 and cover 114a, inside at the middle body of the hemisphere circular arc of described cover 114a, take in lens bracket 116 and dispose object lens 115, these object lens 115 will look like to be taken into via the incident look-out station of cover 114a.On the image space of these object lens 115, disposing imaging apparatus is charge coupled cell (hereinafter referred to as CCD) 117.Around the lens bracket 116 of taking in described object lens 115, dispose four LED 118 (two LED only are shown among the figure) that can send the white color system of illumination light at grade.Dispose in the hollow bulb of the described case member 114 of the rear end side of described CCD 117: treatment circuit 119, this treatment circuit 119 drive the generation that the described CCD 117 of control carries out the resulting image pickup signal of opto-electronic conversion and handle, this image pickup signal is implemented the signal processing of regulation and the shooting that generates the photographed images signal is handled and controlled the LED that lights/extinguish action of described LED 118 and drives and handle; Communication processing circuit 120, it will handle the photographed images conversion of signals that is generated by the shooting of described treatment circuit 119 is that wireless signal sends; Transmitting antenna 123, it will send to the outside from the wireless signal of described communication processing circuit 120; And the driving that described treatment circuit 119 and communication processing circuit 120 are provided a plurality of button cells 121 of power supply.In addition, CCD117, LED 118, treatment circuit 119, communication processing circuit 120 and transmitting antenna 123 are disposed on the not shown substrate, connect by flexible substrate between these substrates.
Capsule type endoscope 103 moves in patient's 102 bodies, will send to external device (ED) 105 with the captured in-vivo image in official hour interval simultaneously, and external device (ED) 105 writes down the endoscopic images that receives in built-in storage device.Be recorded in the endoscopic images in the external device (ED) 105, transfer to termination 107 via carriage 106 and be stored in the not shown storage device, termination 107 is handled according to being transmitted the detection that the endoscopic images of storing carries out pars cardiaca.It is that program is carried out that this cardia detects the image processing software of handling by the view data of endoscopic images being carried out Flame Image Process.This image processing software is carried out by the blood processor such as CPU of termination 107.
Below Shuo Ming Flame Image Process realizes by software, therefore in described capsule type endoscope 103, external device (ED) 105 or termination 107, can carry out this Flame Image Process, but describe in the termination 107 that adopts PC, to be implemented as example here.In addition, in the explanation of Flame Image Process content, the size of a two field picture is by ISX * ISY (1≤ISX, ISY.For example ISX=640, ISY=480) red (R), green (G), three planes of blue (B) constitute, the grey of each planar pixel is that 8 bits are promptly got 0 to 255 value.
In addition, described capsule type endoscope 103 is for example when checking esophagus, with the speed (15fps~30fps) take of per second 15~30 width of cloth images.That is, capsule type endoscope 103 carries out camera function control and takes the few low speed shooting state of width of cloth number to become per second by behind the esophagus.This can be by being provided with not shown timing circuit, control so that when the timer counter of this timing circuit at the appointed time in the time, take the many high-speed camera states of width of cloth number with per second and take, when realizing through then taking to wait after the stipulated time with the few low speed shooting state of per second shooting width of cloth number.
Figure 32 be expression according to resulting a series of endoscopic images, the flow chart of in termination 107, carrying out of example of handling process of detection that passes through to carry out pars cardiaca that passes through stomach esophagus junction surface.After examinee's oral area enters, take resulting a series of endoscopic images, constitute, each frame is carried out the processing of Figure 32 by a plurality of frames.In addition, the view data of each endoscopic images was carried out pretreatment such as contrary Gamma proofreaies and correct, noise is removed before the processing of implementing Figure 32.
For the frame that begins most from a series of images of the processing that will implement Figure 32 begins to handle, at first making frame number i is 1 (step S201).I is 1 to n integer.
Then read the view data (step S202) that frame number is the image Fi of i from the storage device (not shown) of termination 107.Image Fi is made of three planes of RGB.
According to the view data of the image Fi that reads, calculate the characteristic quantity of tone, calculate average tone characteristic quantity μ i here, as the characteristic quantity (step S203) of the regulation of endoscopic images.Average tone characteristic quantity μ i is the meansigma methods of the tone characteristics amount of whole pixels contained in each image.Step S203 constitutes according to characteristic quantity calculation procedure or the feature value calculation unit calculated about the value of whole pixels of each image Fi as the average tone characteristic quantity of tone characteristics amount.
Judge then whether average tone characteristic quantity μ i surpasses the threshold value Th (step S204) of regulation.When the tone characteristics amount was used the value of R/ described later (R+G+B), threshold value Th for example was 0.5.
In addition, by the value of this threshold value Th is adjusted, can judge that the image Fi that reads is to take resulting image near the stomach esophagus junction surface (EG junction surface) to having entered gastral boundary portion, or the central part at stomach esophagus junction surface taken resulting image, still the part that has entered stomach is taken resulting image.About this point, will use Figure 33 to describe in the back.
When in step S204, being "Yes", be that average tone characteristic quantity μ i is when surpassing the threshold value Th of regulation, for capsule type endoscope 103 will enter stomach esophagus junction surface (EG junction surface), during perhaps by stomach esophagus junction surface (EG junction surface), therefore be judged to be the image (step S205) of image Fi of reading for taking at this moment.Step S204 and S205 constitute the boundary portion test section that detects gastral boundary portion according to the average tone characteristic quantity μ i that calculates, and are that the testing result at stomach esophagus junction surface judges that image is the detection unit of the image of the part from the esophagus to the pars cardiaca in the tube chamber according to boundary portion.
When in step S204, being "No", be average tone characteristic quantity μ i when surpassing the threshold value Th of regulation, judge the processing (step S206) of whether a series of images that all will implement the processing of Figure 32 having been finished Figure 32, when all finishing, in step S206 "Yes", end process.When in step S206, being "No", owing to remain untreated image, carry out the processing (step S207) of i=i+1, after this next image is repeated the processing of step S202 to S204.
In addition, as mentioned above, in step S205, when average tone characteristic quantity μ i surpasses the threshold value Th of regulation, be judged to be is that capsule type endoscope 103 is by stomach esophagus junction surface (EG junction surface) or will enter moment at stomach esophagus junction surface (EG junction surface), in other words, can be judged to be and after this will arrive pars cardiaca or after this will arrive gastric portion.Therefore can carry out the detection of pars cardiaca.
Here, the tone to the bio-tissue from esophagus to stomach describes.Figure 33 is the schematic graph that is used for illustrating the tone variations of resulting a series of endoscopic images.In Figure 33, transverse axis represents that the longitudinal axis is represented the tone characteristics amount of the endoscopic images corresponding with each picture number along the picture number (frame number) that arrives the sequential of the resulting endoscopic images of stomach from esophagus squamous epithelium portion process stomach esophagus junction surface.
By the tone of capsule type endoscope 103 resulting each images of entering from examinee's oral area, change as shown in figure 33.That is, the tone of the RA of squamous epithelium portion of esophagus and the stomach RC of cylindrical epithelium is different, and the stomach esophagus junction surface RB between them level is arranged gradually changes.In Figure 33, for example in the tone characteristics amount R/ (R+G+B) that calculates according to three pixel values of RGB described later, the esophagus squamous epithelium RA of portion is that white tone thereby tone characteristics amount are little, is that red color tone thereby tone characteristics amount are big in stomach RC.Among the stomach esophagus junction surface RB between them, the tone characteristics amount is gradually varied to red color tone from white tone.
Therefore, the such tone characteristics amount of the step S204 of Figure 32 in the process that this gradually changes when surpassing the threshold value Th of regulation (when tone becomes redness), the image that is judged to be this image and is when capsule type endoscope 103 will enter stomach esophagus junction surface (EG junction surface) or take during by stomach esophagus junction surface (EG junction surface) etc., in other words, can carry out the detection that gastral boundary portion is a pars cardiaca.That is,, and carry out the detection of pars cardiaca according to the difference of the tone from the esophageal mucosa membrane injury to the gastric mucosa.Particularly, in order to carry out this meansigma methods that adopts the tone characteristics amount by the judgement that waits etc. reliably.
Object lesson to the average tone characteristic quantity μ i that illustrates in Figure 32 describes below.
Figure 34 is the flow chart of example of handling process of the step S203 of that expression is carried out the image of each frame, Figure 32 when using according to the R/ (R+G+B) of three calculated for pixel values of RGB as average tone characteristic quantity μ i.In the processing of Figure 34, calculate colourity rj/ (rj+gj+bj) according to RGB three pixel value rj, gj, bj of each pixel of a frame, and calculate average tone characteristic quantity μ i.J represents to be used for determining in the view data of each frame the numbering of pixel.
At first, be made as j=1, val=0, count=0 (step S211).Wherein, val is the variable that is used to obtain the summation of tone characteristics amount, and count is the variable that is used for obtaining the pixel count that uses in the calculating of average tone characteristic quantity μ i.
Judge in step S212 then whether j pixel is the pixel that belongs to dark portion.Particularly, when the value of j pixel in each image of R image, G image and B image was rj, gj and bj, if rj≤thd, gj≤thd, bj≤thd, then being judged to be this pixel was the pixel that belongs to dark portion.Here, thd is the threshold value of all kinds that is used to take a decision as to whether dark portion pixel, in the present embodiment, for example is set at thd=10.When being judged to be j pixel is when belonging to the pixel of dark portion, enters step S216, and is when not belonging to the pixel of dark portion when being judged to be j pixel, enters step S213.
Then in step S213, judge whether j pixel is whether very bright pixel is the pixel that belongs to halation portion promptly.Particularly, if rj 〉=thh, gj 〉=thh, bj 〉=thh, then being judged to be is the halation pixel.Here, thh is the threshold value of all kinds that is used to take a decision as to whether the halation pixel, in the present embodiment, for example is set at Th=200.When being judged to be j pixel when being the halation pixel, enter step S216, and, enter step S214 when being judged to be j pixel when not being the halation pixel.
In addition, in step S212 and S213, threshold value thd about R image, G image and B image is respectively identical value for each pixel rj, gj with bj with thh, but, can be with threshold setting also for example therefore than threshold value height at gj and bj at rj because the organism mucosa generally speaking has the brightest tendency of R image.In addition, also can with threshold setting be not same value respectively at rj, gj and bj.
In step S214, carry out the computing of val=val+rj/ (rj+gj+bj) and count=count+1.Variable val to the summation that is used to find the solution the tone characteristics amount adds tone characteristics amount rj/ (rj+gj+bj), makes variable count increase by 1.
In addition, in step S215, judge and whether whole pixels have been carried out the processing of step S212 to step S214.Particularly, when j<ISX * ISY, in step S216, be used in the numbering j that determines pixel and add 1 (j=j+1), next pixel is carried out abovementioned steps S212 to step S214.When j=ISX * ISY, when promptly whole pixels having been carried out step S212 to the processing of step S214, judge that count is whether than the threshold value thc big (step S217) of regulation.Threshold value thc represents to estimate inadequate pixel count for tone, if be the above quantity of this threshold value the, then for exist quantity sufficient to the effective pixel of tone evaluation.In step S217, under the situation of "Yes", promptly exist quantity sufficient to the effective pixel of tone evaluation the time, by utilizing the pixel count count that in the calculating of average tone characteristic quantity μ i, uses, calculate average tone characteristic quantity μ i (step S218) divided by tone characteristics amount summation val.Particularly, μ i=val/count.More than, to the pixel except dark portion pixel and halation pixel in the image in the tube chamber, calculate average tone characteristic quantity μ i.
When in step S217, being "No", promptly do not exist quantity sufficient to the effective pixel of tone evaluation the time, the image of this frame is that mistake is abnormal image (step S219), and making average tone characteristic quantity μ i for example is 0 (zero), is judged as not surpass threshold value Th in the step S204 of Figure 32.
Various variation to present embodiment describe below.
About first variation, at above-mentioned Figure 32 in Figure 34, image according to each frame, carry out the judgement of passing through etc. of the passing through of stomach esophagus junction surface, pars cardiaca, but also can work as the above image of requirement ratio in successive a plurality of image or the successive a plurality of image (for example 80%) when the result of determination of step S4 is μ i>Th, be judged as capsule endoscope 103 and passed through stomach esophagus junction surface etc.
In addition, about second variation, though above explanation is that successive a plurality of images are handled, the processing of Figure 32 also can be carried out specific piece image.
In addition, about the 3rd variation, also can calculate the rolling average of the average tone characteristic quantity μ i in successive a plurality of image, whether surpass the threshold value of regulation, carry out the whether judgement by stomach esophagus junction surface etc. of capsule type endoscope 103 according to this moving average.For example, work as m=2,3,4, and during i>=m+1 (take out successive m width of cloth image from n width of cloth image, (m+1) be below the i at this moment), the image F (i-m) that obtains from each image of the successive m width of cloth, average tone characteristic quantity μ i according to Fi comes moving average calculation, judges whether this moving average surpasses the threshold value of regulation.According to the situation of using such rolling average, even red strong intraesophageal image that the variation of the lighting condition that causes owing to the difference because of viewing distance, angle etc. produces etc., also can remove the influence of the little variation of average tone characteristic quantity, whether pass through stomach esophagus junction surface etc. with higher accuracy detection capsule type endoscope 103.
In addition, about the 4th variation, in above example, use R/ (R+G+B) as the ratio of the pixel value of calculating as the tone characteristics amount, but also can adopt other parameter according to three pixel values of RGB.As other the parameter of tone characteristics amount, for example can be G/ (R+G+B) or IHb (=32log 2(R/G)), form and aspect, chroma etc.
In addition, about the 5th variation, also can adopt a plurality of tone characteristics amounts.For example in the step S203 of Figure 32, use is as the R/ (R+G+B) and the G/ (R+G+B) of the ratio of the pixel value of calculating according to three pixel values of RGB, and the meansigma methods of calculating them is the average value mu 1i of tone characteristics amount (R/ (R+G+B)) of the whole pixels in each image and the average value mu 2i of tone characteristics amount (G/ (R+G+B)).Then, can judge in step S204 also whether average value mu 1i and average value mu 2i are μ 1i>Th1 and μ 2i<Th2.
In addition, about the 6th variation, also can detect capsule type endoscope 103 and whether pass through stomach esophagus junction surface etc. according to the variable quantity of average tone characteristic quantity.That is, not to judge whether the average tone characteristic quantity that obtains from each image of successive a series of images surpasses the threshold value of regulation, but judge whether the variable quantity of the average tone characteristic quantity of two width of cloth images surpasses the threshold value of regulation.Promptly, average tone characteristic quantity for each image, also can be for example compare with the average tone characteristic quantity of a previous image or a back image, if the difference of these two average tone characteristic quantities surpasses the threshold value of regulation, for example then being judged to be, capsule type endoscope 103 has entered stomach esophagus junction surface, has entered stomach etc. from stomach esophagus junction surface from esophagus.Judge whether the average tone characteristic quantity μ (i-m1) of image F (i-m1) and image Fi and the difference value of μ i (μ i-μ (i-m1)) the above variation of defined threshold has taken place.M1 is 1,2,3 ...
This be because, owing to the existence of pathological changes portions such as the individual differences of mucomembranous color, Barrett esophagus, the reasons such as deviation of camera system, the tone of mucosa is always not constant, therefore under the condition of the influence that is not subjected to this individual differences etc., can judge whether capsule type endoscope 103 has passed through stomach esophagus junction surface etc.
In addition, also can detect the variation of average tone characteristic quantity this moment by the differential value that calculates average tone characteristic quantity.
Figure 35 is the flow chart of example of the handling process of the expression variation that detects average tone characteristic quantity by the differential value that calculates this average tone characteristic quantity.
In addition, the view data of each image was implemented pretreatment such as contrary Gamma proofreaies and correct, noise is removed also as illustrating in the processing of Figure 32 before the processing of carrying out Figure 35.Step S201 is identical to the processing of step S203 with the step S201 of the processing of Figure 32 to the processing of step S203.That is, in order to begin to handle from the frame that begins most, at first making frame number i is 1 (step S201).Then the view data of the image Fi of frame number i is read (step S202) from the storage device (not shown) of termination 7.Calculate average tone characteristic quantity μ i (step S203) according to the view data of the image Fi that reads.
Judge whether all images is handled, promptly whether finished all images (step S221),, in step S221, be "No", make i=i+1 (step S207), handle changing step S202 over to if do not finish all images.
When having finished all images, it in step S221 "Yes", for resulting a plurality of average tone characteristic quantity μ i, for smoothing in the scope of regulation is the scope of image of the successive regulation width of cloth number moving average calculation f (μ i) (step S222).Then, the effluxion according to moving average f (μ i) comes computing differential value Δ f (μ i) (step S223).
Then, determine to detect and the corresponding image Fi (step S224) of differential value Δ f (μ i) that surpasses the threshold value thf that stipulates.Step S3 constitutes the test section that detects gastral boundary portion to S24.
As mentioned above, the variable quantity that can detect tone between a plurality of images surpasses the image of defined threshold, therefore even there is the individual differences of mucomembranous color etc., also can not be subjected to the influence of individual differences etc., carry out the whether judgement by stomach esophagus junction surface etc. of capsule type endoscope 103.
In addition, about the 7th variation, also can use standard deviation or centrifugal pump, and not use the meansigma methods of tone characteristics amount.
For example, Figure 36 is the curve chart that is used to illustrate the variation of the standard deviation of tone characteristics amount of resulting a series of endoscopic images or centrifugal pump.In Figure 36, transverse axis represents that the longitudinal axis is represented the standard deviation i or the centrifugal pump vi of the tone characteristics amount of the endoscopic images corresponding with each picture number along the picture number (frame number) of the sequential of the endoscopic images that obtains to stomach from esophagus squamous epithelium portion process stomach esophagus junction surface.
Tone by capsule type endoscope 103 resulting each images that enter from examinee's oral area changes as shown in figure 33, and standard deviation or the centrifugal pump of the tone characteristics amount R/ (R+G+B) that calculates change as shown in figure 36.Promptly, because the tone of the image of the RA of squamous epithelium portion of esophagus and the stomach RC of cylindrical epithelium is identical respectively, therefore standard deviation i or the centrifugal pump vi of tone characteristics amount R/ (R+G+B) are little, but among the stomach esophagus junction surface RB between them, this standard deviation i or centrifugal pump vi are big.
Therefore, according to the standard deviation i or the centrifugal pump vi of the tone characteristics amount of each image, can judge that capsule type endoscope 103 whether will be by stomach esophagus junction surface etc.
Also can replace the standard deviation i or the centrifugal pump vi of tone characteristics amount in addition, and use their variation coefficient (the average tone characteristic quantity of=standard deviation i/ μ i).
In addition, about the 8th variation, though in the example of above explanation, use the pixel data of whole pixels of the image of each frame, also can be as shown in figure 37, be not process object, and only the pixel in the regulation zone of each frame sampled and as process object with whole pixels.Figure 37 is the figure that is illustrated in the example in the zone of carrying out the Flame Image Process in above-mentioned present embodiment and the variation in the image 131 of each frame.
The image 131 of each frame is divided into the zone of regulation.In Figure 37, each image 131 is divided into 16 parts and makes each zone of cutting apart is rectangle, in the zone that this is cut apart, only to the zone (R2, R3, R5, R8, R9, R12, R14, R15) of regulation, promptly only to being concerned about that zone (ROI) carries out above-mentioned processing.Particularly,, also can remove the central part in the visual field and set care zone (ROI) therefore in order to calculate the tone of mucomembranous surface more accurately because esophagus is the tube chamber internal organs.
Therefore, if only to be concerned about that zone (ROI) is a process object, then operand can be reduced and the high speed realizing handling.
In addition, for the high speed of further realizing handling, when only when being concerned about that zone (ROI) is process object, also can not that whole frames is handled, but according to interval the k width of cloth (k=1,2,3 ...) each frame, only will be concerned about the zone (ROI) pixel as process object.This is owing to particularly in esophagus, because the width of cloth number that per second is taken is many, even therefore carry out the interval sampling of some width of cloth at interval sometimes, also can judge exactly.
As mentioned above, according to present embodiment (comprising variation), can be according to the tone characteristics amount of tube chamber image, judge that whether each image is the image of capsule type endoscope during will enter stomach esophagus junction surface (EG junction surface) time or by stomach esophagus junction surface (EG junction surface).
In addition, in the present embodiment, by the characteristic quantity threshold application of calculating is handled, image during when detecting each image and whether be capsule type endoscope and will enter stomach esophagus junction surface (EG junction surface) or by stomach esophagus junction surface (EG junction surface), but also can use recognition function such as for example known linearity section of declaring function to carry out this detection.In addition, also can be used in combination characteristic quantity in other embodiment.
(the 5th embodiment)
Cardia checkout gear and method thereof to the employing capsule type endoscope device of the 5th embodiment describes with reference to the accompanying drawings.Same as the endoscopic images and the 4th embodiment of Flame Image Process object in the present embodiment, be the successive a series of endoscopic images that obtains by capsule type endoscope device 101, so and the omission explanation identical with the 4th embodiment of the structure of cardia checkout gear.
Use the tone characteristics amount in the above-described 4th embodiment, the aspect that but the tube chamber image processing apparatus of present embodiment is the cardia checkout gear is different from the 4th embodiment is, it uses the light and shade information of each image, judges whether each image is that capsule type endoscope passes through the isochronous image in stomach esophagus junction surface (EG junction surface).
Figure 38 is that the light and shade that is used for illustrating resulting a series of endoscopic images changes, particularly is the schematic graph of brightness flop.In Figure 38, transverse axis represents that the longitudinal axis is represented the brightness of the endoscopic images corresponding with each picture number along the picture number (frame number) that arrives the sequential of the resulting endoscopic images of stomach from esophagus squamous epithelium portion process stomach esophagus junction surface.
The light and shade of the brightness presentation video by capsule type endoscope 103 resulting each images of entering from examinee's oral area changes as shown in figure 38.That is, the brightness among the RA of squamous epithelium portion of esophagus and the stomach RC of cylindrical epithelium is different, and then the stomach RC of the RA of squamous epithelium portion of the brightness among the stomach esophagus junction surface RB between them and esophagus and cylindrical epithelium is different.As shown in figure 38, for example in the brightness of calculating according to three pixel values of RGB, the esophagus squamous epithelium RA of portion is owing to be narrow tube chamber internal organs, near apart from the distance of mucosal wall, therefore if remove dark portion and halation portion, then average brightness value is big, and brightness relatively is little in stomach RC.Among the stomach esophagus junction surface RB between them, in the such tube chamber of esophagus because the pars cardiaca that top view is closed, so brightness value is bigger than the esophagus squamous epithelium RA of portion.
Therefore, when surpassing the threshold value Th1 of regulation in the process that the light and shade information of image is gradually changing, can be judged to be by stomach esophagus junction surface (EG junction surface), maybe will enter stomach esophagus junction surface (EG junction surface) etc.That is, according to the difference of the light and shade information in the captured image, can be when capsule type endoscope 103 will enter stomach esophagus junction surface (EG junction surface) or the detection of the pars cardiaca of closing during by stomach esophagus junction surface (EG junction surface).Particularly, in order to carry out this judgement reliably, can utilize average brightness value etc. as light and shade information by waiting.
Figure 39 and Figure 32 are same, the flow chart at the example of the handling process by stomach esophagus junction surface time detection pars cardiaca that to be expression carries out in termination 107 according to resulting a series of endoscopic images.The processing of Figure 39 and the processing of Figure 32 are basic identical, enter the back resulting a series of endoscopic images of shooting from examinee's oral area and are made of a plurality of frames, each frame are carried out the processing of Figure 39.In addition, the view data of each endoscopic images was carried out pretreatment such as contrary Gamma proofreaies and correct, noise is removed before the processing of implementing Figure 39.The processing identical with Figure 32 given identical number of steps and simplified illustration in Figure 39.In addition, though below to use brightness value to describe as the situation of light and shade information, but with the pixel data value of G or B as under the situation of light and shade information, only threshold data is different and judge whether pixel data is lower than the situation difference of threshold value, therefore omission explanation.
At first, for the frame that begins most from a series of images of the processing that will implement Figure 39 begins to handle, making frame number i is 1 (step S201), then reads the view data (step S202) that frame number is the image Fi of i from the storage device (not shown) of termination 107.
View data according to the image Fi that reads is calculated average brightness value Ii (step S233).Average brightness value Ii is the meansigma methods of the brightness value of whole pixels contained in each image.Here, brightness value I is the characteristic quantity of the light and shade of presentation video as mentioned above, for example calculates by brightness value I=0.6R+0.3G+0.1 B.The brightness value that step S233 constitutes the whole pixels of each image Fi basis calculates characteristic quantity calculation procedure or the feature value calculation unit of average brightness value as the characteristic quantity of light and shade information.
Judge then whether average brightness value Ii surpasses the threshold value Th11 (step S234) of regulation.
In addition, by the value of this threshold value Th11 is adjusted, can be judged the image Fi that reads whether be entered stomach esophagus junction surface (EG junction surface) near captured image, perhaps whether be the captured image of central part at stomach esophagus junction surface.
When in step S234, being "Yes", be average brightness value Ii when surpassing the threshold value Th11 of regulation, the image of taking when being judged to be this image Fi and being capsule type endoscope 103 will enter stomach esophagus junction surface (EG junction surface) time or by stomach esophagus junction surface (EG junction surface) (step S205).Step S234 and S205 constitute according to the average brightness value Ii that calculates and detect the boundary portion test section of gastral boundary portion and be that the testing result at stomach esophagus junction surface judges that image is the detection unit of the image of the part from the esophagus to the pars cardiaca in the tube chamber according to boundary portion.
When in step S234, being "No", be average brightness value Ii when surpassing the threshold value Th11 of regulation, judge the processing (step S206) of whether a series of images that all need implement the processing of Figure 39 having been finished Figure 39, when all finishing, in step S206 "No", end process.When in step S206, being "No", owing to remain untreated image, thereby carry out the processing (step S207) of i=i+1, after this repeat the processing of step S202 to S234.Step S233 constitutes the test section that detects gastral boundary portion to S205.
In addition, also be in the present embodiment in step S205, when light and shade information is that average brightness value Ii is when surpassing the threshold value Th11 of regulation, the image of taking when being judged to be this image Fi and being capsule type endoscope 103 will enter stomach esophagus junction surface (EG junction surface) time or by stomach esophagus junction surface (EG junction surface), in other words, can be judged to be after this and will or will arrive gastric portion by the pars cardiaca of closing.Therefore can carry out the detection of pars cardiaca.
The following describes the object lesson of the average brightness value Ii that in Figure 39, illustrates.
The flow chart of the example of the handling process of the step S233 of (0.6R+0.3G+0.1B) that Figure 40 use that to be expression implement the image of each frame is calculated according to three pixel values of RGB Figure 39 during as average brightness value Ii.In the processing of Figure 40, calculate brightness value (0.6R+0.3G+0.1B) according to three each pixel value rj, gj, the bj of the RGB of each pixel of a frame, thereby calculate average brightness value Ii.
In addition, because Figure 40 comprises the treatment step identical with Figure 34, therefore give identical number of steps and simplified illustration for identical processing.J is illustrated in the numbering that is used for determining pixel in the view data of each frame.
At first, be made as j=1, val1=0, count1=0 (step S241).Wherein, val1 is the variable that is used to ask the summation of light and shade characteristic quantity, and count1 is the variable that is used for asking the pixel count that uses in the calculating of average brightness value Ii.
Next in step S212, judge whether j pixel is the pixel that belongs to dark portion.When being judged to be j pixel is to enter step S216 when belonging to the pixel of dark portion, and is to enter step S213 when not belonging to the pixel of dark portion when being judged to be j pixel.
Then in step S213, judge whether j pixel is whether very bright pixel is the pixel that belongs to halation portion promptly.Enter step S216 when being judged to be when j pixel is the halation pixel, and enter step S244 when j pixel is not the halation pixel when being judged to be.
In addition, in step S212 and S213, threshold value thd about R image, G image and B image is respectively identical value with thh for each pixel rj, gj and bj, but, can be with threshold setting also for example therefore than threshold value height at gj and bj at rj because the organism mucosa generally speaking has the brightest tendency of R image.In addition, also the threshold value at rj, gj and bj can be set at different values respectively.
In step S244, carry out the computing of val1=val1+ (0.6rj+0.3gj+0.1bj) and count1=count1+1.Variable val1 to the summation that is used to ask the light and shade characteristic quantity adds that the light and shade characteristic quantity is brightness value (0.6rj+0.3gj+0.1bj), makes variable count1 increase by 1.
Then, in step S215, judge and whether carried out the processing of step S212 to step S244 for whole pixels.When not whole pixels not having be carried out step S212 to the processing of step S244, in step S216, determine that to being used for the numbering j of pixel adds 1 (j=j+1), next pixel is carried out processing from described step S212 to step S244.When the processing of whole pixels having been carried out from step S212 to step S244, judge that whether count1 is than the threshold value thc that stipulates big (step S217).When exist quantity sufficient to the effective pixel of light and shade evaluation the time, calculate average brightness value Ii (step 218) with variable count1 divided by variable val1.Particularly, Ii=val1/count1.
When do not exist quantity sufficient to the effective pixel of light and shade evaluation the time, the image that is judged as this frame is that mistake is abnormal image (step S219), making average brightness value Ii for example is 0 (zero), in the step S234 of Figure 39, is judged as not and surpasses threshold value Th11.
In addition, in the above description, the situation that pars cardiaca is closed is illustrated, and still also can be suitable for the processing of present embodiment for situation about opening, and can whether surpass the threshold value of stipulating according to brightness value and detect pars cardiaca.
Various variation to present embodiment describe below.
About first variation, at above-mentioned Figure 39 in Figure 40, image according to each frame, carry out the judgement of passing through etc. of the passing through of stomach esophagus junction surface, pars cardiaca, but also can work as the above image of requirement ratio in successive a plurality of image or the successive a plurality of image (for example 80%) when the result of determination of step S234 is Ii>Th11, be judged as capsule endoscope 103 and passed through stomach esophagus junction surface etc.
In addition, about second variation, though above explanation is handled successive a plurality of images, the processing of Figure 39 also can be carried out specific piece image.
In addition,, also can calculate the rolling average of the average brightness value Ii in successive a plurality of image, whether surpass the threshold value of regulation, carry out the whether judgement by stomach esophagus junction surface etc. of capsule type endoscope 103 according to this moving average about the 3rd variation.For example, work as m=2,3,4, and during i>=m+1 (take out successive m width of cloth image from n width of cloth image, (m+1) be below the i at this moment), the image F (i-m) that obtains from each image of the successive m width of cloth, average brightness value according to Fi comes moving average calculation, judges whether this moving average surpasses the threshold value of regulation.According to using such rolling average, even red strong intraesophageal image that the variation of the lighting condition that causes owing to the difference of viewing distance, angle etc. produces etc., also can remove the influence of the little variation of average brightness value, with passing through etc. of the stomach esophagus junction surface of higher accuracy detection capsule type endoscope 103.
In addition,, in above example,, also can substitute three brightness that calculated for pixel values goes out, and use the pixel data of G or B according to above-mentioned RGB for the light and shade characteristic quantity about the 4th variation.
Figure 41 is the schematic graph of variation of the pixel data of the G of a series of endoscopic images of expression when being used for illustrating pixel data that brightness that replacement goes out according to three calculated for pixel values of above-mentioned RGB uses G or B as the light and shade information of image or B.In Figure 41, transverse axis represents that the longitudinal axis is represented the G of the endoscopic images corresponding with each picture number or the pixel data value of B along the picture number (frame number) that arrives the sequential of the resulting endoscopic images of stomach from esophagus squamous epithelium portion process stomach esophagus junction surface.
Promptly, as shown in figure 41, in the stomach RC of the RA of squamous epithelium portion of esophagus and cylindrical epithelium, the pixel data value difference of G or B, and then the stomach esophagus junction surface RB between them, the RA of squamous epithelium portion of the pixel data value of G or B and esophagus is different with the stomach RC of cylindrical epithelium.Particularly, at stomach esophagus junction surface RB, from the RA of squamous epithelium portion of the esophagus of the big white tone of pixel data value, to the little cylindrical epithelium of pixel data value and the stomach RC of fundic gland mucosa, the pixel data value of G or B reduces gradually.
Therefore, when being lower than the threshold value Th12 of regulation in the process that the light and shade information of image is gradually changing, be judged to be capsule type endoscope 103 by stomach esophagus junction surface (EG junction surface) or will enter stomach esophagus junction surface (EG junction surface) etc.That is,, judge the passing through etc. of stomach esophagus junction surface of capsule type endoscope 103 according to the difference of the light and shade information of the image of taking.Particularly, in order to carry out this judgement by waiting reliably, the meansigma methods etc. of pixel data value that can adopt G or B is as light and shade information.
In addition, about the 5th variation, also can adopt a plurality of light and shade information.For example in the step S233 of Figure 39, use the brightness value of the pixel of calculating according to three pixel values of RGB and the pixel data of G, the meansigma methods of calculating them is the meansigma methods I2i of the pixel data of the average brightness value I1i of the whole pixels in each image and G.Then, also can in step S234, judge whether the meansigma methods I2i of the pixel data of average brightness value I1i and G is I1i>Th13 and I2i<Th14.
In addition, about the 6th variation, also can be according to light and shade change in information amount, detect the passing through etc. of stomach esophagus junction surface of capsule type endoscope 103.That is, also can not to judge whether the light and shade information that obtains from each image of successive a series of images surpasses the threshold value of regulation, but judge whether the light and shade change in information amount of successive two width of cloth images surpasses the threshold value of regulation.Promptly, also can be for the light and shade information of each image, for example the average brightness value with a previous image or a back image compares, if the difference of these two average brightness values surpasses the threshold value of regulation, then carry out the judgement that capsule type endoscope 103 has for example entered stomach esophagus junction surface, entered stomach etc. from stomach esophagus junction surface from esophagus.Judge whether the average brightness value I (i-m1) of image F (i-m1) and image Fi and the difference value of Ii (Ii-I (i-m1)) the above variation of defined threshold has taken place.M1 is 1,2,3 ...
This is the existence of the pathological changes portion such as individual differences, Barrett esophagus owing to mucomembranous color, the reasons such as deviation of camera system, the tone of mucosa is always not constant, therefore under the condition of the influence that is not subjected to this individual differences etc., can judge whether capsule type endoscope 3 has passed through stomach esophagus junction surface etc.
In addition, also can detect the light and shade change in information this moment by the differential value that calculates average brightness value.
To be expression detect the flow chart of the example of the handling process that light and shade changes by the differential value that calculates this average brightness value to Figure 42.Owing to be the processing identical, therefore give identical number of steps and simplified illustration with Figure 35.
In addition, the view data of each image was implemented pretreatment such as contrary Gamma proofreaies and correct, noise is removed also as illustrating in the processing of Figure 39 before the processing of carrying out Figure 42.Step S201 is identical to the processing of step S233 with the step S201 of the processing of Figure 39 to the processing of step S233.
Judge whether all images is handled, promptly whether finished all images (step S221),, in step S221, be "No" then, make i=i+1 (step S207), handle changing step S202 over to if do not finish all images.
When having finished all images, be "Yes" in step S221, for resulting a plurality of average brightness value Ii, for smoothing in the scope of regulation is the scope of image of the successive regulation width of cloth number moving average calculation f (Ii) (step S252).Then, the effluxion according to moving average f (Ii) comes computing differential value Δ f (Ii) (step S253).
Then, determine to detect and the corresponding image Fi (step S254) of differential value Δ f (Ii) that surpasses the threshold value thf1 that stipulates.Step S233 constitutes the test section that detects gastral boundary portion to S254.
As mentioned above, the variable quantity that can detect light and shade between a plurality of images surpasses the image of defined threshold, therefore even there is the individual differences of mucomembranous color etc., also can not be subjected to the influence of individual differences etc., carry out the judgement of pass through grade at the stomach esophagus junction surface of capsule type endoscope 103.
In addition, about the 7th variation, also can detect the pars cardiaca of closed condition according to the distribution of light and shade.For example also can not use the meansigma methods of light and shade information, but same with the 4th embodiment, use standard deviation or centrifugal pump.According to the standard deviation or the centrifugal pump of the light and shade information in resulting a series of endoscopic images, can judge that capsule type endoscope 103 will pass through stomach esophagus junction surface etc.Particularly, obtain the standard deviation of the light and shade of R view data,, then be judged to be the pars cardiaca that top view is closed if this standard deviation is littler than the threshold value of regulation.This is because the light of the image when the pars cardiaca of top view closed condition is more even.In addition, also can replace the standard deviation or the centrifugal pump of average brightness value, and adopt their variation coefficient (=standard deviation/average brightness value).
In addition, about the 8th variation, though the pixel data of whole pixels of the image of each frame of use in the example of above explanation, but as the 4th embodiment according to Figure 37 illustrated, can be not process object also, and only the pixel in the regulation zone of each frame be sampled and as process object with whole pixels.Figure 37 is the figure that is illustrated in the example in the zone of carrying out the Flame Image Process in above-mentioned present embodiment and the variation in the image of each frame.
The zone of image division with each frame for stipulating.In Figure 37, each image is divided into 16 parts and makes each zone of cutting apart is rectangle, in the zone that this is cut apart, only to predefined zone (R2, R3, R5, R8, R9, R12, R14, R15), promptly only to being concerned about that zone (ROI) carries out above-mentioned processing.Particularly,, can remove the central part in a little visual field and set care zone (ROI) therefore in order to calculate the tone of mucomembranous surface more accurately because esophagus is the tube chamber internal organs.
Therefore, if only to be concerned about that zone (ROI) is a process object, then operand can be reduced and the high speed realizing handling.
In addition, for the high speed of further realizing handling, when only when being concerned about that zone (ROI) is process object, also can not that whole frames is handled, but at interval the k width of cloth (k=1,2,3 ...) each frame, only will be concerned about the zone (ROI) pixel as process object.This is because particularly the width of cloth number of taking owing to per second in esophagus is many, even therefore carry out the interval sampling of some width of cloth at interval sometimes, also can judge exactly.
As mentioned above, according to present embodiment (comprising variation), can be according to the light and shade information of image, judge that whether each image is that capsule type endoscope will be will enter stomach esophagus junction surface (EG junction surface) time or by the isochronous image in stomach esophagus junction surface (EG junction surface).
In addition, in the present embodiment, by the characteristic quantity threshold application of calculating is handled, image during when detecting each image and whether be capsule type endoscope and will enter stomach esophagus junction surface (EG junction surface) or by stomach esophagus junction surface (EG junction surface), but for example also can use recognition function such as known linear discriminant function to carry out this detection.In addition, also can be used in combination characteristic quantity in other embodiment.
(the 6th embodiment)
Cardia checkout gear and method thereof to the employing capsule type endoscope device of the 6th embodiment describes with reference to the accompanying drawings.Same as the endoscopic images and the 4th embodiment of Flame Image Process object in the present embodiment, be the successive a series of endoscopic images that obtains by capsule type endoscope device 101, so and the omission explanation identical with the 4th embodiment of the structure of cardia checkout gear.
Use the tone characteristics amount in the above-described 4th embodiment, but the tube chamber image processing apparatus of present embodiment is the cardia checkout gear to be characterised in that, the detection of the pars cardiaca by carrying out open mode judges whether each image is to be in from esophagus towards the isochronous image in the vicinity of the side nearby of the pars cardiaca of stomach when capsule type endoscope.
Figure 43 is the figure of the example of the image of expression capsule type endoscope 103 when being positioned at the side nearby of the pars cardiaca of opening.
When capsule type endoscope 103 in tube chamber, when taking the pars cardiaca open, in the pars cardiaca of opening, with periphery mutually specific luminance significantly reduce.As shown in figure 43, when pars cardiaca 132 was being opened, in the image 131 that capsule type endoscope 103 is taken, the pars cardiaca of opening 132 became dark portion zone.Therefore, along with capsule type endoscope 103 from esophagus by stomach esophagus junction surface (EG junction surface) and near cardia, the area of the pars cardiaca 132 in the image 131 increases.In the present embodiment, when the area in the pars cardiaca zone of opening surpasses prescribed level, then be judged to be capsule type endoscope 103 and will pass through pars cardiaca from the esophagus side of stomach.
Figure 44 represents according to resulting a series of endoscopic images, the flow chart of the example of the handling process of the detection of the pars cardiaca of opening.Enter the back a series of endoscopic images that obtains of taking from examinee's oral area and constitute, each frame is carried out the processing of Figure 44 by a plurality of frames.In addition, the view data of each endoscopic images was carried out pretreatment such as contrary Gamma proofreaies and correct, noise is removed before the processing of implementing Figure 44.
For the frame that begins most a series of images of the processing of implementing Figure 44 from needs begins to handle, at first making frame number i is 1 (step S261).I is 1 to n integer.
Then read the R view data (step S262) that frame number is the image Fi of i from the storage device (not shown) of termination 7.Image Fi is made of three planes of RGB.Here, only read the R view data.
In addition, read the R view data here for the judgement of pars cardiaca described later, but also can use G view data or the B view data of image Fi.
To whole pixels of the R view data of the image Fi that reads, implement 2 values of dark portion pixel and handle (step S263).Particularly,, compare and implement 2 values with the threshold value Th2 of regulation and handle, and to make dark portion pixel be that 1 to make pixel in addition be 0 (zero) the pixel value of each pixel.If making j is the numbering that is used for determining in the view data of each frame pixel, whether the value rj that then passes through to check each pixel is 1 less than threshold value Th2 thereby make the pixel less than threshold value Th2, and making pixel in addition is 0 (zero).
Calculate the ratio ε (step S264) of dark portion pixel below with respect to whole pixels.In other words ratio ε is the area ratio of the dark portion in the image.Particularly, count,, thereby calculate ratio ε whole pixel counts of this number divided by the R view data to handling the pixel count that becomes dark portion pixel by 2 values of step S263.If establishing dark portion pixel count is p1, establish image Fi and be of a size of ISX * ISY, then ratio ε is p1/ (ISX * ISY).Step S264 constitutes dark portion ratio calculation procedure or the dark portion ratio calculating part that each image Fi is calculated the ratio of dark portion pixel.
Then, the ratio ε of dark portion pixel and the threshold value Thr of regulation are compared, judge whether the ratio ε of dark portion pixel surpasses the threshold value Thr (step S265) of regulation.Threshold value Thr for example is 0.8.
When being "Yes" in step S265, promptly entire image surpasses 80% when being dark portion, is judged to be and detects the pars cardiaca of opening (step S266), end process.
When in step S265, being "No", be that dark portion is that 80% of entire image then is judged to be when following and is not cardia (step S267), judge the processing (step S268) of whether a series of images that all need implement the processing of Figure 44 having been finished Figure 44, when all finishing, in step S268 "Yes", end process.When in step S268, being "No", owing to remain untreated image, therefore carry out the processing (step S269) of i=i+1, after this next image is repeated the processing of step S262 to S265.
In addition, in step S266, when the ratio ε of dark portion pixel surpasses the threshold value Thr of regulation, then be judged to be and detect the pars cardiaca of opening, in other words, can be judged to be after this and will perhaps will arrive gastric portion by pars cardiaca.Step S263 constitutes the test section that detects gastral boundary portion to S266.
Various variation to present embodiment describe below.
In above-mentioned example, judgement for pars cardiaca, only read the R view data, but can also read G view data or B view data, perhaps read G view data and B view data, plural view data is carried out 2 values of dark portion pixel and handle, when the ratio ε of the dark portion pixel in the plural view data all surpasses the threshold value Thr that stipulates, be judged to be and detect pars cardiaca.
In addition, about second variation, in above-mentioned Figure 44, image according to each frame detects pars cardiaca, when but the result of determination that also can work as the step S265 of the above image of requirement ratio in successive a plurality of image or the successive a plurality of image (for example 80%) is ε>Thr, is judged to be and detects pars cardiaca.
About the 3rd variation, though above explanation is handled successive a plurality of images, the processing of Figure 44 also can be carried out piece image.
In addition, about the 4th variation, also can not to calculate the ratio ε of dark portion pixel with respect to whole pixels, but to the pixel of non-dark portion, be that the sum that is not less than the pixel P0 of threshold value Th2 is counted in above-mentioned example, should count divided by whole pixel counts, calculate the ratio ε 1 of the pixel of non-dark portion, perhaps calculate the ratio (p0/p1 of the non-dark pixel count p0 of portion with respect to the dark pixel count p1 of portion.Wherein, p1 is not 0 (zero)) etc., carry out the judgement that cardia detects.
In addition, about the 5th variation, capsule type endoscope 103 that can be when detecting pars cardiaca and the distance between the cardia change threshold value Thr.In other words, capsule type endoscope 3 in the time of can detecting pars cardiaca according to hope and the distance between the cardia change threshold value Thr.For example making Thr is 0.5, thereby is to compare in 0.8 o'clock with above-mentioned Thr, can earlier detect cardia.
As mentioned above, according to present embodiment, can detect the pars cardiaca of opening according to the area of the dark portion in the image.
In addition, in the present embodiment,, detect the pars cardiaca of opening, but for example also can use recognition function such as known linear discriminant function to carry out this detection by the characteristic quantity threshold application calculated is handled.In addition, also can be used in combination characteristic quantity in other embodiment.
(the 7th embodiment)
Cardia checkout gear and method thereof to the employing capsule type endoscope device of seventh embodiment of the invention describes with reference to the accompanying drawings.Be characterised in that in the present embodiment, detect the pars cardiaca of opening according to shape.Same as the endoscopic images and the 4th embodiment of Flame Image Process object in the present embodiment, be the successive a series of endoscopic images that obtains by capsule type endoscope device 101, so and the omission explanation identical with the 4th embodiment of the structure of cardia checkout gear.
The tube chamber image processing apparatus of present embodiment is that the cardia checkout gear is characterised in that, the shape of the pars cardiaca by detecting open mode judges that whether each image is near the isochronous image side nearby that is in pars cardiaca from esophagus towards the capsule type endoscope of stomach.
The figure of the example of the image when Figure 45 is the pars cardiaca of expression capsule type endoscope 103 by opening.In image 131A, include the pars cardiaca 132A that opens as dark portion image.
Figure 46 is expression detects the handling process of the pars cardiaca of opening according to resulting a series of endoscopic images the flow chart of example.Enter the back a series of endoscopic images that obtains of taking from examinee's oral area and constitute, each frame is carried out the processing of Figure 46 by a plurality of frames.In addition, the view data of each endoscopic images was carried out pretreatment such as contrary Gamma proofreaies and correct, noise is removed before the processing of implementing Figure 46.
For the frame that begins most a series of images of the processing of implementing Figure 46 from needs begins to handle, at first making frame number i is 1 (step S271).I is 1 to n integer.
Then read the R view data (step S272) that frame number is the image Fi of i from the storage device (not shown) of termination 7.Image Fi is made of three planes of RGB, only reads the R view data here.
R view data to the image Fi that reads is implemented bandpass filtering treatment (step S273).Bandpass filtering treatment can be by using known digital filter process of convolution or on the Fourier blade face, use.
The characteristic of band filter for example has characteristic shown in Figure 47.Figure 47 is the figure of the filter characteristic in the expression bandpass filtering treatment.This filter characteristic owing to be not vulnerable to the influence of trickle marginal element such as blood vessel, has in low slightly frequency band by the many characteristics of composition as shown in figure 47.Filter characteristic for example spatial frequency from 0 (zero) to the frequency of π (rad), on π/4 frequencies, have peak value (1.0) by characteristic.
Result image to bandpass filtering treatment uses threshold value to implement the 2 values processing (step S274) of marginal element below.2 values are handled and for example made the threshold value Th3 of regulation is 10.0, extracts the marginal element with variation bigger than this threshold value Th3.The pixel that makes the marginal element with variation bigger than threshold value Th3 is 1, and pixel in addition is 0 (zero).If making j is the numbering that is used for determining in the view data of each frame pixel, then whether surpass threshold value Th3 by the marginal element that checks each pixel, be 1 thereby make pixel rj above threshold value Th3, making pixel in addition is 0 (zero).In image, comprise as mentioned above under the situation of pars cardiaca, change,, can get rid of other marginal element for example by the caused fold of distortion of mucosa etc. therefore by threshold value Th3 is set at bigger value because this image has rapid light and shade.
Then, whether the image by the marginal element that judge to extract is the cardia determination processing because of the image that cardia produced, and takes a decision as to whether cardia (step S275).In the present embodiment, make marginal element's graph thinning of extraction, will with the consistent degree of approximate circle as evaluation of estimate, whether be circular according to marginal element, can take a decision as to whether cardia.Whether marginal element is that circular can be judged by half conversion process.
Figure 48 is the figure of expression to the example of the bandpass filtering of the image enforcement regulation of Figure 45 and the result's that 2 values are handled image.As shown in figure 48, the pars cardiaca of opening has the shape 132B of circular.Therefore, the image 131B of marginal element of Figure 48 is implemented half conversion etc., judge whether the 132B of marginal element is circular.
According to the result of cardia determination processing, when being cardia, in step S275, be "Yes", be judged to be and detect pars cardiaca (step S276), end process.
When in step S275, being "No", promptly when being not cardia, be judged to be and be not cardia (step S277), judge whether a series of images that all need implement the processing of Figure 46 has been finished the processing (step S278) of Figure 46, when all finishing, in step S278 "Yes", end process.When in step S278, being "No", owing to remain untreated image, carry out the processing (step S279) of i=i+1, after this next image is repeated the processing of step S272 to S274.Step S273 is to the test section of S276 constitutive characteristic amount calculating part, the gastral boundary portion of detection.
Below the various variation in the present embodiment are described.
In above-mentioned example, in order to carry out the judgement of pars cardiaca, only read the R view data, but can also read G view data or B view data, perhaps read G view data and B view data, plural view data is carried out 2 values of dark portion pixel and handles, when the dark portion pixel in the plural view data be shaped as circle the time, be judged to be and detect pars cardiaca.
In addition, about second variation, image according to each frame in above-mentioned Figure 46 detects pars cardiaca, but also can the dark portion pixel in the image more than the requirement ratio in successive a plurality of images or the successive a plurality of image (for example 80%) be shaped as circle the time, be judged to be and detect pars cardiaca.
About the 3rd variation, though above explanation is handled successive a plurality of images, the processing of Figure 46 also can be carried out piece image.
As mentioned above, according to present embodiment, can detect pars cardiaca according to the shape of the pars cardiaca of opening in the image.
In addition, in the present embodiment,, detect the pars cardiaca of opening, but for example also can use recognition function such as known linear discriminant function to carry out this detection by the characteristic quantity threshold application calculated is handled.In addition, also can be used in combination characteristic quantity in other embodiment.
(the 8th embodiment)
Cardia checkout gear and method thereof to the employing capsule type endoscope device of eighth embodiment of the invention describes with reference to the accompanying drawings.Be characterised in that in the present embodiment, when detecting the pars cardiaca of opening, utilize the border detection in dark portion zone and the rim detection both sides that in the 7th embodiment, illustrate, judge whether the pars cardiaca of opening is arranged in the image that photographs according to shape.Same as the endoscopic images and the 4th embodiment of Flame Image Process object in the present embodiment, be the successive a series of endoscopic images that obtains by capsule type endoscope device 1, so the tube chamber image processing apparatus is and the omission explanation identical with the 4th embodiment of the structure of cardia checkout gear.
Figure 49 be expression according to resulting a series of endoscopic images, carry out the flow chart of example of handling process of the detection of pars cardiaca.Enter the back resulting a series of endoscopic images of shooting from examinee's oral area and constitute, each frame is carried out the processing of Figure 49 by a plurality of frames.In addition, the view data of each endoscopic images was carried out pretreatment such as contrary Gamma proofreaies and correct, noise is removed before the processing of implementing Figure 49.
For the frame that begins most a series of images of the processing of implementing Figure 49 from needs begins to handle, at first making frame number i is 1 (step S281).I is 1 to n integer.
Then read the R view data (step S282) that frame number is the image Fi of i from the storage device (not shown) of termination 7.Image Fi is made of three planes of RGB,, only reads the R view data here.
To whole pixels of the R view data of the image Fi that reads, implement 2 values of dark portion pixel and handle (step S283).Particularly, the threshold value Th2 of the pixel value of each pixel and regulation is compared and implement 2 values handle, and to make dark portion pixel be that 1 to make pixel in addition be 0 (zero).If making j is the numbering that is used for determining in the view data of each frame pixel, then whether be lower than threshold value Th2 by the value rj that checks each pixel, be 1 and make the pixel that is lower than threshold value Th2, making pixel in addition is 0 (zero).
Then, extract the border in the dark portion zone that is extracted out, in other words extract the Boundary Extraction at edge and handle (step S284).It is that 1 pixel (dark portion pixel) is a concerned pixel that Boundary Extraction is handled the value of making for example, setting with this concerned pixel be the center 3 * 3 sizes cover (mask) zone, when having at least 1 value to be the pixel of 0 (zero) in 8 pixels in shaded areas, this concerned pixel is made its value as boundary pixel be 1 to handle.Step S284 constitutes dark portion Boundary Extraction step or the Boundary Extraction portion of dark portion that each image Fi is extracted the border of dark portion pixel.Whole dark portion pixels is implemented this Boundary Extraction to be handled.Figure 50 represents the image on the border of extracting.In the image 131C of Figure 50, when having carried out after the Boundary Extraction, form along the shape 132C of the circular on the border of the pars cardiaca of opening.
Then, the R view data of the image Fi that reads is implemented bandpass filtering treatment (step S285).Bandpass filtering treatment can be as illustrating in the 7th embodiment, the process of convolution by using known digital filter or use on the Fourier blade face.
Here, the characteristic of band filter for example has above-mentioned characteristic shown in Figure 47, in order not to be vulnerable to the influence of trickle marginal element such as blood vessel, has in low slightly frequency band by the many characteristics of composition.
To the result image of bandpass filtering treatment, (step S286) handled in 2 values of using threshold value to implement marginal element below.This 2 value processing also as what illustrate, is for example extracted the marginal element with variation bigger than the threshold value Th3 of regulation in the 7th embodiment.Making the pixel that extracts as marginal element is 1, and pixel in addition is 0 (zero).If making j is the numbering that is used for determining in the view data of each frame pixel, then whether surpass threshold value Th3 by the marginal element that checks each pixel, be 1 and make pixel rj above threshold value Th3, making pixel in addition is 0 (zero).In image, include as mentioned above under the situation of pars cardiaca, change,, can get rid of other marginal element for example by the caused fold of distortion of mucosa etc. therefore by threshold value Th3 is set at high value because this image has rapid light and shade.In addition, for the influence of the detection deviation that is not vulnerable to dark portion border, also can carry out expansion process to the pixel that extracts.
Then, calculating boundary pixel that extracts in step S284 and the marginal element that extracts as marginal element in step S286 extract the consistent degree (step S287) between the pixel.Particularly, for example for each boundary pixel ek1 that extracts (k1=1,2,3 ..., K.K is the sum that is detected as the pixel of boundary pixel.), whether the pixel of the same coordinate on the process decision chart picture extracts for also being used as marginal element's pixel, to as boundary pixel ek1 and also be used as marginal element's pixel and the number n 1 of the pixel that extracts is counted.This judgement for example the pixel by getting the same coordinate on boundary pixel ek1 and the image marginal element " with " value (AND) carries out.In addition, calculate the ratio (nl/K) of this number n 1 with respect to the total K of boundary pixel.
Then, according to representing that boundary pixel and marginal element extract the ratio (nl/K) of the consistent degree between the pixel, judge whether image comprises pars cardiaca (step S288).Figure 51 is the figure of the example of the expression image of the process object image being implemented the bandpass filtering of regulation and the result that 2 values are handled.The pars cardiaca of opening has the shape 32B of circular, and jagged on the part of circle in Figure 51.At this moment, the ratio of the boundary member in the dark portion zone of calculating Figure 50 and the marginal portion of Figure 51 takes a decision as to whether cardia.
According to the result of cardia determination processing, when being cardia, in step S288, be "Yes", be judged to be and detect pars cardiaca (step S289), end process.Particularly, when ratio (nl/K) surpasses the threshold value the of regulation, in step S288, be "Yes", be judged to be cardia (step S289), end process.
When in step S288, being "No", be that ratio (nl/K) then is judged to be when surpassing the threshold value the of regulation and is not cardia (step S290), whether judge whole processing of finishing Figure 49 (step S291) of a series of images of needs being implemented the processing of Figure 49, when all finishing, in step S291 "Yes", end process.When in step S291, being "No", owing to remain untreated image, carry out the processing (step S292) of i=i+1, after this next image is repeated the processing of step S282 to S288.Step S283 is to the test section of S289 constitutive characteristic amount calculating part, the gastral boundary portion of detection.
Below the various variation in the present embodiment are described.
In above-mentioned example, in order to carry out the judgement of pars cardiaca, only read the R view data, but can also read G view data or B view data, perhaps read G view data and B view data, plural view data is carried out 2 values of dark portion pixel and handle, when the consistent degree in the plural view data surpasses the threshold value of regulation, be judged to be and detect pars cardiaca.
In addition, about second variation, in above-mentioned Figure 49, image according to each frame detects pars cardiaca, but also can work as when consistent degree surpasses the threshold value of regulation in the image more than the requirement ratio in successive a plurality of images or successive a plurality of image (for example 80%), be judged to be and detect pars cardiaca.
About the 3rd variation, though above explanation is handled successive a plurality of images, the processing of Figure 49 also can be carried out piece image.
According to present embodiment, judge that whether the zone that extracts as dark portion follows big edge, judges thereby can carry out cardia accurately.
In addition, even be in when not opening conglobate state when pars cardiaca, perhaps capsule type endoscope 3 is apart from the distance of cardia, pars cardiaca be distant view as the time, also can carry out cardia accurately and judge.
In addition, in the present embodiment,, detect the pars cardiaca of opening, but for example also can use recognition function such as known linear discriminant function to carry out this detection by the characteristic quantity threshold application calculated is handled.In addition, also can be used in combination characteristic quantity in other embodiment.
(the 9th embodiment)
Cardia checkout gear and method thereof to the employing capsule type endoscope device of the 9th embodiment describes with reference to the accompanying drawings.Be characterised in that in the present embodiment, when detecting pars cardiaca, utilize the center of gravity detection in dark portion zone and the rim detection both sides that in the 7th embodiment, illustrate, judge and take in the resulting image whether the pars cardiaca of opening is arranged according to shape.Same as the endoscopic images and the 4th embodiment of Flame Image Process object in the present embodiment, be the successive a series of endoscopic images that obtains by capsule type endoscope device 101, so the tube chamber image processing apparatus is and the omission explanation identical with the 4th embodiment of the structure of cardia checkout gear.
In addition, for the treatment step identical, give identical number of steps and simplified illustration with the 8th embodiment.
Figure 52 be expression according to resulting a series of endoscopic images, carry out the flow chart of example of handling process of the detection of pars cardiaca.In addition, in Figure 52,, give identical number of steps and simplified illustration for the treatment step identical with the 8th embodiment.
Enter the back resulting a series of endoscopic images of shooting from examinee's oral area and constitute, each frame is carried out the processing of Figure 52 by a plurality of frames.In addition, the view data of each endoscopic images was carried out pretreatment such as contrary Gamma proofreaies and correct, noise is removed before the processing of implementing Figure 52.
For the frame that begins most a series of images of the processing of implementing Figure 52 from needs begins to handle, at first making frame number i is 1 (step S281).I is 1 to n integer.
Then read the R view data (step S282) that frame number is the image Fi of i from the storage device (not shown) of termination 7.
Whole pixels of the R view data of the image Fi that reads are implemented 2 values of dark portion pixel and handled (step S283).Particularly, the threshold value Th2 of the pixel value of each pixel and regulation is compared and implement 2 values handle, making dark portion pixel is that 1 to make pixel in addition be 0 (zero).If making j is the numbering that is used for determining in the view data of each frame pixel, then whether be lower than threshold value Th2 by the value rj that checks each pixel, be 1 thereby make the pixel that is lower than threshold value Th2, making pixel in addition is 0 (zero).
Following basis is handled the coordinate data in detected dark portion zone by 2 values, calculates the barycentric coodinates (step S301) in dark portion zone.Step S301 constitutes dark portion regional barycenter coordinate Calculation step or the regional barycenter coordinate Calculation portion of dark portion of image Fi being calculated the barycentric coodinates in dark portion zone.Figure 53 is the position of centre of gravity of calculating is handled in expression by dark portion regional barycenter coordinate Calculation figure.In the image 131E of Figure 53, the center of gravity 132Ec of the zone 132E of dark portion of the pars cardiaca of opening of circular is shown on the position of the barycentric coodinates of calculating.
Then, as what in the 8th embodiment, illustrate, R view data to the image Fi that reads is implemented bandpass filtering treatment (step S285), and (step S286) handled in 2 values that the result image of bandpass filtering treatment uses threshold value to implement marginal element.
Then, the evaluation (step S307) of the full week property that whether exists with the form of surrounding dark portion zone of the pixel of carrying out the marginal element that extracts among the determination step S286.Particularly, shown in Figure 54, check from the direction of the regulation of the radial extension of barycentric coodinates 132Ec of calculating whether see through the marginal element's pixel that extracts.Figure 54 is the figure that is used to illustrate the evaluation of full week property.In the example shown in Figure 54, if quantity (hereinafter referred to as the crossing number) m2 to the line segment that intersects with the marginal element pixel from radial many line 132Ed that barycentric coodinates 132Ec extends counts, then since in 8 directions the 132Ed line intersect with 7 marginal elements or through marginal element, so crossing number m2 is 7.
Then, according to crossing number m2, judge whether image contains pars cardiaca (step 288).Shown in Figure 54,, when crossing number m2 surpasses the threshold value Thm of regulation with respect to the ratio (m2/m3) of several m3 of many line 132Ed, be judged to be image and contain pars cardiaca even jagged on the part of the circle that marginal element's pixel is described.In other words, the ratio (m2/m3) according to intersecting from many line segments of the radial extension of position of centre of gravity in the dark portion zone of Figure 53 and marginal element's pixel of Figure 54 takes a decision as to whether cardia.
When crossing number m2 surpasses the threshold value Thm of regulation with respect to the ratio of several m3 of many line segment 132Ed, in step S288, be "Yes", be judged to be and detect pars cardiaca (step S289), end process.Particularly, the threshold value thm that surpasses regulation when the ratio (m2/m3) of the evaluation of estimate of the full week property of expression 0.7 the time, is a "Yes" in step S88 for example, is judged to be cardia (step S289), end process.
When in step S288, being "No", be that ratio (m2/m3) is judged to be when surpassing the threshold value thm of regulation and is not cardia (step S290), whether judge whole processing of finishing Figure 52 (step S291) of a series of images of needs being implemented the processing of Figure 52, when all finishing, in step S291 "Yes", end process.When in step S291, being "No", owing to remain untreated image, carry out the processing (step S292) of i=i+1, after this next image is repeated the processing of the step S282 of Figure 52 to S288.Step S283 is to the test section of S289 constitutive characteristic amount calculating part, the gastral boundary portion of detection.
Below the various variation in the present embodiment are described.
In above-mentioned example, in order to carry out the judgement of pars cardiaca, only read the R view data, but can also read G view data or B view data, perhaps read G view data and B view data, plural view data is carried out 2 values of dark portion pixel and handle, when the evaluation of estimate of the full week property in the plural view data surpasses the threshold value of regulation, be judged to be and detect pars cardiaca.
In addition, about second variation, in above-mentioned Figure 52, image according to each frame detects pars cardiaca, when but the evaluation of estimate that also can work as full week property in the image more than the requirement ratio in successive a plurality of images or successive a plurality of image (for example 80%) surpasses the threshold value of regulation, be judged to be and detect pars cardiaca.
About the 3rd variation, though above explanation is handled successive a plurality of images, the processing of Figure 52 also can be carried out piece image.
About the 4th variation, also can whether there be marginal element's image in the zone according to the several zones in a plurality of zones of regulation or certain ratio, carry out the evaluation of full week property.Figure 55 is used to illustrate that ratio according to the zone carries out the figure of situation of evaluation of the full week property of the 4th variation.For example shown in Figure 55, with the center of gravity 132Ec that obtains is the center, for example is divided into eight zones and promptly 0 spends to spend to spend to spend to spend to spend to spend to 315 degree, 315 to 270 degree, 270 to 225 degree, 225 to 180 degree, 180 to 135 degree, 135 to 90 degree, 90 to 45 degree, 45 and spend to these eight of 360 degree.Then, also can judge in this each zone, whether to contain the marginal element's pixel that in step S286, obtains,, be judged to be and detect pars cardiaca when the number of regions that contains the pixel 132G of marginal element is the threshold value of regulation when above.Shown in Figure 56 in the situation of six region memories representing with oblique line in marginal element's pixel.When threshold value is 6, under the situation of Figure 55, is judged to be and detects pars cardiaca.
In addition, also can be not according to number of regions, but according to the angular range that marginal element's pixel exists, be judged to be and detect pars cardiaca.Whether, according to this angular range, theta be the threshold value of regulation for example 270 degree more than, detect pars cardiaca and be judged to be if for example shown in Figure 56, also can obtain the angular range, theta that the surrounding edge composition pixel at center of gravity 132Ec exists.Figure 56 is used to further specify the figure of situation that carries out the evaluation of full week property according to angular range.In addition, when having the line of a plurality of marginal elements pixel, also the threshold value of marginal element's pixel of maximum angle scope and regulation can be compared to be judged to be and detect pars cardiaca.
As mentioned above, in 2 values in dark portion zone were handled, the position on border changed according to threshold value Th2, according to present embodiment, even there is the variation of such boundary position, also can not be subjected to the influence of this variation, judged and carry out cardia accurately.
In addition, when pars cardiaca is in when not opening conglobate state, perhaps capsule type endoscope 3 is apart from the distance of cardia, pars cardiaca be distant view as the time, also can carry out cardia accurately and judge.
In addition, in the present embodiment,, detect the pars cardiaca of opening, but for example also can use recognition function such as known linear discriminant function to carry out this detection by the characteristic quantity threshold application calculated is handled.In addition, also can be used in combination characteristic quantity in other embodiment.
(the tenth embodiment)
Cardia checkout gear and method thereof to the employing capsule type endoscope device of the tenth embodiment describes with reference to the accompanying drawings.Same as the endoscopic images and the 4th embodiment of Flame Image Process object in the present embodiment, be the successive a series of endoscopic images that obtains by capsule type endoscope device 101, so the tube chamber image processing apparatus is and the omission explanation identical with the 4th embodiment of the structure of cardia checkout gear.
In above-mentioned the 6th embodiment to the nine embodiments,, also there is the pars cardiaca closing state sometimes though be the situation that detects the pars cardiaca of open mode.The cardia checkout gear of present embodiment is characterised in that, by detecting the pars cardiaca of closed condition, judges that whether each image is near the isochronous image side nearby that is in pars cardiaca from esophagus towards the capsule type endoscope of stomach.
Figure 57 is the figure of the example of the image of expression capsule type endoscope 103 when being in the side nearby of the pars cardiaca of closing.
When capsule type endoscope 103 is taken the pars cardiaca of closing in tube chamber, because pars cardiaca is closed, so in taking resulting image, do not have clear and definite dark portion zone, and because at the periphery of the terminal top view pars cardiaca of esophagus, so the light and shade in the picture is more uniform state.In the present embodiment, when the area in the pars cardiaca zone of closing does not reach prescribed level, be judged to be and detect the pars cardiaca of closing.
Figure 58 is expression detects the handling process of the pars cardiaca of closing according to resulting a series of endoscopic images the flow chart of example.Enter the back resulting a series of endoscopic images of shooting from examinee's oral area and constitute, each frame is carried out the processing of Figure 58 by a plurality of frames.In addition, the view data of each endoscopic images was carried out pretreatment such as contrary Gamma proofreaies and correct, noise is removed before the processing of implementing Figure 58.In addition, because the processing of Figure 58 comprises the treatment step identical with the processing of Figure 44, therefore give identical number of steps and simplified illustration for the processing identical with Figure 44.
For the frame that begins most a series of images of the processing of implementing Figure 58 from needs begins to handle, at first making frame number i is 1 (step S261).I is 1 to n integer.
Then read the R view data (step S262) that frame number is the image Fi of i from the storage device (not shown) of termination 7.Image Fi is made of three planes of RGB, and only reads the R view data here.
In addition, read the R view data for the judgement of carrying out pars cardiaca described later here, but also can use G view data or the B view data of image Fi.
Whole pixels of the R view data of the image Fi that reads are implemented 2 values of dark portion pixel and handled (step S263).Particularly, the threshold value Th2 of the pixel value of each pixel and regulation is compared and implement 2 values handle, and to make dark portion pixel be that 1 to make pixel in addition be 0 (zero).If making j is the numbering that is used for determining in the view data of each frame pixel, then whether be lower than threshold value Th2 by the value rj that checks each pixel, be 1 and make the pixel that is lower than threshold value Th2, making pixel in addition is 0 (zero).
Indicate processing below by a plurality of zones, indicate and definite wherein dark portion zone (step S314) of maximum area D dark portion pixel.Step S314 constitutes the maximum dark portion zone determining step or the maximum dark portion zone determination portion in the dark portion zone of the maximum area D that determines each image Fi.In Figure 57, the zone 132I of dark portion of maximum area D has been shown in image 132I.
Then, judge that whether maximum area D is less than the threshold value Thd1 (step S315) that stipulates.Threshold value Thd1 for example is 0.1.
When in step S315, being "Yes", when promptly having 10% dark portion of not enough entire image, being judged to be and detecting pars cardiaca (step S266) and end process.
When in step S315, being "No", when promptly having the dark portion more than 10% of entire image, be judged to be and be not cardia (step S267), whether judge whole processing of finishing Figure 58 (step S268) of a series of images of needs being implemented the processing of Figure 58, when all finishing, in step S268 "Yes", end process.When in step S268, being "No", owing to remain untreated image, carry out the processing (step S269) of i=i+1, after this next image is repeated the processing of step S262 to S264.
In addition, in step S266,, be judged to be and detect pars cardiaca as the area D in the dark portion zone of maximum during less than the threshold value Thd1 of regulation, in other words, also can be judged to be image is will be by pars cardiaca after the capsule type endoscope or the image will arrive gastric portion the time.Step S263 is to the test section of S266 constitutive characteristic amount calculating part, the gastral boundary portion of detection.
Below the various variation in the present embodiment are described.
In above-mentioned example, in order to carry out the judgement of pars cardiaca, only read the R view data, but can also read G view data or B view data, perhaps read G view data and B view data, plural view data is carried out 2 values of dark portion pixel and handle,, be judged to be and detect pars cardiaca as the area D in the dark portion zone of the maximum area in the plural view data during all less than the threshold value Thd1 of regulation.
In addition, about second variation, in above-mentioned Figure 58, image according to each frame detects pars cardiaca, but also can work as the above image of requirement ratio in successive a plurality of image or the successive a plurality of image (for example 80%) when the result of determination of step S315 is D<Thd1, be judged to be and detect pars cardiaca.
About the 3rd variation, though above explanation is handled successive a plurality of images, the processing of Figure 58 also can be carried out piece image.
About the 4th variation, also can be at the periphery in the dark portion zone of above-mentioned maximum area, set the care zone (ROI) of the prescribed level shown in the dotted line among Figure 57, calculate standard deviation, centrifugal pump or the variation coefficient value of the light and shade in this care zone (ROI), whether less than the threshold value of regulation, detect the pars cardiaca of closed condition according to the value of this standard deviation of calculating etc.
As mentioned above, according to present embodiment, can detect the pars cardiaca of closed condition according to the dark portion area in the image.
In addition, in the present embodiment,, detect the pars cardiaca of closing, but for example also can use recognition function such as known linear discriminant function to carry out this detection by the characteristic quantity threshold application calculated is handled.In addition, also can be used in combination characteristic quantity in other embodiment.
(the 11 embodiment)
Cardia checkout gear and method thereof to the employing capsule type endoscope device of eleventh embodiment of the invention describes with reference to the accompanying drawings.Same as the endoscopic images and the 4th embodiment of Flame Image Process object in the present embodiment, be the successive a series of endoscopic images that obtains by capsule type endoscope device 101, so and the omission explanation identical with the 4th embodiment of the structure of cardia checkout gear.
The tube chamber image processing apparatus of present embodiment is that the cardia checkout gear is characterised in that, the detection of the pars cardiaca by carrying out closed condition judges that whether each image is near the isochronous image side nearby that is in pars cardiaca from esophagus towards the capsule type endoscope of stomach.
Figure 59 is that expression is according to resulting a series of endoscopic images, the flow chart of the example of the handling process of the detection of the pars cardiaca of closing.Enter the back resulting a series of endoscopic images of shooting from examinee's oral area and constitute, each frame is carried out the processing of Figure 59 by a plurality of frames.In addition, the view data of each endoscopic images was carried out pretreatment such as contrary Gamma proofreaies and correct, noise is removed before the processing of implementing Figure 59.
For the frame that begins most a series of images of the processing of implementing Figure 59 from needs begins to handle, at first making frame number i is 1 (step S321).I is 1 to n integer.
Then read the R view data (step S322) that frame number is the image Fi of i from the storage device (not shown) of termination 7.Image Fi is made of three planes of RGB, and only reads the R view data here.
In addition, read the R view data for the judgement of carrying out pars cardiaca described later here, but also can use G view data or the B view data of image Fi.
Whole pixels of the R view data of the image Fi that reads are implemented 2 values of dark portion pixel and handled (step S323).Particularly, the threshold value Th2 of the pixel value of each pixel and regulation is compared and implement 2 values handle, and to make dark portion pixel be that 1 to make pixel in addition be 0 (zero).If making j is the numbering that is used for determining in the view data of each frame pixel, then whether be lower than threshold value Th2 by the value rj that checks each pixel, be 1 and make the pixel that is lower than threshold value Th2, making pixel in addition is 0 (zero).
Then the image of having implemented 2 values processing is carried out graph thinning and handle (step S324).In Figure 60, the cardia shape 132J that has implemented graph thinning according to the image of the pars cardiaca of closing has been shown in image 131J.Figure 60 is used to illustrate the figure that has implemented the cardia shape of graph thinning according to the image of the pars cardiaca of closing.Then, and in each bar line of generating through graph thinning, the point (hereinafter referred to as branch point) (step S325) of Branch Computed or intersection, with the coordinate data of this branch point as the data storage of expression concentration degree in storage device.
Then, calculate the concentration degree (step S326) of the branch point of being calculated.Step S326 constitutes the concentration degree calculation procedure or the concentration degree calculating part of the concentration degree of the branch point that calculates each image Fi.Concentration degree is for example calculated the centrifugal pump of the coordinate figure of branch point as parameter.
Figure 61 is that parameter that concentration degree is calculated in expression is the flow chart of flow process example of the processing of centrifugal pump.At first, obtain the coordinate data (step S341) of the branch point of calculating from storage device.If for example calculating obtains N branch point, then calculate the centrifugal pump vx of x coordinate figure of N branch point and the centrifugal pump vy (step S342) of y coordinate figure respectively.For N branch point, respectively centrifugal pump vx, vy are stored in the storage device.
Returning Figure 59, is whether centrifugal pump vx, vy judge concentration degree whether high (step S327) less than threshold value thv1, the thv2 of regulation respectively according to the data of expression concentration degree.
When being "Yes" among the step S327, promptly centrifugal pump vx, vy respectively less than the regulation threshold value thv1, thv2 the time, be judged to be and detect pars cardiaca (step S328) and end process.
When in step S327, being "No", be centrifugal pump vx, vy respectively more than or equal to the regulation threshold value thv1, thv2 the time, be judged to be and be not cardia (step S329), whether judge whole processing of finishing Figure 59 (step S330) of a series of images of needs being implemented the processing of Figure 59, when all finishing, in step S330 "Yes", end process.When in step S330, being "No", owing to remain untreated image, carry out the processing (step S331) of i=i+1, after this next image is repeated the processing of step S322 to S327.Step S323 is to the test section of S328 constitutive characteristic amount calculating part, the gastral boundary portion of detection.
Figure 62 is the figure of the example of the expression image that is used to illustrate branch point.In Figure 62, under the situation of the pars cardiaca of closing, the branch point 133J of the line of image 131J concentrates on the part of image 131J.Utilize above-mentioned centrifugal pump that this intensity is quantized, for example obtain above-mentioned centrifugal pump, and by with its dispersion degree for example the threshold value of centrifugal pump and regulation compare, thereby judge whether detect the pars cardiaca of closing.In the present embodiment, because the branch point during with the pars cardiaca graph thinning of closing is concentrated, therefore when concentrated, be judged to be and detect pars cardiaca.
In addition, in step S328, when centrifugal pump less than the regulation threshold value the time, be judged to be and detect the pars cardiaca of closing, in other words, also can be judged to be capsule endoscope 103 and after this will or will arrive gastric portion by pars cardiaca.
Various variation to present embodiment describe below.
In above-mentioned example, in order to carry out the judgement of pars cardiaca, only read the R view data, but can also read G view data or B view data, perhaps read G view data and B view data, plural view data is calculated concentration degree,, be judged to be and detect the pars cardiaca of closing according to the concentration degree of plural view data.
In addition, about second variation, in above-mentioned Figure 59, image according to each frame detects pars cardiaca, but also can work as the above image of requirement ratio in successive a plurality of image or the successive a plurality of image (for example 80%) when the result of determination of step S327 is "Yes", be judged to be and detect the pars cardiaca of closing.
About the 3rd variation, though above explanation is handled successive a plurality of images, the processing of Figure 29 also can be carried out piece image.
In addition, in the present embodiment,, detect the pars cardiaca of closing, but for example also can use recognition function such as known linear discriminant function to carry out this detection by the characteristic quantity threshold application calculated is handled.In addition, also can be used in combination characteristic quantity in other embodiment.
Respectively according to a plurality of embodiments of above explanation, can be according to a width of cloth or successive a series of endoscopic images, detect this image when having passed through stomach esophagus junction surface, will arrive the isochronous image of pars cardiaca, thereby can from the endoscopic images of a large amount of shootings, select the required image of diagnosis of esophagus illness and carry out the diagnosis of esophagus illness rapidly.
Therefore, according to a plurality of embodiments of above explanation, can realize to detect the tube chamber image processing apparatus of pars cardiaca.
For example, the judgement of Barrett esophagus is carried out according near the image the stomach esophagus junction surface.Therefore, if can carry out the detection of above-mentioned pars cardiaca, just then only can carefully diagnose by the image of observing this image or this image front and back, thus make rapidization of diagnosis.
Though the example that adopts the treatment of picture that obtains by capsule type endoscope 3 more than has been described, the image that endoscope obtained for promptly had the insertion section of elongated flexible by common endoscope also can carry out above-mentioned processing certainly.This is because capsule type endoscope and common endoscope all can obtain intraluminal image.
In addition, though in each method of the method (containing variation) of above-mentioned a plurality of embodiments, can detect pars cardiaca, also can be used in combination a plurality of methods and detect pars cardiaca.
According to above-mentioned the 4th to the 11 embodiment and their each variation, can provide the tube chamber image processing apparatus that can detect pars cardiaca according to image in the tube chamber.
For according to esophagus inside etc. being taken the character that a large amount of endoscopic images data that obtain are judged situations such as esophagus Barrett, before the existence on the stomach esophagus junction surface that determines this position periphery, epithelium border, change frame number in turn and repeat the processing of first feature that detects this stomach esophagus junction surface, epithelium border etc., change over to again from being judged to be image and begin thereafter image detected as the detection of second feature of the esophagus Barrett of judging object etc. and handle with this feature.By carrying out such processing, and from beginning most to carry out the detection processing of second feature and the situation of determination processing is compared, can handle more efficiently.

Claims (47)

1. a medical image-processing apparatus is characterized in that, this medical image-processing apparatus has:
The image extraction unit, it extracts two field picture from the intravital dynamic image data of being taken by the photographing in vivo device or several static image datas of taking continuously; And
Image analysis portion, it carries out image analysis and output image analysis result to the two field picture that is extracted by described image extraction unit,
Described image analysis portion has:
The first living body feature test section, it detects first living body feature;
The second living body feature test section, it is based on the testing result of the described first living body feature test section, and the time is gone up captured two field picture before or after the image that the described first living body feature test section uses in detection, detects second living body feature; And
The character detection unit, it is judged the character of organism and exports result of determination according to the testing result of the described second living body feature test section.
2. medical image-processing apparatus according to claim 1, it is characterized in that, the described second living body feature test section is obtained the two field picture of regulation width of cloth number based on the testing result of the described first living body feature test section, this two field picture is carried out the detection of second living body feature and handles.
3. medical image-processing apparatus according to claim 1, it is characterized in that, the described second living body feature test section is based on the testing result of the described first living body feature test section, successively on the acquisition time before or after the handled two field picture of the described first living body feature test section captured two field picture
Testing result when obtained two field picture being handled according to the described first living body feature test section, and interrupt obtaining of two field picture.
4. according to each described medical image-processing apparatus in the claim 1 to 3, it is characterized in that described character detection unit judges whether exist as the Barrett esophagus of illness in the esophagus character as described organism.
5. medical image-processing apparatus according to claim 4 is characterized in that, described first living body feature that detects by the described first living body feature test section is the stomach esophagus junction surface on described intraesophageal border as stomach and described esophagus.
6. medical image-processing apparatus according to claim 5, it is characterized in that, in the described first living body feature test section, extract the end points coordinate of palisade blood vessel, by detecting described stomach esophagus junction surface by the formed palisade blood vessel of the line segment end points boundary line that connects each the described end points coordinate that is extracted.
7. medical image-processing apparatus according to claim 6, it is characterized in that, in the described first living body feature test section, at described palisade blood vessel, to handle the length of resulting line segment be specific length when above when described two field picture being carried out graph thinning, judges that this line segment is described palisade blood vessel.
8. medical image-processing apparatus according to claim 7, it is characterized in that, in the described first living body feature test section, at described palisade blood vessel, consider that also described frame image data is carried out graph thinning to be handled the branch of resulting line segment and count, intersect and count and bending is counted, and judges that described line segment is described palisade blood vessel.
9. medical image-processing apparatus according to claim 8, it is characterized in that, in the described first living body feature test section, at described palisade blood vessel, also consider to connect described frame image data is carried out the line segment that graph thinning is handled the two ends of resulting line segment, with the vector angulation of an end of the dark portion center of the dark portion of image that is connected described two field picture and the dark portion of more approaching described image in the described two ends, judge that described line segment is described palisade blood vessel.
10. according to each described medical image-processing apparatus in the claim 5 to 9, it is characterized in that described second living body feature that detects by the described second living body feature test section is described intraesophageal epithelium border.
11. medical image-processing apparatus according to claim 10, it is characterized in that, described character detection unit according to and from the distance between a plurality of intersection points at described epithelium border that many lonizing radiation that the point of regulation radiates out radially intersect respectively and described stomach esophagus junction surface, judge whether there is described Barrett esophagus.
12. medical image-processing apparatus according to claim 11, it is characterized in that, each distance between the described epithelium border that many lonizing radiation that described character detection unit is considered to radiate out radially with point from regulation intersect respectively and the point of described regulation or and the point of crossing respectively described stomach esophagus junction surface of many lonizing radiation radiating out radially from the point of stipulating and described regulation between each distance, judge whether there is described Barrett esophagus.
13. medical image-processing apparatus according to claim 10 is characterized in that, described character detection unit also judges whether there is described Barrett esophagus according to the crossing number of palisade blood vessel and epithelium boundary-intersected.
14. medical image-processing apparatus according to claim 13, it is characterized in that, in the described first living body feature test section, at described palisade blood vessel, to handle the length of resulting line segment be specific length when above when described two field picture being carried out graph thinning, judges that this line segment is described palisade blood vessel.
15. medical image-processing apparatus according to claim 14, it is characterized in that, in the described first living body feature test section, at described palisade blood vessel, consider that also described frame image data is carried out graph thinning to be handled the branch of resulting line segment and count, intersect and count and bending is counted, and judges that described line segment is described palisade blood vessel.
16. medical image-processing apparatus according to claim 15, it is characterized in that, in the described first living body feature test section, at described palisade blood vessel, also consider to connect described frame image data is carried out the line segment that graph thinning is handled the two ends of resulting line segment, with the vector angulation of an end of the dark portion center of the dark portion of image that is connected described two field picture and the dark portion of more approaching described image in the described two ends, judge that described line segment is described palisade blood vessel.
17. according to each described medical image-processing apparatus in the claim 1 to 4, it is characterized in that,
Described first living body feature that detects by the described first living body feature test section is the cardia of the inlet of conduct from the esophagus to the stomach,
The described second living body feature test section is based on the testing result of the described first living body feature test section, and the two field picture for the time upward took before the image that the described first living body feature test section uses in detection detects second living body feature.
18. medical image-processing apparatus according to claim 17, it is characterized in that, in the described first living body feature test section, be the cardia of remaining silent according to handling branch that resulting fine rule calculates and count or intersect the concentration degree of counting at described two field picture being carried out graph thinning, described cardia being detected.
19. medical image-processing apparatus according to claim 18, it is characterized in that, in the described first living body feature test section, according to described two field picture being carried out edge extracting is handled and the graph thinning image that generates is handled in 2 values and the dark portion that described two field picture extracts the dark portion of image extracted handle resulting dark portion image, be the opening cardia with described cardia detection.
20. medical image-processing apparatus according to claim 19 is characterized in that, according to the fine rule of described graph thinning image with the characteristic point of described dark portion be the center in zone that upwards exists in week or angle, detect described cardia.
21. according to each described medical image-processing apparatus in the claim 1 to 4, it is characterized in that described first living body feature that detects by the described first living body feature test section is the epithelium border as the border of the cylindrical epithelium of the squamous epithelium of esophagus side and stomach side.
22. medical image-processing apparatus according to claim 21, it is characterized in that, described character detection unit is judged the character of described organism only for the two field picture that has detected described epithelium border based on the testing result of the described second living body feature test section.
23. a medical image processing method is characterized in that, this medical image processing method has following steps:
From the intravital dynamic image data of taking by the photographing in vivo device or several static image datas of taking continuously, extract the step of two field picture;
The two field picture that extracts is carried out image analysis, detect the step of first living body feature;
Based on the testing result of described first living body feature,, detect the step of second living body feature in detection before or after the image that the described first living body feature test section uses to the two field picture that the time upward took; And
Based on the testing result of the detection of described second living body feature, judge the character of organism and export the step of result of determination.
24. one kind is used to make computer to carry out following functional programs,
From the intravital dynamic image data of taking by the photographing in vivo device or several static image datas of taking continuously, extract the function of two field picture;
The two field picture that extracts is carried out image analysis, detect the function of first living body feature;
Based on the testing result of described first living body feature,, detect the function of second living body feature in detection before or after the image that the described first living body feature test section uses to the two field picture that the time upward took; And
Based on the testing result of the detection of described second living body feature, judge the character of organism and export the function of result of determination.
25. a tube chamber image processing apparatus is characterized in that, this tube chamber image processing apparatus has:
Feature value calculation unit, it comes the characteristic quantity of computational rules according to digestive tract is taken image in a resulting width of cloth or several tube chambers by Flame Image Process; And
The boundary portion test section, it detects gastral boundary portion according to the described characteristic quantity that calculates.
26. tube chamber image processing apparatus according to claim 25, it is characterized in that, this tube chamber image processing apparatus also has detection unit, and this detection unit judges that based on the testing result of described boundary portion test section image is the image of the part from esophagus to pars cardiaca in the described tube chamber.
27., it is characterized in that the characteristic quantity of described regulation is the tone of image in the described one or more tube chamber according to claim 25 or 26 described tube chamber image processing apparatus.
28., it is characterized in that the characteristic quantity of described regulation is the shading value of image in the described one or more tube chamber according to claim 25 or 25 described tube chamber image processing apparatus.
29., it is characterized in that described boundary portion is stomach esophagus junction surface according to each described tube chamber image processing apparatus in the claim 25 to 28.
30. according to each described tube chamber image processing apparatus in the claim 25 to 29, it is characterized in that described feature value calculation unit is calculated described characteristic quantity for the pixel except dark portion pixel and halation pixel in the image in a described width of cloth or several tube chambers.
31. according to each described tube chamber image processing apparatus in the claim 25 to 30, it is characterized in that, described detection unit image in according to described several tube chambers calculates under the situation of characteristic quantity of described regulation, when the differential value of the rolling average of the characteristic quantity of the described regulation that calculates when the threshold value of regulation is above, being judged to be is described boundary portion.
32. according to each described tube chamber image processing apparatus in the claim 25 to 31, it is characterized in that, described feature value calculation unit is carried out Flame Image Process to the pixel in the regulation zone in each image of image in the described one or more tube chamber, calculates the characteristic quantity of described regulation.
33., it is characterized in that the characteristic quantity of described regulation is the dark portion zone in the image or the area in non-dark portion zone in the described one or more tube chamber according to claim 25 or 26 described tube chamber image processing apparatus.
34. tube chamber image processing apparatus according to claim 33 is characterized in that,
Described boundary portion is the opening cardia,
Described boundary portion test section detects described boundary portion by the threshold value of more described area and regulation.
35., it is characterized in that the characteristic quantity of described regulation is the shape in the dark portion zone in the image in the described one or more tube chamber according to claim 25 or 26 described tube chamber image processing apparatus.
36. tube chamber image processing apparatus according to claim 35 is characterized in that, whether described boundary portion test section is that circle detects described boundary portion according to described shape.
37. tube chamber image processing apparatus according to claim 35 is characterized in that,
The shape in described dark portion zone comprises the boundary shape and the edge part in described dark portion zone,
Described boundary portion test section detects described boundary portion by described border is compared with the consistent degree of described edge part and the threshold value of regulation.
38. tube chamber image processing apparatus according to claim 35 is characterized in that,
The shape in described dark portion zone comprises the edge part in described dark portion zone,
Described boundary portion test section compares by the ratio that exists around will the regulation point of described edge part in described dark portion zone and the threshold value of regulation, detects described boundary portion.
39. according to the described tube chamber image processing apparatus of claim 38, it is characterized in that, whether described boundary portion test section exists described edge part in a plurality of zones according to the regulation in each image of image in described one or more tube chamber, determines described ratio.
40., it is characterized in that described boundary portion test section is determined described ratio according to the angular range that exists of the point of the regulation of described edge part in described dark portion zone on every side according to the described tube chamber image processing apparatus of claim 38.
41., it is characterized in that the characteristic quantity of described regulation is the area in the dark portion zone in the image in the described one or more tube chamber according to claim 25 or 26 described tube chamber image processing apparatus.
42. tube chamber image processing apparatus according to claim 33 is characterized in that,
Described boundary portion is the cardia of remaining silent,
Described boundary portion test section compares by the threshold value with described area and regulation, detects described boundary portion.
43. according to the described tube chamber image processing apparatus of claim 42, it is characterized in that, described boundary portion test section is at each image of image in the described one or more tube chamber, indicate by indicating a plurality of zones of handling dark portion pixel, and according to through the area in the zone of the maximum area in the described a plurality of zones that indicate whether less than the threshold value of described regulation, detect described boundary portion.
44. according to claim 25 or 26 described tube chamber image processing apparatus, it is characterized in that, described boundary portion test section carries out the graph thinning processing of dark portion pixel to each image of image in the described one or more tube chamber, calculate the branch point or the cross point of handling resulting each fine rule by this graph thinning, the concentration degree in described each image detects described boundary portion according to this branch point or cross point.
45. according to the described tube chamber image processing apparatus of claim 44, it is characterized in that,
Described boundary portion is the cardia of remaining silent,
Described boundary portion test section compares by the threshold value to described concentration degree and regulation, detects described boundary portion.
46. a tube chamber image processing method is characterized in that, this tube chamber image processing method has following steps:
According to digestive tract being taken image in a resulting width of cloth or several tube chambers, the step of coming the characteristic quantity of computational rules by Flame Image Process; And
Detect the step of gastral boundary portion according to the described characteristic quantity that calculates.
47. one kind is used to make computer to carry out following functional programs,
According to digestive tract being taken image in a resulting width of cloth or several tube chambers, the function of coming the characteristic quantity of computational rules; And
Detect the function of gastral boundary portion according to the described characteristic quantity that calculates.
CNB2006800044681A 2005-02-15 2006-02-10 Medical image-processing apparatus Active CN100563550C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP038117/2005 2005-02-15
JP038116/2005 2005-02-15
JP2005038116A JP4749732B2 (en) 2005-02-15 2005-02-15 Medical image processing device

Publications (2)

Publication Number Publication Date
CN101115435A true CN101115435A (en) 2008-01-30
CN100563550C CN100563550C (en) 2009-12-02

Family

ID=36985260

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2006800044681A Active CN100563550C (en) 2005-02-15 2006-02-10 Medical image-processing apparatus

Country Status (2)

Country Link
JP (1) JP4749732B2 (en)
CN (1) CN100563550C (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102243762A (en) * 2010-05-11 2011-11-16 奥林巴斯株式会社 Image processing apparatus and image processing method
CN102525381A (en) * 2010-12-16 2012-07-04 奥林巴斯株式会社 Image processing apparatus, image processing method and computer-readable recording device
CN103327883A (en) * 2011-02-22 2013-09-25 奥林巴斯医疗株式会社 Medical image processing device and medical image processing method
CN105072975A (en) * 2013-02-27 2015-11-18 奥林巴斯株式会社 Image processing device, image processing method and image processing program
CN109674493A (en) * 2018-11-28 2019-04-26 深圳蓝韵医学影像有限公司 Method, system and the equipment of medical supersonic automatic tracing carotid artery vascular
CN114511566A (en) * 2022-04-19 2022-05-17 武汉大学 Method and related device for detecting basement membrane positioning line in medical image

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5028138B2 (en) * 2007-05-08 2012-09-19 オリンパス株式会社 Image processing apparatus and image processing program
JP5315376B2 (en) * 2011-04-11 2013-10-16 オリンパス株式会社 Fence-like blood vessel detection device and method of operating the fence-like blood vessel detection device
JP6188992B1 (en) 2015-09-28 2017-08-30 オリンパス株式会社 Image analysis apparatus, image analysis system, and operation method of image analysis apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3895400B2 (en) * 1996-04-30 2007-03-22 オリンパス株式会社 Diagnosis support device
JP2001017379A (en) * 1999-07-09 2001-01-23 Fuji Photo Film Co Ltd Fluorescent diagnostic device

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102243762A (en) * 2010-05-11 2011-11-16 奥林巴斯株式会社 Image processing apparatus and image processing method
CN102243762B (en) * 2010-05-11 2016-06-29 奥林巴斯株式会社 Image processing apparatus and image processing method
CN102525381A (en) * 2010-12-16 2012-07-04 奥林巴斯株式会社 Image processing apparatus, image processing method and computer-readable recording device
CN102525381B (en) * 2010-12-16 2016-08-03 奥林巴斯株式会社 The recording equipment of image processing apparatus, image processing method and embodied on computer readable
CN103327883A (en) * 2011-02-22 2013-09-25 奥林巴斯医疗株式会社 Medical image processing device and medical image processing method
CN103327883B (en) * 2011-02-22 2016-08-24 奥林巴斯株式会社 Medical image-processing apparatus and medical image processing method
CN105072975A (en) * 2013-02-27 2015-11-18 奥林巴斯株式会社 Image processing device, image processing method and image processing program
CN109674493A (en) * 2018-11-28 2019-04-26 深圳蓝韵医学影像有限公司 Method, system and the equipment of medical supersonic automatic tracing carotid artery vascular
CN109674493B (en) * 2018-11-28 2021-08-03 深圳蓝韵医学影像有限公司 Method, system and equipment for medical ultrasonic automatic tracking of carotid artery blood vessel
CN114511566A (en) * 2022-04-19 2022-05-17 武汉大学 Method and related device for detecting basement membrane positioning line in medical image
CN114511566B (en) * 2022-04-19 2022-07-19 武汉大学 Method and related device for detecting basement membrane positioning line in medical image

Also Published As

Publication number Publication date
JP4749732B2 (en) 2011-08-17
JP2006223376A (en) 2006-08-31
CN100563550C (en) 2009-12-02

Similar Documents

Publication Publication Date Title
CN100563550C (en) Medical image-processing apparatus
EP1849402B1 (en) Medical image processing device, lumen image processing device, lumen image processing method, and programs for them
JP4615963B2 (en) Capsule endoscope device
EP1997074B1 (en) Device, system and method for automatic detection of contractile activity in an image frame
US7567692B2 (en) System and method for detecting content in-vivo
EP1997076B1 (en) Cascade analysis for intestinal contraction detection
US8693754B2 (en) Device, system and method for measurement and analysis of contractile activity
CN101150977B (en) Image processor
US7577283B2 (en) System and method for detecting content in-vivo
CN102639049B (en) Information processing device and capsule endoscope system
CN102984990A (en) Diagnosis assistance apparatus
CN103458765B (en) Image processing apparatus
CN110974179A (en) Auxiliary diagnosis system for stomach precancer under electronic staining endoscope based on deep learning
JP5057651B2 (en) Lumen image processing apparatus, lumen image processing method, and program therefor
KR102294738B1 (en) System and Method for sorting organ
EP3226745B1 (en) Capsule coating for image capture control
Phillips et al. Video capsule endoscopy: pushing the boundaries with software technology
US9854958B1 (en) System and method for automatic processing of images from an autonomous endoscopic capsule
TW200824636A (en) Image recognition method for detecting alimentary tract
Seguí et al. Diagnostic system for intestinal motility disfunctions using video capsule endoscopy
Malagelada Vilarino et a
Haritha et al. Saliency Based Hookworm and Infection Detection for Wireless Capsule Endoscopy Diagnosis
WO2022182500A1 (en) Method and apparatus for objective assessment of gastrointestinal conditions based on images captured in the gi tract

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant