US10561338B2 - Endoscope position identifying apparatus, endoscope position identifying method, and recording medium having an endoscope position identifying program recorded therein - Google Patents

Endoscope position identifying apparatus, endoscope position identifying method, and recording medium having an endoscope position identifying program recorded therein Download PDF

Info

Publication number
US10561338B2
US10561338B2 US15/265,386 US201615265386A US10561338B2 US 10561338 B2 US10561338 B2 US 10561338B2 US 201615265386 A US201615265386 A US 201615265386A US 10561338 B2 US10561338 B2 US 10561338B2
Authority
US
United States
Prior art keywords
endoscope
images
path
image
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/265,386
Other languages
English (en)
Other versions
US20170071504A1 (en
Inventor
Caihua WANG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Assigned to FUJIFILM CORPORATION reassignment FUJIFILM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WANG, CAIHUA
Publication of US20170071504A1 publication Critical patent/US20170071504A1/en
Application granted granted Critical
Publication of US10561338B2 publication Critical patent/US10561338B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • A61B5/066Superposing sensor position on an image of the patient, e.g. obtained by ultrasound or x-ray imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/267Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the respiratory tract, e.g. laryngoscopes, bronchoscopes
    • A61B1/2676Bronchoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • A61B2034/2057Details of tracking cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6846Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
    • A61B5/6847Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive mounted on an invasive device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7425Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image

Definitions

  • the present disclosure is related to an apparatus, a method, and a program for positioning an endoscope that identify the position of an endoscope within a lumen structure when the endoscope is inserted into a lumen structure having branching structures such as bronchial tubes to observe the lumen structure.
  • endoscopes are capable of obtaining images in which the colors and textures of the interiors of lumen structures are clearly represented by use of imaging elements such as a CCD (Charge Coupled Device), these images are two dimensional representations of the interiors of the lumen structures. For this reason, it is difficult to understand what portion of the interior of the lumen structure is being represented by endoscope images. Particularly, because endoscopes for the bronchial tubes are thin and have narrow fields of view, it is difficult for the leading ends of such endoscopes to reach the target positions thereof.
  • CCD Charge Coupled Device
  • the branching positions within a virtual endoscope image can be identified by employing the methods of Japanese Unexamined Patent Publication No. 2013-150650, PCT Japanese Publication No. 2013-517909, PCT Publication No. 2012-505695, and International Patent Publication No. WO2011/102012.
  • an actual endoscope image only includes the inner walls of a lumen structure. That is, very few structural features are included in such an actual endoscope image. For this reason, it is extremely difficult to recognize which path from a previous branch an endoscope has been guided into in the case that the endoscope is guided beyond a branch position.
  • the endoscope may be retracted to the branch position, the path into which the endoscope is to be inserted can be reconfirmed, and the endoscope may be reinserted into the reconfirmed path.
  • Such operations pose great burdens both on an operator and on a patient.
  • the error in selecting the path will not be known until a next branch position appears.
  • the present disclosure has been developed in view of the foregoing circumstances.
  • the present disclosure enables which lumen structure an endoscope is inserted into at branch positions to be identified.
  • An endoscope position identifying apparatus of the present disclosure comprises:
  • actual endoscope image obtaining means for sequentially obtaining actual endoscope images that represent the inner walls of a lumen structure, generated by an endoscope which is inserted into a lumen structure having a plurality of branched structures within a subject;
  • virtual endoscope image generating means for generating a plurality of virtual endoscope images that include a plurality of virtual endoscope branch images that represent the inner walls of the lumen structure for each of a plurality of viewpoint positions from which a plurality of branch structures are viewed, from a three dimensional image that includes the lumen structure of the subject;
  • corresponding virtual endoscope image determining means for determining a corresponding virtual endoscope image that corresponds to a branch structure closest to the current position of the endoscope, through which the endoscope has passed;
  • matching means for matching a plurality of virtual endoscope path images generated from a three dimensional image that represent each of a plurality of paths which are present at least in the direction of movement of the endoscope from a corresponding branch viewpoint position for which the corresponding virtual endoscope image was generated, for each of a plurality of viewpoint positions, and a plurality of actual endoscope path images obtained along a path from the branch structure through which the endoscope has passed to the current position of the endoscope, for each of a plurality of paths;
  • position identifying means for identifying the current position of the endoscope from among the plurality of paths, based on the results of matching.
  • an endoscope In the case that an endoscope is inserted into a lumen structure, only a single lumen structure is present prior to a branch structure. However, the lumen structure branches into a plurality of paths beyond the branch structure. For this reason, one path along which the endoscope can move is present prior to a branch position, and a plurality of paths along which the endoscope can move are present beyond the branch structure.
  • the “plurality of paths which are present at least in the direction of movement of the endoscope from a corresponding branch viewpoint position” may be beyond the corresponding branch viewpoint position, that is, only the plurality of paths which are present in the direction of movement of the endoscope, or may be the plurality of paths that include the one path prior to the corresponding branch viewpoint position.
  • the “matching” operation refers to calculating index values that represent the degree to which the plurality of virtual endoscope path images and the plurality of actual endoscope path images match, for each of the plurality of paths.
  • the index values are the results of matching. Note that the degree of similarity between the plurality of virtual endoscope path images and the plurality of actual endoscope path images may be employed as the index value.
  • the endoscope position identifying apparatus of the present disclosure may further comprise identification result output means, for outputting the results of identification obtained by the position identifying means.
  • the endoscope position identifying apparatus of the present disclosure may further comprise determining means, for determining whether the endoscope is positioned along a desired path within the lumen structure based on the identification results obtained by the position identifying means.
  • the corresponding virtual endoscope image determining means may determine the corresponding virtual endoscope image, by comparing at least one actual endoscope branch image, obtained at the position of the closest branch structure through which the endoscope has passed, and a plurality of virtual endoscope branch images.
  • the corresponding virtual endoscope image determining means may determine the corresponding virtual endoscope image, by comparing a virtual endoscope branch image generated at a branch viewpoint position toward the side of direction of movement of the endoscope from a corresponding branch viewpoint position corresponding to a branch structure of the lumen structure through which the endoscope has passed, and at least one actual endoscope image.
  • endoscope position identifying apparatus of the present disclosure may further comprise actual endoscope image identifying means, for identifying an actual endoscope branch image.
  • the actual endoscope image identifying means may perform processes for identifying an actual endoscope branch image at predetermined temporal intervals.
  • the virtual endoscope image generating means may generate the virtual endoscope path images after the corresponding virtual endoscope image is determined.
  • the virtual endoscope image generating means may generate virtual endoscope images including a plurality of virtual endoscope branch images and virtual endoscope path images which are set at predetermined intervals along the paths of a lumen structure within a three dimensional image; the endoscope position identifying apparatus may further comprise first storage means for storing virtual endoscope images for a plurality of viewpoint positions; and the matching means may obtain a plurality of virtual endoscope path images from the first storage means.
  • the endoscope position identifying apparatus of the present disclosure may further comprise second storage means for storing a plurality of actual endoscope images from the current position of the endoscope to the position of a branch structure through which the endoscope has passed.
  • An endoscope position identifying method of the present disclosure comprises:
  • generating a plurality of virtual endoscope images that include a plurality of virtual endoscope branch images that represent the inner walls of the lumen structure for each of a plurality of viewpoint positions from which a plurality of branch structures are viewed, from a three dimensional image that includes the lumen structure of the subject;
  • endoscope position identifying method of the present disclosure may be provided as a program that causes a computer to execute the method.
  • a corresponding virtual endoscope image that corresponds to a branch structure closest to the current position of the endoscope, through which the endoscope has passed is determined. Then, matching of a plurality of virtual endoscope path images generated from a three dimensional image that represent each of a plurality of paths which are present at least in the direction of movement of the endoscope from a corresponding branch viewpoint position for which the corresponding virtual endoscope image was generated, for each of a plurality of viewpoint positions, and a plurality of actual endoscope path images obtained along a path from the branch structure through which the endoscope has passed to the current position of the endoscope, is performed for each of a plurality of paths.
  • the current position of the endoscope is identified from among the plurality of paths, based on the results of matching. Therefore, which path the endoscope is positioned in, in the movement direction of the endoscope from the branch structure closest to the current position of the endoscope, through which the endoscope has passed, can be identified. Accordingly, whether the endoscope is in a correct path or an incorrect path after passing through the branch structure can be easily recognized. As a result, diagnosis employing the endoscope can be performed accurately.
  • FIG. 1 is a schematic diagram that illustrates the hardware configuration of a diagnosis assisting system to which an endoscope position identifying apparatus according to an embodiment of the present disclosure is applied.
  • FIG. 2 is a schematic diagram that illustrates the configuration of an endoscope position identifying apparatus according to a first embodiment of the present disclosure, which is realized by installing a branch structure determining program in a computer.
  • FIG. 3 is a diagram for explaining obtainment of a plurality of actual endoscope images.
  • FIG. 4 is a diagram for explaining obtainment of a plurality of virtual endoscope path images.
  • FIG. 5 is a diagram for explaining obtainment of a plurality of actual endoscope path images.
  • FIG. 6 is a diagram that illustrates identification results and determination results which are displayed on a display.
  • FIG. 7 is a diagram that illustrates identification results and determination results which are displayed on a display.
  • FIG. 8 is a flow chart that illustrates the processes which are performed by the first embodiment.
  • FIG. 9 is a schematic diagram that illustrates the configuration of an endoscope position identifying apparatus according to a second embodiment of the present disclosure, which is realized by installing a branch structure determining program in a computer.
  • FIG. 10 is a diagram that illustrates an image that prompts an operator to select a corresponding virtual endoscope image, which is displayed on a display.
  • FIG. 11 is a diagram that illustrates the processes which are performed by a third embodiment.
  • FIG. 1 is a schematic diagram that illustrates the hardware configuration of a diagnosis assisting system to which an endoscope position identifying apparatus according to an embodiment of the present disclosure is applied.
  • an endoscope 3 a three dimensional image obtaining apparatus 4 , an image storage server 6 , and an endoscope position identifying apparatus 6 are connected via a network 8 so as to be capable of communicating with each other, as illustrated in FIG. 1 .
  • the endoscope 3 is equipped with an endoscope scope 31 for imaging the interiors of lumen structures of subjects, and a processing device 32 for generating images of the interiors of lumen structures based on signals obtained by imaging.
  • the endoscope scope 31 is constituted by an insertion portion, which is to be inserted into lumen structures of subjects, mounted on an operating portion 3 A.
  • the endoscope scope 31 is connected to the processing device 32 via a universal cord which is detachably connected to the processing device 32 .
  • the operating portion 3 A includes various buttons for issuing commands to bend the leading end 3 B of the insertion portion within predetermined angular ranges in the vertical direction and the horizontal direction, to operate a piercing needle mounted on the leading end of the endoscope scope 31 to take tissue samples, and the like.
  • the endoscope scope 31 is a flexible scope for use in bronchial tubes, and is inserted into the bronchial tubes of a subject.
  • the leading end 3 B of the insertion portion of the endoscope scope 31 Light which is guided through an optical fiber from a light source device (not shown) provided in the processing device 32 is emitted from the leading end 3 B of the insertion portion, and an imaging optical system of the endoscope scope 31 obtains images of the interior of the bronchial tubes of the subject.
  • a light source device not shown
  • an imaging optical system of the endoscope scope 31 obtains images of the interior of the bronchial tubes of the subject.
  • the processing device 32 converts image signals obtained by the endoscope scope 31 into digital image signals, corrects image quality by digital signal processes such as white balance adjustment and shading correction, and generates endoscope images T 0 .
  • the generated images constitute a video formed by a plurality of endoscope images T 0 , which are obtained at a predetermined frame rate of 30 fps, for example.
  • the endoscope images T 0 are transmitted to the image storage server 5 or the endoscope position identifying apparatus 6 .
  • the endoscope images T 0 which are obtained by the endoscope will be referred to as “actual endoscope images T 0 ”, in order to distinguish them from virtual endoscope images to be described later.
  • the three dimensional image obtaining apparatus 4 images an examination target portion of the subject to generate a three dimensional image V 0 that represents the examination target portion.
  • the three dimensional image obtaining apparatus 4 is a CT apparatus, an MRI apparatus, a PET (Positron Emission Tomography) apparatus, an ultrasound diagnostic apparatus, or the like.
  • the three dimensional image V 0 generated by the three dimensional image obtaining apparatus 4 is transmitted to the image storage server 5 and stored therein.
  • the three dimensional image obtaining apparatus 4 generates a three dimensional image V 0 that represents the thoracic portion which includes the bronchial tubes.
  • the image storage server 5 is a computer that stores and manages various types of data, and is equipped with a large capacity external memory device and database management software.
  • the image storage server 5 communicates with the other components of the system via the network 8 , to transmit image data and the like.
  • image data such as the actual endoscope images T 0 obtained by the endoscope 3 and the three dimensional image V 0 which is generated by the three dimensional image obtaining apparatus 2 are obtained via the network, then stored within a recording medium such as the large capacity external memory device and managed.
  • the actual endoscope images T 0 are video data which are sequentially obtained accompanying movement of the leading end 3 B of the endoscope.
  • the actual endoscope images T 0 is transmitted to the endoscope position identifying apparatus 6 without being processed through the image storage server 5 .
  • the storage format of image data and communications among each component of the system via the network 8 are based on a protocol such as the DICOM (Digital Imaging and Communication in Medicine) protocol.
  • the endoscope position identifying apparatus 1 is a computer, in which an endoscope position identifying program according to an embodiment of the present disclosure is installed.
  • the computer may be a work station or a personal computer which is directly operated by a physician who performs diagnosis, or may be a server computer connected to the work station or the personal computer via a network.
  • the endoscope position identifying program is recorded on recording media such as a DVD (Digital Versatile Disc) and a CD-ROM (Compact Disc Read Only Memory) which are distributed, and installed onto the computer from the recording medium.
  • the endoscope position identifying program is stored in a recording device of a server computer connected to a network or in a network storage, in a state accessible from the exterior, downloaded to the computer which is utilized by a physician who is the user of the endoscope position identifying apparatus 6 according to a request, then installed therein.
  • FIG. 2 is a schematic diagram that illustrates the configuration of an endoscope position identifying apparatus, which is realized by installing an endoscope position identifying program in a computer.
  • the endoscope position identifying apparatus 6 is equipped with a CPU (Central Processing Unit) 11 , a memory 12 , and a storage 13 , as components of a standard work station.
  • a display 14 and an input section 15 such as a mouse are connected to the endoscope potion identifying apparatus 6 .
  • the storage 13 has recorded therein images, such as the actual endoscope images T 0 and the three dimensional image V 0 which are obtained from the endoscope 3 and the three dimensional image obtaining apparatus 4 via the network 8 , as well as images which are generated by processes performed by the endoscope position identifying apparatus 6 .
  • the endoscope position identifying program is stored in the memory 12 .
  • the position aligning program defines an image obtaining process that sequentially obtains the actual endoscope images T 0 generated by the processing device 32 as well as image data that represents the three dimensional image V 0 generated by the three dimensional image obtaining apparatus 4 ; a virtual endoscope image generating process that generates virtual endoscope images including a plurality of virtual endoscope branch images that represent the inner walls of the bronchial tubes for each of a plurality of viewpoint positions from which a plurality of branch structures are viewed, from the three dimensional image V 0 ; a corresponding virtual endoscope image determining process that determines a corresponding virtual endoscope image that corresponds to a branch structure closest to the current position of the endoscope, through which the endoscope has passed; a matching process that matches a plurality of virtual endoscope path images generated from the three dimensional image V 0 that represent each of a plurality of paths which are present at least in the direction of movement of the
  • the computer functions as an image obtaining unit 21 , a virtual endoscope image generating unit 22 , a corresponding virtual endoscope image determining unit 23 , a matching unit 24 , a position identifying unit 25 , an identification result output unit 26 , and a determining unit 27 , by the CPU 11 executing the above processes according to the endoscope position identifying program.
  • the computer may be equipped with a plurality of processors that respectively perform each of the image obtaining process, the virtual endoscope image generating process, the corresponding virtual endoscope image determining process, the matching process, the position identifying process, the identification result output process, and the determining process.
  • the image obtaining unit 21 corresponds to an actual endoscope image obtaining means
  • the storage 13 corresponds to a first and a second storage means.
  • the image obtaining unit 21 sequentially obtains actual endoscope images which are imaged by the endoscope 3 at predetermined viewpoint positions within the bronchial tubes, and also obtains the three dimensional image V 0 . In the case that the three dimensional image V 0 is already recorded in the storage 13 , the image obtaining unit 21 may obtain the three dimensional image V 0 from the storage 13 . The actual endoscope images T 0 are displayed on the display 14 . Note that the image obtaining unit 21 stores the obtained actual endoscope images T 0 and the three dimensional image V 0 in the storage 13 .
  • the virtual endoscope image generating unit 22 generates virtual endoscope images K 0 including a plurality of virtual endoscope branch images that represent the inner walls of the bronchial tubes for each of a plurality of viewpoint positions from which a plurality of branch structures are viewed, from the three dimensional image V 0 .
  • the generation of the virtual endoscope images K 0 will be described hereinafter.
  • the virtual endoscope image generating unit 22 extracts the bronchial tubes from the three dimensional image V 0 . Specifically, the virtual endoscope image generating unit 22 extracts a graph structure of a bronchial tube region included in an input three dimensional image V 0 as a three dimensional bronchial tube image, employing the technique disclosed in Japanese Unexamined Patent Publication No. 2010-220742, for example. Hereinafter, an example of the method for extracting the graph structure will be described.
  • pixels at the interiors of the bronchial tubes correspond to air regions, and are represented as regions having low pixel values.
  • the walls of the bronchial tubes are represented as cylindrical or linear structures having comparatively high pixel values. Therefore, structural analysis of shapes based on the distribution of pixel values is performed for each pixel, to extract the bronchial tubes.
  • Bronchial tubes branch at multiple steps, and the diameters of the bronchial tubes decrease at portions closer to the distal ends thereof.
  • the virtual endoscope image generating unit 22 administers multiple resolution conversion on the three dimensional image V 0 to generate a plurality of three dimensional images, in order to enable bronchial tubes of different sizes to be detected.
  • a detection algorithm is applied to the three dimensional images of each of the resolutions, to detect lumen structures having different sizes.
  • Hessian matrices are calculated for each pixel of the three dimensional images of each resolution, and whether the pixel is that within a lumen structure is determined from the size relationship of eigenvalues of the Hessian matrices.
  • the Hessian matrices are matrices having two step partial derivatives of density values in each axis (the x axis, the y axis, and the z axis of the three dimensional image).
  • the Hessian matrices become 3 ⁇ 3 matrices when represented by Formula (1) below.
  • eigenvalues of a Hessian matrix of an arbitrary pixel are designated as ⁇ 1, ⁇ 2, and ⁇ 3, it is known that the pixel is that which represents a lumen structure in the case that two of the eigenvalues are large while one of the eigenvalues is close to 0, for example, in the case that ⁇ 3, ⁇ 2>> ⁇ 1 and ⁇ 1 ⁇ 0 are satisfied.
  • an eigenvector corresponding to the smallest eigenvalue ( ⁇ 1 ⁇ 0) of the Hessian matrix matches the direction of the principal axis of the lumen structure.
  • Bronchial tubes can be represented by a graph structure.
  • lumen structures which are extracted in this manner may not necessarily be extracted as a single graph structure in which all of the lumen structures are connected, due to the influence of tumors or the like. Therefore, after detection of the lumen structures is completed within the entirety of the three dimensional image V 0 , whether the extracted lumen structures are within a predetermined distance and whether an angle formed between the direction of a baseline that connects arbitrary points within two extracted lumen structures and the direction of the principal axis of the lumen structure is within a predetermined angular range is evaluated. Whether a plurality of lumen structures are connected is determined based on the results of these evaluation, and the connective relationship among the extracted lumen structures is reconstructed. Extraction of the graph structure of the bronchial tubes is completed by this reconstruction.
  • the virtual endoscope image generating unit 22 classifies the extracted graph structure into a starting point, end points, branch points, and edges.
  • a three dimensional graph structure that represents the bronchial tubes can be obtained as a bronchial tube image, by connecting the starting point, the end points, and the branch points with the edges.
  • the branch points are voxels having three or more links. Note that the method for generating the graph structure is not limited to that described above, and other methods may be applied.
  • the virtual endoscope image generating unit 22 sets a plurality of viewpoint positions, which are set at predetermined intervals along paths along the graph structure of the bronchial tubes from the starting point to the end points thereof, as viewpoints.
  • the branch points are also set as viewpoints.
  • Projected images, formed by projecting the three dimensional image V 0 along sight lines that extend radially from the viewpoints in the direction of movement of the leading end 3 B of the endoscope onto predetermined projection planes by the central projection method are obtained as virtual endoscope images K 0 .
  • a known technique such as the volume rendering technique may be employed as the specific method for central projection.
  • the angle of view (the range of the sight line) of each of the virtual endoscope images K 0 and the center of the field of view (the center of the projection direction) are set in advance by input from a user.
  • the generated virtual endoscope images K 0 are linked with each of the viewpoints along the graph structure of the bronchial tubes and stored in the storage 13 .
  • the virtual endoscope images K 0 which are generated with the branch points as the viewpoints thereof are designated as virtual endoscope branch images Kb.
  • the virtual endoscope branch images Kb represent the inner walls of the bronchial tubes in the case that branching structures are viewed from the branch points.
  • the virtual endoscope branch images Kb are linked with the branch points of the graph structure.
  • the virtual endoscope images K 0 generated for viewpoints along the edges of the graph structure are designated as virtual endoscope path images Kp, and linked with the viewpoints along the edges of the graph structures.
  • the corresponding virtual endoscope image determining unit 23 determines a corresponding virtual endoscope image Kbc that corresponds to the branch structure closest to the leading end 3 B of the endoscope through which the leading end 3 B of the endoscope has passed, from among a plurality of virtual endoscope branch images Kb. Specifically, at least one actual endoscope image T 0 that includes an actual endoscope image at the position of the branch structure closest to the leading end 3 B of the endoscope through which the leading end 3 B of the endoscope has passed and a plurality of virtual endoscope branch images Kb are compared, to determine the corresponding virtual endoscope image Kbc.
  • the corresponding virtual endoscope image determining unit 23 obtains a plurality of actual endoscope images Tni (i: 1 ⁇ m, m is the number of images), which were obtained within a predetermined amount of time prior to the current point in time, from the storage 13 .
  • FIG. 3 is a diagram for explaining obtainment of a plurality of actual endoscope images.
  • the endoscope which has been inserted into the subject moves in the direction toward an end of the bronchial tubes as the direction of movement to a desired position.
  • the actual endoscope image of the branch structure will be included within the plurality of actual endoscope images T 0 which were obtained within a predetermined amount of time prior to the current point in time.
  • the current position of the leading end 3 B of the endoscope is point 40 within the bronchial tube image illustrated in FIG. 3
  • the leading end 3 B of the endoscope had been positioned at point 41 at a point in time which is a predetermined amount of time prior to the current point in time.
  • a branch point 42 is included between the point 40 and the point 41 . Therefore, an actual endoscope image obtained at the position of the branch structure will be included within the plurality of actual endoscope images Tni.
  • the predetermined amount of time may be one minute, for example, but the present disclosure is not limited to such a configuration.
  • only one actual endoscope image may be obtained as the actual endoscope image obtained within the predetermined amount of time prior to the current point in time, as long as the actual endoscope image includes the branch structure.
  • a plurality of actual endoscope images Tni are obtained.
  • the corresponding virtual endoscope image determining unit 23 compares the plurality of actual endoscope images Tni with the plurality of virtual endoscope branch images, and determines the corresponding virtual endoscope image Kbc that corresponds to the branch structure closest to the current position of the leading end 3 B of the endoscope through which the leading end 3 B of the endoscope has passed. Specifically, all of the correlative values between each of the plurality of actual endoscope images Tni and the plurality of virtual endoscope branch mages Kb are calculated.
  • the actual endoscope image for which the greatest correlative value has been calculated is determined to be the actual endoscope branch image Tbc which was obtained at the position of the branch structure, and the virtual endoscope image Kb for which the greatest correlative value has been calculated is determined to be the corresponding virtual endoscope image.
  • the viewpoint position at the branch within the graph structure of the bronchial tubes at which the corresponding virtual endoscope image Kbc was generated is the corresponding branch viewpoint position.
  • the matching unit 24 performs matching of a plurality of virtual endoscope path images Kp that represent each of a plurality of paths which are present in the direction of movement of the endoscope from a corresponding branch viewpoint position for which the corresponding virtual endoscope image Kbc was generated, for each of a plurality of viewpoint positions, and a plurality of actual endoscope path images obtained along a path from the branch structure through which the leading end 3 B of the endoscope has passed to the current position of the endoscope, for each of a plurality of paths.
  • FIG. 4 is a diagram for explaining obtainment of a plurality of virtual endoscope path images. Note that in FIG. 4 and FIG. 5 to be described later, the downward direction in the drawing sheet is the direction of movement of the endoscope. As illustrated in FIG. 4 , if the position of the branch structure at which the corresponding virtual endoscope image Kbc was obtained, that is, the corresponding branch viewpoint position is designated as position 44 , the bronchial tubes branch into two paths 45 and 46 in the direction of movement of the endoscope from the position 44 .
  • the matching unit 24 obtains a plurality of virtual endoscope path images Kp linked to a plurality of predetermined viewpoint positions from the corresponding branch viewpoint position 44 with respect to each of the paths 45 and 46 from the storage 13 .
  • a plurality of virtual endoscope path images Kp 2 from the corresponding branch viewpoint position 44 to a position 48 for example, are obtained for the path 46 .
  • virtual endoscope path images Kp may be obtained for a path 50 to a position 49 opposite the direction of movement of the endoscope from the corresponding branch viewpoint position 44 in addition to the virtual endoscope path images Kp from the corresponding branch viewpoint position 44 and the paths 45 and 46 .
  • FIG. 5 is a diagram for explaining obtainment of a plurality of actual endoscope path images.
  • the position at which the actual endoscope branch image Tbc was obtained that is the branch position, is designated as position 51 .
  • the matching unit 24 obtains a plurality of actual endoscope path images Tp along a path 53 from the branch position 51 to the current position 52 of the leading end 3 B of the endoscope from the storage 13 .
  • actual endoscope path images Tp may be obtained for a path 55 to a position 54 opposite the direction of movement of the endoscope from the branch position 51 in addition to the actual endoscope path images Tp from the branch position 51 and the path 53 .
  • the matching unit 24 performs matching between each of the plurality of actual endoscope path images Tp and each of the plurality of virtual endoscope path images Kp 1 as well as each of the plurality of virtual endoscope path images Kp 2 .
  • the matching is performed by calculating the degree of similarity between each of the plurality of actual endoscope path images Tp and each of the plurality of virtual endoscope path images Kp 1 as well as each of the plurality of virtual endoscope path images Kp 2 .
  • the degrees of similarity among the plurality of images are calculated by maximizing the correlative value of the plurality of virtual endoscope path images Kp 1 and the plurality of virtual endoscope path images Kp 2 using Formula (2) below.
  • the technique that employs Formula (2) is a DP (Dynamic Programming) technique.
  • v k is a kth virtual endoscope image from among the virtual endoscope path images Kp
  • r jk is an actual endoscope image that corresponds to a virtual endoscope image vk from among the plurality of actual endoscope path images Tp
  • K is the number f virtual endoscope path images
  • S(v k , r ik ) is a function that calculates the correlative value between the virtual endoscope image v k and the actual endoscope image r jk . Note that the number of virtual endoscope path images and the number of actual endoscope path images are different.
  • Formula (2) is employed to calculate the correlative values among all of the virtual endoscope images included in the virtual endoscope path images and all of the actual endoscope images included in the actual endoscope path images, and calculates the sum of each row or the sum of the maximum value in each column in the case that the correlative values are arranged in a two dimensional matrix in the order that the actual endoscope path images Tp and the virtual endoscope images Kp are obtained, as the degree of similarity. Note that in the case that the virtual endoscope path images Kp are arranged in the horizontal direction and the actual endoscope path images Tp are arranged in the vertical direction within this matrix, Formula (2) is that which calculates the sums of the maximum values within each row.
  • the matching unit 24 calculates the degree of similarity for each of the virtual endoscope path images Kp employing Formula (2), and determines a virtual endoscope path image Kpmax, for which the degree of similarity is maximal. For example, in the case that the degree of similarity of an actual endoscope path images Tp to a virtual endoscope path image KP 1 is greater than the degrees of similarity of the actual endoscope path images Tp to a virtual endoscope path image KP 2 , the matching unit 24 determines the virtual endoscope path image Kp 1 to be the virtual endoscope path image Kpmax, for which the degree of similarity is maximal.
  • the position identifying unit 25 identifies the path, for which the virtual endoscope path image Kpmax having the maximal degree of similarity has been obtained, as the path of the current position of the leading end 3 B of the endoscope.
  • the identification result output unit 26 outputs the results of identification, by displaying the identification results of the current position of the leading end 3 B of the endoscope, which has been identified by the position identifying unit 25 , on the display 14 .
  • the determining unit 27 determines whether the leading end 3 B of the endoscope is positioned along a desired path within the bronchial tubes, based on the identification results of the current position of the leading end 3 B of the endoscope, which has been identified by the position identifying unit 25 .
  • FIG. 6 is a diagram that illustrates identification results and determination results which are displayed on the display 14 .
  • a current actual endoscope image G 0 and a bronchial tube image G 1 are displayed on the display 14 .
  • the color of a path 60 in which the leading end 3 B of the endoscope is currently positioned is displayed in a different color than other paths within the bronchial tubes.
  • FIG. 6 illustrates the difference in display colors by hatching. Note that the path to the current position of the leading end 3 B of the endoscope may be displayed by a broken line 61 as illustrated in FIG. 6 .
  • a target position 62 for the leading end 3 B of the endoscope may also be displayed.
  • the target position 62 is present along the path 60 beyond the current position of the leading end 3 B of the endoscope. Therefore, the determining unit 27 determines that the leading end 3 B of the endoscope is positioned along a desired path that enables the leading end 3 B of the endoscope to reach the target position 62 . Accordingly, a determination result 63 indicating “OK” is displayed. In this case, the operator can cause the leading end 3 B of the endoscope to progress further toward the target position 62 .
  • determining unit 27 determines that the leading end 3 B of the endoscope is not positioned along a desired path that enables the leading end 3 B of the endoscope to reach the target position 62 . Accordingly, a determination result 63 indicating “NG (No Good)” is displayed. In this case, the operator can retract the endoscope to the closest branch position and insert the endoscope into the correct branch position, to cause the leading end 3 B of the endoscope to progress along a desired path.
  • the determination result 63 may be displayed only in cases that the leading end 3 B of the endoscope is not positioned along a desired path. Conversely, the determination result 63 may be displayed only in cases that the leading end 3 B of the endoscope is positioned along a desired path. In addition, the determination result may be output as an audio message or the like.
  • FIG. 8 is a flow chart that illustrates the processes which are performed by the first embodiment. Note that here, it is assumed that a three dimensional image V 0 has been obtained by the image obtaining unit 21 and is stored in the storage 13 .
  • the image obtaining unit 21 obtains actual endoscope images T 0 (step ST 1 ).
  • the virtual endoscope image generating unit 22 generates a bronchial tube image from a three dimensional image V 0 (step ST 2 ), and further generates virtual endoscope images K 0 that include a plurality of virtual endoscope branch images Kb (step ST 3 ).
  • the virtual endoscope image generating unit 22 links the virtual endoscope images K 0 with viewpoints within the graph structure of bronchial tubes, and stores the linked virtual endoscope images K 0 in the storage 13 (step ST 4 ).
  • the corresponding virtual endoscope image determining unit 23 obtains a plurality of actual endoscope images Tni, which were obtained within a predetermined amount of time prior to the current point in time, from the storage 13 (step ST 5 ). Then, the corresponding virtual endoscope image determining unit 23 compares a plurality of actual endoscope images Tni and the plurality of virtual endoscope branch images Kb, and determines a corresponding virtual endoscope image Kbc that corresponds to the branch structure closest to the current position of the leading end 3 B of the endoscope, through which the leading end 3 B of the endoscope has passed (step ST 6 ).
  • the matching unit 24 performs matching among a plurality of actual endoscope path images Tp and a plurality of virtual endoscope path images Kp for each of a plurality paths (step ST 7 ), the position identifying unit 25 identifies the path that the current position of the leading end 3 B of the endoscope is located in (step ST 8 ), and the determining unit 27 determines whether the leading end 3 B of the endoscope is positioned along a desired path within the bronchial tubes (Desired Path Determination: step ST 9 ). Further, the identification result and the determination result are displayed on the display 14 (step ST 10 ), and the process ends.
  • the leading end 3 B of the endoscope is positioned along, among paths which are present in the direction of movement of the endoscope from the branch structure closest to the current position of the leading end 3 B of the endoscope, through which the leading end 3 B of the endoscope has passed, can be identified. Accordingly, whether the leading end 3 B of the endoscope is positioned in an incorrect path or a correct path after passing through the branch structure can be easily recognized. As a result, diagnosis using the endoscope can be performed with high accuracy.
  • FIG. 9 is a schematic diagram that illustrates the configuration of an endoscope position identifying apparatus according to the second embodiment. Configurations illustrated in FIG. 9 which are the same as those illustrated in FIG. 2 are denoted with the same reference numerals, and detailed descriptions thereof will be omitted.
  • the second embodiment differs from the first embodiment in that the second embodiment is equipped with an actual endoscope image identifying unit 28 that identifies an actual endoscope branch image which was obtained at the position of a branch structure closest to the endoscope, through which the endoscope has passed.
  • the actual endoscope image identifying unit 28 continuously determines whether the actual endoscope images T 0 which are sequentially obtained by image obtaining unit 21 include branch structures, and designates an actual endoscope image T 0 that include branch structures as an actual endoscope branch image Tbc. Determination regarding whether branch structures are included may be performed by employing classifiers, for example.
  • the classifiers are generated by machine learning that employs a plurality of actual endoscope images that include branch structures as learning data. Actual endoscope images are input to these classifiers, which output scores that represent the degree to which the actual endoscope images include branch structures.
  • the actual endoscope image identifying unit 28 determines that an actual endoscope image includes a branch structure in the case that the score output from the classifiers exceed a predetermined threshold value, and designates such an actual endoscope image as an actual endoscope branch images Tbc.
  • the matching unit 24 employs a plurality of actual endoscope path images Tp beyond the branch position, that is, actual endoscope path images Tp in the direction of movement of the leading end 3 B of the endoscope from the branch position.
  • the actual endoscope branch image Tbc in advance as in the second embodiment, the matching process is possible using only actual endoscope images T 0 at and beyond the latest actual endoscope branch image Tbc which are stored in the storage 13 . Accordingly, by designating the actual endoscope branch image Tbc in advance and only storing actual endoscope images T 0 at and beyond the latest actual endoscope branch image Tbc as in the second embodiment, the storage capacity for the actual endoscope images T 0 within the storage 13 can be decreased.
  • the processes for identifying the actual endoscope branch image Tbc may be performed at predetermined temporal intervals, every few seconds, for example. Thereby, the amount of calculations to be performed by the endoscope position identifying apparatus 6 can be reduced.
  • designation of the actual endoscope branch image Tbc in the second embodiment is not limited to a technique that employs classifiers, and a template matching technique may be employed as an alternative.
  • one or a plurality of images that include typical branch structures may be employed as templates, and correlative values may be calculated for the actual endoscope images T 0 by performing template matching.
  • An actual endoscope image T 0 for which the correlative value exceeds a predetermined threshold may be designated as the actual endoscope branch image Tbc.
  • a plurality of virtual endoscope branch images Kb may be employed as templates when performing template matching.
  • the corresponding virtual endoscope image determining unit 23 can immediately determine the virtual endoscope branch image Kb which had the highest correlative value as the corresponding virtual endoscope image Kbc. For this reason, the amount of calculations required for the processes to be performed by the corresponding virtual endoscope image determining unit 23 can be reduced.
  • the actual endoscope branch image Tbc may be designated by actual endoscope images T 0 being displayed on the display 14 , and by input of a result of determination regarding whether an actual endoscope image T 0 is the actual endoscope branch image Tbc being received from an operator via the input section 15 .
  • a plurality of virtual endoscope branch images Kb may be displayed in addition to an actual endoscope image T 0 , as illustrated in FIG. 10 .
  • the operator is enabled to compare the actual endoscope image T 0 with the virtual endoscope branch images Kb, and to determine whether the displayed actual endoscope image T 0 is the actual endoscope branch image Tbc.
  • the corresponding virtual endoscope image Kbc can also be determined in addition fo the actual endoscope branch image Tbc.
  • the virtual endoscope position identifying apparatus is the same as those of the endoscope position identifying apparatuses according to the first and second embodiments described above, and only the processes performed thereby differ. Therefore, detailed descriptions of the components of the apparatus will b omitted.
  • the virtual endo scope path images Kp are generated by the virtual endoscope image generating unit 22 in advance.
  • the virtual endoscope image generating unit 22 generates only the virtual endoscope branch images Kb in advance, and generates virtual endoscope path images Kp after a corresponding virtual endoscope image Kbc is determined.
  • the virtual endoscope path images Kp are generated for each of a plurality of paths which are present in the direction of movement of the leading end 3 B of the endoscope from the branch point to which the determined corresponding virtual endoscope image Kbc is linked.
  • FIG. 11 is a flow chart that illustrates the processes which are performed by the third embodiment. Note that here, it is assumed that a three dimensional image V 0 has been obtained by the image obtaining unit 21 and is stored in the storage 13 .
  • the image obtaining unit 21 obtains actual endoscope images T 0 (step ST 21 ).
  • the virtual endoscope image generating unit 22 generates a bronchial tube image from a three dimensional image V 0 (step ST 22 ), and further generates only a plurality of virtual endoscope branch images Kb from the three dimensional image V 0 (step ST 23 ).
  • the virtual endoscope image generating unit 22 links the virtual endoscope branch images Kb with branch points within the graph structure of bronchial tubes, and stores the linked virtual endoscope branch images Kb in the storage 13 (step ST 24 ).
  • the corresponding virtual endo scope image determining unit 23 obtains a plurality of actual endoscope images Tni, which were obtained within a predetermined amount of time prior to the current point in time, from the storage 13 (step ST 25 ). Then, the corresponding virtual endoscope image determining unit 23 compares a plurality of actual endoscope images Tni and the plurality of virtual endoscope branch images Kb, and determines a corresponding virtual endoscope image Kbc that corresponds to the branch structure closest to the current position of the leading end 3 B of the endoscope, through which the leading end 3 B of the endoscope has passed (step ST 26 ).
  • the virtual endoscope image generating unit 22 generates virtual endoscope path images Kp for each of a plurality of paths which are present in the direction of movement of the leading end 3 B of the endoscope from the branch point to which the determined corresponding virtual endoscope image Kbc is linked (step ST 27 ).
  • the matching unit 24 performs matching among a plurality of actual endoscope path images Tp and a plurality of virtual endoscope path images Kp for each of a plurality paths (step ST 28 ), the position identifying unit 25 identifies the path that the current position of the leading end 3 B of the endoscope is located in (step ST 29 ), and the determining unit 27 determines whether the leading end 3 B of the endoscope is positioned along a desired path within the bronchial tubes (Desired Path Determination: step ST 30 ). Further, the identification result and the determination result are displayed on the display 14 (step ST 31 ), and the process ends.
  • the virtual endoscope path images Kp are generated after the corresponding virtual endoscope image Kbc is determined in the third embodiment. Therefore, the volume of data of the virtual endoscope images K 0 which are stored in the storage 13 can be degreased compared to the first and second embodiments. In addition, the amount of calculations to be performed by the virtual endoscope image generating unit 22 can be reduced, because it is not necessary to generate virtual endoscope path images Kp for all paths within the bronchial tubes.
  • the current position of the leading end 3 B of the endoscope is identified when the endoscope is inserted toward a target position within the bronchial tubes from a branch position.
  • the present disclosure may be applied to cases in which a plurality of target positions are present in the direction of movement of the leading end 3 B of the endoscope from the branch position.
  • the present disclosure may be applied to a case in which a first target position at the end of a first path from a branch position is observed, the leading end 3 B of the endoscope returns to the branch position, and then a second target position at the end of a second path different from the first path is observed.
  • the determining unit 27 may designate the second path as the desired path and determine whether the leading end 3 B of the endoscope is positioned along the desired path, in order to guide the leading end 3 B of the endoscope to the second target position after the first target position is observed.
  • the bronchial tube image is extracted from the three dimensional image V 0 , and the virtual endoscope images K 0 are generated employing the bronchial tube image.
  • the virtual endoscope images K 0 may be generated from the three dimensional image V 0 without extracting the bronchial tube image.
  • the current position of the leading end 3 B of the endoscope is identified when the endoscope is actually inserted into a subject.
  • the actual endoscope images T 0 may be saved, and the saved actual endoscope images T 0 may be employed to investigate whether insertion of the endoscope was performed correctly, or utilized in education regarding insertion of endoscopes.
  • Whether an endoscope is positioned along a desired path within a lumen structure is determined based on the position identification result. Therefore, whether the endoscope is within an incorrect path or a correct path after passing through a branch structure can be more easily recognized.
  • a corresponding virtual endoscope image is determined by comparing a virtual endoscope branch image generated at a branch viewpoint position toward the side of direction of movement of the endoscope from a corresponding branch viewpoint position corresponding to a branch structure of the lumen structure through which the endoscope has passed, and at least one actual endoscope image.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Optics & Photonics (AREA)
  • Pulmonology (AREA)
  • Signal Processing (AREA)
  • Otolaryngology (AREA)
  • Physiology (AREA)
  • Gynecology & Obstetrics (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
US15/265,386 2015-09-16 2016-09-14 Endoscope position identifying apparatus, endoscope position identifying method, and recording medium having an endoscope position identifying program recorded therein Active 2037-04-17 US10561338B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-182839 2015-09-16
JP2015182839A JP6594133B2 (ja) 2015-09-16 2015-09-16 内視鏡位置特定装置、内視鏡位置特定装置の作動方法および内視鏡位置特定プログラム

Publications (2)

Publication Number Publication Date
US20170071504A1 US20170071504A1 (en) 2017-03-16
US10561338B2 true US10561338B2 (en) 2020-02-18

Family

ID=58259976

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/265,386 Active 2037-04-17 US10561338B2 (en) 2015-09-16 2016-09-14 Endoscope position identifying apparatus, endoscope position identifying method, and recording medium having an endoscope position identifying program recorded therein

Country Status (2)

Country Link
US (1) US10561338B2 (ja)
JP (1) JP6594133B2 (ja)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11625825B2 (en) 2019-01-30 2023-04-11 Covidien Lp Method for displaying tumor location within endoscopic images
US11944422B2 (en) 2019-08-30 2024-04-02 Auris Health, Inc. Image reliability determination for instrument localization

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4070723A1 (en) * 2015-09-18 2022-10-12 Auris Health, Inc. Navigation of tubular networks
WO2018188466A1 (en) * 2017-04-12 2018-10-18 Bio-Medical Engineering (HK) Limited Automated steering systems and methods for a robotic endoscope
EP3612121A4 (en) * 2017-04-18 2021-04-07 Intuitive Surgical Operations, Inc. GRAPHIC USER INTERFACE TO MONITOR AN IMAGE GUIDED PROCEDURE
KR102014359B1 (ko) * 2018-02-20 2019-08-26 (주)휴톰 수술영상 기반 카메라 위치 제공 방법 및 장치
CN110831538B (zh) * 2018-05-31 2023-01-24 奥瑞斯健康公司 基于图像的气道分析和映射
WO2020059377A1 (ja) * 2018-09-20 2020-03-26 日本電気株式会社 位置推定装置、位置推定方法、及びコンピュータ読み取り可能な記録媒体
CN113518576A (zh) * 2019-03-25 2021-10-19 奥林巴斯株式会社 移动辅助系统、移动辅助方法以及移动辅助程序
MX2021012314A (es) * 2019-04-11 2021-11-12 Univ Pittsburgh Commonwealth Sys Higher Education Procedimiento de trasplante celular minimamente invasivo para inducir el desarrollo de organogenesis in vivo.
US10799090B1 (en) 2019-06-13 2020-10-13 Verb Surgical Inc. Method and system for automatically turning on/off a light source for an endoscope during a surgery
US20220084194A1 (en) * 2019-08-02 2022-03-17 Hoya Corporation Computer program, processor for endoscope, and information processing method
JP2021049314A (ja) * 2019-12-04 2021-04-01 株式会社Micotoテクノロジー 内視鏡画像処理システム
CN112641514B (zh) * 2020-12-17 2022-10-18 杭州堃博生物科技有限公司 一种微创介入导航系统与方法
CN116433874A (zh) * 2021-12-31 2023-07-14 杭州堃博生物科技有限公司 支气管镜导航方法、装置、设备及存储介质
WO2023172109A1 (ko) * 2022-03-11 2023-09-14 주식회사 림사이언스 디지털 트윈을 이용하여 니들의 경로를 관리하기 위한 방법, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004010857A1 (ja) * 2002-07-31 2004-02-05 Olympus Corporation 内視鏡装置
WO2004023986A1 (ja) * 2002-08-30 2004-03-25 Olympus Corporation 医療処置システム、内視鏡システム、内視鏡挿入動作プログラム及び内視鏡装置
US20090161927A1 (en) * 2006-05-02 2009-06-25 National University Corporation Nagoya University Medical Image Observation Assisting System
JP2010220742A (ja) 2009-03-23 2010-10-07 Fujifilm Corp 画像処理装置および方法並びにプログラム
US20110184238A1 (en) * 2010-01-28 2011-07-28 The Penn State Research Foundation Image-based global registration system and method applicable to bronchoscopy guidance
WO2011102012A1 (ja) 2010-02-22 2011-08-25 オリンパスメディカルシステムズ株式会社 医療機器
US20110282151A1 (en) * 2008-10-20 2011-11-17 Koninklijke Philips Electronics N.V. Image-based localization method and system
US20120203065A1 (en) * 2011-02-04 2012-08-09 The Penn State Research Foundation Global and semi-global registration for image-based bronchoscopy guidance
US20120203067A1 (en) * 2011-02-04 2012-08-09 The Penn State Research Foundation Method and device for determining the location of an endoscope
US20120287238A1 (en) * 2011-01-24 2012-11-15 Olympus Medical Systems Corp. Medical device
WO2013111535A1 (ja) * 2012-01-24 2013-08-01 富士フイルム株式会社 内視鏡画像診断支援装置および方法並びにプログラム
WO2013150650A1 (ja) 2012-04-06 2013-10-10 不二精工 株式会社 ビードリングの把持装置
US20140180063A1 (en) * 2012-10-12 2014-06-26 Intuitive Surgical Operations, Inc. Determining position of medical device in branched anatomical structure
WO2014156378A1 (ja) * 2013-03-27 2014-10-02 オリンパスメディカルシステムズ株式会社 内視鏡システム
US9326660B2 (en) * 2013-03-12 2016-05-03 Olympus Corporation Endoscope system with insertion support apparatus
US9830737B2 (en) * 2012-09-26 2017-11-28 Fujifilm Corporation Virtual endoscopic image generation device, method, and medium containing program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4171833B2 (ja) * 2002-03-19 2008-10-29 国立大学法人東京工業大学 内視鏡誘導装置および方法
JP4245880B2 (ja) * 2002-08-30 2009-04-02 オリンパス株式会社 内視鏡装置
JP4922107B2 (ja) * 2007-09-03 2012-04-25 オリンパスメディカルシステムズ株式会社 内視鏡装置
JP5748520B2 (ja) * 2011-03-25 2015-07-15 富士フイルム株式会社 内視鏡挿入支援装置およびその作動方法、並びに内視鏡挿入支援プログラム

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7641609B2 (en) * 2002-07-31 2010-01-05 Olympus Corporation Endoscope device and navigation method for endoscope device
WO2004010857A1 (ja) * 2002-07-31 2004-02-05 Olympus Corporation 内視鏡装置
WO2004023986A1 (ja) * 2002-08-30 2004-03-25 Olympus Corporation 医療処置システム、内視鏡システム、内視鏡挿入動作プログラム及び内視鏡装置
US20090161927A1 (en) * 2006-05-02 2009-06-25 National University Corporation Nagoya University Medical Image Observation Assisting System
US20110282151A1 (en) * 2008-10-20 2011-11-17 Koninklijke Philips Electronics N.V. Image-based localization method and system
JP2012505695A (ja) 2008-10-20 2012-03-08 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 画像に基づくローカライゼーション方法及びシステム
JP2010220742A (ja) 2009-03-23 2010-10-07 Fujifilm Corp 画像処理装置および方法並びにプログラム
US20110184238A1 (en) * 2010-01-28 2011-07-28 The Penn State Research Foundation Image-based global registration system and method applicable to bronchoscopy guidance
JP2013517909A (ja) 2010-01-28 2013-05-20 ザ ペン ステイト リサーチ ファンデーション 気管支鏡検査法ガイダンスに適用される画像ベースのグローバル登録
US8102416B2 (en) * 2010-02-22 2012-01-24 Olympus Medical Systems Corp. Medical apparatus
WO2011102012A1 (ja) 2010-02-22 2011-08-25 オリンパスメディカルシステムズ株式会社 医療機器
US20110234780A1 (en) 2010-02-22 2011-09-29 Olympus Medical Systems Corp. Medical apparatus
US20120287238A1 (en) * 2011-01-24 2012-11-15 Olympus Medical Systems Corp. Medical device
US20120203065A1 (en) * 2011-02-04 2012-08-09 The Penn State Research Foundation Global and semi-global registration for image-based bronchoscopy guidance
US20120203067A1 (en) * 2011-02-04 2012-08-09 The Penn State Research Foundation Method and device for determining the location of an endoscope
WO2013111535A1 (ja) * 2012-01-24 2013-08-01 富士フイルム株式会社 内視鏡画像診断支援装置および方法並びにプログラム
US20140336501A1 (en) * 2012-01-24 2014-11-13 Fujifilm Corporation Diagnostic endoscopic imaging support apparatus and method, and non-transitory computer readable medium on which is recorded diagnostic endoscopic imaging support program
WO2013150650A1 (ja) 2012-04-06 2013-10-10 不二精工 株式会社 ビードリングの把持装置
US9830737B2 (en) * 2012-09-26 2017-11-28 Fujifilm Corporation Virtual endoscopic image generation device, method, and medium containing program
US20140180063A1 (en) * 2012-10-12 2014-06-26 Intuitive Surgical Operations, Inc. Determining position of medical device in branched anatomical structure
US9326660B2 (en) * 2013-03-12 2016-05-03 Olympus Corporation Endoscope system with insertion support apparatus
WO2014156378A1 (ja) * 2013-03-27 2014-10-02 オリンパスメディカルシステムズ株式会社 内視鏡システム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Office Action dated Jan. 29, 2019 in Japanese Patent Application No. 2015-182839 (with English translation).

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11625825B2 (en) 2019-01-30 2023-04-11 Covidien Lp Method for displaying tumor location within endoscopic images
US11944422B2 (en) 2019-08-30 2024-04-02 Auris Health, Inc. Image reliability determination for instrument localization

Also Published As

Publication number Publication date
JP6594133B2 (ja) 2019-10-23
US20170071504A1 (en) 2017-03-16
JP2017055954A (ja) 2017-03-23

Similar Documents

Publication Publication Date Title
US10561338B2 (en) Endoscope position identifying apparatus, endoscope position identifying method, and recording medium having an endoscope position identifying program recorded therein
US20170296032A1 (en) Branching structure determination apparatus, method, and program
US20170340241A1 (en) Endoscopic examination support device, endoscopic examination support method, and endoscopic examination support program
US10085672B2 (en) Diagnostic endoscopic imaging support apparatus and method, and non-transitory computer readable medium on which is recorded diagnostic endoscopic imaging support program
EP2965263B1 (en) Multimodal segmentation in intravascular images
US10092216B2 (en) Device, method, and non-transitory computer-readable medium for identifying body part imaged by endoscope
US8767057B2 (en) Image processing device, image processing method, and program
JP6254053B2 (ja) 内視鏡画像診断支援装置、システムおよびプログラム、並びに内視鏡画像診断支援装置の作動方法
US9723971B2 (en) Image processing apparatus, method, and program
JP5785120B2 (ja) 医用画像診断支援装置および方法並びにプログラム
US20180263527A1 (en) Endoscope position specifying device, method, and program
JP2013153883A (ja) 画像処理装置、撮影システム及び画像処理方法
US10939800B2 (en) Examination support device, examination support method, and examination support program
JP7270658B2 (ja) 画像記録装置、画像記録装置の作動方法および画像記録プログラム
US11847730B2 (en) Orientation detection in fluoroscopic images
US20200020127A1 (en) Examination support device, examination support method, and examination support program
JP7248098B2 (ja) 検査装置、検査方法及び記憶媒体
US11056149B2 (en) Medical image storage and reproduction apparatus, method, and program
JP6199267B2 (ja) 内視鏡画像表示装置、その作動方法およびプログラム
US20180263712A1 (en) Endoscope position specifying device, method, and program
US11003946B2 (en) Examination support device, examination support method, and examination support program
US20230419517A1 (en) Shape measurement system for endoscope and shape measurement method for endoscope

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJIFILM CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WANG, CAIHUA;REEL/FRAME:039794/0381

Effective date: 20160406

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4