CN107610109A - Method for displaying image, the apparatus and system of endoscope micro-wound navigation - Google Patents

Method for displaying image, the apparatus and system of endoscope micro-wound navigation Download PDF

Info

Publication number
CN107610109A
CN107610109A CN201710795516.3A CN201710795516A CN107610109A CN 107610109 A CN107610109 A CN 107610109A CN 201710795516 A CN201710795516 A CN 201710795516A CN 107610109 A CN107610109 A CN 107610109A
Authority
CN
China
Prior art keywords
endoscope
images
image
micro
cube
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710795516.3A
Other languages
Chinese (zh)
Inventor
杨峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ari Mai Di Medical Technology (beijing) Co Ltd
Original Assignee
Ari Mai Di Medical Technology (beijing) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ari Mai Di Medical Technology (beijing) Co Ltd filed Critical Ari Mai Di Medical Technology (beijing) Co Ltd
Priority to CN201710795516.3A priority Critical patent/CN107610109A/en
Publication of CN107610109A publication Critical patent/CN107610109A/en
Pending legal-status Critical Current

Links

Landscapes

  • Endoscopes (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The present invention discloses a kind of method for displaying image, device and the system of endoscope micro-wound navigation, and the method for displaying image includes:Obtain CT images and obtain endoscopic images in real time;The position and direction of endoscope tip are obtained in real time;According to the position and direction of the endoscope tip, cube cutting, the cube metadata after being cut are carried out to the CT images;Differentiation is carried out based on distance weighted light projection method to the cube metadata after the cutting to render, the cube metadata after being rendered;Cube metadata after described render and the endoscopic images are subjected to virtual reality fusion, obtain virtual reality fusion image, and show.Image display device and system use the method for displaying image, realize that the image during surgical navigational is shown.

Description

Method for displaying image, the apparatus and system of endoscope micro-wound navigation
Technical field
The present invention relates to field of medical technology, more particularly to a kind of method for displaying image of endoscope micro-wound navigation, Apparatus and system.
Background technology
Tumor of base of skull due to its occur position it is deeper, contiguous structure complicated difficult is distinguished, and diagnosis and treatment process be related to neurosurgery, The multidisciplinary technology such as ear-nose-throat department and Head and neck tumour is more difficult to complete tumor resection.By the development of over one hundred year, cranium Bottom tumour diagnosis and treatment develop into the endoscope micro-wound stage by bore hole operation of opening cranium.Endoscope minimal invasive techniques art formula is succinct, postoperative Recover fast, operative approach be avoided that for skin of face structural damage by endoscopic images guiding, so as to reduce it is various simultaneously Send out the probability that disease occurs.
At present, the operation of conventional the malignant tumor of nose and nasal sinuses and tumor of base of skull operation use simple conchoscope video figure As navigation, the operation guiding system of integrative medicine image CT data guiding can provide accurate three-view diagram information substantially, Endoscope profile is shown in the three-view diagram simultaneously.But doctor is given in the three-view diagram of the CT data used in surgical navigational The auxiliary information of offer is still limited, has much room for improvement.
The content of the invention
In order to solve the above technical problems, the present invention propose a kind of navigation of endoscope micro-wound method for displaying image, Device and system.
Embodiment of the present invention provides a kind of method for displaying image of endoscope micro-wound navigation, comprises the following steps:
S1, obtain CT images and obtain endoscopic images in real time;
S2, the position and direction of endoscope tip are obtained in real time;
S3, according to the position and direction of the endoscope tip, cube cutting is carried out to the CT images, cut Cube metadata afterwards;
S4, differentiation is carried out to the cube metadata after the cutting based on distance weighted light projection method and rendered, Cube metadata after being rendered;
S5, the cube metadata after described render and the endoscopic images are subjected to virtual reality fusion, obtain virtual reality fusion Image, and show.
Alternatively, further comprise before the step S3:
S6, using based on region growing and fast marching methods in the CT images to predetermined critical anatomical structures carry out 3D is split, and the critical anatomical structures of 3D segmentations are labeled.
Alternatively, further comprise after the step S6:
S7, color mapping is carried out to the critical anatomical structures obtained by 3D segmentations.
Alternatively, further comprise before the step S3:
S8, carry out registering between the CT images and patient's pose, the CT images after acquisition registration, for step S3 Cube cutting.
Alternatively, further comprise after the step S8:
S9, according to the CT images and the position and direction of endoscope tip after the registration, obtain endoscope and patient The distance between relative position and endoscope and surgical target between human body, and show.
Alternatively, step is further comprised before the step S5:
S10, the endoscopic images are carried out based on the far and near transparency mapping in range image center, and to by transparent The endoscopic images of degree mapping carry out edge attenuation processing, so that the endoscopic images after edge attenuation processing are cut with described Cube metadata after cutting carries out virtual reality fusion.
Alternatively, further comprise before the step S10:
S11, distortion correction is carried out to the endoscopic images.
Alternatively, the step S2 is specifically included:On the operation tool that the endoscope is obtained by optictracking device The position of predetermined flag point, and according to the position of the predetermined flag point, calculate the position and side for obtaining the endoscope tip To;Wherein, the endoscope tip is to stretch into one end in patient body.
Alternatively, the step S3 is specifically included:
According to the position and direction of the endoscope, cube cutting is carried out to the CT images;Wherein, in CT images In the cube of cutting, using the endoscope shaft to for depth direction and depth is d, the focal plane of the endoscope is starting point, Form a cubical side;Meanwhile cubical two other side m and n is set according to the size of indication range.
A kind of image display device for endoscope micro-wound navigation that embodiment of the present invention provides, including display screen, Processor and data-interface;Wherein, the data-interface be used for connect endoscope and CT equipment, with obtain endoscopic images and Preoperative CT images;The processor is used for the image display side for the endoscope micro-wound navigation for performing any of the above-described embodiment Method, to obtain virtual reality fusion image;The display screen is used for the virtual reality fusion image for showing that the processor obtains.
Alternatively, the processor includes CPU processing units and GPU processing units, wherein the CPU processing units are used In carrying out CT images 3D segmentations registering and critical anatomical structures with patient's pose, GPU processing units are used for CT images Cube cutting, rendered based on distance weighted cube metadata and the edge attenuation processing of endoscopic images.
Alternatively, the processor is further used for the real time position according to endoscope, obtains corresponding virtual reality fusion figure The relative position view of picture, endoscope and human body, and be updated to the display screen and shown.
Embodiment of the present invention provides a kind of endoscope micro-wound navigation system, including computer installation and optics with Track equipment, the optictracking device are used to obtain the position of ESS instrument and the tracking to patient's pose in real time, The computer installation is used to obtain endoscopic images and CT images, and combines the positional information of optictracking device tracking, And the method for displaying image using any of the above-described embodiment, the endoscopic images and CT images are handled, obtained Endoscopic images and the virtual reality fusion image of CT images are obtained, and are shown.
Alternatively, the computer installation includes the image display device of any of the above-described embodiment.
Alternatively, the endoscope micro-wound navigation system application is performed the operation in the malignant tumor of nose and nasal sinuses and basis cranii swells Knurl surgical navigational.
To sum up, the method for displaying image of above-mentioned endoscope micro-wound navigation, the navigation of relatively conventional ESS are aobvious Show method, have the following advantages that:
(1) in the virtual reality fusion image of embodiment of the present invention not only can the image that detects of real-time display endoscope, and And the cube metadata progress differentiation after cutting is rendered using based on the distance weighted mode that renders, it can be calculated reducing Complexity, while accelerating rendering speed, there is provided more accurate depth perception, more effectively improve the phase between anatomical structure To relation, make doctor's blocking with front and rear judgement definitely to anatomical structure, more accurate assisting in diagnosis and treatment is provided for doctor Ability;
(2) endoscopic images are handled using Gauss edge decay algorithm, realizes that endoscopic images are empty with CT images The seamless transitions merged in fact, visually reach smooth transition, can by the naked eyes visible structure in endoscopic images with again The matching and transition of well-formed is built, the periphery expansion of relatively conventional endoscopic images, more structural informations can be shown, in addition The lesion information behind endoscopic images can be shown in same view, hence it is evident that improve the prompting of realtime graphic in surgical navigational Effect;
(3) be layered virtual reality fusion and render mode and realize visual field viewing area increasing in the visual field viewing area of endoscope Strong reality guiding, positioning cube is employed on region cut open showing and rendering, the change of its position endoscopically position and direction Change and change, have lifting on perceived distance and scene feeling of immersion.
Brief description of the drawings
Fig. 1 is the schematic flow sheet of the method for displaying image of the endoscope micro-wound navigation of an embodiment of the present invention;
Fig. 2 is that the position of optictracking device tracking ESS instrument is utilized in ESS navigation procedure of the present invention Put schematic diagram;
Fig. 3 is that the cube for carrying out CT images in ESS navigation procedure of the present invention according to endoscope position is cut Schematic diagram;
Fig. 4 is to endoscopic images in the method for displaying image that the endoscope micro-wound of an embodiment of the present invention navigates The schematic flow sheet handled;
Fig. 5 is that the endoscopic images of an embodiment of the present invention carry out the signal of edge Gauss decay and transparency mapping Figure;
Fig. 6 is that CT images are carried out in the method for displaying image of the endoscope micro-wound navigation of an embodiment of the present invention The schematic flow sheet of processing;
Fig. 7 is that CT images are entered in the method for displaying image of the endoscope micro-wound navigation of another embodiment of the present invention The schematic flow sheet of row processing;
Fig. 8 is the image display interfaces schematic diagram of the endoscope micro-wound navigation of an embodiment of the present invention;
Fig. 9 is the functional block diagram of the image display device of the endoscope micro-wound navigation of an embodiment of the present invention;
Figure 10 is the tool of processor in the image display device that the endoscope micro-wound of an embodiment of the present invention navigates Body function block diagram;
Figure 11 is the structural representation of the endoscope micro-wound navigation system of an embodiment of the present invention.
Embodiment
Below with reference to the accompanying drawings embodiments of the invention are illustrated.Retouched in the accompanying drawing of the present invention or a kind of embodiment The element and feature that the element and feature stated can be shown in one or more other accompanying drawings or embodiment are combined.Should Work as attention, for purposes of clarity, eliminated in accompanying drawing and explanation known to unrelated to the invention, those of ordinary skill in the art Part or processing expression and description.
The present invention is described further below in conjunction with the accompanying drawings.
Embodiment of the present invention provides the method for displaying image in a kind of endoscope micro-wound navigation procedure, and this is interior to peep Mirror Minimally Invasive Surgery such as, but not limited to includes the malignant tumor of nose and nasal sinuses operation, tumor of base of skull operation etc., naturally it is also possible to including Other utilize the operation of endoscope.
Specifically, referring to Fig. 1, Fig. 1 is shown in the endoscope micro-wound navigation procedure of an embodiment of the present invention Method for displaying image, the method for displaying image are specifically related to strengthen based on virtual reality fusion of the CT images navigation with reference to endoscopic images Reality, comprise the following steps:
S101, obtain endoscopic images and CT images;
Stretched into patient body using the detecting lenses of endoscope, to obtain endoscopic images.Because endoscope can be sent out in art Changing, therefore the endoscopic images are acquisition in real time.Preoperative scanning is carried out to the predetermined position of patient using CT equipment, to obtain Preoperative CT images are taken, the CT images are 3-D view.Predetermined position such as human body head.
S102, obtain the position and direction of endoscope tip;
Endoscope tip is the detecting lenses that endoscope stretches into one end, i.e. endoscope in patient body.Due to the endoscope Top is stretched into patient body, the more difficult acquisition of apical position and direction, therefore passes through the ESS work outside human body The position of tool carries out conversion acquisition.As shown in Fig. 2 the operation tool 300 of the endoscope is provided with 4 index points, optics is utilized Tracking equipment 200 is tracked monitoring to 4 index points, obtains the positional information of 4 index points.Between two individual data items Coordinate transformation relation registration transformation can be carried out by below equation:
In above formula,The coordinate at CT data coordinate systems midpoint is represented,Represent in optictracking device coordinate system The coordinate of corresponding points,WithIt is then spin matrix and translation vector respectively.According to the positional information of 4 index points, you can Calculated using DLT (Direct Linear Transform) algorithmWithIn addition, the position and direction of the endoscope tip It will in real time obtain, and to track the change in location of endoscope in time, be easy to image update below.
S103, according to the position of the endoscope tip, cube cutting, cube after being cut are carried out to CT images Volume data;
According to the positional information of endoscope tip, it is determined that the cube parameter cut, and built based on the cube parameter Cube CT images are cut, obtain cube metadata.In one embodiment, reference picture 3, the cube parameter of the cutting Specially:In the space O that CT images are formedCTIt is interior, with the focal plane O of endoscopeVIt is axial for depth direction with endoscope for starting point And length is d, forms cubical a line;Meanwhile set cubical other two according to the size of endoscope indication range Bar side m and n.In this way, cube parameter (the i.e. focal plane O based on the determinationVAnd 3 sides), then build a cube.According to The cube of the structure is cut to CT images, you can the cube metadata after being cut.
S104, differentiation is carried out to the cube after the cutting based on distance weighted light projection method and rendered, is obtained Cube metadata after must rendering;
The cube of cube parameter structure according to Fig. 3 is cut to CT images, after obtaining cube metadata, Differentiation is carried out using distance weighted light projection method to the cube metadata after cutting to render.Specifically, stood from data The preceding surface (i.e. the focal plane Ov of endoscope in Fig. 3) of cube starts, and is to the distance between the rear surface of data cube D, with the increase (i.e. the increase of d numerical value) of distance, the decimation factor of each sampled point on every light is corresponding with d values Brightness value, rendered so as to carry out differentiation to cube metadata according to the corresponding transparence value.With continued reference to Fig. 3, with depth p Surgical target exemplified by, in data cube body any point light projection the sampling weight factor, apart from focal plane OVIt is more remote Absorption factor of the voxel in light projection functions it is fewer so that in cube metadata diverse location anatomical structure shape Distinguished into otherness.Sampling location and the mapping relations of decimation factor transparency are as follows:
Wherein m, n, d are respectively the length of side of data cube, and the coordinate position of sampling location is (x, y, z).
By being rendered based on distance weighted, cube can be effectively improved while rendered structure texture is become more meticulous Relative position relation between internal anatomical structure, clear performance anatomical structure and its positional information.
S105, the cube metadata after described render and the endoscopic images are subjected to virtual reality fusion, obtain actual situation and melt Image is closed, and is shown.
Obtain cube metadata and after being rendered to it, then the endoscopic images that itself and step S101 are obtained carry out it is empty Real fusion treatment, obtain virtual reality fusion image.
In the virtual reality fusion image of embodiment of the present invention not only can the image that detects of real-time display endoscope, and adopt With based on it is distance weighted render mode differentiation carried out to the cube metadata after cutting render, can reduce calculate it is complicated Degree, while accelerating rendering speed, there is provided more accurate depth perception, more effectively improve the relative pass between anatomical structure System, make doctor's blocking with front and rear judgement definitely to anatomical structure, more accurate assisting in diagnosis and treatment ability is provided for doctor.
Further, as shown in figure 4, after obtaining endoscopic images in above-mentioned steps S101, it will be handled as follows:
Step S201, distortion correction is carried out to the endoscopic images;
, will be to the endoscopic images distortion correction so that the serious endoscope figure of radial distortion after obtaining endoscopic images As the fast quick-recovery of energy, to eliminate because pattern distortion causes endoscopic images distortion during virtual reality fusion is shown, mismatched with material object.
Step S202, the endoscopic images are carried out based on the far and near transparency mapping in range image center, and to warp The endoscopic images for crossing transparency mapping carry out edge attenuation processing.
Endoscopic images by distortion correction are carried out to map based on the far and near transparency in range image center, specifically Ground, using picture centre as the center of circle, using radius as transparency mapping parameters, with the more remote image of picture centre distance, transparency is got over Height is that is, more transparent.The image for retaining endoscopic centres region is so, it is possible, so as to be declined at the edge to endoscopic images When subtracting processing, layer rendering can be realized, the feeling of immersion of fusion display can be effectively improved, the front and back scene of virtual reality fusion is merged It is more life-like.
Further, the edge attenuation processing is such as, but not limited to handled using Gaussian function.As shown in figure 5, Fig. 5 is shown One conchoscope image carries out the schematic diagram of edge Gauss decay and transparency mapping.For m × n endoscope figure Picture, if any point P (i, j) and the distance of picture centre are in its pictureWherein 0<i ≤m-1,0<j≤n-1.Then according to its distance with picture centre, the zone of opacity radius in endoscopic images can be set For t, image maximum radius is R, i.e., attenuation region is then R-t.The transparency of attenuation region can be then defined as:
In embodiment of the present invention, endoscopic images are handled using Gauss edge decay algorithm, realize endoscope The seamless transitions of image and CT images virtual reality fusion, visually reach smooth transition, can be by the meat in endoscopic images Eye visible structure and the matching and transition for rebuilding well-formed, the periphery expansion of relatively conventional endoscopic images, can show more Structural information, the lesion information behind endoscopic images can be shown in same view in addition, hence it is evident that improve surgical navigational The prompting effect of middle realtime graphic.
Further, as shown in fig. 6, step S103 carries out including following processing before cube cutting to CT images:
S301, by being entered based on region growing and fast marching methods in the CT images to predetermined critical anatomical structures Row 3D is split, and the critical anatomical structures of 3D segmentations are labeled;
Using the CT images of preoperative acquisition as benchmark, by based on region growing and fast marching methods to predetermined key Anatomical structure carries out 3D segmentations, and the critical anatomical structures after 3D segmentations are labeled.The predetermined critical anatomical structures root Determined according to specific operative site, such as blood vessel, tumour and nerve.Moreover, the critical anatomical structures by doctor in CT images The middle particular location for determining critical anatomical structures.
Further, since in CT images predetermined critical anatomical structures have been carried out with 3D segmentations, therefore the pass after 3D segmentations Key anatomical structure carry out step S104 render processing after, it is achieved thereby that to the difference of anatomical structure inside cube metadata Change display, be easy to observe in doctor's art, quickly determine surgical target, such as the tumour to be cut off.
S302, color mapping is carried out to the critical anatomical structures obtained by 3D segmentations;
Carry out color mapping by splitting the critical anatomical structures obtained to 3D, for example, blood vessel be red, tumour be green, Nerve is yellow, so that the critical anatomical structures differentiation in image is more obvious, while is also accelerated for virtual reality fusion processing Speed, and the accuracy of perceived distance provides guarantee when being handled for virtual reality fusion.
The critical anatomical structures of color mapping are carried out in step S104 when being rendered based on distance weighted differentiation, distance The data that the focal plane of endoscope is more remote also render the decay for carrying out color, i.e., more remote structure is more not easy to be observed.Such as This, more effectively improves the relativeness between critical anatomical structures, makes doctor to blocking with before between critical anatomical structures The judgement of relation definitely, more accurate assisting in diagnosis and treatment ability is provided for doctor afterwards.
S303, carry out registering between CT images and patient's pose, the CT images after acquisition registration;
Specifically, position corresponding with the critical anatomical structures in CT images is determined according to predetermined critical anatomical structures, And as reference point.Optictracking device then according to the reference point, realized on patient body to should reference point mark The positioning of will point, then carry out CT using 3PCHM (3-Points Convex Hull Matching) rapid registering computational methods Spin matrix and translation vector between image and patient's pose, and the CT images after being changed.
S304, according to the CT images and the position and direction of endoscope tip after registration, obtain endoscope and patient it Between relative position and the distance between endoscope and surgical target.
Unify afterwards into same coordinate space because patient's pose is registering with CT images process, now according to optics The endoscope real time position that tracking equipment obtains, the relative position between endoscope and human body can be accessed, but also The distance between endoscope and surgical target (for example, tumour to be cut off etc.) are can determine that, to carry out operating theater instruments with respect to position The display put.
It should be noted that above-mentioned steps 303 and step S301 does not limit sequencing, can perform parallel.
Further, as shown in fig. 7, Fig. 7 shows the endoscope micro-wound navigation of a further embodiment of this invention Virtual reality fusion display methods.In embodiment of the present invention, further comprise after step s 304 described above:
Step S305, orthogonal cutting is carried out along the direction parallel and vertical with endoscope to CT images, and added based on distance The light projection method of power carries out differentiation to the data of orthogonal cutting and rendered, and obtains orthogonal cutting data, for showing phase The Section View answered.
By carrying out orthogonal cutting on the basis of endoscope to CT images, the endoscope on cutting plane and target location are realized Distance display, more effectively intuitively show the position and posture of endoscope and operation tool.In addition, thrown using distance weighted light Shooting method carries out differentiation to the data of orthogonal cutting and rendered so that the distance between endoscope and target location show more clear It is clear.
As shown in figure 8, Fig. 8 shows the Transnasal endoscopy operation navigation virtual reality fusion display interface of an embodiment of the present invention.Fig. 8 Shown display interface includes endoscope and human body relative position view, axially position Section View, radial positioning cutting View, and to the cube metadata of cutting after being rendered based on distance weighted differentiation with by transparency mapping and side The virtual reality fusion of the endoscopic images of edge attenuation processing shows view.And the equal endoscopically top of each view in the display interface The change in location at end and update.Based on the display interface shown in Fig. 8, according to endoscope and human body relative position view and axle Into radial positioning Section View, can be clear and intuitive observe endoscope and human body in object construction between away from From and position relationship.View is shown according to virtual reality fusion, the reality rendered based on distance weighted differentiation can be observed simultaneously When cutting cube metadata, edge Gauss decay and the endoscopic images of transparency mapping and the critical anatomical target of color mapping Information, while the anatomical structures such as the nasal cavity in endoscopic images are extended naturally and extended in virtual scene, by based on distance The differentiation of weighting renders display and provides the anatomical structure in virtual scene effective prompting.
It should be noted that due in accompanying drawing can not display color, therefore represented with different lines, actual display image Middle different anatomical structure structure is shown by different colors.Axially position Section View wherein shown in Fig. 9 It is orthogonal cutting is carried out to CT images in step S105 and obtained after rendering orthogonal Section View with radial positioning Section View.
To sum up, the virtual reality fusion display methods of above-mentioned endoscope micro-wound navigation, the navigation of relatively conventional endoscope are aobvious Show method, have the following advantages that:
(1) in the virtual reality fusion image of embodiment of the present invention not only can the image that detects of real-time display endoscope, and And the cube metadata progress differentiation after cutting is rendered using based on the distance weighted mode that renders, it can be calculated reducing Complexity, while accelerating rendering speed, there is provided more accurate depth perception, more effectively improve the phase between anatomical structure To relation, make doctor's blocking with front and rear judgement definitely to anatomical structure, more accurate assisting in diagnosis and treatment is provided for doctor Ability;
(2) endoscopic images are handled in real time using Gauss edge decay algorithm, realizes endoscopic images and CT shadows Seamless transitions as carrying out virtual reality fusion, visually reach smooth transition, can be visible by the naked eyes in endoscopic images Structure and the matching and transition for rebuilding well-formed, the peripheral expansion of relatively conventional endoscopic images, can show more structures Information, the lesion information behind endoscopic images can be shown in same view in addition, hence it is evident that improve in surgical navigational in real time The prompting effect of image;
(3) mode of virtual reality fusion and layer rendering is carried out to visual field viewing area reality in the visual field viewing area of endoscope Existing augmented reality guiding, and positioning cube is employed on region cut open showing and rendering, its position endoscopically position and The change in direction and change, have lifting on perceived distance and scene feeling of immersion;
(4) orthogonal cutting is carried out along the direction parallel or vertical with endoscope to CT images, efficiently avoid three-view diagram Show apart from upper display shortcoming, and to relative position of the operating theater instruments (such as endoscope) between human body Show, apparatus and the distance between human body relation are accurately prompted;
(5) this method realize relative position view between endoscope and human body, on the basis of endoscope to CT Virtual reality fusion between the orthogonal Section View and endoscopic images and CT images of image shows the display of view so that doctor It is raw to combine each view, accurately understand process in endoscope position and art, improve the security of endoscope micro-wound.
Accordingly, the virtual reality fusion display methods of above-mentioned endoscope micro-wound navigation can use the side of combination of hardware software Formula is realized, can also use the form of pure software code, and run in a computer.Specifically, as shown in figure 9, Fig. 9 is shown The image display device of the endoscope micro-wound navigation of an embodiment of the present invention, the virtual reality fusion device may include to show Screen 10, processor 20 and data-interface 30, wherein, the data-interface is used to connect endoscope and CT equipment, with acquisition Sight glass image and CT images;The processor 20 is used for the Minimally Invasive Surgery navigation virtual reality fusion for performing above-mentioned any one embodiment Display methods, to obtain virtual reality fusion image;The display screen 10 is used for the virtual reality fusion figure for showing that the processor 20 obtains Picture.
Further, as shown in Figure 10, above-mentioned processor 20 includes CPU processing units 21 and GPU processing units 22, its Described in CPU processing units 21 be mainly used in performing the function such as mathematical computations and image configuration, such as CT images and patient's pose Registration and critical anatomical structures 3D segmentation.Certainly, the CPU processing units are additionally operable to perform other processing, such as from number Read endoscopic images and CT images according to interface 30, and from optictracking device 200 obtain the real time position of endoscope with And the positional information such as pose of patient.
GPU processing units 22 are used to perform the function relevant with graphics process, such as the cube of CT images cuts, is based on Distance weighted cube metadata renders, the transparency of endoscopic images mapping and edge attenuation processing, to the orthogonal of CT images Cutting etc..
Further, processor 20 is further used for:According to the real time position of endoscope, corresponding virtual reality fusion figure is obtained Relative position view, CT images as, endoscope and human body are carried out just along the direction parallel and vertical with the endoscope The Section View of cutting is handed over, and is updated to the display screen 10 and is shown.
As shown in figure 11, embodiment of the present invention additionally provides a kind of endoscope micro-wound navigation system, such as but not It is limited to be applied to the malignant tumor of nose and nasal sinuses operation and tumor of base of skull surgical navigational.The operation guiding system specifically includes:Meter Calculation machine device 100 and optictracking device 200, the optictracking device 200 are used to obtain ESS instrument in real time 300 position and the tracking to patient's pose, the computer installation 100 are used to obtain endoscopic images and CT images, And the positional information that optictracking device 200 tracks, and the method for displaying image using above-mentioned any one embodiment are combined, The endoscopic images and CT images are handled, you can endoscopic images and the virtual reality fusion image of CT images are obtained, And show.
Alternatively, the computer installation includes the image display device shown in Figure 10.
It should be noted that the computing device in above-mentioned embodiment can add required general hardware platform by software Mode is realized, naturally it is also possible to is realized by hardware, but the former is more preferably embodiment in many cases.Based on such Understand, the part that the technical scheme of embodiment of the present invention substantially contributes to prior art in other words can be produced with software The form of product embodies, that is to say, that the computational methods of any of the above-described embodiment are performed by sequence of program instructions, Perform the computational methods computer software product be stored in a computer-readable storage medium (such as but do not limit ROM/RAM, Magnetic disc, CD etc.) in, including some instructions, make it that a station terminal equipment (can be computer, Medical Devices, server Deng) perform the computational methods of any embodiment of the present invention.
The preferred embodiments of the present invention are the foregoing is only, are not intended to limit the scope of the invention, every utilization Equivalent structure or the flow conversion that description of the invention and accompanying drawing content are made, or directly or indirectly it is used in other related skills Art field, is included within the scope of the present invention.

Claims (15)

1. a kind of method for displaying image of endoscope micro-wound navigation, it is characterised in that comprise the following steps:
S1, obtain CT images and obtain endoscopic images in real time;
S2, the position and direction of endoscope tip are obtained in real time;
S3, according to the position and direction of the endoscope tip, cube cutting is carried out to the CT images, after being cut Cube metadata;
S4, differentiation is carried out to the cube metadata after the cutting based on distance weighted light projection method and rendered, is obtained Cube metadata after rendering;
S5, the cube metadata after described render and the endoscopic images are subjected to virtual reality fusion, obtain virtual reality fusion image, And show.
2. the method for displaying image of endoscope micro-wound navigation as claimed in claim 1, it is characterised in that the step S3 Further comprise before:
S6, using based on region growing and fast marching methods 3D point is carried out to predetermined critical anatomical structures in the CT images Cut, and the critical anatomical structures of 3D segmentations are labeled.
3. the method for displaying image of endoscope micro-wound navigation as claimed in claim 2, it is characterised in that the step S6 Further comprise afterwards:
S7, color mapping is carried out to the critical anatomical structures obtained by 3D segmentations.
4. the method for displaying image of endoscope micro-wound navigation as claimed in claim 1, it is characterised in that the step S3 Further comprise before:
S8, carry out registering between the CT images and patient's pose, the CT images after acquisition registration, for the vertical of step S3 Cube is cut.
5. the method for displaying image of endoscope micro-wound navigation as claimed in claim 4, it is characterised in that the step S8 Further comprise afterwards:
S9, according to the CT images and the position and direction of endoscope tip after the registration, obtain endoscope and human body Between relative position and the distance between endoscope and surgical target, and show.
6. the method for displaying image of endoscope micro-wound navigation as claimed in claim 1, it is characterised in that the step S5 Further comprise step before:
S10, the endoscopic images are carried out based on the far and near transparency mapping in range image center, and to being reflected by transparency The endoscopic images penetrated carry out edge attenuation processing, so that after the endoscopic images and the cutting after edge attenuation processing Cube metadata carry out virtual reality fusion.
7. the method for displaying image of endoscope micro-wound navigation as claimed in claim 1, it is characterised in that the step Further comprise before S10:
S11, distortion correction is carried out to the endoscopic images.
8. the method for displaying image of endoscope micro-wound navigation as claimed in claim 1, it is characterised in that the step S2 Specifically include:
The position of predetermined flag point on the operation tool of the endoscope is obtained by optictracking device, and according to described predetermined The position of index point, calculate the position and direction for obtaining the endoscope tip;Wherein, the endoscope tip is to stretch into patient Internal one end.
9. the method for displaying image of endoscope micro-wound navigation as claimed in claim 1, it is characterised in that the step S3 Specifically include:
According to the position and direction of the endoscope, cube cutting is carried out to the CT images;Wherein, cut in CT images Cube in, using the endoscope shaft to for depth direction and depth is d, the focal plane of the endoscope is starting point, is formed A cubical side;Meanwhile cubical two other side m and n is set according to the size of indication range.
A kind of 10. image display device of endoscope micro-wound navigation, it is characterised in that including display screen, processor and Data-interface;Wherein, the data-interface is used to connect endoscope and CT equipment, to obtain endoscopic images and preoperative CT shadows Picture;The processor is used for perform claim and requires that the image of the endoscope micro-wound navigation in 1-9 described in any one is shown Method, to obtain virtual reality fusion image;The display screen is used for the virtual reality fusion image for showing that the processor obtains.
11. the image display device of endoscope micro-wound navigation as claimed in claim 10, it is characterised in that the processing Device includes CPU processing units and GPU processing units, wherein the CPU processing units are used to carry out CT images and patient's pose Registration and critical anatomical structures 3D segmentation, GPU processing units be used for CT images cube cutting, based on distance weighted Cube metadata render and the edge attenuation processing of endoscopic images.
12. the image display device of endoscope micro-wound navigation as claimed in claim 11, it is characterised in that the processing Device is further used for the real time position according to endoscope, obtains corresponding virtual reality fusion image, the phase of endoscope and human body To location view, and it is updated to the display screen and is shown.
A kind of 13. endoscope micro-wound navigation system, it is characterised in that including computer installation and optictracking device, The optictracking device is used to obtain the position of ESS instrument and the tracking to patient's pose, the calculating in real time Machine device is used to obtain endoscopic images and CT images, and combines the positional information of optictracking device tracking, and utilizes Method for displaying image in claim 1-9 described in any one, the endoscopic images and CT images are handled, obtained The virtual reality fusion image that endoscopic images influence with CT is obtained, and is shown.
14. endoscope micro-wound navigation system as claimed in claim 13, it is characterised in that the computer installation includes Image display device as described in any one in claim 10-12.
15. endoscope micro-wound navigation system as claimed in claim 14, it is characterised in that the endoscope micro-wound Navigation system application is in the malignant tumor of nose and nasal sinuses operation and tumor of base of skull surgical navigational.
CN201710795516.3A 2017-09-06 2017-09-06 Method for displaying image, the apparatus and system of endoscope micro-wound navigation Pending CN107610109A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710795516.3A CN107610109A (en) 2017-09-06 2017-09-06 Method for displaying image, the apparatus and system of endoscope micro-wound navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710795516.3A CN107610109A (en) 2017-09-06 2017-09-06 Method for displaying image, the apparatus and system of endoscope micro-wound navigation

Publications (1)

Publication Number Publication Date
CN107610109A true CN107610109A (en) 2018-01-19

Family

ID=61057467

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710795516.3A Pending CN107610109A (en) 2017-09-06 2017-09-06 Method for displaying image, the apparatus and system of endoscope micro-wound navigation

Country Status (1)

Country Link
CN (1) CN107610109A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109146829A (en) * 2018-06-20 2019-01-04 深圳大学 Image interfusion method, device, equipment and medium based on wavelet neural network
CN109223177A (en) * 2018-07-30 2019-01-18 艾瑞迈迪医疗科技(北京)有限公司 Image display method, device, computer equipment and storage medium
CN109934831A (en) * 2019-03-18 2019-06-25 安徽紫薇帝星数字科技有限公司 A kind of surgical tumor operation real-time navigation method based on indocyanine green fluorescent imaging
CN109998684A (en) * 2019-05-07 2019-07-12 艾瑞迈迪科技石家庄有限公司 Based on the guidance method for early warning and device apart from dynamic quantization
CN111789675A (en) * 2020-06-29 2020-10-20 首都医科大学附属北京天坛医院 Intracranial hematoma operation positioning auxiliary method and device
CN112365417A (en) * 2020-11-10 2021-02-12 华中科技大学鄂州工业技术研究院 Confocal endoscope image correction splicing method and device and readable storage medium
CN113194866A (en) * 2018-10-03 2021-07-30 Cmr外科有限公司 Navigation assistance
CN113646808A (en) * 2019-04-04 2021-11-12 中心线生物医药股份有限公司 Registration of spatial tracking system with augmented reality display
WO2022027878A1 (en) * 2020-08-04 2022-02-10 深圳市精锋医疗科技有限公司 Image processing method for endoscope

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101322652A (en) * 2008-07-04 2008-12-17 浙江大学 Computer simulation scaling biopsy method and apparatus
CN102999902A (en) * 2012-11-13 2013-03-27 上海交通大学医学院附属瑞金医院 Optical navigation positioning system based on CT (computed tomography) registration results and navigation method thereby
CN103313646A (en) * 2011-01-14 2013-09-18 皇家飞利浦电子股份有限公司 Virtual endoscopic imaging with high risk structure highlighting
CN103356155A (en) * 2013-06-24 2013-10-23 清华大学深圳研究生院 Virtual endoscope assisted cavity lesion examination system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101322652A (en) * 2008-07-04 2008-12-17 浙江大学 Computer simulation scaling biopsy method and apparatus
CN103313646A (en) * 2011-01-14 2013-09-18 皇家飞利浦电子股份有限公司 Virtual endoscopic imaging with high risk structure highlighting
CN102999902A (en) * 2012-11-13 2013-03-27 上海交通大学医学院附属瑞金医院 Optical navigation positioning system based on CT (computed tomography) registration results and navigation method thereby
CN103356155A (en) * 2013-06-24 2013-10-23 清华大学深圳研究生院 Virtual endoscope assisted cavity lesion examination system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109146829A (en) * 2018-06-20 2019-01-04 深圳大学 Image interfusion method, device, equipment and medium based on wavelet neural network
CN109223177A (en) * 2018-07-30 2019-01-18 艾瑞迈迪医疗科技(北京)有限公司 Image display method, device, computer equipment and storage medium
CN113194866A (en) * 2018-10-03 2021-07-30 Cmr外科有限公司 Navigation assistance
CN109934831A (en) * 2019-03-18 2019-06-25 安徽紫薇帝星数字科技有限公司 A kind of surgical tumor operation real-time navigation method based on indocyanine green fluorescent imaging
CN113646808A (en) * 2019-04-04 2021-11-12 中心线生物医药股份有限公司 Registration of spatial tracking system with augmented reality display
CN109998684A (en) * 2019-05-07 2019-07-12 艾瑞迈迪科技石家庄有限公司 Based on the guidance method for early warning and device apart from dynamic quantization
CN111789675A (en) * 2020-06-29 2020-10-20 首都医科大学附属北京天坛医院 Intracranial hematoma operation positioning auxiliary method and device
CN111789675B (en) * 2020-06-29 2022-02-22 首都医科大学附属北京天坛医院 Intracranial hematoma operation positioning auxiliary method and device
WO2022027878A1 (en) * 2020-08-04 2022-02-10 深圳市精锋医疗科技有限公司 Image processing method for endoscope
CN112365417A (en) * 2020-11-10 2021-02-12 华中科技大学鄂州工业技术研究院 Confocal endoscope image correction splicing method and device and readable storage medium
CN112365417B (en) * 2020-11-10 2023-06-23 华中科技大学鄂州工业技术研究院 Confocal endoscope image correction stitching method and device and readable storage medium

Similar Documents

Publication Publication Date Title
CN107610109A (en) Method for displaying image, the apparatus and system of endoscope micro-wound navigation
CN107689045A (en) Method for displaying image, the apparatus and system of endoscope micro-wound navigation
US11883118B2 (en) Using augmented reality in surgical navigation
WO2017211087A1 (en) Endoscopic surgery navigation method and system
Bernhardt et al. The status of augmented reality in laparoscopic surgery as of 2016
CN110383345B (en) Flattened views for lumen navigation
Bichlmeier et al. Contextual anatomic mimesis hybrid in-situ visualization method for improving multi-sensory depth perception in medical augmented reality
Pessaux et al. Robotic duodenopancreatectomy assisted with augmented reality and real-time fluorescence guidance
RU2707369C1 (en) Method for preparing and performing a surgical operation using augmented reality and a complex of equipment for its implementation
CN110010249A (en) Augmented reality operation piloting method, system and electronic equipment based on video superposition
CN109259806A (en) A method of the accurate aspiration biopsy of tumour for image guidance
US20210065451A1 (en) Mixed reality system integrated with surgical navigation system
CN114145846B (en) Operation navigation method and system based on augmented reality assistance
CN103356155A (en) Virtual endoscope assisted cavity lesion examination system
EP2901935B1 (en) Method and device for generating virtual endoscope image, and program
US9808145B2 (en) Virtual endoscopic image generation device, method, and medium containing program
Kumar et al. Stereoscopic visualization of laparoscope image using depth information from 3D model
Zhu et al. A neuroendoscopic navigation system based on dual-mode augmented reality for minimally invasive surgical treatment of hypertensive intracerebral hemorrhage
Neubauer et al. STEPS-an application for simulation of transsphenoidal endonasal pituitary surgery
Li et al. A fully automatic surgical registration method for percutaneous abdominal puncture surgical navigation
CN112331311A (en) Method and device for fusion display of video and preoperative model in laparoscopic surgery
CN114931435B (en) Three-dimensional model processing method and device and electronic equipment
CN114334096A (en) Intraoperative auxiliary display method and device based on medical image and storage medium
Wu et al. Application of 3D navigation and vascular reconstruction technology in precision resection of lung segment
Palomar et al. MR in video guided liver surgery

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180119